CN118092669B - Intelligent glasses control method and system - Google Patents
Intelligent glasses control method and system Download PDFInfo
- Publication number
- CN118092669B CN118092669B CN202410481171.4A CN202410481171A CN118092669B CN 118092669 B CN118092669 B CN 118092669B CN 202410481171 A CN202410481171 A CN 202410481171A CN 118092669 B CN118092669 B CN 118092669B
- Authority
- CN
- China
- Prior art keywords
- sliding
- monitoring
- blink
- unlocking
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 239000011521 glass Substances 0.000 title claims abstract description 58
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000012544 monitoring process Methods 0.000 claims abstract description 212
- 210000003128 head Anatomy 0.000 claims abstract description 102
- 230000000694 effects Effects 0.000 claims abstract description 69
- 238000004458 analytical method Methods 0.000 claims abstract description 68
- 210000004709 eyebrow Anatomy 0.000 claims abstract description 32
- 210000001508 eye Anatomy 0.000 claims description 104
- 230000006698 induction Effects 0.000 claims description 30
- 238000012790 confirmation Methods 0.000 claims description 26
- 230000033001 locomotion Effects 0.000 claims description 23
- 238000004364 calculation method Methods 0.000 claims description 20
- 230000001133 acceleration Effects 0.000 claims description 14
- 210000005252 bulbus oculi Anatomy 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 230000033764 rhythmic process Effects 0.000 claims description 7
- 239000004984 smart glass Substances 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 3
- 230000004044 response Effects 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 10
- 230000004397 blinking Effects 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention relates to the technical field of intelligent glasses, and particularly discloses an intelligent glasses control method and system. According to the embodiment of the invention, the eyebrow sliding monitoring is performed in real time, when the unlocking standard is reached, the user can unlock the homepage and enter the homepage, and an operation monitoring signal is generated; performing eye monitoring and head activity sensing to generate eye monitoring data and head sensing data; performing blink recognition analysis on the left and right sides, generating corresponding selection signals, and performing selection operation on a control homepage according to the selection signals; and performing head activity recognition analysis, generating a corresponding click signal, and performing clicking operation on the control homepage according to the click signal. Can carry out corresponding unblock, selection operation and click operation to intelligent glasses through carrying out brow portion slip monitoring, eye monitoring and head activity response, need not the user and carries out manual operation and speech control, can satisfy the use of intelligent glasses in complicated actual scene to the use of intelligent glasses of being convenient for is promoted.
Description
Technical Field
The invention belongs to the technical field of intelligent glasses, and particularly relates to an intelligent glasses control method and system.
Background
The intelligent glasses are generic names of glasses with independent operating systems like intelligent mobile phones, can be provided by software service providers such as user installation software and games, can be used for completing functions such as schedule adding, map navigation, friend interaction, photo and video shooting, friend video call expanding and the like through voice or action control, and can be accessed through a mobile communication network.
If the intelligent glasses are required to be used and popularized, the intelligent glasses are required to meet various complicated use scenes, and the existing intelligent glasses are usually used by users through manual switching or voice recognition. However, in some special use scenes, the user may not be able to perform manual operation, and the intelligent glasses also cannot recognize voice instructions of the user, so that the intelligent function operation of the intelligent glasses is limited, the use of the intelligent glasses in actual scenes cannot be satisfied, and the use and popularization of the intelligent glasses are inconvenient.
Disclosure of Invention
The embodiment of the invention aims to provide an intelligent glasses control method and system, and aims to solve the problems in the background technology.
In order to achieve the above object, the embodiment of the present invention provides the following technical solutions:
an intelligent glasses control method specifically comprises the following steps:
Performing eyebrow sliding monitoring in real time, generating sliding monitoring data, judging whether the sliding monitoring data reach an unlocking standard, unlocking to enter a control homepage when the sliding monitoring data reach the unlocking standard, and generating an operation monitoring signal;
According to the operation monitoring signal, eye monitoring and head activity sensing are carried out, and eye monitoring data and head sensing data are generated;
performing left and right blink recognition analysis according to the eye monitoring data, generating corresponding selection signals, and performing left and right sliding selection operation on a control homepage according to the selection signals;
and carrying out head activity recognition analysis according to the head sensing data, generating a corresponding click signal, and carrying out a clicking operation of confirming or cancelling in a control homepage according to the click signal.
As a further limitation of the technical solution of the embodiment of the present invention, the real-time eyebrow sliding monitoring is performed, sliding monitoring data is generated, whether an unlocking standard is reached is judged, and when the unlocking standard is reached, the user enters the control homepage in an unlocking manner, and the generating of the operation monitoring signal specifically includes the following steps:
Performing eyebrow sliding monitoring in real time to generate sliding monitoring data;
according to the sliding monitoring data, when the eyebrow slides, calculating the continuous sliding times and the continuous sliding gaps;
When the continuous sliding times are not less than the preset standard sliding times and the continuous sliding gaps are smaller than the preset standard sliding gaps, the unlocking standard is reached;
unlocking to enter a control homepage, and generating an operation monitoring signal.
As a further limitation of the technical solution of the embodiment of the present invention, the generating the eye monitoring data and the head sensing data by performing the eye monitoring and the head activity sensing according to the operation monitoring signal specifically includes the following steps:
according to the operation monitoring signal, left eye monitoring is carried out, and left eye monitoring data are generated;
according to the operation monitoring signal, right eye monitoring is carried out, and right eye monitoring data are generated;
according to the operation monitoring signal, nodding activity induction is carried out, and nodding induction data are generated;
and according to the operation monitoring signal, performing head swing movement induction to generate head swing induction data.
As a further limitation of the technical solution of the embodiment of the present invention, the performing a blink recognition analysis according to the eye monitoring data, generating a corresponding selection signal, and performing a selection operation of sliding left and right on a homepage according to the selection signal, specifically includes the following steps:
According to the eye monitoring data, when the existence of blinks is determined, calculating the continuous blink times and continuous blink gaps;
Generating a corresponding selection signal when the continuous blink times are not less than the preset standard blink times and the continuous blink gaps are all less than the preset standard blink gaps;
and according to the selection signal, performing a left-right sliding selection operation on the control homepage.
As a further limitation of the technical solution of the embodiment of the present invention, the performing, according to the head sensing data, head activity recognition analysis, generating a corresponding click signal, and performing, according to the click signal, a click operation of confirming or canceling in manipulating a homepage specifically includes the following steps:
performing head activity recognition analysis according to the nodding induction data to generate a confirmation click signal;
Performing a confirmation clicking operation on the control homepage according to the confirmation clicking signal;
performing head activity recognition analysis according to the head swing sensing data to generate a cancel click signal;
And according to the cancel click signal, performing cancel click operation on the control homepage.
An intelligent eyeglass handling system, the system comprising a brow portion sliding monitoring unit, a head activity sensing unit, a blink recognition analysis unit and a head activity analysis unit, wherein:
The eyebrow sliding monitoring unit is used for carrying out eyebrow sliding monitoring in real time, generating sliding monitoring data, judging whether the sliding monitoring data reach an unlocking standard, unlocking to enter a control homepage when the sliding monitoring data reach the unlocking standard, and generating an operation monitoring signal;
the head activity sensing unit is used for performing eye monitoring and head activity sensing according to the operation monitoring signal to generate eye monitoring data and head sensing data;
the blink recognition analysis unit is used for performing left and right blink recognition analysis according to the eye monitoring data, generating corresponding selection signals, and performing left and right sliding selection operation on a control homepage according to the selection signals;
And the head activity analysis unit is used for carrying out head activity identification analysis according to the head induction data, generating a corresponding click signal, and carrying out a clicking operation of confirming or cancelling in a control homepage according to the click signal.
As a further limitation of the technical solution of the embodiment of the present invention, the eyebrow sliding monitoring unit specifically includes:
the sliding monitoring module is used for carrying out eyebrow sliding monitoring in real time and generating sliding monitoring data;
the sliding calculation module is used for calculating continuous sliding times and continuous sliding gaps when the eyebrow slides according to the sliding monitoring data;
the sliding comparison module is used for calculating a comprehensive unlocking index when the continuous sliding times are not less than the preset standard sliding times and the continuous sliding gaps are all less than the preset standard sliding gaps, and judging whether the unlocking standard is reached according to the comprehensive unlocking index;
and the unlocking processing module is used for unlocking the access control homepage and generating an operation monitoring signal.
As a further limitation of the technical solution of the embodiment of the present invention, the head activity sensing unit specifically includes:
the left eye monitoring module is used for monitoring left eyes according to the operation monitoring signals and generating left eye monitoring data;
The right eye monitoring module is used for monitoring the right eye according to the operation monitoring signal and generating right eye monitoring data;
The nodding induction module is used for carrying out nodding activity induction according to the operation monitoring signal to generate nodding induction data;
and the head-swing sensing module is used for performing head-swing movement sensing according to the operation monitoring signal to generate head-swing sensing data.
As a further limitation of the technical solution of the embodiment of the present invention, the blink recognition analysis unit specifically includes:
the blink calculation module is used for calculating the continuous blink times and continuous blink gaps when the blinks are determined to exist according to the eye monitoring data;
The blink comparison module is used for generating corresponding selection signals when the continuous blink times are not smaller than the preset standard blink times and the continuous blink gaps are smaller than the preset standard blink gaps;
and the selection operation module is used for performing left-right sliding selection operation on the control homepage according to the selection signal.
As a further limitation of the technical solution of the embodiment of the present invention, the head activity analysis unit specifically includes:
The nodding analysis module is used for carrying out head activity recognition analysis according to the nodding induction data and generating a confirmation click signal;
The confirmation operation module is used for carrying out confirmation clicking operation on the operation homepage according to the confirmation clicking signal;
the head swing analysis module is used for performing head activity recognition analysis according to the head swing sensing data and generating a cancel click signal;
And the cancel operation module is used for carrying out cancel click operation on the control homepage according to the cancel click signal.
Compared with the prior art, the invention has the beneficial effects that:
According to the embodiment of the invention, the eyebrow sliding monitoring is performed in real time, when the unlocking standard is reached, the user can unlock the homepage and enter the homepage, and an operation monitoring signal is generated; performing eye monitoring and head activity sensing to generate eye monitoring data and head sensing data; performing blink recognition analysis on the left and right sides, generating corresponding selection signals, and performing selection operation on a control homepage according to the selection signals; and performing head activity recognition analysis, generating a corresponding click signal, and performing clicking operation on the control homepage according to the click signal. Can carry out corresponding unblock, selection operation and click operation to intelligent glasses through carrying out brow portion slip monitoring, eye monitoring and head activity response, need not the user and carries out manual operation and speech control, can satisfy the use of intelligent glasses in complicated actual scene to the use of intelligent glasses of being convenient for is promoted.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present invention.
Fig. 1 shows a flowchart of a method provided by an embodiment of the present invention.
Fig. 2 shows a flowchart of sliding monitoring unlock determination in the method provided by the embodiment of the present invention.
Fig. 3 shows a flowchart of eye monitoring head sensing in a method according to an embodiment of the invention.
Fig. 4 shows a flowchart of a left-right blink recognition analysis in a method according to an embodiment of the present invention.
FIG. 5 shows a flow chart of head activity recognition analysis in a method provided by an embodiment of the invention.
Fig. 6 shows an application architecture diagram of a system provided by an embodiment of the present invention.
Fig. 7 is a block diagram of a eyebrow sliding monitoring unit in the system according to the embodiment of the invention.
Fig. 8 is a block diagram illustrating a structure of a head activity sensing unit in the system according to an embodiment of the present invention.
Fig. 9 shows a block diagram of a blink recognition analysis unit in the system according to an embodiment of the present invention.
Fig. 10 is a block diagram showing the structure of a head activity analysis unit in the system according to the embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
It can be appreciated that in the prior art, the intelligent glasses are usually used by users to manually switch the use functions or switch the use functions through voice recognition, however, in some special use scenes, the users may not be able to manually operate, and the intelligent glasses also may not be able to recognize voice instructions of the users, so that the intelligent function operation of the intelligent glasses is limited, the use of the intelligent glasses in the actual scene cannot be satisfied, and the use and popularization of the intelligent glasses are inconvenient.
In order to solve the problems, according to the embodiment of the invention, the eyebrow sliding monitoring is performed in real time, when the unlocking standard is reached, the user enters a control homepage in an unlocking manner, and an operation monitoring signal is generated; performing eye monitoring and head activity sensing to generate eye monitoring data and head sensing data; performing blink recognition analysis on the left and right sides, generating corresponding selection signals, and performing selection operation on a control homepage according to the selection signals; and performing head activity recognition analysis, generating a corresponding click signal, and performing clicking operation on the control homepage according to the click signal. Can carry out corresponding unblock, selection operation and click operation to intelligent glasses through carrying out brow portion slip monitoring, eye monitoring and head activity response, need not the user and carries out manual operation and speech control, can satisfy the use of intelligent glasses in complicated actual scene to the use of intelligent glasses of being convenient for is promoted.
Fig. 1 shows a flowchart of a method provided by an embodiment of the present invention.
Specifically, a method for controlling intelligent glasses specifically includes the following steps:
Step S101, performing eyebrow sliding monitoring in real time, generating sliding monitoring data, judging whether an unlocking standard is met, unlocking to enter a control homepage when the unlocking standard is met, and generating an operation monitoring signal.
In some special use scenes, the user wearing the intelligent glasses is inconvenient to operate the intelligent glasses by hands and is inconvenient to operate the functions of the intelligent glasses by voice, and other convenient operation methods which accord with complex use scenes are needed to be adopted at the moment. The inner side of the upper part of the intelligent glasses is attached to the eyebrow of a user, the sliding of the eyebrow of the user is monitored in real time, sliding monitoring data are generated, when the user senses that the eyebrow slides, the current sliding monitoring data are analyzed, the continuous sliding times and the continuous sliding gaps of the eyebrow of the user are calculated, when the continuous sliding times are not smaller than the preset standard sliding times and the continuous sliding gaps are smaller than the preset standard sliding gaps, the comprehensive unlocking index is calculated, when the unlocking standard of the intelligent glasses is judged to be reached according to the comprehensive unlocking index, the intelligent glasses are unlocked at the moment, a control homepage of the intelligent glasses is entered, the user can check the control homepage of the intelligent glasses through eyes, and an operation monitoring signal is generated at the moment.
Specifically, fig. 2 shows a flowchart of sliding monitoring and unlocking judgment in the method provided by the embodiment of the invention.
In the preferred embodiment of the present invention, the real-time eyebrow sliding monitoring is performed, sliding monitoring data is generated, whether an unlocking standard is reached is judged, when the unlocking standard is reached, the user enters a control homepage in an unlocking manner, and the generating of the operation monitoring signal specifically includes the following steps:
Step S1011, performing real-time eyebrow sliding monitoring to generate sliding monitoring data.
Step S1012, according to the sliding monitoring data, when the eyebrow sliding exists, the continuous sliding times and the continuous sliding gaps are calculated.
In step S1013, when the number of continuous sliding times is not less than the preset standard sliding times and the continuous sliding gaps are all less than the preset standard sliding gaps, the comprehensive unlocking index is calculated, and whether the unlocking standard is reached is determined according to the comprehensive unlocking index.
The step S1013 specifically includes the following sub-steps:
Step S1013a, obtaining the number of continuous sliding times, the time point of each sliding, the speed of each sliding and the sliding mode corresponding to each sliding, and calculating to obtain a continuous sliding gap according to the time points of adjacent sliding;
step S1013b, calculating to obtain a dynamic sliding rhythm index according to the continuous sliding times and the time point of each sliding;
The calculation formula of the dynamic sliding rhythm index is expressed as follows:
Wherein, The dynamic sliding cadence index is represented,Represent the firstAt the time point of the secondary sliding,Indicating the number of consecutive slides.
It should be noted here that the dynamic sliding tempo indexThe change condition of the sliding rhythm is reflected,The smaller the value, the more stable the cadence that indicates a slip.
Step S1013c, calculating the sliding acceleration change rate according to the sliding speed;
wherein, the calculation formula of the sliding acceleration change rate is expressed as:
Wherein, The rate of change of the sliding acceleration is indicated,Represent the firstThe speed of the secondary slip.
Here, the sliding acceleration change rate is described asReflecting the degree of variation of the sliding acceleration,The smaller the sliding acceleration, the more stable the sliding acceleration.
Step S1013d, calculating according to the sliding mode corresponding to each sliding to obtain the sliding entropy;
Wherein, the calculation formula of the sliding entropy is expressed as follows:
Wherein, The sliding entropy is represented by a sliding entropy,Represent the firstThe probability of occurrence of the sliding mode is selected,Representing the total number of different sliding modes.
It will be appreciated that the sliding entropy is mainly used to quantify the complexity and uncertainty of the sliding pattern. The slip pattern may be defined based on factors such as the speed, acceleration, and direction of the slip. Sliding entropyThe larger the complexity of the representation sliding mode.
Step S1013e, calculating to obtain a comprehensive unlocking index according to the dynamic sliding rhythm index, the sliding acceleration change rate and the sliding entropy, and determining that the unlocking standard is reached when the comprehensive unlocking index is judged to be smaller than a preset unlocking index threshold.
The comprehensive unlocking index calculation formula is expressed as follows:
Wherein, Indicating the overall unlocking index of the vehicle,Representing the weight coefficient.
It can be appreciated that in the formula of the integrated unlocking index, the sliding entropyTake a negative value because the higher the sliding entropy (i.e., the more complex the sliding mode), the greater the likelihood of unlocking. When the comprehensive unlocking indexIf the value is smaller than a preset threshold value, the unlocking standard is considered to be met.
Step S1014, unlock entry manipulation home page, and generate operation monitoring signal.
Further, the intelligent glasses control method further comprises the following steps:
step S102, eye monitoring and head movement sensing are carried out according to the operation monitoring signals, and eye monitoring data and head sensing data are generated.
In the embodiment of the invention, according to the operation monitoring signal, the intelligent glasses are activated to monitor and sense the activities of eyes and heads of users, and specifically: the method comprises the steps of respectively monitoring and shooting a left eye and a right eye of a user, generating left eye monitoring data and right eye monitoring data, and respectively generating nodding induction data and nodding induction data by actively inducing nodding and nodding of the user.
It can be appreciated that the motion sensing of the user's head can be utilized to detect the motion of the user nodding and waving by using acceleration detection techniques.
Specifically, fig. 3 shows a flowchart of eye monitoring head sensing in the method provided in the embodiment of the present invention.
In a preferred embodiment of the present invention, the eye monitoring and head activity sensing are performed according to the operation monitoring signal, and the generating of the eye monitoring data and the head sensing data specifically includes the following steps:
And step S1021, performing left eye monitoring according to the operation monitoring signal, and generating left eye monitoring data.
Step S1022, performing right eye monitoring according to the operation monitoring signal, and generating right eye monitoring data.
Step S1023, according to the operation monitoring signal, the nodding motion sensing is carried out, and nodding sensing data are generated.
Step S1024, according to the operation monitoring signal, the head swing movement sensing is performed, and the head swing sensing data is generated.
Further, the intelligent glasses control method further comprises the following steps:
step S103, performing left and right blink recognition analysis according to the eye monitoring data, generating corresponding selection signals, and performing left and right sliding selection operation on a control homepage according to the selection signals.
In the embodiment of the invention, according to the left eye monitoring data and the right eye monitoring data, blink identification is carried out on the left eye and the right eye of a user, when blink exists, the continuous blink times and the continuous blink gaps of the left eye or the right eye are calculated, when the continuous blink times are not smaller than the preset standard blink times and the continuous blink gaps are smaller than the preset standard blink gaps, corresponding selection signals are generated (if blink is left eye, a left slide selection signal is generated, and if blink is right eye, a right slide selection signal is generated), and then according to the selection signals, left and right slide selection operations are carried out on a control homepage (if blink is left slide selection signal, left slide selection operation is carried out on the control homepage, so that the function marking image of the marking frame slides to the left side, and if blink selection signal is right slide operation is carried out on the control homepage, so that the function marking image of the marking frame slides to the right side.
Specifically, fig. 4 shows a flowchart of a left-right blink recognition analysis in the method provided by the embodiment of the present invention.
In a preferred embodiment of the present invention, the performing a blink recognition analysis according to the eye monitoring data, generating a corresponding selection signal, and performing a selection operation of sliding left and right on a homepage according to the selection signal specifically includes the following steps:
Step S1031, calculating a continuous blink number and a continuous blink interval when it is determined that blink exists according to the eye monitoring data.
Step S1032, when the continuous blink times are not less than the preset standard blink times and the continuous blink gaps are all less than the preset standard blink gaps, generating corresponding selection signals.
Step S1033, according to the selection signal, performing a left-right sliding selection operation on the manipulation homepage.
The step S1033 specifically includes the following sub-steps:
Step S1033a, obtaining a movement amount of the eyeball in the horizontal direction at each blink, a movement speed of the eyeball at each blink, a time point at each blink and a duration time at each blink, and calculating a continuous blink gap according to the time points of adjacent blinks;
Step S1033b, calculating to obtain a basic sliding direction according to the movement amount of the eyeballs in the horizontal direction during each blink;
Wherein, the calculation formula of the basic sliding direction is expressed as:
Wherein, Indicating the basic sliding direction of the eyeball,Represent the firstThe amount of movement of the eyeball in the horizontal direction at the time of the blink,Representing the function operation for determining the sum of the amounts of movement,Indicating the total number of blinks.
In this case, ifThenThe function returns a positive number, indicating a rightward slip; if it isThenThe function returns a negative number, indicating a sliding to the left; It means remaining stationary.
Step S1033c, calculating to obtain a sliding power index according to the moving speed of the eyeballs at each blink and the duration time at each blink;
wherein, the calculation formula of the sliding power index is expressed as:
Wherein, The slip power index is indicated as such,Represent the firstThe speed of movement of the eye at the time of the blink,Represent the firstDuration of eye at blink.
It will be appreciated that this formula is used to quantify the average eye movement speed of the user during blinking and the duration of blinking, thereby evaluating the user's sliding intent and power.
Step S1033d, determining a final sliding direction and a final sliding distance based on the basic sliding direction and the sliding power index.
Wherein, the calculation formula of the final sliding direction is expressed as:
Wherein, Indicating the direction of the final sliding movement,Indicating the final sliding distance of the slide-in member,Represents an adjustable scale factor for use in a computer systemConverting to an actual screen sliding distance.
Step S1033e, performing a corresponding sliding operation on the manipulation home page according to the final sliding direction and the final sliding distance.
Further, the intelligent glasses control method further comprises the following steps:
Step S104, performing head activity recognition analysis according to the head sensing data, generating a corresponding click signal, and performing a clicking operation of confirming or cancelling in the operation homepage according to the click signal.
According to the embodiment of the invention, the nodding and the nodding of the user are identified according to the nodding sensing data and the nodding sensing data, and when the nodding of the user is identified, a click confirmation signal is generated to confirm the function marking image corresponding to the opening marking selection frame; and when the user swing is identified, generating a cancel click signal, canceling the function marking image corresponding to the opening marking frame, or returning to the upper menu.
Specifically, fig. 5 shows a flowchart of a head activity recognition analysis in the method according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the performing head activity recognition analysis according to the head sensing data generates a corresponding click signal, and the clicking operation for performing confirmation or cancellation on the manipulation homepage according to the click signal specifically includes the following steps:
step S1041, performing a head activity recognition analysis according to the nodding induction data, and generating a confirmation click signal.
Step S1042, performing a confirmation click operation on the manipulation homepage according to the confirmation click signal.
Step S1043, performing head activity recognition analysis according to the swing sensing data, and generating a cancel click signal.
Step S1044, performing a cancel click operation on the manipulation homepage according to the cancel click signal.
Further, fig. 6 shows an application architecture diagram of the system provided by the embodiment of the present invention.
In another preferred embodiment of the present invention, an intelligent glasses control system includes:
The brow sliding monitoring unit 101 is configured to perform brow sliding monitoring in real time, generate sliding monitoring data, determine whether an unlocking standard is reached, unlock the brow sliding monitoring data to enter a control homepage when the unlocking standard is reached, and generate an operation monitoring signal.
In the embodiment of the invention, in some special use scenes, a user wearing the intelligent glasses is inconvenient to operate the intelligent glasses by hands, and is also inconvenient to operate the functions of the intelligent glasses by voice, at the moment, other operation methods which are more convenient and accord with complex use scenes are needed to be adopted, the brow sliding monitoring unit 101 is attached to the brow of the user through the inner side of the upper part of the intelligent glasses, the brow sliding of the user is monitored in real time, sliding monitoring data are generated, when the brow sliding is sensed, the current sliding monitoring data are analyzed, the continuous sliding times and the continuous sliding gaps of the brow of the user are calculated, when the continuous sliding times are not less than the preset standard sliding times and the continuous sliding gaps are all less than the preset standard sliding gaps, the unlocking standard of the intelligent glasses is reached, at the moment, the intelligent glasses are unlocked, the user enters the control homepage of the intelligent glasses, the user can check the control homepage of the intelligent glasses through eyes, and an operation monitoring signal is generated at the moment.
Specifically, fig. 7 shows a block diagram of the eyebrow sliding monitoring unit 101 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the brow portion sliding monitoring unit 101 specifically includes:
the sliding monitoring module 1011 is used for performing the eyebrow sliding monitoring in real time to generate sliding monitoring data.
And a sliding calculation module 1012 for calculating the number of continuous sliding times and the continuous sliding gap when the brow portion slides according to the sliding monitoring data.
The sliding comparison module 1013 is configured to reach the unlocking standard when the number of continuous sliding times is not less than the preset standard sliding times and the continuous sliding gaps are all less than the preset standard sliding gaps.
The unlocking processing module 1014 is used for unlocking the access control homepage and generating an operation monitoring signal.
Further, the intelligent glasses control system further comprises:
and the head activity sensing unit 102 is used for performing eye monitoring and head activity sensing according to the operation monitoring signal to generate eye monitoring data and head sensing data.
In the embodiment of the present invention, the head activity sensing unit 102 activates the smart glasses to monitor and sense the activities of eyes and head of the user according to the operation monitoring signal, specifically: the method comprises the steps of respectively monitoring and shooting a left eye and a right eye of a user, generating left eye monitoring data and right eye monitoring data, and respectively generating nodding induction data and nodding induction data by actively inducing nodding and nodding of the user.
Specifically, fig. 8 shows a block diagram of the head activity sensing unit 102 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the head activity sensing unit 102 specifically includes:
the left eye monitoring module 1021 is configured to perform left eye monitoring according to the operation monitoring signal, and generate left eye monitoring data.
The right eye monitoring module 1022 is configured to perform right eye monitoring according to the operation monitoring signal, and generate right eye monitoring data.
The nodding induction module 1023 is used for performing nodding activity induction according to the operation monitoring signal to generate nodding induction data.
The head-swing sensing module 1024 is configured to perform head-swing motion sensing according to the operation monitoring signal, and generate head-swing sensing data.
Further, the intelligent glasses control system further comprises:
and the blink recognition analysis unit 103 is used for performing blink recognition analysis on the left and right sides according to the eye monitoring data, generating corresponding selection signals, and performing left and right sliding selection operation on the control homepage according to the selection signals.
In the embodiment of the present invention, the blink recognition analysis unit 103 performs blink recognition on the left eye and the right eye of the user according to the left eye monitoring data and the right eye monitoring data, calculates the continuous blink times and the continuous blink gaps of the left eye or the right eye when there is blink, and generates corresponding selection signals (if blink is left eye, a left slide selection signal is generated, and if blink is right eye, a right slide selection signal is generated) when the continuous blink times are not less than the preset standard blink times and the continuous blink gaps are all less than the preset standard blink gaps, and further performs a left-right slide selection operation on the control homepage according to the selection signals (if the blink is left slide selection signal, a left slide selection operation is performed on the control homepage to enable the label frame to slide left to the left functional label image, and if the blink selection signal is right slide selection operation is performed on the control homepage to enable the label frame to slide right to the right side).
Specifically, fig. 9 shows a block diagram of the blink recognition analysis unit 103 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the blink recognition analysis unit 103 specifically includes:
and a blink calculation module 1031 for calculating a continuous blink number and a continuous blink clearance when it is determined that blinks exist according to the eye monitoring data.
The blink comparison module 1032 is configured to generate a corresponding selection signal when the continuous blink times are not less than a preset standard blink times and the continuous blink intervals are all less than the preset standard blink intervals.
A selection operation module 1033 for performing a selection operation of sliding left and right on the manipulation home page according to the selection signal.
Further, the intelligent glasses control system further comprises:
The head activity analysis unit 104 is configured to perform head activity recognition analysis according to the head sensing data, generate a corresponding click signal, and perform a click operation of confirmation or cancellation on the manipulation homepage according to the click signal.
In the embodiment of the invention, the head activity analysis unit 104 identifies the nodding and the nodding of the user according to nodding induction data and nodding induction data, and generates a confirmation click signal when the nodding of the user is identified, so as to confirm the function marking image corresponding to the opening marking frame; and when the user swing is identified, generating a cancel click signal, canceling the function marking image corresponding to the opening marking frame, or returning to the upper menu.
Specifically, fig. 10 shows a block diagram of the structure of the head activity analysis unit 104 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the head activity analysis unit 104 specifically includes:
the nodding analysis module 1041 is configured to perform head activity recognition analysis according to the nodding sensing data, and generate a confirmation click signal.
And a confirmation operation module 1042, configured to perform a confirmation click operation on the manipulation homepage according to the confirmation click signal.
The head swing analysis module 1043 is configured to perform head activity recognition analysis according to the head swing sensing data, and generate a cancel click signal.
And a cancel operation module 1044, configured to perform a cancel click operation on the manipulation homepage according to the cancel click signal.
It should be understood that, although the steps in the flowcharts of the embodiments of the present invention are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in various embodiments may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
Those skilled in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a non-volatile computer readable storage medium, and where the program, when executed, may include processes in the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.
Claims (6)
1. The intelligent glasses control method is characterized by comprising the following steps of:
Performing eyebrow sliding monitoring in real time, generating sliding monitoring data, judging whether the sliding monitoring data reach an unlocking standard, unlocking to enter a control homepage when the sliding monitoring data reach the unlocking standard, and generating an operation monitoring signal;
According to the operation monitoring signal, eye monitoring and head activity sensing are carried out, and eye monitoring data and head sensing data are generated;
performing left and right blink recognition analysis according to the eye monitoring data, generating corresponding selection signals, and performing left and right sliding selection operation on a control homepage according to the selection signals;
performing head activity recognition analysis according to the head sensing data, generating a corresponding click signal, and performing a clicking operation of confirming or cancelling in a control homepage according to the click signal;
The method comprises the steps of carrying out eyebrow sliding monitoring in real time, generating sliding monitoring data, judging whether the sliding monitoring data reach an unlocking standard, unlocking to enter a control homepage when the unlocking standard is reached, and generating an operation monitoring signal specifically comprises the following steps:
Performing eyebrow sliding monitoring in real time to generate sliding monitoring data;
according to the sliding monitoring data, when the eyebrow slides, calculating the continuous sliding times and the continuous sliding gaps;
When the continuous sliding times are not less than the preset standard sliding times and the continuous sliding gaps are all smaller than the preset standard sliding gaps, calculating a comprehensive unlocking index, and judging whether the unlocking standard is met according to the comprehensive unlocking index;
Unlocking to enter a control homepage, and generating an operation monitoring signal;
According to the operation monitoring signal, eye monitoring and head activity sensing are carried out, and the generation of eye monitoring data and head sensing data specifically comprises the following steps:
according to the operation monitoring signal, left eye monitoring is carried out, and left eye monitoring data are generated;
according to the operation monitoring signal, right eye monitoring is carried out, and right eye monitoring data are generated;
according to the operation monitoring signal, nodding activity induction is carried out, and nodding induction data are generated;
according to the operation monitoring signal, head swing movement induction is carried out, and head swing induction data are generated;
When the continuous sliding times are not less than the preset standard sliding times and the continuous sliding gaps are all smaller than the preset standard sliding gaps, the method for calculating the comprehensive unlocking index and judging whether the unlocking standard is reached according to the comprehensive unlocking index comprises the following steps:
Acquiring continuous sliding times, a time point of each sliding, a speed of each sliding and a sliding mode corresponding to each sliding, and calculating to obtain a continuous sliding gap according to the time points of adjacent sliding;
calculating according to the continuous sliding times and the time point of each sliding to obtain a dynamic sliding rhythm index;
The calculation formula of the dynamic sliding rhythm index is expressed as follows:
;
Wherein, The dynamic sliding cadence index is represented,Represent the firstAt the time point of the secondary sliding,Representing the number of consecutive slides;
calculating according to the speed of each sliding to obtain the sliding acceleration change rate;
wherein, the calculation formula of the sliding acceleration change rate is expressed as:
;
Wherein, The rate of change of the sliding acceleration is indicated,Represent the firstThe speed of the secondary slip;
calculating according to a sliding mode corresponding to each sliding to obtain a sliding entropy;
Wherein, the calculation formula of the sliding entropy is expressed as follows:
;
Wherein, The sliding entropy is represented by a sliding entropy,Represent the firstThe probability of occurrence of the sliding mode is selected,Representing the total number of different sliding modes;
calculating to obtain a comprehensive unlocking index according to the dynamic sliding rhythm index, the sliding acceleration change rate and the sliding entropy, and determining that the comprehensive unlocking index is smaller than a preset unlocking index threshold value when the comprehensive unlocking index is judged to be up to an unlocking standard;
The comprehensive unlocking index calculation formula is expressed as follows:
;
Wherein, Indicating the overall unlocking index of the vehicle,Representing the weight coefficient;
The eye monitoring data is used for carrying out left and right blink recognition analysis according to the eye monitoring data, generating corresponding selection signals, and carrying out left and right sliding selection operation on a control homepage according to the selection signals, wherein the selection operation specifically comprises the following steps of:
According to the eye monitoring data, when the existence of blinks is determined, calculating the continuous blink times and continuous blink gaps;
Generating a corresponding selection signal when the continuous blink times are not less than the preset standard blink times and the continuous blink gaps are all less than the preset standard blink gaps;
According to the selection signal, performing left-right sliding selection operation on the control homepage;
the method for performing left-right sliding selection operation on the control homepage according to the selection signal comprises the following steps:
Acquiring the movement amount of the eyeballs in the horizontal direction at each blink, the movement speed of the eyeballs at each blink, the time point at each blink and the duration time at each blink, and calculating to obtain a continuous blink clearance according to the time points of adjacent blinks;
Step S1033b, calculating to obtain a basic sliding direction according to the movement amount of the eyeballs in the horizontal direction during each blink;
Wherein, the calculation formula of the basic sliding direction is expressed as:
;
Wherein, Indicating the basic sliding direction of the eyeball,Represent the firstThe amount of movement of the eyeball in the horizontal direction at the time of the blink,Representing the function operation for determining the sum of the amounts of movement,Indicating the total number of blinks;
according to the moving speed of eyeballs during each blink and the duration time during each blink, calculating to obtain a sliding power index;
wherein, the calculation formula of the sliding power index is expressed as:
;
Wherein, The slip power index is indicated as such,Represent the firstThe speed of movement of the eye at the time of the blink,Represent the firstDuration of eye at the time of blink;
step S1033d, determining a final sliding direction and a final sliding distance based on the basic sliding direction and the sliding power index;
wherein, the calculation formula of the final sliding direction is expressed as:
;
Wherein, Indicating the direction of the final sliding movement,Indicating the final sliding distance of the slide-in member,Represents an adjustable scale factor for use in a computer systemConverting into an actual screen sliding distance;
And executing corresponding sliding operation on the control homepage according to the final sliding direction and the final sliding distance.
2. The method for controlling intelligent glasses according to claim 1, wherein the performing head activity recognition analysis according to the head sensing data generates a corresponding click signal, and the clicking operation for performing confirmation or cancellation on the control homepage according to the click signal specifically comprises the following steps:
performing head activity recognition analysis according to the nodding induction data to generate a confirmation click signal;
Performing a confirmation clicking operation on the control homepage according to the confirmation clicking signal;
performing head activity recognition analysis according to the head swing sensing data to generate a cancel click signal;
And according to the cancel click signal, performing cancel click operation on the control homepage.
3. A smart glasses manipulation system, characterized in that the smart glasses manipulation method according to any one of the preceding claims 1 to 2 is applied, the system comprising an eyebrow sliding monitoring unit, a head activity sensing unit, a blink recognition analysis unit and a head activity analysis unit, wherein:
The eyebrow sliding monitoring unit is used for carrying out eyebrow sliding monitoring in real time, generating sliding monitoring data, judging whether the sliding monitoring data reach an unlocking standard, unlocking to enter a control homepage when the sliding monitoring data reach the unlocking standard, and generating an operation monitoring signal;
the head activity sensing unit is used for performing eye monitoring and head activity sensing according to the operation monitoring signal to generate eye monitoring data and head sensing data;
the blink recognition analysis unit is used for performing left and right blink recognition analysis according to the eye monitoring data, generating corresponding selection signals, and performing left and right sliding selection operation on a control homepage according to the selection signals;
And the head activity analysis unit is used for carrying out head activity identification analysis according to the head induction data, generating a corresponding click signal, and carrying out a clicking operation of confirming or cancelling in a control homepage according to the click signal.
4. The intelligent eyeglass handling system of claim 3, wherein the brow slide monitoring unit specifically comprises:
the sliding monitoring module is used for carrying out eyebrow sliding monitoring in real time and generating sliding monitoring data;
the sliding calculation module is used for calculating continuous sliding times and continuous sliding gaps when the eyebrow slides according to the sliding monitoring data;
the sliding comparison module is used for calculating a comprehensive unlocking index when the continuous sliding times are not less than the preset standard sliding times and the continuous sliding gaps are all less than the preset standard sliding gaps, and judging whether the unlocking standard is reached according to the comprehensive unlocking index;
and the unlocking processing module is used for unlocking the access control homepage and generating an operation monitoring signal.
5. The intelligent eyeglass handling system of claim 4, wherein the head activity sensing unit comprises:
the left eye monitoring module is used for monitoring left eyes according to the operation monitoring signals and generating left eye monitoring data;
The right eye monitoring module is used for monitoring the right eye according to the operation monitoring signal and generating right eye monitoring data;
The nodding induction module is used for carrying out nodding activity induction according to the operation monitoring signal to generate nodding induction data;
and the head-swing sensing module is used for performing head-swing movement sensing according to the operation monitoring signal to generate head-swing sensing data.
6. The smart glasses manipulation system of claim 5, wherein the blink recognition analysis unit specifically comprises:
the blink calculation module is used for calculating the continuous blink times and continuous blink gaps when the blinks are determined to exist according to the eye monitoring data;
The blink comparison module is used for generating corresponding selection signals when the continuous blink times are not smaller than the preset standard blink times and the continuous blink gaps are smaller than the preset standard blink gaps;
the selection operation module is used for performing left-right sliding selection operation on the control homepage according to the selection signal;
The head activity analysis unit specifically includes:
The nodding analysis module is used for carrying out head activity recognition analysis according to the nodding induction data and generating a confirmation click signal;
The confirmation operation module is used for carrying out confirmation clicking operation on the operation homepage according to the confirmation clicking signal;
the head swing analysis module is used for performing head activity recognition analysis according to the head swing sensing data and generating a cancel click signal;
And the cancel operation module is used for carrying out cancel click operation on the control homepage according to the cancel click signal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410481171.4A CN118092669B (en) | 2024-04-22 | 2024-04-22 | Intelligent glasses control method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410481171.4A CN118092669B (en) | 2024-04-22 | 2024-04-22 | Intelligent glasses control method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118092669A CN118092669A (en) | 2024-05-28 |
CN118092669B true CN118092669B (en) | 2024-07-26 |
Family
ID=91163367
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410481171.4A Active CN118092669B (en) | 2024-04-22 | 2024-04-22 | Intelligent glasses control method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118092669B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105260017A (en) * | 2015-09-28 | 2016-01-20 | 南京民办致远外国语小学 | Glasses mouse and working method therefor |
CN106774841A (en) * | 2016-11-23 | 2017-05-31 | 上海擎感智能科技有限公司 | Intelligent glasses and its awakening method, Rouser |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103838378B (en) * | 2014-03-13 | 2017-05-31 | 广东石油化工学院 | A kind of wear-type eyes control system based on pupil identification positioning |
KR102326489B1 (en) * | 2014-11-24 | 2021-11-16 | 삼성전자주식회사 | Electronic device and method for controlling dispaying |
CN106708251A (en) * | 2015-08-12 | 2017-05-24 | 天津电眼科技有限公司 | Eyeball tracking technology-based intelligent glasses control method |
CN105676458A (en) * | 2016-04-12 | 2016-06-15 | 王鹏 | Wearable calculation device and control method thereof, and wearable equipment with wearable calculation device |
KR101865329B1 (en) * | 2016-06-23 | 2018-06-08 | 충북대학교 산학협력단 | HMD headset apparatus and input method thereof |
CN110554768A (en) * | 2018-05-31 | 2019-12-10 | 努比亚技术有限公司 | intelligent wearable device control method and device and computer readable storage medium |
KR20210078214A (en) * | 2019-12-18 | 2021-06-28 | 이형빈 | Eyeglasses Mouse Device |
CN111736691B (en) * | 2020-06-01 | 2024-07-05 | Oppo广东移动通信有限公司 | Interaction method and device of head-mounted display device, terminal device and storage medium |
-
2024
- 2024-04-22 CN CN202410481171.4A patent/CN118092669B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105260017A (en) * | 2015-09-28 | 2016-01-20 | 南京民办致远外国语小学 | Glasses mouse and working method therefor |
CN106774841A (en) * | 2016-11-23 | 2017-05-31 | 上海擎感智能科技有限公司 | Intelligent glasses and its awakening method, Rouser |
Also Published As
Publication number | Publication date |
---|---|
CN118092669A (en) | 2024-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111857356B (en) | Method, device, equipment and storage medium for recognizing interaction gesture | |
KR20190101327A (en) | Method and apparatus for assessing price for subscription products | |
US11825278B2 (en) | Device and method for auto audio and video focusing | |
KR20190107621A (en) | Apparatus and control method for recommending applications based on context-awareness | |
CN112269650A (en) | Task scheduling method and device, electronic equipment and storage medium | |
CN111488855A (en) | Fatigue driving detection method, device, computer equipment and storage medium | |
CN111936990A (en) | Method and device for waking up screen | |
CN107463903B (en) | Face key point positioning method and device | |
KR20190094316A (en) | An artificial intelligence apparatus for recognizing speech of user and method for the same | |
KR20190096876A (en) | System nad method of unsupervised training with weight sharing for the improvement in speech recognition and recording medium for performing the method | |
KR101979375B1 (en) | Method of predicting object behavior of surveillance video | |
CN106203306A (en) | The Forecasting Methodology at age, device and terminal | |
KR20110074107A (en) | Method for detecting object using camera | |
KR20200080418A (en) | Terminla and operating method thereof | |
CN118092669B (en) | Intelligent glasses control method and system | |
CN112884040B (en) | Training sample data optimization method, system, storage medium and electronic equipment | |
JP6432513B2 (en) | Video processing apparatus, video processing method, and video processing program | |
WO2024152659A9 (en) | Image processing method and apparatus, device, medium, and program product | |
KR20190048630A (en) | Electric terminal and method for controlling the same | |
CN118194880A (en) | Method and device for recommending speaking operation based on large model, storage medium and electronic equipment | |
CN116893886A (en) | Decision confirmation method, decision confirmation device, electronic equipment and computer storage medium | |
US11218803B2 (en) | Device and method of performing automatic audio focusing on multiple objects | |
KR100815291B1 (en) | Method and system for optimizing parameters of face recognizer or face detector by using feedback of user | |
WO2023188264A1 (en) | Information processing system | |
CN118230426B (en) | Human body posture recognition method, apparatus and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |