CN105739705A - Human-eye control method and apparatus for vehicle-mounted system - Google Patents
Human-eye control method and apparatus for vehicle-mounted system Download PDFInfo
- Publication number
- CN105739705A CN105739705A CN201610079868.4A CN201610079868A CN105739705A CN 105739705 A CN105739705 A CN 105739705A CN 201610079868 A CN201610079868 A CN 201610079868A CN 105739705 A CN105739705 A CN 105739705A
- Authority
- CN
- China
- Prior art keywords
- image
- eye
- onboard system
- pupil
- transparent display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
- Eye Examination Apparatus (AREA)
Abstract
The invention relates to a human-eye control apparatus for a vehicle-mounted system. The apparatus comprises a camera, the vehicle-mounted system and a transparent display, wherein the vehicle-mounted system comprises a video acquisition module, an image processing module and an eye control system module. The camera is used for scanning a driving region and acquiring images within one second; the vehicle-mounted system processes the acquired images and judges a vehicle-mounted system control instruction according to eye motion information; and the transparent display feeds back vehicle-mounted system information, so that the convenience in control is brought for a driver. The vehicle-mounted system is controlled through eye actions in the whole process from turn-on of the transparent display and turn-off of the transparent display.
Description
Technical field
The present invention relates to onboard system technical field, particularly relate to a kind of by identifying that eye motion realizes the human-computer interaction technology that onboard system controls.
Background technology
Along with extensively popularizing of automobile, practical function and the simple operation of onboard system are had higher requirement by people, the onboard systems such as such as vehicle mounted multimedia used in everyday, navigation system, GPS can monitor speed, again can real-time positioning, guide direction and shortest path etc. be provided.Due in vehicle travel process, driver's change to destination, driving path;Or driver needs the function of onboard system is switched over, and then needs onboard system is operated manually, and thus safe driving is created hidden danger.
CN102609305A discloses a kind of dynamic based on eye and head detection novel human-machine interaction device, by detecting, eye is dynamic realizes the real-time control to computer system device with the dynamic signal of head for it, this system instead of the external equipments such as traditional mouse, keyboard and carries out man-machine interaction, utilize head to move and carry out signal detection, analog mouse right and left key.CN201220002430 discloses a kind of interactive system for vehicle-mounted voice based on cloud computing, mainly utilizes voice to carry out man-machine interaction.CN103336581A discloses a kind of man-machine interaction method based on the design of human body eye movement characteristics and system, interacts mainly for computer, and on implementation, its main analog mouse controls.GB1418891A proposes the improvement that will be applied on automobile to the system of looking squarely, according to the radian of windshield, the isoparametric adjustment of pilot's line of vision angle, the concrete technical scheme being applied on automobile by the system of looking squarely is provided, realize the computer action response to people, but to the conscious complex operations of people not mentioned, as opened multimedia or navigation system etc..
Additionally, each motor corporation also studies and realizes different functions on HUD, such as, the JPH07257228A patent of Nissan Motor proposes the function of realizing route navigation on HUD, and the information required for judging driver according to Eye-controlling focus displays, carrying out man-machine interaction is utilize hands or sound.The JPH1096776A patent of Honda Motors Co. is proposed by, in the HUD different colours shown mark and the distance of front truck, remind driver to note spacing, for as if the people that is shown to alternately of car external environment and computer see.
Summary of the invention
It is an object of the invention to provide a kind of onboard system human eye control method and device, by identifying that eye motion simulates the control to transparent display, it is achieved human eye controls onboard system.
The scheme that the technical problem of the solution present invention adopts is:
A kind of onboard system human eye controls device, and device includes photographic head, onboard system, transparent display, and onboard system includes video acquisition module, image processing module, eye Ore-controlling Role module.
Photographic head is used for scanning driver area, image within gathering 1 second, onboard system processes the image gathered and judges onboard system control instruction according to eye motion information, onboard system information is fed back by transparent display, so that driver controls, from opening transparent display to closedown, onboard system is controlled by whole process by eye motion.
Video acquisition module: gather image and with matrix or array form, image passed to image processing module.
Image processing module: the eye feature in image is identified and preservation, this feature mainly includes pupil position, hot spot, eyeball etc..Complete to include head position identification, the identification of eye position, pupil position identification, filter the Image Information Processing of nictation and sight line drop point Information locating, and process information is passed to an Ore-controlling Role module.First, by the colour of skin matching judgment of contouring head and people being identified the head position of people.Secondly, extracted the regional area comprising eyes by face morphology, carry out the identification of eye position, eye position demarcated simultaneously and judge equilbrium position.Then, before the pupil of location, utilize two-dimensional Gabor filtering algorithm first to identify iris, carry out Pupil diameter according to the iris recognized.Pupil diameter step is: first, by Gaussian filter, iris image is carried out pretreatment;Edge detection algorithm based on Canny operator carries out image procossing, formulates the gradient direction of each marginal point;Finally, according to Gradient direction information, it is determined that pupil center's point position, pupil center is the pupil radium first being obtained series of discrete by Hough transformation, and the gamma characteristic lower than the gray value of the white of the eye of surrounding further according to pupil is accurately obtained pupil center's point position coordinates.Subsequently, the time of nictation naturally according to people is between the 0.2-0.6 second, people blinks consciously and the trained reflex that produces because of strong illumination is typically in the physiological property of about 1 second nictation, carry out denoising, filtering etc. to process, remove human eye naturally to blink the impact on image, it is judged that blink consciously or stare and then identify the concrete operations instruction of eye control onboard system.Finally, according to pupil center corneal reflection method, pupil center is used to estimate to watch attentively by mapping function to the vector of one or two corneal reflex point and, it is determined that sight line drop point information.
Eye Ore-controlling Role module: control transparent display onboard by human eye, photographic head the image scanned, has identified sight line and has moved up action, opened transparent display.Moved by human eye pupil and follow the tracks of sight line and control display mouse and move.According to the image filtering characteristic nictation, it is judged that blink consciously or stare and then identify that eye controls the concrete operations instruction of onboard system.The coordinate of sight line drop point site is translated into mouse position, eye motion is translated into corresponding mouse action instruction, and shows on transparent display, it is achieved the eye motion manipulation to onboard system.Software in system can be used after having operated to close display, or do not have sight line to drop on display screen position then automatically into dormancy for a long time.
The method that the technical problem of the solution present invention adopts comprises the following steps:
A, after () opening vehicle loading system, onboard system gathers human eye uncalibrated image, and namely driver watches the fixing point on screen attentively, and camera collection image now is as uncalibrated image.
B driver area is actively timed image acquisition by () photographic head, adjust back with function and driver area carries out the image acquisition of 25-30 frame per second and preserves.Then, the image preserved being carried out the identification of eye position and inclination of sighting line angle analysis, whether real-time judge has turned on transparent display;If having turned on transparent display, onboard system performs step (d);If also not opening transparent display display, perform step (c).
C () photographic head continues to scan on driver area, it may be judged whether meet new line condition, controls transparent display switch.Wherein, whether new line condition is to be moved up by equilbrium position and exceeded threshold value according to eyeball position in 1 second to judge, threshold value is determined according to distance and the pupil position of the upper lower eyelid of people, threshold value is decided to be and accounts for the 30% of upper lower eyelid from upper eyelid distance and be as the criterion, it is considered as exceeding threshold value and eye position in display screen be more than or equal to this value, meets new line condition.If unlatching transparent display, onboard system performs step (d);If being not turned on transparent display, onboard system jumps back to step (b);
D () location sight line projects to transparent display position: the image stored by relative analysis step (b) and the uncalibrated image of step (a) human eye, completes eye tracking and pupil position identification;
Pupil position identification, specific operation process is as follows:
(1) determine filter window function, carry out linear filtering;
(2) Otsu algorithm (Otsu) is utilized to carry out region segmentation, it is achieved binaryzation;
(3) frontier tracing is proceeded by from the central point of image, it is thus achieved that pupil scope;
(4) centroid method is adopted to calculate the coordinate of pupil center's point.
Determining in process in pupil center's point position, Otsu algorithm is as follows:
If image comprise L gray level (0,1 ..., L-1), gray value is counted equal to the pixel of i and is expressed as Ni, and the total pixel of image is counted as N=N0+N1+...+N (L-1).The probability of the gray value point equal to i is: entire image is divided into dark space c1 and clear zone c2 two class by P (i)=N (i)/N, thresholding t, then inter-class variance σ is the function of t: σ=a1*a2 (u1-u2)2.In formula, aj is the area ratio with the image gross area of class cj, For the average of class cj, In formula, j represents subregion label, and j takes 1 or 2;This method selects optimum thresholding t to make inter-class variance maximum, it may be assumed that make Δ u=u1-u2, σ b=max{a1 (t) * a2 (t) Δ u2}。
It is weighted the region watched attentively again processing, by pupil center corneal reflection method, uses pupil center to estimate to watch drop point attentively by mapping function to the vector of one or two corneal reflex point, find out centroid position as sight line drop point information;Finally, by the sight line drop point of people in transparent display location positioning.
E () eye motion information simulation transparent display operates: by judging that eye motion realizes the control to onboard system mouse instructions, reach human eye and control the purpose of onboard system.
The gain effect of the present invention is:
1. human eye controls onboard system: by eye motion being scanned and identifying, and then complete the control to onboard system, instead of manual operation, simultaneously with transparent display, ensure that pilot's line of vision control system when without departing from front, decrease potential safety hazard during driving.
2. adaptive environment denoising: on onboard system, because the environment of running car is open systems, so producing impact when can produce various power, light and shade light to system identification pupil and feature extraction, so adaptive environment denoising can strengthen man-machine interaction, make system more intelligent.
3. sight line is accurately positioned: being accurately positioned is a basis for control vehicle-mounted system, prevent the Wrong control of operator onboard system, be accurately positioned the software to being used on onboard system in the future and also have very great help, such as safety prompt function software, when sight line leaves front, and front is dangerous can remind.
Accompanying drawing explanation
Fig. 1 is the method main flow schematic diagram of the present invention;
Fig. 2 is that the vehicle-mounted pan specific region of the present invention starts display, simulation transparent display controls idiographic flow schematic diagram;
Fig. 3 is the image processing architecture schematic diagram of the present invention;
Fig. 4 is the device structure schematic diagram of the present invention;
Fig. 5 is the system structure schematic diagram of the present invention.
Detailed description of the invention
The present invention provides a kind of onboard system human eye control method and device.Its system structure includes as shown in Figure 5: driver, photographic head, onboard system, transparent display are constituted.Onboard system module as shown in Figure 4, including video acquisition module, image processing module, eye Ore-controlling Role module.
The method main flow of the present invention is as it is shown in figure 1, its step specifically includes that
Step 11: open system;
The power supply of manually opened onboard system, system can start automatically.
Step 12: carry out video acquisition;
Video acquisition mainly includes systematic parameter and initializes and timing image acquisition.Systematic parameter initializes: onboard system gathers uncalibrated image after opening, and namely driver watches the fixing point on screen attentively, and camera collection image now is as uncalibrated image.Timing image acquisition: driver area carries out the collection image of one second 25-30 frame rate with function readjustment, and after collection, image information stores.
Step 13: carry out eye position by the image transmitted and identify and the judgement of inclination of sighting line angle;
Eye position identifies and inclination of sighting line angle judges mainly the image information transmitted to be carried out feature extraction.First pass through the identification that contouring head and the colour of skin are carried out head and face.Then, utilize face morphology to extract the regional area comprising eyes, carry out the identification of eye position, eye position is demarcated, it is judged that equilbrium position simultaneously.Carry out pupil position identification again, will identify that pupil position information stores.The time of nictation naturally according to people, people blinked the physiological feature being typically in about 1 second consciously between the 0.2-0.6 second, also has the trained reflex produced because of strong illumination to be processed by the mode such as denoising or filtering nictation, removes the image of nature nictation.Inclination of sighting line angle analysis is according to human eye uncalibrated image when starting, and finds out pupil equilbrium position.Pupil equilbrium position is defined as sight line drop point when being in screen centre position, the relative position of pupil and eyeball.By contrasting in human eye uncalibrated image and real time imaging, two pupil positions and two screen drop point site obtain 2 inclination of sighting line angle vectors, determine inclination of sighting line angle by inclination of sighting line angle vector.
Step 14: judge whether transparent display is opened
This step is mainly saved when running, and adding fast speed, if opened, leaping to step 16.Step 15 is then entered without opening.
Step 15: by scanning specific region, it is judged that new line state also controls transparent display switch
Compare with the image information of step 12 storage, it may be judged whether there is new line state, namely judge in 1 second that whether eyeball position is moved up by equilbrium position and exceed threshold value.Threshold value is determined according to distance and the pupil position of the upper lower eyelid of people, and threshold value is decided to be and accounts for the 30% of upper lower eyelid from upper eyelid distance and be as the criterion.It is considered as exceeding threshold value and eye position in display screen be more than or equal to this value, then opens transparent display;In the ungratified situation of Rule of judgment, then it return back to step 12 and continue to gather video.
Step 16: location sight line projects to transparent display position;
By the uncalibrated image of the image of relative analysis step 13 storage and step 12 human eye, complete to follow the tracks of sight line and pupil coordinate location, then be weighted the region watched attentively processing, find out centroid position as sight line drop point information.Finally, by the sight line drop point of people in transparent display location positioning.
Step 17: the eye motion information simulation control to transparent display;
According to the information that step 16 processes, realized the movement to onboard system mouse and other operational orders by eye motion, reach human eye and control the purpose of onboard system.The movement of mouse is determined in the movement of sight line;By blinking, transparent display is operated by simulation: double non-natural nictation is for clicking through;Stare for choosing;Sight line moves and controls mouse track, and shows on transparent display.
Below in conjunction with accompanying drawing 2, embodiments of the invention two carried out specifically, describe in detail:
Step 21: after driver enters car, manually opened onboard system power supply.
Step 22: after system is opened, everything in readiness begins to send video acquisition order to photographic head, and photographic head scanning driver's seat, the particularly bottom scanning seat according to the people of different heights is equipped with difference.The image of scanning is carried out image procossing, first by coloured image binaryzation, the contouring head coupling and the colour of skin match cognization that carry out people again go out the position of head part, eye locations is extracted after identification, also the information such as people's angulus oculi medialis, symmetry can be utilized simultaneously to carry out outline judges to be human eye, and identify pupil position, then process the impact of nature nictation, image information after processing is stored, image within only storing a second.
Step 23: open transparent display: before sending commands to transparent display, will compare the image of storage, pupil position moving direction in image within comparing a second.Pupil has mobile and moves up and exceed threshold value and be then considered as meeting new line action, then system sends open command to transparent display, opens transparent display.Regardless of whether opened display, photographic head is continuous collecting image all the time.If having opened transparent display, then real time imaging is made without judging whether new line action;If not opening transparent display, return to step 22.
Step 24: the coordinate of sight line screen drop point site is translated into mouse position, eye motion is translated into corresponding mouse action instruction, by judging eye motion, double non-natural nictation is for clicking through, stare for choosing, sight line moves and moves for mouse, and shows on transparent display, it is achieved the eye motion manipulation to onboard system.
Fig. 3 is the image processing architecture schematic diagram of the present invention, including step 31 head position identification;Step 32 positions eyes rectangle frame;Step 33 pupil position identification, step 34 sight line screen impact point positioning.
Below in conjunction with accompanying drawing 3, the image processing architecture of the present invention is illustrated:
Step 31: first pass through the identification that contouring head and the colour of skin are carried out head and face.
Step 32: utilize face morphology to extract the regional area comprising eyes, carry out the identification of eye position, eye position is demarcated, it is judged that equilbrium position simultaneously.
Step 33: will identify that pupil position information stores.The time of nictation naturally according to people, people blinked the physiological feature being typically in about 1 second consciously between the 0.2-0.6 second, also has the trained reflex produced because of strong illumination to be processed by the mode such as denoising or filtering nictation, removes the image of nature nictation.Inclination of sighting line angle analysis is according to human eye uncalibrated image when starting, and finds out pupil equilbrium position.Pupil equilbrium position is defined as sight line drop point when being in screen centre position, the relative position of pupil and eyeball.
Step 34: by contrasting in human eye uncalibrated image and real time imaging, two pupil positions and two screen drop point site obtain 2 inclination of sighting line angle vectors, determine inclination of sighting line angle by inclination of sighting line angle vector.
Above to a kind of onboard system human eye control method provided by the present invention and device, it is described in detail, principles of the invention and specific case embodiment are set forth, and the explanation of above example is only intended to help to understand method and the core concept thereof of the present invention;Simultaneously for one of ordinary skill in the art, according to the thought of the present invention, all can change in specific embodiments and applications and point out.In sum, this specification content should not be construed as limitation of the present invention.
Those of ordinary skill in the art are it will be appreciated that all or part of step realizing said method embodiment can be completed by the hardware that programmed instruction is relevant, aforesaid program can be stored in computer read/write memory medium, this program upon execution, performs to include the step of said method embodiment.Onboard system human eye control method can be the using method in onboard system, utilizes eyes to control onboard system.
Last it is noted that above example is only in order to illustrate technical scheme, it is not intended to limit;Although the present invention being described in detail with reference to signature embodiment, it will be understood by those within the art that: the technical scheme signed described in each embodiment still can be modified by it, or wherein portion of techniques feature is carried out equivalent replacement;And these amendments or replacement, do not make the essence of appropriate technical solution depart from the spirit and scope of various embodiments of the present invention technical scheme.
Claims (10)
1. onboard system human eye controls a device, and device includes photographic head, onboard system, transparent display, and described onboard system includes video acquisition module, image processing module, eye Ore-controlling Role module;It is characterized in that:
Described photographic head is used for scanning driver area, image within gathering 1 second, onboard system processes the image gathered and judges onboard system control instruction according to eye motion information, onboard system information is fed back by transparent display, so that driver controls, from opening transparent display to closedown, onboard system is controlled by whole process by eye motion;
Described video acquisition module, gathers image and with matrix or array form, image is passed to image processing module;
Described image processing module, completes to include head position identification, the identification of eye position, pupil position identification, filters the Image Information Processing of nictation and sight line screen impact point positioning, and process information passes to an Ore-controlling Role module;
Described eye Ore-controlling Role module, translates into mouse position by the coordinate of sight line screen drop point site, and eye motion is translated into corresponding mouse action instruction, controls mouse and moves, and show on transparent display, it is achieved the eye motion manipulation to onboard system.
2. onboard system human eye according to claim 1 controls device, it is characterised in that: described head position is identified by the colour of skin matching judgment of contouring head and people to identify the head position of people;
Described eye position is identified by face morphology and extracts the regional area comprising eyes, carries out the identification of eye position, eye position is demarcated simultaneously and judge equilbrium position.
3. onboard system human eye according to claim 1 controls device, it is characterized in that: described pupil position identification is before the pupil of location, two-dimensional Gabor filtering algorithm is utilized first to identify iris, Pupil diameter is carried out according to the iris recognized, first, with Gaussian filter, iris image is carried out pretreatment;Then, the edge detection algorithm based on Canny operator carries out image procossing, formulates the gradient direction of each marginal point;Finally, according to Gradient direction information, it is determined that pupil center's point position;Pupil center is the pupil radium first being obtained series of discrete by Hough transformation, and the gamma characteristic lower than the gray value of the white of the eye of surrounding further according to pupil is accurately obtained pupil center's point position coordinates.
4. onboard system human eye according to claim 1 controls device, it is characterized in that: described sight line screen impact point positioning is to estimate sight line screen drop point according to pupil center corneal reflection method, it is according to for pupil center's mapping relations to the vector function of one or two corneal reflex point.
5. utilize device described in claim 1-4 to carry out onboard system human eye control method, it is characterised in that to comprise the steps:
A, after () opening vehicle loading system, onboard system gathers human eye uncalibrated image, to parameter initialization;
B driver area is actively timed image acquisition and preserves by () photographic head, the image preserved is carried out the identification of eye position and inclination of sighting line angle analysis, and whether real-time judge has turned on transparent display;If having turned on transparent display, onboard system performs step (d);If also not opening transparent display display, perform step (c);
C () photographic head continues to scan on driver area, it may be judged whether meeting new line condition and control transparent display switch, if opening transparent display, onboard system performs step (d);If being not turned on transparent display, onboard system jumps back to step (b);
D () location sight line projects to transparent display position: the image stored by relative analysis step (b) and the uncalibrated image of step (a) human eye, completes eye tracking and pupil position identification;It is weighted the region watched attentively again processing, finds out centroid position as sight line screen drop point;Finally, by the sight line screen drop point of people in transparent display location positioning;
E () eye motion information simulation transparent display operates: the coordinate of sight line screen drop point site is translated into mouse position, eye motion is translated into corresponding mouse action instruction, by judging eye motion, double non-natural nictation is for clicking through, stare for choosing, sight line moves and moves for mouse, and shows on transparent display, it is achieved the eye motion manipulation to onboard system.
6. according to claim 5 onboard system human eye control method, it is characterized in that: it is gather human eye uncalibrated image after onboard system is opened that described systematic parameter initializes, namely driver watches the fixing point on screen attentively, and camera collection image now is as uncalibrated image.
7. according to claim 5 onboard system human eye control method, it is characterised in that: described timing image acquisition is to adjust back with function driver area carries out the image acquisition of 25-30 frame per second and preserves.
8. according to claim 5 onboard system human eye control method, it is characterized in that: described inclination of sighting line angle analysis is according to human eye uncalibrated image when starting, finding out pupil equilbrium position, pupil equilbrium position is defined as sight line drop point when being in screen centre position, the relative position of pupil and eyeball;By contrasting in human eye uncalibrated image and real time imaging, two pupil positions and two screen drop point site obtain 2 inclination of sighting line angle vectors, determine inclination of sighting line angle by inclination of sighting line angle vector.
9. according to claim 5 onboard system human eye control method, it is characterized in that: whether described new line condition is to be moved up by equilbrium position and exceeded threshold value according to eyeball position in 1 second to judge, threshold value is determined according to distance and the pupil position of the upper lower eyelid of people, threshold value is decided to be and accounts for the 30% of upper lower eyelid from upper eyelid distance and be as the criterion, it is considered as exceeding threshold value and eye position in display screen be more than or equal to this value, meets new line condition.
10. according to claim 5 onboard system human eye control method, it is characterised in that: described pupil position identification specific operation process is as follows:
(1) determine filter window function, carry out linear filtering;
(2) Otsu algorithm (Otsu) is utilized to carry out region segmentation, it is achieved binaryzation;
(3) frontier tracing is proceeded by from the central point of image, it is thus achieved that pupil scope;
(4) centroid method is adopted to calculate the coordinate of pupil center's point;
Determining in process in pupil center's point position, Otsu algorithm is as follows:
If image comprise L gray level (0,1 ..., L-1), gray value is counted equal to the pixel of i and is expressed as Ni, and the total pixel of image is counted as N=N0+N1+...+N (L-1).The probability of the gray value point equal to i is: entire image is divided into dark space c1 and clear zone c2 two class by P (i)=N (i)/N, thresholding t, then inter-class variance σ is the function of t: σ=a1*a2 (u1-u2)2.In formula, aj is the area ratio with the image gross area of class cj,A2=1-a1;Uj is the average of class cj, In formula, j represents subregion label, and j takes 1 or 2;This method selects optimum thresholding t to make inter-class variance maximum, it may be assumed that make Δ u=u1-u2, σ b=max{a1 (t) * a2 (t) Δ u2}。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610079868.4A CN105739705A (en) | 2016-02-04 | 2016-02-04 | Human-eye control method and apparatus for vehicle-mounted system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610079868.4A CN105739705A (en) | 2016-02-04 | 2016-02-04 | Human-eye control method and apparatus for vehicle-mounted system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105739705A true CN105739705A (en) | 2016-07-06 |
Family
ID=56245006
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610079868.4A Pending CN105739705A (en) | 2016-02-04 | 2016-02-04 | Human-eye control method and apparatus for vehicle-mounted system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105739705A (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106226911A (en) * | 2016-09-09 | 2016-12-14 | 深圳市世尊科技有限公司 | A kind of intelligent glasses and exchange method being carried out man-machine interaction by sight line |
CN106502423A (en) * | 2016-11-21 | 2017-03-15 | 武汉理工大学 | Based on the automation microoperation method that human eye vision is positioned |
CN107179610A (en) * | 2017-06-29 | 2017-09-19 | 深圳市虚拟现实科技有限公司 | Method for displaying image and its device |
CN107380064A (en) * | 2017-07-04 | 2017-11-24 | 上海青研科技有限公司 | A kind of vehicle-mounted Eye-controlling focus device based on augmented reality |
CN107472140A (en) * | 2017-08-08 | 2017-12-15 | 上海青研科技有限公司 | A kind of eye control vehicle-mounted HUD systems |
CN107506751A (en) * | 2017-09-13 | 2017-12-22 | 重庆爱威视科技有限公司 | Advertisement placement method based on the dynamic control of eye |
CN107608523A (en) * | 2017-09-30 | 2018-01-19 | 广东欧珀移动通信有限公司 | Control method, device and the storage medium and mobile terminal of mobile terminal |
CN108189787A (en) * | 2017-12-12 | 2018-06-22 | 北京汽车集团有限公司 | Control method and apparatus, storage medium and the vehicle of seat |
CN108334871A (en) * | 2018-03-26 | 2018-07-27 | 深圳市布谷鸟科技有限公司 | The exchange method and system of head-up display device based on intelligent cockpit platform |
WO2018137595A1 (en) * | 2017-01-25 | 2018-08-02 | 丁贤根 | Face recognition method |
CN109814448A (en) * | 2019-01-16 | 2019-05-28 | 北京七鑫易维信息技术有限公司 | A kind of vehicle multi-mode state control method and system |
CN109925678A (en) * | 2019-03-01 | 2019-06-25 | 北京七鑫易维信息技术有限公司 | A kind of training method based on eye movement tracer technique, training device and equipment |
CN109965843A (en) * | 2019-03-14 | 2019-07-05 | 华南师范大学 | A kind of eye movements system passing picture based on filtering |
CN109968979A (en) * | 2019-03-14 | 2019-07-05 | 百度在线网络技术(北京)有限公司 | Vehicle-mounted projection processing method, device, mobile unit and storage medium |
CN110017520A (en) * | 2019-04-15 | 2019-07-16 | 宁波奥克斯厨电制造有限公司 | A kind of smoke exhaust ventilator eyeball tracking control method, device and smoke exhaust ventilator |
CN110126740A (en) * | 2019-05-29 | 2019-08-16 | 辽宁科大物联科技有限公司 | A kind of control method and device of automotive electronics rearview mirror |
CN110728256A (en) * | 2019-10-22 | 2020-01-24 | 上海商汤智能科技有限公司 | Interaction method and device based on vehicle-mounted digital person and storage medium |
CN110758249A (en) * | 2019-11-18 | 2020-02-07 | 王明轩 | Workbench display system |
CN110825228A (en) * | 2019-11-01 | 2020-02-21 | 腾讯科技(深圳)有限公司 | Interaction control method and device, storage medium and electronic device |
WO2020042666A1 (en) * | 2018-08-31 | 2020-03-05 | 北京七鑫易维信息技术有限公司 | Vehicle starting method and apparatus, device, and storage medium |
CN111824043A (en) * | 2019-04-16 | 2020-10-27 | 上汽通用汽车有限公司 | Automobile display screen control system and method and vehicle comprising same |
CN112114671A (en) * | 2020-09-22 | 2020-12-22 | 上海汽车集团股份有限公司 | Human-vehicle interaction method and device based on human eye sight and storage medium |
CN112241663A (en) * | 2019-07-18 | 2021-01-19 | 上汽通用汽车有限公司 | Device and system for allocating a plurality of vehicle-mounted resources |
CN112667084A (en) * | 2020-12-31 | 2021-04-16 | 上海商汤临港智能科技有限公司 | Control method and device for vehicle-mounted display screen, electronic equipment and storage medium |
CN113341692A (en) * | 2021-05-21 | 2021-09-03 | 浙江大学 | Eye control interaction system developed based on rocker metaphor |
CN114253439A (en) * | 2021-10-30 | 2022-03-29 | 惠州华阳通用智慧车载系统开发有限公司 | Multi-screen interaction method |
CN114356482A (en) * | 2021-12-30 | 2022-04-15 | 业成科技(成都)有限公司 | Method for interacting with human-computer interface by using sight line drop point |
CN114528039A (en) * | 2021-01-08 | 2022-05-24 | 蒋耕银 | Background application cleaning system based on big data |
CN115120436A (en) * | 2022-06-27 | 2022-09-30 | 广东技术师范大学 | Eye-controlled intelligent medical bed and control method thereof |
CN115953813A (en) * | 2022-12-19 | 2023-04-11 | 北京字跳网络技术有限公司 | Expression driving method, device, equipment and storage medium |
CN116185190A (en) * | 2023-02-09 | 2023-05-30 | 江苏泽景汽车电子股份有限公司 | Information display control method and device and electronic equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102830797A (en) * | 2012-07-26 | 2012-12-19 | 深圳先进技术研究院 | Man-machine interaction method and system based on sight judgment |
CN103310202A (en) * | 2013-06-27 | 2013-09-18 | 西安电子科技大学 | System and method for guaranteeing driving safety |
CN103440481A (en) * | 2013-08-30 | 2013-12-11 | 山东师范大学 | Method for locating boundaries of non-ideal iris image |
CN103985334A (en) * | 2014-05-04 | 2014-08-13 | 京东方科技集团股份有限公司 | Transparent display system and device with transparent display system |
CN104461005A (en) * | 2014-12-15 | 2015-03-25 | 东风汽车公司 | Vehicle-mounted screen switch control method |
CN105128862A (en) * | 2015-08-18 | 2015-12-09 | 上海擎感智能科技有限公司 | Vehicle terminal eyeball identification control method and vehicle terminal eyeball identification control system |
-
2016
- 2016-02-04 CN CN201610079868.4A patent/CN105739705A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102830797A (en) * | 2012-07-26 | 2012-12-19 | 深圳先进技术研究院 | Man-machine interaction method and system based on sight judgment |
CN103310202A (en) * | 2013-06-27 | 2013-09-18 | 西安电子科技大学 | System and method for guaranteeing driving safety |
CN103440481A (en) * | 2013-08-30 | 2013-12-11 | 山东师范大学 | Method for locating boundaries of non-ideal iris image |
CN103985334A (en) * | 2014-05-04 | 2014-08-13 | 京东方科技集团股份有限公司 | Transparent display system and device with transparent display system |
CN104461005A (en) * | 2014-12-15 | 2015-03-25 | 东风汽车公司 | Vehicle-mounted screen switch control method |
CN105128862A (en) * | 2015-08-18 | 2015-12-09 | 上海擎感智能科技有限公司 | Vehicle terminal eyeball identification control method and vehicle terminal eyeball identification control system |
Non-Patent Citations (1)
Title |
---|
刘晓敏: "《基于虹膜识别的商务会馆管理系统的实现》", 31 August 2015, 湖南师范大学出版社 * |
Cited By (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106226911A (en) * | 2016-09-09 | 2016-12-14 | 深圳市世尊科技有限公司 | A kind of intelligent glasses and exchange method being carried out man-machine interaction by sight line |
CN106502423A (en) * | 2016-11-21 | 2017-03-15 | 武汉理工大学 | Based on the automation microoperation method that human eye vision is positioned |
CN106502423B (en) * | 2016-11-21 | 2019-04-30 | 武汉理工大学 | Automation microoperation method based on human eye vision positioning |
WO2018137595A1 (en) * | 2017-01-25 | 2018-08-02 | 丁贤根 | Face recognition method |
CN107179610A (en) * | 2017-06-29 | 2017-09-19 | 深圳市虚拟现实科技有限公司 | Method for displaying image and its device |
CN107380064A (en) * | 2017-07-04 | 2017-11-24 | 上海青研科技有限公司 | A kind of vehicle-mounted Eye-controlling focus device based on augmented reality |
CN107380064B (en) * | 2017-07-04 | 2022-08-19 | 上海青研科技有限公司 | Vehicle-mounted sight tracking device based on augmented reality technology |
CN107472140A (en) * | 2017-08-08 | 2017-12-15 | 上海青研科技有限公司 | A kind of eye control vehicle-mounted HUD systems |
CN107472140B (en) * | 2017-08-08 | 2023-04-07 | 上海青研科技有限公司 | Eye-control vehicle-mounted HUD system |
CN107506751A (en) * | 2017-09-13 | 2017-12-22 | 重庆爱威视科技有限公司 | Advertisement placement method based on the dynamic control of eye |
CN107506751B (en) * | 2017-09-13 | 2019-10-08 | 重庆爱威视科技有限公司 | Advertisement placement method based on eye movement control |
CN107608523A (en) * | 2017-09-30 | 2018-01-19 | 广东欧珀移动通信有限公司 | Control method, device and the storage medium and mobile terminal of mobile terminal |
CN108189787A (en) * | 2017-12-12 | 2018-06-22 | 北京汽车集团有限公司 | Control method and apparatus, storage medium and the vehicle of seat |
CN108334871A (en) * | 2018-03-26 | 2018-07-27 | 深圳市布谷鸟科技有限公司 | The exchange method and system of head-up display device based on intelligent cockpit platform |
WO2020042666A1 (en) * | 2018-08-31 | 2020-03-05 | 北京七鑫易维信息技术有限公司 | Vehicle starting method and apparatus, device, and storage medium |
CN109814448A (en) * | 2019-01-16 | 2019-05-28 | 北京七鑫易维信息技术有限公司 | A kind of vehicle multi-mode state control method and system |
CN109925678A (en) * | 2019-03-01 | 2019-06-25 | 北京七鑫易维信息技术有限公司 | A kind of training method based on eye movement tracer technique, training device and equipment |
CN109968979A (en) * | 2019-03-14 | 2019-07-05 | 百度在线网络技术(北京)有限公司 | Vehicle-mounted projection processing method, device, mobile unit and storage medium |
CN109965843A (en) * | 2019-03-14 | 2019-07-05 | 华南师范大学 | A kind of eye movements system passing picture based on filtering |
CN109968979B (en) * | 2019-03-14 | 2021-12-07 | 阿波罗智联(北京)科技有限公司 | Vehicle-mounted projection processing method and device, vehicle-mounted equipment and storage medium |
CN109965843B (en) * | 2019-03-14 | 2022-05-24 | 华南师范大学 | Eye movement system based on filtering image transmission |
CN110017520A (en) * | 2019-04-15 | 2019-07-16 | 宁波奥克斯厨电制造有限公司 | A kind of smoke exhaust ventilator eyeball tracking control method, device and smoke exhaust ventilator |
CN110017520B (en) * | 2019-04-15 | 2020-04-10 | 宁波奥克斯厨电制造有限公司 | Smoke exhaust ventilator eyeball tracking control method and device and smoke exhaust ventilator |
CN111824043A (en) * | 2019-04-16 | 2020-10-27 | 上汽通用汽车有限公司 | Automobile display screen control system and method and vehicle comprising same |
CN110126740A (en) * | 2019-05-29 | 2019-08-16 | 辽宁科大物联科技有限公司 | A kind of control method and device of automotive electronics rearview mirror |
CN112241663A (en) * | 2019-07-18 | 2021-01-19 | 上汽通用汽车有限公司 | Device and system for allocating a plurality of vehicle-mounted resources |
CN110728256A (en) * | 2019-10-22 | 2020-01-24 | 上海商汤智能科技有限公司 | Interaction method and device based on vehicle-mounted digital person and storage medium |
CN110825228A (en) * | 2019-11-01 | 2020-02-21 | 腾讯科技(深圳)有限公司 | Interaction control method and device, storage medium and electronic device |
CN110758249A (en) * | 2019-11-18 | 2020-02-07 | 王明轩 | Workbench display system |
CN112114671A (en) * | 2020-09-22 | 2020-12-22 | 上海汽车集团股份有限公司 | Human-vehicle interaction method and device based on human eye sight and storage medium |
CN112667084A (en) * | 2020-12-31 | 2021-04-16 | 上海商汤临港智能科技有限公司 | Control method and device for vehicle-mounted display screen, electronic equipment and storage medium |
CN114528039A (en) * | 2021-01-08 | 2022-05-24 | 蒋耕银 | Background application cleaning system based on big data |
CN113341692A (en) * | 2021-05-21 | 2021-09-03 | 浙江大学 | Eye control interaction system developed based on rocker metaphor |
CN114253439A (en) * | 2021-10-30 | 2022-03-29 | 惠州华阳通用智慧车载系统开发有限公司 | Multi-screen interaction method |
CN114253439B (en) * | 2021-10-30 | 2024-06-04 | 惠州华阳通用智慧车载系统开发有限公司 | Multi-screen interaction method |
CN114356482B (en) * | 2021-12-30 | 2023-12-12 | 业成科技(成都)有限公司 | Method for interaction with human-computer interface by using line-of-sight drop point |
CN114356482A (en) * | 2021-12-30 | 2022-04-15 | 业成科技(成都)有限公司 | Method for interacting with human-computer interface by using sight line drop point |
CN115120436A (en) * | 2022-06-27 | 2022-09-30 | 广东技术师范大学 | Eye-controlled intelligent medical bed and control method thereof |
CN115953813B (en) * | 2022-12-19 | 2024-01-30 | 北京字跳网络技术有限公司 | Expression driving method, device, equipment and storage medium |
CN115953813A (en) * | 2022-12-19 | 2023-04-11 | 北京字跳网络技术有限公司 | Expression driving method, device, equipment and storage medium |
CN116185190A (en) * | 2023-02-09 | 2023-05-30 | 江苏泽景汽车电子股份有限公司 | Information display control method and device and electronic equipment |
CN116185190B (en) * | 2023-02-09 | 2024-05-10 | 江苏泽景汽车电子股份有限公司 | Information display control method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105739705A (en) | Human-eye control method and apparatus for vehicle-mounted system | |
EP3488382B1 (en) | Method and system for monitoring the status of the driver of a vehicle | |
US11182629B2 (en) | Machine learning based driver assistance | |
CN111931579B (en) | Automatic driving assistance system and method using eye tracking and gesture recognition techniques | |
KR20200131883A (en) | Driving condition analysis method and device, driver monitoring system and vehicle | |
EP2975997B1 (en) | System and method for on-axis eye gaze tracking | |
CN101466305B (en) | Method for determining and analyzing a location of visual interest | |
KR20200132945A (en) | Driving condition detection method and device, driver monitoring system and vehicle | |
TW202036465A (en) | Method, device and electronic equipment for monitoring driver's attention | |
EP3259734B1 (en) | Glare reduction | |
CN101950355B (en) | Method for detecting fatigue state of driver based on digital video | |
US10298911B2 (en) | Visualization of spatial and other relationships | |
US9773179B2 (en) | Vehicle operator monitoring system and method | |
CN110826369A (en) | Driver attention detection method and system during driving | |
CN106965675B (en) | A kind of lorry swarm intelligence safety work system | |
CN110703904A (en) | Augmented virtual reality projection method and system based on sight tracking | |
EP1868138A2 (en) | Method of tracking a human eye in a video image | |
CN111062292B (en) | Fatigue driving detection device and method | |
EP3457321A1 (en) | Control device for vehicle headlight | |
CN113963416B (en) | Eye movement interaction method and system based on laser visual feedback | |
CN111619324A (en) | Intelligent anti-dazzling method and system for sight tracking automobile | |
CN110825216A (en) | Method and system for man-machine interaction of driver during driving | |
CN116883978A (en) | Driver distraction fatigue detection method, system and storage medium based on infrared camera | |
CN109987036A (en) | A kind of control method improving interaction accuracy based on driver's body posture | |
Meng et al. | Application and development of AI technology in automobile intelligent cockpit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160706 |
|
RJ01 | Rejection of invention patent application after publication |