CN114360043B - Model parameter calibration method, sight tracking method, device, medium and equipment - Google Patents
Model parameter calibration method, sight tracking method, device, medium and equipment Download PDFInfo
- Publication number
- CN114360043B CN114360043B CN202210267130.6A CN202210267130A CN114360043B CN 114360043 B CN114360043 B CN 114360043B CN 202210267130 A CN202210267130 A CN 202210267130A CN 114360043 B CN114360043 B CN 114360043B
- Authority
- CN
- China
- Prior art keywords
- pupil
- eyeball
- center
- point
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 210000001747 pupil Anatomy 0.000 claims abstract description 122
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 108
- 210000001508 eye Anatomy 0.000 claims abstract description 62
- 238000012937 correction Methods 0.000 claims abstract description 32
- 238000001514 detection method Methods 0.000 claims abstract description 20
- 230000003287 optical effect Effects 0.000 claims description 41
- 230000006870 function Effects 0.000 claims description 38
- 210000004087 cornea Anatomy 0.000 claims description 34
- 239000013598 vector Substances 0.000 claims description 32
- 238000005457 optimization Methods 0.000 claims description 27
- 230000000007 visual effect Effects 0.000 claims description 22
- 238000004590 computer program Methods 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 238000013519 translation Methods 0.000 claims description 7
- 238000013507 mapping Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012897 Levenberg–Marquardt algorithm Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
Images
Landscapes
- Eye Examination Apparatus (AREA)
Abstract
A model parameter calibration method, a sight tracking method, a device, a medium and equipment are provided, wherein the model parameter calibration method comprises the following steps: respectively acquiring images shot when a user watches each preset watching point on a screen to obtain human eye images corresponding to each preset watching point; carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set; performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model; calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function; and solving an equation set formed by nonlinear equations corresponding to all the preset gazing points. The method facilitates the calibration of the model parameters and obtains more accurate sight direction.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a model parameter calibration method, a sight tracking method, a device, a medium and equipment.
Background
An eye tracking system for a virtual reality headset is mainly composed of an eye, a camera and a screen. Near-eye display systems for virtual reality headsets typically have an eyepiece between the eye and the screen to magnify the screen. The eyeball tracking algorithm estimates the coordinates or the sight direction of the fixation point on the screen according to the eyeball image shot by the camera.
Currently, the eye tracking algorithm is used to estimate the direction of sight, and usually requires establishing a system model and measuring model parameters or using preset model parameters. Different users have different eye characteristics, and the sight lines of the users when watching a certain point are different, so that sight line direction estimation errors are caused if each user adopts uniform model parameters. Therefore, a convenient method for calibrating the model parameters on site is needed to be provided for the user, so that the sight line estimation precision is improved.
Disclosure of Invention
In view of the above situation, it is necessary to provide a method, a device, a medium and an apparatus for calibrating model parameters, a method for tracking a line of sight, a device, a medium and a device for calibrating model parameters on site, so as to improve the accuracy of line of sight estimation.
A method for calibrating model parameters comprises the following steps:
respectively acquiring images shot when a user watches each preset fixation point on a screen to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
and acquiring a nonlinear equation corresponding to each preset gazing point, and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points to obtain the value of each unknown parameter in the eyeball model.
Further, in the above method for calibrating model parameters, the step of performing corneal refraction correction on the set of pupil contour points includes:
establishing a first optimization function related to the pupil contour points according to the intersection condition of rays emitted by each contour point in the pupil contour point set through the center of a camera and the eyeball model;
and establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point.
Further, in the above method for calibrating model parameters, the first optimization function is:
the second optimization function is:
wherein case I is the intersection point of ray and cornea, and the corneal surface is shown as the surface of cornea according to the law of refractionNormal vector of (c)Direction of rayVector of refracted lightAre coplanar, i.e.
The corresponding refraction light parameter equation is as follows:,t2the parameters of the equation are expressed in terms of,,representing a real number set, in case of case I, defining,The intersection of the refracted ray and the pupil plane;
case II is defined as the intersection point between the ray and the eyeball and the intersection point between the ray and the cornea;
case III means that the ray has no intersection point with the eyeball and the cornea, and the ray parameter t is selected at the moment0To minimize the distance between the ray and the center of the eyeball, define;
Wherein i represents the frame number of the human eye image, j represents the jth contour point in the pupil contour point set, E is the position coordinate of the eyeball spherical center, ri represents the pupil radius,the coordinates representing the center of the pupil are,a jth pupil image contour point representing the ith pupil,represents the intersection point of the camera ray corresponding to the outline point of the pupil image and the eyeball model,is a sign of a norm,the inner product of the vector is represented as,the intersection of the ray with the corneal surface is shown,the center of the ellipse representing the fit of the pupil contour points,is represented byThe calculated pupil center coordinate, n is the refractive index of the cornea,is a refractive index of air and is,an optimization function is represented with respect to the state of the model,represents the eyeball model states in the 1 st to the Nth frame images, and is a Lagrange multiplier,is a positive number so that the residual satisfying case II is always smaller than the residual satisfying case III.
Further, the method for calibrating the model parameters includes the step of determining the intersection point of the user's sight line and the screen according to the corrected pupil contour point set and the eyeball model:
carrying out ellipse fitting according to the corrected pupil contour point set, and calculating the normal direction of the pupil according to the fitted ellipse to obtain the optical axis direction under a camera coordinate system;
calculating a two-dimensional projection vector of the optical axis direction under a camera coordinate system on a corresponding human eye image;
determining the projection of the eyeball center according to the intersection point of the two-dimensional projection vectors of at least two frames of human eye images;
determining the coordinates of the pupil center under a camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance, wherein the pupil-eyeball distance is the distance between the pupil center and the eyeball center;
converting the coordinates of the pupil center under the camera coordinate system into a world coordinate system;
converting the optical axis direction under the camera coordinate system into a world coordinate system, and correcting the optical axis direction converted under the world coordinate system into the visual axis direction under the world coordinate system according to the visual axis optical axis included angle, wherein the visual axis optical axis included angle is the included angle between the visual axis of the eyes and the optical axis;
and determining the sight of the user according to the coordinates of the pupil center in the world coordinate system obtained through conversion and the visual axis direction in the world coordinate system obtained through conversion, and calculating the intersection point of the sight and the screen.
Further, the method for calibrating the model parameters includes the steps of determining the coordinates of the pupil center in the camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center, and the pupil-eyeball distance:
calculating the direction of the eyeball center in a camera coordinate system according to the projection of the eyeball center;
determining the distance between the camera and the eyeball center under the world coordinate according to the position of the eyeball center and the translation vector of the camera;
calculating the coordinates of the eyeball center under a camera coordinate system according to the direction of the eyeball center in the camera coordinate system, the distance between the camera and the eyeball center and the pupil-eyeball distance;
and determining the coordinates of the pupil center in the camera coordinate system according to the coordinates of the eyeball center in the camera coordinate system and the optical axis direction in the camera coordinate system.
Further, in the above method for calibrating model parameters, the unknown parameters of the eyeball model include: the position coordinates of the eyeball sphere center, the distance between the pupil center and the eyeball sphere center, the distance between the cornea curvature center and the eyeball sphere center, the cornea curvature radius and the included angle between the visual axis and the optical axis.
The invention also discloses a sight tracking method, which comprises the following steps:
correcting parameters of the eyeball model according to values of unknown parameters of the eyeball model obtained by the model parameter calibration method;
acquiring a to-be-detected human eye image of the user, and carrying out contour detection on pupil projection in the to-be-detected human eye image to obtain a contour point set;
and performing corneal refraction correction on the contour point set, and determining an intersection point of the sight line of the user and the screen according to the corrected contour point set and the eyeball model with the modified parameters.
The invention also discloses a model parameter calibration device, which comprises:
the first acquisition module is used for respectively acquiring images shot when a user watches each preset fixation point on a screen so as to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
the first pupil detection module is used for carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
the first correction module is used for performing corneal refraction correction on the pupil contour point set and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
the equation establishing module is used for calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
and the equation solving module is used for acquiring the nonlinear equation corresponding to each preset gazing point and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points so as to obtain the value of each unknown parameter in the eyeball model.
The invention also discloses a sight tracking device, comprising:
the correction module is used for correcting the parameters of the eyeball model according to the value of the unknown parameters of the eyeball model obtained by the model parameter calibration method;
the second acquisition module is used for acquiring the to-be-detected eye image of the user;
the second pupil detection module is used for carrying out contour detection on the pupil projection in the human eye image to be detected to obtain a contour point set;
a second correction module for performing corneal refractive correction on the set of contour points;
and the determining module is used for determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter correction.
The invention also discloses a readable storage medium on which a computer program is stored, which program, when executed by a processor, performs the method of any of the above.
The invention also discloses an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the method of any one of the above items when executing the computer program.
The method guides a user to watch by utilizing a plurality of preset fixation points on the screen, establishes a nonlinear equation set of unknown parameters of an eyeball model according to the measurement result of the eye pattern, and solves the nonlinear equation set to obtain the model parameters. The model parameters of different users can be corrected by the user in the calibration process, so that more accurate sight direction can be obtained.
Drawings
FIG. 1 is a flow chart of a method for calibrating model parameters according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating the corneal refraction correction procedure for the set of pupil contour points according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a gaze tracking method according to a third embodiment of the present invention;
FIG. 4 is a block diagram of a model parameter calibration apparatus according to a fourth embodiment of the present invention;
FIG. 5 is a block diagram of a gaze tracking device in a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device in an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
These and other aspects of embodiments of the invention will be apparent with reference to the following description and attached drawings. In the description and drawings, particular embodiments of the invention have been disclosed in detail as being indicative of some of the ways in which the principles of the embodiments of the invention may be practiced, but it is understood that the scope of the embodiments of the invention is not limited correspondingly. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
According to the model parameter calibration method, a plurality of preset fixation points on a screen are utilized to guide a user to perform fixation, a nonlinear equation set related to model parameters is established according to the measurement result of an eye pattern, the nonlinear equation set is solved to obtain the model parameters, the on-site calibration of the model parameters is realized, and the accuracy of sight direction estimation is improved.
Example 1
Referring to fig. 1, a method for calibrating model parameters according to a first embodiment of the present invention includes steps S11-S15.
Step S11, images shot when a user watches each preset fixation point on the screen are respectively obtained to obtain human eye images corresponding to each preset fixation point, and the number of the preset fixation points is set according to the number of unknown parameters in the eyeball model.
The present embodiment relates to a system formed by a screen, a camera and an eyeball, wherein the known parameters are as follows:
taking a screen or a virtual image of the screen as a reference coordinate system, the width w (pixels) of a camera image and the height h (pixels) of the camera image;
Camera external parameters: rotation angle vectorA rotation matrix corresponding to the rotation angle vector, and a translation vectorWhere subscripts x, y, and z respectively represent three axis directions in three-dimensional coordinates.
The camera internal parameters are calibrated before assembly, and the camera external parameters are calibrated after assembly. The parameters of the model corresponding to the eyeball in the system are unknown parameters. The eyeball model is a physiological model of the eyeball, and the number of the preset fixation points is set according to the number of unknown parameters in the eyeball model. For example, the number of the unknown parameters of the eyeball model in this embodiment is 8, which are as follows:
by calibrating at least 8 preset fixation points, the 8 model parameters to be estimated can be solved.
And in the model parameter calibration stage, guiding a user to sequentially watch the 8 preset watching points, and sequentially collecting human eye images of the user to respectively obtain human eye images corresponding to the preset watching points.
And step S12, performing contour point detection on the pupil projection in the human eye image corresponding to the current preset gazing point to obtain a pupil contour point set.
And respectively obtaining 8 human eye images aiming at the 8 fixation points, and carrying out contour detection on the pupil projection in each human eye image to obtain a corresponding pupil contour point set. The contour points of the pupil projection detected in the current human eye image are collected into。
Step S13, performing corneal refraction correction on the pupil contour point set, and determining an intersection point between the user' S gaze and the screen according to the corrected pupil contour point set and the eyeball model.
Through analysis, in the existing sight line estimation algorithm, due to corneal refraction, system deviation usually exists in real data related to the eyeball position and the gaze direction, and therefore sight line estimation accuracy is low. Considering the influence of corneal refraction on the accuracy of the ray estimation, the present embodiment needs to correct corneal refraction for contour points in the set of pupil contour points.
Specifically, as shown in fig. 2, in one embodiment of the present invention, the step of performing corneal refraction correction on the set of pupil contour points includes steps S131 to S133.
Step S131, establishing a first optimization function related to the pupil contour points according to the intersection condition of the rays respectively emitted from each contour point in the pupil contour point set through the center of the camera and the eyeball model.
Step S132, establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point.
Step S133, correcting the contour points in the pupil contour point set according to the second optimization function.
Before corneal refraction correction is carried out, mapping from camera pixel coordinates to space coordinates needs to be established, wherein the mapping comprises mapping of a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system. Firstly, mapping a pixel coordinate system to an image coordinate system according to camera internal parameters, then mapping the image coordinate system to a camera coordinate system, and finally mapping the camera coordinate system to a world coordinate system through calibrated camera external parameters.
After coordinate mapping is completed, calculating one of the human eye images which is emitted to each contour point through the center (0, 0, 0) of the cameraThe strip ray passes through the cornea. It can be understood that the pupil contour point in the human eye image is a perspective projection of the contour of the real pupil on the camera imaging plane through corneal refraction. The ray in this step refers to the ray corresponding to the perspective projection. Intersection conditions of the rays and the eyeball model are divided into three types, namely case I, and the intersection point of the rays and the eyeball model is on the surface of a cornea; intersection point of case II ray and eyeball modelOn the eyeball; the case III ray has no intersection with both the eyeball and the cornea.
Specifically, the first optimization function for the contour points is established according to the intersection of the ray and the eyeball model as follows.
J pupil image contour point for i pupilFrom the center of the camera, throughThe ray of the corresponding 3D point is defined as,t1In order to be the parameters of the equation,,representing a set of real numbers.
case I: intersection of ray and eyeball modelOn the corneal surface. According to the law of refraction, at the surface of the corneaNormal vector of (c)Direction of rayVector of refracted lightAre coplanar, i.e.
The corresponding refraction light parameter equation is as follows:,t2the parameters of the equation are expressed in terms of,;
wherein i represents the frame number of the human eye image, j represents the jth contour point in the pupil contour point set,representing the vector of the refracted ray, n is the refractive index of the cornea,in order for the refractive index of air to be typically 1,the direction of the rays is indicated,indicating the point of intersectionThe normal vector of (a) is,representing the inner product of vectors,The intersection of the ray with the corneal surface is shown,
calculating the intersection point of the refracted ray and the pupil plane according to the parameter equation of the refracted rayDefinition of。
Intersection point of case II ray and eyeball modelOn the eyeball and not on the cornea. At this time define。
The case III ray has no intersection with both the eyeball and the cornea. Then selecting ray parametersTo minimize the distance between the ray and the center of the eyeball, define。
Correspondingly, the first optimization function is:
wherein E is the position coordinate of the eyeball center, ri represents the pupil radius,the coordinates representing the center of the pupil are,is a positive number such that the residuals satisfying case II are always smaller than the residuals satisfying case III.
Wherein case I is that the ray has an intersection point with the cornea, and the distance between the point on the contour circle and the center of the circle is expected to be as close as possible to the radius; caseII is that the ray has no intersection point with the cornea, but has an intersection point with the eyeball, and the distance between the point on the contour circle and the center of the circle is expected to be as small as possible; caseIII means that the ray has no intersection point with the cornea and no intersection point with the eyeball, namely, the ray deviates far, and the distance between the point on the contour circle and the spherical center of the eyeball is expected to be as small as possible.
In practice, the case where the ray does not intersect the cornea may not be considered. Calculating whether the value of the contour point projected on the iris and the distance from the pupil center and the pupil radius are small enough according to the equationWhen sufficiently small, the estimated pupil center is closer to the true value.
It should be noted that, for Case I, when a ray intersects the cornea, the ray includes three points: camera center, pupil contour point, corneal refraction point. When the cornea model is a spherical surface, the intersection of the ray and the corneal spherical surface has two conditions, namely one intersection and two intersections. When tangent, there is a point of intersection, i.e. only one corneal refraction point, in which case the point of intersection is the target refraction point; when two intersection points are generated after rays pass through the cornea, two corneal refraction points are formed, that is, two refraction rays are respectively formed by refraction, and in this case, an intersection point closer to the camera is selected as a target refraction point among the two refraction points. The refracted ray vector is a vector corresponding to the refracted ray generated by the target refracted point.
and calculating the intersection point of the refraction ray and the pupil plane by combining the refraction light parameter equation, the iris plane equation and the corneal spherical equation.
Wherein the corneal spherical equation is determined according to the spherical center of the corneal spherical surface and the corneal curvature radius, wherein the spherical center of the corneal spherical surfaceComprises the following steps:
e is the position coordinates of the eyeball center,is the distance between the cornea curvature center and the eyeball sphere center,is an optical axis;
the iris plane equation is determined according to the position coordinate P of the pupil center and the optical axis.
For a given eyeball center E, i.e.According toAnd an optical axisThe spherical center of the corneal spherical surface can be obtained:
Then according to the spherical center of the corneal sphereAnd radius of curvature of corneaThe corneal spherical equation can be determined.
According to the center of the iris circle(i.e. pupil center) and the normal to the iris(i.e., the optical axis), the point-normal iris plane equation can be determined.
At any time, the state of the eyeball model is determined by the position coordinates of the eyeball centerAnd state vectorAnd (6) determining. WhereinRepresenting the zenith angle and azimuth angle of the optical axis in the spherical coordinate system of the eyeball,representing the pupil radius.
The refraction light parameter equation, the iris plane equation and the corneal spherical equation are combined, and the coordinates of the intersection point of the refraction ray and the iris plane, namely the corresponding points on the pupil contour circle when the current model parameter values can be calculated.
And after the projection of the pupil contour point is calculated, calculating the mapping projection of the pupil center point. Further taking into account the spatial coordinates of the pupil centerAnd should also be close to the pupil center, a second optimization function can be obtained. The second optimization function is:
wherein,the center of the ellipse representing the fit of the pupil contour points,is represented byCalculated pupil center coordinates.Represents an optimization function with respect to the model state, and λ is the lagrangian multiplier when the constrained objective function is constructed using the lagrangian multiplier method. When optimizing a functionWhen the pupil contour points are small enough, the found pupil centers are considered as the real pupil centers, and therefore the more accurate coordinates of the pupil contour points can be determined.
Further, in one embodiment of the present invention, the step of determining an intersection point of the line of sight of the user and the screen according to the corrected pupil contour point set and the eyeball model includes:
carrying out ellipse fitting according to the corrected pupil contour point set, and calculating the normal direction of the pupil according to the fitted ellipse to obtain the optical axis direction under a camera coordinate system;
calculating a two-dimensional projection vector of the optical axis direction on the corresponding human eye image under a camera coordinate system;
determining the projection of the eyeball center according to the intersection point of the two-dimensional projection vectors of at least two frames of human eye images;
determining the coordinates of the pupil center under a camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance, wherein the pupil-eyeball distance is the distance between the pupil center and the eyeball center;
converting the coordinates of the pupil center under the camera coordinate system into a world coordinate system;
converting the optical axis direction under the camera coordinate system into a world coordinate system, and correcting the optical axis direction converted under the world coordinate system into the visual axis direction under the world coordinate system according to the visual axis optical axis included angle, wherein the visual axis optical axis included angle is the included angle between the visual axis of the eyes and the optical axis;
and determining the sight of the user according to the coordinates of the pupil center in the world coordinate system obtained through conversion and the visual axis direction in the world coordinate system obtained through conversion, and calculating the intersection point of the sight and the screen.
The step of determining the coordinates of the pupil center under the camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance comprises the following steps:
calculating the direction of the eyeball center in a camera coordinate system according to the projection of the eyeball center;
determining the distance between the camera and the eyeball center under the world coordinate according to the position of the eyeball center and the translation vector of the camera;
calculating the coordinates of the eyeball center under a camera coordinate system according to the direction of the eyeball center in the camera coordinate system, the distance between the camera and the eyeball center and the pupil-eyeball distance;
and determining the coordinates of the pupil center in the camera coordinate system according to the coordinates of the eyeball center in the camera coordinate system and the optical axis direction in the camera coordinate system.
Step S14, calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function.
And step S15, acquiring a nonlinear equation corresponding to each preset gazing point, and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points to obtain the values of the unknown parameters in the eyeball model.
In particular, the line of sight of the userAnd a screenThe intersection point is. The coordinate of the current preset fixation point in the world coordinate system is,Namely the real position of the current preset fixation point.
With 8 points of regardEstablishing a nonlinear equation set and solving to obtain 8 model parameters to be estimated, wherein the expression of the model parameters is,Whereinparameters of three coordinate axes are included. The solution of the system of nonlinear equations may use the Levenberg-Marquardt algorithm.
In particular, the method comprises the following steps of,,respectively representing the distances corresponding to 8 fixation points;
,Representing a two-norm function representing finding the model parameter x that minimizes the sum of squared gaze point errors.
In the embodiment, a plurality of preset fixation points on a screen are used for guiding a user to watch, a nonlinear equation set of unknown parameters of an eyeball model is established according to the measurement result of the eye diagram, and the nonlinear equation set is solved to obtain model parameters. The model parameters of different users can be corrected by the user in the calibration process, and more accurate sight direction can be obtained.
Example 2
The following describes, as a specific example in practical operation, a step of determining an intersection point of the user's gaze and the screen according to the corrected pupil contour point set and the eyeball model.
The calculation process involves a world coordinate system and a camera coordinate system. Let the x-axis and y-axis of the world coordinate system be in the screen plane and the z-axis be perpendicular to the screen plane. The origin of the world coordinate system is at the center of the screen, and the x-axis is horizontalRight, y-axis horizontal down, z-axis pointing to the eyeball side. The camera coordinate system coincides with the world coordinate system but tilts up about the x-axis with a slight rotation about the y-axis and z-axis. Lower corner markRepresenting the camera coordinate system, lower corner marksRepresenting a world coordinate system. Upper corner markIndicating the optical axis of the eye, superscriptRepresenting the visual axis of the eye.
First, fit the pupil projection to an ellipse, and in particular implementation,
fitting the contour of the corneal refraction-corrected pupil projection to an ellipse. According to fitting ellipseCalculating pupil normal, i.e. direction of optical axis in camera coordinate systemAnd calculating a two-dimensional projection vector of the three-dimensional optical axis direction on the image。
Secondly, initializing eyeball center projection, specifically:
computing at least two imagesThe intersection point of the two points is taken as the eyeball center projection;
According to the camera intrinsic parameters andcalculating the direction of the eyeball center in the camera coordinate systemAnd carrying out normalization processing. Wherein, cx,cyRespectively, the abscissa and ordinate of the center pixel of the image sensor, and f is the focal length of the camera.
Finally, fitting model parameters specifically include:
Calculating the direction of the eye's optical axis in the camera coordinate system from the corresponding eye diagram;
Calculating the distance between the camera and the center of the eyeball,Is the eyeball center coordinate under the world coordinate system,representing translation vectors in the camera extrinsic parameters;
Calculating pupil center coordinates under camera coordinate system,The distance between the pupil center and the eyeball center;
centering the pupil under the camera coordinate systemConverting to world coordinate system to obtainWhereinas an external parameter of the camera, i.e. rotation angle vectorA corresponding rotation matrix;
the direction of the optical axis in the camera coordinate systemConversion to world coordinate system;
The direction of the optical axis under the world coordinate systemCorrected to the direction of visual axis in world coordinate systemWherein:
Example 3
Referring to fig. 3, the method for tracking a line of sight according to the three embodiments of the present invention includes steps S31 to S33.
Step S31, correcting the parameters of the eyeball model according to the values of the unknown parameters of the eyeball model obtained by the model parameter calibration method.
And step S32, acquiring the human eye image to be detected of the user, and carrying out contour detection on the pupil projection in the human eye image to be detected to obtain a contour point set.
Step S33, performing corneal refraction correction on the contour point set, and determining an intersection point between the user' S sight line and the screen according to the corrected contour point set and the eyeball model after parameter correction.
The parameters of the eyeball model are calibrated by the model parameter calibration method in the embodiment. Acquiring a current human eye image to be detected of a user, and carrying out contour detection on pupil projection in the human eye image to be detected to obtain a contour point set. Corneal refraction correction is performed on the set of contour points to eliminate the effect of corneal refraction. Inputting the corrected contour point set into a system formed by a screen, a camera and eyeballs, and calculating the intersection point of the sight of the user and the screen。
Referring to fig. 4, a model parameter calibration apparatus according to a fourth embodiment of the present invention includes:
a first obtaining module 41, configured to obtain images captured when a user watches each preset gazing point on a screen, respectively, so as to obtain a human eye image corresponding to each preset gazing point, where the number of the preset gazing points is set according to the number of unknown parameters in an eyeball model;
the first pupil detection module 42 is configured to perform contour point detection on pupil projections in a human eye image corresponding to a current preset gaze point to obtain a pupil contour point set;
a first correction module 43, configured to perform corneal refraction correction on the pupil contour point set, and determine an intersection point between the user's gaze and the screen according to the corrected pupil contour point set and the eyeball model;
an equation establishing module 44, configured to calculate a distance between the position of the intersection and a real position of the current preset gaze point, and establish a nonlinear equation corresponding to the current preset gaze point by using each unknown parameter in the eyeball model as a variable and using the distance as a target function;
and the equation solving module 45 is configured to obtain a nonlinear equation corresponding to each preset gazing point, and solve an equation set formed by the nonlinear equations corresponding to each preset gazing point to obtain values of each unknown parameter in the eyeball model.
Referring to fig. 5, a gaze tracking apparatus according to a fifth embodiment of the present invention includes:
a correction module 51, configured to correct parameters of the eyeball model according to values of unknown parameters of the eyeball model obtained by the model parameter calibration method;
a second obtaining module 52, configured to obtain an image of a human eye to be detected of the user;
the second pupil detection module 53 is configured to perform contour detection on a pupil projection in the human eye image to be detected, so as to obtain a contour point set;
a second correction module 54 for performing corneal refractive correction on the set of contour points;
and a determining module 55, configured to determine an intersection point between the line of sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter modification.
The model parameter calibration device provided by the embodiment of the invention has the same implementation principle and technical effect as the method embodiment, and for brief description, the corresponding content in the method embodiment can be referred to where the device embodiment is not mentioned.
In another aspect, the present invention further provides an electronic device, please refer to fig. 6, which includes a processor 10, a memory 20, and a computer program 30 stored on the memory and executable on the processor, wherein the processor 10 executes the computer program 30 to implement the method in the above embodiment.
The electronic device may be, but is not limited to, a virtual reality headset, a computer, a server, and the like. Processor 10 may be, in some embodiments, a Central Processing Unit (CPU), controller, microcontroller, microprocessor or other data Processing chip that executes program code stored in memory 20 or processes data.
The memory 20 includes at least one type of readable storage medium, which includes a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a magnetic memory, a magnetic disk, an optical disk, and the like. The memory 20 may in some embodiments be an internal storage unit of the electronic device, for example a hard disk of the electronic device. The memory 20 may also be an external storage device of the electronic device in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the electronic device. Further, the memory 20 may also include both an internal storage unit and an external storage device of the electronic apparatus. The memory 20 may be used not only to store application software installed in the electronic device and various types of data, but also to temporarily store data that has been output or will be output.
Optionally, the electronic device may further comprise a user interface, a network interface, a communication bus, etc., the user interface may comprise a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface may further comprise a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface. The network interface may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), typically used to establish a communication link between the device and other electronic devices. The communication bus is used to enable connection communication between these components.
It should be noted that the configuration shown in fig. 6 does not constitute a limitation of the electronic device, and in other embodiments the electronic device may include fewer or more components than shown, or some components may be combined, or a different arrangement of components.
The invention also proposes a computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as in the above-mentioned embodiments.
Those of skill in the art will understand that the logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be viewed as implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus (e.g., a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or execute the instructions). For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following technologies, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description of the specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (9)
1. A method for calibrating model parameters is characterized by comprising the following steps:
respectively acquiring images shot when a user watches each preset fixation point on a screen to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
carrying out contour point detection on pupil projection in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
performing corneal refraction correction on the pupil contour point set, and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
acquiring a nonlinear equation corresponding to each preset watching point, and solving an equation set formed by the nonlinear equations corresponding to the preset watching points to obtain values of unknown parameters in the eyeball model;
the step of performing corneal refraction correction on the set of pupil contour points comprises:
establishing a first optimization function related to the pupil contour points according to the intersection condition of rays respectively emitted by each contour point in the pupil contour point set through the center of a camera and the eyeball model;
establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point;
the first optimization function is:
the second optimization function is:
wherein case I is the intersection point of ray and cornea, and the corneal surface is shown as the surface of cornea according to the law of refractionNormal vector of (c)Direction of rayVector of refracted lightAre coplanar, i.e.
The corresponding refraction light parameter equation is as follows:,t2the parameters of the equation are expressed in terms of,,a set of real numbers is represented as,
case II is defined as the intersection point between the ray and the eyeball and the intersection point between the ray and the cornea;
case III means that the ray has no intersection point with the eyeball and the cornea, and the ray parameter t is selected at the moment0To minimize the distance between the ray and the center of the eyeball, define;
Wherein i represents the frame number of the human eye image, j represents the jth contour point in the pupil contour point set, E is the position coordinate of the eyeball spherical center, ri represents the pupil radius,to representThe coordinates of the center of the pupil are,is a positive number, and the number of the positive number,a jth pupil image contour point representing the ith pupil,represents the intersection point of the camera light ray corresponding to the pupil image contour point and the eyeball model,is a sign of a norm,the inner product of the vector is represented as,the intersection of the ray with the corneal surface is shown,the center of the ellipse representing the fit of the pupil contour points,is represented byThe calculated pupil center coordinate, n is the refractive index of the cornea,is a refractive index of air and is,an optimization function is represented with respect to the state of the model,the states of eyeball models in the 1 st to Nth frame images are represented, and lambda is a Lagrange multiplier.
2. The method for calibrating model parameters according to claim 1, wherein the step of determining the intersection point of the user's gaze and the screen based on the corrected set of pupil contour points and the eyeball model comprises:
carrying out ellipse fitting according to the corrected pupil contour point set, and calculating the normal direction of the pupil according to the fitted ellipse so as to obtain the optical axis direction under a camera coordinate system;
calculating a two-dimensional projection vector of the optical axis direction under a camera coordinate system on a corresponding human eye image;
determining the projection of the eyeball center according to the intersection point of the two-dimensional projection vectors of at least two frames of human eye images;
determining the coordinates of the pupil center under a camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center and the pupil-eyeball distance, wherein the pupil-eyeball distance is the distance between the pupil center and the eyeball center;
converting the coordinates of the pupil center under the camera coordinate system into a world coordinate system;
converting the optical axis direction under the camera coordinate system into a world coordinate system, and correcting the optical axis direction converted under the world coordinate system into the visual axis direction under the world coordinate system according to the visual axis optical axis included angle, wherein the visual axis optical axis included angle is the included angle between the visual axis of the eyes and the optical axis;
and determining the sight of the user according to the coordinates of the pupil center in the world coordinate system obtained through conversion and the visual axis direction in the world coordinate system obtained through conversion, and calculating the intersection point of the sight and the screen.
3. The method for calibrating model parameters according to claim 2, wherein the step of determining the coordinates of the pupil center in the camera coordinate system according to the projection of the eyeball center, the position coordinates of the eyeball center, and the pupil-eyeball distance comprises:
calculating the direction of the eyeball center in a camera coordinate system according to the projection of the eyeball center;
determining the distance between the camera and the eyeball center under the world coordinate according to the position of the eyeball center and the translation vector of the camera;
calculating the coordinates of the eyeball center under a camera coordinate system according to the direction of the eyeball center in the camera coordinate system, the distance between the camera and the eyeball center and the pupil-eyeball distance;
and determining the coordinates of the pupil center in the camera coordinate system according to the coordinates of the eyeball center in the camera coordinate system and the optical axis direction in the camera coordinate system.
4. The method for calibrating model parameters according to claim 1, wherein the unknown parameters of the eyeball model comprise: the position coordinates of the eyeball center, the distance between the pupil center and the eyeball center, the distance between the cornea curvature center and the eyeball center, the cornea curvature radius and the included angle between the visual axis and the optical axis.
5. A gaze tracking method, comprising:
correcting the parameters of the eyeball model according to the value of the unknown parameters of the eyeball model obtained by the model parameter calibration method according to any one of claims 1 to 4;
acquiring a to-be-detected human eye image of the user, and carrying out contour detection on pupil projection in the to-be-detected human eye image to obtain a contour point set;
and performing corneal refraction correction on the contour point set, and determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model after parameter correction.
6. A model parameter calibration device is characterized by comprising:
the first acquisition module is used for respectively acquiring images shot when a user watches each preset fixation point on a screen so as to obtain human eye images corresponding to each preset fixation point, wherein the number of the preset fixation points is set according to the number of unknown parameters in an eyeball model;
the first pupil detection module is used for detecting contour points of pupil projections in a human eye image corresponding to a current preset fixation point to obtain a pupil contour point set;
the first correction module is used for performing corneal refraction correction on the pupil contour point set and determining an intersection point of the sight of the user and the screen according to the corrected pupil contour point set and the eyeball model;
the equation establishing module is used for calculating the distance between the position of the intersection point and the real position of the current preset fixation point, and establishing a nonlinear equation corresponding to the current preset fixation point by taking each unknown parameter in the eyeball model as a variable and taking the distance as a target function;
the equation solving module is used for acquiring a nonlinear equation corresponding to each preset gazing point and solving an equation set formed by the nonlinear equations corresponding to the preset gazing points to obtain values of all unknown parameters in the eyeball model;
the step of performing corneal refraction correction on the set of pupil contour points comprises:
establishing a first optimization function related to the pupil contour points according to the intersection condition of rays respectively emitted by each contour point in the pupil contour point set through the center of a camera and the eyeball model;
establishing a second optimization function related to the pupil center according to the first optimization function and the ellipse center coordinate fitted by the pupil contour point;
the first optimization function is:
the second optimization function is:
wherein case I is the intersection point of ray and cornea, and the corneal surface is shown as the surface of cornea according to the law of refractionNormal vector of (c)Direction of rayVector of refracted lightAre coplanar, i.e.
The corresponding refraction light parameter equation is:,t2the parameters of the equation are expressed in terms of,,a set of real numbers is represented by,
case II is defined as the intersection point between the ray and the eyeball and the intersection point between the ray and the cornea;
case III means that the ray has no intersection point with the eyeball and the cornea, and the ray parameter t is selected at the moment0To minimize the distance between the ray and the center of the eyeball, define;
Wherein i represents the frame number of the human eye image, j represents the jth contour point in the pupil contour point set, E is the position coordinate of the eyeball spherical center, ri represents the pupil radius,the coordinates representing the center of the pupil are,is a positive number, and the number of the positive number,a jth pupil image contour point representing the ith pupil,represents the intersection point of the camera light ray corresponding to the pupil image contour point and the eyeball model,is a sign of a norm,the inner product of the vector is represented as,the intersection of the ray with the corneal surface is shown,the center of the ellipse representing the fit of the pupil contour points,is represented byThe calculated pupil center coordinate, n is the refractive index of the cornea,is a refractive index of air and is,an optimization function is represented with respect to the state of the model,the states of eyeball models in the 1 st frame image to the Nth frame image are represented, and lambda is a Lagrange multiplier.
7. A gaze tracking device, comprising:
a correction module, configured to correct the parameters of the eyeball model according to the value of the unknown parameters of the eyeball model obtained by the model parameter calibration method according to any one of claims 1 to 4;
the second acquisition module is used for acquiring the to-be-detected eye image of the user;
the second pupil detection module is used for carrying out contour detection on the pupil projection in the human eye image to be detected to obtain a contour point set;
a second correction module for performing corneal refractive correction on the set of contour points;
and the determining module is used for determining the intersection point of the sight of the user and the screen according to the corrected contour point set and the eyeball model with the parameters corrected.
8. A readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
9. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 5 when executing the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210267130.6A CN114360043B (en) | 2022-03-18 | 2022-03-18 | Model parameter calibration method, sight tracking method, device, medium and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210267130.6A CN114360043B (en) | 2022-03-18 | 2022-03-18 | Model parameter calibration method, sight tracking method, device, medium and equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114360043A CN114360043A (en) | 2022-04-15 |
CN114360043B true CN114360043B (en) | 2022-06-17 |
Family
ID=81094291
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210267130.6A Active CN114360043B (en) | 2022-03-18 | 2022-03-18 | Model parameter calibration method, sight tracking method, device, medium and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114360043B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115546876B (en) * | 2022-11-07 | 2023-12-19 | 广州图语信息科技有限公司 | Pupil tracking method and device |
CN116052264B (en) * | 2023-03-31 | 2023-07-04 | 广州视景医疗软件有限公司 | Sight estimation method and device based on nonlinear deviation calibration |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003079577A (en) * | 2001-09-12 | 2003-03-18 | Nippon Telegr & Teleph Corp <Ntt> | Visual axis measuring apparatus and method, visual axis measuring program, and recording medium recording the same |
CN101901485A (en) * | 2010-08-11 | 2010-12-01 | 华中科技大学 | 3D free head moving type gaze tracking system |
CN102520796A (en) * | 2011-12-08 | 2012-06-27 | 华南理工大学 | Sight tracking method based on stepwise regression analysis mapping model |
CN108968907A (en) * | 2018-07-05 | 2018-12-11 | 四川大学 | The bearing calibration of eye movement data and device |
CN110263745A (en) * | 2019-06-26 | 2019-09-20 | 京东方科技集团股份有限公司 | A kind of method and device of pupil of human positioning |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9983709B2 (en) * | 2015-11-02 | 2018-05-29 | Oculus Vr, Llc | Eye tracking using structured light |
ES2909057T3 (en) * | 2017-09-08 | 2022-05-05 | Tobii Ab | Eye tracking using the position of the center of the eyeball |
-
2022
- 2022-03-18 CN CN202210267130.6A patent/CN114360043B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003079577A (en) * | 2001-09-12 | 2003-03-18 | Nippon Telegr & Teleph Corp <Ntt> | Visual axis measuring apparatus and method, visual axis measuring program, and recording medium recording the same |
CN101901485A (en) * | 2010-08-11 | 2010-12-01 | 华中科技大学 | 3D free head moving type gaze tracking system |
CN102520796A (en) * | 2011-12-08 | 2012-06-27 | 华南理工大学 | Sight tracking method based on stepwise regression analysis mapping model |
CN108968907A (en) * | 2018-07-05 | 2018-12-11 | 四川大学 | The bearing calibration of eye movement data and device |
CN110263745A (en) * | 2019-06-26 | 2019-09-20 | 京东方科技集团股份有限公司 | A kind of method and device of pupil of human positioning |
Non-Patent Citations (2)
Title |
---|
Arantxa Villanueva等.A Novel Gaze Estimation System With One Calibration Point.《IEEE Transactions on Systems》.2008, * |
刘冬.基于立体视觉的视线估计方法研究.《中国优秀硕士学位论文》.2017,第2017年卷(第5期), * |
Also Published As
Publication number | Publication date |
---|---|
CN114360043A (en) | 2022-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2010259605A (en) | Visual line measuring device and visual line measuring program | |
US9244529B2 (en) | Point-of-gaze estimation robust to head rotations and/or device rotations | |
US9291834B2 (en) | System for the measurement of the interpupillary distance using a device equipped with a display and a camera | |
US7950800B2 (en) | Method of measuring at least one geometrico-physiognomic parameter for positioning a vision correcting eyeglass frame on the face of a wearer | |
CN109690553A (en) | The system and method for executing eye gaze tracking | |
EP3339943A1 (en) | Method and system for obtaining optometric parameters for fitting eyeglasses | |
JP6594129B2 (en) | Information processing apparatus, information processing method, and program | |
CN113808160B (en) | Sight direction tracking method and device | |
US10216010B2 (en) | Determining user data based on image data of a selected eyeglass frame | |
CN112102389A (en) | Method and system for determining spatial coordinates of a 3D reconstruction of at least a part of a physical object | |
CN114360043B (en) | Model parameter calibration method, sight tracking method, device, medium and equipment | |
US10866635B2 (en) | Systems and methods for capturing training data for a gaze estimation model | |
WO2019010959A1 (en) | Method and device for determining sight line, and computer readable storage medium | |
JP6840697B2 (en) | Line-of-sight direction estimation device, line-of-sight direction estimation method, and line-of-sight direction estimation program | |
US10955690B2 (en) | Spectacle wearing parameter measurement system, measurement program, measurement method thereof, and manufacturing method of spectacle lens | |
US11181978B2 (en) | System and method for gaze estimation | |
US20220003632A1 (en) | Method and device for measuring the local refractive power and/or the refractive power distribution of a spectacle lens | |
KR20200006621A (en) | Methods, apparatus, and computer programs for determining near vision points | |
JP2018099174A (en) | Pupil detector and pupil detection method | |
CN116999017B (en) | Auxiliary eye care intelligent control system based on data analysis | |
JP2019215688A (en) | Visual line measuring device, visual line measurement method and visual line measurement program for performing automatic calibration | |
CN113129451B (en) | Holographic three-dimensional image space quantitative projection method based on binocular vision positioning | |
CN116051631A (en) | Light spot labeling method and system | |
JP6906943B2 (en) | On-board unit | |
JP2018101212A (en) | On-vehicle device and method for calculating degree of face directed to front side |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |