CN109696954A - Eye-controlling focus method, apparatus, equipment and storage medium - Google Patents

Eye-controlling focus method, apparatus, equipment and storage medium Download PDF

Info

Publication number
CN109696954A
CN109696954A CN201710987005.1A CN201710987005A CN109696954A CN 109696954 A CN109696954 A CN 109696954A CN 201710987005 A CN201710987005 A CN 201710987005A CN 109696954 A CN109696954 A CN 109696954A
Authority
CN
China
Prior art keywords
coordinate
point
pupil
light source
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710987005.1A
Other languages
Chinese (zh)
Other versions
CN109696954B (en
Inventor
高林
刘婷婷
袁坤
黄婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Institute of Computing Technology of CAS
Tencent Cyber Tianjin Co Ltd
Original Assignee
Institute of Computing Technology of CAS
Tencent Cyber Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Computing Technology of CAS, Tencent Cyber Tianjin Co Ltd filed Critical Institute of Computing Technology of CAS
Priority to CN201710987005.1A priority Critical patent/CN109696954B/en
Publication of CN109696954A publication Critical patent/CN109696954A/en
Application granted granted Critical
Publication of CN109696954B publication Critical patent/CN109696954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention relates to a kind of Eye-controlling focus method, apparatus, equipment and storage mediums, comprising: obtains eye image;Determine pupil image point in the eye image in the coordinate of coordinate and the light source imaging point that is formed in the eye image through corneal reflection of light source in the three-dimensional coordinate system in the three-dimensional coordinate system of screen;Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine boresight direction;According to the boresight direction, fixation point position on the screen is determined.The scheme of the application improves the accuracy of identified fixation point position.

Description

Eye-controlling focus method, apparatus, equipment and storage medium
Technical field
The present invention relates to field of computer technology, are situated between more particularly to a kind of Eye-controlling focus method, apparatus, equipment and storage Matter.
Background technique
With the rapid development of science and technology, Eye Tracking Technique also more and more attention has been paid to.Wherein, Eye-controlling focus is By observing eyes, to estimate the position of eye gaze point.
It is the root by comparing the eye image currently obtained with the eye image obtained before in conventional method The fixation point position of human eye is estimated according to the displacement difference of the pupil image point in two width eye images.It is this directly according to plane The fixation point position that the displacement difference of pupil image point is estimated is not accurate enough.
Summary of the invention
Based on this, it is necessary to not accurate enough for the estimated fixation point position of the displacement difference according to pupil image point Problem provides a kind of Eye-controlling focus method, apparatus, computer equipment and storage medium.
A kind of Eye-controlling focus method, which comprises
Obtain eye image;
Determine that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through cornea in the eye image Reflect coordinate of the light source imaging point formed in the eye image in the three-dimensional coordinate system;
Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;
According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine optical axis side To;
According to the boresight direction, fixation point position on the screen is determined.
A kind of Eye-controlling focus device, described device include:
Module is obtained, for obtaining eye image;
Coordinate determining module, for determining seat of the pupil image point in the three-dimensional coordinate system of screen in the eye image Seat of the light source imaging point that mark and light source are formed in the eye image through corneal reflection in the three-dimensional coordinate system Mark;
Optical axis direction determining module, for determining light according to the coordinate of the pupil image point and the coordinate of light source imaging point Axis direction;
Boresight direction determining module, for according to the matched smooth boresight direction differential seat angle of the eye image and described Optical axis direction determines boresight direction;
Fixation point determining module, for determining fixation point position on the screen according to the boresight direction.
A kind of computer equipment, including memory and processor are stored with computer program, the meter in the memory When calculation machine program is executed by processor, so that the processor executes following steps:
Obtain eye image;
Determine that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through cornea in the eye image Reflect coordinate of the light source imaging point formed in the eye image in the three-dimensional coordinate system;
Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;
According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine optical axis side To;
According to the boresight direction, fixation point position on the screen is determined.
A kind of storage medium being stored with computer program, the computer program are executed by one or more processors When, so that one or more processors execute following steps:
Obtain eye image;
Determine that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through cornea in the eye image Reflect coordinate of the light source imaging point formed in the eye image in the three-dimensional coordinate system;
Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;
According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine optical axis side To;
According to the boresight direction, fixation point position on the screen is determined.
Above-mentioned Eye-controlling focus method, apparatus, computer equipment and storage medium determine pupil of the pupil after cornea reflects Coordinate of the imaging point in screen coordinate system, and determine light source imaging point of the light source after corneal reflection in screen coordinate system Coordinate, pupil image point and light source imaging point in the eye image of plane are converted to the seat in three-dimensional screen coordinate system Mark.According to the coordinate of the coordinate of three-dimensional pupil image point and light source imaging point, pupil center and the angle of true eyes are determined The optical axis direction of film center of curvature line, and according to the differential seat angle between optical axis direction and boresight direction, determine boresight direction, the optical axis Direction can more accurately indicate direction of visual lines.In turn, the fixation point position of eyes on the screen is determined according to boresight direction It is more accurate.
Detailed description of the invention
Fig. 1 is the applied environment figure of Eye-controlling focus method in one embodiment;
Fig. 2 is the flow diagram of Eye-controlling focus method in one embodiment;
Fig. 3 A is schematic diagram of the pupil through cornea dioptric imaging in one embodiment;
Fig. 3 B is the schematic diagram that light source is imaged through corneal reflection in one embodiment;
Fig. 4 is the schematic diagram of light boresight direction in one embodiment;
Fig. 5 is the flow diagram that imaging point coordinate determines step in one embodiment;
Fig. 6 is the flow diagram that the coordinate at pupil center and corneal curvature center determines step in one embodiment;
Fig. 7 is the vertical side of first level orientation angle and first of the optical axis direction in three-dimensional coordinate system in one embodiment To the schematic diagram of angle;
Fig. 8 is the flow diagram of Eye-controlling focus method in another embodiment;
Fig. 9 is the structural block diagram of Eye-controlling focus device in one embodiment;
Figure 10 is the structural block diagram of Eye-controlling focus device in another embodiment;
Figure 11 is the structural block diagram of Eye-controlling focus device in another embodiment;
Figure 12 is the schematic diagram of internal structure of computer equipment in one embodiment.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.
Fig. 1 is the applied environment figure of Eye-controlling focus method in one embodiment.Referring to Fig.1, which answers It include the head-mounted display 110 communicated by network and computer equipment 120 with environment.Wherein, head-mounted display (HMD, Head Mounted Display) 110 can be the wear-type for having output display virtual real scenes function and show and set It is standby, it will be understood that head-mounted display is also not limited to output virtual reality scenario, is also possible to can be used in observing outer The equipment of boundary's things, such as medical head-mounted display apparatus, can be used for observing organism tissue.Head-mounted display 110 It may include infrared light supply occurrence of equipment 110a and eyes capture apparatus 110b and screen 110c.Infrared light supply occurrence of equipment 110a can be used for generating the infrared light supply of directive eyes, and eyes capture apparatus 110b can be used for shooting eye image, screen 110c It can be used for exporting display virtual real scenic picture.Eyes capture apparatus 110 can be camera.Computer equipment 120 can be used for Processing relevant to virtual reality is carried out to calculate.Computer equipment 120 can be terminal.Terminal can be desktop computer or shifting Dynamic terminal, mobile terminal may include at least one of mobile phone, tablet computer, personal digital assistant and wearable device etc..
Infrared light supply occurrence of equipment 110a can emit infrared light supply to eyes, and eyes capture apparatus 110b can shoot packet Include the eye image for the light source imaging point that pupil image point and light source are obtained through corneal reflection.Eyes capture apparatus can will be shot Eye image be sent to computer equipment 120.Computer equipment 120 obtains eye image, determine in eye image pupil at The light source that coordinate and light source of the picture point in the three-dimensional coordinate system of screen 110c are formed in eye image through corneal reflection at Coordinate of the picture point in three-dimensional coordinate system.Computer equipment 120 is according to the coordinate of pupil image point and the coordinate of light source imaging point Determine optical axis direction, and according to the matched smooth boresight direction differential seat angle of eye image and optical axis direction, determine boresight direction. Computer equipment 120 determines fixation point position of the eyes being taken on screen 110c according to boresight direction.
It should be noted that in other embodiments, infrared light supply occurrence of equipment and eyes capture apparatus are also possible to not It is included in head-mounted display, can be independently of head-mounted display and keeps determining position with head-mounted display The equipment of relationship.
Fig. 2 is the flow diagram of Eye-controlling focus method in one embodiment.The present embodiment is mainly with the Eye-controlling focus side Method is applied to computer equipment and comes for example, the computer equipment can be the computer equipment 120 in Fig. 1.Reference Fig. 2, This method specifically comprises the following steps:
S202 obtains eye image.
Wherein, the light source imaging point formed including pupil image point and light source through corneal reflection in eye image.Pupil is The small sircle hole at iris center in eyes enters the channel of eyes for light.Cornea is transparent positioned at one layer of eyeball antetheca Film is equivalent to meniscus lens.It is raised before cornea, spherical bending.
Pupil image point, be pupil refraction point of the pupil center after cornea reflects in the eye image being taken at Picture.Pupil center is the central point of pupil region.Light source imaging point is pupillary reflex point of the light source center after corneal reflection Imaging in the eye image being taken.Light source is the incident light of directive eyes.Light source center is the center of source region Point.In one embodiment, light source can be infrared light supply.
Fig. 3 A is schematic diagram of the pupil through cornea dioptric imaging in one embodiment.Fig. 3 B is light source warp in one embodiment The schematic diagram of corneal reflection imaging.Referring to Fig. 3 A, the eyeball E of eyes entity include eyeball center d, pupil center p, cornea J and Corneal curvature center c.Wherein, corneal curvature center is the ball centre that cornea is modeled as to sphere.Eyeball center is by eyeball As the ball centre of sphere.The shooting point of capture apparatus S is point o.Pupil center point p obtains pupil folding after cornea J refraction Exit point r, the pupil refraction point r after refraction are imaged as pupil image point v in the eye image captured by capture apparatus S.Ginseng According to Fig. 3 B, the eyeball E of eyes entity includes eyeball center d, corneal curvature center c and pupil center p and cornea J, capture apparatus S Shooting point be point o, light source center be point l.Light source center l obtains reflection point q after cornea J reflection, the reflection point q after reflection Light source imaging point u is imaged as in the eye image captured by capture apparatus S.
S204 determines that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through angle in eye image Coordinate of the light source imaging point that film reflection is formed in eye image in three-dimensional coordinate system.
Wherein, screen is the display screen in head-mounted display.The three-dimensional coordinate system of screen, be include horizontal axis (X-axis), The three-dimensional system of coordinate of the longitudinal axis (Y-axis) and vertical pivot (Z axis).The axial direction of the horizontal axis of the three-dimensional coordinate system of screen is the level along screen Direction, the axial direction of the longitudinal axis are the vertical directions along screen, and the axial direction of vertical pivot is the direction along normal to screen.In a reality It applies in example, the origin in the three-dimensional coordinate system of screen can be screen center.
Specifically, computer equipment can identify pupil image point and light source imaging point in eye image, then divide Coordinate corresponding to the three-dimensional coordinate system of pupil image point and light source imaging point relative to screen is not determined.Wherein, relative to Coordinate corresponding to the three-dimensional coordinate system of screen is to determine pupil image point and light source using the three-dimensional coordinate system of screen as reference Imaging point coordinate corresponding respectively under the three-dimensional coordinate system.
S206 determines optical axis direction according to the coordinate of the coordinate of pupil image point and light source imaging point.
Wherein, optical axis direction is the direction of pupil center Yu the corneal curvature line of centres.Corneal curvature center is by cornea It is modeled as the ball centre of sphere.It should be noted that eyeball center also is located at pupil center and the corneal curvature line of centres On optical axis direction.Eyeball center is using eyeball as ball centre when sphere.
In one embodiment, step S206 includes: the coordinate of the coordinate and light source imaging point according to pupil image point, really Determine the coordinate of pupil center and the coordinate at corneal curvature center;According to the coordinate of the coordinate of pupil center and corneal curvature center, Determine the optical axis direction of pupil center Yu the corneal curvature line of centres.
It should be noted that pupil image point is imaging of the pupil center in eye image, and " the seat of pupil center Mark " and " pupil center " in " coordinate at corneal curvature center " and " corneal curvature " center " is and the matched eye of eye image Pupil center and corneal curvature center in eyeball entity.The then coordinate of the coordinate of pupil center and corneal curvature center, respectively refers to With coordinate and the eyes entity of the pupil center in the matched eyes entity of eye image in the three-dimensional coordinate system of screen In coordinate of the corneal curvature center in the three-dimensional coordinate system of screen.
In one embodiment, since pupil image point is imaging of the pupil center through cornea refraction in eye image, Then computer equipment is available with the matched corneal radii of eye image, in conjunction with corneal radii, law of refraction formula and camera Image-forming principle formula, and according to the coordinate of pupil image point, determine the coordinate of pupil center.Wherein, matched with eye image Cornea is the cornea in eyes entity corresponding to the eye image.Eyes corresponding to eye image are taken to generate The eyes of the eye image.In one embodiment, corneal radii is the radius of sphericity that cornea is modeled as to sphere.For example, will When cornea is modeled as sphere, the distance on corneal curvature center to the surface of cornea sphere is corneal radii.
Similarly, in one embodiment, due to light source imaging point be light source center through corneal reflection in eye image Imaging, then computer equipment is available with the matched corneal radii of eye image, in conjunction with corneal radii, reflection law formula and Camera imaging principle formula, and according to the coordinate of light source imaging point, determine the coordinate at corneal curvature center.Wherein, with eyes figure It is the cornea in eyes entity corresponding to the eye image as matched cornea.
S208, according to the matched smooth boresight direction differential seat angle of eye image and optical axis direction, determine boresight direction.
Wherein, light boresight direction differential seat angle is the differential seat angle between optical axis direction and boresight direction.Boresight direction is retina The direction of the line at upper central fovea and corneal curvature center.Light boresight direction differential seat angle includes the poor and vertical side of horizontal direction angle To differential seat angle.Wherein, horizontal direction angle is poor, and optical axis direction and boresight direction are in the X direction of the three-dimensional coordinate system of screen Differential seat angle.Vertical direction differential seat angle is optical axis direction and boresight direction on the y direction of the three-dimensional coordinate system of screen Differential seat angle.
It is present in eyes entity corresponding to the eye image with the matched smooth boresight direction differential seat angle of eye image Light boresight direction differential seat angle.It is appreciated that existing light boresight direction differential seat angle is solid for eyes normal for one It is fixed, here it is not intended that ocular deformation or may light boresight direction differential seat angle bring shadow to the eyes when being abnormal It rings.
In one embodiment, computer equipment can according to the matched smooth boresight direction differential seat angle of eye image, will Optical axis direction adjusts accordingly, and obtains boresight direction.
Fig. 4 is the schematic diagram of light boresight direction in one embodiment.Referring to Fig. 4, corneal curvature center c in eyeball E with Direction corresponding to the line of pupil center p is optical axis direction Op, and eyeball center d is located on optical axis direction Op.Central fovea f and angle Direction corresponding to the line of film center of curvature c is boresight direction Ls.As shown in figure 4, optical axis direction Op and boresight direction Ls it Between there are light boresight direction differential seat angles.
S210 determines fixation point position on the screen according to boresight direction.
Wherein, fixation point is point corresponding to the sight of eyes entity corresponding to eye image.Staring on the screen Point position, is the position for the point that eyes entity corresponding to eye image is seen on screen.
Specifically, computer equipment can using boresight direction as direction of visual lines, with determine on the screen stare point It sets.In one embodiment, computer equipment can determine the intersection point of boresight direction and screen, using the intersection point as on the screen Fixation point.
In one embodiment, fixation point position can be the fixation point coordinate in the three-dimensional coordinate system of screen.
Pupil image point and light source imaging point in the eye image of plane are converted to three-dimensional by above-mentioned Eye-controlling focus method Coordinate in screen coordinate system.According to the coordinate of the coordinate of three-dimensional pupil image point and light source imaging point, true eye is determined The optical axis direction of the pupil center of eyeball and the corneal curvature line of centres, and according to the differential seat angle between optical axis direction and boresight direction, Determine that boresight direction, boresight direction can more accurately indicate direction of visual lines.In turn, determine that eyes are shielding according to boresight direction Fixation point position on curtain is more accurate.
As shown in figure 5, in one embodiment, step S204 (abbreviation imaging point coordinate determines step), specifically include with Lower step:
S502, determined in eye image pupil image point location of pixels and the light source that is formed through corneal reflection of light source at The location of pixels of picture point.
Specifically, computer equipment can carry out image recognition processing to eye image, identify the pupil in eye image Bore region, and determine that the central point of identified pupil region obtains pupil image point.Computer equipment may recognize that Source region in eye image, and determine that the central point of identified source region obtains light source imaging point.
In one embodiment, computer equipment can determine the pupil in the image coordinate system corresponding to eye image Location of pixels corresponding to borescopic imaging point, and determine location of pixels corresponding to the light source imaging point.Wherein, eye image institute Corresponding image coordinate system is the two-dimensional coordinate system built in the eye image, forms each of the eye image for determining The location of pixels of pixel.
S504, according to the mapping relations of coordinate in the three-dimensional coordinate system of location of pixels in eye image and screen, by pupil The location of pixels of imaging point is mapped as coordinate of the pupil image point in three-dimensional coordinate system.
The location of pixels of light source imaging point is mapped as light source imaging point in three-dimensional coordinate system according to mapping relations by S506 In coordinate.
Coordinate in the three-dimensional coordinate system of location of pixels and screen in the available preset eye image of computer equipment Mapping relations.It is appreciated that the mapping relations of coordinate in the three-dimensional coordinate system of location of pixels and screen in preset eye image, It can be direct mapping relations and (location of pixels in eye image be directly mapped as to the seat in the three-dimensional coordinate system of screen Mark).The mapping relations of coordinate are also possible to indirectly in the three-dimensional coordinate system of location of pixels and screen in preset eye image Mapping relations.In one embodiment, coordinate reflects in the three-dimensional coordinate system of location of pixels and screen in preset eye image Relationship is penetrated, the mapping relations and camera that can be location of pixels and coordinate in camera coordinates system in preset eye image are sat The multiple mapping relationship that the mapping relations of coordinate form in the three-dimensional coordinate system of coordinate and screen in mark system.
Computer equipment can be closed according to the mapping of coordinate in the three-dimensional coordinate system of location of pixels in eye image and screen The location of pixels of pupil image point is mapped as coordinate of the pupil image point in three-dimensional coordinate system by system, and according to mapping relations, The location of pixels of light source imaging point is mapped as coordinate of the light source imaging point in three-dimensional coordinate system.
In above-described embodiment, the pixel position of pupil image point and light source imaging point in the eye image by identifying plane It sets, and according to the mapping relations of the coordinate in location of pixels and screen coordinate system, is converted to the seat in three-dimensional screen coordinate system Mark.And then according to the coordinate of the coordinate of three-dimensional pupil image point and light source imaging point, true eye can be accurately determined The pupil center of eyeball and the optical axis direction of the corneal curvature line of centres, to guarantee according to the angle between optical axis direction and boresight direction The accuracy for the boresight direction that degree difference is determined.So that determining the fixation point position of eyes on the screen according to boresight direction It is more accurate.
As shown in fig. 6, in one embodiment, according to the coordinate of the coordinate of pupil image point and light source imaging point, determining The coordinate of pupil center and the coordinate (coordinate at abbreviation pupil center and corneal curvature center determines step) at corneal curvature center, Specifically includes the following steps:
S602 obtains the shooting point of the light source and eye image corresponding light source coordinates in the three-dimensional coordinate system of screen respectively With shooting point coordinate.
It is appreciated that the available light source of computer equipment corresponding light source coordinates in the three-dimensional coordinate system of screen, with And the shooting point of acquisition eye image corresponding shooting point coordinate in the three-dimensional coordinate system of screen.
Wherein, light source coordinates are light source center corresponding coordinates in the three-dimensional coordinate system of screen.
Specifically, computer equipment can directly acquire preset light source coordinates.Computer equipment also can detecte light source It the position at center will be detected according to the mapping relations of coordinate in the position of preset light source and the three-dimensional coordinate system of screen The position of light source center be mapped as the light source coordinates in the three-dimensional coordinate system of screen.
Wherein, shooting point is the position for shooting eye image.Shooting point coordinate is spatial coordinate of the shooting point in screen Corresponding coordinate in system.
In one embodiment, shooting point corresponding shooting point coordinate in the three-dimensional coordinate system of screen can be sets in advance It sets.The available preset shooting point coordinate of computer equipment.It in another embodiment, can be preparatory in computer equipment The mapping relations of coordinate in the position of shooting point and the three-dimensional coordinate system of screen are set.Computer equipment can detecte shooting point Position, and according to the preset mapping relations, the position of detected shooting point is mapped as the three-dimensional coordinate system in screen In shooting point coordinate.
S604 determines the coordinate of pupil refraction point according to the coordinate of shooting point coordinate and pupil image point.
It is appreciated that since pupil image point is pupil refraction point of the pupil center after cornea reflects in eye image Imaging, then computer equipment according to the coordinate of shooting point coordinate and pupil image point, can be determined according to camera imaging principle The coordinate of pupil refraction point.
Specifically, computer equipment can determine the first difference of the coordinate of shooting point coordinate and pupil image point, according to First difference, and combine the refraction between the coordinate of first difference and pupil refraction point and the second difference of shooting point coordinate Conversion parameter and shooting point coordinate is imaged, determines the coordinate of pupil refraction point.Dioptric imaging conversion parameter can be known , it is also possible to unknown.It is appreciated that when dioptric imaging conversion parameter is unknown parameter, then identified pupil refraction The coordinate of point can be that unknown number is indicated to obtain by dioptric imaging conversion parameter.
In one embodiment, computer equipment can determine the coordinate of pupil refraction point according to following formula:
R-o=kr(o-v);
Wherein, r is the coordinate of pupil refraction point;O is shooting point coordinate;V is the coordinate of pupil image point.O-v is first Difference;R-o is the second difference;krFor dioptric imaging conversion parameter.
S606 determines the coordinate of light reflection point according to the coordinate of shooting point coordinate and light source imaging point.
It is appreciated that since light source imaging point is light reflection point of the light source center after corneal reflection in eye image Imaging, then computer equipment can determine light according to the coordinate of shooting point coordinate and light source imaging point according to camera imaging principle The coordinate of reflection point.
Specifically, computer equipment can determine the third difference of the coordinate of shooting point coordinate and light source imaging point, according to The third difference, and combine being reflected between the coordinate of the third difference and light reflection point and the 4th difference of shooting point coordinate As conversion parameter and shooting point coordinate, the coordinate of light reflection point is determined.Catoptric imaging conversion parameter can be it is known, It can be unknown.It is appreciated that when catoptric imaging conversion parameter is unknown parameter, then the coordinate of identified light reflection point It can be that unknown number is indicated to obtain by catoptric imaging conversion parameter.
In one embodiment, computer equipment can determine the coordinate of light reflection point according to following formula:
Q-o=kq(o-u);
Wherein, q is the coordinate of light source reflection point;O is shooting point coordinate;U is the coordinate of light source imaging point.O-u is third Difference;Q-o is the 4th difference;kqFor catoptric imaging conversion parameter.
It should be noted that light source can be one or more.When there is multiple light sources, then each light source has accordingly Light reflection point, then each smooth reflection point can be determined according to above-mentioned formula respectively.
S608 is obtained at a distance from eye image matched pupil center to corneal curvature center and corneal radii.
It is appreciated that the distance of pupil center to corneal curvature center is substantially solid for a normal eye Fixed, different eyes, there may be differences for the distance of pupil center to corneal curvature center.Corneal radii is to model cornea For the radius of sphericity of sphere.With at a distance from eye image matched pupil center to corneal curvature center and corneal radii, be this The distance and corneal radii at the pupil center of eyes entity corresponding to eye image to corneal curvature center.
S610 is anti-according to pupil center to the distance at corneal curvature center, corneal radii, the coordinate of light reflection point, light source Coordinate, shooting point coordinate and the light source coordinates of exit point, determine the coordinate of pupil center and the coordinate at corneal curvature center.
In one embodiment, computer equipment can determine the coordinate and corneal curvature of pupil center according to following formula The coordinate at center:
(r-o) × (c-o) (p-o)=0;
n1| | (r-c) × (p-r) | | | | (o-r) | |=n2||(r-c)×(o-r)||·||(p-r)||;
| | r-c | |=R;
(l-o) × (q-o) (c-o)=0;
(l-q) (q-c) | | (o-q) | |=(o-q) (q-c) | | (l-q) | |;
| | q-c | |=R;
| | p-c | |=K;
Q-o=kq(o-u);
R-o=kr(o-v);
Wherein, r is the coordinate of pupil refraction point;O is shooting point coordinate;C is the coordinate at corneal curvature center;P is pupil The coordinate at center;n1For corneal refractive power;n2For air refraction;R is corneal radii;L is light source coordinates;Q is light reflection point Coordinate;K is the distance at pupil center to corneal curvature center;kqFor catoptric imaging conversion parameter;U is the seat of light source imaging point Mark;krFor dioptric imaging conversion parameter;V is the coordinate of pupil image point.
It should be noted that light source can be one or more.When there is multiple light sources, then each light source has accordingly Light reflection point, then the light source coordinates of each light source can and corresponding each smooth reflection point can bring into respectively it is above-mentioned corresponding Formula calculated.In one embodiment, light source can be 2.
In above-described embodiment, it is equivalent to and combines eyes inherent parameters (the i.e. distance of pupil center to corneal curvature center And corneal radii), and according to light reflection and refraction principle, the coordinate of light reflection point, the coordinate of pupil refraction point are obtained, and combine Shooting point coordinate and light source coordinates can accurately determine out the coordinate of pupil center and the coordinate at corneal curvature center.
In one embodiment, according to the coordinate of the coordinate of pupil center and corneal curvature center, determine pupil center with The optical axis direction of the corneal curvature line of centres includes: the coordinate that the coordinate of pupil center is subtracted to corneal curvature center, obtains pupil Vector of the hole center to corneal curvature center;According to the ratio of vector sum vector field homoemorphism, obtain indicating that pupil center and cornea are bent The unit vector of the optical axis direction of the rate line of centres.
It is appreciated that the coordinate of pupil center be considered as from the origin of the three-dimensional coordinate system of screen to pupil center to Amount.The coordinate at corneal curvature center is considered as the origin by the three-dimensional coordinate system of screen to the vector at corneal curvature center.Cause The coordinate of pupil center, is subtracted the coordinate at corneal curvature center by this, available pupil center to corneal curvature center to Amount.
Unit vector is the directive vector of tool that mould is equal to 1.It is appreciated that unit vector is only used for indicating direction.
In above-described embodiment, by the way that the coordinate of pupil center to be subtracted to the coordinate at corneal curvature center, pupil center is obtained To the vector at corneal curvature center;According to the ratio of vector sum vector field homoemorphism, obtain indicating pupil center and corneal curvature center The unit vector of the optical axis direction of line.It can be directly according to the simple of the coordinate of pupil center and the coordinate at corneal curvature center It calculates, so that it may determine optical axis direction, consuming system is carried out to pupil center and corneal curvature center without being used in solid space The link process for process resource of uniting, substantially increases the determination efficiency of optical axis direction, while saving process resource.
In one embodiment, step S208 includes: acquisition and the matched smooth boresight direction differential seat angle of eye image;According to It is vertical to obtain first level orientation angle and first of the optical axis direction in three-dimensional coordinate system for the unit vector for indicating optical axis direction Orientation angle;According to first level orientation angle, the first vertical direction angle and light boresight direction differential seat angle, optical axis side is obtained To the second horizontal direction angle in three-dimensional coordinate system and the second vertical direction angle;According to the second horizontal direction angle and second Vertical direction angle obtains the unit vector for indicating boresight direction.
In one embodiment, computer equipment can obtain first level orientation angle and first perpendicular according to following formula Straight orientation angle:
Wherein, p is the coordinate of pupil center;C is the coordinate at corneal curvature center;P-c is pupil center to the cornea The vector of the center of curvature;For the unit vector for indicating optical axis direction;θ is first level orientation angle;It is vertical for first Orientation angle.
Fig. 7 is the vertical side of first level orientation angle and first of the optical axis direction in three-dimensional coordinate system in one embodiment To the schematic diagram of angle.It should be noted that for clearer the first water for representing optical axis direction in three-dimensional coordinate system Flat orientation angle and the first vertical direction angle translate the three-dimensional coordinate system of screen, so that the original of the three-dimensional coordinate system of screen Point corresponds to eyeball center d, obtains 3 D stereo coordinate system shown in fig. 7, the X in Fig. 7dAxis is parallel to the three-dimensional of screen and sits Mark X-axis, the Y of systemdAxis is parallel to the Y-axis and Z of the three-dimensional coordinate system of screendAxis is parallel to the Z axis of the three-dimensional coordinate system of screen. It is appreciated that with obtained in Fig. 7 by the light represented by obtained 3 D stereo coordinate system after the three-dimensional coordinate system translation of screen The first level orientation angle of axis direction and the first vertical direction angular dimension, with optical axis direction in three-dimensional coordinate system first The size and Orientation of horizontal direction angle and the first vertical direction angle is equivalent.Referring to Fig. 7, c is corneal curvature center, and p is pupil Hole center.Point c indicates optical axis direction with the direction where point p line.θ is first level orientation angle;It is vertical for first Orientation angle, the distance between point c and point p are K.
In one embodiment, light boresight direction differential seat angle includes horizontal direction angle difference and vertical direction differential seat angle;Root According to the second horizontal direction angle and the second vertical direction angle, obtain indicating that the unit vector of boresight direction includes:
Computer equipment can obtain indicating the unit vector of boresight direction in the following way:
Wherein, θ is first level orientation angle;For the first vertical direction angle;α is that horizontal direction angle is poor;β is perpendicular Histogram is to differential seat angle;θ+α is the second horizontal direction angle;For the second vertical direction angle.
In above-described embodiment, by obtaining optical axis direction in three-dimensional coordinate system according to the unit vector for indicating optical axis direction In first level orientation angle and the first vertical direction angle;According to first level orientation angle, the first vertical direction angle, With light boresight direction differential seat angle, boresight direction the second horizontal direction angle and second vertical direction angle in three-dimensional coordinate system are obtained Degree;According to the second horizontal direction angle and the second vertical direction angle, the unit vector for indicating boresight direction is obtained.Direct basis Simple computation is carried out for the unit vector for indicating optical axis direction, that is, can determine that boresight direction, without being used in solid space pair Optical axis direction expend the rotation adjustment processing of process resource, substantially increases the determination efficiency of boresight direction, saves simultaneously Process resource.
In one embodiment, step S210 includes: to obtain the unit vector for indicating boresight direction in vertical coordinate axis Coordinate value;According to the ratio between the ordinate at corneal curvature center and the opposite number of the coordinate value of acquisition, obtain staring linear ginseng Number;Linear transformation is carried out according to the coordinate for indicating the unit vector of boresight direction, staring linear dimensions and corneal curvature center, is obtained To fixation point position on the screen.
Wherein, linear dimensions is stared, is the parameter for determining fixation point position for carrying out linear transformation.
It is appreciated that indicating that the vector element of the unit vector of boresight direction corresponds respectively in the three-dimensional coordinate system of screen Abscissa, ordinate and ordinate.For example, So,Corresponding to abscissa in the three-dimensional coordinate system of screen;Corresponding to ordinate in the three-dimensional coordinate system of screen;Solid corresponding to screen is sat Ordinate in mark system.Therefore, computer equipment it is available indicate boresight direction unit vector in correspond to ordinate to Secondary element is to get the coordinate value to the unit vector in vertical coordinate axis.
Specifically, coordinate value of the available unit vector for indicating boresight direction of computer equipment in vertical coordinate axis, And determine the opposite number of the coordinate value of the acquisition, according to the ratio between the ordinate at corneal curvature center and the opposite number of determination, It obtains staring linear dimensions.
In one embodiment, computer equipment can obtain staring linear dimensions according to following formula:
Wherein, kgTo stare linear dimensions;czFor the ordinate at corneal curvature center;For table Show the opposite number of coordinate value of the unit vector of boresight direction in vertical coordinate axis;.
In one embodiment, according to indicate boresight direction unit vector, stare linear dimensions and corneal curvature center Coordinate carry out linear transformation, the fixation point position obtained on the screen includes:
Fixation point position on the screen is obtained according to following formula:
Wherein, g is fixation point coordinate;C is the coordinate at corneal curvature center;kgTo stare linear dimensions;For the unit vector for indicating boresight direction.
In above-described embodiment, simple computation is carried out according to the unit vector for indicating boresight direction, so that it may obtain in screen On fixation point position.The determination efficiency of fixation point position is substantially increased, while saving process resource.
In one embodiment, this method further include: obtain school when the default calibration point of fixation point position on the screen Quasi- eye image;The calibration coordinate of pupil image point and light source imaging point in calibration eye image is obtained respectively;According to acquisition The calibration coordinate of the calibration coordinate and light source imaging point of pupil image point, generates using eyeball parameter to be pre- represented by unknown parameter If the coordinate predicted value of calibration point;According to the preset coordinate of coordinate predicted value and default calibration point, prediction error functions are obtained;Really Surely the eyeball parameter for being minimized prediction error functions.
Specifically, computer equipment according to the matched smooth boresight direction differential seat angle of eye image and optical axis direction, Before determining boresight direction, it is also necessary to be calibrated to eye parameters.Wherein, eye parameters are used to characterize the attribute letter of eyes Breath.Eye parameters include light boresight direction differential seat angle, in corneal radii and pupil center and corneal curvature centre distance etc. At least one.
In one embodiment, calibration point is pre-set in computer equipment.Wherein, calibration point is preset in the vertical of screen Coordinate in body coordinate system is known preset coordinate.Default calibration point can be shown on screen by computer equipment.Shooting Equipment can shoot calibration eye image when the default calibration point of fixation point position on the screen, and will calibration eye image hair It send to computer equipment.It is appreciated that when user stares the default calibration point on screen, fixation point position then on the screen pre- If calibration point.Wherein, default calibration point can be one or more.In one embodiment, presetting calibration point is 9.One In a embodiment, the calibration eye image that computer equipment obtains is one or more.In one embodiment, computer equipment The calibration eye image of acquisition is 100.
Computer equipment can obtain respectively calibration eye image in pupil image point in the three-dimensional coordinate system of screen The calibration coordinate of coordinate and light source imaging point in the three-dimensional coordinate system of screen is calibrated, and according to the school of the pupil image of acquisition point The calibration coordinate of quasi coordinates and light source imaging point is generated using eye parameters as the coordinate of default calibration point represented by unknown parameter Predicted value.It is appreciated that computer equipment be according to calibration eye image in pupil image point calibration coordinate and light source at The calibration coordinate of picture point, and eye parameters (for example, light boresight direction differential seat angle) is used as unknown parameter, to predict the calibration eye The coordinate of default calibration point corresponding to eyeball image, the obtained coordinate predicted value for presetting calibration point are then with eyeball parameter Unknown parameter indicates.
Further, computer equipment can be predicted according to the preset coordinate of coordinate predicted value and default calibration point Error function determines the eyeball parameter for being minimized prediction error functions.The eyeball parameter determined is as participation basis The eyeball parameter that eye image needs to use during determining fixation point position.
In one embodiment,Wherein n indicates the quantity of default calibration point;Table Show the coordinate predicted value of i-th of the default calibration point indicated using each eyeball parameter as unknown number;giIndicate i-th of default calibration point Known preset coordinate.
In one embodiment, before step S208, this method further include: obtain fixation point position on the screen pre- If calibration eye image when calibration point;The calibration for obtaining pupil image point and light source imaging point in calibration eye image respectively is sat Mark;According to the calibration coordinate of the calibration coordinate and light source imaging point of the pupil image point of acquisition, generate with light boresight direction angle Difference is the coordinate predicted value of default calibration point represented by unknown parameter;According to the default seat of coordinate predicted value and default calibration point Mark, obtains prediction error functions;Determine the light boresight direction differential seat angle for being minimized prediction error functions.
In one embodiment, according to the pupil image point of acquisition calibration coordinate and light source imaging point calibration coordinate, Generate using light boresight direction differential seat angle as the coordinate predicted value of default calibration point represented by unknown parameter include: according to pupil at The calibration coordinate of picture point and the calibration coordinate of light source imaging point determine the optical axis direction of calibration;By the light optical axis side in eyeball parameter As unknown parameter and the optical axis direction calibrated is combined to differential seat angle, determines the boresight direction of calibration;And according to the optical axis of calibration Direction is predicted the coordinate of the corresponding default calibration point of the calibration eye image on the screen, is obtained with light boresight direction angle Degree difference is the coordinate predicted value of default calibration point represented by unknown parameter.
Further, computer equipment can be predicted according to the preset coordinate of coordinate predicted value and default calibration point Error function determines the light boresight direction differential seat angle for being minimized prediction error functions.The light boresight direction angle determined Degree difference is then as the light boresight direction differential seat angle for participating in processing in step S208.
In the present embodiment, eyeball parameter is obtained by calibration process, so that participating in the eyeball parameter of fixation point calculating more It is accurate, thus improve determined by fixation point position accuracy.
In one embodiment, eye image is the eye image of binocular;Fixation point position is the fixation point position of binocular.
It is appreciated that the eye image of binocular can be while include the eye image of binocular (i.e. in an eye image Including binocular), it is also possible to the respective eye image of binocular and (i.e. there was only one eye eyeball, computer equipment in an eye image The eye image of each eye is obtained respectively), it does not limit this.It should be noted that according to eye in each embodiment of the application Eyeball image carries out the process for the fixation point position that processing determines on the screen, is the treatment process for one eye eyeball, if needed It determines the respective fixation point position of binocular, then can all execute respectively corresponding treatment process for each eye, obtain binocular Fixation point position.
In the present embodiment, this method further include: the fixation point position of binocular is subjected to parallax conversion, obtains staring for binocular Point position same aiming spot corresponding in virtual reality scenario;It is executed under virtual reality scenario according to aiming spot Interaction process.
Wherein, virtual reality scenario is three-dimensional virtual scene.
It is appreciated that eyes see that three-dimensional scenic is because the parallax between binocular generates, in order to realize virtual reality field The 3 D stereo of scape needs the principle for generating three-dimensional scenic according to eyes parallax to generate three-dimensional virtual reality scenario.Then count Parallax conversion can be carried out according to by the fixation point position of binocular by calculating machine equipment, obtain the fixation point position of binocular in virtual reality Corresponding same aiming spot in scene.I.e. computer equipment can according to binocular respectively on the screen stare point It sets, obtains an aiming spot in three-dimensional virtual reality scenario.It should be noted that aiming spot is virtual Aiming spot in reality scene, rather than the coordinate position in the three-dimensional coordinate system of screen.
In one embodiment, the void in the incoming rendering engine of the available virtual reality applications program of computer equipment The parameter of quasi- video camera, computer equipment determine the fixation point position of binocular in void according to the parameter of the virtual camera of acquisition Corresponding aiming spot in quasi- reality scene.
In one embodiment, executing the interaction process under virtual reality scenario according to aiming spot includes: virtual Under reality scene, the virtual objects or menu item for corresponding to aiming spot are determined;Choose virtual objects or menu item;To choosing Virtual objects or menu item interact processing.
Wherein, virtual objects are the objects in virtual reality scenario, are the component parts of virtual reality scenario.It is empty Quasi- object can be any type of object in virtual reality scenario, for example, virtual objects, Virtual Building, virtual portrait and void At least one of quasi- animal etc..Menu item is an option in the menu presented in virtual reality scenario.Virtual reality field Scape can be reality-virtualizing game scene, virtual reality social activity scene or virtual reality design scenario etc..
Reality-virtualizing game scene is the scene for showing scene of game by virtual reality technology.In virtual reality In scene of game, virtual objects can be reality-virtualizing game role, virtual reality article or virtual reality building etc., menu item It can be the option for triggering reality-virtualizing game operation, such as the menu item of triggering attack operation.
Virtual reality social activity scene is the scene for showing social scene by virtual reality technology.In virtual reality In scene of game, virtual objects, which can be, carries out social virtual portrait.Menu item can be the choosing for triggering social operation , such as the menu item of triggering real time phone call operation.
Virtual reality design scenario is the scene for showing design scenario by virtual reality technology.In virtual reality In design scenario, virtual objects can be virtual design object, for example, virtual costume, virtual furnishings or virtual house ornamentation etc. are virtual Design object.Menu item can be the option for triggering design operation, such as the menu of the mobile operation for designing object of triggering ?.
In one embodiment, computer equipment is available stares duration corresponding to the aiming spot, when staring Duration is greater than or equal to default when staring duration threshold value, then can choose virtual objects corresponding to the aiming spot or menu ?.It is that binocular is stared and corresponding to the aiming spot stares point it is appreciated that corresponding to aiming spot stares duration The duration set.
For example, binocular stares when 5 seconds a length of, the then corresponding target fixation point in the fixation point position of a fixation point position A length of 5 seconds when being stared corresponding to position, it is assumed that default duration threshold value of staring is 4 seconds, when staring duration 5 seconds greater than default stare Long threshold value 4 seconds, then can choose virtual objects corresponding to the aiming spot or menu item.
In one embodiment, computer equipment can also monitor blinking for binocular when aiming spot does not change It is dynamic, when blinking for binocular meets the preset virtual objects or dish for blinking condition of choosing, then choosing corresponding to the aiming spot Individual event.
Wherein, blink choose condition be by blink eyes to correspond to aiming spot virtual objects or menu item into The condition chosen of row can then be chosen corresponding to the virtual of aiming spot when blinking for binocular meets this and blink condition of choosing Object or menu item.It blinks and chooses condition can be continuously to blink preset times.Condition is chosen to can be needle it is appreciated that blinking To the condition for blinking setting of one of eyes in binocular, for example, left eye continuously blinks 3 times.It blinks and chooses condition can also Not limited this for the condition for blinking setting of two eyes.
For example, the corresponding aiming spot in fixation point position of binocular is A point, then it is corresponding in the fixation point position of binocular When aiming spot A point does not change, binocular continuously blinks under 2, so that it may which aiming spot A point is corresponding virtual right As or menu item chosen.
In one embodiment, computer equipment can also monitor eye in binocular when aiming spot does not change The rotation of ball is chosen condition when the rotation of eyeball meets preset rotation, is then chosen corresponding to the virtual right of the aiming spot As or menu item.
Wherein, rotation choose condition be by rotation eyeball to correspond to aiming spot virtual objects or menu item into The condition that row is chosen, chooses condition when the rotation of eyeball meets the rotation, then can choose corresponding to the virtual of aiming spot Object or menu item.
Rotation choose condition to can be boresight direction of the eyeball relative to fixation point position, to the left or to the right or upwards or to The rotation of any one direction or the combination of multiple directions rotation in inferior.Multiple directions combination rotation can be to the left or to It is right or upward or downward etc. in any several directional combinations rotation, for example be rotated up from the right side, just belong to multiple directions One of which in combination rotation.
For example, eyeball turns right, the virtual objects or menu item for corresponding to the aiming spot can will be chosen.Or Person, eyeball are rotated up from the right side, can will choose the virtual objects or menu item for corresponding to the aiming spot.
It is appreciated that blink choose condition can be for one of eyes in binocular eyeball rotation setting Condition is also possible to the condition of the rotation setting for the eyeball of two eyes, does not limit this.
In one embodiment, computer equipment can obtain the rotation of eyeball when monitoring the rotation of eyeball of binocular Track, judges whether the rotary motion trace of eyeball meets rotation and choose condition.
Further, after choosing to virtual objects or menu item, computer equipment can be virtual to what is chosen Object or menu item interact processing.It is appreciated that computer equipment can choose multiple virtual objects and/or menu item And then processing uniformly is interacted to the virtual objects or menu item chosen, it can also be to each virtual objects chosen Or menu item, processing is interacted respectively, is not limited this.
In one embodiment, computer equipment can monitor turning for head after choosing to virtual objects or menu item It is dynamic, when the rotary motion trace on head meets interactive operation condition, then to virtual objects or the menu item progress chosen and interaction behaviour Make the corresponding interactive operation of condition.
Wherein, interactive operation condition is the condition of satisfaction required for executing corresponding interactive operation by head rotation.It hands over Interoperability include moving operation, delete operation, highlight operation, hide operation, confirmation operation, subject content acquisition operate and Resource increases at least one of operation etc..
Moving operation is the operation of mobile chosen virtual objects or menu item.The behaviour such as delete, highlight and hide Make, is to delete, highlight (for example, highlighted) and hide the operation of chosen virtual objects or menu item respectively.Confirmation behaviour Make, is the operation that the virtual objects or menu item chosen are confirmed.In one embodiment, confirmation operation can be really Recognize chosen menu item, function corresponding to the menu item is executed with triggering.
Subject content obtains operation, is for triggering the operation for obtaining subject content corresponding to menu item.It is appreciated that Menu item is usually to describe to the generality of corresponding subject content, is triggered to menu item, available theme corresponding to it Content.
Resource increases operation, is the operation for increasing resource to the virtual objects chosen.Wherein, resource may include virtual At least one such as article, virtual energy value or virtual blood volume value.Virtual objects may include virtual equipment or virtually dress up to wait objects Product, for example virtual equipment can be added to reality-virtualizing game role or virtually dressed up.
In one embodiment, computer equipment can be combined with handle after choosing to virtual objects or menu item, right The virtual objects or menu item chosen carry out corresponding interaction process.Specifically, the available handle of computer equipment is to choosing Virtual objects or menu item issue the instruction of corresponding interactive controlling, instruct in response to the interactive controlling to virtual objects or menu Item carries out corresponding interactive controlling.
In one embodiment, computer equipment can also adjust three-dimensional virtual reality scenario according to the aiming spot Virtual reality show picture.Specifically, computer equipment can be right by institute in the preset range centered on aiming spot The virtual reality answered shows picture, adjusts to the front view position of current binocular, to facilitate to around aiming spot Virtual reality shows that picture is checked.
In above-described embodiment, the accuracy of identified fixation point position ensure that based on the above method.And then according to double Purpose fixation point position determines that the aiming spot in virtual reality scenario is more accurate.To according to the aiming spot The interaction process executed under virtual reality scenario is more accurate.
In one embodiment, this method further include: parallax conversion is carried out to fixation point position, fixation point position is obtained and exists It is organic to be determined as target by corresponding target position in observed organism tissue for the organism tissue for corresponding to target position Body tissue.
Wherein, target organs tissue is the organism tissue of the medical procedure for receiving to be implemented.It needs to illustrate , for determine the fixation point position of target organs tissue can be one eye eyeball fixation point position (it is appreciated that this In medical staff is not precluded by single eyes come the case where carrying out medical treatment), be also possible to the fixation point position of binocular, It does not limit this.
In one embodiment, computer equipment can with auto-control equipment to target organs tissue at Reason.
Because ensure that the accuracy of identified fixation point position based on above-mentioned Eye-controlling focus method in above-described embodiment, And then the target organs tissue for receiving the medical procedure to be implemented is determined by fixation point position, so that target is organic Body tissue determines more accurate.
As shown in figure 8, in one embodiment, providing another Eye-controlling focus method, this method specifically includes following Step:
S802 obtains calibration eye image when the default calibration point of fixation point position on the screen;Calibration is obtained respectively The calibration coordinate of pupil image point and light source imaging point in eye image.
S804, according to the calibration coordinate of the calibration coordinate and light source imaging point of the pupil image point of acquisition, generation is regarded with light Axis direction differential seat angle is the coordinate predicted value of default calibration point represented by unknown parameter.
S806 obtains prediction error functions according to the preset coordinate of coordinate predicted value and default calibration point;Determination makes to predict The light boresight direction differential seat angle that error function is minimized.
S808 obtains eye image;Determine that the location of pixels of pupil image point and light source are anti-through cornea in eye image Penetrate the location of pixels for the light source imaging point to be formed.
S810, according to the mapping relations of coordinate in the three-dimensional coordinate system of location of pixels in eye image and screen, by pupil The location of pixels of imaging point is mapped as coordinate of the pupil image point in three-dimensional coordinate system;According to mapping relations, light source is imaged The location of pixels of point is mapped as coordinate of the light source imaging point in three-dimensional coordinate system.
S812 obtains the shooting point of the light source and eye image corresponding light source coordinates in the three-dimensional coordinate system of screen respectively With shooting point coordinate.
S814 determines the coordinate of pupil refraction point according to the coordinate of shooting point coordinate and pupil image point;According to shooting point The coordinate of coordinate and light source imaging point determines the coordinate of light reflection point.
S816 is obtained at a distance from eye image matched pupil center to corneal curvature center and corneal radii.
S818 is rolled over according to pupil center to the distance at corneal curvature center, corneal radii, the coordinate of light reflection point, pupil Coordinate, shooting point coordinate and the light source coordinates of exit point, determine the coordinate of pupil center and the coordinate at corneal curvature center.
The coordinate of pupil center is subtracted the coordinate at corneal curvature center by S820, obtains pupil center into corneal curvature The vector of the heart.
S822 obtains the optical axis for indicating pupil center and the corneal curvature line of centres according to the ratio of vector sum vector mould The unit vector in direction.
S824 is obtained and the matched smooth boresight direction differential seat angle of eye image;According to indicate optical axis direction unit vector, Obtain first level orientation angle and first vertical direction angle of the optical axis direction in three-dimensional coordinate system.
S826, according to first level orientation angle, the first vertical direction angle and light boresight direction differential seat angle, depending on Axis direction the second horizontal direction angle and second vertical direction angle in three-dimensional coordinate system.
S828, according to the second horizontal direction angle and the second vertical direction angle, obtain indicating the unit of boresight direction to Amount.
S830 obtains coordinate value of the unit vector for indicating boresight direction in vertical coordinate axis;According to corneal curvature center Ordinate and acquisition coordinate value opposite number between ratio, obtain staring linear dimensions.
S832 is carried out according to the coordinate for indicating the unit vector of boresight direction, staring linear dimensions and corneal curvature center Linear transformation obtains fixation point position on the screen.
The fixation point position of binocular is carried out parallax conversion, obtains the fixation point position of binocular in virtual reality field by S834 Corresponding same aiming spot in scape;The interaction process under virtual reality scenario is executed according to aiming spot.
Pupil image point and light source imaging point in the eye image of plane are converted to three-dimensional by above-mentioned Eye-controlling focus method Coordinate in screen coordinate system.According to the coordinate of the coordinate of three-dimensional pupil image point and light source imaging point, true eye is determined The optical axis direction of the pupil center of eyeball and the corneal curvature line of centres, and according to the differential seat angle between optical axis direction and boresight direction, Determine that boresight direction, boresight direction can more accurately indicate direction of visual lines.Thus, determine that eyes are shielding according to boresight direction Fixation point position on curtain is more accurate.
Secondly, eyes inherent parameters (i.e. the distance and corneal radii of pupil center to corneal curvature center) is combined, and According to light reflection and refraction principle, obtain the coordinate of light reflection point, the coordinate of pupil refraction point, and combine shooting point coordinate and Light source coordinates can accurately determine out the coordinate of pupil center and the coordinate at corneal curvature center.
Then, directly according to the simple computation of the coordinate of pupil center and the coordinate at corneal curvature center, so that it may determine Emergent shaft direction expend to pupil center and corneal curvature center the line of system processing resources without being used in solid space Processing, substantially increases the determination efficiency of optical axis direction, while saving process resource.To improve fixation point position really Determine efficiency, while saving process resource.
Then, light boresight direction differential seat angle is obtained by calibration process, so that participating in the light boresight direction that fixation point calculates Differential seat angle is more accurate, thus the accuracy of fixation point position determined by improving.
Finally, ensure that the accuracy of identified fixation point position based on the above method.And then staring according to binocular Point position determines that the aiming spot in virtual reality scenario is more accurate.It is virtual to be executed according to the aiming spot Interaction process under reality scene is more accurate.
As shown in figure 9, in one embodiment, providing a kind of Eye-controlling focus device 900, which includes obtaining Module 902, coordinate determining module 904, optical axis direction determining module 906, boresight direction determining module 908 and fixation point determine Module 910, in which:
Module 902 is obtained, for obtaining eye image.
Coordinate determining module 904, for determining that pupil image point is in the three-dimensional coordinate system of screen in the eye image Coordinate and the light source imaging point that is formed in the eye image through corneal reflection of light source in the three-dimensional coordinate system Coordinate.
Optical axis direction determining module 906, for true according to the coordinate of the pupil image point and the coordinate of light source imaging point Determine optical axis direction.
Boresight direction determining module 908, for according to the matched smooth boresight direction differential seat angle of the eye image and The optical axis direction, determines boresight direction.
Fixation point determining module 910, for determining fixation point position on the screen according to the boresight direction.
In one embodiment, coordinate determining module 904 is also used to determine pupil image point in the eye image The location of pixels for the light source imaging point that location of pixels and light source are formed through corneal reflection;According to location of pixels in eye image and screen The mapping relations of coordinate, are mapped as the pupil image point for the location of pixels of the pupil image point in the three-dimensional coordinate system of curtain Coordinate in the three-dimensional coordinate system;According to the mapping relations, the location of pixels of the light source imaging point is mapped as institute State coordinate of the light source imaging point in the three-dimensional coordinate system.
In one embodiment, the optical axis direction determining module 906 is also used to the coordinate according to the pupil image point With the coordinate of the light source imaging point, the coordinate of pupil center and the coordinate at corneal curvature center are determined;According in the pupil The coordinate of the coordinate of the heart and the corneal curvature center determines the optical axis of the pupil center Yu the corneal curvature line of centres Direction.
In one embodiment, the coordinate determining module 904 is also used to obtain the light source and the eyes figure respectively The shooting point of picture corresponding light source coordinates and shooting point coordinate in the three-dimensional coordinate system of the screen;It is sat according to the shooting point The coordinate of mark and the pupil image point, determines the coordinate of pupil refraction point;According to the shooting point coordinate and the light source at The coordinate of picture point determines the coordinate of light reflection point;It obtains and the matched pupil center of the eye image to corneal curvature center Distance and corneal radii;It is reflected according to the distance, the corneal radii, the light of the pupil center to corneal curvature center Coordinate, the shooting point coordinate and the light source coordinates of the coordinate, the pupil refraction point put, determine the seat of pupil center The coordinate of mark and corneal curvature center.
In one embodiment, the coordinate determining module 904 is also used to determine the seat of pupil center according to following formula The coordinate of mark and corneal curvature center:
(r-o) × (c-o) (p-o)=0;
n1| | (r-c) × (p-r) | | | | (o-r) | |=n2||(r-c)×(o-r)||·||(p-r)||;
| | r-c | |=R;
(l-o) × (q-o) (c-o)=0;
(l-q) (q-c) | | (o-q) | |=(o-q) (q-c) | | (l-q) | |;
| | q-c | |=R;
| | p-c | |=K;
Wherein, r is the coordinate of pupil refraction point;O is shooting point coordinate;C is the coordinate at corneal curvature center;P is pupil The coordinate at center;n1For corneal refractive power;n2For air refraction;R is corneal radii;L is light source coordinates;Q is light reflection point Coordinate;K is the distance at pupil center to corneal curvature center.
In one embodiment, the optical axis direction determining module 906 is also used to subtract on the coordinate of the pupil center The coordinate at the corneal curvature center, obtain the pupil center to the corneal curvature center vector;According to the vector With the ratio of the vector field homoemorphism, obtain indicating the unit of the optical axis direction of the pupil center and the corneal curvature line of centres to Amount.
In one embodiment, the boresight direction determining module 908 is also used to obtain matched with the eye image Light boresight direction differential seat angle;According to the unit vector for indicating the optical axis direction, the optical axis direction is obtained in the three-dimensional seat First level orientation angle and the first vertical direction angle in mark system;Vertically according to the first level orientation angle, first Orientation angle and the smooth boresight direction differential seat angle, obtain boresight direction second horizontal direction angle in the three-dimensional coordinate system Degree and the second vertical direction angle;According to second horizontal direction angle and the second vertical direction angle, indicated The unit vector of boresight direction.
In one embodiment, fixation point determining module 910 is also used to obtain the unit vector for indicating the boresight direction Coordinate value in vertical coordinate axis;According between the ordinate at the corneal curvature center and the opposite number of the coordinate value of acquisition Ratio, obtain staring linear dimensions;According to the unit vector for indicating the boresight direction, it is described stare linear dimensions and The coordinate at the corneal curvature center carries out linear transformation, obtains fixation point position on the screen.
As shown in Figure 10, in one embodiment, the device 900 further include:
Calibration module 901, for obtaining calibration eyes figure when the default calibration point of fixation point position on the screen Picture;The calibration coordinate of pupil image point and light source imaging point in the calibration eye image is obtained respectively;According to the pupil of acquisition The calibration coordinate of the calibration coordinate and light source imaging point of imaging point, generates using light boresight direction differential seat angle as represented by unknown parameter The default calibration point coordinate predicted value;According to the preset coordinate of the coordinate predicted value and the default calibration point, obtain To prediction error functions;Determine the light boresight direction differential seat angle for being minimized the prediction error functions.
In one embodiment, the eye image is the eye image of binocular;The fixation point position is the solidifying of binocular Viewpoint position.In the present embodiment, as shown in figure 11, the device 900 further include:
Virtual reality scenario interactive module 912 obtains institute for the fixation point position of the binocular to be carried out parallax conversion State the fixation point position of binocular same aiming spot corresponding in virtual reality scenario;It is held according to the aiming spot Interaction process under row virtual reality scenario.
Figure 12 is the schematic diagram of internal structure of computer equipment in one embodiment.Referring to Fig.1 2, the computer equipment packet Include processor, non-volatile memory medium, built-in storage and the network interface connected by system bus.Wherein, the equipment Non-volatile memory medium can storage program area and computer program, which is performed, and may make processor Execute a kind of Eye-controlling focus method.The processor of the computer equipment supports whole equipment for providing calculating and control ability Operation.Computer program can be stored in the built-in storage, when which is executed by processor, may make processor Execute a kind of Eye-controlling focus method.The network interface of computer equipment is for carrying out network communication.The display screen of computer equipment It can be liquid crystal display or electric ink display screen.
It will be understood by those skilled in the art that structure shown in Figure 12, only part relevant to application scheme The block diagram of structure, does not constitute the restriction for the computer equipment being applied thereon to application scheme, and specific computer is set Standby may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, Eye-controlling focus device provided by the present application can be implemented as a kind of shape of computer program Formula, the computer program can be run in computer equipment as shown in figure 12, and the non-volatile of the computer equipment is deposited Storage media can store each program module for forming the Eye-controlling focus device, for example, acquisition module 902 shown in Fig. 9, coordinate are true Cover half block 904, optical axis direction determining module 906, boresight direction determining module 908 and fixation point determining module 910.Each journey It include computer program in sequence module, the computer program is described in this specification for executing the computer equipment Step in the Eye-controlling focus method of each embodiment of the application, for example, computer equipment can pass through sight as shown in Figure 9 Acquisition module 902 in follow-up mechanism 900 obtains eye image, and determines pupil in eye image by coordinate determining module 904 The light source that coordinate and light source of the borescopic imaging point in the three-dimensional coordinate system of screen are formed in eye image through corneal reflection at Coordinate of the picture point in three-dimensional coordinate system.Computer equipment can be by optical axis direction determining module 906 according to pupil image point Coordinate and the coordinate of light source imaging point determine optical axis direction, and by boresight direction determining module 908 according to eye image Matched smooth boresight direction differential seat angle and optical axis direction, determine boresight direction.Computer equipment can be determined by fixation point Module 910 determines fixation point position on the screen according to boresight direction.
In one embodiment, a kind of computer equipment, including memory and processor are provided, is deposited in the memory Computer program is contained, when the computer program is executed by processor, so that the processor executes following steps:
Obtain eye image;
Determine that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through cornea in the eye image Reflect coordinate of the light source imaging point formed in the eye image in the three-dimensional coordinate system;
Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;
According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine optical axis side To;
According to the boresight direction, fixation point position on the screen is determined.
In one embodiment, in the determination eye image pupil image point in the three-dimensional coordinate system of screen Coordinate determines the seat of light source imaging point that light source is formed in the eye image through corneal reflection in the three-dimensional coordinate system Mark includes:
Determined in the eye image pupil image point location of pixels and the light source that is formed through corneal reflection of light source at The location of pixels of picture point;
According to the mapping relations of coordinate in the three-dimensional coordinate system of location of pixels in eye image and screen, by the pupil at The location of pixels of picture point is mapped as coordinate of the pupil image point in the three-dimensional coordinate system;
According to the mapping relations, the location of pixels of the light source imaging point is mapped as the light source imaging point described Coordinate in three-dimensional coordinate system.
In one embodiment, described that optical axis is determined according to the coordinate of the pupil image point and the coordinate of light source imaging point Direction includes:
According to the coordinate of the coordinate of the pupil image point and the light source imaging point, coordinate and the angle of pupil center are determined The coordinate of the film center of curvature;
According to the coordinate of the coordinate of the pupil center and the corneal curvature center, determine the pupil center with it is described The optical axis direction of the corneal curvature line of centres.
In one embodiment, described according to the coordinate of the pupil image point and the coordinate of the light source imaging point, really The coordinate of the coordinate and corneal curvature center of determining pupil center includes:
The shooting point for obtaining the light source and the eye image respectively is corresponding in the three-dimensional coordinate system of the screen Light source coordinates and shooting point coordinate;
According to the coordinate of the shooting point coordinate and pupil image point, the coordinate of pupil refraction point is determined;
According to the coordinate of the shooting point coordinate and the light source imaging point, the coordinate of light reflection point is determined;
It obtains at a distance from the eye image matched pupil center to corneal curvature center and corneal radii;
According to the distance of the pupil center to corneal curvature center, the corneal radii, the smooth reflection point coordinate, The coordinate of the pupil refraction point, the shooting point coordinate and the light source coordinates, determine the coordinate and cornea of pupil center The coordinate of the center of curvature.
In one embodiment, it is described according to the distance of the pupil center to corneal curvature center, the corneal radii, The coordinate of the smooth reflection point, the coordinate of the pupil refraction point, the shooting point coordinate and the light source coordinates, determine pupil The coordinate at hole center and the coordinate at corneal curvature center include:
The coordinate of pupil center and the coordinate at corneal curvature center are determined according to following formula:
(r-o) × (c-o) (p-o)=0;
n1| | (r-c) × (p-r) | | | | (o-r) | |=n2||(r-c)×(o-r)||·||(p-r)||;
| | r-c | |=R;
(l-o) × (q-o) (c-o)=0;
(l-q) (q-c) | | (o-q) | |=(o-q) (q-c) | | (l-q) | |;
| | q-c | |=R;
| | p-c | |=K;
Wherein, r is the coordinate of pupil refraction point;O is shooting point coordinate;C is the coordinate at corneal curvature center;P is pupil The coordinate at center;n1For corneal refractive power;n2For air refraction;R is corneal radii;L is light source coordinates;Q is light reflection point Coordinate;K is the distance at pupil center to corneal curvature center.
In one embodiment, described according to the coordinate of the pupil center and the coordinate at the corneal curvature center, really The fixed pupil center and the optical axis direction of the corneal curvature line of centres include:
The coordinate that the coordinate of the pupil center is subtracted to the corneal curvature center obtains the pupil center described in The vector at corneal curvature center;
According to the ratio of vector field homoemorphism described in the vector sum, obtain indicating that the pupil center and corneal curvature center connect The unit vector of the optical axis direction of line.
In one embodiment, it is described according to the matched smooth boresight direction differential seat angle of the eye image and the light Axis direction determines that boresight direction includes:
It obtains and the matched smooth boresight direction differential seat angle of the eye image;
According to the unit vector for indicating the optical axis direction, the of the optical axis direction in the three-dimensional coordinate system is obtained One horizontal direction angle and the first vertical direction angle;
According to the first level orientation angle, the first vertical direction angle and the smooth boresight direction differential seat angle, obtain Boresight direction the second horizontal direction angle and second vertical direction angle in the three-dimensional coordinate system;
According to second horizontal direction angle and the second vertical direction angle, the unit for indicating boresight direction is obtained Vector.
In one embodiment, described according to the boresight direction, determine that fixation point position on the screen includes:
Obtain coordinate value of the unit vector for indicating the boresight direction in vertical coordinate axis;
According to the ratio between the ordinate at the corneal curvature center and the opposite number of the coordinate value of acquisition, coagulated Depending on linear dimensions;
According to the unit vector for indicating the boresight direction, described stare linear dimensions and the corneal curvature center Coordinate carry out linear transformation, obtain fixation point position on the screen.
In one embodiment, it is described according to the matched smooth boresight direction differential seat angle of the eye image and described Optical axis direction, before determining boresight direction, computer program also makes processor execute following steps:
Obtain calibration eye image when the default calibration point of fixation point position on the screen;
The calibration coordinate of pupil image point and light source imaging point in the calibration eye image is obtained respectively;
According to the calibration coordinate of the calibration coordinate and light source imaging point of the pupil image point of acquisition, generate with light boresight direction Differential seat angle is the coordinate predicted value of the default calibration point represented by unknown parameter;
According to the preset coordinate of the coordinate predicted value and the default calibration point, prediction error functions are obtained;
Determine the light boresight direction differential seat angle for being minimized the prediction error functions.
In one embodiment, the eye image is the eye image of binocular;The fixation point position is the solidifying of binocular Viewpoint position.In the present embodiment, computer program also makes processor execute following steps:
The fixation point position of the binocular is subjected to parallax conversion, obtains the fixation point position of the binocular in virtual reality Corresponding same aiming spot in scene;The interaction process under virtual reality scenario is executed according to the aiming spot.
In one embodiment, a kind of storage medium for being stored with computer program, the computer program quilt are provided When one or more processors execute, so that one or more processors execute following steps:
Obtain eye image;
Determine that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through cornea in the eye image Reflect coordinate of the light source imaging point formed in the eye image in the three-dimensional coordinate system;
Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;
According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine optical axis side To;
According to the boresight direction, fixation point position on the screen is determined.
In one embodiment, in the determination eye image pupil image point in the three-dimensional coordinate system of screen Coordinate determines the seat of light source imaging point that light source is formed in the eye image through corneal reflection in the three-dimensional coordinate system Mark includes:
Determined in the eye image pupil image point location of pixels and the light source that is formed through corneal reflection of light source at The location of pixels of picture point;
According to the mapping relations of coordinate in the three-dimensional coordinate system of location of pixels in eye image and screen, by the pupil at The location of pixels of picture point is mapped as coordinate of the pupil image point in the three-dimensional coordinate system;
According to the mapping relations, the location of pixels of the light source imaging point is mapped as the light source imaging point described Coordinate in three-dimensional coordinate system.
In one embodiment, described that optical axis is determined according to the coordinate of the pupil image point and the coordinate of light source imaging point Direction includes:
According to the coordinate of the coordinate of the pupil image point and the light source imaging point, coordinate and the angle of pupil center are determined The coordinate of the film center of curvature;
According to the coordinate of the coordinate of the pupil center and the corneal curvature center, determine the pupil center with it is described The optical axis direction of the corneal curvature line of centres.
In one embodiment, described according to the coordinate of the pupil image point and the coordinate of the light source imaging point, really The coordinate of the coordinate and corneal curvature center of determining pupil center includes:
The shooting point for obtaining the light source and the eye image respectively is corresponding in the three-dimensional coordinate system of the screen Light source coordinates and shooting point coordinate;
According to the coordinate of the shooting point coordinate and pupil image point, the coordinate of pupil refraction point is determined;
According to the coordinate of the shooting point coordinate and the light source imaging point, the coordinate of light reflection point is determined;
It obtains at a distance from the eye image matched pupil center to corneal curvature center and corneal radii;
According to the distance of the pupil center to corneal curvature center, the corneal radii, the smooth reflection point coordinate, The coordinate of the pupil refraction point, the shooting point coordinate and the light source coordinates, determine the coordinate and cornea of pupil center The coordinate of the center of curvature.
In one embodiment, it is described according to the distance of the pupil center to corneal curvature center, the corneal radii, The coordinate of the smooth reflection point, the coordinate of the pupil refraction point, the shooting point coordinate and the light source coordinates, determine pupil The coordinate at hole center and the coordinate at corneal curvature center include:
The coordinate of pupil center and the coordinate at corneal curvature center are determined according to following formula:
(r-o) × (c-o) (p-o)=0;
n1| | (r-c) × (p-r) | | | | (o-r) | |=n2||(r-c)×(o-r)||·||(p-r)||;
| | r-c | |=R;
(l-o) × (q-o) (c-o)=0;
(l-q) (q-c) | | (o-q) | |=(o-q) (q-c) | | (l-q) | |;
| | q-c | |=R;
| | p-c | |=K;
Wherein, r is the coordinate of pupil refraction point;O is shooting point coordinate;C is the coordinate at corneal curvature center;P is pupil The coordinate at center;n1For corneal refractive power;n2For air refraction;R is corneal radii;L is light source coordinates;Q is light reflection point Coordinate;K is the distance at pupil center to corneal curvature center.
In one embodiment, described according to the coordinate of the pupil center and the coordinate at the corneal curvature center, really The fixed pupil center and the optical axis direction of the corneal curvature line of centres include:
The coordinate that the coordinate of the pupil center is subtracted to the corneal curvature center obtains the pupil center described in The vector at corneal curvature center;
According to the ratio of vector field homoemorphism described in the vector sum, obtain indicating that the pupil center and corneal curvature center connect The unit vector of the optical axis direction of line.
In one embodiment, it is described according to the matched smooth boresight direction differential seat angle of the eye image and the light Axis direction determines that boresight direction includes:
It obtains and the matched smooth boresight direction differential seat angle of the eye image;
According to the unit vector for indicating the optical axis direction, the of the optical axis direction in the three-dimensional coordinate system is obtained One horizontal direction angle and the first vertical direction angle;
According to the first level orientation angle, the first vertical direction angle and the smooth boresight direction differential seat angle, obtain Boresight direction the second horizontal direction angle and second vertical direction angle in the three-dimensional coordinate system;
According to second horizontal direction angle and the second vertical direction angle, the unit for indicating boresight direction is obtained Vector.
In one embodiment, described according to the boresight direction, determine that fixation point position on the screen includes:
Obtain coordinate value of the unit vector for indicating the boresight direction in vertical coordinate axis;
According to the ratio between the ordinate at the corneal curvature center and the opposite number of the coordinate value of acquisition, coagulated Depending on linear dimensions;
According to the unit vector for indicating the boresight direction, described stare linear dimensions and the corneal curvature center Coordinate carry out linear transformation, obtain fixation point position on the screen.
In one embodiment, it is described according to the matched smooth boresight direction differential seat angle of the eye image and described Optical axis direction, before determining boresight direction, computer program also makes processor execute following steps:
Obtain calibration eye image when the default calibration point of fixation point position on the screen;
The calibration coordinate of pupil image point and light source imaging point in the calibration eye image is obtained respectively;
According to the calibration coordinate of the calibration coordinate and light source imaging point of the pupil image point of acquisition, generate with light boresight direction Differential seat angle is the coordinate predicted value of the default calibration point represented by unknown parameter;
According to the preset coordinate of the coordinate predicted value and the default calibration point, prediction error functions are obtained;
Determine the light boresight direction differential seat angle for being minimized the prediction error functions.
In one embodiment, the eye image is the eye image of binocular;The fixation point position is the solidifying of binocular Viewpoint position.In the present embodiment, computer program also makes processor execute following steps:
The fixation point position of the binocular is subjected to parallax conversion, obtains the fixation point position of the binocular in virtual reality Corresponding same aiming spot in scene;The interaction process under virtual reality scenario is executed according to the aiming spot.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, which can be stored in a computer-readable storage and be situated between In matter, the program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, storage medium above-mentioned can be The non-volatile memory mediums such as magnetic disk, CD, read-only memory (Read-Only Memory, ROM) or random storage note Recall body (Random Access Memory, RAM) etc..
Each technical characteristic of above embodiments can be combined arbitrarily, for simplicity of description, not to above-described embodiment In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance Shield all should be considered as described in this specification.
Only several embodiments of the present invention are expressed for above embodiments, and the description thereof is more specific and detailed, but can not Therefore it is construed as limiting the scope of the patent.It should be pointed out that for those of ordinary skill in the art, Under the premise of not departing from present inventive concept, various modifications and improvements can be made, and these are all within the scope of protection of the present invention. Therefore, the scope of protection of the patent of the invention shall be subject to the appended claims.

Claims (15)

1. a kind of Eye-controlling focus method, which comprises
Obtain eye image;
Determine that coordinate and light source of the pupil image point in the three-dimensional coordinate system of screen are through corneal reflection in the eye image Coordinate of the light source imaging point formed in the eye image in the three-dimensional coordinate system;
Optical axis direction is determined according to the coordinate of the coordinate of the pupil image point and light source imaging point;
According to the matched smooth boresight direction differential seat angle of the eye image and the optical axis direction, determine boresight direction;
According to the boresight direction, fixation point position on the screen is determined.
2. the method according to claim 1, wherein pupil image point is shielding in the determination eye image Coordinate in the three-dimensional coordinate system of curtain determines light source imaging point that light source is formed in the eye image through corneal reflection in institute The coordinate stated in three-dimensional coordinate system includes:
The light source imaging point that the location of pixels of pupil image point and light source are formed through corneal reflection is determined in the eye image Location of pixels;
According to the mapping relations of coordinate in the three-dimensional coordinate system of location of pixels in eye image and screen, by the pupil image point Location of pixels be mapped as coordinate of the pupil image point in the three-dimensional coordinate system;
According to the mapping relations, the location of pixels of the light source imaging point is mapped as the light source imaging point in the solid Coordinate in coordinate system.
3. the method according to claim 1, wherein the coordinate and light source according to the pupil image point at The coordinate of picture point determines that optical axis direction includes:
According to the coordinate of the coordinate of the pupil image point and the light source imaging point, determine that the coordinate of pupil center and cornea are bent The coordinate at rate center;
According to the coordinate of the coordinate of the pupil center and the corneal curvature center, the pupil center and the cornea are determined The optical axis direction of center of curvature line.
4. according to the method described in claim 3, it is characterized in that, the coordinate and the light according to the pupil image point The coordinate of the coordinate of source imaging point, the coordinate and corneal curvature center that determine pupil center includes:
The shooting point of the light source and the eye image corresponding light source in the three-dimensional coordinate system of the screen is obtained respectively Coordinate and shooting point coordinate;
According to the coordinate of the shooting point coordinate and pupil image point, the coordinate of pupil refraction point is determined;
According to the coordinate of the shooting point coordinate and the light source imaging point, the coordinate of light reflection point is determined;
It obtains at a distance from the eye image matched pupil center to corneal curvature center and corneal radii;
According to the coordinate, described of the distance of the pupil center to corneal curvature center, the corneal radii, the smooth reflection point The coordinate of pupil refraction point, the shooting point coordinate and the light source coordinates, determine the coordinate and corneal curvature of pupil center The coordinate at center.
5. according to the method described in claim 4, it is characterized in that, described according to the pupil center to corneal curvature center Distance, the corneal radii, the coordinate of the smooth reflection point, the coordinate of the pupil refraction point, the shooting point coordinate and The coordinate of the light source coordinates, the coordinate and corneal curvature center that determine pupil center includes:
The coordinate of pupil center and the coordinate at corneal curvature center are determined according to following formula:
(r-o) × (c-o) (p-o)=0;
n1| | (r-c) × (p-r) | | | | (o-r) | |=n2||(r-c)×(o-r)||·||(p-r)||;
| | r-c | |=R;
(l-o) × (q-o) (c-o)=0;
(l-q) (q-c) | | (o-q) | |=(o-q) (q-c) | | (l-q) | |;
| | q-c | |=R;
| | p-c | |=K;
Wherein, r is the coordinate of pupil refraction point;O is shooting point coordinate;C is the coordinate at corneal curvature center;P is pupil center Coordinate;n1For corneal refractive power;n2For air refraction;R is corneal radii;L is light source coordinates;Q is the seat of light reflection point Mark;K is the distance at pupil center to corneal curvature center.
6. according to the method described in claim 3, it is characterized in that, the coordinate and the cornea according to the pupil center The coordinate of the center of curvature, determines the pupil center and the optical axis direction of the corneal curvature line of centres includes:
The coordinate that the coordinate of the pupil center is subtracted to the corneal curvature center obtains the pupil center to the cornea The vector of the center of curvature;
According to the ratio of vector field homoemorphism described in the vector sum, obtain indicating the pupil center and the corneal curvature line of centres The unit vector of optical axis direction.
7. according to the method described in claim 6, it is characterized in that, it is described according to the matched smooth optical axis side of the eye image To differential seat angle and the optical axis direction, determine that boresight direction includes:
It obtains and the matched smooth boresight direction differential seat angle of the eye image;
According to the unit vector for indicating the optical axis direction, first water of the optical axis direction in the three-dimensional coordinate system is obtained Flat orientation angle and the first vertical direction angle;
According to the first level orientation angle, the first vertical direction angle and the smooth boresight direction differential seat angle, the optical axis is obtained Direction the second horizontal direction angle and second vertical direction angle in the three-dimensional coordinate system;
According to second horizontal direction angle and the second vertical direction angle, obtain indicating the unit of boresight direction to Amount.
8. determination is in the screen the method according to the description of claim 7 is characterized in that described according to the boresight direction On fixation point position include:
Obtain coordinate value of the unit vector for indicating the boresight direction in vertical coordinate axis;
According to the ratio between the ordinate at the corneal curvature center and the opposite number of the coordinate value of acquisition, fixation line is obtained Property parameter;
According to unit vector, the seat for staring linear dimensions and the corneal curvature center for indicating the boresight direction Mark carries out linear transformation, obtains fixation point position on the screen.
9. the method according to claim 1, wherein it is described according to the matched smooth optical axis of the eye image Orientation angle difference and the optical axis direction, before determining boresight direction, the method also includes:
Obtain calibration eye image when the default calibration point of fixation point position on the screen;
The calibration coordinate of pupil image point and light source imaging point in the calibration eye image is obtained respectively;
According to the calibration coordinate of the calibration coordinate and light source imaging point of the pupil image point of acquisition, generate with light boresight direction angle Difference is the coordinate predicted value of the default calibration point represented by unknown parameter;
According to the preset coordinate of the coordinate predicted value and the default calibration point, prediction error functions are obtained;
Determine the light boresight direction differential seat angle for being minimized the prediction error functions.
10. method according to any one of claim 1 to 9, which is characterized in that the eye image is the eyes of binocular Image;The fixation point position is the fixation point position of binocular;
The method also includes:
The fixation point position of the binocular is subjected to parallax conversion, obtains the fixation point position of the binocular in virtual reality scenario In corresponding same aiming spot;
The interaction process under virtual reality scenario is executed according to the aiming spot.
11. according to the method described in claim 10, it is characterized in that, described execute virtual reality according to the aiming spot Interaction process under scene includes:
Under virtual reality scenario, the virtual objects or menu item for corresponding to the aiming spot are determined;
Choose the virtual objects or the menu item;
Processing is interacted to the virtual objects or menu item chosen.
12. a kind of Eye-controlling focus device, which is characterized in that described device includes:
Module is obtained, for obtaining eye image;
Coordinate determining module, for determine coordinate in the three-dimensional coordinate system of screen of pupil image point in the eye image, And coordinate of the light source imaging point that is formed in the eye image through corneal reflection of light source in the three-dimensional coordinate system;
Optical axis direction determining module, for determining optical axis side according to the coordinate of the pupil image point and the coordinate of light source imaging point To;
Boresight direction determining module, for according to the matched smooth boresight direction differential seat angle of the eye image and the optical axis Direction determines boresight direction;
Fixation point determining module, for determining fixation point position on the screen according to the boresight direction.
13. device according to claim 12, which is characterized in that the eye image is the eye image of binocular;It is described Fixation point position is the fixation point position of binocular;
Described device further include:
Calibration module obtains the fixation point position of the binocular for the fixation point position of the binocular to be carried out parallax conversion The corresponding same aiming spot in virtual reality scenario;It is executed under virtual reality scenario according to the aiming spot Interaction process.
14. a kind of computer equipment, including memory and processor, computer program, the meter are stored in the memory When calculation machine program is executed by processor, so that the processor executes the step such as any one of claims 1 to 11 the method Suddenly.
15. a kind of storage medium for being stored with computer program, when the computer program is executed by one or more processors, So that one or more processors are executed such as the step of any one of claims 1 to 11 the method.
CN201710987005.1A 2017-10-20 2017-10-20 Sight tracking method, device, equipment and storage medium Active CN109696954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710987005.1A CN109696954B (en) 2017-10-20 2017-10-20 Sight tracking method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710987005.1A CN109696954B (en) 2017-10-20 2017-10-20 Sight tracking method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109696954A true CN109696954A (en) 2019-04-30
CN109696954B CN109696954B (en) 2021-05-07

Family

ID=66226506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710987005.1A Active CN109696954B (en) 2017-10-20 2017-10-20 Sight tracking method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109696954B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking
CN110537897A (en) * 2019-09-10 2019-12-06 北京未动科技有限公司 Sight tracking method and device, computer readable storage medium and electronic equipment
CN110811645A (en) * 2019-10-15 2020-02-21 南方科技大学 Visual fatigue measuring method and system, storage medium and electronic equipment
CN112099622A (en) * 2020-08-13 2020-12-18 中国科学院深圳先进技术研究院 Sight tracking method and device
TWI725802B (en) * 2020-03-31 2021-04-21 宏達國際電子股份有限公司 Head mounted display
CN112698724A (en) * 2020-12-30 2021-04-23 山东大学 Implementation method of penetrating screen system based on camera eye movement tracking
CN112732080A (en) * 2020-12-30 2021-04-30 宇龙计算机通信科技(深圳)有限公司 Operation instruction generation method and device, storage medium and electronic equipment
CN113034608A (en) * 2021-03-11 2021-06-25 东北大学秦皇岛分校 Corneal surface morphology measuring device and method
CN113808160A (en) * 2021-08-05 2021-12-17 虹软科技股份有限公司 Sight direction tracking method and device
CN114063775A (en) * 2021-11-01 2022-02-18 南开大学 Remote gaze interaction device
CN114619972A (en) * 2020-12-11 2022-06-14 上海博泰悦臻网络技术服务有限公司 Suspension support and support adjustment method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
US20140313308A1 (en) * 2013-04-19 2014-10-23 Samsung Electronics Co., Ltd. Apparatus and method for tracking gaze based on camera array
CN106168853A (en) * 2016-06-23 2016-11-30 中国科学技术大学 A kind of free space wear-type gaze tracking system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101901485A (en) * 2010-08-11 2010-12-01 华中科技大学 3D free head moving type gaze tracking system
US20140313308A1 (en) * 2013-04-19 2014-10-23 Samsung Electronics Co., Ltd. Apparatus and method for tracking gaze based on camera array
CN106168853A (en) * 2016-06-23 2016-11-30 中国科学技术大学 A kind of free space wear-type gaze tracking system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
金纯,李娅萍,高奇,曾伟: ""视线追踪系统中注视点估计算法研究"", 《科学技术与工程》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427101A (en) * 2019-07-08 2019-11-08 北京七鑫易维信息技术有限公司 Calibration method, device, equipment and the storage medium of eyeball tracking
CN110537897A (en) * 2019-09-10 2019-12-06 北京未动科技有限公司 Sight tracking method and device, computer readable storage medium and electronic equipment
CN110811645A (en) * 2019-10-15 2020-02-21 南方科技大学 Visual fatigue measuring method and system, storage medium and electronic equipment
TWI725802B (en) * 2020-03-31 2021-04-21 宏達國際電子股份有限公司 Head mounted display
CN112099622A (en) * 2020-08-13 2020-12-18 中国科学院深圳先进技术研究院 Sight tracking method and device
CN114619972A (en) * 2020-12-11 2022-06-14 上海博泰悦臻网络技术服务有限公司 Suspension support and support adjustment method
CN112732080A (en) * 2020-12-30 2021-04-30 宇龙计算机通信科技(深圳)有限公司 Operation instruction generation method and device, storage medium and electronic equipment
CN112698724B (en) * 2020-12-30 2022-02-11 山东大学 Implementation method of penetrating screen system based on camera eye movement tracking
CN112698724A (en) * 2020-12-30 2021-04-23 山东大学 Implementation method of penetrating screen system based on camera eye movement tracking
CN113034608A (en) * 2021-03-11 2021-06-25 东北大学秦皇岛分校 Corneal surface morphology measuring device and method
CN113808160A (en) * 2021-08-05 2021-12-17 虹软科技股份有限公司 Sight direction tracking method and device
WO2023011339A1 (en) * 2021-08-05 2023-02-09 虹软科技股份有限公司 Line-of-sight direction tracking method and apparatus
CN113808160B (en) * 2021-08-05 2024-01-16 虹软科技股份有限公司 Sight direction tracking method and device
CN114063775A (en) * 2021-11-01 2022-02-18 南开大学 Remote gaze interaction device

Also Published As

Publication number Publication date
CN109696954B (en) 2021-05-07

Similar Documents

Publication Publication Date Title
CN109696954A (en) Eye-controlling focus method, apparatus, equipment and storage medium
JP7443332B2 (en) Depth plane selection for multi-depth plane display systems with user categorization
US11762462B2 (en) Eye-tracking using images having different exposure times
Chen et al. Probabilistic gaze estimation without active personal calibration
US11315288B2 (en) Systems and techniques for estimating eye pose
Nishino et al. The world in an eye [eye image interpretation]
CN109086726A (en) A kind of topography's recognition methods and system based on AR intelligent glasses
CN107358217B (en) Sight estimation method and device
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
CN110187855A (en) The intelligent adjusting method for avoiding hologram block vision of near-eye display device
CN108875524A (en) Gaze estimation method, device, system and storage medium
CN104089606B (en) A kind of free space eye tracking measuring method
JP7030317B2 (en) Pupil detection device and pupil detection method
CN104809424B (en) Method for realizing sight tracking based on iris characteristics
US20220253135A1 (en) Eye center of rotation determination with one or more eye tracking cameras
Chi et al. 3-D gaze-estimation method using a multi-camera-multi-light-source system
Sun et al. A Novel Integrated Eye-Tracking System With Stereo Stimuli for 3-D Gaze Estimation
Nitschke Image-based eye pose and reflection analysis for advanced interaction techniques and scene understanding
Lanillos et al. A Bayesian hierarchy for robust gaze estimation in human–robot interaction
Li et al. A smart eye tracking system for virtual reality
CN111587397B (en) Image generation device, spectacle lens selection system, image generation method, and program
D'Angelo et al. Development of a Low-Cost Augmented Reality Head-Mounted Display Prototype
Yang et al. Spatialgaze: towards spatial gaze tracking for extended reality
Xu et al. 3D eye model-based gaze tracking system with a consumer depth camera
Hotta et al. Gaze Calibration of Eye Trackers for Head-Mounted Displays Using Eye-Frontalization Process

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant