CN108399001A - Binocular stereo vision eye movement analysis method and device in a kind of VR/AR - Google Patents

Binocular stereo vision eye movement analysis method and device in a kind of VR/AR Download PDF

Info

Publication number
CN108399001A
CN108399001A CN201710065321.3A CN201710065321A CN108399001A CN 108399001 A CN108399001 A CN 108399001A CN 201710065321 A CN201710065321 A CN 201710065321A CN 108399001 A CN108399001 A CN 108399001A
Authority
CN
China
Prior art keywords
optical axis
eye
eyes
left eyes
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201710065321.3A
Other languages
Chinese (zh)
Inventor
傅之成
杜煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Green Technology Co Ltd
Original Assignee
Shanghai Green Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Green Technology Co Ltd filed Critical Shanghai Green Technology Co Ltd
Priority to CN201710065321.3A priority Critical patent/CN108399001A/en
Publication of CN108399001A publication Critical patent/CN108399001A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The present invention discloses a kind of binocular stereo vision eye movement analysis method in VR/AR:Kappa footmarks are fixed, eye gaze calibration point, and 2 video cameras for respectively distributing to right and left eyes determine the angles kappa of eyes at least two light source;The optical axis of right and left eyes is determined in real time;The optical axis region of right and left eyes is determined according to the optical axis at the angles kappa and right and left eyes;Determine the region that user watches attentively.Device using the above method is also provided, wear-type VR or AR equipment is used for, an angle kappa need to only be demarcated for a user, the error that head position moves when user being overcome repeatedly to wear reduces cumbersome calibration procedure repeatedly;This method uses a kind of efficient light source group, and multiple corneal reflections are formed on user's cornea, and the reliable corneal reflection of automatic screening carries out eye movement analysis.When user uses wear-type VR or AR equipment, covering user's whole eye movement point position or full view are realized so that eye movement point position detection is smooth when user sweeps, the eye image brightness uniformity of capture, and subsequent image processing is convenient.

Description

Binocular stereo vision eye movement analysis method and device in a kind of VR/AR
Technical field
The invention belongs to a kind of binocular stereo vision eye movement analysis method in eyeball tracking field more particularly to VR/AR and Device.
Background technology
The angles Kappa are the angles between the optical axis of eyeball and optical axis, are positive kappa when cornea optical reflection is to nasal side deviation Angle, eyes of user look like exotropia;When to temporo lateral deviation, to bear the angles kappa, the also known as feminine gender angles kappa, eyes of user is seen Get up as esotropia.The angles normal person kappa are the positive angles kappa within 0~5 °, bear the angles kappa and the positive angles kappa more than 5 ° It is pathologic, easy mistaken diagnosis is strabismus.
Wear-type virtual reality device (also known as VR glasses or the VR helmets) and wear-type augmented reality equipment (also known as AR Glasses or the AR helmets) it is virtual reality and augmented reality product that be currently fast-developing and popularizing.Wherein, eyeball tracking Technology is to extract eyeball characteristic point based on image processing techniques, calculates and record the technology of the seen position of eyes in real time.
In the prior art, wear-type virtual reality or augmented reality equipment are based primarily upon dark pupil technology, and with corneal The reflective spot calculating pupil as a reference point-reflective point vector of corneal, has the following problems:
One, quantity of light source is less, when eyes of user rotation amplitude is larger, pupil deflection is larger, infrared light supply Corneal reflection may be fallen within except pupil region, can not be captured by thermal camera, caused covering visual angle small, can not be detected certain A little visual angles.
Two, infrared light supply negligible amounts or be unevenly distributed, infrared light supply can not Uniform Illumination eyes, lead to infrared photography The eye image brightness irregularities of machine capture, image quality is poor, influences eyeball tracking data precision.
Three, the normal angles kappa of user are not considered, cause detection error.
When four, using wear-type VR equipment every time due to user, equipment has different degrees of relative to the eyes of user It needs to recalibrate eyes, using trouble when relatively moving, thus wear-type VR equipment is reused to same user.
Invention content
The present invention provides binocular stereo vision eye movement analysis method and devices in a kind of VR, are asked with solving above-mentioned technology Topic.
A kind of binocular stereo vision eye movement analysis method in VR/AR, including step:
Kappa footmarks are fixed, eye gaze calibration point, respectively distribute to 2 video cameras of right and left eyes with it is equally distributed several At least two light source in a light source determines the angles kappa of at least one eye eyeball;
The optical axis of right and left eyes is determined in real time;
Determine the optical axis region of right and left eyes;
Determine the watching area of user.
Preferably, respectively distribute to right and left eyes is equally distributed 8 light sources.
Preferably, the step of angles kappa of the determining one eye eyeball include:
2 video cameras shoot the image of the corneal reflection including at least two light source of the one eye eyeball, The intersection point of the intersecting lens for corresponding several planes that 2 video cameras are determined to the corneal reflection respectively is in cornea ball The heart, the image of the eyes determine the pupil center of the one eye eyeball, the cornea ball center and pupil center's line Determine the optical axis of the one eye eyeball;
The calibration point determines the optical axis of eyes with cornea ball center line;
The angle of the optical axis and the optical axis is the angles kappa.
Preferably, the optical axis of the real-time determining right and left eyes includes:
2 video cameras shoot the image for respectively including at least two corneal reflection of the right and left eyes, described 2 The intersection point of the intersecting lens at least two plane that video camera is determined at least two corneal reflection respectively is cornea ball center, described The image of eyes determines the pupil center of the right and left eyes;
The cornea ball center and the optical axis that pupil center's line is the right and left eyes.
Preferably, the region that the user watches attentively is the region either right and left eyes of the optical axis region intersection of the right and left eyes Optical axis region point midway.
Preferably, the region that the user watches attentively is the optical axis region of the left eye or the optical axis region of right eye.
Binocular stereo vision ophthalmogyric device in a kind of VR/AR, its main feature is that:Including eyeball tracking module, to carry out eyeball Tracking, the eyeball tracking module include distribute to 2 video cameras of each eye, fixed position several light sources;
Several light sources of the fixed position generate 2 corneal reflections at least on cornea.
Preferably, the quantity of the light source has 8
2 video cameras are arranged towards eyes, and positioned at the lower section of 8 light sources.
Preferably, the light source is infrared light supply, and the video camera is thermal camera.
Preferably, further include mirror cup, mirror cup includes mirror cup holder and lens, and the mirror cup holder includes bracket base and branch Frame side wall.
The present invention has advantageous effect:By a kind of efficient light source group of offer and the angles kappa detection method, for one User need to only demarcate an angle kappa, to overcome wear-type virtual reality or augmented reality equipment to be moved relative to user's head The dynamic error brought, without additional setting calibrator (-ter) unit;And a kind of efficient infrared light supply group is used, it can be selected on cornea The corneal reflection of reasonable quantity is selected to detect eye movement point position, when user uses wear-type virtual reality or augmented reality equipment When, realize whole eye movement point positions or the full view of covering user, to detect that smooth user sweeps eye movement point position, together When, the eye image brightness uniformity for making thermal camera capture facilitates subsequent image to handle.
Description of the drawings
Fig. 1 is eye image when user faces.
Fig. 2 a~Fig. 2 h are the eye images of user's different visual angles in Fig. 2.
Fig. 3 is the calibration eye image schematic diagram of Fig. 1.
Fig. 4 is the schematic diagram of Fig. 2 a.
Fig. 5 is binocular stereo vision eye movement analysis method flow schematic diagram in VR/AR.
Fig. 6 is the front view of binocular stereo vision ophthalmogyric device in the VR/AR in one embodiment of the invention.
Fig. 7 be in Fig. 6 device along the sectional view of A-A '.
Fig. 8 is device operating diagram in Fig. 7.
Fig. 9 is to choose the corneal reflection of effective 2 light sources and the schematic diagram at the angles kappa.
Figure 10 is that single camera single light source determines the schematic diagram for passing through cornea ball center and camera optical center plane.
Figure 11 is that the double light sources of single camera determine the schematic diagram for passing through cornea ball center and camera optical center line.
Figure 12 is the pupil refraction virtual image and determines the schematic diagram of pupil center.
Specific implementation mode
Present pre-ferred embodiments are provided below in conjunction with the accompanying drawings, with the technical solution that the present invention will be described in detail.
Since the angles normal person kappa belong to normal phenomenon within the scope of 0~5 degree, with reference to figure 9, eyes 400 watch object When body (or calibration point) 600, the optical axis 502 of eyes 400 refers to the cornea ball center 4010 by pupil center 402, cornea 401 In (the approximate part in a spheroidal of cornea 401, the cornea ball center 4010 are the center of circle of the ball) and the rotation of eyes 400 The straight line of the heart 403 does not detect center of rotation 403 generally, only need to be according to the company of pupil center 402 and cornea ball center 4010 Line is the optical axis 502 that can determine eyes 400;The optical axis 501 of eyes 400 refers to by cornea ball center 4010 and retinal centre The straight line at recessed 404 center;The angle of the optical axis 501 and optical axis 502 is the angles kappa of eyes 400.It is generally acknowledged that the right and left eyes of people The angles kappa it is close.In the prior art, generally using the pupil center location detected representated by optical axis 502 as eyeball tracking Technology, it is clear that optical axis 502 can not accurately be directed toward the target 600 that user is watched, thus can not accounting for eyes of user Existing for energy when the angles kappa, eyeball tracking result is inaccurate.And since there are a certain ranges at the angles kappa, if the prior art needs User's watching area is accurately determined, is required to calibrate when user reuses every time, which, which generally uses, watches calibration attentively The method of point carries out.
Thus, the present invention takes binocular stereo vision eye movement analysis in a kind of VR in virtual reality or augmented reality equipment Method:
Step 1:Demarcate Kappa angles, eye gaze calibration point is respectively distributed to 2 video cameras of right and left eyes and is uniformly distributed Several light sources at least two light source determine the angles kappa of at least one eye eyeball;
Step 2:Determine the watching area of user:
The optical axis of right and left eyes is determined in real time;
Determine the optical axis region of right and left eyes;
Determine the watching area of user.
Step 1 is described below:
Demarcate the angles Kappa:Eye gaze calibration point, respectively distribute to 2 video cameras of right and left eyes with equally distributed 6 with At least two light source in upper light source determines the angles kappa of at least one eye eyeball.
The key point of this step is, accurately determines the angles kappa of eyes of user.
According to above-mentioned introduction, the optical axis and optical axis are determined.Since the optical axis is by cornea ball center and fovea centralis line Determining space vector, fovea centralis more difficult acquisition in normal wear-type virtual reality and augmented reality equipment, thus The present invention takes the optical axis using the space vector that calibration point and cornea ball center line determine as eyes of user, when due to calibration Calibration point is fixation object, and the position of calibration point is accurate, thus the determination of the optical axis is more accurate.With reference to figure 5, below Introduce the cornea ball center for determining eyes:
The present invention obtains the spatial position of cornea ball's dead center by spheric reflection principle.With reference to figure 10, a video camera (the optical center O of a video camera) and light source L, incident ray, reflection light and normal are in same known to spheric reflection theorem In plane.This illustrates when point light source is when anterior corneal surface reflects, light source L, cornea ball center C (4010), camera shooting It is in the same plane that machine optical center O and light source L is imaged G on camera sensor.
With reference to figure 10, is reflected by anterior corneal surface by the light 106A that light source L launches, picture is formed on camera sensor Point G.It is respectively at incidence since light source L, cornea ball center C, camera optical center O and light source L are imaged G on camera sensor On light 106A, normal and emergent ray 106B, so this four points are in a sheet of planar Π togetherLOn.Plane ΠLSpace bit It sets to be calculated by the spatial position of camera optical center O, picture point G and light source L and obtain, wherein the position of light source L is by by marking It is determined in fixed, picture point G is determined by corneal reflection.So the light source of a video camera and a generation corneal reflection can be with Determine a plane by cornea ball center.
It follows that with reference to figure 11, in the case of 2 efficient light sources L1 and L2 and video camera, detection process meeting Obtain 2 plane Π by cornea ball center C1And Π2, wherein the definition of efficient light sources, which is video camera, to be caught on cornea The light source for obtaining the corneal reflection of its generation, may thereby determine that picture point G1 and G2.Due to plane Π1And Π2Pass through this The optical center O and cornea ball center C of video camera can only determine that is passed through a cornea ball so the intersecting lens of these planes is CO The straight line of center C not can determine that the spatial position of cornea ball center.
Therefore, the present invention uses 2 video cameras, you can obtains two and is taken the photograph respectively by 2 by cornea ball center C The straight line of camera optical center, that is, the intersection point of two straight lines is cornea ball center C, so as to which the angle of eyes of user is calculated Film ball center C.
But user, when using wear-type virtual reality and augmented reality equipment, the amplitude of variation at visual angle is larger, such as schemes Shown in 2a~Fig. 2 h, it is most important at this moment how above-mentioned at least two corneal reflection can be captured on each visual angle.
The present invention introduces binocular stereo vision ophthalmogyric device in a kind of VR/AR at this as a result, can be caught on each visual angle Obtain at least two corneal reflection.The device includes eyeball tracking module, and to carry out eyeball tracking, which includes 2 A video camera, fixed position several light sources, several light sources generate 2 corneal reflections at least on cornea.
In another embodiment, the light source 106 of fixed position has 8, and the fixed bit of 8 infrared light supplies 106 is equipped with It is a variety of, at be uniformly distributed or approaches uniformity be distributed, allow video camera 102 that can capture at least two cornea on each visual angle Reflective spot.
Preferably, for convenience of implementing, light source uses more regular fixed position, reference chart to inventor in the present embodiment 6,8 light sources 106 are uniformly distributed around eye center when user faces or approaches uniformity is distributed, and video camera 102 can be each At least two corneal reflection can be captured on visual angle;2 video cameras 102 are set to towards eyes below 8 light sources.Fig. 8 is figure Display screen 200 in 102 operating diagram of video camera in 7, wear-type virtual reality or augmented reality equipment shows calibration point, mesh Mark object or image.
Fig. 1 is that 8 light sources 106 are uniformly distributed around eye center or approaches uniformity is distributed, the eye shot when user faces Eyeball image, it is seen that obviously have 8 corneal reflections on cornea 401.Fig. 3 is the schematic diagram of Fig. 1, and 8 light sources are from 12 directions 1061,1062,1063,1064,1065,1066,1067,1068 are followed successively by clockwise, corneal reflection corresponding with 8 light sources It is followed successively by:Corneal reflection 1, corneal reflection 2 12, corneal reflection 3 13, corneal reflection 4 14, corneal reflection 5 15, corneal reflection 6 16, corneal reflection 7 17, corneal reflection 8 18.Fig. 4 is the schematic diagram of Fig. 2 a, hence it is evident that visible The corneal reflection 17,18,11,12 of light source 1067,1068,1061,1062 can be detected under the visual angle on cornea.
Preferably, light source is infrared light supply, and video camera is thermal camera.
Preferably, the present apparatus further includes mirror cup, the mirror cup includes lens 105 and lens carrier, as shown in Fig. 6~Fig. 7, Lens carrier includes rack side wall 103 and bracket base 104, lens carrier designed as axial symmetry using the central axes of lens 105 or Asymmetrical design, is subject to the working portion for fully exposing lens 105, i.e. bracket base 104 takes hollow design, such as annulus or Square hole designs, and 104 while fastening lens 105, exposes the working portion of lens 105.Wherein, lens 105 are mounted on saturating 104 inside of mirror support pedestal;The device is the device in wear-type virtual reality or augmented reality equipment, due to being that wear-type is set It is standby, eyes of user 400 is irradiated using 8 infrared light supplies 106 of fixed position, and then thermal camera 102 captures 8 infrared lights 8 corneal reflections of the source 106 on cornea (substantially covering iris 2 and 3 region of pupil):Corneal reflection 1, cornea Reflective spot 2 12, corneal reflection 3 13, corneal reflection 4 14, corneal reflection 5 15, corneal reflection 6 16, cornea are anti- Luminous point 7 17, corneal reflection 8 18, under step 1 and step two-state, since wear-type virtual reality device is opposite with head Static, i.e., 8 infrared light supplies 106 of fixed position and user's head are opposing stationary, to which above-mentioned corneal reflection is relative to eye The position of ball center is absolutely constant.
Preferably, 8 infrared light supplies 106 are set to 104 outside of bracket base, and 8 infrared light supplies 106 are set to support 104 surface of bracket base of lens 105.
Preferably, above-mentioned infrared light supply 106 uses infrared LED light source, in order to allow thermal camera 102 to take clearly Eye image, it is preferable that select wavelength for the infrared LED light source of 940nm.
Preferably, at work, video camera shooting image is easy to be interfered by the visible light that display screen emits the present apparatus, leads to It crosses before the camera lens of video camera and one optical filter is set to solve the above problems.
It should be noted that 105 type of said lens can there are many, as lens 105 can be Fig. 1 shown in plano-convex Lens can also be biconvex lens symmetrically or non-symmetrically, can also concave-convex lens, the present invention do not do the type of lens 105 Limitation.
It can detect at least in step 1 and step 2 it should be noted that inventor devises 8 infrared light supplies 106 2 corneal reflections so that eye movement point position detection is smooth when user sweeps, and can equably irradiate eyes, to red The eye image brightness uniformity that outer video camera 102 receives, it is easier to judge the corneal reflection on cornea eye so that follow-up Processing is more prone to.
Binocular stereo vision ophthalmogyric device can guarantee in above-mentioned various VR/AR shows survey under any visual angle on cornea At least two corneal reflection needed for angulation film ball center.
It should be noted that using the geometrical rule position on display screen as calibration point in the present embodiment, benefit It is, corneal center section is relative to display screen in almost parallel so that area maximum can be surveyed when calibration.In other embodiments Specific position or any position on display screen can also be used of the invention without limitation as calibration point.
With reference to figure 9, using calibration point 600 and the line of identified cornea ball center 4010 as the optical axis 501.
With reference to figure 5, Fig. 9, since optical axis 502 passes through cornea ball center 4010 and pupil center 402, cornea ball center 4010 After determination, determine that optical axis need to only determine pupil center 402 again, a kind of method of determining pupil center 402 introduced below:
After obtaining cornea ball center, the sky of human eye optical axis is rebuild using the pupil image that twin camera 102 is shot Between position.Due to the refraction action of cornea ball surface, although it is the true of pupil that the pupil position reconstructed by double camera, which is not, Real space position, that is, the pupil virtual image and true pupil image for reflecting formation have differences, but the center of two figures with Camera optical center O and cornea ball center C are in the same plane.
With reference to figure 12, P and P ' points indicate that pupil center reflects the center to form the virtual image through cornea with pupil respectively.By rolling over Theorem is penetrated it is found that the refraction virtual image, preimage are in refraction normal in same plane.Equally, the center of pupil also complies with this rule Rule namely Tu Zhong pupil center P, pupil virtual image center P ' and refraction normal CO are in same plane.Pass through straight line CO and pupil The spatial position of hole virtual image center P ' obtains a plane by pupil center P points, this relationship can carry out table by following formula Show:
P ﹒ (eocXeop’)=0
When 2 video cameras collect pupil center's virtual image simultaneously, a camera optical center, pupil center and angle can be passed through The intersection of the plane where plane and another camera optical center, pupil center and cornea ball center where film ball center is Pass through the straight line of pupil center and cornea ball center, the i.e. space vector of eye optical axis.
In turn, the angle of optical axis 502 and the optical axis 501 is the angles kappa of eyes of user.
It should be noted that the present invention can also can simultaneously determine double at the angles kappa of mark timing determination one eye eyeball The respective angles kappa of eye.When determining the respective angles kappa of eyes, the respective angles the kappa cooperation of eyes can be used in step 2 The respective optical axis of eyes determines the respective optical axis range of eyes, and determines watching attentively for user according to the respective optical axis range of the eyes Region, i.e., simple eye determining user's watching area;The optical axis range of above-mentioned eyes can also be used to determine the field of regard of user jointly Domain, the present invention are without limitation.
Step 2 is described below:
The optical axis of right and left eyes is determined in real time;Regarding for right and left eyes is determined according to the optical axis at the angles kappa and the right and left eyes Axis region;The region that user watches attentively is determined according to the optical axis of the right and left eyes.
The key point of this step is to determine the optical axis of right and left eyes in real time.
According to above-mentioned, optical axis passes through cornea ball center and pupil center.
Shoot the image for respectively including at least two corneal reflection of right and left eyes with reference to 5,2 video cameras of figure, described 2 The intersection point of the intersecting lens at least four plane that video camera is determined at least two corneal reflection respectively is cornea ball center.It determines The method of cornea ball center is identical as the method in step 1, and this will not be repeated here by inventor.
The pupil center of right and left eyes is determined according to the image of the eyes of 2 video cameras shooting.The method for determining pupil center Identical as the method in step 1, this will not be repeated here by inventor.
Preferably, after obtaining at least two corneal reflection, 2 video cameras respectively with two at least two corneal reflection The intersection point of the intersecting lens for 2 planes that 2 corneal reflections of two adjacent light sources determine is cornea ball center.
Cornea ball center and the optical axis that pupil center's line is the right and left eyes.
The optical axis region of right and left eyes is determined according to the optical axis at the angles kappa and above-mentioned right and left eyes that are obtained in step 1.The left side It is prolonging for 2 times of kappa angles that the optical axis region of right eye, which is using right and left eyes optical axis as axis, by vertex, apex angle of cornea ball center, Extend to 2 class conical regions of display screen.
Preferably, being the watching area of user with the corresponding display screen area in optical axis region of above-mentioned left eye.Or with above-mentioned The corresponding display screen area in optical axis region of right eye is the watching area of user.I.e. simple eye optical axis region determines the field of regard of user Domain.
In view of there are a certain range, pupil center 402 and cornea balls relative to the deviation of optical axis 502 angles kappa for the optical axis 501 Center 4010 can Accurate Determining, and then it is accurate determine optical axis 502, determined existing for the optical axis 501 according to optical axis 502 and the angles kappa One range, thus according to the principle of stereoscopic vision of two eyes, overlapping region between the range of two eyes optical axis or its His effective coverage can accurately determine the watching area of user, and inventor has found that the watching area of the user can be excluded effectively Wear-type VR or AR equipment move the error brought relative to user's head, that is, the angles kappa is used to determine this binocular solid of the optical axis Vision eye movement analysis method has the function of self calibration and excludes above-mentioned error;Thus a user need to only be demarcated once The angles kappa, to overcome wear-type virtual unit to move the error brought relative to user's head, without additional setting calibration cartridge It sets.Preferably, determining the watching area of user according to the optical axis region of above-mentioned right and left eyes.According to the spy in the optical axis region of right and left eyes Sign, there is a possibility that two kinds:
One is the regions that the optical axis region of right and left eyes has intersection, i.e., when non-pathologic eyes have the normal angles kappa There should all be intersection, then the region of the intersection is the region that user watches attentively.This binocular tri-dimensional of the optical axis is determined using the angles kappa Feel that eye movement analysis method has the function of self calibration and excludes above-mentioned error;Thus a user need to only be demarcated once The angles kappa, to overcome wear-type virtual unit to move the error brought relative to user's head, without additional setting calibration cartridge It sets.
One is the optical axis regions of right and left eyes there is no the region of intersection, and the region that the user watches attentively is the right and left eyes optical axis The point midway in region.Determine that this binocular stereo vision eye movement analysis method of the optical axis has self calibration and arranges using the angles kappa Except the effect of above-mentioned error;An angle kappa need to be only thus demarcated hence for a user, to overcome wear-type virtually to set It is standby that the error brought is moved relative to user's head, without additional setting calibrating installation.
The foregoing is only a preferred embodiment of the present invention, but scope of protection of the present invention is not limited thereto, Any one skilled in the art in the technical scope disclosed by the present invention, the change or replacement that can be readily occurred in, It should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention is subject to the protection scope in claims.

Claims (10)

1. a kind of binocular stereo vision eye movement analysis method in VR/AR, including step:
Kappa footmarks are fixed, eye gaze calibration point, respectively distribute to 2 video cameras of right and left eyes and several equally distributed light At least two light source in source determines the angles kappa of at least one eye eyeball;
The optical axis of right and left eyes is determined in real time;
Determine the optical axis region of right and left eyes;
Determine the watching area of user.
2. according to binocular stereo vision eye movement analysis method in the VR/AR described in claim 1, it is characterised in that:Each distribution What it is to right and left eyes is equally distributed 8 light sources.
3. according to binocular stereo vision eye movement analysis method in the VR/AR described in claim 2, it is characterised in that:It is described true The step of angles kappa for determining one eye eyeball includes:
2 video cameras shoot the image of the corneal reflection including at least two light source of the one eye eyeball, described The intersection point of the intersecting lens for corresponding several planes that 2 video cameras are determined to the corneal reflection respectively is cornea ball center, institute The image for stating eyes determines that the pupil center of the one eye eyeball, the cornea ball center determine institute with pupil center's line State the optical axis of one eye eyeball;
The calibration point determines the optical axis of eyes with cornea ball center line;
The angle of the optical axis and the optical axis is the angles kappa.
4. according to binocular stereo vision eye movement analysis method in the VR/AR described in claim 3, it is characterised in that:The reality When determine that the optical axis of right and left eyes includes:
2 video cameras shoot the image for respectively including at least two corneal reflection of the right and left eyes, 2 camera shootings The intersection point of the intersecting lens at least two plane that machine is determined at least two corneal reflection respectively is cornea ball center, the eyes Image determine the pupil center of the right and left eyes;
The cornea ball center and the optical axis that pupil center's line is the right and left eyes.
5. according to binocular stereo vision eye movement analysis method in the VR/AR described in claim 1, it is characterised in that:The use The region that family is watched attentively is the point midway in the region either optical axis region of right and left eyes of the optical axis region intersection of the right and left eyes.
6. according to binocular stereo vision eye movement analysis method in the VR/AR described in claim 1, it is characterised in that:The use The region that family is watched attentively is the optical axis region of the left eye or the optical axis region of right eye.
7. binocular stereo vision ophthalmogyric device in a kind of VR/AR, it is characterised in that:Including eyeball tracking module, to carry out eyeball Tracking, the eyeball tracking module include distribute to 2 video cameras of each eye, fixed position several light sources;
Several light sources of the fixed position generate 2 corneal reflections at least on cornea.
8. according to binocular stereo vision ophthalmogyric device in the VR/AR described in claim 7, it is characterised in that:The light source Quantity has 8
2 video cameras are arranged towards eyes, and positioned at the lower section of 8 light sources.
9. according to binocular stereo vision ophthalmogyric device in any VR/AR in claim 7~8, it is characterised in that:It is described Light source is infrared light supply, and the video camera is thermal camera.
10. according to binocular stereo vision ophthalmogyric device in the VR/AR described in claim 7, it is characterised in that:It further include mirror Cup, mirror cup includes mirror cup holder and lens, and the mirror cup holder includes bracket base and rack side wall.
CN201710065321.3A 2017-02-06 2017-02-06 Binocular stereo vision eye movement analysis method and device in a kind of VR/AR Withdrawn CN108399001A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710065321.3A CN108399001A (en) 2017-02-06 2017-02-06 Binocular stereo vision eye movement analysis method and device in a kind of VR/AR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710065321.3A CN108399001A (en) 2017-02-06 2017-02-06 Binocular stereo vision eye movement analysis method and device in a kind of VR/AR

Publications (1)

Publication Number Publication Date
CN108399001A true CN108399001A (en) 2018-08-14

Family

ID=63093596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710065321.3A Withdrawn CN108399001A (en) 2017-02-06 2017-02-06 Binocular stereo vision eye movement analysis method and device in a kind of VR/AR

Country Status (1)

Country Link
CN (1) CN108399001A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240497A (en) * 2018-08-28 2019-01-18 北京航空航天大学青岛研究院 A kind of eye movement tracking automatic calibrating method in virtual reality scenario
CN112213859A (en) * 2020-10-12 2021-01-12 歌尔科技有限公司 Head-mounted display device imaging method and head-mounted display device
CN112587083A (en) * 2020-12-07 2021-04-02 东莞市东全智能科技有限公司 Visual processing method, device and computer storage medium
CN112613389A (en) * 2020-12-18 2021-04-06 上海影创信息科技有限公司 Eye gesture control method and system and VR glasses thereof
CN112807200A (en) * 2021-01-08 2021-05-18 上海青研科技有限公司 Strabismus training equipment
WO2023184109A1 (en) * 2022-03-28 2023-10-05 京东方科技集团股份有限公司 Eyeball tracking apparatus and method, and display device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240497A (en) * 2018-08-28 2019-01-18 北京航空航天大学青岛研究院 A kind of eye movement tracking automatic calibrating method in virtual reality scenario
CN109240497B (en) * 2018-08-28 2021-07-13 北京航空航天大学青岛研究院 Automatic calibration method for eye tracking in virtual reality scene
CN112213859A (en) * 2020-10-12 2021-01-12 歌尔科技有限公司 Head-mounted display device imaging method and head-mounted display device
US11892641B2 (en) 2020-10-12 2024-02-06 Goertek, Inc. Imaging method for a head-mounted display and head-mounted display
CN112587083A (en) * 2020-12-07 2021-04-02 东莞市东全智能科技有限公司 Visual processing method, device and computer storage medium
CN112587083B (en) * 2020-12-07 2023-08-15 东莞市东全智能科技有限公司 Visual processing method, device and computer storage medium
CN112613389A (en) * 2020-12-18 2021-04-06 上海影创信息科技有限公司 Eye gesture control method and system and VR glasses thereof
CN112807200A (en) * 2021-01-08 2021-05-18 上海青研科技有限公司 Strabismus training equipment
CN112807200B (en) * 2021-01-08 2022-07-19 上海青研科技有限公司 Strabismus training equipment
WO2023184109A1 (en) * 2022-03-28 2023-10-05 京东方科技集团股份有限公司 Eyeball tracking apparatus and method, and display device

Similar Documents

Publication Publication Date Title
CN108399001A (en) Binocular stereo vision eye movement analysis method and device in a kind of VR/AR
CN206505382U (en) Binocular stereo vision ophthalmogyric device in a kind of VR/AR
CN103458770B (en) Optical measuring device and method for capturing at least one parameter of at least one eyes that illumination characteristic can be adjusted
US7025459B2 (en) Ocular fundus auto imager
CN106339087B (en) A kind of eyeball tracking method and device thereof based on multidimensional coordinate
US6296358B1 (en) Ocular fundus auto imager
US4993826A (en) Topography measuring apparatus
US7309128B2 (en) Automated stereocampimeter and related method for improved measurement of the visual field
US5873832A (en) Method and apparatus for measuring properties of the eye using a virtual image
US6193371B1 (en) Keratometer/pachymeter
US6659611B2 (en) System and method for eye gaze tracking using corneal image mapping
US5220361A (en) Gaze tracking for field analyzer
US4902123A (en) Topography measuring apparatus
CN102596005B (en) Method and device for automatically measuring at least one refractive characteristic of both eyes of person
US4998819A (en) Topography measuring apparatus
US20110170061A1 (en) Gaze Point Tracking Using Polarized Light
US20190269323A1 (en) Ocular Fundus Imaging Systems Devices and Methods
CN106132284A (en) Optical eye is dynamic follows the trail of
CN106663183A (en) Eye tracking and user reaction detection
CN104427924B (en) Apparatus and method at least one objective refraction of eye feature of patient is measured for multiple visual ranges
CN106963335A (en) Subjective formula ophthalmic unit
US7360895B2 (en) Simplified ocular fundus auto imager
AU2015284130B2 (en) System and method for corneal topography with flat panel display
US6860602B2 (en) Apparatus for examining an anterior-segment of an eye
CN109964230A (en) Method and apparatus for eyes measurement acquisition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20180814

WW01 Invention patent application withdrawn after publication