CN206505382U - Binocular stereo vision ophthalmogyric device in a kind of VR/AR - Google Patents

Binocular stereo vision ophthalmogyric device in a kind of VR/AR Download PDF

Info

Publication number
CN206505382U
CN206505382U CN201720107953.7U CN201720107953U CN206505382U CN 206505382 U CN206505382 U CN 206505382U CN 201720107953 U CN201720107953 U CN 201720107953U CN 206505382 U CN206505382 U CN 206505382U
Authority
CN
China
Prior art keywords
user
stereo vision
binocular stereo
eye
cornea
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201720107953.7U
Other languages
Chinese (zh)
Inventor
傅之成
杜煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Green Technology Co Ltd
Original Assignee
Shanghai Green Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Green Technology Co Ltd filed Critical Shanghai Green Technology Co Ltd
Priority to CN201720107953.7U priority Critical patent/CN206505382U/en
Application granted granted Critical
Publication of CN206505382U publication Critical patent/CN206505382U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The utility model discloses binocular stereo vision ophthalmogyric device in a kind of VR/AR:Including eyeball tracking module, to carry out eyeball tracking, to carry out eyeball tracking, the eyeball tracking module includes distributing to 2 video cameras, several light sources of fixed position of each eye;Several light sources of the fixed position produce 2 corneal reflections at least on cornea.For wear-type VR or AR equipment, a kappa angle need to be only demarcated for a user, overcomes wear-type VR user repeatedly to wear the error of head position movement, cumbersome calibration procedure repeatedly is reduced;This method uses a kind of efficient light source group, and multiple corneal reflections are formed on user's cornea, and the reliable corneal reflection of automatic screening carries out eye movement analysis.When user uses wear-type VR or AR equipment, the whole eye dynamic point positions of covering user or full visual angle so that the detection of eye dynamic point position is smooth when user sweeps, the eye image brightness uniformity of capture, successive image processing is convenient.

Description

Binocular stereo vision ophthalmogyric device in a kind of VR/AR
Technical field
The utility model belongs to binocular stereo vision ophthalmogyric device in eyeball tracking field, more particularly to a kind of VR/AR.
Background technology
Kappa angles are the angles between the optical axis of eyeball and optical axis, are positive kappa when cornea optical reflection is to nasal side off normal Angle, eyes of user looks like exotropia;During to temporo lateral deviation, to bear kappa angles, also known as feminine gender kappa angles, eyes of user is seen Get up as esotropia.Normal person kappa angles are the positive kappa angles within 0~5 °, bear kappa angles and the positive kappa angles more than 5 ° It is pathologic, easy mistaken diagnosis is strabismus.
Wear-type virtual reality device (also known as VR glasses, or the VR helmets) and wear-type augmented reality equipment (also known as AR Glasses, or the AR helmets) it is virtual reality and augmented reality product that be currently fast-developing and popularizing.Wherein, eyeball tracking Technology is to extract eyeball characteristic point based on image processing techniques, calculates and record the technology that eyes see position in real time.
In the prior art, wear-type virtual reality or augmented reality equipment are based primarily upon dark pupil technology, and with corneal Reflective spot is as a reference point to calculate the reflective point vector of pupil-corneal, there is problems with:
First, quantity of light source is less, when eyes of user rotation amplitude is larger, pupil deflection is larger, infrared light supply Corneal reflection may be fallen within outside pupil region, it is impossible to captured by thermal camera, cause covering visual angle small, it is impossible to detect certain A little visual angles.
2nd, infrared light supply negligible amounts or skewness, infrared light supply can not Uniform Illumination eyes, cause infrared photography The eye image brightness irregularities of machine capture, image quality is poor, influences eyeball tracking data precision.
3rd, the normal kappa angles of user are not considered, cause detection error.
When the 4th, using wear-type VR equipment every time due to user, equipment has different degrees of relative to the eyes of user Relative movement, thus need to recalibrate eyes when reusing same user wear-type VR equipment, use trouble.
Utility model content
The utility model provides binocular stereo vision ophthalmogyric device in a kind of VR/AR, to solve above-mentioned technical problem.
Binocular stereo vision ophthalmogyric device, is characterized in a kind of VR/AR:Including eyeball tracking module, to carry out eyeball Follow the trail of, the eyeball tracking module includes distributing to 2 video cameras, several light sources of fixed position of each eye;
Several light sources of the fixed position produce 2 corneal reflections at least on cornea.
Preferably, the light source of the fixed position has 8.
Preferably, 2 video cameras are arranged at below 8 light sources towards eyes.
Preferably, the light source is infrared light supply.
Preferably, the video camera is thermal camera.
Preferably, an optical filter is also set up on the camera lens of described video camera.
Preferably, in addition to mirror cup, above-mentioned mirror cup includes mirror cup support and lens, and above-mentioned mirror cup support includes bracket base And rack side wall.
Preferably, the thermal camera is arranged at outside the mirror cup support, the bracket base or the rack side wall Side.
Preferably, the thermal camera is arranged on the rack side wall outer surface.
Preferably, the thermal camera is symmetrical arranged with mirror cup or the infrared light supply.
The utility model has beneficial effect:By providing a kind of efficient light source group and kappa angles detection method, for One user need to only demarcate a kappa angle, so as to overcome wear-type virtual reality or augmented reality equipment relative to account The error that portion's moving belt comes, without additional calibration equipment;And a kind of efficient infrared light supply group is used, it can be selected on cornea The corneal reflection of reasonable quantity is selected to detect a dynamic point position, when user uses wear-type virtual reality or augmented reality equipment When, the whole eye dynamic point positions or full visual angle of covering user are realized, so as to detect smooth user's pan eye dynamic point position, together When make thermal camera capture eye image brightness uniformity, facilitate successive image to handle.
Brief description of the drawings
Fig. 1 is eye image when user faces.
Fig. 2 a~Fig. 2 h are the eye images of user's different visual angles.
Fig. 3 is Fig. 1 demarcation eye image schematic diagram.
Fig. 4 is Fig. 2 a schematic diagram.
Fig. 5 is the front view of binocular stereo vision ophthalmogyric device in the VR/AR in one embodiment of the invention.
Fig. 6 is profile of the device along A-A ' in Fig. 5.
Fig. 7 is device operating diagram in Fig. 6.
Fig. 8 is the workflow schematic diagram of the present invention.
Fig. 9 is the corneal reflection and the schematic diagram at kappa angles for choosing effective 2 light sources.
Figure 10 is that single camera single light source determines to pass through the schematic diagram of cornea ball center and video camera photocentre plane.
Figure 11 is that the double light sources of single camera determine to pass through the schematic diagram of cornea ball center and video camera photocentre line.
Figure 12 is the pupil refraction virtual image and the schematic diagram for determining pupil center.
Embodiment
The utility model preferred embodiment is provided below in conjunction with the accompanying drawings, to describe the technical solution of the utility model in detail.
Because normal person kappa angles belong to normal phenomenon in the range of 0~5 degree, with reference to Figure 12, the viewing target of eyes 400 During object (or calibration point) 600, the optical axis 502 of eyes 400 refers to the cornea ball center by pupil center 402, cornea 401 4010 (cornea 401 is in approximately the part of a spheroidal, and the cornea ball center 4010 is the center of circle of the ball) and turn of eyes 400 The straight line at dynamic center 403, does not detect center of rotation 403 typically, only need to be according to pupil center 402 and cornea ball center 4010 Line be the optical axis 502 that can determine that eyes 400;The optical axis 501 of eyes 400 refers to by cornea ball center 4010 and retina The straight line at the center of central fovea 404;The angle of the optical axis 501 and optical axis 502 is the kappa angles of eyes 400.It is generally acknowledged that the left side of people The kappa angles of right eye are close.In the prior art, it will typically detect that the pupil center location representated by optical axis 502 is used as eyeball The technology of tracking, it is clear that optical axis 502 can not accurately point to the target 600 that user is watched, thus not account for user's eye During eyeball kappa angles that may be present, eyeball tracking result is inaccurate.And because kappa angles have certain limit, prior art is such as Fruit is needed to accurately determine user's watching area, and calibration is required to when user reuses every time, and the calibration, which is typically used, to be watched attentively The method of calibration point is carried out.
Carrying out the measurement at kappa angles needs at least corneal reflection, and thus, the utility model introduces a kind of VR/AR herein Middle binocular stereo vision ophthalmogyric device, can capture at least two corneal reflection on each visual angle.The device is chased after including eyeball Track module, to carry out eyeball tracking, the eyeball tracking module includes 2 video cameras, several light sources of fixed position, several Light source produces 2 corneal reflections at least on cornea.
In another embodiment, the fixed bit of 8 infrared light supplies 106 be equipped with it is a variety of, into be uniformly distributed or it is approximate Even distribution, allows video camera 102 to capture at least two corneal reflection on each visual angle.
It is preferred that for convenience of implementation, light source is using more regular fixed position in the present embodiment, with reference to Fig. 5, user 8 light sources 106 are uniformly distributed or approaches uniformity distribution around eye center when facing, and video camera 102 can be on each visual angle At least two corneal reflection can be captured;2 video cameras 102 are arranged at below 8 light sources towards eyes.Fig. 7 is taken the photograph in Fig. 6 Display screen 200 in the operating diagram of camera 102, wear-type virtual reality or augmented reality equipment shows calibration point, object Or image.
Fig. 1 is that 8 light sources 106 are uniformly distributed or approaches uniformity distribution around eye center, the eye shot when user faces Eyeball image, it is seen that substantially have 8 corneal reflections on cornea 401.Fig. 3 is Fig. 1 schematic diagram, and 8 light sources are from 12 directions 1061,1062,1063,1064,1065,1066,1067,1068 are followed successively by clockwise, corneal reflection corresponding with 8 light sources It is followed successively by:Corneal reflection 1, corneal reflection 2 12, corneal reflection 3 13, corneal reflection 4 14, corneal reflection 5 15, corneal reflection 6 16, corneal reflection 7 17, corneal reflection 8 18.Fig. 4 is Fig. 2 a schematic diagram, hence it is evident that visible The corneal reflection 17,18,11,12 of light source 1067,1068,1061,1062 can be detected under the visual angle on cornea.
Preferably, light source is infrared light supply, and video camera is thermal camera.
It is preferred that the present apparatus also includes mirror cup, the mirror cup includes lens 105 and lens carrier, as shown in figures 5 and 6, Lens carrier include rack side wall 103 and bracket base 104, the axis of lens carrier using lens 105 designed as axial symmetry or Asymmetrical design, is defined by the working portion for fully exposing lens 105, i.e. bracket base 104 takes hollow design, such as annulus or Square hole is designed, and while 104 are fastening lens 105, exposes the working portion of lens 105.Wherein, lens 105 are arranged on saturating The inner side of mirror support base 104;The device is the device in wear-type virtual reality or augmented reality equipment, due to being that wear-type is set It is standby, eyes of user 400 is irradiated using 8 infrared light supplies 106 of fixed position, and then thermal camera 102 captures 8 infrared lights 8 corneal reflections of the source 106 on cornea (substantially covering iris 2 and the region of pupil 3):Corneal reflection 1, cornea Reflective spot 2 12, corneal reflection 3 13, corneal reflection 4 14, corneal reflection 5 15, corneal reflection 6 16, cornea are anti- Luminous point 7 17, corneal reflection 8 18, due to 8 of wear-type virtual reality device and head geo-stationary, i.e. fixed position Infrared light supply 106 and user's head geo-stationary, so that above-mentioned corneal reflection is definitely constant relative to the position at eyeball center.
Preferably, 8 infrared light supplies 106 are arranged at the outside of bracket base 104, and 8 infrared light supplies 106 are arranged at into support The surface of bracket base 104 of lens 105.
Preferably, above-mentioned infrared light supply 106 uses infrared LED light source, in order to allow thermal camera 102 to photograph clearly Eye image, it is preferable that the infrared LED light source that selection wavelength is 940nm.
Preferably, operationally, the visible ray that video camera shooting image is easily launched by display screen is disturbed the present apparatus, is led to Cross and set an optical filter to solve the above problems before the camera lens of video camera.
It should be noted that the type of said lens 105 can have a variety of, if lens 105 can be the plano-convex as shown in Fig. 1 Lens or biconvex lens symmetrically or non-symmetrically, can also concave-convex lens, type of the utility model to lens 105 It is not limited.
It should be noted that inventor devises 8 infrared light supplies, 2 corneas can be at least detected on cornea reflective Point, so that the detection of eye dynamic point position is smooth during user's pan, and can equably irradiate eyes, so that thermal camera 102 The eye image brightness uniformity received, it is easier to judge the corneal reflection on cornea eye so that subsequent treatment is more held Easily.
Binocular stereo vision ophthalmogyric device can ensure to show survey on cornea under any visual angle in above-mentioned various VR/AR At least two corneal reflection needed for angulation film ball center.
It should be noted that being used as calibration point, its benefit using the geometrical rule position on display screen in the present embodiment It is, corneal center tangent plane is relative to display screen in almost parallel so that area can be surveyed during demarcation maximum.In other embodiments Can also be using ad-hoc location on display screen or optional position as calibration point, the utility model is without limitation.
In order to realize the detection to kappa angles, and the watching area of user is determined using kappa angles, the present apparatus is used Following method of work:
Step one:Demarcate kappa angles:Eye gaze calibration point, respectively distributes to 2 video cameras of right and left eyes with being uniformly distributed Several light sources at least two light source determine at least one eyes kappa angles;
Step 2:Determine the watching area of user:The optical axis of right and left eyes is determined in real time;According to kappa angles and described The optical axis of right and left eyes determines the optical axis region of right and left eyes;The watching area of user is determined according to the optical axis of the right and left eyes.
Step one is described below:
Demarcate kappa angles:Eye gaze calibration point, respectively distribute to 2 video cameras of right and left eyes with it is equally distributed some At least two light source in individual light source determines the kappa angles of at least one eyes.
The key point of this step is, accurately determines the kappa angles of eyes of user.
According to above-mentioned introduction, the optical axis and optical axis are determined.Because the optical axis is by cornea ball center and fovea centralis line The space vector of determination, fovea centralis is more difficult in normal wear-type virtual reality and augmented reality equipment to be obtained, thus The utility model is taken using calibration point and the space vector of cornea ball center line determination as the optical axis of eyes of user, due to mark Timing calibration point is fixation object, and the position of calibration point is accurate, thus the determination of the optical axis is more accurate.With reference to Fig. 8, under Introduce the cornea ball center for determining eyes in face:
The utility model obtains the locus of cornea ball's dead center by spheric reflection principle.With reference to Figure 10, one is taken the photograph Camera (the photocentre O of video camera) and a light source L, understand that incident ray, reflection light and normal are in by spheric reflection theorem On same plane.This is illustrated when spot light reflects in anterior corneal surface, light source L, cornea ball center C (4010), It is in the same plane that video camera photocentre O and light source L is imaged G on camera sensor.
With reference to Figure 10, reflected by the light source L light 106A launched by anterior corneal surface, picture is formed on camera sensor Point G.Incidence is respectively at because light source L, cornea ball center C, video camera photocentre O and light source L are imaged G on camera sensor On light 106A, normal and emergent ray 106B, so this four points are in a sheet of planar Π togetherLOn.Plane ΠLSpace bit Put to be calculated by video camera photocentre O, picture point G and light source L locus and obtain, wherein light source L position is by by mark Determined in fixed, picture point G is determined by corneal reflection.So the light source of a video camera and a generation corneal reflection can be with Determine a plane by cornea ball center.
It follows that with reference to Figure 11, in the case of 2 efficient light sources L1 and L2 and video camera, detection process meeting Obtain 2 plane Π by cornea ball center C1And Π2, wherein, the definition of efficient light sources is that video camera can be caught on cornea The light source of the corneal reflection of its generation is obtained, picture point G1 and G2 is may thereby determine that.Due to plane Π1And Π2Pass through this The photocentre O and cornea ball center C of video camera, so the intersecting lens of these planes is CO, i.e., can only determine that is passed through a cornea ball Center C straight line, it is impossible to determine the locus of cornea ball center.
Therefore, the utility model uses 2 video cameras, you can obtain two by cornea ball center C respectively through 2 The straight line of individual video camera photocentre, i.e. the intersection point of two straight lines is cornea ball center C, and eyes of user is obtained so as to calculate Cornea ball center C.
But user, when using wear-type virtual reality and augmented reality equipment, the amplitude of variation at visual angle is larger, such as schemes Shown in 2a~Fig. 2 h, at this moment how above-mentioned at least two corneal reflection can be captured on each visual angle most important.
With reference to Fig. 9, using calibration point 600 and the line of identified cornea ball center 4010 as the optical axis 501.
With reference to Fig. 8, Fig. 9, because optical axis 502 passes through cornea ball center 4010 and pupil center 402, cornea ball center 4010 It is determined that after, determine that optical axis need to only determine pupil center 402, a kind of method for determining pupil center 402 introduced below again:
After cornea ball center is obtained, the pupil image shot using twin camera 102 rebuilds the sky of human eye optical axis Between position.Due to the refraction action of cornea ball surface, although the pupil position reconstructed by double camera is not the true of pupil Real space position, that is, the pupil virtual image and real pupil image for reflecting formation has differences, but the center of two figures with Video camera photocentre O and cornea ball center C are in the same plane.
With reference to Figure 12, P and P ' expression pupil center reflects the center to form the virtual image through cornea to point with pupil respectively.By reflecting Theorem understands that the refraction virtual image, preimage and refraction normal are in same plane.Equally, the center of pupil also complies with this rule, Namely Tu Zhong pupil center P, pupil virtual image center P ' and refraction normal CO are in same plane.It is empty by straight line CO and pupil Inconocenter P ' locus obtains a plane by pupil center P points, and this relation can be indicated by following formula:
P ﹒ (eocXeop’)=0
When 2 video cameras collect pupil center's virtual image simultaneously, a video camera photocentre, pupil center and angle can be passed through The intersection of the plane where plane and another video camera photocentre, pupil center and cornea ball center where film ball center is Pass through the space vector of the straight line of pupil center and cornea ball center, i.e. eye optical axis.
And then, the angle of optical axis 502 and the optical axis 501 is the kappa angles of eyes of user.
It should be noted that the utility model can be at the kappa angles of mark one eye of timing determination, can also really simultaneously Determine the respective kappa angles of eyes.When determining the respective kappa angles of eyes, the respective kappa angles of eyes can be used in step 2 Coordinate the respective optical axis of eyes to determine the respective optical axis scope of eyes, and determine user's according to the respective optical axis scope of the eyes Watching area, i.e., simple eye determination user's watching area;The optical axis scope of above-mentioned eyes can also be used to determine the note of user jointly Viewed area, the utility model is without limitation.
Step 2 is described below:
When using:The optical axis of right and left eyes is determined in real time;Left and right is determined according to the optical axis of the kappa angles and the right and left eyes The optical axis region of eye;The region that user watches attentively is determined according to the optical axis of the right and left eyes.
The key point of this step is the optical axis for determining right and left eyes in real time.
According to above-mentioned, optical axis passes through cornea ball center and pupil center.
With reference to Fig. 8, what 2 video cameras shot right and left eyes includes the image of at least two corneal reflection, described 2 respectively The intersection point of the intersecting lens at least four plane that video camera is determined with least two corneal reflection respectively is cornea ball center.It is determined that The method of cornea ball center is identical with the method in step one, and inventor will not be described here.
The image of the eyes shot according to 2 video cameras determines the pupil center of right and left eyes.Determine the side of pupil center Method is identical with the method in step one, and inventor will not be described here.
It is preferred that obtain at least two corneal reflection after, 2 video cameras respectively with two at least two corneal reflection The intersection point of the intersecting lens for 2 planes that 2 corneal reflections of two adjacent light sources are determined is cornea ball center.
Cornea ball center and the optical axis that pupil center's line is the right and left eyes.
The optical axis region of right and left eyes is determined according to the kappa angles obtained in step one and the optical axis of above-mentioned right and left eyes.The left side The optical axis region of right eye be using right and left eyes optical axis as axis, by summit of cornea ball center, drift angle be prolonging for 2 times of kappa angles Extend 2 class conical regions of display screen.
It is preferred that using the corresponding display screen area in optical axis region of above-mentioned left eye as the watching area of user.Or more state The corresponding display screen area in optical axis region of right eye is the watching area of user.I.e. simple eye optical axis region determines the field of regard of user Domain.
Deviate kappa angles relative to optical axis 502 in view of the optical axis 501 and there is certain limit, pupil center 402 and cornea ball Center 4010 can Accurate Determining, and then it is accurate determine optical axis 502, determine what the optical axis 501 was present according to optical axis 502 and kappa angles One scope, thus according to the principle of stereoscopic vision of two eyes, overlapping region between the scope of two eyes optical axis or its His effective coverage can accurately determine the watching area of user, and inventor has found that the watching area of the user can be excluded effectively Wear-type VR or AR equipment determine this binocular solid of the optical axis with respect to the error that user's head moving belt comes using kappa angles Vision eye movement analysis method has self calibration and excludes the effect of above-mentioned error;Thus it need to only be demarcated once for a user Kappa angles, so as to overcome wear-type virtual unit relative to the error that user's head moving belt comes, calibration cartridge is set without extra Put.It is preferred that determining the watching area of user according to the optical axis region of above-mentioned right and left eyes.According to the spy in the optical axis region of right and left eyes Levy, there are two kinds of possibilities:
A kind of is that the optical axis region of right and left eyes has intersecting region, i.e., when non-pathologic eyes have normal kappa angles All should have and occur simultaneously, then the intersecting region is the region that user watches attentively.This binocular tri-dimensional of the optical axis is determined using kappa angles Feel that eye movement analysis method has self calibration and excludes the effect of above-mentioned error;Thus it need to only be demarcated once for a user Kappa angles, so as to overcome wear-type virtual unit relative to the error that user's head moving belt comes, calibration cartridge is set without extra Put.
A kind of is that intersecting region is not present in the optical axis region of right and left eyes, and the region that the user watches attentively is the right and left eyes optical axis The point midway in region.Determine that this binocular stereo vision eye movement analysis method of the optical axis has self calibration and arranges using kappa angles Except the effect of above-mentioned error;A kappa angle need to be only thus demarcated hence for a user, so as to overcome wear-type virtually to set The standby error come relative to user's head moving belt, calibrating installation is set without extra.
It is described above, only the utility model preferably embodiment, but protection domain of the present utility model is not This is confined to, any one skilled in the art can readily occur in the technical scope that the utility model is disclosed Change or replacement, should all cover within protection domain of the present utility model.Therefore, protection domain of the present utility model is to weigh The protection domain that profit is required is defined.

Claims (10)

1. binocular stereo vision ophthalmogyric device in a kind of VR/AR, it is characterised in that:Including eyeball tracking module, to carry out eyeball Follow the trail of, the eyeball tracking module includes distributing to 2 video cameras, several light sources of fixed position of each eye;
Several light sources of the fixed position produce 2 corneal reflections at least on cornea.
2. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 1, it is characterised in that:The fixed bit The light source put has 8.
3. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 2, it is characterised in that:2 shootings Machine is arranged at below 8 light sources towards eyes.
4. according to binocular stereo vision ophthalmogyric device in any described VR/AR in claims 1 to 3, it is characterised in that:It is described Light source is infrared light supply.
5. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 1, it is characterised in that:The video camera For thermal camera.
6. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 5, it is characterised in that:Described shooting An optical filter is also set up on the camera lens of machine.
7. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 5, it is characterised in that:Also include mirror cup, Above-mentioned mirror cup includes mirror cup support and lens, and above-mentioned mirror cup support includes bracket base and rack side wall.
8. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 7, it is characterised in that:It is described infrared to take the photograph Camera is arranged on the outside of the mirror cup support, the bracket base or the rack side wall.
9. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 8, it is characterised in that:It is described infrared to take the photograph Camera is arranged on the rack side wall outer surface.
10. binocular stereo vision ophthalmogyric device in the VR/AR according to claim 7, it is characterised in that:It is described infrared to take the photograph Camera is symmetrical arranged with mirror cup or the infrared light supply.
CN201720107953.7U 2017-02-06 2017-02-06 Binocular stereo vision ophthalmogyric device in a kind of VR/AR Active CN206505382U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720107953.7U CN206505382U (en) 2017-02-06 2017-02-06 Binocular stereo vision ophthalmogyric device in a kind of VR/AR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720107953.7U CN206505382U (en) 2017-02-06 2017-02-06 Binocular stereo vision ophthalmogyric device in a kind of VR/AR

Publications (1)

Publication Number Publication Date
CN206505382U true CN206505382U (en) 2017-09-19

Family

ID=59834072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720107953.7U Active CN206505382U (en) 2017-02-06 2017-02-06 Binocular stereo vision ophthalmogyric device in a kind of VR/AR

Country Status (1)

Country Link
CN (1) CN206505382U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205374A (en) * 2018-01-02 2018-06-26 京东方科技集团股份有限公司 Eyeball tracking module and its method, the video glass of a kind of video glass
CN109963143A (en) * 2019-02-01 2019-07-02 谷东科技有限公司 A kind of image acquiring method and system of AR glasses
WO2019237772A1 (en) * 2018-06-14 2019-12-19 北京七鑫易维信息技术有限公司 Light source control method and apparatus
CN113080844A (en) * 2021-03-31 2021-07-09 上海青研科技有限公司 Visual detection and visual training device for optimizing retina area
CN113208884A (en) * 2021-01-08 2021-08-06 上海青研科技有限公司 Visual detection and visual training equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205374A (en) * 2018-01-02 2018-06-26 京东方科技集团股份有限公司 Eyeball tracking module and its method, the video glass of a kind of video glass
CN108205374B (en) * 2018-01-02 2020-07-28 京东方科技集团股份有限公司 Eyeball tracking module and method of video glasses and video glasses
US11079839B2 (en) 2018-01-02 2021-08-03 Beijing Boe Optoelectronics Technology Co., Ltd. Eye tracking device and eye tracking method applied to video glasses and video glasses
WO2019237772A1 (en) * 2018-06-14 2019-12-19 北京七鑫易维信息技术有限公司 Light source control method and apparatus
CN109963143A (en) * 2019-02-01 2019-07-02 谷东科技有限公司 A kind of image acquiring method and system of AR glasses
CN113208884A (en) * 2021-01-08 2021-08-06 上海青研科技有限公司 Visual detection and visual training equipment
CN113080844A (en) * 2021-03-31 2021-07-09 上海青研科技有限公司 Visual detection and visual training device for optimizing retina area
CN113080844B (en) * 2021-03-31 2024-01-09 上海青研科技有限公司 Visual inspection and visual training device for preferential retina areas

Similar Documents

Publication Publication Date Title
CN206505382U (en) Binocular stereo vision ophthalmogyric device in a kind of VR/AR
US10416763B2 (en) Eye tracking and user reaction detection
CN108399001A (en) Binocular stereo vision eye movement analysis method and device in a kind of VR/AR
CN103458770B (en) Optical measuring device and method for capturing at least one parameter of at least one eyes that illumination characteristic can be adjusted
CN106132284B (en) The tracking of optics eye movement
CN106339087B (en) A kind of eyeball tracking method and device thereof based on multidimensional coordinate
JP6106684B2 (en) System and method for high resolution gaze detection
US6659611B2 (en) System and method for eye gaze tracking using corneal image mapping
US7025459B2 (en) Ocular fundus auto imager
Schaeffel Kappa and Hirschberg ratio measured with an automated video gaze tracker
JP2019512726A (en) Corneal sphere tracking to generate an eye model
CN108351514A (en) Use the eye tracks of structure light
EP1138254A1 (en) Keratometer/pachymeter
CA2263249A1 (en) Method and apparatus for measuring properties of the eye using a virtual image
AU2004325004A2 (en) Method for designing spectacle lenses taking into account an individual's head and eye movement
EP3164056B1 (en) System and method for corneal topography with flat panel display
CN110226110A (en) Fresnel lens with the dynamic draft for reducing optical artifacts
US10492680B2 (en) System and method for corneal topography with flat panel display
WO2018164104A1 (en) Eye image processing device
CN109964230A (en) Method and apparatus for eyes measurement acquisition
JPH02264632A (en) Sight line detector

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant