CN111885367A - Display device and application method - Google Patents

Display device and application method Download PDF

Info

Publication number
CN111885367A
CN111885367A CN202010701510.7A CN202010701510A CN111885367A CN 111885367 A CN111885367 A CN 111885367A CN 202010701510 A CN202010701510 A CN 202010701510A CN 111885367 A CN111885367 A CN 111885367A
Authority
CN
China
Prior art keywords
eye
dimensional space
center
point
eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010701510.7A
Other languages
Chinese (zh)
Inventor
杜煜
胡飞扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qingyan Technology Co ltd
Original Assignee
Shanghai Qingyan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qingyan Technology Co ltd filed Critical Shanghai Qingyan Technology Co ltd
Priority to CN202010701510.7A priority Critical patent/CN111885367A/en
Publication of CN111885367A publication Critical patent/CN111885367A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/385Image reproducers alternating rapidly the location of the left-right image components on the display screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention relates to a display device and an application method of the display device. The system comprises an eyeball tracking module, and the center position of the eyeball is matched with the optimal viewpoint of the naked eye 3D display device by calculating the three-dimensional space positions of the center of the eyeball of the left eye and the center of the eyeball of the right eye of a person. Because the central position of the eyeballs is unchanged when the eyeballs rotate, when people watch different areas of the naked eye 3D display device, the left eye and the right eye are always at the optimal viewpoints, and the effective anti-crosstalk effect is achieved.

Description

Display device and application method
Technical Field
The present invention relates to a display device and an application of the display device.
Background
The basic principle of the naked eye 3D display technology is that an image displayed by a display screen is divided into two parts: the left eye image and the right eye image only see the right eye image through the slit grating, the cylindrical lens, the directional light source and the like.
When naked eye 3D displays of different manufacturers and models are designed and produced, the optimal view points are generally designed, namely when the left eye and the right eye are respectively at the respective optimal view points, the left eye can only see a left eye image, and the right eye can only see a right eye image, so that an ideal visual effect is achieved. If the left and right eyes are not at the optimal viewing point, a crosstalk phenomenon occurs in which the left eye can see a part of the right eye image or the right eye can see a part of the left eye image, resulting in poor viewing effects such as ghosting.
In order to solve the problem of matching of the designed optimal viewpoint and the positions of human eyes, some naked eye 3D technical schemes use an eyeball tracking technology, and the principle is that the three-dimensional space position of a pupil or the center of the pupil is calculated through an image processing technology and is matched with the designed optimal viewpoint, so that the purpose of preventing crosstalk is achieved. However, when the user looks at different positions of the naked eye 3D display, the pupil of the eye may move within a certain range, so that the pupil or the center of the pupil may be at the optimal viewpoint, or not at the optimal viewpoint during the movement of the pupil of the eye, and thus the optimal crosstalk prevention effect may not be achieved.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the left and right eye positions of the person do not match the optimal viewpoint for the naked eye 3D display.
In order to solve the above-mentioned technical problem, an aspect of the present invention provides a display device, comprising,
the naked eye 3D display device is used for displaying the L-type images and the R-type images;
the eyeball tracking module is used for measuring the three-dimensional space coordinate of the sphere center of the left eye eyeball and the three-dimensional space coordinate of the sphere center of the right eye eyeball, feeding the three-dimensional space coordinate of the sphere center of the left eye eyeball and the three-dimensional space coordinate of the sphere center of the right eye eyeball obtained through measurement back to a naked eye 3D display device or an observer, and enabling the three-dimensional space coordinate of the sphere center of the left eye eyeball and the three-dimensional space coordinate of the sphere center of the right eye eyeball to be superposed with the optimal viewpoint of the naked eye 3D display device through adjusting the relative position of the human eye and the naked eye 3D display device or adjusting the display mode or structure of the naked eye 3D display device, so that the purposes that the left eye can only see the L-type images and not see the R-type images and the right eye.
Preferably, the eyeball tracking module includes two image capturing units and a calculating unit, the two image capturing units are respectively used for obtaining a left eye image and a right eye image, the relative positions of the two image capturing units and the naked eye 3D display device are fixed, the calculating unit calculates the three-dimensional space coordinates of the sphere center of the left eye eyeball and the three-dimensional space coordinates of the sphere center of the right eye eyeball through a one-point calibration method, and the implementation of the one-point calibration method includes the following steps:
the naked eye 3D display device displays a calibration point CP with known three-dimensional space coordinates, after a left eye or a right eye of an observer watches the calibration point CP, a left eye image or a right eye image is obtained through the camera unit, and the calculating unit calculates a three-dimensional space coordinate point P at the center of a pupil of the left eye through the left eye image or the right eye imageLOr the three-dimensional spatial coordinate point P of the center of the right eye pupilRFurther obtain the connection calibration point CP and the three-dimensional coordinate point PLA left eye straight line or a connection calibration point CP and a three-dimensional space coordinate point PRThe right eye straight line of (1) is along the left eye straight line and the three-dimensional space coordinate point PLThe coordinates of the points with the distance Dmm are the three-dimensional space coordinates of the sphere center of the eyeball of the left eye and the three-dimensional space coordinate point P along the straight line of the right eyeRThe coordinates of the points with the distance Dmm are the three-dimensional space coordinates of the spherical center of the eyeball of the right eye, and D is an empirical value of the distance between the spherical center of the eyeball and the center of the pupil.
Preferably, the eyeball tracking module comprises two camera units and a calculation unit, wherein the two camera units are respectively used for acquiring a left eye image and a right eye image, the relative positions of the two camera units and the naked eye 3D display device are fixed, the calculation unit calculates the three-dimensional space coordinates of the sphere center of the left eye eyeball and the three-dimensional space coordinates of the sphere center of the right eye eyeball through an N-point calibration mode, N is more than or equal to 2, and the implementation of the N-point calibration mode comprises the following steps:
displaying N calibration points with known three-dimensional space coordinates by the naked eye 3D display device, wherein the N calibration points are positioned at different positions, and defining the nth calibration point as CPnThe mth calibration point is defined as CPm,n=1,……,N,m=1,……,N,n≠m;
The observer's left or right eye gazes at the index point CPnThen, a left eye image or a right eye image is obtained through the camera shooting unit, and the calculating unit calculates and obtains a three-dimensional space coordinate point of the center of the pupil of the left eye through the left eye image or the right eye image
Figure BDA0002591400290000021
Or the center of the pupil of the right eyeInter-coordinate point
Figure BDA0002591400290000022
Thereby obtaining a connection index point CPnAnd coordinate point of three-dimensional space
Figure BDA0002591400290000023
Left eye straight line
Figure BDA0002591400290000024
Or connect the index point CPnAnd coordinate point of three-dimensional space
Figure BDA0002591400290000025
Right eye straight line
Figure BDA0002591400290000026
The observer's left or right eye gazes at the index point CPmThen, a left eye image or a right eye image is obtained through the camera shooting unit, and the calculating unit calculates and obtains a three-dimensional space coordinate point of the center of the pupil of the left eye through the left eye image or the right eye image
Figure BDA0002591400290000031
Or three-dimensional spatial coordinate points at the center of the pupil of the right eye
Figure BDA0002591400290000032
Thereby obtaining a connection index point CPmAnd coordinate point of three-dimensional space
Figure BDA0002591400290000033
Left eye straight line
Figure BDA0002591400290000034
Or connect the index point CPmAnd coordinate point of three-dimensional space
Figure BDA0002591400290000035
Right eye straight line
Figure BDA0002591400290000036
Left eye straight line
Figure BDA0002591400290000037
Is in line with the left eye
Figure BDA0002591400290000038
The coordinates of the intersection points are the three-dimensional space coordinates of the sphere center of the left eye eyeball; straight line of right eye
Figure BDA0002591400290000039
Straight line with the right eye
Figure BDA00025914002900000310
The coordinates of the intersection points are the three-dimensional space coordinates of the spherical center of the eye of the right eye.
Preferably, the eyeball tracking module comprises two camera units, two light sources and a calculation unit, wherein:
two point light sources for forming virtual images in the left and right eyes of the observer, and coordinate points R corresponding to the two point light sources1、R2The three-dimensional space coordinates of (a) are known;
the two camera units are fixed relative to the naked eye 3D display device and are respectively used for acquiring a left eye image containing a virtual image and a right eye image containing the virtual image;
the calculation unit calculates the three-dimensional space coordinates of the sphere center of the left eye eyeball and the sphere center of the right eye eyeball in a calibration-free mode, and the calibration-free mode comprises the following steps:
a calculation unit obtains three-dimensional spatial coordinate points R 'of two virtual images for the left eye using the left eye image including the virtual images'1L、R′2LAnd obtaining a three-dimensional space coordinate point P of the center of the pupil of the left eyeLOr obtaining three-dimensional space coordinate points R 'of two virtual images of the right eye by using the right eye image containing the virtual images'1R、R′2RAnd obtaining a three-dimensional space coordinate point P of the pupil center of the right eyeR
Calculating to obtain a connecting three-dimensional space coordinate point R'1LAnd coordinate point R1Of linear R'1LR1Connecting three-dimensional space coordinate points R'2LAnd coordinate point R2Of linear R'2LR2Straight line R'1LR1And linear R'2LR2The intersection point of the two is the spherical center O of the spherical surface of the outer surface of the cornea of the left eyeLAnd further calculating to obtain a connecting three-dimensional space coordinate point PLAnd the center of sphere OLAlong the left-eye line and the three-dimensional spatial coordinate point PLThe coordinates of points with a distance Dmm between the points are three-dimensional space coordinates of the eyeball center of the left eye, and D is an empirical value of the distance between the eyeball center and the pupil center;
calculating to obtain a connecting three-dimensional space coordinate point R'1RAnd coordinate point R1Of linear R'1RR1Connecting three-dimensional space coordinate points R'2RAnd coordinate point R2Of linear R'2RR2Straight line R'1RR1And linear R'2RR2The intersection point of the right eye and the right eye is the spherical center O of the spherical surface on which the outer surface of the cornea is positionedRAnd further calculating to obtain a connecting three-dimensional space coordinate point PRAnd the center of sphere ORAlong the right-eye line and the three-dimensional space coordinate point PRThe coordinates of the points with the distance Dmm are the three-dimensional space coordinates of the spherical center of the eye of the right eye.
Preferably, the calculating unit calculates the three-dimensional space coordinates of the sphere center of the left eye eyeball and the three-dimensional space coordinates of the sphere center of the right eye eyeball through a dynamic calibration mode, and the implementation of the dynamic calibration mode comprises the following steps:
an observer randomly looks at N different positions of the naked eye 3D display device, wherein N is more than or equal to 2, the calculation unit calculates to obtain N left eye straight lines or N right eye straight lines, the coordinate of the intersection point of any two left eye straight lines in the N left eye straight lines is the three-dimensional space coordinate of the sphere center of the left eye eyeball, and the coordinate of the intersection point of any two right eye straight lines in the N right eye straight lines is the three-dimensional space coordinate of the sphere center of the right eye eyeball.
Another technical solution of the present invention is to provide an application method of the display device, wherein the display in the left-eye monocular mode, the right-eye monocular mode, the binocular 2D mode, or the binocular 3D mode is performed, and the eye tracking module performs an eye movement test when the display in the left-eye monocular mode, the right-eye monocular mode, the binocular 2D mode, or the binocular 3D mode is performed, wherein:
the left-eye monocular mode is to display L-class images visible only to the left eye;
the right-eye monocular mode is to display an R-class image visible only to the right eye;
the binocular 2D mode is that a left-eye visible L-type image and a right-eye visible R-type image are displayed simultaneously, and the L-type image and the R-type image are the same;
the binocular 3D mode is to simultaneously display an L-type image visible to the left eye and an R-type image visible to the right eye, and the L-type image and the R-type image are stereoscopic images with parallax.
Preferably, when the left-eye monocular mode, the right-eye monocular mode, the binocular 2D mode or the binocular 3D mode is displayed, a vision examination or a vision training is performed.
Preferably, when the left-eye monocular mode and the right-eye monocular mode are performed, the eyesight test is a national standard logarithmic visual acuity chart, and the L-type images displayed in the left-eye monocular mode are the visual acuity chart only seen by the left eye; the R-type image displayed in the right-eye monocular mode is an eye chart that can be seen only by the right eye.
Preferably, the visual training is performed in the binocular 3D mode, the L-class image visible to the left eye and the R-class image visible to the right eye have a certain parallax, and the parallax is periodically changed.
The invention has the beneficial effects that: by calculating the three-dimensional space positions of the centers of the left eye eyeball and the right eye eyeball of the person and matching the designed optimal viewpoint of the naked eye 3D display device, the center positions of the eyeballs are unchanged when the eyeballs rotate, so that the left eye and the right eye can be always at the optimal viewpoint when the person watches different areas of the naked eye 3D display device, and the effective anti-crosstalk effect is achieved.
Drawings
Fig. 1 is a schematic diagram of an optimal viewpoint position of a slit grating type naked eye 3D display;
FIG. 2 is a schematic diagram of the relationship between the eyeball center, the pupil center and the optimal viewpoint position when the naked eye 3D display is seen at different positions;
fig. 3(a) to 3(d) are schematic views for calculating three-dimensional spatial positions of eyeballs by using a binocular camera;
fig. 4 is a schematic diagram of calculating a three-dimensional spatial position of an eyeball using an imaging system comprising two near-infrared cameras and two near-infrared light sources.
Detailed Description
The invention will be further illustrated with reference to the following specific examples. It should be understood that these examples are for illustrative purposes only and are not intended to limit the scope of the present invention. Further, it should be understood that various changes or modifications of the present invention may be made by those skilled in the art after reading the teaching of the present invention, and such equivalents may fall within the scope of the present invention as defined in the appended claims.
Example 1:
the basic principle of the naked eye 3D display technology is to divide the image of the display device into two parts: the left eye image and the right eye image only enable the left eye to see the left eye image and enable the right eye to see the right eye image through the slit grating, the cylindrical lens, the directional light source and the like.
Taking the slit grating type naked eye 3D display shown in fig. 1 as an example, a left eye image displayed by the display screen 101 is L, a right eye image displayed by the display screen 101 is R, the slit grating 102 is located between human eyes and the display screen 101, and the light-shielding regions and the light-transmitting regions are alternately arranged. Thus, a pair of optimal viewpoint positions A and B exist, when the left eye is positioned at the viewpoint A and the right eye is positioned at the viewpoint B, the left eye can only see the left eye image L, and the right eye can only see the right eye image R, so that the optimal visual effect without crosstalk is achieved.
In the existing naked eye 3D display technical scheme, the positions of pupils of the left eye and the right eye of a person are calculated through an eyeball tracking technology, so that the center of the pupil of the left eye is aligned with a point A, the center of the pupil of the right eye is aligned with a point B, and the aim of matching the eye position with the optimal viewpoint is fulfilled. However, this method has the following disadvantages: when the naked eye 3D display is seen at different positions, the pupil of the eye can move within a certain range. Thus, in the process of moving the pupil of the eye, the pupil center is sometimes at the optimal viewpoint, and sometimes not at the optimal viewpoint, so that the optimal crosstalk prevention effect cannot be achieved. As shown in fig. 2, for the left eye as an example, it is assumed that when the eye looks at the middle position of the display screen, the pupil center is aligned with the optimal viewpoint a, but when the eye looks at the right position of the display screen, the pupil center moves to the position a', which is not aligned with the optimal viewpoint a, and the optimal visual effect is not achieved.
In order to solve this problem, the present invention proposes a method of optimal viewpoint matching with an eyeball center position and a naked eye 3D display device, as shown in fig. 2, when the eyes look at different positions of the display screen (e.g., the middle point and the right point in the figure), the eyeball center position O is stationary while the eyeball rotates. The eyeball center position O is located at the optimal viewpoint, so that no matter how large the visual angle of the naked eye 3D display is, the eyeball center is always used as the center in the rotation process of the eyes, the eyeball center is always consistent with the optimal viewpoint, and the purpose of achieving the optimal anti-crosstalk effect in the whole visual angle range can be achieved.
Since the center of the eyeball is located inside the eyeball and is not easily obtained directly by the camera or the distance measuring device, the method for obtaining the center of the eyeball in the embodiment is as follows:
the eyeball tracking module comprises two cameras, can perform 3D space positioning on the center of the left pupil and the center of the right pupil, and performs 3D space positioning on the center of the eyeball of the left eye and the center of the eyeball of the right eye through calibration. The calibration method can be divided into two-point calibration and one-point calibration.
The device shown in fig. 3(a) includes a naked eye 3D display and two identical cameras, the two cameras are a left camera 301 and a right camera 302, the left camera 301 and the right camera 302 are located below the naked eye 3D display and are placed in parallel, and a left eye image and a right eye image can be shot. The relative positions of the left camera 301 and the right camera 302 and the naked eye 3D display are fixed, and the relative positions of the naked eye 3D display and human eyes can be adjusted and changed.
The two-point calibration is realized by the following steps:
step 1: as shown in fig. 3(b), a calibration point C is displayed on the left side of the naked eye 3D display, and the eye of the subject is made to watch the calibration point C. Taking the left eye as an example, the line of sight at this time is the index point C and the pupil center P1C-P of1(through the center of the eye globe of the left eye). Although the visual axis of some people is not completely coincident with the optical axis, a small included angle exists, namely the Kappa angle, but the Kappa angle is generally small in value, so that the calculated effect on the eyeball center is not large, and the calculated effect is ignored.
According to the principle of binocular vision, the center P of the pupil1Three-dimensional space coordinate P1The (x, y, z) calculation method is as follows:
as shown in fig. 3(b), the three-dimensional space coordinate uses the optical center E of the left camera 301 as the origin, the straight line of the connecting line EI from the optical center of the left camera 301 to the optical center of the right camera 302 is the X-axis, the straight line of the optical axis EH of the left camera 301 is the Y-axis, and the Z-axis is perpendicular to the XY-plane (not shown in this figure).
The distance between the left camera 301 and the right camera 302 is T, F is the center of the imaging plane of the left camera 301, and J is the center of the imaging plane of the right camera 302. Center of pupil P1An imaging point on the left camera 301 is G, and an imaging point on the right camera 302 is K. Because the size of the imaging plane of the camera is known, the X-axis distance between a certain imaging point and the center of the imaging plane is easy to calculate. The distance between the G point and the F point projected in the X-axis direction can be calculated to be deltax1The distance between the K point and the J point projected in the X-axis direction is Deltax2
The focal length f of the camera is known, i.e., EF ═ IJ ═ f.
Obtaining an equation (r) and an equation (c) according to a similar triangle principle as follows:
Figure BDA0002591400290000071
Figure BDA0002591400290000072
because equation (r) and equation (c) are onlyWith two unknowns x and y, other values Δ x1、Δx2F, T are known, so the following can be solved:
Figure BDA0002591400290000073
Figure BDA0002591400290000074
in the same way, according to the pupil center P1The distance between the imaging point and the imaging plane center in the Z-axis direction (assuming this distance is Δ Z) is calculated as P1Z-coordinate of (c):
Figure BDA0002591400290000075
thus the pupil center P1Three-dimensional space coordinate P1(x, y, z) results can be calculated.
And because the relative positions of the left camera 301 and the right camera 302 and the naked eye 3D display are fixed, namely the three-dimensional space coordinates of the calibration point C are known, the C and the pupil center P of the left eye1Three-dimensional space connecting line C-P1The left eye line of sight can be determined.
Step 2: as shown in fig. 3(c), a calibration point D is displayed on the right side of the naked eye 3D display, so that the eye of the tested person can watch the calibration point D, and according to the calculation method in step 1, the left eye sight line D-P can be obtained2. As shown in FIG. 3(d), C-P is calculated in a three-dimensional space coordinate system1And D-P2The three-dimensional space position O of the sphere center of the eyeball of the left eye can be obtained1. In the process of viewing the two calibration points C and D, the head is required to be fixed, and only the eyeball rotates.
And step 3: by the same method, the three-dimensional space position O of the sphere center of the eyeball of the right eye can be calculated2. The detailed process is not repeated here.
In addition, the device provided by the invention can realize one-point calibration. One point calibration is that the eye only needs to look at one calibration point to obtain the three-dimensional space position of the eyeball center. The principle is that according to physiological knowledge, the radius of an eyeball of an adult is about 12mm, and because the center of a pupil is approximately positioned on the spherical surface of the eyeball, the center of the eyeball is positioned on the extension line of a connecting line of a calibration point and the center of the pupil and is 12mm away from the center of the pupil. Compared with two-point calibration, the one-point calibration has the advantages that only one calibration point needs to be seen, and the calibration time is saved. The disadvantage is that it is only suitable for adults and not children, because children have smaller eyeball radius than adults and are uncertain in size.
Example 2:
the difference between this embodiment and embodiment 1 is that the eye tracking module includes at least two near-infrared cameras and at least 2 near-infrared light sources, and can measure three-dimensional positions of the eyeball centers of the left eye and the right eye, and can measure three-dimensional visual lines of the left eye and the right eye.
This embodiment uses two near-infrared cameras, is located the display screen below, bilateral symmetry, and two near-infrared cameras are 10cm apart from. Two near-infrared light sources, namely 850nm near-infrared LED lamps (point light sources), are arranged on the outer sides of the two near-infrared cameras and are symmetrical left and right, and the distance between the two near-infrared light sources is 30 cm.
The outer side of the pupil has a spherical corneal surface. The outer surface of the cornea of the human eye is regarded as a convex mirror, and the point light source forms a virtual image on the other side of the convex mirror through the reflection of the convex mirror. Based on optical knowledge it is known that the position of the virtual image is determined by the position where the light source and the convex mirror are located, independently of the position where the observer is located (i.e. independently of the position where the camera is located). In addition, a spatial straight line formed by connecting the point light source and the virtual image passes through the spherical center of the spherical surface where the convex mirror is located.
Based on the optical principle of point light source convex mirror reflection imaging and the binocular vision positioning principle in embodiment 1, three-dimensional coordinates of two near-infrared reflection point virtual images on the outer surface of the cornea can be calculated. As shown in FIG. 4, for the left eye, two near-infrared point light sources are R1And R2Their virtual images on the outer surface of the cornea are R1' and R2’,R1-R1' connection and R2-R2' ofThe connecting line intersects with the spherical center O of the spherical surface of the outer surface of the corneac. In addition, the three-dimensional coordinates of the pupil center P can be measured based on the binocular vision positioning principle in embodiment 1. The three-dimensional position of the eyeball center can be calculated through a calibration-free mode, a one-point calibration mode, a two-point calibration mode and a dynamic calibration mode.
The principle of the calibration-free mode is as follows: according to the physiological knowledge that the radius of the eyeball of an adult is about 12mm, the center of the eyeball is positioned on the line PO between the center of the pupil and the center of the outer surface of the corneacOn the extension line, the pupil center is approximately located on the eyeball spherical surface, so that the eyeball center can be calculated to be located on PO for adultscThe extension line is about 12mm from the pupil center P, thereby obtaining the three-dimensional position of the eyeball center.
The method of one-point calibration and two-point calibration is the same as that of the embodiment 1.
The dynamic calibration is that the testee can see at least two different places on the display screen at will without displaying the calibration point at a specific position, thereby obtaining two or more than two straight lines POcArbitrary two straight lines POcThe two-dimensional position of the eyeball center can be calculated by intersecting the eyeball center. The method is suitable for infants or children who cannot pay attention to the marked point, for example, the infants or children are enabled to watch a section of animation, and the three-dimensional position of the center of the eyeball can be calculated only by moving the eyes.
In addition, the device and the method provided by the invention can also measure the three-dimensional sight of the left eye and the three-dimensional sight of the right eye. Because the line connecting the pupil center and the corneal sphere center is the sight line of human eyes, the intersection point of the sight line and the display screen is the eye movement point of the position of the display screen seen by the human eyes.
Example 3:
through the above embodiment 1 or embodiment 2, the relative three-dimensional spatial positions of the center of the left eye eyeball and the center of the right eye eyeball and the naked eye 3D device can be calculated. Since the position of the designed optimal viewpoint of the naked eye 3D device is known, when the initial positions of the center of the eyeball of the left eye and the center of the eyeball of the right eye do not match the designed optimal viewpoint of the naked eye 3D device, we can adjust in the following manner.
The method (one) is as follows: and when the initial left eye eyeball center position and the right eye eyeball center position are not the optimal viewpoints, adjusting the positions of the naked eye 3D display devices to enable the left eye eyeball center and the right eye eyeball center to be located at the optimal viewpoints. The mode of adjusting the position of the naked eye 3D display device can be adjusted in an electric mode, a mechanical mode and the like or manually.
The second method is: and when the initial left eye eyeball center position and the right eye eyeball center position are not the optimal viewpoints, adjusting the display modes of the display module, including adjusting the pixel display mode and the sub-pixel display mode, so that the left eye eyeball center and the right eye eyeball center are positioned at the optimal viewpoints.
The third mode is: when the initial left eye eyeball center position and the right eye eyeball center position are not the optimal viewpoints, the structure of the display module is adjusted, for example, the left-right relative position or the relative distance between the slit grating and the display screen is adjusted, or the light emitting direction of the light source is pointed, so that the optimal viewpoints are changed, and the left eye eyeball center and the right eye eyeball center are located at the optimal viewpoints.
The method (IV) is: when the initial left eye eyeball center position and the right eye eyeball center position are not the optimal viewpoints, the positions of the left eye and the right eye of the person are adjusted, namely the left eye eyeball center and the right eye eyeball center are positioned at the optimal viewpoints through the movement of the person.
In practical use, matching of the eyeball center position and the optimal viewpoint may be achieved by one or a combination of more of the above manners. The methods are not only suitable for slit grating naked eye 3D display devices, but also suitable for various naked eye 3D display devices with other structures such as cylindrical lenses, directional light sources and the like.
Example 4:
by using the naked eye 3D display device and the method for matching the eyeball center position of human eyes with the optimal viewpoint, the display in a left eye monocular mode, a right eye monocular mode, a double eye 2D mode and a double eye 3D mode can be performed. The left-eye monocular mode is to display L-class images visible only to the left eye; the right-eye monocular mode is to display an R-class image visible only to the right eye; the binocular 2D mode is that a left-eye visible L-type image and a right-eye visible R-type image are displayed simultaneously, and the L-type image and the R-type image are the same; the binocular 3D mode is to simultaneously display an L-type image visible to the left eye and an R-type image visible to the right eye, and the L-type image and the R-type image are stereoscopic images with parallax.
Meanwhile, the eyeball tracking module can perform eye movement testing. The eye movement test is to measure the position of the display screen seen by human eyes, and the eye movement point is the intersection point of the sight line and the display screen. In general, the eye movement point of the left eye and the eye movement point of the right eye should coincide or substantially coincide. In some cases, there may be a difference between the eye movement points of the left and right eyes. In order to study the difference, when the conventional method measures the eye movement data of a single eye of the left eye and a single eye of the right eye, because it is not desirable that the other eye also simultaneously see the test content to cause interference, the other eye generally needs to be shielded by means of an eyeshade or the like. However, it is sometimes necessary to switch the eye movement test of the left eye and the right eye relatively quickly and frequently, and this may cause a large waste in time and inconvenience in testing if the eye mask is still replaced.
The naked eye 3D display device can only display L-type images which can be seen by the left eye or only display R-type images which can be seen by the right eye, so that the naked eye 3D display device can be combined with the eye movement testing function to achieve the effect of testing monocular movement without using an eye mask.
In this embodiment, the device using the two near-infrared cameras and the two near-infrared light sources and the method for performing gaze measurement using the device can calculate the eye movement point of the left eye and the eye movement point of the right eye respectively. The eye movement test can be carried out in the display of a left-eye single-eye mode, a right-eye single-eye mode, a double-eye 2D mode and a double-eye 3D mode.
(1) Left eye monocular mode eye movement test: the naked eye 3D display only displays L-type images, only the left eye is visible and the right eye is invisible. Eye movement point data for the left and right eyes are calculated by the eye tracking module (although the right eye cannot see the L-class images at this time, it is still meaningful to calculate the eye movement point of the right eye at this time).
(2) Right eye monocular mode eye movement test: the naked eye 3D display only displays R-class images, only the right eye is visible and the left eye is invisible. Eye movement point data for the left and right eyes are calculated by the eye tracking module (although the left eye does not see the R-class images at this time, it is still meaningful to calculate the eye movement point of the left eye at this time).
(3) Binocular 2D mode eye movement test: the naked eye 3D display simultaneously displays an L-class image and an R-class image, the two images have no parallax, and left and right eyes look like the same 2D image. Eye movement point data for the left and right eyes is calculated by the eye tracking module.
(4) Binocular 3D mode eye movement test: the naked eye 3D display simultaneously displays an L-type image and an R-type image, the two images have parallax, and left and right eyes look like a three-dimensional 3D image. Eye movement point data for the left and right eyes is calculated by the eye tracking module. In particular, the three-dimensional coordinates of the intersection of the three-dimensional line of sight of the left and right eyes and the virtual 3D image can be tested, which is of practical value for the testing or training of stereoscopic vision.
Example 5:
by using the naked eye 3D display device and the method for matching the eyeball center position of human eyes with the optimal viewpoint, the vision examination and the vision training can be carried out when the display is carried out in a left eye monocular mode, a right eye monocular mode, a binocular 2D mode and a binocular 3D mode.
For example: the L-type images displayed on the naked eye 3D display device are national standard logarithmic visual charts (E-shaped charts), and the R-type images are the same national standard logarithmic visual charts. When the left eye vision is inspected, the national standard logarithmic visual acuity chart of the L-type images is displayed on the naked eye 3D display device, the right eye does not need to be shielded at the moment, the visual acuity chart can be seen by the left eye, the visual acuity chart cannot be seen by the right eye, and then the vision inspection of the left eye is performed through the visual acuity chart. Similarly, when the vision of the right eye is checked, the national standard logarithmic visual acuity chart of the B-type images is displayed on the naked eye 3D display device, the left eye does not need to be shielded at the moment, the effect that the right eye can see the visual acuity chart and the left eye cannot see the visual acuity chart can be achieved, and then the vision of the right eye is checked through the visual acuity chart.
For example: by using the naked eye 3D display device and the method for determining the optimal viewpoint, the visual training of the binocular vision function can be performed. The specific method comprises the following steps: the distance and relative position of the naked eye 3D display device and the human eyes are kept unchanged. The naked eye 3D display device simultaneously displays an L-type image only visible to the left eye and an R-type image only visible to the right eye, the L-type image and the R-type image are left and right views of a small sphere with certain parallax, and the parallax is changed periodically. From the viewer's position, the ball will move periodically from far to near and then from near to far. The observer watches the small ball with both eyes to perform visual training of visual function of both eyes.
During visual examination and visual training, real-time adjustment and feedback of display contents can be performed according to data of eye movement tests. For example, in the case of visual acuity test, if the eye movement testing device finds that the examinee does not see the eye chart because of inattention, the eye movement testing device can remind the examinee through flash light feedback or sound feedback. For example, during visual training, the visual target of the visual training can emit light or sound when the eyes of the person to be trained watch the visual target, so that the interest of the training is increased; in addition, the time length of the eyes of the person to be trained watching the sighting marks in the whole training process can be recorded, and the effective training time is judged. Such devices and methods may be particularly effective because the subject of vision examination and training is often a young child.

Claims (9)

1. A display device, comprising,
the naked eye 3D display device is used for displaying the L-type images and the R-type images;
the eyeball tracking module is used for measuring the three-dimensional space coordinate of the sphere center of the left eye eyeball and the three-dimensional space coordinate of the sphere center of the right eye eyeball, feeding the three-dimensional space coordinate of the sphere center of the left eye eyeball and the three-dimensional space coordinate of the sphere center of the right eye eyeball obtained through measurement back to a naked eye 3D display device or an observer, and enabling the three-dimensional space coordinate of the sphere center of the left eye eyeball and the three-dimensional space coordinate of the sphere center of the right eye eyeball to be superposed with the optimal viewpoint of the naked eye 3D display device through adjusting the relative position of the human eye and the naked eye 3D display device or adjusting the display mode or structure of the naked eye 3D display device, so that the purposes that the left eye can only see the L-type images and not see the R-type images and the right eye.
2. The display device according to claim 1, wherein the eyeball tracking module comprises two image capturing units and a calculating unit, the two image capturing units are respectively used for acquiring a left-eye image and a right-eye image, the relative positions of the two image capturing units and the naked-eye 3D display device are fixed, the calculating unit calculates the three-dimensional space coordinates of the sphere center of the left-eye eyeball and the three-dimensional space coordinates of the sphere center of the right-eye eyeball through a one-point calibration method, and the implementation of the one-point calibration method comprises the following steps:
the naked eye 3D display device displays a calibration point CP with known three-dimensional space coordinates, after a left eye or a right eye of an observer watches the calibration point CP, a left eye image or a right eye image is obtained through the camera unit, and the calculating unit calculates a three-dimensional space coordinate point P at the center of a pupil of the left eye through the left eye image or the right eye imageLOr the three-dimensional spatial coordinate point P of the center of the right eye pupilRFurther obtain the connection calibration point CP and the three-dimensional coordinate point PLA left eye straight line or a connection calibration point CP and a three-dimensional space coordinate point PRThe right eye straight line of (1) is along the left eye straight line and the three-dimensional space coordinate point PLThe coordinates of the points with the distance Dmm are the three-dimensional space coordinates of the sphere center of the eyeball of the left eye and the three-dimensional space coordinate point P along the straight line of the right eyeRThe coordinates of the points with the distance Dmm are the three-dimensional space coordinates of the spherical center of the eyeball of the right eye, and D is an empirical value of the distance between the spherical center of the eyeball and the center of the pupil.
3. The display device according to claim 1, wherein the eye tracking module comprises two image capturing units and a calculating unit, the two image capturing units are respectively used for acquiring a left eye image and a right eye image, relative positions of the two image capturing units and the naked eye 3D display device are fixed, the calculating unit calculates the left eye eyeball center three-dimensional space coordinate and the right eye eyeball center three-dimensional space coordinate by an N-point calibration method, N is greater than or equal to 2, and the implementation of the N-point calibration method comprises the following steps:
displaying N calibration points with known three-dimensional space coordinates by the naked eye 3D display device, wherein the N calibration points are positioned at different positions, and defining the nth calibration point as CPnThe mth calibration point is defined as CPm,n=1,......,N,m=1,......,N,n≠m;
The observer's left or right eye gazes at the index point CPnThen, a left eye image or a right eye image is obtained through the camera shooting unit, and the calculating unit calculates and obtains a three-dimensional space coordinate point of the center of the pupil of the left eye through the left eye image or the right eye image
Figure FDA0002591400280000021
Or three-dimensional spatial coordinate points at the center of the pupil of the right eye
Figure FDA0002591400280000022
Thereby obtaining a connection index point CPnAnd coordinate point of three-dimensional space
Figure FDA0002591400280000023
Left eye straight line
Figure FDA0002591400280000024
Or connect the index point CPnAnd coordinate point of three-dimensional space
Figure FDA0002591400280000025
Right eye straight line
Figure FDA0002591400280000026
The observer's left or right eye gazes at the index point CPmThen, a left eye image or a right eye image is obtained through the camera shooting unit, and the calculating unit calculates and obtains a three-dimensional space coordinate point of the center of the pupil of the left eye through the left eye image or the right eye image
Figure FDA0002591400280000027
Or three-dimensional spatial coordinate points at the center of the pupil of the right eye
Figure FDA0002591400280000028
Thereby obtaining a connection index point CPmSit in three-dimensional spacePunctuation
Figure FDA0002591400280000029
Left eye straight line
Figure FDA00025914002800000210
Or connect the index point CPmAnd coordinate point of three-dimensional space
Figure FDA00025914002800000211
Right eye straight line
Figure FDA00025914002800000212
Left eye straight line
Figure FDA00025914002800000213
Is in line with the left eye
Figure FDA00025914002800000214
The coordinates of the intersection points are the three-dimensional space coordinates of the sphere center of the left eye eyeball; straight line of right eye
Figure FDA00025914002800000215
Straight line with the right eye
Figure FDA00025914002800000216
The coordinates of the intersection points are the three-dimensional space coordinates of the spherical center of the eye of the right eye.
4. The display apparatus according to claim 1, wherein the eye tracking module comprises two image capturing units, two light sources and a calculating unit, wherein:
two point light sources for forming virtual images in the left and right eyes of the observer, and coordinate points R corresponding to the two point light sources1、R2The three-dimensional space coordinates of (a) are known;
the two camera units are fixed relative to the naked eye 3D display device and are respectively used for acquiring a left eye image containing a virtual image and a right eye image containing the virtual image;
the calculation unit calculates the three-dimensional space coordinates of the sphere center of the left eye eyeball and the sphere center of the right eye eyeball in a calibration-free mode, and the calibration-free mode comprises the following steps:
a calculation unit obtains three-dimensional spatial coordinate points R 'of two virtual images for the left eye using the left eye image including the virtual images'1L、R’2LAnd obtaining a three-dimensional space coordinate point P of the center of the pupil of the left eyeLOr obtaining three-dimensional space coordinate points R 'of two virtual images of the right eye by using the right eye image containing the virtual images'1R、R’2RAnd obtaining a three-dimensional space coordinate point P of the pupil center of the right eyeR
Calculating to obtain a connecting three-dimensional space coordinate point R'1LAnd coordinate point R1Of linear R'1LR1Connecting three-dimensional space coordinate points R'2LAnd coordinate point R2Of linear R'2LR2Straight line R'1LR1And linear R'2LR2The intersection point of the two is the spherical center O of the spherical surface of the outer surface of the cornea of the left eyeLAnd further calculating to obtain a connecting three-dimensional space coordinate point PLAnd the center of sphere OLAlong the left-eye line and the three-dimensional spatial coordinate point PLThe coordinates of points with a distance Dmm between the points are three-dimensional space coordinates of the eyeball center of the left eye, and D is an empirical value of the distance between the eyeball center and the pupil center;
calculating to obtain a connecting three-dimensional space coordinate point R'1RAnd coordinate point R1Of linear R'1RR1Connecting three-dimensional space coordinate points R'2RAnd coordinate point R2Of linear R'2RR2Straight line R'1RR1And linear R'2RR2The intersection point of the right eye and the right eye is the spherical center O of the spherical surface on which the outer surface of the cornea is positionedRAnd further calculating to obtain a connecting three-dimensional space coordinate point PRAnd the center of sphere ORAlong the right-eye line and the three-dimensional space coordinate point PRThe coordinates of points with a distance Dmm between the points are the three-dimensional space of the spherical center of the eye of the right eyeAnd (4) coordinates.
5. The display device according to claim 4, wherein the calculating unit calculates the three-dimensional space coordinates of the sphere center of the left eye eyeball and the three-dimensional space coordinates of the sphere center of the right eye eyeball through a dynamic calibration method, and the implementation of the dynamic calibration method comprises the following steps:
an observer randomly looks at N different positions of the naked eye 3D display device, wherein N is more than or equal to 2, the calculation unit calculates to obtain N left eye straight lines or N right eye straight lines, the coordinate of the intersection point of any two left eye straight lines in the N left eye straight lines is the three-dimensional space coordinate of the sphere center of the left eye eyeball, and the coordinate of the intersection point of any two right eye straight lines in the N right eye straight lines is the three-dimensional space coordinate of the sphere center of the right eye eyeball.
6. An application method of the display device according to claim 1, wherein the display of the left-eye monocular mode, the right-eye monocular mode, the binocular 2D mode or the binocular 3D mode is performed, and the eye tracking module performs the eye movement test when the display of the left-eye monocular mode, the right-eye monocular mode, the binocular 2D mode or the binocular 3D mode is performed, wherein:
the left-eye monocular mode is to display L-class images visible only to the left eye;
the right-eye monocular mode is to display an R-class image visible only to the right eye;
the binocular 2D mode is that a left-eye visible L-type image and a right-eye visible R-type image are displayed simultaneously, and the L-type image and the R-type image are the same;
the binocular 3D mode is to simultaneously display an L-type image visible to the left eye and an R-type image visible to the right eye, and the L-type image and the R-type image are stereoscopic images with parallax.
7. The method for applying a display device according to claim 6, wherein a visual acuity test or a visual acuity training is performed while performing the left-eye monocular mode, the right-eye monocular mode, the binocular 2D mode, or the binocular 3D mode display.
8. The method for applying a display device according to claim 6, wherein a visual acuity test is performed in the left-eye monocular mode and the right-eye monocular mode, the visual acuity test is a national standard logarithmic visual acuity chart, and the L-type images displayed in the left-eye monocular mode are visual acuity charts which can be seen only by the left eye; the R-type image displayed in the right-eye monocular mode is an eye chart that can be seen only by the right eye.
9. The method as claimed in claim 6, wherein the visual training is performed in the binocular 3D mode, the L-type image visible to the left eye and the R-type image visible to the right eye have a certain parallax, and the parallax varies periodically.
CN202010701510.7A 2020-07-20 2020-07-20 Display device and application method Pending CN111885367A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010701510.7A CN111885367A (en) 2020-07-20 2020-07-20 Display device and application method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010701510.7A CN111885367A (en) 2020-07-20 2020-07-20 Display device and application method

Publications (1)

Publication Number Publication Date
CN111885367A true CN111885367A (en) 2020-11-03

Family

ID=73155663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010701510.7A Pending CN111885367A (en) 2020-07-20 2020-07-20 Display device and application method

Country Status (1)

Country Link
CN (1) CN111885367A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040184A (en) * 2021-11-26 2022-02-11 京东方科技集团股份有限公司 Image display method, system, storage medium and computer program product
WO2024123179A1 (en) * 2022-12-08 2024-06-13 Dimenco Holding B.V. Method for displaying a stereoscopic image on an autostereoscopic display device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072366A (en) * 2007-05-24 2007-11-14 上海大学 Free stereo display system and method based on light field and binocular vision technology
CA2750287A1 (en) * 2011-08-29 2011-11-02 Microsoft Corporation Gaze detection in a see-through, near-eye, mixed reality display
JP2013025101A (en) * 2011-07-21 2013-02-04 Olympus Corp Image forming apparatus
CN103091849A (en) * 2011-11-08 2013-05-08 原创奈米科技股份有限公司 Three-dimensional image display method
US20130307948A1 (en) * 2012-05-16 2013-11-21 Samsung Display Co., Ltd. 3-dimensional image display device and display method thereof
US20150268476A1 (en) * 2014-03-18 2015-09-24 Kabushiki Kaisha Toshiba Image display device and image display method
US20150365650A1 (en) * 2014-06-16 2015-12-17 Hyundai Motor Company Method for extracting eye center point
CN106168853A (en) * 2016-06-23 2016-11-30 中国科学技术大学 A kind of free space wear-type gaze tracking system
US20180300589A1 (en) * 2017-04-13 2018-10-18 Modiface Inc. System and method using machine learning for iris tracking, measurement, and simulation
CN109040736A (en) * 2018-08-08 2018-12-18 上海玮舟微电子科技有限公司 A kind of scaling method, device, equipment and the storage medium of eye space position
CN109963140A (en) * 2017-12-25 2019-07-02 深圳超多维科技有限公司 Nakedness-yet stereoscopic display method and device, equipment and computer readable storage medium
WO2020095856A1 (en) * 2018-11-05 2020-05-14 ソニー株式会社 Image projection system, image projection device, diffractive optical element for image display, and image projection method
US20200195915A1 (en) * 2018-10-31 2020-06-18 Tobii Ab Detecting relocation of a head-mounted device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072366A (en) * 2007-05-24 2007-11-14 上海大学 Free stereo display system and method based on light field and binocular vision technology
JP2013025101A (en) * 2011-07-21 2013-02-04 Olympus Corp Image forming apparatus
CA2750287A1 (en) * 2011-08-29 2011-11-02 Microsoft Corporation Gaze detection in a see-through, near-eye, mixed reality display
CN103091849A (en) * 2011-11-08 2013-05-08 原创奈米科技股份有限公司 Three-dimensional image display method
US20130114135A1 (en) * 2011-11-08 2013-05-09 Unique Instruments Co. Ltd Method of displaying 3d image
US20130307948A1 (en) * 2012-05-16 2013-11-21 Samsung Display Co., Ltd. 3-dimensional image display device and display method thereof
US20150268476A1 (en) * 2014-03-18 2015-09-24 Kabushiki Kaisha Toshiba Image display device and image display method
US20150365650A1 (en) * 2014-06-16 2015-12-17 Hyundai Motor Company Method for extracting eye center point
CN106168853A (en) * 2016-06-23 2016-11-30 中国科学技术大学 A kind of free space wear-type gaze tracking system
US20180300589A1 (en) * 2017-04-13 2018-10-18 Modiface Inc. System and method using machine learning for iris tracking, measurement, and simulation
CN109963140A (en) * 2017-12-25 2019-07-02 深圳超多维科技有限公司 Nakedness-yet stereoscopic display method and device, equipment and computer readable storage medium
CN109040736A (en) * 2018-08-08 2018-12-18 上海玮舟微电子科技有限公司 A kind of scaling method, device, equipment and the storage medium of eye space position
US20200195915A1 (en) * 2018-10-31 2020-06-18 Tobii Ab Detecting relocation of a head-mounted device
WO2020095856A1 (en) * 2018-11-05 2020-05-14 ソニー株式会社 Image projection system, image projection device, diffractive optical element for image display, and image projection method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040184A (en) * 2021-11-26 2022-02-11 京东方科技集团股份有限公司 Image display method, system, storage medium and computer program product
WO2024123179A1 (en) * 2022-12-08 2024-06-13 Dimenco Holding B.V. Method for displaying a stereoscopic image on an autostereoscopic display device
NL2033691B1 (en) * 2022-12-08 2024-06-14 Dimenco Holding B V Method for displaying a stereoscopic image on an autostereoscopic display device

Similar Documents

Publication Publication Date Title
CN113208884B (en) Visual detection and visual training equipment
CN109558012B (en) Eyeball tracking method and device
US9323075B2 (en) System for the measurement of the interpupillary distance using a device equipped with a screen and a camera
US20170263017A1 (en) System and method for tracking gaze position
JP2013228557A (en) Display device and control method thereof
ES2963724T3 (en) Prism prescription value acquisition system, acquisition procedure, acquisition device and program to correct fixation disparity
Ponto et al. Perceptual calibration for immersive display environments
Bakker et al. Accurate gaze direction measurements with free head movement for strabismus angle estimation
US11822089B2 (en) Head wearable virtual image module for superimposing virtual image on real-time image
TW202216103A (en) Systems and methods for improving binocular vision
CN111885367A (en) Display device and application method
TW201400084A (en) Fixation line measuring method, fixation line measuring device, eyeball rotation center measuring device, and view point measuring device
CN114903760A (en) Strabismus training equipment
JP3322625B2 (en) Pseudo visual device
Wibirama et al. 3D gaze tracking on stereoscopic display using optimized geometric method
Hartle et al. Stereoscopic depth constancy for physical objects and their virtual counterparts
CN112926523B (en) Eyeball tracking method and system based on virtual reality
CN112336301B (en) Strabismus measuring equipment
Guan et al. Perceptual requirements for world-locked rendering in AR and VR
Pala et al. Optical cross-talk and visual comfort of a stereoscopic display used in a real-time application
CN113080836A (en) Non-center gazing visual detection and visual training equipment
CN113138664A (en) Eyeball tracking system and method based on light field perception
Wibirama et al. Design and implementation of gaze tracking headgear for Nvidia 3D Vision®
CN110794590A (en) Virtual reality display system and display method thereof
CN112315423B (en) Eyeball movement measuring equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201103