CN114326129A - Virtual reality glasses - Google Patents

Virtual reality glasses Download PDF

Info

Publication number
CN114326129A
CN114326129A CN202210160030.3A CN202210160030A CN114326129A CN 114326129 A CN114326129 A CN 114326129A CN 202210160030 A CN202210160030 A CN 202210160030A CN 114326129 A CN114326129 A CN 114326129A
Authority
CN
China
Prior art keywords
lens
display
virtual reality
lens array
reality glasses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210160030.3A
Other languages
Chinese (zh)
Inventor
卢增祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yixin Technology Development Co ltd
Original Assignee
Yixin Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yixin Technology Development Co ltd filed Critical Yixin Technology Development Co ltd
Priority to CN202210160030.3A priority Critical patent/CN114326129A/en
Publication of CN114326129A publication Critical patent/CN114326129A/en
Priority to PCT/CN2022/118671 priority patent/WO2023159919A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The invention provides virtual reality glasses, which comprise a display chip and a lens array, wherein the display chip is positioned at the light incident side of the lens array, and a virtual image formed by light emitted by the display chip through the lens array is positioned at one side of the display chip, which is far away from the lens array; the display chip comprises a plurality of pixel units which are arranged in rows and columns along a first direction and a second direction, the pixel units arranged along the first direction have the same polarization state, and the polarization states of the pixel units in two adjacent rows are orthogonal along the second direction; the first direction is vertical to the second direction; the lens array comprises a plurality of lenses which are arranged in rows and columns along a third direction and a fourth direction, the plurality of lenses comprise first lenses and second lenses, a second lens is arranged between every two adjacent first lenses along the third direction, a second lens is arranged between every two adjacent first lenses along the fourth direction, and the polarization states of the first lenses and the second lenses are orthogonal. The invention can improve the display resolution and solve the problems of visual fatigue and dizziness caused by convergence and focusing conflict.

Description

Virtual reality glasses
Technical Field
The invention relates to the technical field of VR (virtual reality), in particular to virtual reality glasses.
Background
VR: virtual Reality (Virtual Reality) is a Virtual environment created by computer graphics technology, sensor technology, display technology, optical devices, etc. to make a user feel immersive. At present, VR or AR equipment manufacturers have more tables. One of the solutions is to project a micro projector onto a reflective screen by using the principle of optical reflection projection, and refract the micro projector to the eyeball through a lens, so as to form an enlarged virtual screen in front of the eyeball for displaying the image and data. The other solution is that a head-mounted display is mainly used for games, stereoscopic impression is enhanced by binocular parallax, and immersion of the games is improved.
All can bring the pixel amplification when enlargiing small screen to virtual big screen among above VR glasses technical scheme, cause the loss of resolution ratio, and above VR display device because there is great difference between eyes focus distance and the brain perception distance, produce symptoms such as tired vertigo easily. To improve the resolution of VR glasses, the display pixels need to be reduced to a small size, which may cause the diffraction limit problem of the optical system, and the ideal imaging point of the pixels cannot be obtained, and the resolution of the VR glasses cannot be approached or reached.
Disclosure of Invention
The embodiment of the invention provides virtual reality glasses, which are used for improving the display resolution and solving the problems of visual fatigue and dizziness caused by convergence and focusing conflicts.
The embodiment of the invention provides virtual reality glasses, which comprise a display chip and a lens array, wherein the display chip is positioned on the light incident side of the lens array, and a virtual image formed by light emitted by the display chip through the lens array is positioned on one side of the display chip, which is far away from the lens array;
the display chip comprises a plurality of pixel units which are arranged in rows and columns along a first direction and a second direction, the pixel units arranged along the first direction have the same polarization state, and the polarization states of two adjacent rows of the pixel units are orthogonal along the second direction; the first direction is perpendicular to the second direction;
the lens array comprises a plurality of lenses which are arranged in rows and columns along a third direction and a fourth direction, the plurality of lenses comprise first lenses and second lenses, the second lens is arranged between every two adjacent first lenses along the third direction at an interval, the second lens is arranged between every two adjacent first lenses along the fourth direction at an interval, the polarization states of the first lenses and the second lenses are orthogonal, and the third direction is perpendicular to the fourth direction.
Optionally, the first lens and the second lens have the same focal length; the first lens and the second lens are mounted on different planes.
Alternatively to this, the first and second parts may,
Figure BDA0003514154200000021
wherein f is the focal length of the first lens or the second lens, emax is the maximum value of the pupil of the human eye, emin is the minimum value of the pupil of the human eye, L is the distance between the first lens and the pupil of the human eye, D is the entrance pupil diameter of the lens,
Figure BDA0003514154200000022
error tracking for the position of the pupil of the human eye.
Optionally, the first lens and the second lens have different focal lengths; the first lens and the second lens are arranged on the same plane.
Optionally, the distance between the centers of the adjacent lenses is t,
Figure BDA0003514154200000023
wherein emin is the minimum value of the pupil of the human eye, D is the entrance pupil diameter of the lens,
Figure BDA0003514154200000024
error tracking for the position of the pupil of the human eye.
Optionally, the display device further comprises a driving module, wherein the driving module is used for driving the display chip and/or the lens array to move along the axial direction and driving the display chip and/or the lens array to vibrate along the vertical axis direction; the axial direction is perpendicular to the plane of the display chip, and the vertical axis direction is parallel to the plane of the display chip.
Optionally, the lens array further includes a lens array support frame, and the lens is rotationally fixed on the lens array support frame;
the driving module comprises a telescopic device, and the telescopic device is used for driving the lens to rotate so as to realize that the lens array swings along the vertical axis direction.
Optionally, the vertical axis direction is parallel to the second direction.
Optionally, the virtual image includes at least two display layers sequentially arranged; the virtual reality glasses further comprise a sensor and a controller, wherein the sensor is used for detecting the position of the lens array and feeding back the position to the controller in real time, and the controller is used for controlling the time point of lightening the display chip according to the position of the lens array so that the human eyes can see the complete picture displayed by each display layer.
Optionally, the virtual image includes at least two display layers sequentially arranged; the virtual reality glasses further comprise a human eye tracking device and a controller, wherein the human eye tracking device is used for tracking and determining the positions of human eyes and pupils, so that clear images can be observed in any eye movement range; the controller adjusts the display brightness of each display layer through adjusting the brightness of the pixel units, so that the distance between the picture formed by visual fusion of the display layers and the visual comfort characteristic of human eyes is displayed.
Optionally, the virtual image includes a first display layer, a second display layer, a third display layer, and a fourth display layer, which are sequentially arranged, and the first display layer is located between the display chip and the second display layer.
Optionally, the display module comprises a plurality of display modules, each display module comprises a display chip and a lens array which are arranged in parallel, the display chips are spliced, and the lens arrays are spliced;
and a non-zero included angle exists between every two adjacent display chips.
Optionally, the first lens and the second lens have the same focal length; the first lens and the second lens are mounted on the same plane.
Optionally, the optical system further comprises a mechanical adjusting device, wherein the mechanical adjusting device is used for adjusting the initial position of the lens array and/or the display chip, so as to adapt to human eyes with different visual acuity.
Optionally, the device further comprises a fixing device, a pupil detection device and a driving and detection device;
the fixing device and the pupil detection device comprise an eye tracking device, and the driving and detection device comprises a driving module, a sensor and a controller;
the human eye tracking device is used for tracking and determining the positions of pupils of human eyes, so that clear images can be observed in any eye movement range;
the driving module is used for driving the display chip and/or the lens array to move along the axial direction and driving the display chip and/or the lens array to vibrate along the vertical axis direction;
the sensor is used for detecting the position of the lens array and feeding back the position to the controller in real time.
Optionally, the system further comprises a human eye tracking device and a controller, wherein the human eye tracking device is used for tracking and determining the position of a human eye pupil;
the controller projects images to pupils of human eyes according to the positions of the through holes of the human eyes, so that the virtual reality glasses have dynamic exit pupils, and clear images can be observed in any eye movement range.
The embodiment of the invention provides virtual reality glasses, which comprise a display chip and a lens array, wherein the display chip is provided with pixel units with orthogonal polarization states, the lens array is provided with a first lens and a second lens with orthogonal polarization states, and the first lens and the second lens are arranged at intervals along a third direction and a fourth direction. Therefore, the pixel units observed by human eyes through the lenses with different polarization states are not mutually interfered, the display area of the display chip can be enlarged due to the different polarization states of the adjacent lenses, the refreshing frequency of the display chip and the light-emitting area of the pixel units are not changed, the effective area of the display chip corresponding to a single lens is increased through the arrangement of the polarization states, and the display resolution is improved. And solves the problems of visual fatigue and dizziness caused by convergence and focusing conflict.
Drawings
Fig. 1 is a schematic structural diagram of virtual reality glasses according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a relative relationship between a display chip and a lens array according to an embodiment of the present invention;
fig. 3 is a front view of a display chip according to an embodiment of the present invention;
fig. 4 is a front view of a lens array according to an embodiment of the present invention;
fig. 5 is a perspective view of another lens array provided in the embodiment of the present invention;
FIG. 6 is a side view of the lens array of FIG. 5;
FIG. 7 is a schematic diagram of a display effect according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a portion of a display chip and a lens array;
FIG. 9 is a schematic view of light spots projected onto the pupils of a human eye when the pupils of the human eye are minimum;
fig. 10 is a schematic optical path diagram of virtual reality glasses according to an embodiment of the present invention;
FIG. 11 is a side view of another lens array provided by an embodiment of the invention;
fig. 12 is a schematic diagram of a first state of another virtual reality glasses according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of the virtual reality glasses of FIG. 12 in a second state;
FIG. 14 is a schematic diagram of another display effect according to an embodiment of the present invention;
FIG. 15 is a schematic diagram of another display effect according to an embodiment of the present invention;
FIG. 16 is a schematic diagram of another display effect according to an embodiment of the present invention;
fig. 17 is a schematic view illustrating a lens rotating according to an embodiment of the present invention;
fig. 18 is a schematic optical path diagram of another virtual reality glasses according to an embodiment of the present invention;
FIG. 19 is a schematic diagram of an effective area of a display chip viewed by human eyes through a single lens under different brightness environments;
fig. 20 is a schematic structural diagram of another virtual reality glasses according to an embodiment of the present invention;
fig. 21 is a schematic structural view of another virtual reality glasses according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Fig. 1 is a schematic structural diagram of virtual reality glasses according to an embodiment of the present invention, fig. 2 is a schematic structural diagram of a relative relationship between a display chip and a lens array according to an embodiment of the present invention, fig. 3 is a front view of the display chip according to the embodiment of the present invention, fig. 4 is a front view of the lens array according to the embodiment of the present invention, referring to fig. 2 to fig. 4, the virtual reality glasses include a display chip 10 and a lens array 20, the display chip 10 is located on a light incident side of the lens array 20, and a virtual image formed by light emitted by the display chip 10 through the lens array 20 is located on a side of the display chip 10 away from the lens array 20.
The display chip 10 includes a plurality of pixel units arranged in rows and columns along a first direction and a second direction, and the plurality of pixel units arranged along the first direction have the same polarization state, that is, the plurality of pixel units in the same row have the same polarization state. In the second direction, the polarization states of the pixel units in two adjacent rows are orthogonal. The first direction is perpendicular to the second direction. Illustratively, the plurality of pixel units includes a first pixel unit 11 and a second pixel unit 12, and the polarization state of the first pixel unit 11 is orthogonal to the polarization state of the second pixel unit 12.
Illustratively, the mutually orthogonal polarization states may be linear polarization or circular polarization. For example, the polarization state of the first pixel unit 11 is vertically linear polarization, and the polarization state of the second pixel unit 12 is horizontally linear polarization. Alternatively, the polarization state of the first pixel unit 11 is left-handed circular polarization, and the polarization state of the second pixel unit 12 is right-handed circular polarization. Of course, the mutually orthogonal polarization states are not limited to linear polarization and circular polarization, and may be other forms of polarization.
The lens array 20 includes a plurality of lenses arranged in rows and columns in the third direction and the fourth direction, and the plurality of lenses includes a first lens 21 and a second lens 22. Along the third direction, two adjacent first lenses 21 are spaced by one second lens 22, two adjacent second lenses 22 are spaced by one first lens 21, and the first lenses 21 and the second lenses 22 are arranged at intervals one by one. Along the fourth direction, two adjacent first lenses 21 are spaced by one second lens 22, two adjacent second lenses 22 are spaced by one first lens 21, and the first lenses 21 and the second lenses 22 are arranged at intervals one by one. The polarization states of the first lens 21 and the second lens 22 are orthogonal. In one embodiment, the first direction is parallel to the third direction and the second direction is parallel to the fourth direction. In another embodiment, the first direction intersects the third direction and the second direction intersects the fourth direction.
Illustratively, the polarization state of the first lens 21 is vertically linear polarization, and the polarization state of the second lens 22 is horizontally linear polarization. Alternatively, the polarization state of the first lens 21 is left-handed circular polarization, and the polarization state of the second lens 22 is right-handed circular polarization.
It can be understood that when the polarization state of the light emitted by the pixel unit is the same as that of the lens (i.e. the polarization state is parallel), the light can completely pass through the lens; when the polarization state of the light emitted by the pixel unit is orthogonal to the polarization state of the lens, the light does not pass through the lens at all (namely is cut off).
In one embodiment, the polarization state of the first lens 21 is the same as the polarization state of the first pixel unit 11, and is orthogonal to the polarization state of the second pixel unit 12. The polarization state of the second lens 22 is orthogonal to the polarization state of the first pixel unit 11, and is the same as the polarization state of the second pixel unit 12. At this time, the first lens 21 may transmit light emitted from the first pixel unit 11 and cut off light emitted from the second pixel unit 12. The second lens 22 may transmit light emitted from the second pixel unit 12 and cut off light emitted from the first pixel unit 11.
The embodiment of the invention provides virtual reality glasses, which comprise a display chip 10 and a lens array 20, wherein the display chip 10 is provided with pixel units with orthogonal polarization states, the lens array 20 is provided with a first lens 21 and a second lens 22 with orthogonal polarization states, and the first lens 21 and the second lens 22 are uniformly arranged at intervals along a third direction and a fourth direction. Therefore, pixel units observed by human eyes through lenses with different polarization states are not mutually interfered, the display area of the display chip 10 can be enlarged due to different polarization states of adjacent lenses, the refreshing frequency of the display chip 10 and the light-emitting area of the pixel units are not changed, the effective area of the display chip corresponding to a single lens is increased through the arrangement of the polarization states, and the display resolution is improved. And solves the problems of visual fatigue and dizziness caused by convergence and focusing conflict.
Fig. 5 is a perspective view of another lens array according to an embodiment of the invention, fig. 6 is a side view of the lens array shown in fig. 5, and fig. 7 is a schematic diagram of a display effect according to an embodiment of the invention, and referring to fig. 5 to 7, the first lens 21 and the second lens 22 have the same focal length. The first lens 21 and the second lens 22 are mounted on different planes. So that virtual images formed by the first lens 21 and the second lens 22 are located at different plane positions.
For example, referring to fig. 1 to 7, a virtual image formed by light emitted from the display chip 10 after passing through the second lens 22 is located on the first display layer 31, and a virtual image formed by light emitted from the display chip 10 after passing through the first lens 21 is located on the second display layer 32. The first display layer 31 is located between the second display layer 32 and the display chip 10.
According to the rayleigh criterion, the angular resolution formula of an optical system: sin β is 1.22 λ/D. Where β is the angular resolution, λ is the wavelength, D is the entrance pupil diameter, and β is equal to about 0.406mRad when D is 1.5mm and λ is 500 nm. However, the depth of field of the lens is relatively small, and in order to achieve better depth vision, virtual reality glasses are needed to perform multilayer display to achieve more real stereoscopic vision.
In practical situations, the vision of human eyes varies according to the environment, and the resolution usually does not reach 1 '(1/60 degrees), in one example of the present invention, assuming that the angular resolution of the general vision is 1.5' (1/40 degrees) (0.436mrad), when a virtual image is formed, because the image distance is larger, the object distance is close to the focal point, and then the angular resolution β imaged by the pixel unit satisfies:
β~pitch/f。 (1)
where pitch is the size of the pixel cell. f is the focal length of the lens. The symbol "" represents approximately equal.
Fig. 8 is a schematic diagram of a portion of a display chip and a lens array, and referring to fig. 8, a dotted square array is an area of the display chip 10 that needs to be displayed under a corresponding lens. The solid circular array is a lens array. In order to ensure that images formed by adjacent lenses are completely spliced, the area between the solid line square and the dotted line square in fig. 8 is set as a fusion area, the fusion areas corresponding to the adjacent lenses display the same images and are overlapped after being projected to the pupils of human eyes, and the images formed by the adjacent lenses with the solid lines are fused. The distance between the centers of adjacent lenses is t.
Fig. 9 is a schematic diagram of a light spot projected to a pupil of a human eye when the pupil of the human eye is minimum, referring to fig. 9, where a distance between centers of adjacent lenses is t, and polarization states of the adjacent lenses are different, and satisfy:
Figure BDA0003514154200000091
wherein emin is the minimum value of the pupil of the human eye, D is the entrance pupil diameter of the lens,
Figure BDA0003514154200000092
error tracking for the position of the pupil of the human eye. When the distance is far, the entrance pupil diameter D is equivalent to the size of a light spot incident on the pupil of the human eye after the pixel unit is imaged by the lens, and the formula (2) is required to be satisfied in order that the pupil of the human eye can view an image close to infinity and the brightness of spliced pixels of adjacent lenses is consistent. The diameter of the pupil is enough to ensure that all light spots emitted from the adjacent lenses are incident on the pupils of the human eyes. When D is 1.5mm, t is 2mm,
Figure BDA0003514154200000093
when the particle diameter is larger than the range of emin, emin is 4.5 mm. Namely, the pupil diameter is larger than 4.5mm when the virtual reality glasses are worn, and the complete and clear image can be watched. And 4mm-6mm is the comfortable pupil diameter of human eyes in normal lighting environment, and the diameter of the human eyes pupil can be adjusted to be less than 4mm only in extremely bright environment. Under general conditions, because the virtual reality glasses only provide light rays by the display chip, the external light rays can not irradiate the pupils of the human eyesTherefore, the pupil of the human eye is located in a dark area, and the diameter of the pupil of the human eye is generally larger than 4 mm. When the display picture is bright and the pupils of human eyes shrink, the embodiment of the invention can adjust the overall display brightness of the display chip, so that the pupils of human eyes cannot adjust the diameter of the pupils due to too bright, and the display picture viewed by the pupils of human eyes cannot be lost due to too bright display.
Fig. 10 is a schematic optical path diagram of virtual reality glasses according to an embodiment of the present invention, and referring to fig. 10, an image displayed by the display chip 10 includes line segments a1-b1 and line segments c1-d 1. The lens array 20 includes a lens a, a lens B, and a lens C. The size of the pupil 30 of the human eye changes along with the change of the external light brightness, and when the external light brightness is high, the diameter of the pupil 30 of the human eye is reduced; when the external light brightness is small, the diameter of the pupil 30 of the human eye becomes large. The minimum value of the pupil 30 of the human eye is denoted as emin, and the maximum value of the pupil 30 of the human eye is denoted as emax. The imaging objective lens is u, i.e. the distance between the display chip 10 and the lens array 20 is u. The first lens 21 (i.e., the lens array 20) is located at a distance L from the pupil 30 of the human eye. f is the focal length of the lens. Specifically, when the first lens 21 and the second lens 22 have the same focal length, f is the focal length of the first lens 21 or the second lens 22. The distance between any two of the lens A, the lens B and the lens C is 2 mm. The pixel units on the display chip 10 can only be observed by the pupils 30 of human eyes through one lens imaging, so that the divergence angle of the pixel units on the display chip 10 is ensured. Illustratively, the light beam exiting through the lens B at the point B1 can enter the pupil 30 of the human eye, and the light beam exiting through the lens C at the point C1 can also enter the pupil 30 of the human eye. Since the virtual image imaged by the display chip 10 through the lens array 20 is located far away, it can be considered that the point c1 and the point b1 are displayed at the same point on the imaging plane. When the displayed image is dark, the diameter of the pupil 30 of the human eye is increased, and the light beam emitted from the point b1 through the lens C is not incident into the pupil 30 of the human eye, so that the display crosstalk is avoided.
Optionally, the focal length f of the lens satisfies:
Figure BDA0003514154200000101
wherein D is the diameter of the entrance pupil of the lens,
Figure BDA0003514154200000102
error tracking for the position of the pupil of the human eye.
Illustratively, when L is 17mm,
Figure BDA0003514154200000111
when D is 1.5mm, emin is 4.5mm and emax is 8mm, f is obtained<13.8 mm. As can be seen from the above equation (1), the larger the focal length f of the lens, the smaller the angular resolution β, and the higher the resolution. As an example, a lens with a focal length f of 10mm and an entrance pupil diameter D of 1.5mm may suffice for the display. When L is 17mm, the distance between the lens array 20 and the pupil 30 of the human eye is close to the normal wearing mode of the glasses.
Fig. 11 is a side view of another lens array according to an embodiment of the invention, and referring to fig. 11, the first lens 21 and the second lens 22 have different focal lengths. The first lens 21 and the second lens 22 are mounted on the same plane. In the embodiment of the present invention, the first lens 21 and the second lens 22 with different focal lengths are adopted, the first lens 21 and the second lens 22 are installed on the same plane, and light emitted by the display chip 10 passes through the first lens 21 and the second lens 22 to form two different display layers respectively.
In another embodiment, the first lens 21 and the second lens 22 have different focal lengths, and the first lens 21 and the second lens 22 are mounted on different planes, and light emitted from the display chip 10 passes through the first lens 21 and the second lens 22 to form two different display layers, respectively.
In the above embodiments, different display layers are formed by setting lenses with different focal lengths on the same plane, or setting lenses with different focal lengths on the same plane. In some subsequent embodiments, by changing the distance between the display chip 10 and the lens array 20 in a dynamic setting manner, an original display layer is respectively displayed as a plurality of display layers at different time points; or the original two display layers are respectively displayed as a plurality of display layers at different time points.
Fig. 12 is a schematic diagram of a first state of another virtual reality glasses according to an embodiment of the present invention, fig. 13 is a schematic diagram of a second state of the virtual reality glasses shown in fig. 12, and referring to fig. 12 to 13, the virtual reality glasses further include a driving module (not shown in fig. 12 and 13) for driving the display chip 10 and/or the lens array 20 to move along the axial direction L1, and driving the display chip 10 and/or the lens array 20 to vibrate (e.g., move) along the vertical axis direction L2. The axial direction L1 is perpendicular to the plane of the display chip 10, and the vertical direction L2 is parallel to the plane of the display chip 10. The axial direction L1 is perpendicular to the vertical axis direction L2.
Fig. 14 is a schematic view of another display effect according to an embodiment of the invention, and referring to fig. 12-14, an example is taken in which the driving module drives the lens array 20 to move. As shown in fig. 12, in the first state of the virtual reality glasses, the lens array 20 and the display chip 10 have a first distance therebetween. After the driving module drives the lens array 20 to move along the axial direction L1, as shown in fig. 13, the virtual reality glasses are in the second state, and a second distance is formed between the lens array 20 and the display chip 10. The position before the lens array 20 is moved in the axial direction L1 is indicated by a dashed box in fig. 13. Wherein the first distance is not equal to the second distance. Thus, by changing the distance between the display chip 10 and the lens array 20, the original one display layer is respectively displayed as two display layers at different time points; the original two display layers can also be respectively displayed as four display layers at different time points.
Illustratively, when the lens array 20 satisfies that "lenses of different focal lengths are on the same plane, or lenses of different focal lengths are on the same plane", by changing the distance between the display chip 10 and the lens array 20, four display layers as shown in fig. 14, that is, a first display layer 31, a second display layer 32, a third display layer 33, and a fourth display layer 34 may be formed. The second display layer 32 is located between the first display layer 31 and the third display layer 33, and the third display layer 33 is located between the second display layer 32 and the fourth display layer 34.
In one embodiment, the lens array 20 is moved axially while also performing vertical axis vibration. The vertical axis vibration is vibration in the vertical axis direction L2. Due to vertical axis vibration, after one pixel unit on the display chip 10 is imaged by a lens with the same polarization state as the pixel unit, an imaging point of the pixel unit is also subjected to vibration scanning, and a pixel linear array is scanned on an image surface. The vertical axis vibration and the axial vibration are superposed to vibrate, and one pixel unit on the display chip 10 can be imaged into two pixel linear arrays on two surfaces through the lens. Here, the axial vibration is vibration in the axial direction L1. In the vertical axis vibration, the vibration direction of the lens array 20 forms an angle with the arrangement direction of the pixel units, that is, the vibration direction of the lens array 20 intersects with the first direction. Thus, the scanning lines formed by the pixel units are two oblique lines. The purpose of the vibration of the lens array 20 is to form two imaging display surfaces by axial vibration, and to form a display effect far higher than the resolution of the display chip on the two display surfaces by vertical vibration. In short, vertical axis vibration is used to improve display resolution.
Illustratively, when the angle between the second direction and the fourth direction is 7 ° (tan (7 °) to 1/8), the resolution of one pixel unit becomes 8 times the original resolution in the lateral direction by the vertical axis vibration.
In another embodiment, there may be only vertical axis vibration, and no axial vibration. That is, the resolution is increased only by the vertical vibration. The display layer is not increased by axial vibration. It may be adapted to add a display layer by a static manner of "setting the first lens 21 and the second lens 22 to have the same focal length and the first lens 21 and the second lens 22 to be disposed on different planes", or "setting the first lens 21 and the second lens 22 to have different focal lengths and the first lens 21 and the second lens 22 to be disposed on the same plane". Or in scenes where the resolution is increased, but the display layer need not be increased.
Optionally, the boresight direction L2 is parallel to the second direction. The first direction and the third direction form a small included angle, and the second direction and the fourth direction form a small included angle. The vertical axis direction L2 is parallel to the column direction of the pixel cells in the display chip 10.
The embodiment of the invention further introduces an imaging mode of the virtual reality glasses. In one embodiment, the virtual reality glasses may form at least two display layers, and display the pictures on the at least two display layers. The display resolution of each display layer is close to or reaches the display picture of the retina quality resolution, the display depth of field is increased, and the problem of visual fatigue caused by convergence and focus conflict is solved.
Referring to fig. 7, the virtual image includes at least two display layers sequentially disposed. The virtual reality glasses further comprise a sensor and a controller (not shown in fig. 7), wherein the sensor is used for detecting the position of the lens array 20 and feeding back the position to the controller in real time, and the controller is used for controlling the time point of lighting the display chip 10 according to the position of the lens array 20, so that the human eyes can see the pictures displayed by the display layers completely.
Exemplarily, referring to fig. 7, the virtual image includes a first display layer 31 and a second display layer 32. When the lens array 20 is in the first position, the virtual reality glasses form the first display layer 31, and the controller lights up the pixel units of the display chip 10 for realizing the first display layer 31. When the lens array 20 is in the second position, the virtual reality glasses form the second display layer 32, and the controller lights up the pixel units of the display chip 10 for realizing the second display layer 32.
Exemplarily, referring to fig. 14, the virtual image includes a first display layer 31, a second display layer 32, a third display layer 33, and a fourth display layer 34. The distance between the first display layer 31 and the pupil 30 of the human eye is P1, the distance between the second display layer 32 and the pupil 30 of the human eye is P2, the distance between the third display layer 33 and the pupil 30 of the human eye is P3, and the distance between the fourth display layer 34 and the pupil 30 of the human eye is P4. The value of P1 is determined according to the photopic distance of human eyes, the value of P4 is determined according to the requirement of comfortable viewing of distant pictures, and the values of P2 and P3 are determined according to the display layers and the lens vibration distance. The fourth display layer 34 is farther from the lens array 20 and has a foreground that is deep near infinity.
Exemplarily, P1 ═ 0.25m, P2 ═ 0.36m, P3 ═ 0.69m, and P4 ═ 4 m.
When the pupil 30 of the human eye views the image of the display chip 10 through the lens array 20, a suitable display area of the display chip 10 is selected according to the position difference between the lens array 20 and the pupil 30 of the human eye, so that after each area of the display chip 10 is imaged through different lenses, the image plane (i.e., the display layer) can be completely spliced into a large-amplitude image plane. When the pupils 30 of the human eyes see through the lens array 20, since the lens size is small and the lens is within the distance of the human eyes, the pupils 30 of the human eyes can only see the images formed by the regions of the display chip 10 through the lens, and cannot see the gaps between the lenses.
In another embodiment, the virtual reality glasses may form at least two display layers, and make at least two of the display layers generate visual fusion, and the pictures are displayed on the virtual display layers after the visual fusion.
Fig. 15 is a schematic view of another display effect according to an embodiment of the present invention, and referring to fig. 15, a virtual image includes at least two display layers sequentially disposed. The virtual reality glasses further include a human eye tracking device and a controller (not shown in fig. 15), wherein the human eye tracking device is used for tracking and determining the position of the pupil 30 of the human eye, so that a clear image can be observed in any range of eye motion. The controller adjusts the display brightness of each display layer through adjusting the brightness of the pixel unit, so that the picture formed by visual fusion of the display layers is displayed to the distance which accords with the visual comfort characteristic of human eyes. The eye movement range refers to the movement range of the pupils of the human eyes, and the pupils of the human eyes can rotate in various directions such as up, down, left and right directions. The virtual reality glasses provided by the invention enable human eyes to have a large eye movement range (namely eye movement distance), and enable clear images to be observed in any eye movement range.
Exemplarily, referring to fig. 15, the virtual image includes a first display layer 31 and a second display layer 32. When the same display screen with different brightness is viewed on the first display layer 31 and the second display layer 32 through the human eye pupil 30, the two display screens are merged into one layer (the first virtual display layer 351) through human eye vision, the first virtual display layer 351 is positioned at a position between the first display layer 31 and the second display layer 32, the position is determined by the brightness of the display screen of the first display layer 31 and the second display layer 32, and the first virtual display layer 351 is closer to the brighter display layer.
The exit pupil and the imaging distance of the traditional VR eye optical system determine the divergence angle of the light emitted by the pixel, when the image is far away, the divergence angle is small, the eye movement range of human eyes is limited to a great extent, and when the image exceeds the eye movement range of the optical system, the phenomenon that a display picture is unclear or lost occurs. In an embodiment of the present invention, the virtual reality glasses further include a human eye tracking device and a controller, wherein the human eye tracking device is configured to track and determine a position of a pupil of a human eye. The controller projects images to pupils of human eyes according to the positions of the through holes of the human eyes, so that the virtual reality glasses have dynamic exit pupils, and clear images can be observed in any eye movement range. That is, by dynamically tracking the exit pupil, a large eye movement range is supported even if the beam divergence angle of the pixel cell after the lens is small.
Fig. 16 is a schematic view of another display effect provided by an embodiment of the present invention, and referring to fig. 16, a virtual image includes a first display layer 31, a second display layer 32, a third display layer 33, and a fourth display layer 34. According to the position of the pupil 30 of the human eye determined by the human eye tracking device, when the lens array 20 vibrates to a specific position, the light emitting brightness of the pixel unit in the display chip 10 is determined, and a virtual display layer is formed by adjusting the brightness of at least one of the first display layer 31, the second display layer 32, the third display layer 33 and the fourth display layer 34 (for example, the second display layer 32 and the third display layer 33 form the second virtual display layer 352) in at least two of the first display layer 31, the second display layer 32, the third display layer 33 and the fourth display layer 34. According to the embodiment of the invention, the distance from the vision fusion picture to the vision comfort characteristic of human eyes is displayed by adjusting the display brightness of the adjacent display layers, so that the human eyes can comfortably see a clear picture at a photopic distance and at infinity by combining the depth of field of the lens and the eye adjustment characteristic.
Fig. 17 is a schematic view illustrating that the lens provided by the embodiment of the present invention rotates, and referring to fig. 17, the lens array further includes a lens array support frame 23, and the lens (the first lens 21 or the second lens 22) is rotatably fixed on the lens array support frame 23. The driving module comprises a telescopic device 51, wherein the telescopic device 51 is used for driving the lenses to rotate, and the single lens independently swings in a deflection mode with an included angle changing relative to the optical axis of the single lens. The swing amplitude can be less than +/-10 degrees, and the plane of the swing is parallel to a connecting line of two adjacent opposite polarization lenses. Optically equivalent to a translational vibration in the direction of the homeotropic axis. Therefore, the embodiment of the invention realizes the swinging of the lens in the vertical axis direction, thereby realizing the swinging of the lens array in the vertical axis direction.
For example, the telescopic device 51 may be an electrical signal driving device such as a piezoelectric ceramic, a memory metal wire, etc., the telescopic device 51 drives the lens to slightly rotate, at this time, an image formed by the display chip through the lens also rotates, the display pixels on the image plane are displaced, and because the rotation angle is small and the lens has a certain depth of field, the original image plane and the rotated image plane can still be considered to be on the same plane, and only the upper and lower positions of the pixel unit are changed.
Assuming that the rotation angle of the lens is α and the distance between the first lens and the pupil of the human eye is L, the pixel unit moving distance h satisfies: h is sin (α) L.
In one embodiment, one telescopic device 51 may be provided corresponding to a plurality of lenses; in another embodiment, one telescopic device 51 may be provided corresponding to one lens.
Fig. 18 is a schematic optical path diagram of another virtual reality glasses according to an embodiment of the present invention, and referring to fig. 18, only e is shown on the display chip 10aThe area is imaged through the lens a and seen by the pupil 30 of the human eye, ebThe region is imaged by the lens b and is seen by the pupil 30 of the human eye due to L>u hours and e very small, eaAnd ebThe regions are not coincident, which can not cause ebThe area pixel luminescence is imaged to the pupil 30, e of the human eye through the lens aaNor does it emit light that is imaged through lens b to the pupil 30 of the human eye. However, when the space light is dark, the pupil 30 of the human eye is bigger, the maximum pupil diameter can reach 8mm, eaAnd ebThe area is increased, and e appearsaAnd ebThe risk of overlapping regions.
Fig. 19 is a schematic view of effective areas of a display chip viewed by human eyes through a single lens under different luminance environments, in fig. 19, an area k1 is an effective imaging display area corresponding to a minimum value emin of a human eye pupil 30 on the display chip 10, and an area k2 is an effective display area corresponding to a maximum value emax of the human eye pupil 30 on the display chip 10. In order to satisfy independent display of the pixel units corresponding to the lenses and not influence imaging display of adjacent lenses, a k1 area needs to be displayed. Thus, the area between the k1 area and the k2 area causes the waste of the pixel cells, and the utilization rate of the pixel cells of the display chip 10 is reduced.
Alternatively, the first lens 21 and the second lens 22 have the same focal length, and the first lens 21 and the second lens 22 are mounted on the same plane. Because the polarization states of the adjacent pixel units are different, the polarization states of the adjacent lenses are different, only when the polarization states of the pixel units are the same as the polarization states of the lenses, the light beams emitted by the pixel units can be imaged through the lenses, and at the moment, the pixel units observed by the pupils of human eyes through the lenses with different polarization states are not mutually interfered, so that the situations of overlapping of the display areas or mutual interference of imaging cannot occur. Due to the different polarization states of the adjacent lenses, the display area of the display chip can be enlarged to the k3 area as shown in the figure, the refresh frequency of the display chip 10 and the light-emitting area of the pixel unit of the display chip 10 are not changed, and the display resolution is provided.
Fig. 20 is a schematic structural view of another virtual reality glasses according to an embodiment of the present invention, and referring to fig. 20, the virtual reality glasses include a plurality of display modules, and each display module includes a display chip and a lens array that are arranged in parallel. The plurality of display modules are respectively a first module 1001, a second module 1002 and a third module 1003. The display module includes a display chip 10 and a plurality of lens arrays 20. The plurality of display modules are spliced, namely the plurality of display chips 10 are spliced, and the plurality of lens arrays 20 are spliced. Two adjacent display chips have a nonzero included angle, and because the display chips in the same display module are arranged in parallel with the lens arrays, the two adjacent lens arrays have a nonzero included angle. From this, through the concatenation of display module assembly, enlarged the demonstration field of vision.
Illustratively, the included angle between the first module 1001 and the second module 1002 is θ, 20 ° ≦ θ ≦ 30 °. The angle between the second module 1002 and the third module 1003 may be θ. That is, the angle between the first module 1001 and the second module 1002 is equal to the angle between the second module 1002 and the third module 1003.
Fig. 21 is a schematic structural view of another virtual reality glasses according to an embodiment of the present invention, and referring to fig. 21, the virtual reality glasses include a fixing device and pupil detection device 40 and a driving and detection device 50, the fixing device and pupil detection device 40 includes an eye tracking device, and the driving and detection device 50 includes a driving module, a sensor, and a controller. The virtual reality glasses further include a mechanical adjusting device, which may be disposed in the fixing device and the pupil detection device 40 or the driving and detection device 50, or may be disposed separately. The mechanical adjusting device is used for adjusting the initial position of the lens array 20 and/or the display chip 10, and is used for adapting to human eyes with different vision.
As an example, the embodiment of the present invention further provides design parameters of virtual reality glasses. The lens interval t is 2mm, the entrance pupil diameter of the lens is 1.5mm, and L is 17 mm. The first, second, third, and fourth display layers 31, 32, 33, and 34 may be arranged as shown in the following table, calculated according to the imaging formula:
Figure BDA0003514154200000191
wherein, P1 and P2 are two display layers that same array face camera lens formed because of axial vibration formation of image, and P3 and P4 are two display layers that the camera lens axial vibration formation of another array face formed, and two-layer camera lens array face interval is 243um, and camera lens axial vibration displacement is A1 ═ 117 um. The lenses with different polarization properties are arranged at intervals on the vertical axis component of the vibration, the vibration needs to reach at least the position of the bulkhead lens, the resolution of the display chip 10 is determined by the size of the light-emitting point, the installation direction and the included angle of the vibration long axis, and the period span is larger due to the existence of the polarization, so the effective amplitude A2 of the Y of the vertical axis component of the lens array is larger than 2 mm. In summary, the ratio k of the long (vertical axis) vibration to the short (axial) vibration of the two-dimensional vibration is a2/a1>2000/117, the lens axial displacement is relatively small relative to the vertical axis displacement and can be almost ignored, and the lens array axial vibration process can be ignored.
As an example, the embodiment of the present invention further provides another design parameter of virtual reality glasses. Because the virtual reality glasses can adjust the distance between the display chip 10 and the lens array 20, when the eyes of a user are near-sighted, the distance between the lens array 20 and the display chip 10 can be adjusted, so that the distance between an imaging surface and the eyes is closer than that of the normal eyes, and therefore a near-sighted viewer can watch the same display content with the normal eyes by only wearing VR eyes. If the human eye is near 500 degrees, the picture of the first display layer is adjusted forwards. Compared with a non-myopic wearer, the distance between the lens array 20 and the display chip 10 is adjusted by 0.44mm, and the distance can be realized by a mechanical adjusting device.
The lens interval t is 2mm, the entrance pupil diameter of the lens is 1.5mm, and L is 17 mm. When the human eye is near 500 degrees, the first display layer 31, the second display layer 32, the third display layer 33 and the fourth display layer 34 may be set as shown in the following table according to the imaging formula:
Figure BDA0003514154200000201
it is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious modifications, rearrangements, combinations and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (16)

1. The virtual reality glasses are characterized by comprising a display chip and a lens array, wherein the display chip is positioned on the light incident side of the lens array, and a virtual image formed by light emitted by the display chip through the lens array is positioned on one side of the display chip, which is far away from the lens array;
the display chip comprises a plurality of pixel units which are arranged in rows and columns along a first direction and a second direction, the pixel units arranged along the first direction have the same polarization state, and the polarization states of two adjacent rows of the pixel units are orthogonal along the second direction; the first direction is perpendicular to the second direction;
the lens array comprises a plurality of lenses which are arranged in rows and columns along a third direction and a fourth direction, the plurality of lenses comprise first lenses and second lenses, the second lens is arranged between every two adjacent first lenses along the third direction at an interval, the second lens is arranged between every two adjacent first lenses along the fourth direction at an interval, the polarization states of the first lenses and the second lenses are orthogonal, and the third direction is perpendicular to the fourth direction.
2. The virtual reality glasses of claim 1, wherein the first lens and the second lens have the same focal length;
the first lens and the second lens are mounted on different planes.
3. The virtual reality glasses according to claim 2,
Figure FDA0003514154190000011
wherein f is the focal length of the first lens or the second lens, emax is the maximum value of the pupil of the human eye, emin is the minimum value of the pupil of the human eye, L is the distance between the first lens and the pupil of the human eye, D is the entrance pupil diameter of the lens,
Figure FDA0003514154190000012
error tracking for the position of the pupil of the human eye.
4. The virtual reality glasses of claim 1, wherein the first lens and the second lens have different focal lengths;
the first lens and the second lens are arranged on the same plane.
5. The virtual reality glasses of claim 1, wherein a distance between centers of adjacent lenses is t,
Figure FDA0003514154190000021
wherein emin is the minimum value of the pupil of the human eye, D is the entrance pupil diameter of the lens,
Figure FDA0003514154190000022
error tracking for the position of the pupil of the human eye.
6. The virtual reality glasses according to claim 1, further comprising a driving module, wherein the driving module is configured to drive the display chip and/or the lens array to move in an axial direction and to drive the display chip and/or the lens array to vibrate in a vertical axis direction;
the axial direction is perpendicular to the plane of the display chip, and the vertical axis direction is parallel to the plane of the display chip.
7. The virtual reality glasses of claim 6, wherein the lens array further comprises a lens array support frame, the lens being rotationally fixed to the lens array support frame;
the driving module comprises a telescopic device, and the telescopic device is used for driving the lens to rotate so as to realize that the lens array swings along the vertical axis direction.
8. The virtual reality glasses of claim 6, wherein the vertical axis direction is parallel to the fourth direction.
9. The virtual reality glasses of claim 6, wherein the virtual image comprises at least two display layers arranged in sequence;
the virtual reality glasses further comprise a sensor and a controller, wherein the sensor is used for detecting the position of the lens array and feeding back the position to the controller in real time, and the controller is used for controlling the time point of lightening the display chip according to the position of the lens array so that the human eyes can see the complete picture displayed by each display layer.
10. The virtual reality glasses of claim 1, wherein the virtual image comprises at least two display layers arranged in sequence;
the virtual reality glasses further comprise a human eye tracking device and a controller, wherein the human eye tracking device is used for tracking and determining the positions of human eyes and pupils, so that clear images can be observed in any eye movement range; the controller adjusts the display brightness of each display layer through adjusting the brightness of the pixel units, so that the distance between the picture formed by visual fusion of the display layers and the visual comfort characteristic of human eyes is displayed.
11. The virtual reality glasses according to claim 1, wherein the virtual image includes a first display layer, a second display layer, a third display layer, and a fourth display layer arranged in sequence, and the first display layer is located between the display chip and the second display layer.
12. The virtual reality glasses according to claim 1, comprising a plurality of display modules, wherein the display modules comprise the display chips and the lens arrays arranged in parallel, the display chips are spliced, and the lens arrays are spliced;
and a non-zero included angle exists between every two adjacent display chips.
13. The virtual reality glasses of claim 1, wherein the first lens and the second lens have the same focal length;
the first lens and the second lens are mounted on the same plane.
14. The virtual reality glasses according to claim 1, further comprising a mechanical adjustment device for adjusting an initial position of the lens array and/or the display chip for accommodating human eyes with different vision.
15. The virtual reality glasses according to claim 14, further comprising a fixing device and a pupil detection device and a driving and detection device;
the fixing device and the pupil detection device comprise an eye tracking device, and the driving and detection device comprises a driving module, a sensor and a controller;
the human eye tracking device is used for tracking and determining the positions of pupils of human eyes, so that clear images can be observed in any eye movement range;
the driving module is used for driving the display chip and/or the lens array to move along the axial direction and driving the display chip and/or the lens array to vibrate along the vertical axis direction;
the sensor is used for detecting the position of the lens array and feeding back the position to the controller in real time.
16. The virtual reality glasses according to claim 1, further comprising a human eye tracking device and a controller, wherein the human eye tracking device is configured to track and determine a position of a pupil of a human eye;
the controller projects images to pupils of human eyes according to the positions of the through holes of the human eyes, so that the virtual reality glasses have dynamic exit pupils, and clear images can be observed in any eye movement range.
CN202210160030.3A 2022-02-22 2022-02-22 Virtual reality glasses Pending CN114326129A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210160030.3A CN114326129A (en) 2022-02-22 2022-02-22 Virtual reality glasses
PCT/CN2022/118671 WO2023159919A1 (en) 2022-02-22 2022-09-14 Virtual reality glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210160030.3A CN114326129A (en) 2022-02-22 2022-02-22 Virtual reality glasses

Publications (1)

Publication Number Publication Date
CN114326129A true CN114326129A (en) 2022-04-12

Family

ID=81030787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210160030.3A Pending CN114326129A (en) 2022-02-22 2022-02-22 Virtual reality glasses

Country Status (2)

Country Link
CN (1) CN114326129A (en)
WO (1) WO2023159919A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114740625A (en) * 2022-04-28 2022-07-12 珠海莫界科技有限公司 Optical machine, control method of optical machine and AR near-to-eye display device
WO2023159919A1 (en) * 2022-02-22 2023-08-31 亿信科技发展有限公司 Virtual reality glasses

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110063289A1 (en) * 2008-05-08 2011-03-17 Seereal Technologies S.A. Device for displaying stereoscopic images
WO2017164573A1 (en) * 2016-03-23 2017-09-28 Samsung Electronics Co., Ltd. Near-eye display apparatus and near-eye display method
CN107505720A (en) * 2017-09-14 2017-12-22 北京邮电大学 A kind of 3 d light fields display device based on cross-polarization
CN108732763A (en) * 2018-05-31 2018-11-02 京东方科技集团股份有限公司 Display screen and head for virtual reality show device and show system with virtual reality head
CN108886610A (en) * 2016-03-15 2018-11-23 深见有限公司 3D display device, methods and applications

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101255210B1 (en) * 2006-05-04 2013-04-23 삼성전자주식회사 Multiview autostereoscopic display
JP5127530B2 (en) * 2008-03-26 2013-01-23 株式会社東芝 Stereoscopic image display device
CN108445633A (en) * 2018-03-30 2018-08-24 京东方科技集团股份有限公司 A kind of VR head-mounted display apparatus, VR display methods and VR display systems
CN213876192U (en) * 2021-01-28 2021-08-03 成都工业学院 Multi-resolution stereoscopic display device based on polarization grating
CN114326129A (en) * 2022-02-22 2022-04-12 亿信科技发展有限公司 Virtual reality glasses

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110063289A1 (en) * 2008-05-08 2011-03-17 Seereal Technologies S.A. Device for displaying stereoscopic images
CN108886610A (en) * 2016-03-15 2018-11-23 深见有限公司 3D display device, methods and applications
WO2017164573A1 (en) * 2016-03-23 2017-09-28 Samsung Electronics Co., Ltd. Near-eye display apparatus and near-eye display method
CN107505720A (en) * 2017-09-14 2017-12-22 北京邮电大学 A kind of 3 d light fields display device based on cross-polarization
CN108732763A (en) * 2018-05-31 2018-11-02 京东方科技集团股份有限公司 Display screen and head for virtual reality show device and show system with virtual reality head

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023159919A1 (en) * 2022-02-22 2023-08-31 亿信科技发展有限公司 Virtual reality glasses
CN114740625A (en) * 2022-04-28 2022-07-12 珠海莫界科技有限公司 Optical machine, control method of optical machine and AR near-to-eye display device
WO2023207577A1 (en) * 2022-04-28 2023-11-02 珠海莫界科技有限公司 Optical engine, control method for optical engine, and ar near-eye display apparatus

Also Published As

Publication number Publication date
WO2023159919A1 (en) 2023-08-31

Similar Documents

Publication Publication Date Title
JP6821574B2 (en) Display device with total internal reflection
US10685492B2 (en) Switchable virtual reality and augmented/mixed reality display device, and light field methods
US6813085B2 (en) Virtual reality display device
CN104854864B (en) Time multiplexing display with lateral operation pattern and longitudinal operator scheme
US20060033992A1 (en) Advanced integrated scanning focal immersive visual display
KR101660411B1 (en) Super multi-view 3D display apparatus
US20130286053A1 (en) Direct view augmented reality eyeglass-type display
CN107076984A (en) Virtual image maker
WO2023159919A1 (en) Virtual reality glasses
CA2253482A1 (en) Multiple viewer system for displaying a plurality of images
JP2010224129A (en) Stereoscopic image display device
WO2021139204A1 (en) Three-dimensional display device and system
CN114365027A (en) System and method for displaying object with depth of field
CN112346252A (en) Near-to-eye display device
JP2009098326A (en) Three-dimensional image forming apparatus
US20240061246A1 (en) Light field directional backlighting based three-dimensional (3d) pupil steering
JP5487935B2 (en) Display device and display method
CN116420104A (en) Virtual image display system for a virtual reality and augmented reality device
JP2017037304A (en) Video unit for head-up display, head-up display, and method for generating front-facing image for stereoscopic vision using video unit
JP2016110146A (en) Image display device
CN108732772B (en) Display device and driving method thereof
JP2003255265A (en) Stereoscopic image display device
WO2011048854A1 (en) Three-dimensional image display device
US11754768B2 (en) Augmented reality display device
US20230360567A1 (en) Virtual reality display system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40063975

Country of ref document: HK