WO2023159919A1 - 一种虚拟现实眼镜 - Google Patents

一种虚拟现实眼镜 Download PDF

Info

Publication number
WO2023159919A1
WO2023159919A1 PCT/CN2022/118671 CN2022118671W WO2023159919A1 WO 2023159919 A1 WO2023159919 A1 WO 2023159919A1 CN 2022118671 W CN2022118671 W CN 2022118671W WO 2023159919 A1 WO2023159919 A1 WO 2023159919A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
display
human eye
virtual reality
pupil
Prior art date
Application number
PCT/CN2022/118671
Other languages
English (en)
French (fr)
Inventor
卢增祥
Original Assignee
亿信科技发展有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 亿信科技发展有限公司 filed Critical 亿信科技发展有限公司
Publication of WO2023159919A1 publication Critical patent/WO2023159919A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present disclosure relates to the technical field of VR, for example, to a kind of virtual reality glasses.
  • Virtual reality is the use of computer graphics technology, sensor technology, display technology, optical devices, etc. to create a virtual environment, so that users have a sense of immersion.
  • VR Virtual Reality
  • One of the solutions is to use the principle of optical reflection projection to project the micro-projector onto the reflective screen and refract it to the human eye through the lens, forming an enlarged virtual screen in front of the eyes for displaying patterns and data.
  • a head-mounted display is mainly used for games, which uses binocular parallax to enhance the three-dimensional sense and improve the immersion of the game.
  • the display pixels need to be reduced. If the display pixel is too small, it will encounter the diffraction limit problem of the optical system, and the ideal imaging point of the pixel cannot be obtained, and the resolution of the retina cannot be approached or reached.
  • An embodiment of the present disclosure provides a virtual reality glasses to improve display resolution and solve problems of visual fatigue and dizziness caused by convergence and focus conflicts.
  • An embodiment of the present disclosure provides a kind of virtual reality glasses, including a display chip and a lens array, the display chip is located on the light-incident side of the lens array, and the virtual image formed by the light emitted by the display chip through the lens array is located on the side of the lens array.
  • the side of the display chip away from the lens array;
  • the display chip includes a plurality of pixel units arranged in rows and columns along the first direction and a second direction, the plurality of pixel units arranged along the first direction have the same polarization state, and along the second direction,
  • the polarization states of the pixel units in two adjacent rows are orthogonal; the first direction is perpendicular to the second direction;
  • the lens array includes a plurality of lenses arranged in rows and columns along the third direction and the fourth direction, the plurality of lenses include a first lens and a second lens, and along the third direction, between two adjacent first lenses One second lens is arranged between, and one second lens is arranged between two adjacent first lenses along the fourth direction, and the polarization state of the first lens and the second lens is positive intersection, the third direction is perpendicular to the fourth direction.
  • FIG. 1 is a schematic structural diagram of a virtual reality glasses provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a relative relationship between a display chip and a lens array provided by an embodiment of the present disclosure
  • FIG. 3 is a front view of a display chip provided by an embodiment of the present disclosure.
  • FIG. 4 is a front view of a lens array provided by an embodiment of the present disclosure.
  • FIG. 5 is a perspective view of another lens array provided by an embodiment of the present disclosure.
  • Fig. 6 is a side view of the lens array shown in Fig. 5;
  • FIG. 7 is a schematic diagram of a display effect provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a part of the display chip and lens array
  • Fig. 9 is a schematic diagram of the light spot projected to the pupil of the human eye when the diameter of the pupil of the human eye is the smallest;
  • FIG. 10 is a schematic diagram of an optical path of a virtual reality glasses provided by an embodiment of the present disclosure.
  • FIG. 11 is a side view of another lens array provided by an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram of the first state of another virtual reality glasses provided by an embodiment of the present disclosure.
  • Fig. 13 is a schematic diagram of a second state of the virtual reality glasses shown in Fig. 12;
  • FIG. 14 is a schematic diagram of another display effect provided by an embodiment of the present disclosure.
  • FIG. 15 is a schematic diagram of another display effect provided by an embodiment of the present disclosure.
  • FIG. 16 is a schematic diagram of another display effect provided by an embodiment of the present disclosure.
  • FIG. 17 is a schematic diagram of rotation of the lens provided by an embodiment of the present disclosure.
  • FIG. 18 is a schematic diagram of an optical path of another virtual reality glasses provided by an embodiment of the present disclosure.
  • 19 is a schematic diagram of the effective area of the display chip viewed by the human eye through a single lens under different brightness environments;
  • FIG. 20 is a schematic structural diagram of another virtual reality glasses provided by an embodiment of the present disclosure.
  • Fig. 21 is a schematic structural diagram of another virtual reality glasses provided by an embodiment of the present disclosure.
  • Fig. 22 is a schematic diagram of a hardware structure of a controller provided by an embodiment
  • Fig. 23 is a schematic diagram of a partial hardware structure of a virtual reality glasses provided by an embodiment.
  • Figure 1 is a schematic structural diagram of a virtual reality glasses provided by an embodiment of the present disclosure
  • Figure 2 is a schematic diagram of the relative relationship between a display chip and a lens array provided by an embodiment of the present disclosure
  • Figure 3 is a schematic diagram of a virtual reality glasses provided by an embodiment of the present disclosure
  • FIG. 4 is a front view of a lens array provided by an embodiment of the present disclosure.
  • the virtual reality glasses include a display chip 10 and a lens array 20. On the light incident side, the virtual image formed by the light emitted by the display chip 10 through the lens array 20 is located on the side of the display chip 10 away from the lens array 20 .
  • the display chip 10 includes a plurality of pixel units arranged in rows and columns along the first direction and the second direction, and the plurality of pixel units arranged along the first direction have the same polarization state, that is, the plurality of pixel units in the same row have the same polarization state state.
  • the polarization states of the pixel units in two adjacent rows are orthogonal.
  • the first direction is perpendicular to the second direction.
  • the plurality of pixel units includes a first pixel unit 11 and a second pixel unit 12 , and the polarization state of the first pixel unit 11 is orthogonal to the polarization state of the second pixel unit 12 .
  • the mutually orthogonal polarization states may be linear polarization or circular polarization.
  • the polarization state of the first pixel unit 11 is vertical linear polarization
  • the polarization state of the second pixel unit 12 is horizontal linear polarization.
  • the polarization state of the first pixel unit 11 is left-handed circular polarization
  • the polarization state of the second pixel unit 12 is right-handed circular polarization.
  • the mutually orthogonal polarization states may be linear polarization and circular polarization, and may also be other forms of polarization.
  • the lens array 20 includes a plurality of lenses arranged in rows and columns along the third direction and the fourth direction, and the plurality of lenses include a first lens 21 and a second lens 22 .
  • a second lens 22 is arranged between two adjacent first lenses 21, a first lens 21 is arranged between two adjacent second lenses 22, the first lens 21 and the second lens 22 Alternate settings.
  • a second lens 22 is arranged between two adjacent first lenses 21, a first lens 21 is arranged between two adjacent second lenses 22, the first lens 21 and the second lens 22 Alternate settings.
  • the polarization states of the first lens 21 and the second lens 22 are orthogonal.
  • the first direction is parallel to the third direction
  • the second direction is parallel to the fourth direction.
  • the first direction intersects the third direction
  • the second direction intersects the fourth direction.
  • the polarization state of the first lens 21 is vertical linear polarization
  • the polarization state of the second lens 22 is horizontal linear polarization
  • the polarization state of the first lens 21 is left-handed circular polarization
  • the polarization state of the second lens 22 is right-handed circular polarization.
  • the polarization state of the first lens 21 is the same as that of the first pixel unit 11 , and is orthogonal to the polarization state of the second pixel unit 12 .
  • the polarization state of the second lens 22 is orthogonal to the polarization state of the first pixel unit 11 , and is the same as the polarization state of the second pixel unit 12 .
  • the first lens 21 can transmit the light emitted by the first pixel unit 11 and block the light emitted by the second pixel unit 12 .
  • the second lens 22 can transmit the light emitted by the second pixel unit 12 and cut off the light emitted by the first pixel unit 11.
  • An embodiment of the present disclosure provides a kind of virtual reality glasses, including a display chip 10 and a lens array 20, wherein the display chip 10 is provided with pixel units with orthogonal polarization states, and the lens array 20 is provided with first lenses 21 and second lenses with orthogonal polarization states.
  • the first lens 21 and the second lens 22 are alternately arranged along the third direction and the fourth direction. Therefore, the pixel units observed by human eyes through lenses with different polarization states do not interfere with each other.
  • the display area of the display chip 10 can be enlarged, and the refresh rate of the display chip 10 and the pixel unit
  • the light-emitting area does not change, and the effective area of the display chip corresponding to a single lens is increased by setting the polarization state, thereby improving the display resolution. And solve the problems of visual fatigue and dizziness caused by convergence and focus conflicts.
  • Fig. 5 is a perspective view of another lens array provided by an embodiment of the present disclosure
  • Fig. 6 is a side view of the lens array shown in Fig. 5
  • Fig. 7 is a schematic diagram of a display effect provided by an embodiment of the present disclosure, refer to Fig. 5 - Fig. 7.
  • the first lens 21 and the second lens 22 have the same focal length.
  • the first lens 21 and the second lens 22 are installed on different planes. Therefore, the virtual images formed by the first lens 21 and the second lens 22 are located at different plane positions.
  • the virtual image formed by the light emitted by the display chip 10 after passing through the second lens 22 is located on the first display layer 31
  • the virtual image formed by the light emitted by the display chip 10 after passing through the first lens 21 is located on the first display layer 31 .
  • Two display layers 32 .
  • the first display layer 31 is located between the second display layer 32 and the display chip 10 .
  • is the angular resolution
  • is the wavelength
  • D is the diameter of the entrance pupil.
  • is approximately equal to 0.406mRad. But at this time, the depth of field of the lens is relatively small. In order to achieve better depth vision, virtual reality glasses are required to perform multi-layer display to achieve more realistic stereo vision.
  • the vision of the human eye will change according to the environment, and the resolution usually does not reach 1' (1/60 degree). degrees (about 0.436mrad), when forming a virtual image, due to the large image distance, the object distance is close to the focal point, and the angular resolution ⁇ of the pixel unit imaging at this time satisfies:
  • pitch is the size of the pixel unit.
  • f is the focal length of the lens.
  • the symbol " ⁇ " means approximately equal to.
  • FIG. 8 is a schematic diagram of a part of the display chip and lens array.
  • the dotted line square array is the area to be displayed by the display chip 10 under the corresponding lens.
  • the circular arrays in solid lines are lens arrays.
  • the area between the solid line square and the dotted line square in Figure 8 is set as the fusion area, and the fusion area corresponding to the adjacent lens displays the same image, which occurs after being projected to the pupil of the human eye. Overlapping realizes the fusion of images formed by adjacent lenses.
  • the distance between the centers of adjacent lenses is t.
  • Fig. 9 is a schematic diagram of the light spot projected to the pupil of the human eye when the diameter of the pupil of the human eye is the smallest. Referring to Fig. 9, the distance between the centers of adjacent lenses is t, and the polarization states of adjacent lenses are different, satisfying :
  • emin is the minimum diameter of the human eye pupil
  • D is the entrance pupil diameter of the lens
  • the diameter of the entrance pupil D is equivalent to the size of the light spot incident on the pupil of the human eye after the pixel unit is imaged by the lens. Consistent brightness needs to satisfy the above formula (2).
  • the size of the pupil diameter should meet the requirement that the light spot emitted from the adjacent lens can enter the pupil of the human eye.
  • D 1.5mm
  • t 2mm
  • the pupil diameter must be greater than 4.5mm in order to view a complete and clear image.
  • 4mm-6mm is the comfortable pupil diameter of the human eye in a normal lighting environment. Only in an extremely bright environment will the diameter of the human eye pupil be adjusted to less than 4mm. Under normal circumstances, since the virtual reality glasses only provide light from the display chip, external light will not irradiate the pupil of the human eye, so the area where the pupil of the human eye is located is relatively dark, and the diameter of the pupil of the human eye is generally greater than 4mm. When the display picture is bright and the pupil of the human eye shrinks, the embodiments of the present disclosure can adjust the overall display brightness of the display chip so that the pupil diameter of the human eye pupil will not be adjusted due to being too bright. It will be missing because the display is too bright.
  • FIG. 10 is a schematic diagram of an optical path of a virtual reality glasses provided by an embodiment of the present disclosure.
  • the image displayed by the display chip 10 includes line segments a1-b1 and line segments c1-d1.
  • the lens array 20 includes a lens A, a lens B and a lens C.
  • the pupil 30 of the human eye changes its size with the change of the external light brightness. When the external light brightness is high, the diameter of the human eye pupil 30 becomes smaller; when the external light brightness is small, the diameter of the human eye pupil 30 becomes large.
  • the minimum value of the diameter of the human pupil 30 is denoted as emin, and the maximum value of the diameter of the human pupil 30 is denoted as emax.
  • the imaging objective lens is u, that is, the distance between the display chip 10 and the lens array 20 is u.
  • the distance between the first lens 21 (that is, the lens array 20 ) and the pupil 30 of the human eye is L.
  • f is the focal length of the lens.
  • f is the focal length of the first lens 21 or the second lens 22 .
  • the distance between any two of lens A, lens B and lens C is 2 mm.
  • the pixel units on the display chip 10 can only be observed by the pupil 30 of the human eye through an imaging lens, which ensures the divergence angle of the pixel units on the display chip 10 .
  • the light beam emitted by the point b1 through the lens B can enter the pupil 30 of the human eye, and the light beam emitted by the point c1 through the lens C can also enter the pupil 30 of the human eye. Since the position of the virtual image formed by the display chip 10 through the lens array 20 is far away, it can be approximately considered that point c1 and point b1 are displayed at the same point on the imaging plane. When the displayed image is dark, the diameter of the pupil 30 of the human eye will become larger, and the light beam exiting the lens C at point b1 will not enter the pupil 30 of the human eye, thereby avoiding the occurrence of display crosstalk.
  • the focal length f of the lens satisfies:
  • D is the entrance pupil diameter of the lens
  • D is the tracking error of the pupil position of the human eye.
  • FIG. 11 is a side view of another lens array provided by an embodiment of the present disclosure.
  • the first lens 21 and the second lens 22 have different focal lengths.
  • the first lens 21 and the second lens 22 are installed on the same plane.
  • the first lens 21 and the second lens 22 with different focal lengths are used, and the first lens 21 and the second lens 22 are installed on the same plane, and the light emitted by the display chip 10 passes through the first lens 21 and the second lens 22.
  • Two different display layers are respectively formed behind the second lens 22 .
  • first lens 21 and the second lens 22 have different focal lengths, and the first lens 21 and the second lens 22 are installed on different planes, and the light emitted by the display chip 10 passes through the first lens 21 and the second lens 22.
  • Two different display layers are respectively formed behind the second lens 22 .
  • different display layers are formed by disposing lenses with different focal lengths on the same plane, or disposing lenses with different focal lengths on the same plane. It is a static setting method. In some subsequent embodiments, through a dynamic setting method, by changing the distance between the display chip 10 and the lens array 20, the original display layer can be displayed at different time points. multiple display layers; or the original two display layers are displayed as multiple display layers at different time points.
  • Figure 12 is a schematic diagram of the first state of another virtual reality glasses provided by an embodiment of the present disclosure
  • Figure 13 is a schematic diagram of the second state of the virtual reality glasses shown in Figure 12, referring to Figures 12-13
  • the virtual reality glasses also include a drive module
  • the driving module is configured to drive the display chip 10 and/or the lens array 20 to move along the axial direction L1
  • the axis L1 is perpendicular to the plane where the display chip 10 is located
  • the vertical axis direction L2 is parallel to the plane where the display chip 10 is located.
  • the axial direction L1 and the vertical axis direction L2 are perpendicular to each other.
  • FIG. 14 is a schematic diagram of another display effect provided by an embodiment of the present disclosure.
  • the drive module drives the lens array 20 to move as an example.
  • FIG. 12 in the first state of the virtual reality glasses, there is a first distance between the lens array 20 and the display chip 10 .
  • the driving module drives the lens array 20 to move along the axis L1 .
  • the virtual reality glasses are in the second state, and there is a second distance between the lens array 20 and the display chip 10 .
  • the dotted line box in FIG. 13 shows the position of the lens array 20 before moving along the axis L1.
  • the first distance is not equal to the second distance.
  • the original display layer can be displayed as two display layers at different time points; The time points are displayed as four display layers respectively.
  • the lens array 20 when the lens array 20 satisfies that "lenses with different focal lengths are on the same plane, or lenses with different focal lengths are on the same plane", by changing the distance between the display chip 10 and the lens array 20, it can be formed as Four display layers shown in FIG. 14 , namely, a first display layer 31 , a second display layer 32 , a third display layer 33 and a fourth display layer 34 .
  • the second display layer 32 is located between the first display layer 31 and the third display layer 33
  • the third display layer 33 is located between the second display layer 32 and the fourth display layer 34 .
  • the lens array 20 when the lens array 20 moves axially, it also vibrates vertically.
  • the vertical axis vibration is vibration along the vertical axis direction L2. Due to vertical axis vibration, after a pixel unit on the display chip 10 is imaged by a lens with the same polarization state, its imaging point will also vibrate and scan to form a pixel line array on the image plane.
  • the vertical axis vibration and the axial vibration are superposed and vibrated, and one pixel unit on the display chip 10 can be imaged as two pixel line arrays on two surfaces through the lens.
  • the axial vibration is the vibration along the axial direction L1.
  • the vibration direction of the lens array 20 forms a certain angle with the arrangement direction of the pixel units, that is, the vibration direction of the lens array 20 intersects with the first direction. Therefore, the scanning line for imaging by the pixel unit is two oblique lines.
  • the purpose of the vibration of the lens array 20 is to form two imaging display surfaces through axial vibration, and to form a display effect much higher than the resolution of the display chip on the two display layers through vertical axis vibration. In short, through vertical axis vibration, it is used to improve display resolution.
  • the display layer is not increased by axial vibration. It can be applied by "setting the first lens 21 and the second lens 22 to have the same focal length, and the first lens 21 and the second lens 22 are set on different planes", or by "setting the first lens 21 and the second lens The lenses 22 have different focal lengths, and the first lens 21 and the second lens 22 are arranged on the same plane" in a static way to increase the display layer. Alternatively, it is suitable for scenarios where the resolution is increased without increasing the display layer.
  • the vertical axis direction L2 is parallel to the second direction.
  • the first direction forms a small angle with the third direction, and the second direction forms a small angle with the fourth direction.
  • the vertical axis direction L2 is parallel to the column direction of the pixel units in the display chip 10 .
  • the embodiment of the present disclosure further introduces the imaging manner of the virtual reality glasses.
  • the virtual reality glasses can form at least two display layers, and display pictures on the at least two display layers.
  • the display resolution of each display layer is close to or reaches the retina-quality resolution display screen, which increases the display depth of field and solves the problem of visual fatigue caused by convergence and focus conflicts.
  • the virtual image includes at least two display layers arranged in sequence.
  • the virtual reality glasses also include a sensor and a controller, the sensor is set to detect the position of the lens array 20 and feeds back to the controller in real time, the controller is set to control the time point of lighting the display chip 10 according to the position of the lens array 20, so that Human eyes can see the pictures displayed by each display layer in its entirety.
  • the virtual image includes a first display layer 31 and a second display layer 32 .
  • the virtual reality glasses form the first display layer 31 , and the controller lights up the pixel units in the display chip 10 for realizing the first display layer 31 .
  • the virtual reality glasses form the second display layer 32 , and the controller lights up the pixel units in the display chip 10 for realizing the second display layer 32 .
  • the virtual image includes a first display layer 31 , a second display layer 32 , a third display layer 33 and a fourth display layer 34 .
  • the distance between the first display layer 31 and the pupil 30 of the human eye is P1
  • the distance between the second display layer 32 and the pupil 30 of the human eye is P2
  • the distance between the third display layer 33 and the pupil 30 of the human eye is P3
  • the distance between the fourth display layer 34 and the pupil 30 of the human eye is P4.
  • the value of P1 is determined according to the visual distance of the human eye
  • the value of P4 is determined according to the comfort of viewing distant images
  • the values of P2 and P3 are determined according to each display layer and the vibration distance of the lens.
  • the distance between the fourth display layer 34 and the lens array 20 is relatively long, and its foreground depth is close to infinity.
  • P1 0.25m
  • P2 0.36m
  • P3 0.69m
  • P4 4m.
  • the appropriate display area of the display chip 10 is selected according to the position difference between the lens array 20 and the pupil 30 of the human eye, so that each area of the display chip 10 passes through different After the lens is imaged, it can be completely spliced into a large image plane on the image plane (that is, the display layer). And when the pupil 30 of the human eye is viewed through the lens array 20, since the lens size is small and within the clear vision distance of the human eye, the pupil 30 of the human eye can only see the image formed by each area of the display chip 10 through the lens, and cannot see the image. to the gap between the lenses.
  • the virtual reality glasses can form at least two display layers, and at least two of the display layers can be visually fused, and the picture is displayed on the visually fused virtual display layer.
  • FIG. 15 is a schematic diagram of another display effect provided by an embodiment of the present disclosure.
  • the virtual image includes at least two display layers arranged in sequence.
  • the virtual reality glasses also include a human eye tracking device and a controller.
  • the human eye tracking device is configured to track and determine the position of the pupil 30 of the human eye, so that clear images can be observed within any eye movement range.
  • the controller adjusts the display brightness of each display layer by adjusting the brightness of the pixel unit, so that the image formed by the visual fusion of multiple display layers is displayed to a preset distance, and the preset distance can be a distance that meets the visual comfort characteristics of the human eye. .
  • the eye movement range refers to the movement range of the pupil of the human eye, and the pupil of the human eye can rotate in various directions such as up, down, left, and right.
  • the virtual reality glasses provided by the present disclosure enable human eyes to have a large eye movement range (ie, eye movement distance), so that clear images can be observed within any eye movement range.
  • the virtual image includes a first display layer 31 and a second display layer 32 .
  • the human vision will fuse the two layers of display pictures into one layer (the first virtual display layer 351).
  • the virtual display layer 351 is located at a position between the first display layer 31 and the second display layer 32, the position is determined by the brightness of the display screens of the first display layer 31 and the second display layer 32, and the distance between the first virtual display layer 351 and the brighter The display layer is closer.
  • the virtual reality glasses further include a human eye tracking device and a controller, and the human eye tracking device is configured to track and determine the position of the pupil of the human eye.
  • the controller projects an image to the pupil of the human eye, so that the virtual reality glasses have a dynamic exit pupil, so that clear images can be observed in any eye movement range. That is, by dynamically tracking the exit pupil, even if the beam divergence angle of the pixel unit after passing through the lens is small, it still supports a large range of eye movement.
  • FIG. 16 is a schematic diagram of another display effect provided by an embodiment of the present disclosure.
  • the virtual image includes a first display layer 31 , a second display layer 32 , a third display layer 33 and a fourth display layer 34 .
  • the position of the human eye pupil 30 determined by the human eye tracking device when the lens array 20 vibrates to a specific position, the luminous brightness of the pixel unit in the display chip 10 is determined.
  • the visual fusion picture is displayed to a preset distance.
  • the preset distance can be a distance that meets the visual comfort characteristics of the human eye. The human eye can comfortably see a clear picture at the distance of vision and infinity.
  • Fig. 17 is a schematic diagram of the rotation of the lens provided by the embodiment of the present disclosure.
  • the lens array also includes a lens array support frame 23, and the lens (the first lens 21 or the second lens 22) is rotatably mounted on the lens array on the support frame 23.
  • the driving module includes a telescopic device 51 configured to drive the lens to rotate, wherein a single lens independently performs a deflection swing with a varying angle relative to its optical axis. Its swing range can be less than ⁇ 10 degrees, and its swinging plane is parallel to the connecting line of two adjacent opposite polarized lenses. Optically, it is equivalent to a translational vibration along the vertical axis. Therefore, the embodiment of the present disclosure realizes the swing of the lens in the vertical axis direction, so as to realize the swing of the lens array along the vertical axis direction.
  • the telescopic device 51 can be an electrical signal drive device such as a piezoelectric ceramic, a memory wire, etc., and the telescopic device 51 drives the lens to rotate slightly. At this time, the image formed by the display chip through the lens will also rotate, and the display of the image surface The pixels will be displaced. Because the rotation angle is small and the lens has a certain depth of field, it can still be considered that the original image plane and the rotated image plane are on the same plane, and only the upper and lower positions of the pixel units change.
  • one retractable device 51 may be provided corresponding to multiple lenses; in another embodiment, one retractable device 51 may be provided corresponding to one lens.
  • FIG. 18 is a schematic diagram of the optical path of another virtual reality glasses provided by the embodiment of the present disclosure.
  • the e a area on the display chip 10 is imaged by the lens a and is seen by the pupil 30 of the human eye, and the e b area is imaged by the lens b Seen by the pupil 30 of the human eye, because when L>u and e is small, the areas where e a and e b are located do not overlap, which will not cause the pixel light in the area e b to be imaged to the pupil 30 of the human eye through the lens a, and e a also No light will be imaged to the pupil 30 of the human eye through the lens b.
  • Figure 19 is a schematic diagram of the effective area of the display chip viewed by the human eye through a single lens under different brightness environments.
  • the k1 area is the effective imaging display area corresponding to the minimum value emin of the diameter of the human eye pupil 30 on the display chip 10
  • the k2 area is the effective imaging display area of the human eye.
  • the maximum value emax of the diameter of the eye pupil 30 corresponds to the effective display area on the display chip 10 .
  • the first lens 21 and the second lens 22 have the same focal length, and the first lens 21 and the second lens 22 are installed on the same plane. Since the polarization states of adjacent pixel units are different, the polarization states of adjacent lenses are also different. Only when the polarization state of the pixel unit is the same as that of the lens, the light beam emitted by the pixel unit can pass through the lens for imaging. At this time, the human eye The pixel units observed by the pupil through lenses with different polarization states do not interfere with each other, and the above-mentioned overlapping of display areas or mutual interference of imaging does not occur.
  • the display area of the display chip can be expanded to the k3 area shown in the figure, the refresh rate of the display chip 10 and the light-emitting area of the pixel unit of the display chip 10 will not change, and the display resolution will be improved. .
  • FIG. 20 is a schematic structural diagram of another virtual reality glasses provided by an embodiment of the present disclosure.
  • the virtual reality glasses include multiple display modules, and the display modules include display chips and lens arrays arranged in parallel.
  • the multiple display modules are respectively a first module 1001 , a second module 1002 and a third module 1003 .
  • the display module includes a display chip 10 and a plurality of lens arrays 20 .
  • Multiple display modules are spliced, that is, multiple display chips 10 are spliced, and multiple lens arrays 20 are spliced.
  • a non-zero angle is formed between two adjacent display chips. Since the display chips in the same display module are arranged in parallel with the lens array, a non-zero angle is formed between two adjacent lens arrays. As a result, the display field of view is enlarged through splicing of the display modules.
  • the included angle between the first module group 1001 and the second module group 1002 is ⁇ , 20° ⁇ 30°.
  • the included angle between the second module 1002 and the third module 1003 may be ⁇ . That is, the angle between the first module 1001 and the second module 1002 is equal to the angle between the second module 1002 and the third module 1003 .
  • FIG. 21 is a schematic structural diagram of another virtual reality glasses provided by an embodiment of the present disclosure.
  • the virtual reality glasses include a fixing device and a pupil detection device 40, a driving and detection device 50, and the fixing device and the pupil detection device 40 include a human body.
  • the eye tracking device, driving and detecting device 50 includes a driving module, a sensor and a controller.
  • the virtual reality glasses also include a mechanical adjustment device, which can be set in the fixing device and pupil detection device 40 or the driving and detection device 50 , or can be set separately.
  • the mechanical adjustment device is configured to adjust the initial positions of the lens array 20 and/or the display chip 10 for use by human eyes with different visions.
  • the embodiment of the present disclosure also provides design parameters of virtual reality glasses.
  • the lens distance t is 2mm
  • the entrance pupil diameter of the lens is 1.5mm for D
  • P1 and P2 are two display layers formed by axial vibration imaging of lenses on the same array surface
  • P3 and P4 are two display layers formed by axial vibration imaging of lenses on another array surface
  • the distance between the two layers of lens arrays is 243um
  • the vibration needs to be at least to the position of the adjacent lens.
  • the period span is larger, so the Y effective amplitude A2 of the vertical axis component of the lens array>2mm.
  • the embodiment of the present disclosure also provides design parameters of another kind of virtual reality glasses.
  • the above-mentioned virtual reality glasses can adjust the distance between the display chip 10 and the lens array 20, when the user's eyes are short-sighted, the distance between the lens array 20 and the display chip 10 can be adjusted so that the imaging surface is closer to the eyes than normal eyes , so that myopic viewers can watch the same display content with normal eyes only by wearing VR glasses.
  • the myopia of the human eye is 500 degrees
  • the picture of the first display layer is adjusted forward.
  • the distance between the lens array 20 and the display chip 10 is adjusted by 0.44 mm, and this distance can be realized by a mechanical adjustment device.
  • the lens distance t is 2mm
  • the entrance pupil diameter of the lens is 1.5mm for D
  • f 10mm, calculated according to the imaging formula, when the myopia of the human eye is 500 degrees
  • the first display layer 31, the second display layer 32, the third display layer 33 and the fourth display layer 34 can be set as shown in the following table:
  • FIG. 22 is a schematic diagram of a hardware structure of a controller 200 provided by an embodiment. As shown in FIG. 22 , the electronic device includes: one or more processors 210 and a memory 220 . A processor 210 is taken as an example in FIG. 22 .
  • the electronic device may further include: an input device 230 and an output device 240 .
  • the processor 210, the memory 220, the input device 230 and the output device 240 in the electronic device may be connected through a bus or in other ways.
  • the connection through a bus is taken as an example.
  • the memory 220 can be configured to store software programs, computer-executable programs and modules.
  • the processor 210 executes various functional applications and data processing by running software programs, instructions and modules stored in the memory 220, so as to implement any method in the above-mentioned embodiments.
  • the memory 220 may include a program storage area and a data storage area, wherein the program storage area may store an operating system and an application program required by at least one function; the data storage area may store data created according to the use of the electronic device, and the like.
  • the memory may include a volatile memory such as a random access memory (Random Access Memory, RAM), and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device or other non-transitory solid-state storage device.
  • RAM Random Access Memory
  • Memory 220 may be a non-transitory computer storage medium or a transitory computer storage medium.
  • the non-transitory computer storage medium includes, for example, at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage devices.
  • the memory 220 may optionally include memory located remotely relative to the processor 210, and these remote memories may be connected to the electronic device through a network. Examples of the above-mentioned network may include Internet, enterprise intranet, local area network, mobile communication network and combinations thereof.
  • the input device 230 can be configured to receive input numbers or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the output device 240 may include a display device such as a display screen.
  • This embodiment also provides a computer-readable storage medium storing computer-executable instructions, and the computer-executable instructions are used to execute the above method.
  • the non-transitory computer-readable storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM) or RAM, etc.
  • Fig. 23 is a schematic diagram of a partial hardware structure of a virtual reality glasses provided by an embodiment.
  • the sensor 300 , the driving module 400 and the eye tracking device 500 are all electrically connected to the controller 200 .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

一种虚拟现实眼镜,包括显示芯片(10)和镜头阵列(20),显示芯片(10)位于镜头阵列(20)的入光侧,显示芯片(10)发出的光经镜头阵列(20)所成虚像位于显示芯片(10)远离镜头阵列(20)一侧;显示芯片(10)包括多个沿第一方向和第二方向行列排布的像素单元(11、12),沿第一方向排列的多个像素单元具有相同的偏振态,沿第二方向,相邻两行像素单元的偏振态正交;第一方向与第二方向垂直;镜头阵列(20)包括多个沿第三方向和第四方向行列排布的镜头,多个镜头包括第一镜头(21)和第二镜头(22),沿第三方向,相邻两个第一镜头(21)之间设有一个第二镜头(22),沿第四方向,相邻两个第一镜头(21)之间设有一个第二镜头(22),第一镜头(21)与第二镜头(22)的偏振态正交。

Description

一种虚拟现实眼镜
本公开要求在2022年02月22日提交中国专利局、申请号为202210160030.3的中国专利申请的优先权,以上申请的全部内容通过引用结合在本公开中。
技术领域
本公开涉及VR技术领域,例如涉及一种虚拟现实眼镜。
背景技术
虚拟现实(VR:Virtual Reality),是利用计算机图形技术,传感器技术,显示技术,光学器件等在创建虚拟的环境,使用户产生身临其境的沉浸感。目前的VR或AR设备厂商表较多。其中的一种解决方案,是利用光学反射投影原理,将微型投影仪投射到反射屏上在通过透镜折射到人眼球,在眼前形成一个放大的虚拟屏幕用来所显示图案和数据。其中的另一种解决方案,是一款头戴式显示器主要用于游戏,用双眼视差增强立体感,提高游戏的沉浸感。
以上VR眼镜技术方案中将小屏幕放大为虚拟大屏时都会带来像素放大,造成分辨率的损失,且以上VR显示设备由于眼睛聚焦距离与大脑感知距离之间存在较大差异,容易产生疲劳眩晕等症状。而要提高VR眼镜的分辨率,需要将显示像素缩小。如果显示像素过小会遇到光学系统衍射极限问题,不能得到像素的理想成像点,不能接近或达到视网膜分辨率。
发明内容
本公开实施例提供一种虚拟现实眼镜,以提高显示分辨率,并解决辐辏聚焦冲突引起视觉疲劳及眩晕问题。
本公开实施例提供了一种虚拟现实眼镜,包括显示芯片和镜头阵列,所述显示芯片位于所述镜头阵列的入光侧,所述显示芯片发出的光经所述镜头阵列所成虚像位于所述显示芯片远离所述镜头阵列一侧;
所述显示芯片包括多个沿第一方向和第二方向行列排布的像素单元,沿所述第一方向排列的多个所述像素单元具有相同的偏振态,沿所述第二方向,相邻两行所述像素单元的偏振态正交;所述第一方向与所述第二方向垂直;
所述镜头阵列包括多个沿第三方向和第四方向行列排布的镜头,多个所述镜头包括第一镜头和第二镜头,沿所述第三方向,相邻两个第一镜头之间设有一个所述第二镜头,沿所述第四方向,相邻两个第一镜头之间设有一个所述第二镜头,所述第一镜头与所述第二镜头的偏振态正交,所述第三方向与所述第四方向垂直。
附图说明
图1为本公开实施例提供的一种虚拟现实眼镜的结构示意图;
图2为本公开实施例提供的一种显示芯片与镜头阵列的相对关系示意图;
图3为本公开实施例提供的一种显示芯片的正视图;
图4为本公开实施例提供的一种镜头阵列的正视图;
图5为本公开实施例提供的另一种镜头阵列的立体图;
图6为图5所示镜头阵列的侧视图;
图7为本公开实施例提供的一种显示效果示意图;
图8为部分显示芯片及镜头阵列的示意图;
图9为在人眼瞳孔的直径为最小时,投射至人眼瞳孔的光斑示意图;
图10为本公开实施例提供的一种虚拟现实眼镜的光路示意图;
图11为本公开实施例提供的另一种镜头阵列的侧视图;
图12为本公开实施例提供的另一种虚拟现实眼镜的第一状态示意图;
图13为图12所示虚拟现实眼镜的第二状态示意图;
图14为本公开实施例提供的另一种显示效果示意图;
图15为本公开实施例提供的另一种显示效果示意图;
图16为本公开实施例提供的另一种显示效果示意图;
图17为本公开实施例提供的镜头发生旋转的示意图;
图18为本公开实施例提供的另一种虚拟现实眼镜的光路示意图;
图19为不同亮度环境下人眼通过单个镜头观看显示芯片的有效区域示意图;
图20为本公开实施例提供的另一种虚拟现实眼镜的结构示意图;
图21为本公开实施例提供的另一种虚拟现实眼镜的结构示意图;
图22为一实施例提供的一种控制器的硬件结构示意图;
图23为一实施例提供的一种虚拟现实眼镜的部分硬件结构示意图。
具体实施方式
下面结合附图和实施例对本公开作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅用于解释本公开。另外还需要说明的是,为了便于描述,附图中仅示出了与本公开相关的部分而非全部结构。
图1为本公开实施例提供的一种虚拟现实眼镜的结构示意图,图2为本公开实施例提供的一种显示芯片与镜头阵列的相对关系示意图,图3为本公开实施例提供的一种显示芯片的正视图,图4为本公开实施例提供的一种镜头阵列 的正视图,参考图2-图4,虚拟现实眼镜包括显示芯片10和镜头阵列20,显示芯片10位于镜头阵列20的入光侧,显示芯片10发出的光经镜头阵列20所成虚像位于显示芯片10远离镜头阵列20一侧。
显示芯片10包括多个沿第一方向和第二方向行列排布的像素单元,沿第一方向排列的多个像素单元具有相同的偏振态,即同一行中的多个像素单元具有相同的偏振态。沿第二方向,相邻两行像素单元的偏振态正交。第一方向与第二方向垂直。示例性地,多个像素单元包括第一像素单元11和第二像素单元12,第一像素单元11的偏振态与第二像素单元12的偏振态正交。
示例性地,相互正交的偏振态可以为线偏振或者圆偏振。例如,第一像素单元11的偏振态为竖直线偏振,第二像素单元12的偏振态为水平线偏振。或者,第一像素单元11的偏振态为左旋圆偏振,第二像素单元12的偏振态为右旋圆偏振。当然,相互正交的偏振态可以为线偏振和圆偏振,还可以为其他形式的偏振。
镜头阵列20包括多个沿第三方向和第四方向行列排布的镜头,多个镜头包括第一镜头21和第二镜头22。沿第三方向,相邻两个第一镜头21之间设有一个第二镜头22,相邻两个第二镜头22之间设有一个第一镜头21,第一镜头21和第二镜头22交替设置。沿第四方向,相邻两个第一镜头21之间设有一个第二镜头22,相邻两个第二镜头22之间设有一个第一镜头21,第一镜头21和第二镜头22交替设置。第一镜头21与第二镜头22的偏振态正交。在一实施方式中,第一方向与第三方向平行,第二方向与第四方向平行。在另一实施方式中,第一方向与第三方向交叉,第二方向与第四方向交叉。
示例性地,第一镜头21的偏振态为竖直线偏振,第二镜头22的偏振态为水平线偏振。或者,第一镜头21的偏振态为左旋圆偏振,第二镜头22的偏振态为右旋圆偏振。
可以理解的是,像素单元发出的光的偏振态与镜头的偏振态相同(即偏振态平行)时,可以完全通过;像素单元发出的光的偏振态与镜头的偏振态正交时,完全不通过(即光被截止)。
在一实施方式中,第一镜头21的偏振态与第一像素单元11的偏振态相同,与第二像素单元12的偏振态正交。第二镜头22的偏振态与第一像素单元11的偏振态正交,与第二像素单元12的偏振态相同。此时,第一镜头21可以透过第一像素单元11发出的光,并截止第二像素单元12发出的光。第二镜头22可 以透过第二像素单元12发出的光,并截止第一像素单元11发出的光。
本公开实施例提供一种虚拟现实眼镜,包括显示芯片10和镜头阵列20,其中,显示芯片10设置偏振态正交的像素单元,镜头阵列20设置偏振态正交的第一镜头21和第二镜头22,沿第三方向和第四方向,第一镜头21和第二镜头22均交替排列。从而,人眼通过偏振态不同的镜头观察到的像素单元相互之间互不干扰,由于相邻镜头的偏振状态不同,显示芯片10的显示区域可以扩大,显示芯片10的刷新频率及像素单元的发光面积都不发生改变,通过偏振态的设置增大单个镜头对应的显示芯片有效区域,提高了显示分辨率。并解决辐辏聚焦冲突引起视觉疲劳及眩晕问题。
图5为本公开实施例提供的另一种镜头阵列的立体图,图6为图5所示镜头阵列的侧视图,图7为本公开实施例提供的一种显示效果示意图,参考图5-图7,第一镜头21与第二镜头22具有相同的焦距。第一镜头21与第二镜头22安装在不同的平面上。从而第一镜头21和第二镜头22所成虚像位于不同的平面位置处。
示例性地,参考图1-图7,显示芯片10发出的光经过第二镜头22后所成虚像位于第一显示层31,显示芯片10发出的光经过第一镜头21后所成虚像位于第二显示层32。第一显示层31位于第二显示层32与显示芯片10之间。
根据瑞里判据,光学系统的角分辨率公式:sinβ=1.22λ/D。其中β是角分辨率,λ是波长,D是入瞳直径,当D=1.5mm,λ=500nm时,β约等于0.406mRad。但此时镜头的景深比较小,为达到较好的深度视觉,需要虚拟现实眼镜进行多层显示达到更加真实的立体视觉。
在实际情况中,人眼视力根据环境会发生变化,分辨率通常达不到1’(1/60度),本公开的一个示例中,假设通常视力的角分辨率为1.5’(1/40度)(约0.436mrad),成虚像时,由于像距较大,物距接近于焦点,此时像素单元成像的角分辨率β满足:
β~pitch/f。(1)
其中,pitch为像素单元的大小。f为镜头的焦距。符号“~”表示近似等于。
图8为部分显示芯片及镜头阵列的示意图,参考图8,虚线正方形阵列是显示芯片10在对应的镜头下需要显示的区域。实线圆形阵列是镜头阵列。为保证相邻镜头所成图像拼接完整,图8中实线正方形及虚线正方形中间的区域设置为融合区域,相邻镜头所对应的融合区域显示相同的图像,并在投射至人眼瞳 孔后发生重叠,实现相邻镜头所成图像的融合。相邻镜头的中心之间的距离为t。
图9为在人眼瞳孔的直径为最小时,投射至人眼瞳孔的光斑示意图,参考图9,相邻所述镜头的中心之间的距离为t,相邻的镜头的偏振态不同,满足:
Figure PCTCN2022118671-appb-000001
其中,emin为人眼瞳孔的直径的最小值,D为镜头的入瞳直径,
Figure PCTCN2022118671-appb-000002
为人眼瞳孔位置追踪误差。其中,在距离较远时,入瞳直径D等价于像素单元经镜头成像后入射到人眼瞳孔的光斑大小,为使人眼瞳孔能够观看到接近无限远的像,且相邻镜头拼接像素亮度一致,需要满足上述公式(2)。瞳孔直径的大小要满足从相邻的镜头出光的光斑能够入射到人眼瞳孔。当D=1.5mm,t=2mm,
Figure PCTCN2022118671-appb-000003
时,emin~4.5mm。即需要佩戴虚拟现实眼镜时瞳孔直径大于4.5mm,才能观看到完整清晰的图像。而4mm-6mm是人眼在正常光照环境中的舒适的瞳孔直径,只有在极亮的环境中人眼瞳孔的直径才会调节至小于4mm。在一般情况下,由于虚拟现实眼镜只有显示芯片提供光线,外界光线不会照射到人眼瞳孔,所以人眼瞳孔所处的区域比较暗,人眼瞳孔的直径一般会大于4mm。当显示画面较亮,人眼瞳孔缩小时,本公开实施例可以调整显示芯片的整体显示亮度,使人眼瞳孔不会因太亮而调节瞳孔直径,而此时人眼瞳孔观看到的显示画面不会因为显示太亮出现缺失。
图10为本公开实施例提供的一种虚拟现实眼镜的光路示意图,参考图10,显示芯片10显示的图像包括线段a1-b1,和线段c1-d1。镜头阵列20包括镜头A、镜头B和镜头C。人眼瞳孔30随着外界光亮度的变化而改变其大小,在外界光亮度大时,人眼瞳孔30的直径变小;在外界光亮度小时,人眼瞳孔30的直径变大。将人眼瞳孔30的直径的最小值记为emin,将人眼瞳孔30的直径的最大值记为emax。成像物镜为u,即,显示芯片10与镜头阵列20之间的间距为u。第一镜头21(即镜头阵列20)与人眼瞳孔30的距离为L。f为镜头的焦距。具体地,第一镜头21和第二镜头22具有相同的焦距时,f为第一镜头21或者第二镜头22的焦距。镜头A、镜头B和镜头C中任意两者之间的间距为2mm。显示芯片10上的像素单元只能通过一个镜头成像被人眼瞳孔30观察到,保证了显示芯片10上像素单元的发散角。示例性地,b1点经过镜头B出射的光束能够入射到人眼瞳孔30,c1点经过镜头C出射的光束也能入射到人眼瞳孔30。由于显示芯片10经过镜头阵列20成像的虚像位置较远,所以可以近似认为c1点 与b1点在成像面上为同一点显示。当显示图像较暗时,人眼瞳孔30的直径会变大,b1点经过镜头C出射的光束不会入射到人眼瞳孔30中,避免了显示串扰情况的发生。
可选地,镜头的焦距f满足:
Figure PCTCN2022118671-appb-000004
其中,D为镜头的入瞳直径,
Figure PCTCN2022118671-appb-000005
为人眼瞳孔位置追踪误差。
示例性地,当L=17mm,
Figure PCTCN2022118671-appb-000006
D=1.5mm,emin=4.5mm,emax=8mm时,得出f<13.8mm。由上述公式(1)可知,镜头的焦距f越大,角分辨率β越小,分辨率越高。作为一种示例,焦距f为10mm,入瞳直径D为1.5mm的镜头可以满足显示需要。当L=17mm时,镜头阵列20距离人眼瞳孔30的距离临近符合眼镜常规佩戴方式。
图11为本公开实施例提供的另一种镜头阵列的侧视图,参考图11,第一镜头21与第二镜头22具有不同的焦距。第一镜头21与第二镜头22安装在同一个平面上。本公开实施例中,采用不同焦距的第一镜头21和第二镜头22,并将第一镜头21和第二镜头22安装在同一个平面上,显示芯片10发出的光经过第一镜头21和第二镜头22后分别形成两个不同的显示层。
在另一实施方式中,第一镜头21与第二镜头22具有不同的焦距,且第一镜头21与第二镜头22安装在不同的平面上,显示芯片10发出的光经过第一镜头21和第二镜头22后分别形成两个不同的显示层。
以上的实施方式中,通过设置不同焦距的镜头在相同的平面上,或者,设置不同焦距的镜头在同一平面上,形成不同的显示层。其为静态的设置方式,在后续的一些实施方式中,通过动态的设置方式,通过改变显示芯片10与镜头阵列20之间的距离,使原本的一个显示层,在不同的时间点,分别显示为多个显示层;或者使原本的两个显示层,在不同的时间点,分别显示为多个显示层。
图12为本公开实施例提供的另一种虚拟现实眼镜的第一状态示意图,图13为图12所示虚拟现实眼镜的第二状态示意图,参考图12-图13,虚拟现实眼镜还包括驱动模块,驱动模块被设置为驱动显示芯片10和/或镜头阵列20沿轴向L1移动,以及驱动显示芯片10和/或镜头阵列20沿垂轴方向L2振动(例如,移动)。其中,轴向L1垂直于显示芯片10所在平面,垂轴方向L2平行于显示芯片10所在平面。轴向L1与垂轴方向L2相互垂直。
图14为本公开实施例提供的另一种显示效果示意图,参考图12-图14,以驱动模块驱动镜头阵列20移动为例。如图12所示,在虚拟现实眼镜的第一状态下,镜头阵列20与显示芯片10之间具有第一距离。驱动模块驱动镜头阵列20沿轴向L1移动后,如图13所示,虚拟现实眼镜处于第二状态下,镜头阵列20与显示芯片10之间具有第二距离。图13中虚线框示意了镜头阵列20沿轴向L1移动之前的位置。其中,第一距离与第二距离不相等。如此,通过改变显示芯片10与镜头阵列20之间的距离,使原本的一个显示层,在不同的时间点,分别显示为两个显示层;也可以使原本的两个显示层,在不同的时间点,分别显示为四个显示层。
示例性地,在镜头阵列20满足“不同焦距的镜头在相同的平面上,或者,不同焦距的镜头在同一平面上”时,通过改变显示芯片10与镜头阵列20之间的距离,可以形成如图14所示的四个显示层,即,第一显示层31、第二显示层32、第三显示层33和第四显示层34。第二显示层32位于第一显示层31和第三显示层33之间,第三显示层33位于第二显示层32与第四显示层34之间。
在一实施方式中,在镜头阵列20做轴向运动的同时也在进行垂轴振动。垂轴振动为沿垂轴方向L2的振动。由于垂轴振动,显示芯片10上的一个像素单元经过与其相同偏振态的镜头成像后,其成像点也会发生振动扫描,在像面上扫描成一条像素线阵。垂轴振动与轴向振动叠加振动,显示芯片10上一个像素单元即可以通过镜头在两个面上成像为两个像素线阵。其中,轴向振动为沿轴向L1的振动。在垂轴振动中,镜头阵列20的振动方向与像素单元的排列方向呈一定夹角,即,镜头阵列20的振动方向与第一方向交叉。由此,像素单元成像的扫描线为两条斜线。镜头阵列20振动的目的,是通过轴向振动形成两个成像显示面,通过垂轴振动,在两个显示层形成远高于显示芯片分辨率的显示效果。简言之,通过垂轴振动,用于提高显示分辨率。
示例性地,第二方向与第四方向的夹角为7°(tan(7°)~1/8)时,通过垂轴振动,一个像素单元在横向上,分别率变为原来的8倍。
在另一实施方式中,可以仅存在垂轴振动,而不存在轴向振动。即,仅通过垂直振动来增加分辨率。不通过轴向振动来增加显示层。其可以适用于通过“设置第一镜头21和第二镜头22具有相同的焦距,且第一镜头21和第二镜头22设置在不同的平面上”,或者通过“设置第一镜头21和第二镜头22具有不同的焦距,且第一镜头21和第二镜头22设置在同一个平面上”的静态方式,来增加显示层。 亦或者,适用于增加分辨率,但无需增加显示层的场景中。
可选地,垂轴方向L2平行于第二方向。第一方向与第三方向呈一个小的夹角,第二方向与第四方向呈一个小的夹角。垂轴方向L2平行于显示芯片10中像素单元的列方向。
本公开实施例还对虚拟现实眼镜的成像方式做进一步地介绍。在一实施方式中,虚拟现实眼镜可以形成至少两个显示层,并使画面显示在该至少两个显示层。每个显示层的显示分辨率都接近或达到视网膜品质分辨率的显示画面,增大显示景深,解决辐辏聚焦冲突引起的视觉疲劳问题。
参考图7,虚像包括依次设置的至少两个显示层。虚拟现实眼镜还包括传感器和控制器,传感器被设置为检测镜头阵列20的位置并实时反馈给控制器,控制器被设置为根据镜头阵列20的位置控制点亮显示芯片10的时间点,以使人眼能看到完整的各显示层所显示的画面。
示例性地,参考图7,虚像包括第一显示层31和第二显示层32。在镜头阵列20处于第一位置时,虚拟现实眼镜形成第一显示层31,则控制器点亮显示芯片10中用于实现第一显示层31的像素单元。在镜头阵列20处于第二位置时,虚拟现实眼镜形成第二显示层32,则控制器点亮显示芯片10中用于实现第二显示层32的像素单元。
示例性地,参考图14,虚像包括第一显示层31、第二显示层32、第三显示层33和第四显示层34。第一显示层31与人眼瞳孔30之间的距离为P1,第二显示层32与人眼瞳孔30之间的距离为P2,第三显示层33与人眼瞳孔30之间的距离为P3,第四显示层34与人眼瞳孔30之间的距离为P4。P1的数值是根据人眼明视距离确定,P4的数值是根据满足观看远处画面舒适确定,P2和P3的数值是根据每个显示层及镜头振动距离确定。第四显示层34与镜头阵列20之间的距离较远,其前景深接近无限远。
示例性地,P1=0.25m,P2=0.36m,P3=0.69m,P4=4m。
人眼瞳孔30透过镜头阵列20观看显示芯片10的像时,根据镜头阵列20与人眼瞳孔30之间的位置差异,选择显示芯片10合适的显示区域,使显示芯片10的各区域通过不同镜头成像后,在像面(即显示层)上能够完整拼接成大幅像面。且人眼瞳孔30透过镜头阵列20观看时,由于镜头尺寸较小且处于人眼明视距离内,人眼瞳孔30只能透过镜头看到显示芯片10各个区域所成的像,不能看到镜头之间的间隙。
在另一实施方式中,虚拟现实眼镜可以形成至少两个显示层,并使其中的至少两个显示层产生视觉融合,画面显示在视觉融合后的虚拟显示层。
图15为本公开实施例提供的另一种显示效果示意图,参考图15,虚像包括依次设置的至少两个显示层。虚拟现实眼镜还包括人眼追踪装置和控制器,人眼追踪装置被设置为追踪确定人眼瞳孔30的位置,使任意眼动范围内均能观察到清晰图像。控制器通过调节像素单元的亮度,调节呈现的各个显示层的显示亮度,使多个显示层视觉融合形成的画面显示到预设距离,该预设距离可以为符合人眼的视觉舒适特性的距离。其中,眼动范围指的是人眼瞳孔的移动范围,人眼瞳孔可以在上下左右等各个方位进行转动。本公开提供的虚拟现实眼镜,使人眼具有大的眼动范围(即眼动距离),使任意眼动范围内均能观察到清晰图像。
示例性地,参考图15,虚像包括第一显示层31和第二显示层32。人眼瞳孔30观看到第一显示层31和第二显示层32上不同亮度的相同显示画面时,人眼视觉会将两层显示画面融合为一层(第一虚拟显示层351),第一虚拟显示层351位于第一显示层31和第二显示层32中间某位置,该位置由第一显示层31和第二显示层32显示画面的亮度决定,第一虚拟显示层351距离较亮的显示层较近。
传统VR眼睛光学系统的出瞳和成像距离决定了像素出光的发散角,当成像较远时,该发散角会很小,很大程度上限制了人眼的眼动范围,当成像超过光学系统的眼动范围时,就会出现显示画面不清楚或者缺失现象。在本公开实施例中,虚拟现实眼镜还包括人眼追踪装置和控制器,人眼追踪装置被设置为追踪确定人眼瞳孔的位置。控制器根据人眼瞳孔的位置,向人眼瞳孔投射图像,以使虚拟现实眼镜具有动态出瞳,使任意眼动范围内均能观察到清晰图像。即,通过动态追踪出瞳,即便像素单元经镜头后光束发散角很小,仍支持很大的眼动范围。
图16为本公开实施例提供的另一种显示效果示意图,参考图16,虚像包括第一显示层31、第二显示层32、第三显示层33和第四显示层34。根据人眼追踪装置确定的人眼瞳孔30的位置,在镜头阵列20振动到特定位置时,确定显示芯片10中像素单元的发光亮度,通过调节第一显示层31、第二显示层32、第三显示层33和第四显示层34中的至少一个的亮度,使得第一显示层31、第二显示层32、第三显示层33和第四显示层34中的至少两个显示层形成虚拟显示 层(例如第二显示层32和第三显示层33形成第二虚拟显示层352)。本公开实施例通过调节相邻显示层的显示亮度,使视觉融合画面显示到预设距离,该预设距离可以为符合人眼的视觉舒适特性的距离,这样可以结合镜头景深,眼睛调节特性使人眼在明视距离和无穷远都能舒适看到清晰画面。
图17为本公开实施例提供的镜头发生旋转的示意图,参考图17,所述镜头阵列还包括镜头阵列支撑架23,镜头(第一镜头21或者第二镜头22)可旋转地安装于镜头阵列支撑架23上。驱动模块包括伸缩装置51,伸缩装置51被设置为驱动镜头发生旋转,其中,单个镜头独立地相对其光轴做夹角变化的偏转式摆动。其摆幅可以小于±10度,其摆动的平面是平行于两个相邻异性偏振镜头的连线。在光学上其等效为,沿垂轴方向的平移振动。故而,本公开实施例实现镜头的垂轴方向的摆动,从而实现镜头阵列沿垂轴方向摆动。
示例性地,伸缩装置51可以是压电陶瓷,记忆金属丝等电信号驱动装置,伸缩装置51带动镜头发生轻微旋转,此时显示芯片经镜头所成的像也会发生旋转,像面的显示像素就会发生位移,因转动角度较小且镜头有一定景深,仍可认为原始像面和旋转后像面在同一个平面上,仅仅是像素单元的上下位置发生改变。
假设镜头的旋转角度为α,所述第一镜头与人眼瞳孔的距离为L,则像素单元移动距离h满足:h=sin(α)*L。
在一实施方式中,可以多个镜头对应设置一个伸缩装置51;在另一实施方式中,可以一个镜头对应设置一个伸缩装置51。
图18为本公开实施例提供的另一种虚拟现实眼镜的光路示意图,参考图18,显示芯片10上只有e a区域经过镜头a成像被人眼瞳孔30看到,e b区域经过镜头b成像被人眼瞳孔30看到,因L>u时且e很小时,e a与e b所在的区域不重合,不会导致e b区域像素发光经过镜头a成像到人眼瞳孔30,e a也不会发光经过镜头b成像到人眼瞳孔30。但当空间光比较暗时,人眼瞳孔30比较大,最大瞳孔直径可达到8mm,e a与e b所在的区域也会跟着变大,此时出现e a与e b区域重合的风险。
图19为不同亮度环境下人眼通过单个镜头观看显示芯片的有效区域示意图,图19,k1区域为人眼瞳孔30的直径的最小值emin在显示芯片10上对应的有效成像显示区域,k2区域为人眼瞳孔30的直径的最大值emax在显示芯片10上对应的有效显示区域。为满足镜头各自对应的像素单元独立显示,不影响相邻镜头的成像显示,需要显示k1区域。如此,k1区域与k2区域之间的区域会 造成像素单元的浪费,降低显示芯片10的像素单元利用率。
可选地,第一镜头21与第二镜头22具有相同的焦距,第一镜头21与第二镜头22安装在相同的平面上。由于相邻的像素单元的偏振态不同,相邻的镜头偏振态也不相同,仅当像素单元的偏振态与镜头的偏振态相同时,像素单元发出的光束才能通过镜头成像,此时人眼瞳孔通过偏振态不同的镜头观察到的像素单元相互之间互不干扰,不会发生上述显示区域重叠或成像相互干扰的情况。由于相邻镜头的偏振状态不同,显示芯片的显示区域可以扩大至图示的k3区域,显示芯片10的刷新频率及显示芯片10的像素单元的发光面积都不发生改变,并提供了显示分辨率。
图20为本公开实施例提供的另一种虚拟现实眼镜的结构示意图,参考图20,虚拟现实眼镜包括多个显示模组,显示模组包括平行设置的显示芯片和镜头阵列。多个显示模组分别为第一模组1001、第二模组1002和第三模组1003。显示模组包括显示芯片10和多个镜头阵列20。多个显示模组拼接,即,多个显示芯片10拼接,多个镜头阵列20拼接。相邻两个显示芯片之间形成非零夹角,由于同一个显示模组中的显示芯片与镜头阵列平行设置,因此,相邻两个镜头阵列之间形成非零夹角。由此,通过显示模组的拼接扩大了显示视野。
示例性地,第一模组1001与第二模组1002之间的夹角为θ,20°≤θ≤30°。第二模组1002与第三模组1003之间的夹角可以为θ。即,第一模组1001与第二模组1002之间的夹角,等于,第二模组1002与第三模组1003之间的夹角。
图21为本公开实施例提供的另一种虚拟现实眼镜的结构示意图,参考图21,虚拟现实眼镜包括固定装置及瞳孔检测装置40、驱动及检测装置50,固定装置及瞳孔检测装置40包括人眼追踪装置,驱动及检测装置50包括驱动模块、传感器和控制器。虚拟现实眼镜还包括机械调节装置,机械调节装置可以设置于固定装置及瞳孔检测装置40或者驱动及检测装置50中,亦或者,单独设置。机械调节装置被设置为调节镜头阵列20和/或显示芯片10的初始位置,以供不同视力人眼使用。
作为一种示例,本公开实施例还给出一种虚拟现实眼镜的设计参数。镜头间距t为2mm,镜头的入瞳直径为D为1.5mm,L为17mm。f=10mm,根据成像公式计算,第一显示层31、第二显示层32、第三显示层33和第四显示层34可设置为如下表所示:
Figure PCTCN2022118671-appb-000007
其中,P1与P2为同一阵列面镜头由于轴向振动成像形成的两个显示层,P3与P4为另一阵列面的镜头轴向振动成像形成的两个显示层,两层镜头阵列面间距为243um,镜头轴向振动位移为A1=117um。在振动的垂轴分量上的因不同偏振属性的镜头交替设置,振动需要至少到隔壁镜头的位置,显示芯片10的分辨率由发光点大小、安装方向与振动长轴夹角决定,因存在偏振,周期跨度要更大一些,因此镜头阵列垂轴分量的Y有效振幅A2>2mm。综上,二维振动的长(垂轴)短(轴向)轴振动比例k=A2/A1>2000/117=17,镜头轴向位移相对垂轴位移较小,几乎可以忽略不计,可以忽略镜头阵列轴向振动过程。
作为一种示例,本公开实施例还给出另一种虚拟现实眼镜的设计参数。因上述虚拟现实眼镜能够调节显示芯片10与镜头阵列20之间的间距,所以当使用者眼睛近视时,可以将镜头阵列20与显示芯片10的距离调整,使成像面距离眼睛较正常眼睛近一些,这样近视观看者可以仅佩戴VR眼睛即可与正常眼睛观看同样的显示内容。如人眼近视500度时,将第一显示层的画面向前调节。与非近视佩戴者比较,镜头阵列20与显示芯片10的距离调整了0.44mm,该距离可以通过机械调节装置实现。
镜头间距t为2mm,镜头的入瞳直径为D为1.5mm,L为17mm。f=10mm,根据成像公式计算,人眼近视500度时,第一显示层31、第二显示层32、第三显示层33和第四显示层34可设置为如下表所示:
Figure PCTCN2022118671-appb-000008
图22是一实施例提供的一种控制器200的硬件结构示意图,如图22所示,该电子设备包括:一个或多个处理器210和存储器220。图22中以一个处理器210为例。
所述电子设备还可以包括:输入装置230和输出装置240。
所述电子设备中的处理器210、存储器220、输入装置230和输出装置240可以通过总线或者其他方式连接,图22中以通过总线连接为例。
存储器220作为一种计算机可读存储介质,可设置为存储软件程序、计算机可执行程序以及模块。处理器210通过运行存储在存储器220中的软件程序、指令以及模块,从而执行多种功能应用以及数据处理,以实现上述实施例中的任意一种方法。
存储器220可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据电子设备的使用所创建的数据等。此外,存储器可以包括随机存取存储器(Random Access Memory,RAM)等易失性存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件或者其他非暂态固态存储器件。
存储器220可以是非暂态计算机存储介质或暂态计算机存储介质。该非暂态计算机存储介质,例如包括至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器220可选包括相对于处理器210远程设置的存储器,这些远程存储器可以通过网络连接至电子设备。上述网络的实例可以包括互联网、企业内部网、局域网、移动通信网及其组合。
输入装置230可设置为接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。输出装置240可包括显示屏等显示设备。
本实施例还提供一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行上述方法。
上述实施例方法中的全部或部分流程可以通过计算机程序来执行相关的硬件来完成的,该程序可存储于一个非暂态计算机可读存储介质中,该程序在执行时,可包括如上述方法的实施例的流程,其中,该非暂态计算机可读存储介质可以为磁碟、光盘、只读存储记忆体(Read‐Only Memory,ROM)或RAM等。
图23为一实施例提供的一种虚拟现实眼镜的部分硬件结构示意图。在一实施例中,如图23所示,传感器300、驱动模块400和人眼追踪装置500均与控制器200电连接。

Claims (16)

  1. 一种虚拟现实眼镜,包括显示芯片和镜头阵列,所述显示芯片位于所述镜头阵列的入光侧,所述显示芯片发出的光经所述镜头阵列所成虚像位于所述显示芯片远离所述镜头阵列一侧;
    所述显示芯片包括多个沿第一方向和第二方向行列排布的像素单元,沿所述第一方向排列的多个所述像素单元具有相同的偏振态,沿所述第二方向,相邻两行所述像素单元的偏振态正交;所述第一方向与所述第二方向垂直;
    所述镜头阵列包括多个沿第三方向和第四方向行列排布的镜头,多个所述镜头包括第一镜头和第二镜头,沿所述第三方向,相邻两个第一镜头之间设有一个所述第二镜头,沿所述第四方向,相邻两个第一镜头之间设有一个所述第二镜头,所述第一镜头与所述第二镜头的偏振态正交,所述第三方向与所述第四方向垂直。
  2. 根据权利要求1所述的虚拟现实眼镜,其中,所述第一镜头与所述第二镜头具有相同的焦距;
    所述第一镜头与所述第二镜头安装在不同的平面上。
  3. 根据权利要求2所述的虚拟现实眼镜,其中,
    Figure PCTCN2022118671-appb-100001
    其中,f为所述第一镜头或者所述第二镜头的焦距,emax为人眼瞳孔的直径的最大值,emin为人眼瞳孔的直径的最小值,L为所述第一镜头与人眼瞳孔的距离,D为镜头的入瞳直径,
    Figure PCTCN2022118671-appb-100002
    为人眼瞳孔位置追踪误差。
  4. 根据权利要求1所述的虚拟现实眼镜,其中,所述第一镜头与所述第二镜头具有不同的焦距;
    所述第一镜头与所述第二镜头安装在同一个平面上。
  5. 根据权利要求1所述的虚拟现实眼镜,其中,相邻所述镜头的中心之间的距离为t,
    Figure PCTCN2022118671-appb-100003
    其中,emin为人眼瞳孔的直径的最小值,D为镜头的入瞳直径,
    Figure PCTCN2022118671-appb-100004
    为人眼瞳孔位置追踪误差。
  6. 根据权利要求1所述的虚拟现实眼镜,还包括驱动模块,所述驱动模块被设置为驱动所述显示芯片和所述镜头阵列中至少之一沿轴向移动,以及驱动所述显示芯片和所述镜头阵列中至少之一沿垂轴方向振动;
    其中,所述轴向垂直于所述显示芯片所在平面,所述垂轴方向平行于所述显示芯片所在平面。
  7. 根据权利要求6所述的虚拟现实眼镜,其中,所述镜头阵列还包括镜头阵列支撑架,所述镜头可旋转地安装于所述镜头阵列支撑架上;
    所述驱动模块包括伸缩装置,所述伸缩装置被设置为驱动所述镜头发生旋转,实现所述镜头阵列沿所述垂轴方向摆动。
  8. 根据权利要求6所述的虚拟现实眼镜,其中,所述垂轴方向平行于所述第二方向。
  9. 根据权利要求6所述的虚拟现实眼镜,其中,所述虚像包括依次设置的至少两个显示层;
    所述虚拟现实眼镜还包括传感器和控制器,所述传感器被设置为检测所述镜头阵列的位置并实时反馈给所述控制器,所述控制器被设置为根据所述镜头阵列的位置控制点亮所述显示芯片的时间点。
  10. 根据权利要求1所述的虚拟现实眼镜,其中,所述虚像包括依次设置的至少两个显示层;
    所述虚拟现实眼镜还包括人眼追踪装置和控制器,所述人眼追踪装置被设置为追踪确定人眼瞳孔的位置;所述控制器通过调节所述像素单元的亮度,调节呈现的每个显示层的显示亮度,使多个所述显示层视觉融合形成的画面显示到预设距离。
  11. 根据权利要求1所述的虚拟现实眼镜,其中,所述虚像包括依次排列的第一显示层、第二显示层、第三显示层和第四显示层,所述第一显示层位于所述显示芯片与所述第二显示层之间。
  12. 根据权利要求1所述的虚拟现实眼镜,包括多个显示模组,每个所述显示模组包括平行设置的所述显示芯片和所述镜头阵列,多个所述显示芯片拼接,多个所述镜头阵列拼接;
    相邻两个所述显示芯片之间形成非零夹角。
  13. 根据权利要求1所述的虚拟现实眼镜,其中,所述第一镜头与所述第二镜头具有相同的焦距;
    所述第一镜头与所述第二镜头安装在相同的平面上。
  14. 根据权利要求1所述的虚拟现实眼镜,还包括机械调节装置,所述机械调节装置被设置为调节所述镜头阵列和所述显示芯片中至少之一的初始位置。
  15. 根据权利要求14所述的虚拟现实眼镜,还包括固定装置及瞳孔检测装置和驱动及检测装置;
    所述固定装置及瞳孔检测装置包括人眼追踪装置,所述驱动及检测装置包括驱动模块、传感器和控制器;
    所述人眼追踪装置被设置为追踪确定人眼瞳孔的位置;
    所述驱动模块被设置为驱动所述显示芯片和所述镜头阵列中至少之一沿轴向移动,以及驱动所述显示芯片和所述镜头阵列中至少之一沿垂轴方向振动;
    所述传感器被设置为检测所述镜头阵列的位置并实时反馈给所述控制器。
  16. 根据权利要求1所述的虚拟现实眼镜,还包括人眼追踪装置和控制器,所述人眼追踪装置被设置为追踪确定人眼瞳孔的位置;
    所述控制器被设置为根据人眼瞳孔的位置,向人眼瞳孔投射图像。
PCT/CN2022/118671 2022-02-22 2022-09-14 一种虚拟现实眼镜 WO2023159919A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210160030.3 2022-02-22
CN202210160030.3A CN114326129A (zh) 2022-02-22 2022-02-22 一种虚拟现实眼镜

Publications (1)

Publication Number Publication Date
WO2023159919A1 true WO2023159919A1 (zh) 2023-08-31

Family

ID=81030787

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/118671 WO2023159919A1 (zh) 2022-02-22 2022-09-14 一种虚拟现实眼镜

Country Status (2)

Country Link
CN (1) CN114326129A (zh)
WO (1) WO2023159919A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114326129A (zh) * 2022-02-22 2022-04-12 亿信科技发展有限公司 一种虚拟现实眼镜
CN114740625B (zh) * 2022-04-28 2023-08-01 珠海莫界科技有限公司 一种光机、光机的控制方法及ar近眼显示装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101438599A (zh) * 2006-05-04 2009-05-20 三星电子株式会社 多视角自动立体显示器
JP2009237112A (ja) * 2008-03-26 2009-10-15 Toshiba Corp 立体画像表示装置
CN107505720A (zh) * 2017-09-14 2017-12-22 北京邮电大学 一种基于正交偏振的三维光场显示装置
CN108445633A (zh) * 2018-03-30 2018-08-24 京东方科技集团股份有限公司 一种vr头戴式显示设备、vr显示方法及vr显示系统
CN108732763A (zh) * 2018-05-31 2018-11-02 京东方科技集团股份有限公司 用于虚拟现实的显示屏和头显装置与虚拟现实头显系统
CN108886610A (zh) * 2016-03-15 2018-11-23 深见有限公司 3d显示装置、方法和应用
CN213876192U (zh) * 2021-01-28 2021-08-03 成都工业学院 一种基于偏振光栅的多分辨率立体显示装置
CN114326129A (zh) * 2022-02-22 2022-04-12 亿信科技发展有限公司 一种虚拟现实眼镜

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008001644B4 (de) * 2008-05-08 2010-03-04 Seereal Technologies S.A. Vorrichtung zur Darstellung von dreidimensionalen Bildern
WO2017164573A1 (en) * 2016-03-23 2017-09-28 Samsung Electronics Co., Ltd. Near-eye display apparatus and near-eye display method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101438599A (zh) * 2006-05-04 2009-05-20 三星电子株式会社 多视角自动立体显示器
JP2009237112A (ja) * 2008-03-26 2009-10-15 Toshiba Corp 立体画像表示装置
CN108886610A (zh) * 2016-03-15 2018-11-23 深见有限公司 3d显示装置、方法和应用
CN107505720A (zh) * 2017-09-14 2017-12-22 北京邮电大学 一种基于正交偏振的三维光场显示装置
CN108445633A (zh) * 2018-03-30 2018-08-24 京东方科技集团股份有限公司 一种vr头戴式显示设备、vr显示方法及vr显示系统
CN108732763A (zh) * 2018-05-31 2018-11-02 京东方科技集团股份有限公司 用于虚拟现实的显示屏和头显装置与虚拟现实头显系统
CN213876192U (zh) * 2021-01-28 2021-08-03 成都工业学院 一种基于偏振光栅的多分辨率立体显示装置
CN114326129A (zh) * 2022-02-22 2022-04-12 亿信科技发展有限公司 一种虚拟现实眼镜

Also Published As

Publication number Publication date
CN114326129A (zh) 2022-04-12

Similar Documents

Publication Publication Date Title
JP6821574B2 (ja) 全反射を有するディスプレイ装置
WO2023159919A1 (zh) 一种虚拟现实眼镜
JP4263461B2 (ja) 自動立体光学装置
JP6415608B2 (ja) 目用投影システム
JP5417660B2 (ja) 立体プロジェクション・システム
US8648773B2 (en) Three-dimensional display
US20130286053A1 (en) Direct view augmented reality eyeglass-type display
US20210018760A1 (en) Optical display system, ar display device and vr display device
CN104285176A (zh) 立体焦场式眼镜显示器
CN102213832A (zh) 头部安装型显示器及其光学位置调整方法
US10642061B2 (en) Display panel and display apparatus
CN110133859B (zh) 显示装置
CN111856775A (zh) 显示设备
CN114365027A (zh) 显示具有景深的物体的系统与方法
CN114041080A (zh) 具有偏振体全息图的光学系统
US11573419B2 (en) Display device and display method
US10511832B1 (en) Calibration of virtual image system with extended nasal field of view
WO2021139204A1 (zh) 三维显示装置以及系统
CN110286493B (zh) 一种基于双光栅的立体投影装置
CN114200679B (zh) 光学模组/系统、显示装置、头戴式显示设备和显示系统
WO2018161648A1 (zh) 图像显示系统
US9857592B2 (en) Display device
CN209879155U (zh) 一种基于双光栅的立体投影装置
JP2022546174A (ja) 反射偏光子を備えるビームスキャナ
JP2016110146A (ja) 画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22928185

Country of ref document: EP

Kind code of ref document: A1