CN114245092A - Multi-depth near-to-eye display method and device - Google Patents

Multi-depth near-to-eye display method and device Download PDF

Info

Publication number
CN114245092A
CN114245092A CN202210165158.9A CN202210165158A CN114245092A CN 114245092 A CN114245092 A CN 114245092A CN 202210165158 A CN202210165158 A CN 202210165158A CN 114245092 A CN114245092 A CN 114245092A
Authority
CN
China
Prior art keywords
image
image light
preset frequency
eye display
depth near
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210165158.9A
Other languages
Chinese (zh)
Inventor
雍海波
赵鑫
郑昱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Journey Technology Ltd
Original Assignee
Journey Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Journey Technology Ltd filed Critical Journey Technology Ltd
Priority to CN202210165158.9A priority Critical patent/CN114245092A/en
Publication of CN114245092A publication Critical patent/CN114245092A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof

Abstract

The application provides a multi-depth near-to-eye display method and device. In the multi-depth near-to-eye display method provided by the application, the position of the image light generated by the image source is adjusted by the preset frequency, so that the incident angle of the image light entering the optical projection device can be changed by the preset frequency, and the image combination depth of the projected image projected from the optical projection device is further changed. At this time, the image combination depth of the projection image periodically changes within a preset range at a preset frequency, so that the internal rotation angle of human eyes changes, and the pressure of the human eyes can be relieved.

Description

Multi-depth near-to-eye display method and device
Technical Field
The application belongs to the technical field of near-eye display, and particularly relates to a multi-depth near-eye display method and device.
Background
With the development of scientific technology, the near-eye imaging display technology is more and more emphasized by people. The near-eye imaging display technology is widely applied to the fields of scientific research, military, industry, games, videos, education and the like. The image depth of the existing near-eye display device is fixed, so that the visual accommodation (objects focused by crystalline lenses at different depths) can never be changed, and the visual convergence (the eyes rotate inwards to overlap the view of each eye into an aligned image) can not be changed, so that the visual convergence accommodation conflict is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a multi-depth near-to-eye display method and device, and the fatigue of human eyes can be relieved by changing a convergence adjusting angle.
In a first aspect, an embodiment of the present application provides a multi-depth near-eye display method, including:
adjusting the position of image light generated by an image source at a preset frequency;
utilizing an optical lead-in device to couple the image light into an optical projection device in a time sequence;
and projecting the image light by using the optical projection device to obtain a projected image, wherein the projection position of the projected image periodically changes within a preset range at the preset frequency.
In the multi-depth near-to-eye display method provided by the application, the position of the image light generated by the image source is adjusted at the preset frequency, so that the incident angle of the image light entering the optical projection device can be changed at the preset frequency, and the image combination depth (projection position) of the projection image projected from the optical projection device is further changed. At this time, the image combination depth of the projection image periodically changes within a preset range at a preset frequency, so that the internal rotation angle of human eyes changes, and the pressure of the human eyes can be relieved.
In one embodiment, the adjustable position range of the image light is determined according to a design parameter of the optical introduction device.
In one embodiment, the position of the image source is adjusted at the preset frequency within the adjustable position range of the image light.
In one embodiment, the position of the image source is adjusted in at least one of horizontal, vertical and rotational directions at the preset frequency within the adjustable range of positions of the image light.
In one embodiment, the position of the active light emitting area of the image source is adjusted at the preset frequency within the adjustable position range of the image light.
In one embodiment, the position of the active light emitting area of the image source is adjusted in at least one of horizontal, vertical and rotational directions at the preset frequency within the adjustable range of positions of the image light.
In a second aspect, embodiments of the present application provide a multi-depth near-eye display device, including:
an image source for the generated image light;
an optical introduction device for receiving the image light;
the optical lead-in device is also used for coupling the image light into the optical projection device in a time sequence manner, and the optical projection device is used for projecting the image light to obtain a projected image; and
and the adjusting device is used for adjusting the position of the image light generated by the image source at a preset frequency so that the projection position of the projection image periodically changes within a preset range at the preset frequency.
In one embodiment, the adjusting device is configured to adjust the position of the image source within the adjustable position range of the image light at the preset frequency.
In one embodiment, the adjusting device is used for adjusting the position of an effective light emitting area of the image source within the adjustable position range of the image light at the preset frequency.
In a third aspect, an embodiment of the present application provides a terminal device, where the terminal device includes: a processor and a memory, the memory for storing a computer program, the processor for invoking and running the computer program from the memory, causing the apparatus to perform the method of any of the first aspects.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the processor is caused to execute the method of any one of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, where the computer program product includes: computer program code which, when executed by a computer, causes the computer to perform the method of any of the first aspects.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a multi-depth near-eye display method according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart of another multi-depth near-eye display method according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart illustrating a further multi-depth near-eye display method according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of a multi-depth near-eye display device according to an embodiment of the present disclosure;
fig. 5 is a display schematic diagram of a display area of an image source according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, an embodiment of the present application provides a multi-depth near-eye display method, including:
s10, adjusting the position of the image light generated by the image source with a preset frequency;
s20, coupling the image light into the optical projection device in time sequence by using the optical lead-in device;
and S30, projecting the image light by using the optical projection device to obtain a projection image, wherein the projection position of the projection image periodically changes within a preset range at the preset frequency.
It is understood that the multi-depth near-eye display method may be applied to a near-eye display device. The near-eye Display device may be a Virtual Reality (VR) device, an Augmented Reality (AR) device, a Mediated Reality (MR) device, or the like, and may also be a Head Up Display (HUD), or the like, which is not limited in this application.
In step S10, the image source may be a flat panel Display or a curved panel Display, and further optionally, the image source may be a Liquid Crystal Display (LCD), a Liquid Crystal On Silicon (LCOS), a reflective projection Display, a Light Emitting Diode (LED) Display, an Organic Light-Emitting Diode (OLED) Display, or the like.
The mode of adjusting the position of the image light generated by the image source at the preset frequency is not particularly limited, and the purpose of adjusting the position of the image light can be achieved by moving the physical position of the image source in real time; the position of the effective light emitting area of the image source can be adjusted in real time, so that the purpose of adjusting the position of the image light is achieved. The value of the preset frequency is not specifically limited, as long as it can be ensured that the eyes do not suffer from visual fatigue due to long-term invariance. In one embodiment, the image light performs a periodic motion within the adjustable position range, with a period of 1/20 s.
In one embodiment, when the near-eye display device is a binocular near-eye display device, due to the phenomenon of persistence of vision of human eyes, a user can feel that images at different focusing positions are generated simultaneously, and therefore 3D effect display is achieved. It is known that, in order to realize a 3D effect display, the human eye can recognize a refresh rate lower than 20Hz, and it is necessary to rapidly change the projection position (focus position) of the projection image at a predetermined frequency higher than 20Hz (for example, 60Hz is possible).
The optical introduction device may include a beam shaping lens in combination with an optical coupling-in element. The beam shaping lens group is used for shaping the image light to form parallel light imaged at infinity. The optical incoupling element can be an incoupling prism or an incoupling grating, etc. for coupling parallel light exiting from the beam shaping lens group into the optical projection device. The optical projection device may include an output coupler, which may include a surface grating or a volume grating, and may also include a spatial light modulator, or the optical projection apparatus may also include a half-mirror.
In the multi-depth near-to-eye display method provided by the application, the position of the image light generated by the image source is adjusted by the preset frequency, so that the incident angle of the image light entering the optical projection device can be changed by the preset frequency, and the image combination depth of the projected image projected from the optical projection device is further changed. At this time, the image combination depth of the projection image periodically changes within a preset range at a preset frequency, so that the internal rotation angle of human eyes changes, and the pressure of the human eyes can be relieved.
Referring to fig. 2, in one embodiment, the multi-depth near-eye display method further includes a step S101 of determining an adjustable position range of the image light according to design parameters of the optical lead-in device. The design parameters of the optical lead-in device may include a design aperture value and a position. The aperture size and position for passing right through the image light are defined as the effective aperture value and the effective aperture position. The design aperture value is typically larger than the effective aperture at design time, thus allowing the image light to move within the adjustable position range.
Further, in one embodiment, the multi-depth near-eye display method further includes a step S102 of adjusting the position of the image source within the adjustable position range of the image light at the preset frequency. Alternatively, an automated moving platform may be provided on which the image source is placed. The automatic moving platform drives the image source to reciprocate within an adjustable position range at a preset frequency. In one embodiment, the automatic moving platform can move in at least one of horizontal, vertical and rotational directions at a preset frequency, and then drives the image light to move in at least one of horizontal, vertical and rotational directions.
Referring to fig. 3, in one embodiment, the multi-depth near-eye display method further includes a step S103 of adjusting the position of the active light emitting area of the image source at a predetermined frequency within the adjustable position range of the image light. In this case, the position of the image source may be fixed, and only the active light emitting region is controlled to move within the adjustable position range at a predetermined frequency. Referring to fig. 5, the display area of the image source may include an active light emitting area and a redundant area. The pixels of the effective light emitting area participate in generating the image light, and the pixels of the redundant area do not participate in generating the image light. When the position of the effective light emitting area is controlled to be changed, the change of the image light position at a preset frequency can be realized. In one embodiment, within the adjustable position range of the image light, the position of the effective light emitting area of the image source is adjusted in at least one of horizontal, vertical and rotational directions at the preset frequency, so as to drive the image light to move in at least one of horizontal, vertical and rotational directions.
Referring to fig. 4, an embodiment of the present application provides a multi-depth near-eye display device. The multi-depth near-eye display device includes an image source, an optical introduction device, an optical projection device, and an adjustment device.
The image source is for the generated image light. The optical introduction device is configured to receive the image light. The optical introduction device is also used for coupling the image light into the optical projection device in a time sequence. The optical projection device is used for projecting the image light to obtain a projection image. The adjusting device is used for adjusting the position of image light generated by an image source at a preset frequency, so that the projection position of the projection image periodically changes within a preset range at the preset frequency.
It is understood that the multi-depth near-eye display device is used to implement the multi-depth near-eye display method. The multi-depth near-eye Display device may be a Virtual Reality (VR) device, an Augmented Reality (AR) device, a Mediated Reality (MR) device, or the like, and may also be a Head Up Display (HUD), or the like, which is not limited in this application.
The image source may be a flat panel Display or a curved panel Display, and further optionally, the image source may be a Liquid Crystal Display (LCD), a Liquid Crystal On Silicon (LCOS), a reflective projection Display, a Light Emitting Diode (LED) Display, an Organic Light-Emitting Diode (OLED) Display, or the like.
The optical introduction device may include a beam shaping lens in combination with an optical coupling-in element. The beam shaping lens group is used for shaping the image light to form parallel light imaged at infinity. The optical incoupling element can be an incoupling prism or an incoupling grating, etc. for coupling parallel light exiting from the beam shaping lens group into the optical projection device. The optical projection device may include an output coupler, which may include a surface grating or a volume grating, and may also include a spatial light modulator, or the optical projection apparatus may also include a half-mirror.
The structure of the adjusting device is not particularly limited as long as the position of the image light generated by the image source can be adjusted at a preset frequency. In one embodiment, the adjustment device may adjust the position of the image light by moving the physical location of the image source in real time. In another embodiment, the position of the image light may be adjusted by adjusting the position of the active light emitting area of the image source in real time.
In this embodiment, the position of the image light generated by the image source is adjusted by the adjusting device at the preset frequency, so that the incident angle of the image light entering the optical projection device can be changed at the preset frequency, and further the image combination depth of the projection image projected from the optical projection device is changed. At this time, the image combination depth of the projection image periodically changes within a preset range at a preset frequency, so that the internal rotation angle of human eyes changes, and the pressure of the human eyes can be relieved.
In one embodiment, the adjustment device comprises an automated moving platform. An image source is disposed on the automated mobile platform. The automatic moving platform drives the image source to reciprocate within an adjustable position range at a preset frequency. In one embodiment, the automatic moving platform can move in at least one of horizontal, vertical and rotational directions at a preset frequency, and then drives the image light to move in at least one of horizontal, vertical and rotational directions.
In one embodiment, the position of the image source can be fixed, and the adjusting device only controls the effective light emitting area to move within the adjustable position range at a preset frequency. Referring to fig. 5, the display area of the image source may include an active light emitting area and a redundant area. The pixels of the effective light emitting area participate in generating the image light, and the pixels of the redundant area do not participate in generating the image light. When the adjusting device controls the position of the effective light emitting area to change, the image light position can be changed at a preset frequency. In one embodiment, within the adjustable position range of the image light, the adjusting device adjusts the position of the active light emitting area of the image source in at least one of horizontal, vertical and rotational directions at the preset frequency, so as to drive the image light to move in at least one of horizontal, vertical and rotational directions.
Based on the same inventive concept, as shown in fig. 6, the embodiment of the present application further provides a terminal device, where the terminal device 300 may be a projection device, an Augmented Reality (AR) device, another product facing future technologies, and the like.
As shown in fig. 6, the terminal device 300 of this embodiment includes: a processor 301, a memory 302, and a computer program 303 stored in the memory 302 and operable on the processor 301. The computer program 303 may be executed by the processor 301 to generate instructions, and the processor 301 may implement the steps in the above-described embodiments of the multi-depth near-eye display method according to the instructions. Alternatively, the processor 301 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 303.
Illustratively, the computer program 303 may be divided into one or more modules/units, which are stored in the memory 302 and executed by the processor 301 to accomplish the present application. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 303 in the terminal device 300.
Those skilled in the art will appreciate that fig. 6 is merely an example of the terminal device 300 and does not constitute a limitation of the terminal device 300 and may include more or less components than those shown, or combine certain components, or different components, for example, the terminal device 300 may further include input-output devices, network access devices, buses, etc.
The Processor 301 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 302 may be an internal storage unit of the terminal device 300, such as a hard disk or a memory of the terminal device 300. The memory 302 may also be an external storage device of the terminal device 300, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device 300. Further, the memory 302 may also include both an internal storage unit of the terminal device 300 and an external storage device. The memory 302 is used to store computer programs and other programs and data required by the terminal device 300. The memory 302 may also be used to temporarily store data that has been output or is to be output.
The terminal device provided in this embodiment may execute the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method of the above-mentioned method embodiments.
The embodiment of the present application further provides a computer program product, which when running on a terminal device, enables the terminal device to implement the method of the above method embodiment when executed.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc.
Reference throughout this application to "one embodiment" or "some embodiments," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In the description of the present application, it is to be understood that the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature.
In addition, in the present application, unless otherwise explicitly specified or limited, the terms "connected," "connected," and the like are to be construed broadly, e.g., as meaning both mechanically and electrically; the terms may be directly connected or indirectly connected through an intermediate medium, and may be used for communicating between two elements or for interacting between two elements, unless otherwise specifically defined, and the specific meaning of the terms in the present application may be understood by those skilled in the art according to specific situations.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A multi-depth near-eye display method, comprising:
adjusting the position of image light generated by an image source at a preset frequency;
utilizing an optical lead-in device to couple the image light into an optical projection device in a time sequence;
and projecting the image light by using the optical projection device to obtain a projected image, wherein the projection position of the projected image periodically changes within a preset range at the preset frequency.
2. The multi-depth near-eye display method of claim 1, comprising:
and determining the adjustable position range of the image light according to the design parameters of the optical leading-in device.
3. The multi-depth near-eye display method of claim 2, comprising:
adjusting the position of the image source at the preset frequency within the adjustable position range of the image light.
4. The multi-depth near-eye display method of claim 3, comprising:
adjusting a position of the image source in at least one of horizontal, vertical, and rotational directions at the preset frequency within the adjustable position range of the image light.
5. The multi-depth near-eye display method of claim 2, comprising:
and adjusting the position of an effective light emitting area of the image source within the adjustable position range of the image light at the preset frequency.
6. The multi-depth near-eye display method of claim 5, comprising:
adjusting a position of an active light emitting area of the image source in at least one of horizontal, vertical, and rotational directions at the preset frequency within the adjustable position range of the image light.
7. A multi-depth near-eye display device, comprising:
an image source for generating image light;
an optical introduction device for receiving the image light;
the optical lead-in device is also used for coupling the image light into the optical projection device in a time sequence manner, and the optical projection device is used for projecting the image light to obtain a projected image; and
and the adjusting device is used for adjusting the position of the image light generated by the image source at a preset frequency so that the projection position of the projection image periodically changes within a preset range at the preset frequency.
8. The multi-depth near-eye display device of claim 7, wherein the adjustment device is to adjust the position of the image source at the preset frequency within an adjustable range of positions of the image light.
9. The multi-depth near-eye display device of claim 7, wherein the adjustment device is configured to adjust a position of an active light emitting area of the image source at the preset frequency within an adjustable range of positions of the image light.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 6 when executing the computer program.
CN202210165158.9A 2022-02-23 2022-02-23 Multi-depth near-to-eye display method and device Pending CN114245092A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210165158.9A CN114245092A (en) 2022-02-23 2022-02-23 Multi-depth near-to-eye display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210165158.9A CN114245092A (en) 2022-02-23 2022-02-23 Multi-depth near-to-eye display method and device

Publications (1)

Publication Number Publication Date
CN114245092A true CN114245092A (en) 2022-03-25

Family

ID=80747832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210165158.9A Pending CN114245092A (en) 2022-02-23 2022-02-23 Multi-depth near-to-eye display method and device

Country Status (1)

Country Link
CN (1) CN114245092A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101795420A (en) * 2010-04-07 2010-08-04 昆山龙腾光电有限公司 Stereo image displaying system and control method thereof
US20120007949A1 (en) * 2010-07-06 2012-01-12 Samsung Electronics Co., Ltd. Method and apparatus for displaying
CN103487939A (en) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 Adjustable head mount display optical system and adjusting method thereof
CN203433194U (en) * 2013-08-28 2014-02-12 成都理想境界科技有限公司 Adjustable head mount display optical system and head mount display
EP3176776A1 (en) * 2015-12-01 2017-06-07 Xiaomi Inc. Luminance adjusting method and apparatus, computer program and recording medium
CN108647001A (en) * 2018-05-21 2018-10-12 云谷(固安)科技有限公司 A kind of display methods and device of protection eyesight
CN108717234A (en) * 2018-05-21 2018-10-30 云谷(固安)科技有限公司 Sight protectio method and display device
CN110192142A (en) * 2019-04-01 2019-08-30 京东方科技集团股份有限公司 Display device and its display methods, display system
CN111694158A (en) * 2020-06-17 2020-09-22 Oppo广东移动通信有限公司 Calibration method, calibration equipment and calibration system for near-eye display device
CN211980054U (en) * 2020-06-04 2020-11-20 京东方科技集团股份有限公司 Display device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101795420A (en) * 2010-04-07 2010-08-04 昆山龙腾光电有限公司 Stereo image displaying system and control method thereof
US20120007949A1 (en) * 2010-07-06 2012-01-12 Samsung Electronics Co., Ltd. Method and apparatus for displaying
CN103487939A (en) * 2013-08-28 2014-01-01 成都理想境界科技有限公司 Adjustable head mount display optical system and adjusting method thereof
CN203433194U (en) * 2013-08-28 2014-02-12 成都理想境界科技有限公司 Adjustable head mount display optical system and head mount display
EP3176776A1 (en) * 2015-12-01 2017-06-07 Xiaomi Inc. Luminance adjusting method and apparatus, computer program and recording medium
CN108647001A (en) * 2018-05-21 2018-10-12 云谷(固安)科技有限公司 A kind of display methods and device of protection eyesight
CN108717234A (en) * 2018-05-21 2018-10-30 云谷(固安)科技有限公司 Sight protectio method and display device
CN110192142A (en) * 2019-04-01 2019-08-30 京东方科技集团股份有限公司 Display device and its display methods, display system
CN211980054U (en) * 2020-06-04 2020-11-20 京东方科技集团股份有限公司 Display device
CN111694158A (en) * 2020-06-17 2020-09-22 Oppo广东移动通信有限公司 Calibration method, calibration equipment and calibration system for near-eye display device

Similar Documents

Publication Publication Date Title
US11238836B2 (en) Depth based foveated rendering for display systems
US9298012B2 (en) Eyebox adjustment for interpupillary distance
US20180275410A1 (en) Depth based foveated rendering for display systems
US9052414B2 (en) Virtual image device
US11598966B2 (en) Light projection system including an optical assembly for correction of differential distortion
US8988474B2 (en) Wide field-of-view virtual image projector
US11435576B2 (en) Near-eye display with extended accommodation range adjustment
US20200301239A1 (en) Varifocal display with fixed-focus lens
WO2022135284A1 (en) Display module, and method and apparatus for adjusting position of virtual image
CN114365027A (en) System and method for displaying object with depth of field
Itoh et al. Computational phase-modulated eyeglasses
US11122256B1 (en) Mixed reality system
CN111736350A (en) Near-to-eye display device
US20230077212A1 (en) Display apparatus, system, and method
CN114245092A (en) Multi-depth near-to-eye display method and device
US20170359572A1 (en) Head mounted display and operating method thereof
US11733446B2 (en) Polarization-based multiplexing of diffractive elements for illumination optics
US20230314846A1 (en) Configurable multifunctional display panel
US20240027748A1 (en) Scanning projector performing consecutive non-linear scan with multi-ridge light sources
US11644610B1 (en) Phase plate and fabrication method for color-separated laser backlight in display systems
US20230360567A1 (en) Virtual reality display system
Zhang Design and Prototyping of Wide Field of View Occlusion-capable Optical See-through Augmented Reality Displays by Using Paired Conical Reflectors
WO2023147162A1 (en) Phase plate and fabrication method for color-separated laser backlight in display systems
WO2023219925A1 (en) Virtual reality display system
WO2023147166A1 (en) Phase plate and fabrication method for color-separated laser backlight in display systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220325