WO2024041034A1 - Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie - Google Patents

Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie Download PDF

Info

Publication number
WO2024041034A1
WO2024041034A1 PCT/CN2023/093125 CN2023093125W WO2024041034A1 WO 2024041034 A1 WO2024041034 A1 WO 2024041034A1 CN 2023093125 W CN2023093125 W CN 2023093125W WO 2024041034 A1 WO2024041034 A1 WO 2024041034A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
light
position information
component
control signal
Prior art date
Application number
PCT/CN2023/093125
Other languages
English (en)
Chinese (zh)
Inventor
秦振韬
毛磊
董天浩
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024041034A1 publication Critical patent/WO2024041034A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present application relates to the field of display technology, and in particular, to a display module, an optical display system, a terminal device and an imaging method.
  • Three-dimensional (3D) display can give the observer a strong sense of three-dimensional realism, increase the display depth of field, and enhance the richness of the display content.
  • 3D technology is based on two sets of image information for the left eye and the right eye, and projects them onto the screen.
  • the observer uses polarized glasses to separate the image information, so that the left eye acquires a set of image information and the right eye Eyes obtain another set of image information.
  • naked-eye 3D display technology also known as parallax barrier or parallax barrier technology
  • naked-eye 3D display technology can achieve stereoscopic visual effects through light barrier technology without the use of external tools such as polarized glasses.
  • 3D images displayed based on light barrier technology will cause a loss of resolution.
  • the present application provides a display module, optical display system, terminal equipment and imaging method to reduce or eliminate the problem of resolution loss in displayed images.
  • the present application provides a display module.
  • the display module includes a light source component, an optical shaping component, a light modulating component and a reflecting component.
  • the optical shaping component is located between the light source component and the light modulating component.
  • the light source component is used to emit the first beam and the second beam in a time-sharing manner, and the first beam and the second beam have different exit positions.
  • the optical shaping component is used to adjust the direction of the first light beam to obtain the third light beam, and adjust the direction of the second light beam to obtain the fourth light beam.
  • the light modulation component is used to modulate the third light beam to obtain first image light carrying first image information, and to modulate the fourth light beam to obtain second image light carrying second image information.
  • the reflective component is used to reflect the first image light to the observer's left eye for imaging, and to reflect the second image light to the observer's right eye for imaging.
  • the optical shaping component is located between the light source component and the light modulation component, the light source component can cooperate with the optical shaping component to generate directional backlight, and the light modulation component loads the first image information or the second image information in a time-division manner.
  • the observer can see the first image information or the second image information modulated by all pixels of the light modulation component with a single eye, therefore, the loss of resolution of the displayed image can be reduced or avoided.
  • the first image light is imaged on the left eye
  • the second image light is imaged on the right eye.
  • the observer can observe different parallax images. If the first image information and the second image information are different, At the same time, a three-dimensional 3D image effect can be obtained.
  • the light source assembly includes a light source array, and the light source array includes a first light source and a second light source.
  • the light source component is configured to, according to the received first control signal, the first light source emit the first light beam during the first period, and the second light source emit the second light beam during the second period.
  • the light source array can realize the division of multiple viewing areas in the eye box, so that observers can watch at different positions of the eye box.
  • the first control signal includes position information of the first light source and position information of the second light source.
  • the light sources in the light source array are distributed in a plane.
  • the planarly distributed light source array can reduce the thickness of the display module and facilitate the processing of dense light source arrays.
  • the light sources in the light source array are distributed in a curved surface.
  • the light source array distributed through the curved surface can make the images seen by the observer at the edge and the center basically the same, thereby expanding the eye box range of the display module.
  • the light sources in the light source array are distributed in polygonal rows.
  • the light source array distributed in a polygon can make the image seen by the observer at the edge and center as consistent as possible while reducing the thickness of the display module as much as possible, thereby expanding the eye box range of the display module.
  • the light source assembly includes a surface light source and a light valve; the surface light source is used to emit a surface beam; the light valve is used to open the light valve in the first position in the third period according to the received second control signal, The first beam of the surface beam is emitted through the first position, and the light valve of the second position is opened during the fourth period, and the second beam of the surface beam is emitted through the second position.
  • the second control signal includes the first position information of the light valve and the second position information of the light valve.
  • a pixelated light source with high modulation accuracy can be achieved, making the switching between the first beam and the second beam smoother in the eye tracking state, and the image flicker observed by the observer moving within the eye box is less , more stable; and by controlling the opening width of the light valve, it can accurately match a larger viewing depth range.
  • the surface light source may be a planar surface light source, and/or the light valve may be a planar light valve.
  • Using a flat surface light source and/or a flat light valve helps reduce the thickness of the display module.
  • the optical shaping component includes a lens array. Further, the lenses in the lens array are distributed in a plane.
  • the thickness of the display module can be further reduced. Furthermore, the lens array helps to improve the uniformity of the image formed based on the first image light or the second image light.
  • the lens array includes an aspherical lens array.
  • the aspherical lens array can focus the light beams in different directions on the same fixed point. Therefore, the aspherical lens can reduce the edge of the field of view of the display module. aberration.
  • the aspherical lens array can not only reduce the aberration of the display module, but also in the case of a larger field of view (FOV), the light source component can also be designed as a planar light source array, which can reduce processing and assembly costs. the complexity.
  • the lens array includes a Fresnel lens array.
  • the edge of the image can be imaged through the Fresnel lens. The difference is small.
  • the pitch p of the lenses in the lens array (referring to the effective diameter or effective aperture size of a single lens in the lens array) satisfies the following formula 1:
  • is the divergence angle of the first light beam and the divergence angle of the second light beam, whichever is larger
  • h is the lateral luminous width of the first light source or the second light source
  • t is the distance between the luminous surface of the light source component and the optical shaping component. The distance between faces close to the light modulating component.
  • the light source i.e. the first light source or the second light source
  • the light source array that is turned on once can illuminate the entire corresponding lens, thereby avoiding the existence of dark areas at the edge of the lens and affecting the display of the display module.
  • Image uniformity i.e. the first light source or the second light source
  • the pitch p of the lenses in the lens array satisfies the following formula 2:
  • W eye is the distance between the virtual eyes on the virtual eye box surface
  • d is the distance between the optical shaping component and the virtual eye box surface
  • the virtual eye box surface is the virtual image formed by the reflection of the actual eye box surface by the reflective component
  • t is the light source. The distance between the light-emitting surface of the component and the surface of the optical shaping component close to the light modulation component.
  • the optical shaping component may also include an aspherical lens.
  • the aspherical lens Since the curvature of the aspherical lens changes continuously from the center to the edge, the light beams in different directions can be focused on the same fixed point. Therefore, the aspherical lens can reduce the aberration at the edge of the field of view of the display module.
  • the reflective component includes at least one reflective element, and the reflective element may include but is not limited to a plane mirror, a spherical mirror, a free-form mirror, or a semi-transparent mirror.
  • the present application provides an optical display system, which includes a control module and the above-mentioned first aspect or any one of the display modules in the first aspect.
  • the control module is used to control the display module for image display.
  • the light source assembly includes a light source array, the light sources in the light source array are distributed in a plane, and the light source array includes a first light source and a second light source.
  • the control module is used to obtain the left eye position information and right eye position information of the observer; determine the position information of the first light source according to the left eye position information, and determine the position information of the second light source according to the right eye position information; according to the first light source
  • the position information of the second light source and the position information of the second light source generate a first control signal, and the first control signal is sent to the light source assembly.
  • the light source assembly is configured to, according to the received first control signal, the first light source emit the first light beam during the first period, and the second light source emit the second light beam during the second period.
  • the first control signal includes position information of the first light source and position information of the second light source.
  • the light source component includes a surface light source and a light valve
  • the control module is used to obtain the observer's left eye position information and right eye position information; and determine the first position information of the light valve according to the left eye position information, and determining the second position information of the light valve according to the right eye position information; generating a second control signal according to the first position information of the light valve and the second position information of the light valve and sending the second control signal to the light source assembly.
  • the surface light source is used to emit the surface beam; the light valve is used to open the light valve at the first position in the third period according to the received second control signal, and the first beam in the surface beam is emitted through the first position; and In the fourth period, the light valve at the second position is opened, and the second beam of the surface beam is emitted through the second position.
  • the surface light source is a planar surface light source and/or the light valve is a planar light valve.
  • the second control signal includes first position information of the light valve and second position information of the light valve.
  • the optical display system also includes an eye tracking module, and the eye tracking module is connected to the control module. catch. Further, optionally, the control module is configured to receive left eye position information and right eye position information from the eye tracking module.
  • the present application provides a terminal device, including the above-mentioned first aspect or any one of the optical display systems in the first aspect, and the optical display system is installed on the terminal device.
  • the reflective component included in the display module may be a windshield on the terminal device.
  • the present application provides an imaging method that can be applied to a display module.
  • the display module includes a light source component, an optical shaping component, a light modulation component and a reflection component.
  • the optical shaping component is located between the light source component and the light modulation component. between.
  • the method includes controlling a light source component to emit a first beam and a second beam in a time-sharing manner. The first beam and the second beam have different exit positions. The first beam is adjusted to a third beam by an optical shaping component, and the second beam is adjusted by an optical shaping component.
  • Be the fourth light beam and control the light modulation component to optically modulate the third light beam to obtain the first image light carrying the first image information, and optically modulate the fourth light beam to obtain the second image light carrying the second image information;
  • first The image light is reflected by the reflective component to the observer's left eye for imaging, and the second image light is reflected by the reflective component to the observer's right eye for imaging; a first control signal is sent to the light source component, and a third control signal is sent to the light modulation component.
  • control method can be executed by the control module.
  • the observer's left eye position information and right eye position information are obtained, a control signal is generated based on the left eye position information and the right eye position information, and the control signal is sent to the light source component.
  • the control signal is used to control the light source component to emit the first light beam and the second light beam in a time-divided manner.
  • left eye position information and right eye position information are received from the eye tracking module.
  • the control signal includes a first control signal
  • the light source component includes a light source array
  • the position information of the first light source is determined according to the left eye position information
  • the position information of the second light source is determined according to the right eye position information
  • a first control signal is generated according to the position information of the first light source and the position information of the second light source.
  • the first control signal is used to control the first light source in the light source array to emit the first light beam in the first period, and to control the second light source in the light source array to emit the second light beam in the second period.
  • the control signal includes a second control signal
  • the light source component includes a surface light source and a light valve
  • the first position information of the light valve is determined according to the left eye position information
  • the first position information of the light valve is determined according to the right eye position.
  • the information determines second position information of the light valve
  • a second control signal is generated based on the first position information of the light valve and the second position information of the light valve.
  • the second control signal is used to control the light valve at the first position to be opened during the third period, and the light valve at the second position to be opened during the fourth period.
  • the present application provides a control device, which is used to implement the method in the above-mentioned fourth aspect or any possible implementation of the fourth aspect, including corresponding functional modules, respectively used to implement the above method. step.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules corresponding to the above functions.
  • the control device may include: an interface circuit and a processor.
  • the processor is used to implement the corresponding functions of the first aspect through logic circuits or execution code instructions.
  • the interface circuit is used to receive signals from other control devices other than the control device and transmit them to the processor or to transfer signals from the processor. Sent to other control devices besides the control device.
  • the interface circuit can be an independent receiver, an independent transmitter, or a transceiver with integrated transceiver functions.
  • the control device may also include a memory, which may be coupled to the processor and which stores necessary program instructions and data for the control device.
  • the processor cooperates with the interface circuit to control the light source component to emit the first beam and the second beam in a time-sharing manner.
  • the first beam and the second beam have different exit positions.
  • the first beam is adjusted to the third beam by the optical shaping component.
  • the second beam passes through The optical shaping component is adjusted to the fourth light beam; and the light modulation component is controlled to modulate the third light beam to obtain the first image light carrying the first image information, and modulate the fourth light beam to obtain the second image light carrying the second image information; first The image light is reflected by the reflective component to the observer's left eye for imaging, and the second image light is reflected by the reflective component to the observer's right eye for imaging.
  • the interface circuit is used to obtain the left eye position information and the right eye position information of the observer; the processor is used to generate a control signal according to the left eye position information and the right eye position information, and the control signal is used to
  • the light source component is controlled to emit a first light beam in a first period and a second light beam in a second period; the interface circuit is also used to send a control signal to the light source component.
  • the interface circuit is used to receive left eye position information and right eye position information from the eye tracking module.
  • control signal includes a first control signal
  • the light source assembly includes a light source array.
  • the processor is specifically configured to: determine the position information of the first light source based on the left eye position information, and determine the position information of the second light source based on the right eye position information; and generate the third light source based on the position information of the first light source and the position information of the second light source. a control signal.
  • control signal includes a second control signal
  • the light source assembly includes a surface light source and a light valve
  • the processor is specifically configured to determine the first position information of the light valve according to the left eye position information, and to determine the first position information of the light valve according to the right eye position information. Determine second position information of the light valve; and generate the second control signal based on the first position information of the light valve and the second position information of the light valve.
  • the present application provides a control device, which is used to implement the method in the above-mentioned fourth aspect or any possible implementation of the fourth aspect, including corresponding functional modules, respectively used to implement the above method. step.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules corresponding to the above functions.
  • the processing module cooperates with the transceiver module to control the light source component to emit the first beam and the second beam in a time-sharing manner.
  • the first beam is adjusted to the third beam through the optical shaping component, and the second beam is adjusted to the fourth beam through the optical shaping component. ; and control the light modulation component to modulate the third light beam to obtain the first image light carrying the first image information, and modulate the fourth light beam to obtain the second image light carrying the second image information;
  • the first image light is reflected to the observation through the reflective component
  • the left eye of the observer is imaged, and the second image light is reflected to the observer's right eye through the reflective component.
  • the transceiver module is used to obtain the left eye position information and the right eye position information of the observer; the processing module is used to generate a control signal according to the left eye position information and the right eye position information, and the control signal is In order to control the light source component to emit the first light beam in the first time period and the second light beam in the second time period, the transceiver module is also used to send a control signal to the light source component.
  • the transceiver module is used to receive left eye position information and right eye position information from the eye tracking module.
  • control signal includes a first control signal
  • the light source assembly includes a light source array.
  • the processing module is specifically configured to: determine the position information of the first light source based on the left eye position information, and determine the position information of the second light source based on the right eye position information; and generate the third light source based on the position information of the first light source and the position information of the second light source.
  • a control signal is used to control the light source assembly to emit a first light beam from the first light source during a first period and to emit a second light beam from the second light source during a second period.
  • control signal includes a second control signal
  • the light source assembly includes a surface light source and a light valve
  • the processing module is specifically configured to determine the first position information of the light valve according to the left eye position information, and determine the first position information of the light valve according to the right eye position information. Determine second position information of the light valve; and generate a second control signal based on the first position information of the light valve and the second position information of the light valve.
  • the second control signal is used to control the light valve to open the light valve at the first position during the third period, emit the first beam of the surface beam emitted by the surface light source through the first position, and open the light valve at the second position during the fourth period. , the second of the surface beams emitted by the surface light source The light beam is emitted through the second position.
  • the present application provides a computer-readable storage medium.
  • Computer programs or instructions are stored in the computer-readable storage medium.
  • the control device is caused to execute the fourth aspect or the third aspect. Methods in any possible implementation of the four aspects.
  • the present application provides a computer program product.
  • the computer program product includes a computer program or instructions.
  • the control device causes the control device to execute the fourth aspect or any of the fourth aspects.
  • Figure 1a is a schematic diagram of a scene applied to HUD provided by this application.
  • Figure 1b is a schematic diagram of a scenario applied to NED equipment provided by this application.
  • Figure 1c is a schematic diagram of a scene applied to a vehicle-mounted display screen provided by this application.
  • FIG. 2 is a schematic structural diagram of a display module provided by this application.
  • Figure 3 is a schematic diagram of an optical path in a display module provided by this application.
  • Figure 4 is a schematic diagram of the relationship between a first light source and a second light source provided by this application;
  • Figure 5a is a schematic structural diagram of a one-dimensional light source array provided by this application.
  • Figure 5b is a schematic structural diagram of a two-dimensional light source array provided by this application.
  • Figure 5c is a schematic structural diagram of a staggered arrangement of a light source array provided by this application.
  • Figure 5d is a schematic structural diagram of yet another staggered arrangement of light source arrays provided by this application.
  • Figure 5e is a schematic structural diagram of yet another staggered arrangement of light source arrays provided by this application.
  • Figure 5f is a schematic structural diagram of another light source array provided by the present application in which the light sources are arranged staggered in the column direction;
  • Figure 5g is a schematic structural diagram of a light source array provided by the present application, in which part of the light source array is arranged in a staggered arrangement at equal intervals in the row direction and partially arranged in a staggered arrangement at non-equal intervals;
  • Figure 5h is a schematic structural diagram of a light source array provided by the present application in which the staggered arrangement is partially equally spaced in the column direction and the other is non-equally spaced staggered arrangement;
  • Figure 6a is a schematic diagram of a polygonal distribution of light sources in a light source array provided by this application;
  • Figure 6b is a three-dimensional diagram of a regular hexagonally distributed light source array provided by this application.
  • Figure 7a is a schematic structural diagram of a light source array provided by the present application in which the light sources are distributed on a curved surface;
  • Figure 7b is a three-dimensional diagram of a light source array with curved surface distribution provided by this application.
  • Figure 7c is a three-dimensional diagram of a light source array with curved surface distribution provided by this application.
  • Figure 8a is a schematic structural diagram of a surface light source and light valve provided by this application.
  • Figure 8b is a schematic structural diagram of another surface light source and light valve provided by this application.
  • Figure 9a is a schematic structural diagram of a Fresnel lens provided by this application.
  • Figure 9b is a schematic structural diagram of an aspherical lens array provided by this application.
  • Figure 9c is a schematic structural diagram of a spherical lens array provided by this application.
  • Figure 9d is a schematic structural diagram of a Fresnel lens array provided by this application.
  • Figure 10a is a schematic structural diagram of a display screen provided by this application.
  • Figure 10b is a schematic structural diagram of a display screen provided by this application.
  • Figure 11 is a schematic diagram of the optical path of a display screen provided by this application.
  • Figure 12 is a schematic structural diagram of another display module provided by this application.
  • FIG. 13 is a schematic structural diagram of another display module provided by this application.
  • Figure 14 is a schematic structural diagram of another display module provided by this application.
  • Figure 15 is a schematic structural diagram of an optical display system provided by this application.
  • Figure 16 is a principle of convergence adjustment conflict provided by this application.
  • Figure 17 is a schematic circuit diagram of an optical display system provided by this application.
  • Figure 18 is an exemplary functional block diagram of a vehicle provided by this application.
  • Figure 19 is a schematic flow chart of an imaging method provided by this application.
  • Figure 20a is a schematic optical path diagram of an observer's left eye and right eye at position 1 provided by this application;
  • Figure 20b is a schematic optical path diagram of an observer's left eye and right eye at position 2 provided by this application;
  • FIG. 21 is a schematic structural diagram of a control device provided by this application.
  • Figure 22 is a schematic structural diagram of a control device provided by this application.
  • the display module provided by this application can also be integrated into a head-up display device (HUD).
  • HUD head-up display device
  • Figure 1a takes the HUD installed in a vehicle as an example.
  • HUD can project the formed image (called HUD virtual image) into the driver's front field of view and fuse it with real road information, thereby enhancing the driver's perception of the actual driving environment.
  • the HUD can carry navigation information (such as direction arrows, distance, and/or driving time, etc.) and/or vehicle status information (such as driving speed, driving mileage, rotation speed, temperature, fuel level, and/or car light status).
  • HUD includes but is not limited to windshield-head up device (W-HUD), or AR-HUD, etc.
  • the display module provided by this application can also be integrated into a near eye display (NED) device.
  • the NED device may be, for example, an AR device or a virtual reality (VR) device.
  • the AR device may include but is not limited to AR glasses or an AR helmet.
  • the VR device may include but is not limited to VR glasses or a VR helmet. Please refer to Figure 1b, taking AR glasses as an example. Users can wear AR glasses to play games, watch videos, participate in virtual meetings, or video shopping, etc.
  • the display module provided by this application can also be integrated into a vehicle-mounted display screen.
  • the vehicle-mounted display screen can be installed on the back of the vehicle's seat or in the passenger seat. This application covers the installation of vehicle-mounted display screens. The position is not limited.
  • Figure 1c shows an example of being installed on the back of the seat.
  • the display module provided by this application can also be integrated into a device including a picture generation unit (picture generation unit, PGU) and a reflective component, or other possible smart virtual images.
  • a picture generation unit picture generation unit
  • a reflective component or other possible smart virtual images.
  • display scenarios such as light display, void display, etc., and are not limited to the scenes illustrated in the above examples.
  • this application provides a display module.
  • the display module can be used without losing the resolution of the displayed image.
  • the display module includes a light source component, an optical shaping component, a light modulating component and a reflecting component.
  • the optical shaping component is located between the light source component and the light modulating component.
  • the light source component is used to emit the first light beam and the second light beam in a time-sharing manner, and the first light beam and the second light beam emit different positions from the light source component. It can also be understood that the light source component emits different light beams from different positions at different time periods.
  • the optical shaping component is used to adjust the direction of the first light beam to obtain the third light beam, and adjust the direction of the second light beam to obtain the fourth light beam.
  • the light modulation component is used to modulate the third light beam to obtain first image light carrying first image information, and to modulate the fourth light beam to obtain second image light carrying second image information.
  • the reflective component is used to reflect the first image light to the observer's left eye for imaging, and to reflect the second image light to the observer's right eye for imaging.
  • the propagation light path of the first beam is: the light source component emits the first beam, the first beam is adjusted by the optical shaping component to obtain a third beam, and the third beam modulates the first image information into the third beam through the light modulation component.
  • the first image light is obtained, and the first image light is emitted to the observer's left eye for imaging through the reflective component.
  • the propagation light path of the second beam is: the light source component emits the second beam, the second beam is adjusted by the optical shaping component to obtain a fourth beam, and the fourth beam is modulated by the light modulation component to modulate the second image information into the fourth beam to obtain the second image.
  • the second image light is reflected by the reflective component to the observer's right eye for imaging.
  • the optical shaping component is located between the light source component and the light modulation component, the light source component can cooperate with the optical shaping component to generate directional backlight, and the light modulation component loads the first image information or the second image information in a time-division manner.
  • the observer can see the first image information or the second image information modulated by all the pixels of the light modulation component with one eye. It can also be understood that the observer can see the complete first image information or the second image information modulated by the light modulation component with one eye.
  • the second image information therefore, can reduce or avoid the loss of resolution of the displayed image.
  • the first image light is imaged on the left eye
  • the second image light is imaged on the right eye.
  • the observer can observe different parallax images. If the first image information and the second image information are different, a three-dimensional 3D image effect can be obtained. It can be understood that if the first image information and the second image information are the same, the display module can also display a two-dimensional image effect.
  • the light source component is configured to emit the first light beam and the second light beam in a time-divided manner according to the received control signal.
  • the following shows two possible structures of the light source assembly based on examples, and the process of the light source assembly emitting the first light beam and the second light beam in a time-division manner is introduced in detail respectively.
  • the light source component includes a light source array.
  • the light source array includes a first light source and a second light source. According to the received first control signal, the light source array emits the first light beam from the first light source during the first period and the second light beam from the second light source during the second period. That , the first control signal includes position information of the first light source and position information of the second light source.
  • the first period T 1 during which the first light source is turned on and the second period T 2 during which the second light source is turned on can be set during the initialization process of the display module; or, the first period T 1 and T 2 during which the first light source is turned on can be set.
  • the second period T2 during which the second light source is turned on may also be carried in the first control signal. It should be noted that there is a preset time period ⁇ t between turning on the first light source and the second light source. Please refer to Figure 4. Usually, the preset time period ⁇ t is equal to the modulation time length T 5 of the light modulation component.
  • the light source array can determine the first light source that needs to be turned on based on the position information of the first light source in the first control signal.
  • the first light source is turned on in the first period T1 , and the turned on first light source is used to emit the first light beam;
  • the second light source that needs to be turned on is determined according to the position information of the second light source.
  • the second light source is turned on in the second period T2 , and the turned on second light source is used to emit the second light beam.
  • the light sources in the light source array are distributed in a plane.
  • the light source array includes m ⁇ n light sources, m is an integer greater than 1, and n is a positive integer; or, m is a positive integer, and n is an integer greater than 1.
  • the light source array may be a one-dimensional array (or called a linear array), or it may be a two-dimensional array (or called an area array).
  • the light source array includes 6 ⁇ 1 light sources as an example. It can also be understood that these six light sources are arranged in strips along the row direction.
  • Case 2 The light sources in the light source array are arranged two-dimensionally.
  • the method can be further divided into the following five situations.
  • the light source array includes 6 ⁇ 6 light sources.
  • the lights are aligned in the row direction and also in the column direction.
  • the light sources in the light source array are staggered at equal intervals in the row direction and aligned in the column direction.
  • FIG. 5c it is a schematic structural diagram of a staggered arrangement of a light source array provided by the present application.
  • the light sources in the light source array are staggered at equal intervals in the row direction.
  • a staggered arrangement with a three-row period (or period called a staggered arrangement) is taken as an example.
  • the staggered size of any two adjacent light sources in the row direction is equal. is ⁇ 1 . It should be noted that the misalignment size ⁇ 1 in the row direction is smaller than the distance S 1 between the centers of two adjacent light sources.
  • the light sources in the light source array are staggered at equal intervals in the column direction, but are aligned in the row direction.
  • FIG. 5d it is a schematic structural diagram of another staggered arrangement of light source array provided by the present application.
  • the light sources in the light source array are staggered at equal intervals in the column direction.
  • a three-column periodic staggered arrangement is taken as an example.
  • the stagger size of any two adjacent light sources in the column direction is ⁇ 2 . It should be noted that the dislocation size ⁇ 2 in the column direction is smaller than the distance H 1 between the centers of two adjacent light sources.
  • dislocation size ⁇ 1 in the above-mentioned Fig. 5c may be the same as the dislocation size ⁇ 2 in the above-mentioned Fig. 5d, or may be different, and this application does not limit this.
  • Case 2.4 The light sources in the light source array are staggered at unequal intervals in the row direction and aligned in the column direction.
  • FIG. 5e it is a schematic structural diagram of yet another staggered arrangement in a light source array provided by this application.
  • the light sources in the light source array have at least two different dislocation sizes in the row direction.
  • a 2-row periodic dislocation arrangement is taken as an example, that is, there are two different dislocation sizes in the row direction.
  • the misalignment size of two adjacent light sources in the row direction is ⁇ 3 or ⁇ 4 . In this example, ⁇ 3 is smaller than ⁇ 4 .
  • the misalignment size ⁇ 3 of any two adjacent light sources in the row direction is less than the distance S 1 between the centers of the two adjacent light sources, and the misalignment size ⁇ 4 of any two adjacent light sources in the row direction is also smaller than the distance S 1 between the centers of the two adjacent light sources.
  • offset sizes in the offset arrangements at unequal intervals in the row direction may be different from each other, or may be partially the same and partially different, which is not limited in this application.
  • FIG. 5f it is a schematic structural diagram of another light source array provided by the present application in which the light sources are arranged staggered in the column direction.
  • the light sources in the light source array have at least two different dislocation sizes in the column direction.
  • a two-column periodic dislocation arrangement is taken as an example, that is, there are two different dislocation sizes in the column direction.
  • the misalignment size of two adjacent light sources in the column direction is ⁇ 5 or ⁇ 6 . In this example, ⁇ 5 is smaller than ⁇ 6 .
  • the misalignment size ⁇ 5 of any two adjacent light sources in the column direction is smaller than the distance H 1 between the centers of the two adjacent light sources, and the misalignment size ⁇ 6 of any two adjacent light sources in the column direction is smaller than the distance H 1 between the centers of the two adjacent light sources.
  • the distance between the centers is H 1 .
  • the stagger sizes may be different from each other, or may be partially the same and partially different, which is not limited in this application.
  • the staggered arrangement of the light sources in the row direction in the light source array can also be a combination of the above situations 2.2 and 2.5, that is, some are equally spaced staggered arrangements, and some are non-equally spaced staggered arrangements.
  • the misalignment sizes in the row direction are ⁇ 7 , ⁇ 8 and ⁇ 7 .
  • the staggered arrangement of the light sources in the column direction in the light source array can also be a combination of the above scenarios 2.3 and 2.5, that is, some are equally spaced staggered arrangements, and some are non-equally spaced staggered arrangements.
  • the misalignment sizes in the column direction are ⁇ 9 , ⁇ 0 and ⁇ 9 . It can be understood that, in order to reduce optical crosstalk as much as possible, the distance between two adjacent light sources can be set larger. For example, the distance between two adjacent light sources can be set to 400 microns or greater.
  • the light sources in the light source array given above are all rectangular as examples, and the shape of the light source can also be any other possible shape, such as a circle, an ellipse, a polygon, etc., which is not limited in this application.
  • the light sources in the light source array are distributed in polygonal rows.
  • Figure 6a is a schematic diagram of a polygonal distribution of light sources in a light source array provided by the present application.
  • the light sources in the light source array are distributed in a regular hexagon.
  • Figure 6b is a three-dimensional diagram of a regular hexagonally distributed light source array.
  • the thickness of the display module can be reduced as much as possible, so that the images seen by the observer at the edges and the center are as consistent as possible, thereby expanding the eye box range of the display module.
  • Scenario 3 The light sources in the light source array are distributed on a curved surface.
  • Figure 7a is a schematic structural diagram of a light source array provided by the present application in which the light sources are distributed on a curved surface.
  • Figure 7b or Figure 7c which is a three-dimensional diagram of a light source array distributed on a curved surface.
  • the light source array given above can achieve independent addressing.
  • Independent addressing means that the light sources in the light source array can be independently gated (or called on or turned on or powered on), and the gated light sources are used to emit the first beam or the second beam.
  • a driving current is input to a first light source in the light source array to strobe the first light source.
  • a driving current is input to the second light source in the light source array to strobe the second light source.
  • the addressing mode of the light source array is related to the physical connection relationship of the light sources in the light source array.
  • the light sources can be gated point by point, or the light sources can be gated on demand, where, for example, the gate on demand can be gated on a specific position at a certain time.
  • Light source etc.
  • the light sources in the same column in the light source array are connected in series and different columns are connected in parallel
  • the light source array can be gated column by column.
  • the light source in the column For another example, if the light sources in the same row of the light source array are connected in series and different rows are connected in parallel, the light sources in the light source array can be gated row by row.
  • the light sources in the light source array can be gated according to the diagonal lines, etc., which will not be listed one by one here.
  • the light source in the light source array may be a laser diode (LD), a light-emitting diode (LED), a vertical cavity surface emitting laser (VCSEL), an edge laser Emitting laser (edge emitting laser, EEL), all-solid-state semiconductor laser (diode pumped solid state laser, DPSS), or fiber laser, etc.
  • LD laser diode
  • LED light-emitting diode
  • VCSEL vertical cavity surface emitting laser
  • EEL edge laser Emitting laser
  • DPSS all-solid-state semiconductor laser
  • the light source component includes an area light source and a light valve.
  • the light source component may include a surface light source and a light valve.
  • the surface light source can be a planar surface light source
  • the light valve can be a planar light valve.
  • the light valve can cover the complete area light source. When the light valve at a certain position is opened, the area beam emitted by the area light source can be emitted from the position where the light valve is opened.
  • the surface light source is used to emit surface beams, and usually the surface light source is always on.
  • the light valve is used to open the light valve at the first position in the third period according to the received second control signal.
  • the first beam of the surface beam emitted by the surface light source is emitted through the first position.
  • the light valve is used to open the light valve at the second position in the fourth period according to the second control signal, and the second light beam in the surface beam emitted by the surface light source is emitted through the second position, please refer to (2) in Figure 8b.
  • the second control signal includes the first position information of the light valve and the second position information of the light valve.
  • the first position information of the light valve includes the first width and the first center coordinate of the light valve that needs to be opened, or includes the first starting position and the first end position of the light valve that needs to be opened.
  • the second position information of the light valve includes the second width and the second center coordinate of the light valve that needs to be opened, or includes the second starting position and the second end position of the light valve that needs to be opened.
  • the first width may be the same as the second width or may be different, which is not limited in this application.
  • the third period T 3 during which the light valve is opened at the first position and the fourth period T 4 during which the light valve is opened at the second position can be set during the initialization process of the display module; or, the light valve at the first position is opened.
  • the third period T 3 of opening and the fourth period T 4 of the light valve opening of the second position may also be carried in the first control signal.
  • the interval between opening the light valve in the first position and opening the light valve in the second position is a preset time period ⁇ t.
  • the preset time period ⁇ t can be found in the above-mentioned relevant introduction, and will not be described again here.
  • the light valve opens the light valve at the first position in the third period according to the received second control signal, and in the third period, the first light beam in the surface beam emitted by the surface light source is emitted through the first position; and In the fourth period, the light valve at the second position is opened. In the fourth period, the second beam of the surface beam emitted by the surface light source is emitted through the second position.
  • the first light beam is a partial light beam in the surface beam
  • the second light beam is also a partial light beam in the surface light beam. The first light beam and the second light beam emit from the light valve at different positions.
  • the first beam and the second beam can be adjusted more accurately, so that in the eye tracking state, the switching of the first beam and the second beam is smoother, and the image flickering observed by the observer moving within the eye box Fewer and more stable; and by controlling the opening width of the light valve, it can accurately match a larger viewing depth range.
  • the optical shaping component is used to adjust the direction of the first light beam to obtain the third light beam, and adjust the direction of the second light beam to obtain the fourth light beam.
  • the optical shaping component is used to change the propagation direction of the first light beam from the light source assembly so that the first light beam is focused to a certain position, and to change the propagation direction of the second light beam from the light source assembly so that the second light beam is focused. to another location.
  • optical shaping component The following exemplarily illustrates two possible structures of the optical shaping component.
  • the optical shaping component includes a lens.
  • the lens includes a spherical lens, an aspheric lens or a Fresnel lens (see Figure 9a).
  • Fresnel lenses can also be called threaded lenses.
  • One side of the surface of the Fresnel lens is smooth, and the other side is inscribed with concentric circles (or textures or Fresnel strips) from small to large. Therefore, the Fresnel lens can make the aberration at the edge of the image smaller. Since the curvature of the aspherical lens continuously changes from the center to the edge, the aspherical lens can focus the light beams in different directions on the same fixed point, thus reducing the aberration at the edge of the display module's field of view.
  • the material of the lens may be optical materials such as glass, resin, or crystal.
  • the material of the lens is resin, the quality of the display module can be reduced.
  • the material of the lens is glass, it helps to improve the imaging quality of the display module and suppress temperature drift. This application does not limit the material of the lens.
  • the optical shaping component includes a lens array.
  • the lens array includes a spherical lens array (see Figure 9b), an aspherical lens array (see Figure 9c) or a Fresnel lens array (see Figure 9d).
  • the pitch p of the lens array refers to the effective diameter or effective aperture size of a single lens in the lens array. Due to the aspherical lens array, the aberration at the edge of the field of view of the display module can be reduced, and in the case of a larger field of view (FOV), the light source component can also be designed as a planar light source array, thus Processing and assembly complexity can be reduced.
  • the lens array may be a planarly distributed lens array. Based on this, it helps to reduce the thickness of the display module.
  • the optical shaping component includes a lens array.
  • one lens in the lens array corresponds to at least two light sources in the light source array.
  • part of the light sources among the plurality of light sources corresponding to one lens serve as the first light source
  • other part of the light sources serve as the second light source.
  • one lens can correspond to two light sources, one of which serves as the first light source and the other as the second light source.
  • one lens can correspond to at least four light sources. It can also be understood that for each lens in the lens array, it has a light source corresponding to the position of an observer's left eye and a light source corresponding to the position of the right eye.
  • the number of first light sources (or second light sources) turned on in a certain period of time is related to the lateral width of the first light source (or second light source), the first light source array (or second light source array)
  • the density is related to the number of lenses included in the lens array, where the lateral direction of the first light source (or the second light source) refers to the direction parallel to the pitch p of the lenses.
  • the total lateral width of the first light source (or second light source) that needs to be turned on each time is certain, and the number of first light sources (or second light sources) turned on is the same as a single
  • the lateral width of the first light source (or second light source) is related to the density of the first light source array (or second light source array). The smaller the lateral width and the greater the density, the first light source (or second light source) that is turned on once The greater the number.
  • the light modulation component is used to modulate the third light beam to obtain a first image light carrying the first image information; and to modulate the fourth light beam to obtain a second light beam carrying the second image information.
  • Image light can load (or modulate) the first image information into the third light beam to obtain the first image light carrying the image information; load the second image information into the fourth light beam to obtain the third light beam carrying the image information.
  • the third beam and the fourth beam may be called optical carriers. It can be understood that in order to achieve a 3D display effect, the first image information and the second image information are different.
  • the light modulation component can switch the first image information and the second image information at high frequency, so that it can be This makes the images reaching the left and right eyes of the observer different, thereby achieving 3D display.
  • the light modulation component can be a liquid crystal on silicon (LCOS) display, a liquid crystal display (LCD), a digital light processing (DLP) display, a laser line scan ( laser beam scanning (LBS) display, organic light emitting diode (OLED), micro light emitting diode (micro light emitting diode, micro-LED), active matrix organic light emitting diode or active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (FLED), quantum dot light emitting diode (QLED), digital micro-mirror device , DMD) reflective display screen, etc.
  • LCOS liquid crystal on silicon
  • LCD liquid crystal display
  • DLP digital light processing
  • LBS laser line scan
  • OLED organic light emitting diode
  • micro light emitting diode micro light emitting diode
  • micro-LED active matrix organic light emitting diode or active matrix organic light emitting diode
  • the display screen may include a light source component, an optical shaping component and a light modulation component.
  • the light source component takes a surface light source and a light valve as an example.
  • the position of the light valve can emit part of the surface beam emitted by the surface light source (called the first beam or the second beam).
  • the emitted first beam or the second beam The divergence angle ⁇ .
  • the optical shaping component takes a spherical lens array as an example
  • the light modulation component takes an LCD as an example.
  • the display screen may include a light source component, an optical shaping component and a light modulation component.
  • the light source component is a planarly distributed light source array disposed on the substrate as an example
  • the optical shaping component is a planarly distributed aspherical lens array as an example
  • the light modulation component is an LCD as an example.
  • the planarly distributed aspherical lens array and the planarly distributed light source array can realize a dense light source array and a short-focus lens array with a small lateral width, thereby reducing the thickness of the display module.
  • display modules can be compressed to the millimeter level.
  • the aspherical lens array can make the aberration of the edge field of view smaller, thereby reducing the complexity of processing and assembly of the display module in the case of a large FOV.
  • the illuminated light source array can realize the division of multiple viewing areas in the eye box, allowing observers to perform 3D viewing at different positions of the eye box.
  • Figure 11 is a schematic diagram of an optical path of a display screen provided in this application.
  • the first light source in the light source array is turned on, and the turned on first light source is used to emit the first light beam (indicated by a solid line).
  • the first light beam is directed through the corresponding aspherical lens in the aspherical lens array to obtain the third
  • the third light beam is modulated by the LCD to obtain the first image light carrying the first image information, and the first image light is used for imaging in the left eye.
  • the second light source in the light source array is turned on.
  • the turned on second light source is used to emit a second light beam (indicated by a dotted line).
  • the second light beam is adjusted by the corresponding aspherical lens in the aspherical lens array to obtain a fourth light beam.
  • the fourth light beam is After being modulated by the LCD, the second image light carrying the second image information is obtained, and the second image light is used for imaging in the right eye.
  • the reflective component is used to reflect the first image light to the observer's left eye for imaging, and to reflect the second image light to the observer's right eye for imaging, so that the observer can observe a 3D image.
  • Figure 12 is an example in which the reflective component includes a free-form surface reflector.
  • the reflective assembly includes at least one reflective element.
  • the reflective element may include, for example, but is not limited to, a plane reflector, a spherical reflector, a free-curved mirror or a semi-transparent and semi-reflective mirror (semi-transparent and semi-reflective mirror).
  • a semi-transparent mirror can also be called a beam splitter, a beam splitter or a semi-reflective semi-reflective mirror. It is a semi-reflective film coated on optical glass, or a semi-transparent and semi-reflective film is coated on an optical surface of a lens. , an optical element that changes the original transmission and reflection ratio of the incident light beam. By coating the film layer, you can increase the light intensity by increasing the transmittance; you can also increase the reflection and reduce the light intensity.
  • the display module includes a display screen and a reflective component.
  • the display screen may include a light source component, an optical shaping component and a light modulation component.
  • the light source component is a planarly distributed light source array disposed on the substrate as an example
  • the optical shaping component is a planarly distributed aspherical lens array as an example
  • the light modulation component is an LCD as an example.
  • the reflective component takes the reflective component shown in Figure 12 as an example.
  • L is the length of the formed light spot (or light bar), and L needs to cover the left eye without affecting the right eye, or it needs to cover the right eye without affecting the left eye.
  • the pitch p of the aspherical lenses in the aspherical lens array satisfies the following formula 1.
  • is the larger divergence angle of the first beam and the second beam
  • t is the distance between the light-emitting surface of the light source array and the surface of the aspherical lens array close to the LCD
  • h is the first The lateral luminous width of the light source or second light source.
  • the lateral width can be understood as the sum of the lateral widths of all the first light sources corresponding to an aspherical lens or the sum of the lateral widths of all the corresponding second light sources.
  • an aspherical lens corresponds to two first light sources and two second light sources.
  • h is the sum of the lateral widths of the two first light sources, or h is the sum of the lateral widths of the two second light sources.
  • t satisfies the following imaging formula 2, and may be 10 mm, for example.
  • the divergence angle of the first light beam and the divergence angle of the second light beam can be set.
  • the divergence angle of the first light source and the emission angle of the second light source can be set, or a light valve that opens the first position can be set. a first width and a second width that opens the light valve in a second position.
  • the divergence angle of the first light beam may be the same as the divergence angle of the second light beam, or may be different, which is not limited in this application.
  • the divergence angle of the first light source in the light source array is the divergence angle of the first light beam
  • the divergence angle of the second light source is the divergence angle of the second light beam.
  • f2 is the equivalent focal length of the aspherical lens array
  • d is the distance between the aspherical lens array and the virtual eyebox surface
  • d satisfies the following formula 3.
  • W is the inter-eye distance between the observer's left eye and right eye, usually about 65mm
  • d1 is the distance between the actual eye box surface and the reflective component
  • d3 is the distance between the virtual eye box surface and the reflective component distance.
  • d1 and d3 satisfy the following imaging formula 4.
  • f1 is the equivalent focal length of the reflective component.
  • the reflective component is a planar reflective element
  • the size of the actual eye box surface is the same as the size of the virtual eye box surface
  • the distance d1 between the actual eye box surface and the reflective component and the distance d1 between the virtual eye box surface and the reflective component The distance d3 is the same
  • the pitch p of the aspherical lenses in the aspherical lens array also satisfies the following formula 5.
  • W eye is the virtual eye distance on the virtual eye box surface
  • the virtual eye box surface is the virtual image formed by the reflection of the actual eye box surface by the reflective component.
  • W eye satisfies the following formula 6.
  • the aspherical lens when the aspherical lens is close to the light source, the light energy emitted by the light source covers a very small range.
  • an aspherical lens array composed of small-pitch aspherical lenses combined with a corresponding dense light source array, the uniformity of the image displayed by the display module can be improved.
  • the above-mentioned display module may include, but is not limited to, NED equipment, vehicle-mounted display screen or HUD, etc.
  • this application can also provide an optical display system.
  • Figure 15 is a schematic structural diagram of an optical display system provided by this application.
  • the optical display system may include the display module in any of the above embodiments, and the details will not be repeated here.
  • the optical display system may also include a control module, and the control module is connected to the display module.
  • the control module is used to control the display module to display images.
  • the optical display system may also include an eye tracking module, and the eye tracking module is connected to the control module.
  • the control module can receive left eye position information and right eye position information from the eye tracking module.
  • control module and eye tracking module are introduced separately below.
  • the control module can receive the observer's left eye position information and right eye position information from the eye tracking module, and determine that the light source component needs to be turned on based on the left eye position information.
  • the position information of the first light source, and the position information of the second light source that the light source component needs to be turned on is determined based on the right eye position information; the first control signal is generated based on the position information of the first light source and the position information of the second light source.
  • the control module can determine the first position information of the light valve based on the left eye position information, and determine the second position information of the light valve based on the right eye position information; and based on the third position information of the light valve.
  • the first position information and the second position information of the light valve generate a second control signal.
  • the control module may include a processor.
  • the processor may be a circuit with signal (or data) processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities.
  • a central processing unit CPU
  • a microprocessor e.g., a central processing unit (CPU)
  • GPU graphics processing unit
  • DSP digital signal processor
  • the processor can implement certain functions through the logical relationship of the hardware circuit, and the logical relationship of the hardware circuit is fixed or reconfigurable.
  • the processor is an application-specific integrated circuit (application-specific integrated circuit, Hardware circuits implemented by ASIC) or programmable logic device (PLD), such as field programmable gate array (FPGA).
  • FPGA field programmable gate array
  • the processor loads the configuration file,
  • the process of realizing hardware circuit configuration can be understood as the process of loading instructions by the processor to realize the functions of some or all of the above units.
  • the processor can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (neural network processing unit, NPU), tensor processing unit (TPU), deep learning Processing unit (deep learning processing unit, DPU), etc.
  • AP application processor
  • ISP image signal processor
  • eye tracking refers to tracking eye movement by measuring the position of the gaze point of the eye or the movement of the eyeball relative to the head.
  • the eye tracking module is a device that can track and measure eye position and eye movement information.
  • the eye tracking module tracks and outputs the position of the observer's eyes in real time and transmits it to the control module, so that the control module controls to turn on the first light source or the second light source corresponding to the position of the human eye, and switches the left eye corresponding to the first light source at high speed.
  • the eye tracking module continuously monitors the position of the human eye. When the human eye position moves, the eye tracking module obtains new position information and transmits the position information to the control module.
  • the eye tracking module can specifically determine the convergence depth of the binocular gaze image (i.e., binocular viewpoint).
  • the principle of convergence adjustment conflict is exemplarily shown.
  • the vergence accommodation conflict is due to the fact that when the human eye observes three-dimensional (3D) content, the correct binocular lens focus depth is always fixed on the screen, while the binocular vergence converges at the target distance defined by the parallax, which may be located In front of the screen, it may also be located behind the screen.
  • the mismatch between the focus depth and the vergence depth causes a convergence adjustment conflict.
  • the eye tracking module may include, but is not limited to, a camera, an infrared emitter or an infrared detector, etc. Further, optionally, the eye tracking module also includes a processor.
  • FIG. 17 is a schematic circuit diagram of an optical display system provided by this application.
  • the circuit in the optical display system mainly includes a processor 1701, an external memory interface 1702, an internal memory 1703, an audio module 1704, a video module 1705, a power module 1706, a wireless communication module 1707, input/output (I/O ) interface 1708, video interface 1709, display circuit 1710, modulator 1711, light source 1712, etc.
  • the processor 1701 and its peripheral components such as external memory interface 1702, internal memory 1703, audio module 1704, video module 1705, power module 1706, wireless communication module 1707, I/O interface 1708, video interface 1709, display circuit 1710 Can be connected via bus.
  • the circuit diagram schematically illustrated in this application does not constitute a specific limitation on the optical display system.
  • the optical display system may include more or fewer components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1701 includes one or more processing units.
  • the processing unit may be a circuit with signal (or data) processing capabilities.
  • signal (or data) processing capabilities For details, please refer to the above-mentioned relevant introductions, which will not be repeated here.
  • different processing units can be independent devices or integrated in one or more processors.
  • the processor 1701 may also be provided with a memory for storing instructions and data.
  • the memory in processor 1701 is cache memory. This memory can store instructions or data that have just been used or recycled by the processor 1701. according to. If the processor 1701 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 1701 is reduced, thereby improving the efficiency of the optical display system.
  • the processor 1701 can execute stored instructions to perform the above imaging method.
  • the optical display system may also include a plurality of input/output (I/O) interfaces 1708 connected to the processor 1701 .
  • the I/O interface 1708 may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and universal asynchronous reception and transmission.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART Universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the above-mentioned I/O interface 1708 can be connected to devices such as a mouse, touch pad, keyboard, camera, speaker/speaker, microphone, etc., or can be connected to physical buttons on the optical display system (such as volume keys, brightness adjustment keys, power on/off keys, etc.).
  • the external memory interface 1702 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the optical display system.
  • the external memory card communicates with the processor 1701 through the external memory interface 1702 to implement the data storage function.
  • Internal memory 1703 may be used to store computer executable program code, which includes instructions.
  • the internal memory 1703 may include a program storage area and a data storage area.
  • the stored program area can store the operating system, at least one application program required for the function, etc.
  • the storage data area can store data created during the use of the optical display system, etc.
  • the internal memory 1703 may include random access memory (RAM), flash memory, universal flash storage (UFS), read-only memory (ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or in this field any other well-known storage medium.
  • the processor 1701 executes various functional applications and data processing of the optical display system by executing instructions stored in the internal memory 1703 and/or instructions stored in a memory provided in the processor 1701 .
  • An exemplary storage medium is coupled to the processor such that the processor can read information from the storage medium and write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and storage media may be located in an ASIC.
  • the ASIC can be located in an optical display system.
  • the processor and storage medium may also exist as discrete components in the optical display system.
  • the optical display system can implement audio functions through the audio module 1704 and application processor. For example, music playback, phone calls, etc.
  • the audio module 1704 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1704 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1704 may be provided in the processor 1701, or some functional modules of the audio module 1704 may be provided in the processor 1701.
  • the video interface 1709 can receive external input audio and video signals, which can specifically be a high definition multimedia interface (HDMI), a digital visual interface (DVI), or a video graphics array (VGA). , display port (display port, DP), etc., the video interface 1709 can also output video to the outside.
  • HDMI high definition multimedia interface
  • DVI digital visual interface
  • VGA video graphics array
  • display port display port, DP
  • the video interface 1709 can also output video to the outside.
  • the video interface 1709 can receive speed signals and power signals input from peripheral devices, and can also receive externally input AR video signals.
  • the optical display system acts as a projection
  • the video interface 1709 can receive video signals input from an external computer or terminal device.
  • the video module 1705 can decode the video input by the video interface 1709, for example, perform H.264 decoding.
  • the video module can also encode the video collected by the optical display system, such as H.264 encoding of the video collected by an external camera.
  • the processor 1701 can also decode the video input from the video interface 1709 and then output the decoded image signal to the display circuit 1710 .
  • the power module 1706 is used to provide power to the processor 1701 and the light source 1712 according to the input power (such as direct current).
  • the power module 1706 may include a rechargeable battery, and the rechargeable battery may provide power to the processor 1701 and the light source 1712.
  • the light emitted by the light source 1712 can be transmitted to the modulator 1711 for imaging, thereby forming an image light signal.
  • the wireless communication module 1707 can enable the optical display system to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT) , Global navigation satellite system (GNSS), frequency modulation (FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • Bluetooth bluetooth, BT
  • GNSS Global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 1707 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1707 receives electromagnetic waves through the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1701 .
  • the wireless communication module 1707 can also receive the signal to be sent from the processor 1701, frequency modulate it
  • the video data decoded by the video module 1705 can also be received wirelessly through the wireless communication module 1707 or read from an external memory.
  • the optical display system can use the wireless LAN in the car. Receive video data from the terminal device or car entertainment system, the optical display system can also read the audio and video data stored in the external memory.
  • the display circuit 1710 and the modulator 1711 are used to display corresponding images.
  • the video interface 1709 receives an externally input video source signal.
  • the video module 1705 decodes and/or digitizes the signal and outputs one or more image signals to the display circuit 1710.
  • the display circuit 1710 drives the modulation according to the input image signal.
  • the detector 1711 images the incident polarized light, and then outputs at least two channels of image light.
  • the processor 1701 can also output one or more image signals to the display circuit 1710 .
  • the optical display system may include but is not limited to HUD, projector, display, vehicle display screen, AR device, VR device, or smart car lights, etc.
  • the AR device may include but is not limited to AR glasses or AR helmet, etc.
  • VR equipment may include but is not limited to VR glasses or VR helmets, etc.
  • the present application can also provide a terminal device. Based on the structure and functional principles of the optical display device described above, the present application can also provide a terminal device.
  • the terminal device may include the display module in any of the above embodiments.
  • the terminal device may be a vehicle (such as an unmanned vehicle, a smart vehicle, an electric vehicle, or a digital vehicle, etc.), a robot, a mapping device, a drone, or a smart home device (such as a television, a sweeping robot, or a smart desk lamp).
  • audio system intelligent lighting system, electrical control system, home background music, home theater system, intercom system, or video surveillance, etc.
  • intelligent manufacturing equipment such as industrial equipment
  • intelligent transportation equipment such as AGV, unmanned transport vehicle
  • smart terminals mobile phones, computers, tablets, PDAs, desktops, headsets, speakers, wearable devices, vehicle-mounted devices, virtual reality devices, augmented reality devices, etc.
  • terminal device 1800 may include propulsion system 1801, sensing system 1802, Control system 1803, computer system 1804, user interface 1805 and optical display system 1806.
  • the components of terminal device 1800 may be configured to operate in interconnection with each other and/or with other components coupled to various systems.
  • computer system 1804 may be configured to receive data from and control propulsion system 1801, sensing system 1802, control system 1803, and the like.
  • Computer system 1804 may also be configured to generate a display of the image on user interface 1805 and receive input from user interface 1805 .
  • Propulsion system 1801 may provide powered motion for terminal device 1800.
  • Propulsion system 1801 may include an engine/motor, energy source, transmission, and wheels/tyres. Additionally, propulsion system 1801 may additionally or alternatively include other components in addition to those shown in FIG. 18 . This application does not specifically limit this.
  • the sensing system 1802 may include several sensors for sensing information about the environment in which the terminal device 1800 is located, and the like.
  • the sensors of the sensing system 1802 may include, but are not limited to, global positioning system (GPS), inertial measurement unit (IMU), millimeter wave radar, lidar, cameras, and sensors for modifying the position and/or orientation of the brake.
  • Millimeter wave radar may utilize radio signals to sense targets in the surrounding environment of the terminal device 1800 .
  • millimeter wave radar may be used to sense the speed and/or heading of the target.
  • LiDAR can utilize laser light to sense targets in the environment where the terminal device 1800 is located.
  • a lidar may include one or more laser sources and one or more detectors, among other system components.
  • the camera may be used to capture multiple images of the surrounding environment of the terminal device 1800 .
  • the camera can be a still camera or a video camera.
  • GPS may be any sensor used to estimate the geographic location of terminal device 1800 .
  • the GPS may include a transceiver that estimates the position of the terminal device 1800 relative to the Earth based on satellite positioning data.
  • computer system 1804 may be used to estimate the road traveled by terminal device 1800 using GPS in conjunction with map data.
  • the IMU may be used to sense changes in position and orientation of the terminal device 1800 based on inertial acceleration and any combination thereof.
  • the combination of sensors in the IMU may include, for example, an accelerometer and a gyroscope. Additionally, other combinations of sensors in the IMU are possible.
  • the sensing system 1802 may also include sensors of internal systems of the terminal device 1800 being monitored (eg, in-vehicle air quality monitor, fuel gauge, oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding properties (position, shape, orientation, speed, etc.). This detection and identification is a critical function for the secure operation of terminal device 1800. Sensing system 1802 may also include other sensors. This application does not specifically limit this.
  • the control system 1803 controls the operation of the terminal device 1800 and its components.
  • the control system 1803 may include various elements, including steering units, throttles, braking units, sensor fusion algorithms, computer vision systems, route control systems, and obstacle avoidance systems.
  • the steering system is operable to adjust the forward direction of the terminal device 1800.
  • it may be a steering wheel system.
  • the throttle is used to control the operating speed of the engine and thus the speed of the terminal device 1800 .
  • Control system 1803 may additionally or alternatively include other components in addition to those shown in FIG. 18 . This application does not specifically limit this.
  • the braking unit is used to control the terminal equipment 1800 to decelerate. Braking units use friction to slow down the wheels.
  • the braking unit may convert the kinetic energy of the wheels into electrical current.
  • the braking unit may also take other forms to slow down the wheel speed to control the speed of the terminal device 1800 .
  • the computer vision system may be operable to process and analyze images captured by the camera in order to identify objects and/or features in the environment surrounding the terminal device 1800 . Objects and/or features may include traffic signals, road boundaries, and obstacles.
  • Computer vision systems can use target recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, computer vision systems can be used to map an environment, track targets, estimate a target's speed, and so on. Route control systems are used to determine terminal equipment 1800 driving route.
  • the route control system may combine data from the sensing system 1802, GPS, and one or more predetermined maps to determine a driving route for the terminal device 1800.
  • An obstacle avoidance system is used to identify, evaluate, and avoid or otherwise negotiate potential obstacles in the environment of terminal device 1800 .
  • control system 1803 may additionally or alternatively include components in addition to those shown and described. Alternatively, some of the components shown above may be reduced.
  • the computer system 1804 may include at least one processor 18041. Further, the computer system 1804 may also include an interface circuit 18042. Processor 18041 executes instructions stored in a non-transitory computer-readable medium such as memory 18043. Computer system 1804 may also be multiple computing devices that control individual components or subsystems of terminal device 1800 in a distributed manner.
  • the processor 18041 may be a circuit with signal (or data) processing capabilities. For details, please refer to the above related introduction, which will not be described again here.
  • FIG. 18 functionally illustrates the processor, memory, and other elements of computer system 1804 in the same block
  • the processor and memory may not actually be stored in the same physical enclosure. multiple processors or memories within.
  • the memory may be a hard drive or other storage medium located in a housing different from computer system 1804.
  • some components such as the steering component and the deceleration component, may each have their own processor that only performs calculations related to component-specific functionality.
  • the processor can also be remote from the vehicle but can communicate wirelessly with the vehicle.
  • memory 18043 may contain instructions (eg, program logic) that may be read by processor 18041 to perform various functions of terminal device 1800, including the functions described above.
  • Memory 18043 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of propulsion system 1801 , sensing system 1802 , and control system 1803 .
  • the memory 18043 may also store data such as road maps, route information, data detected by sensors, vehicle position, direction, speed and other such vehicle data, as well as other information. This information may be used by terminal device 1800 and computer system 1804 when terminal device 1800 is in autonomous, semi-autonomous and/or manual modes.
  • User interface 1805 used to provide information to or receive information from the user of the terminal device 1800.
  • user interface 1805 may include one or more input/output devices within a collection of peripheral devices, which may include, for example, a wireless communication system, a touch screen, a microphone and/or a speaker, and the like.
  • Computer system 1804 may control the functionality of terminal device 1800 based on input received from various subsystems (eg, propulsion system 1801 , sensing system 1802 , and control system 1803 ) and from user interface 1805 .
  • computer system 1804 may utilize input from control system 1803 in order to control the steering unit to avoid obstacles detected by sensing system 1802 and obstacle avoidance system.
  • computer system 1804 is operable to provide control over many aspects of terminal device 1800 and its subsystems.
  • the optical display system 1806 can be referred to the introduction of any of the previous embodiments, and will not be described again here. It should be noted that the functions of some components in the optical display system can also be implemented by other subsystems of the vehicle.
  • the controller can also be a component in the control system.
  • one or more of the components described above may be installed separately from or associated with the terminal device 1800 .
  • the memory 18043 may exist partially or completely separately from the terminal device 1800.
  • the components described above may be communicatively coupled together in wired and/or wireless manners.
  • terminal device functional framework shown in Figure 18 is just an example.
  • the terminal device 1800 may include more, fewer, or different systems, and each system may include more, less, or different components.
  • systems and components shown can be combined or divided in any way, which is not specifically limited in this application.
  • this application provides an imaging method, please refer to the introduction of Figure 19.
  • This imaging method can be applied to the display module or optical display system or terminal device shown in any of the above embodiments. It can also be understood that the imaging method can be implemented based on the display module or optical display system or terminal device shown in any of the above embodiments.
  • the following takes the light source component as a light source array as an example. The method includes the following steps:
  • Step 1901 The eye tracking module detects the position of the observer's left eye and right eye.
  • the process and principle of the eye tracking module detecting the position of the observer's left eye and right eye can be found in the relevant introduction mentioned above and will not be described again here.
  • Step 1902 The eye tracking module sends left eye position information and right eye position information to the control module.
  • the control module receives left eye position information and right eye position information from the eye tracking module.
  • the eye tracking module is connected with the control module.
  • Step 1903 The control module determines the position information of the first light source based on the left eye position information and the first corresponding relationship, and determines the position information of the second light source based on the right eye position information and the second corresponding relationship.
  • the first correspondence relationship includes the relationship between the position of the left eye and the position information of the first light source
  • the second correspondence relationship includes the correspondence relationship between the position of the right eye and the position information of the second light source. It should be noted that the first correspondence relationship and the second correspondence relationship may be in the form of a table, or may be in other possible forms, which is not limited in this application.
  • Table 1 and Table 2 exemplarily show the first correspondence relationship and the second correspondence relationship.
  • the position information of the first light source and the position information of the second light source in Table 1 and Table 2 are both represented by the rows and columns of the first light source and the second light source in the light source array.
  • control module can also obtain the first period T 1 during which the first light source is turned on, the second period T 2 during which the second light source is turned on, the switching frequency v1 of the first light source, and the switching frequency v2 of the second light source. wait.
  • the first period T 1 during which the first light source is turned on and the switching frequency v1 of the first light source can be carried in the first control signal and sent to the light source array.
  • the second period T 2 during which the second light source is turned on and the switching frequency v2 of the second light source may be carried in the second control signal and sent to the light source array.
  • Step 1904 The control module generates a first control signal based on the position information of the first light source and the position information of the second light source.
  • the first control signal includes position information of the first light source and position information of the second light source.
  • Step 1905 The control module generates a third control signal based on the position information of the first light source and the position information of the second light source.
  • the third control signal is used to control the light modulation component to switch the frequency of the first image information and the second image information.
  • Step 1904 may be executed first and then step 1905, or step 1905 may be executed first and then step 1904, or step 1904 and step 1905 may be synchronized. Execution, that is, the first control signal and the third control signal are generated synchronously, which is not limited in this application.
  • Step 1906 The control module sends a first control signal to the light source component and a third control signal to the light modulation component.
  • the light source component receives the first control signal from the control module
  • the light modulation component receives the third control signal from the control module.
  • Step 1907 The light source component turns on the first light source in the light source array during the first period and turns on the second light source in the light source array during the second period according to the received first control signal.
  • the first light source that is turned on is used to emit a first light beam
  • the second light source that is turned on is used to emit a second light beam.
  • Step 1908 The light modulation component modulates the third light beam according to the third control signal to obtain the first image light carrying the first image information, and modulates the fourth light beam to obtain the second image light carrying the second image information.
  • the third control signal includes the switching frequency of the first image information and the second image information.
  • This switching frequency can be changed based on minimum requirements (i.e. the flicker frequency cannot be distinguished by the human eye) and user needs.
  • the switching frequency of the first image information and the second image information is equal to the sum of the switching frequency of the first light source and the switching frequency of the second light source.
  • the switching frequency of the first light source ⁇ 1 1/60 second
  • the switching frequency of the second light source ⁇ 2 1/60 second. It can be understood that after the optical display system starts to operate, the switching frequency of the first image information and the second image information is usually fixed.
  • the interval between the switching time of the light modulation component and the turning on time of the first light source is ⁇ T, where ⁇ T can also be preset during the initialization process of the display module. It should be understood that ⁇ T may be equal to 0 or not equal to 0, and this application does not limit this.
  • the display module can be controlled to display images. If the first image information and the second image information are the same, the display module can realize two-dimensional image display. If the first image information and the second image information are different, the display module can realize three-dimensional image display.
  • the eye tracking module continues to monitor the position of the human eye, and the eye tracking module continues to perform the above step 1901.
  • the eye tracking module detects the new positions of the left eye and the right eye, and repeats the above steps 1902 to 1908.
  • the first control signal in step 1902 does not change, and the first control signal continues to be sent to light the corresponding first light source or second light source.
  • Figure 20a is an optical path diagram of an observer's left eye and right eye at position 1 provided by this application.
  • the eye tracking module detects that the observer's left eye and right eye are at position 2, the optical path based on the display module can be seen in Figure 20b.
  • the control module determines the first position information of the light valve according to the left eye position information and the third corresponding relationship, and determines the first position information of the light valve according to the right eye position information. and the fourth pair The second position information of the light valve should be determined according to the relationship.
  • Table 3 and Table 4 exemplarily show the third correspondence relationship and the fourth correspondence relationship.
  • the first position information of the light valve includes the first width and the first center coordinate of the light valve as an example.
  • the second position information of the light valve includes the second width and the second center coordinate of the light valve. example.
  • first width 11, the first width 12... the first width 1n in the above Table 3 can be the same or different
  • second width 21, the second width 22... the second width 2n can be the same or different.
  • the first width and the second width may be the same or different, which is not limited in this application.
  • the above examples are based on the third correspondence relationship and the fourth correspondence relationship in different tables. It should be understood that the third corresponding relationship and the fourth corresponding relationship may also be in the same table, and this application does not limit this.
  • the above step 1904 may specifically include the control module generating a second control signal based on the first position information of the light valve and the second position information of the light valve.
  • control device includes corresponding hardware structures and/or software modules that perform each function.
  • modules and method steps of each example described in conjunction with the embodiments disclosed in this application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software driving the hardware depends on the specific application scenarios and design constraints of the technical solution.
  • FIG. 21 and FIG. 22 are schematic structural diagrams of possible control devices provided by the present application. These control devices can be used to implement the functions of the control devices in the above method embodiments, and therefore can also achieve the beneficial effects of the above method embodiments.
  • control device 2100 includes a processing module 2101 and a transceiver module 2102.
  • the control device 2100 is used to implement the functions of the control device in the method embodiment shown in FIG. 19 .
  • the processing module 2101 coordinates with the transceiver module 2102 to control the light source component to emit the first beam and the second beam in a time-sharing manner, the first beam and The exit position of the second beam is different.
  • the first beam is adjusted to a third beam through the optical shaping component, and the second beam is adjusted to a fourth beam through the optical shaping component.
  • the light modulating component is controlled to modulate the third beam to carry the first image information.
  • the first image light is optically modulated on the fourth beam to obtain the second image light carrying the second image information; the first image light is reflected to the observer through the reflective component The left eye of the observer is imaged, and the second image light is reflected by the reflective component to the observer's right eye for imaging.
  • processing module 2101 in the embodiment of the present application can be implemented by a processor or processor-related circuit components
  • transceiver module 2102 can be implemented by a transceiver or transceiver-related circuit components.
  • the control device 2200 may include a processor 2201 and an interface circuit 2202.
  • the processor 2201 and the interface circuit 2202 are coupled to each other.
  • the interface circuit 2202 may be a transceiver or an input-output interface.
  • the control device 2200 may also include a memory 2203 for storing instructions executed by the processor 2201 or input data required for the processor 2201 to run the instructions or data generated after the processor 2201 executes the instructions.
  • control device 2200 When the control device 2200 is used to implement the method shown in Figure 19, the processor 2201 is used to perform the functions of the above-mentioned processing module 2101, and the interface circuit 2202 is used to perform the functions of the above-mentioned transceiver module 2102.
  • a computer program product includes one or more computer programs or instructions.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, a control device, user equipment or other programmable device.
  • a computer program or instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., a computer program or instructions may be transferred from a website, computer, server, or data center Transmission by wired or wireless means to another website site, computer, server or data center.
  • Computer-readable storage media can be any available media that can be accessed by a computer, or data storage devices such as servers and data centers that integrate one or more available media. Available media can be magnetic media, such as floppy disks, hard disks, tapes; optical media, such as digital video discs (DVD); or semiconductor media, such as solid state drives (SSD) ).
  • a, b or c can mean: a, b, c, "a and b", “a and c", “b and c”, or “a and b and c” ”, where a, b, c can be single or multiple.
  • the character “/” generally indicates that the related objects are in an "or” relationship.
  • the character “/” indicates that the related objects are in a “division” relationship.
  • the word “exemplary” is used to mean an example, illustration, or illustration. Any embodiment or design described herein as “example” is not intended to be construed as preferred or advantageous over other embodiments or designs. Alternatively, it can be understood that the use of the word “example” is intended to present concepts in a specific manner and does not constitute a limitation on this application.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un module d'affichage, un système d'affichage optique, un dispositif terminal et un procédé d'imagerie, destinés à être utilisés pour résoudre le problème dans l'état de la technique de perte de résolution d'affichage d'image, et appliqués à un dispositif d'affichage tête haute (HUD), à un dispositif d'affichage proche de l'œil (NED), ou à un affichage monté sur véhicule et similaire. Le module d'affichage comprend : un ensemble source de lumière utilisé pour émettre un premier faisceau lumineux et un deuxième faisceau lumineux dans différentes périodes de temps, respectivement, des positions de sortie du premier faisceau lumineux et du deuxième faisceau lumineux étant différentes ; un ensemble de mise en forme optique utilisé pour ajuster la direction du premier faisceau lumineux pour obtenir un troisième faisceau lumineux et ajuster la direction du deuxième faisceau lumineux pour obtenir un quatrième faisceau lumineux ; un ensemble de modulation de lumière utilisé pour moduler le troisième faisceau lumineux pour obtenir une première lumière d'image transportant des premières informations d'image, et moduler le quatrième faisceau lumineux pour obtenir une seconde lumière d'image transportant des secondes informations d'image ; et un ensemble réfléchissant utilisé pour réfléchir la première lumière d'image vers un œil gauche d'un observateur pour imager et réfléchir la seconde lumière d'image vers un œil droit de l'observateur pour une imagerie, l'ensemble de mise en forme optique étant situé entre l'ensemble source de lumière et l'ensemble de modulation de lumière. De cette manière, la perte de résolution d'images peut être évitée.
PCT/CN2023/093125 2022-08-26 2023-05-10 Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie WO2024041034A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211033849.X 2022-08-26
CN202211033849.XA CN117664924A (zh) 2022-08-26 2022-08-26 一种显示模组、光学显示系统、终端设备及成像方法

Publications (1)

Publication Number Publication Date
WO2024041034A1 true WO2024041034A1 (fr) 2024-02-29

Family

ID=90012355

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/093125 WO2024041034A1 (fr) 2022-08-26 2023-05-10 Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie

Country Status (2)

Country Link
CN (1) CN117664924A (fr)
WO (1) WO2024041034A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050018288A1 (en) * 2001-12-14 2005-01-27 Peter-Andre Redert Stereoscopic display apparatus and system
CN104950459A (zh) * 2015-05-27 2015-09-30 广东顺德中山大学卡内基梅隆大学国际联合研究院 一种视角增强型指向性背光裸眼立体显示装置
CN110187506A (zh) * 2019-05-28 2019-08-30 京东方科技集团股份有限公司 光学显示系统和增强现实设备
US20210392305A1 (en) * 2020-06-16 2021-12-16 Lightspace Technologies, SIA Display Systems, Projection Units and Methods for Presenting Three-Dimensional Images
CN216351537U (zh) * 2021-07-12 2022-04-19 极瞳科技(北京)有限公司 一种全息近眼三维显示系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050018288A1 (en) * 2001-12-14 2005-01-27 Peter-Andre Redert Stereoscopic display apparatus and system
CN104950459A (zh) * 2015-05-27 2015-09-30 广东顺德中山大学卡内基梅隆大学国际联合研究院 一种视角增强型指向性背光裸眼立体显示装置
CN110187506A (zh) * 2019-05-28 2019-08-30 京东方科技集团股份有限公司 光学显示系统和增强现实设备
US20210392305A1 (en) * 2020-06-16 2021-12-16 Lightspace Technologies, SIA Display Systems, Projection Units and Methods for Presenting Three-Dimensional Images
CN216351537U (zh) * 2021-07-12 2022-04-19 极瞳科技(北京)有限公司 一种全息近眼三维显示系统

Also Published As

Publication number Publication date
CN117664924A (zh) 2024-03-08

Similar Documents

Publication Publication Date Title
US11838689B2 (en) Rotating LIDAR with co-aligned imager
US11270114B2 (en) AR device and method for controlling the same
US20220365345A1 (en) Head-up display and picture display system
JP2023175794A (ja) ヘッドアップディスプレイ
US20200257124A1 (en) Electronic device
WO2021015171A1 (fr) Affichage tête haute
WO2024021852A1 (fr) Appareil d'affichage stéréoscopique, système d'affichage stéréoscopique et véhicule
US11388390B2 (en) Wearable electronic device on head
WO2024041034A1 (fr) Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie
WO2024021574A1 (fr) Système de projection 3d, système de projection et véhicule
WO2024065332A1 (fr) Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image
EP3961291B1 (fr) Affichage tête haute de véhicule et unité de source de lumière utilisée à cet effet
WO2023193210A1 (fr) Module d'émission optique, dispositif d'affichage optique, dispositif terminal et procédé d'affichage d'image
US20240036311A1 (en) Head-up display
US20240069335A1 (en) Head-up display
US11442279B2 (en) Wearable glasses-type device and method for providing virtual images of the glasses-type device
US20230152586A1 (en) Image generation device and head-up display
WO2024032057A1 (fr) Appareil d'affichage tridimensionnel, dispositif d'affichage tridimensionnel et procédé d'affichage tridimensionnel
WO2023130759A1 (fr) Dispositif d'affichage et véhicule
WO2024098828A1 (fr) Système de projection, procédé de projection et moyen de transport
US11977243B1 (en) Collaborative navigation system with campfire display
WO2023190338A1 (fr) Dispositif d'irradiation d'image
WO2023216670A1 (fr) Appareil d'affichage tridimensionnel et véhicule
WO2023019442A1 (fr) Système de détection, dispositif de terminal, et procédé de commande de détection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23856128

Country of ref document: EP

Kind code of ref document: A1