WO2023193210A1 - 光学发射模组、光学显示装置、终端设备及图像显示方法 - Google Patents

光学发射模组、光学显示装置、终端设备及图像显示方法 Download PDF

Info

Publication number
WO2023193210A1
WO2023193210A1 PCT/CN2022/085666 CN2022085666W WO2023193210A1 WO 2023193210 A1 WO2023193210 A1 WO 2023193210A1 CN 2022085666 W CN2022085666 W CN 2022085666W WO 2023193210 A1 WO2023193210 A1 WO 2023193210A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
light source
light
image
light beam
Prior art date
Application number
PCT/CN2022/085666
Other languages
English (en)
French (fr)
Inventor
林君翰
翁德正
徐彧
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2022/085666 priority Critical patent/WO2023193210A1/zh
Publication of WO2023193210A1 publication Critical patent/WO2023193210A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the present application relates to the field of image display technology, and in particular, to an optical emission module, an optical display device, a terminal device and an image display method.
  • HUD head-up display
  • a HUD device is a device that projects driving-related information (such as instrument information or navigation information, etc.) to the front of the driver's field of vision.
  • the driver can see instrument information and navigation information in front of the driver's field of vision without having to lower his head to look under the steering wheel.
  • the instrument panel or central control display screen can improve the braking reaction time in emergency situations, thereby improving driving safety.
  • AR-HUD devices can be used.
  • the AR-HUD device can display instrument information and navigation information at the same time.
  • the AR-HUD device needs to generate two virtual images with different focal planes (or called dual-screen display).
  • dual-screen display the AR-HUD device needs to generate two virtual images with different focal planes.
  • LBS laser beam scanning
  • This application provides an optical emission module, an optical display device, a terminal device and an image display method, which are used to generate virtual images with different focal planes while reducing the volume of the HUD device.
  • the optical emission module may include a first light source component, a transflective component, a first reflective component and a second reflective component.
  • the transflective component includes N reflective areas and M A transmission area, N and M are both positive integers; the first light source component is used to emit the first beam and the second beam; the reflection area of the transflective component is used to reflect the received first beam to the first reflection component; A reflective component is used to reflect the received first light beam to the second reflective component; the transmission area of the transflective component is used to transmit the received second light beam to the second reflective component; the second reflective component is used to transmit the received second light beam to the second reflective component.
  • the first light beam of a reflective component is reflected to the first diffusion component, and is scanned on the first diffusion component through the rotation of the second reflective component to form a first image, and the second light beam from the transmission area of the transflective component is reflected to the third
  • the second diffusion component is scanned on the second diffusion component through the rotation of the second reflection component to form a second image.
  • the first image corresponds to the first virtual image of the far focal plane
  • the second image corresponds to the second virtual image of the near focal plane.
  • the first virtual image and the second virtual image are virtual images formed on two different focal planes.
  • the design of dual optical paths (that is, the propagation optical path of the first beam and the propagation optical path of the second beam) can be realized through the reflective area and the transmissive area of the transflective component, and a second reflective component can be used to realize the design based on the first beam.
  • the first image is formed and the second image is formed based on the second light beam, thus helping to reduce the volume of the optical emission module.
  • a virtual image or dual-screen display
  • N reflective areas and M transmissive areas of the transflective component are distributed crosswise.
  • the reflection area and the transmission area can be switched to realize that the reflection area reflects the received first light beam to the first reflection component, and the transmission area reflects the received second light beam. Transmitted to the second reflective component.
  • the areas of the N reflective regions are the same or different; and/or the areas of the M transmission regions are the same or different.
  • the sum of the areas of the N reflective regions of the transflective component is greater than or equal to the sum of the areas of the M transmissive regions.
  • the intensity of the reflected first light beam can be increased, thereby helping to increase the brightness of the first image, thereby improving the first image corresponding to the first image.
  • the brightness of the virtual image By the sum of the areas of the N reflection areas being greater than the sum of the areas of the M transmission areas, the intensity of the reflected first light beam can be increased, thereby helping to increase the brightness of the first image, thereby improving the first image corresponding to the first image.
  • the brightness of the virtual image By the sum of the areas of the N reflection areas being greater than the sum of the areas of the M transmission areas, the intensity of the reflected first light beam can be increased, thereby helping to increase the brightness of the first image, thereby improving the first image corresponding to the first image.
  • the brightness of the virtual image By the sum of the areas of the N reflection areas being greater than the sum of the areas of the M transmission areas, the intensity of the reflected first light beam can be increased, thereby helping to increase the brightness of the first image, thereby improving the first image corresponding to the first image.
  • the first reflective component includes a reflective mirror.
  • the reflective mirror as the first reflective component helps to simplify the structure of the optical transmitting module. Moreover, the optical path design is also relatively simple.
  • the second reflective component is a micro-electromechanical system (MEMS) galvanometer.
  • MEMS micro-electromechanical system
  • the first beam can be scanned on the first diffusion component to form a first image
  • the second beam can be scanned on the second diffusion component to form a second image
  • the transflective component includes a color wheel.
  • the color wheel is used as a transflective component, which is simple to implement; and it is easy to control the switching between the reflective area and the transmissive area.
  • the first diffusion component is a first diffusion screen
  • the second diffusion component is a second diffusion screen
  • the first light source component is specifically configured to emit the first light beam according to the received first control signal, where the first control signal is generated based on the information of the first image; or, according to the received The second control signal emits the second light beam, and the second control signal is generated based on the information of the second image.
  • the first light source component can determine whether it is necessary to emit the first light beam of the first image or the second light beam of the second image.
  • the transflective component is specifically used to align the reflection area of the transflective component with the propagation optical path of the first light beam according to the received third control signal, and the third control signal is based on the first image. information generated; or, align the transmission area of the transflective component with the propagation path of the second light beam according to the received fourth control signal, which is generated based on the information of the second image.
  • the transflective component can timely switch the reflection area to align with the propagation light path of the first light beam, or timely switch the reflection area to align with the propagation light path of the second light beam.
  • the switching frequency of the reflective area and the transmissive area of the transflective component is equal to the switching frequency of the first light source component emitting the first light beam and the second light beam.
  • the switching frequency of the reflective area and the transmissive area of the transflective component is equal to the switching frequency of the first light source component emitting the first beam and the second beam, so that when the first light source component emits the first beam, the reflective area of the transflective component It is exactly aligned with the propagation light path of the first light beam.
  • the transmission area of the transflective component is exactly aligned with the propagation light path of the second light beam.
  • the first light source component includes a first light source for emitting red light, a second light source for emitting blue light, a third light source for emitting green light, and a light combining element.
  • the light combining element is By mixing red light, blue light and green light, a first beam or a second beam is obtained.
  • the red light emitted by the first light source, the blue light emitted by the second light source, and the green light emitted by the third light source can be mixed to obtain the first light beam or the second light beam of any desired color, or can be mixed to obtain white light.
  • the light combining element includes a first dichroic mirror and a second dichroic mirror; the first dichroic mirror is used to reflect blue light from the second light source and transmit green light from the third light source; The second dichroic mirror is used to reflect the red light from the first light source, transmit the green light transmitted by the first dichroic mirror, and transmit the blue light reflected by the first dichroic mirror.
  • the third light source for emitting green light can be placed at the farthest position from the light combining module through the first dichroic mirror and the second dichroic mirror. It is possible to balance the brightness of the first image and the second image with red, green and blue light.
  • the application provides an optical emission module, including a second light source component, a third light source component, a first reflective component and a second reflective component; the second light source component is used to emit a first light beam; and the third light source component for emitting the second light beam; the first reflective component is used for reflecting the first light beam from the second light source component to the second reflective component; the second reflective component is used for reflecting the first light beam from the first reflective component to the first
  • the diffusion component is scanned on the first diffusion component through the rotation of the second reflection component to form the first image, and the second light beam from the third light source component is reflected to the second diffusion component and is reflected on the first diffusion component through the rotation of the second reflection component.
  • the second diffusion component is scanned to form a second image.
  • the first image corresponds to the first virtual image of the far focal plane
  • the second image corresponds to the second virtual image of the near focal plane.
  • the first virtual image and the second virtual image are virtual images formed on two different focal planes.
  • a dual optical path design i.e., the propagation light path of the first beam and the propagation light path of the second beam
  • the first image and the second image are formed in the optical emission module.
  • the first reflective component includes a reflective mirror.
  • the reflective mirror as the first reflective component helps to simplify the structure of the optical transmitting module. Moreover, the optical path design is also relatively simple.
  • the second reflective component is a micro-electromechanical system (MEMS) galvanometer.
  • MEMS micro-electromechanical system
  • the first beam can be scanned on the first diffusion component to form a first image
  • the second beam can be scanned on the second diffusion component to form a second image
  • the first diffusion component is a first diffusion screen
  • the second diffusion component is a second diffusion screen
  • the second light source component is specifically configured to emit the first light beam according to the received fifth control signal, which is generated based on the information of the first image; and/or the third light source The component is specifically configured to emit the second light beam according to the received sixth control signal, the sixth control signal being generated based on the information of the second image.
  • the second light source component can emit the first light beam of the first image
  • the third light source component can emit the second light beam of the second image
  • the second light source assembly includes a first light source for emitting red light, a second light source for emitting blue light, a third light source for emitting green light, and a light combining element.
  • the light combining element is By mixing red light, blue light and green light, the first beam is obtained.
  • the third light source component may be the same as the second light source component.
  • the red light emitted by the first light source, the blue light emitted by the second light source and the green light emitted by the third light source can be mixed to obtain the first light beam and the second light beam of any desired color, or can also be mixed to obtain white light.
  • the light combining element includes a first dichroic mirror and a second dichroic mirror; the first dichroic mirror is used to reflect blue light from the second light source and transmit green light from the third light source; The second dichroic mirror is used to reflect the red light from the first light source, transmit the green light transmitted by the first dichroic mirror, and transmit the blue light reflected by the first dichroic mirror.
  • the third light source for emitting green light can be placed at the farthest position from the light combining module through the first dichroic mirror and the second dichroic mirror. It is possible to balance the brightness of the first image and the second image with red, green and blue light.
  • the present application provides an optical display device.
  • the optical display device includes a light amplification module and the first aspect or any one of the optical emission modules in the first aspect; alternatively, the optical display device includes a light amplification module. module and the above-mentioned second aspect or any one of the optical emission modules in the second aspect.
  • the optical display device may include but is not limited to a head-up display device (HUD), a projector, a vehicle display screen, an augmented reality (augmented reality) device, a virtual reality (VR) device, etc.
  • HUD head-up display device
  • augmented reality augmented reality
  • VR virtual reality
  • the light amplification module includes at least one curved reflector and/or at least one cylindrical mirror.
  • the present application provides a terminal device, which includes a windshield and the third aspect or any one of the optical display devices in the third aspect.
  • this application provides an image display method.
  • the image display method includes controlling the first light source component of the optical emission module to emit the first light beam, controlling the reflection area of the transflective component of the optical reflection module to align with the propagation light path of the first light beam, and controlling the second light beam of the optical emission module.
  • the reflection module rotates, and the first beam is scanned on the first diffusion component through the rotation of the second reflection module to form the first image; the first light source component is controlled to emit the second beam, and the transmission area of the transflective component is controlled to align with the second beam.
  • propagation light path and controls the rotation of the second reflection module, and the second beam scans on the second diffusion component through the rotation of the second reflection module to form a second image.
  • the first light source component includes a first light source for emitting red light, a second light source for emitting blue light, and a third light source for emitting green light; the method further includes: obtaining the first The brightness of the first virtual image corresponding to the image and the brightness of the second virtual image corresponding to the second image; determining the current size of the input first light source, the second light source and the third light source according to the brightness of the first virtual image; according to the brightness of the second virtual image The brightness determines the magnitude of the current input to the first light source, the second light source and the third light source.
  • the brightness of the first virtual image and the second virtual image can be flexibly adjusted.
  • information of the first image may be acquired, a first control signal may be generated according to the information of the first image, and the first control signal may be sent to the first light source component, where the first control signal is used to control the emission of the first light source component.
  • the first beam may be acquired, a first control signal may be generated according to the information of the first image, and the first control signal may be sent to the first light source component, where the first control signal is used to control the emission of the first light source component.
  • the information of the second image can be acquired, a second control signal is generated according to the information of the second image, and the second control signal is sent to the first light source component, where the second control signal is used to control the emission of the first light source component. Second beam.
  • the present application provides a control device, which is used to implement the above fifth aspect or any one of the methods in the fifth aspect, including corresponding functional modules, respectively used to implement the steps in the above methods.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules corresponding to the above functions.
  • control device is, for example, a chip or a chip system or a logic circuit.
  • the beneficial effects can be found in the description of the fifth aspect above and will not be described again here.
  • the control device may include: a transceiver module and a processing module.
  • the processing module may be configured to support the control device to perform corresponding functions in the method of the fifth aspect, and the transceiver module is used to support interaction between the control device and functional components in the optical emission module and the like.
  • the transceiver module can be an independent receiving module, an independent transmitting module, a transceiver module with integrated transceiver function, etc.
  • the present application provides a control device, which is used to implement the fifth aspect or any one of the methods in the fifth aspect, including corresponding functional modules, respectively used to implement the steps in the above methods.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules corresponding to the above functions.
  • control device is, for example, a chip or a chip system or a logic circuit.
  • the beneficial effects can be found in the description of the fifth aspect above and will not be described again here.
  • the control device may include: an interface circuit and a processor.
  • the processor may be configured to support the control device to perform corresponding functions in the method of the fifth aspect, and the interface circuit is used to support interaction between the control device and the optical emission module.
  • the control device may also include a memory, which may be coupled to the processor and which stores necessary program instructions and the like for the control device.
  • the present application provides a chip.
  • the chip includes at least one processor and an interface circuit. Further, optionally, the chip may also include a memory.
  • the processor is used to execute computer programs or instructions stored in the memory, so that the chip Execute the method in the above fifth aspect or any possible implementation of the fifth aspect.
  • the present application provides a computer-readable storage medium.
  • Computer programs or instructions are stored in the computer-readable storage medium.
  • the control device causes the control device to execute the above fifth aspect or the third aspect. Methods in any possible implementation of the five aspects.
  • the present application provides a computer program product.
  • the computer program product includes a computer program or instructions.
  • the control device causes the control device to execute the fifth aspect or any of the fifth aspects.
  • Figure 1a is a schematic diagram of a scene where an optical emission module provided by this application is applied to a HUD device;
  • Figure 1b is a schematic diagram of a scenario where an optical emission module provided by this application is applied to NED equipment;
  • Figure 1c is a schematic diagram of a scene where an optical emission module provided by this application is applied to a projector;
  • Figure 1d is a schematic diagram of a scene where an optical emission module provided by this application is applied to a vehicle display screen;
  • FIG. 2 is a schematic structural diagram of a HUD device in the prior art
  • FIG. 3 is a schematic structural diagram of an optical emission module provided by this application.
  • Figure 4a is a schematic structural diagram of a first light source component provided by this application.
  • Figure 4b is a schematic structural diagram of another first light source component provided by this application.
  • Figure 5a is a schematic structural diagram of a transflective component provided by this application.
  • Figure 5b is a schematic structural diagram of another transflective component provided by this application.
  • Figure 5c is a schematic structural diagram of another transflective component provided by this application.
  • Figure 5d is a schematic structural diagram of another transflective component provided by this application.
  • Figure 5e is a schematic structural diagram of another transflective component provided by this application.
  • Figure 6 is a line diagram obtained by scanning the first beam and the second beam corresponding to the diffusion component after being emitted by the second reflective component provided by the present application;
  • Figure 7a is a schematic structural diagram of a light-diffusing element provided by this application.
  • Figure 7b is a schematic structural diagram of another light-diffusing element provided by this application.
  • Figure 7c is a schematic structural diagram of another light-diffusing element provided by this application.
  • Figure 7d is a schematic structural diagram of another light-diffusing element provided by this application.
  • Figure 8 is a schematic structural diagram of another optical emission module provided by this application.
  • Figure 9 is a schematic structural diagram of another optical emission module provided by the present application.
  • Figure 10 is a schematic structural diagram of an optical display device provided by the present application.
  • Figure 11 is a schematic circuit diagram of an optical display device provided by the present application.
  • Figure 12a is a schematic diagram of a possible functional framework of a vehicle provided by this application.
  • Figure 12b is a simplified schematic diagram of a partial structure of a vehicle provided by this application.
  • Figure 13 is a schematic flow chart of an image display method provided by this application.
  • FIG 14 is a schematic structural diagram of a control device provided by this application.
  • Figure 15 is a schematic structural diagram of a control device provided by this application.
  • VID Virtual image distance
  • the virtual image distance refers to the distance between the center of the eye box and the center of the HUD virtual image. See Figure 1a below.
  • the eye box usually refers to the range within which the driver's eyes can see all virtual images.
  • the general eye box size is 130 millimeters (mm) x50mm. Due to differences in the height of different drivers, the eye box has a movement range of approximately ⁇ 50mm in the vertical direction. It can also be understood that the driver can see a clear HUD virtual image within the eye box range. When moving left or right or up and down beyond the eye box range, the driver may see a distorted HUD virtual image or even no HUD virtual image.
  • the optical emission module in this application can also be integrated into a head-up display (HUD) device, see Figure 1a.
  • HUD head-up display
  • Figure 1a takes the HUD installed in a vehicle as an example.
  • HUD can project the formed image (called HUD virtual image) into the driver's front field of view and fuse it with real road information, thereby enhancing the driver's perception of the actual driving environment.
  • HUD can superimpose the HUD virtual image carrying navigation information and/or instrument information (such as driving speed, driving mileage, rotation speed, temperature, fuel level, car light status, etc.) on the real environment outside the vehicle, allowing the driver to Augmented reality visuals are available.
  • it can be applied to augmented reality (AR) navigation, adaptive cruise, lane departure warning and other scenarios.
  • HUD includes but is not limited to augmented reality head up display (AR-HUD) devices, etc.
  • the optical emission module in this application can also be integrated into a near eye display (NED) device.
  • the NED device can be, for example, an AR device or a VR device.
  • the AR device can include but not Limited to AR glasses or AR helmets, VR equipment may include but is not limited to VR glasses or VR helmets. Please refer to Figure 1b, taking AR glasses as an example. Users can wear AR glasses equipment to play games, watch videos, participate in virtual meetings, or video shopping, etc.
  • the optical emission module in this application can be integrated into a projector. Please refer to Figure 1c.
  • the projector can project images onto a wall or projection screen. Based on the optical emission module of this application, dual projection screen display can be realized.
  • the optical emission module in this application can also be integrated into a vehicle-mounted display screen.
  • the vehicle-mounted display screen can be installed on the back of the seat of the vehicle or the passenger position, etc. This application does not limit the installation location of the vehicle display screen.
  • optical emission module provided by this application can also be applied in other possible scenarios, and is not limited to the scenarios illustrated above.
  • it can also be used in displays as backlight sources.
  • FIG 2 is a schematic structural diagram of a HUD device in the prior art.
  • HUD virtual images can be formed at two different locations.
  • the HUD device includes LBS1, LBS2, curved reflector 1 and curved reflector 2.
  • LBS1 and LBS2 need to be placed at two different designated positions. The designated positions are determined based on the virtual image distance of the two virtual images that need to be formed. Since two LBSs need to occupy a large space, the HUD device will be larger in size, which will limit the application scenarios of the HUD device. Especially when the HUD device is used in scenes with limited space such as vehicles, a miniaturized HUD device is required.
  • this application proposes an optical emission module.
  • the optical display device based on the optical emission module can generate virtual images with different focal planes without increasing the volume of the optical display module.
  • optical emission module proposed in this application will be described in detail below with reference to Figures 3 to 9.
  • the optical display device provided by the present application will be described in detail with reference to FIG. 10 and FIG. 11 .
  • the terminal device provided by this application will be described in detail with reference to Figure 12a and Figure 12b.
  • the image display method provided by this application will be described in detail with reference to Figure 13.
  • the optical emission module may include a first light source component, a transflective component, a first reflective component and a second reflective component.
  • the transflective component includes N reflective areas and M transmissive areas, where N and M are both positive integers. In one possible case, the transflective component includes a reflective area and a transmissive area. In another possible situation, N reflection areas and M transmission areas are cross-distributed.
  • the first light source component is used to emit a first light beam and a second light beam.
  • the reflective area of the transflective component is used to reflect the received first light beam to the first reflective component; the transmissive area of the transflective component is used to transmit the received second light beam to the second reflective component.
  • the first light source component emits the first light beam, and the emission area of the transflective component is aligned with the propagation light path of the first beam; the first light source component emits the second light beam, and the transmission area of the transflective component is aligned with the propagation light path of the second beam.
  • the first reflective component is used to reflect the received first light beam to the second reflective component; the second reflective component is used to reflect the first light beam from the first reflective component to the first diffusion component and pass through the second reflective component. Rotate and scan on the first diffusion component to form the first image, and reflect the second beam from the transmission area of the transflective component to the second diffusion component, and scan on the second diffusion component to form the second image through the rotation of the second reflection component.
  • the content displayed in the first image may be the same as the content displayed in the second image, or may be different.
  • the first image can display navigation information; the second image can display instrument information, such as driving speed, driving mileage, rotation speed, temperature, fuel level, and car light status.
  • the first image corresponds to the first virtual image of the far focal plane, and the second image corresponds to the second virtual image of the near focal plane.
  • the first virtual image and the second virtual image are virtual images formed on two different focal planes.
  • the design of dual optical paths i.e., the propagation optical path of the first beam and the propagation optical path of the second beam
  • a second reflective component can be used to realize the The first light beam forms a first image
  • the second light beam forms a second image, thereby helping to reduce the volume of the optical emission module.
  • a virtual image or dual-screen display
  • the first light source component is used to emit the first light beam or the second light beam.
  • the first light beam and the second light beam may be combined by a monochromatic light beam, or may be combined by a plurality of color light beams, which is not limited in this application.
  • the first light source component may include a first light source (or R light source) for emitting red light (Red, R), and a second light source (or B light source) for emitting blue light (Blue, B). light source) and a third light source (or G light source) for emitting green light (Green, G).
  • the red light emitted by the first light source, the blue light emitted by the second light source, and the green light emitted by the third light source can be mixed to obtain a first light beam or a second light beam of different colors, or can be mixed to obtain white light.
  • the first light source, the second light source, and the third light source may be, for example, a laser diode (LD), a light emitting diode (LED), or an organic light emitting diode (OLED). , or micro light emitting diode (micro light emitting diode, micro-LED), etc.
  • LD laser diode
  • LED light emitting diode
  • OLED organic light emitting diode
  • micro light emitting diode micro light emitting diode
  • micro-LED micro light emitting diode
  • FIG. 4a is a schematic structural diagram of a first light source component provided by the present application.
  • the first light source assembly includes a first light source, a second light source and a third light source.
  • the three light sources are arranged in a row.
  • the three colors (RGB) of light emitted by the three light sources can be mixed to form a first light beam or a second light beam.
  • each light source also corresponds to a collimating mirror (such as a collimating lens, or a curved mirror, etc.).
  • the first light source corresponds to a collimating mirror
  • the second light source corresponds to a collimating mirror
  • the third light source also corresponds to a collimating mirror.
  • a dichroic mirror is not needed, so the structure of the first light source component can be reduced, and the volume of the optical emission module can be further reduced.
  • the first light source assembly includes a first light source, a second light source and a third light source. Further, optionally, the first light source assembly may further include a light combining element, wherein the light combining element includes a first dichroic mirror and a second dichroic mirror. Among them, three colors (RGB) of light are emitted by three light sources.
  • RGB red, green, blue
  • the first dichroic mirror is used to reflect the blue light from the second light source and transmit the green light from the third light source to the second dichroic mirror;
  • the dichroic mirror is used to reflect red light from the first light source, transmit green light transmitted by the first dichroic mirror, and transmit blue light reflected by the first dichroic mirror. It can also be understood that after passing through the second dichroic mirror, the red light from the first light source, the green light from the third light source, and the blue light from the second light source are mixed to form the first light beam or the second light beam. Since green light has the greatest impact on the brightness of the image, the third light source for emitting green light can be placed at the farthest position from the light combining module through the first dichroic mirror and the second dichroic mirror.
  • each light source can also correspond to a collimating mirror.
  • each light source can also correspond to a collimating mirror.
  • first light source the second light source and the third light source in the first light source assembly given above can also be interchanged.
  • a third dichroic mirror can be used to replace the second dichroic mirror, and the third dichroic mirror can reflect Blue light, transmits red and green light. No more enumeration here.
  • the first light beam emitted by the first light source component carries the information of the first image (for example, the content information of the first image), and the second light beam carries the information of the second image (for example, the content information of the second image).
  • the first light source component can project the first image pixel by pixel, and the first light source component can project the second image pixel by pixel, for example, it can be controlled by the control component.
  • the control component for details, please refer to the following introduction of the control component, which will not be described again here. .
  • the transflective component includes N reflective areas and M transmissive areas, where N and M are both positive integers.
  • a transflective component includes a reflective area and a transmissive area.
  • a transflective component includes multiple reflective areas and multiple transmissive areas. Specifically, the reflective area of the transflective component is used to reflect the received first light beam to the first reflective component, and the transmissive area of the transflective component is used to transmit the received second light beam to the second reflective component.
  • Fig. 5a exemplarily shows a schematic structural diagram of a transflective component provided by the present application.
  • the transflective component includes two reflective areas and two transmissive areas, and these two reflective areas and two transmissive areas are cross-distributed.
  • the transflective component also includes a fixed area.
  • the fixed area can be connected to the driving element, and the driving element can be integrated into the optical emission module, or can be independent of the optical emission module.
  • the driving element can drive the fixed area to drive the transflective component to rotate based on the received control signal.
  • the control signal please refer to the following introduction of the control component, which will not be described again here.
  • the driving element can drive the fixed area to rotate and switch around the y-axis with C as the center based on the received control signal.
  • the driving element may be, for example, a driving motor or a servo motor. It can be understood that the driving element can drive the transflective assembly to rotate in the clockwise direction, or it can also rotate in the counterclockwise direction, which is not limited in this application.
  • the driving element can drive the fixed area to rotate around the y-axis with C as the center to align the propagation light path of the first light beam from the first light source component to the reflection area, which can also be understood as,
  • the reflective area of the transflective component is aligned with the propagation optical path of the first light beam, and the first light beam can be directed to the reflective area of the transflective component.
  • the driving element can drive the fixed area to rotate around the y-axis with C as the center to the transmission area to align with the propagation light path of the second light beam from the first light source component, which can also be understood as the transflective component.
  • the transmission area is aligned with the propagation optical path of the second light beam, and the second light beam can be directed to the transmission area of the transflective component.
  • the sum of the areas of the N reflective regions included in the transflective component may be greater than, or less than, or equal to the sum of the areas of the M transmissive regions. Specifically, allocation can be made based on the requirements for the brightness of the first virtual image (or called the far-focus screen) and the second virtual image (or called the near-focus screen) in actual applications.
  • the intensity of the reflected first light beam can be increased, thus helping to increase the brightness of the first image, thereby improving the The brightness of the first virtual image corresponding to the first image.
  • the transflective component can also include more than 2 reflective areas and more than 2 transmissive areas. Please refer to Figure 5c. Taking the transflective component including 4 reflective areas and 4 transmissive areas as an example, 4 The reflection area and 4 transmission areas are distributed crosswise. Alternatively, the transflective component can also include 1 reflective area and 1 transmissive area, see Figure 5d. It should be noted that this application does not limit the number of reflective areas and the number of transmissive areas included in the transflective component. The above-mentioned Figures 5a, 5b, 5c and 5d are only examples. In addition, the number of reflective areas and transmissive areas included in the transflective component can be the same or different, see Figure 5e.
  • the reflective area of the transflective component may be formed by coating a reflective film on glass, and the transmissive area of the transflective component may be formed by coating an anti-reflective film on transparent glass.
  • the reflective film can reflect light beams in the visible light band range.
  • the transflective component may be a color wheel.
  • the optical emission module may include a first reflective component and a second reflective component.
  • the first reflective component may include at least one reflector (such as a plane reflector), or the first reflective component may include a reflective prism (such as a total reflection prism, etc.), and the second reflective component may be, for example, a MEMS galvanometer.
  • the first reflecting component includes multiple reflecting mirrors, the first light beam can be reflected between the multiple reflecting mirrors, which can further increase the optical path of the first light beam.
  • the second reflective component reflects the first light beam from the first reflective component to the first diffusion component, and scans on the first diffusion component by rotation to form the first image.
  • the second reflective component reflects the second light beam from the transmission area of the transflective component to the second diffusion component, and scans on the second diffusion component through rotation to form a second image. It can also be understood that after the first light beam from the first reflective component and the second light beam from the transmission area of the transflective component are reflected by the same second reflective component, the first light beam and the second light beam are at a specific fixed angle. ⁇ is projected onto respective corresponding expansion components for back-and-forth linear scanning, producing two real images (i.e., the first image and the second image).
  • the second reflective component takes a MEMS galvanometer as an example.
  • the first beam after being rotated and reflected by the MEMS galvanometer scans back and forth linearly on the first diffusion component (please refer to (a) in Figure 6), resulting in The first image; the second beam after being rotated and reflected by the MEMS galvanometer scans linearly back and forth on the second diffusion component (please refer to (b) in Figure 6) to generate the second image.
  • the angle ⁇ between the first light beam and the second light beam after being reflected by the second reflective component is related to information such as the required size and position of the first virtual image and the second virtual image. Specifically, it can be determined according to the first virtual image. and the size and position of the second virtual image are determined. Wherein, the size and position information of the first virtual image and the second virtual image can be set in advance.
  • the first diffusion component and the second diffusion component may be, for example, a diffusion screen or called a diffusion plate (Diffuser) or a diffusion sheet.
  • the first diffusion component may be the same as the second diffusion component.
  • the first diffusion component and the second diffusion component can be two parts of one diffusion screen, or they can also be two diffusion screens.
  • the diffusion screen may be a screen whose surface includes a microlens array (MLA) (or a compound eye lens).
  • MVA microlens array
  • the microlens array can be imprinted on one side of the substrate to form a diffusion screen (see Figure 7a), or the microlens array can be imprinted on both sides of the substrate to form a diffusion screen (see Figure 7b), or The microlens array and the substrate may be integrated on one side to form a diffusion screen (see Figure 7c), or the microlens array and the substrate may be integrated on both sides to form a diffusion screen (see Figure 7d).
  • the diffusion screen can also be formed in other possible ways. For example, it is formed by glass etching or glass cold processing.
  • the microlens array can control the divergence angle of the first beam or the second beam, thereby making the first beam and the second beam more uniform and with controllable angles, thereby improving the formed first beam.
  • the uniformity and clarity of the image and the second image can thereby make the first virtual image and the second virtual image have good uniformity, high definition and high brightness.
  • the number of lenses included in the microlens array shown in Figs. 7a to 7d is only an example.
  • the microlens array may include more lenses than in Figs. 7a to 7d, or may include more lenses than in Figs. 7a to 7d.
  • the lens shown in Figure 7d is not limited in this application. It should be understood that the more microlenses (or sub-eyes) the microlens array includes, the better the light uniformity effect will be.
  • the microlens array may be one row or multiple rows, and this application is not limited thereto.
  • the distance between the center of the first diffusion component and the center of the second diffusion component is a (see Figure 8).
  • the size of the distance a can be specifically determined based on the actual required information such as the position and size of the first virtual image and the second virtual image.
  • the optical reflection module may also include a control component, which is introduced below.
  • the control component is used to control the first light source component and the transflective component.
  • the control component controls the first light source component to emit the first beam (that is, emit the far-focus pixel point), and synchronously controls the reflection area of the transflective component to align with the propagation light path of the first beam.
  • the control component controls the first light source component to emit the second light beam (that is, emit the near-focus pixel point), and synchronously controls the transmission area of the transflective component to align with the propagation optical path of the second light beam.
  • the control component may send a first control signal to the first light source component and send a third control signal to the driving element.
  • the first light source component emits the first light beam based on the received first control signal.
  • the driving element drives the rotation of the transflective component to align the propagation light path of the first light beam with the reflection area based on the received third control signal.
  • the control component transmits a second control signal to the light source and a fourth control signal to the driving element.
  • the first light source component emits the second light beam based on the received second control signal.
  • the driving element drives the transflective component to rotate to the transmission area to align with the propagation optical path of the second light beam based on the received fourth control signal.
  • the switching frequency of the reflective area and the transmissive area of the transflective component is equal to the switching frequency of the first light source component emitting the first light beam and the second light beam.
  • control component can be found in the introduction of the processor in Figure 11 below, and will not be described again here.
  • the optical emission module may include a first light source component, a color wheel, a reflector and a MEMS galvanometer.
  • the color wheel shown in FIG. 5a is taken as an example
  • the first light source component is taken as the first light source component shown in FIG. 4b.
  • the first light beam emitted by the first light source component is reflected to the reflector through the reflection area of the color wheel, and the first light beam is reflected to the MEMS galvanometer through the reflector; the second light beam emitted by the first light source component is transmitted to the MEMS galvanometer through the transmission area of the color wheel.
  • the angle between the first beam and the second beam directed to the MEMS galvanometer is ⁇ . Based on the reflection principle, the angle between the first beam and the second beam reflected from the MEMS galvanometer is also ⁇ ; Through the rotation of the MEMS galvanometer, the first light beam can form a first image on the first diffusion screen, and the second light beam can form a second image on the second diffusion screen.
  • FIG. 9 is a schematic structural diagram of another optical emission module provided by the present application.
  • the optical emission module has a second light source component, a third light source component, a first reflective component and a second reflective component.
  • the second light source component is used to emit the first light beam;
  • the third light source component is used to emit the second light beam;
  • the first reflective component is used to reflect the first light beam from the second light source component to the second reflective component;
  • the second reflective component is used to The first light beam from the first reflective component is reflected to the first diffusion component, and the first image is formed by scanning on the first diffusion component through the rotation of the second reflective component, and the second light beam from the third light source component is reflected. to the second diffusion component, and scan the second diffusion component through the rotation of the second reflection component to form a second image.
  • a dual optical path design i.e., the propagation light path of the first beam and the propagation light path of the second beam
  • the first image and the second image are formed in a smaller optical emission module.
  • the second light source component and the third light source component in this example may be the same as the above-mentioned first light source component.
  • the second light source component can emit the first light beam according to the received fifth control signal, and the fifth control signal is generated based on the information of the first image; the third light source component can emit the first light beam according to the received fifth control signal.
  • the sixth control signal emits the second light beam, and the sixth control signal is generated based on the information of the second image.
  • the fifth control signal may be sent by the above-mentioned control component to the second light source component, and the fifth control signal may be the same as the above-mentioned first control signal.
  • the sixth control signal may also be sent by the above-mentioned control component to the third light source component.
  • the sixth control signal may be the same as the above-mentioned second control signal.
  • the optical display device may include the optical emission module in any of the above embodiments. Further, the optical display device may further include a light amplification module for combining the first image and the third image from the optical emission module. 2 images to enlarge.
  • the light amplification module includes at least one curved reflector and/or at least one cylindrical mirror.
  • a light amplification module including two curved reflectors is taken as an example. After the first image is amplified by the light amplification module, it can form a first virtual image at the first position through the windshield. After the second image is amplified by the light amplification module, it can form a second virtual image at the second position after being reflected by the windshield. The first position and the second position are two different positions. It can also be understood that two virtual images can be formed at two different positions through the windshield.
  • the optical display device may further include a first diffusion component and a second diffusion component, as shown in FIG. 10 , and for details, please refer to the above related introduction, which will not be described again here.
  • the optical display device may also include other possible structures, which are not limited in this application.
  • the optical display device may further include an optical lens, and the optical lens may include at least one lens, and the lens may be a spherical lens or an aspheric lens.
  • the combination of multiple spherical lenses and/or aspheric lenses can help improve the imaging quality of the optical lens and reduce the aberration of the optical lens.
  • the first light beam forming the first image can be shaped and/or homogenized through an optical lens, thereby helping to improve the quality of the first image formed based on the first light beam.
  • the second light beam forming the second image can be shaped and/or uniformized through the optical lens, thereby helping to improve the quality of the second image formed based on the second light beam.
  • FIG. 11 is a schematic circuit diagram of an optical display device provided by the present application.
  • the circuit in the optical display device mainly includes a processor 1101, an external memory interface 1102, an internal memory 1103, an audio module 1104, a video module 1105, a power module 1106, a wireless communication module 1107, an I/O interface 1108, a video interface 1109, One or more of the display circuit 1110, the modulator 1111, and the like.
  • the processor 1101 and its peripheral components such as external memory interface 1102, internal memory 1103, audio module 1104, video module 1105, power module 1106, wireless communication module 1107, I/O interface 1108, video interface 1109, display circuit 1110 Can be connected via bus.
  • Processor 1101 may be called a front-end processor.
  • the circuit diagram schematically illustrated in the embodiment of the present application does not constitute a specific limitation on the optical display device.
  • the optical display device may include more or less components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1101 may be a circuit with signal (or data) processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities, such as a central processing unit (central processing unit, CPU), microprocessor, graphics processing unit (GPU) (can be understood as a microprocessor), or digital signal processor (digital signal processor, DSP), etc.; in another implementation, processing
  • CPU central processing unit
  • GPU graphics processing unit
  • DSP digital signal processor
  • processing The processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is an application-specific integrated circuit (ASIC) or a programmable logic device (programmable).
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (neural network processing unit, NPU), a tensor processing unit (TPU), a deep learning processing unit (deep learning processing unit, DPU), etc.
  • ASIC hardware circuit designed for artificial intelligence
  • NPU neural network processing unit
  • TPU tensor processing unit
  • DPU deep learning processing unit
  • it can also be an application processor (application processor, AP), image signal processor (image signal processor, ISP), or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
  • the processor 1101 may also be provided with a memory for storing instructions and data.
  • the memory in processor 1101 is cache memory. This memory may hold instructions or data that have been recently used or recycled by the processor 1101 . If the processor 1101 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 1101 is reduced, thereby improving the efficiency of the optical display device. Among them, the processor 1101 can execute the stored instructions.
  • the optical display device may also include a plurality of input/output (I/O) interfaces 1108 connected to the processor 1101 .
  • the interface 1108 may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and/or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the above-mentioned I/O interface 1108 can be connected to devices such as a mouse, touch pad, keyboard, camera, speaker/speaker, microphone, etc., or can be connected to physical buttons on the optical display device (such as volume keys, brightness adjustment keys, power on/off keys, etc.).
  • the external memory interface 1102 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the optical display device.
  • the external memory card communicates with the processor 1101 through the external memory interface 1102 to implement the data storage function.
  • Internal memory 1103 may be used to store computer executable program code, which includes instructions.
  • the internal memory 1103 may include a program storage area and a data storage area. Among them, the stored program area can store the operating system, at least one application program required for the function, etc. The storage data area can store data created during use of the optical display device, etc.
  • the internal memory 1103 may include random access memory (RAM), flash memory, read-only memory (ROM), programmable ROM (PROM), erasable programmable memory In read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically erasable EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or any other form of storage media well known in the art.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from the storage medium and write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor 1101 executes various functional applications and data processing of the optical display device by executing instructions stored in the internal memory 1103 and/or instructions stored in a memory provided in the processor 1101.
  • the optical display device can implement audio functions through the audio module 1104 and an application processor. Such as music playback, phone calls, etc.
  • the audio module 1104 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1104 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1104 may be provided in the processor 1101, or some functional modules of the audio module 1104 may be provided in the processor 1101.
  • the video interface 1109 can receive external audio and video signals, which can be specifically a high definition multimedia interface (HDMI), a digital visual interface (DVI), or a video graphics array (VGA). , display port (display port, DP), etc., the video interface 1109 can also output video.
  • HDMI high definition multimedia interface
  • DVI digital visual interface
  • VGA video graphics array
  • display port display port, DP
  • the video interface 1109 can also output video.
  • the video interface 1109 can receive speed signals and power signals input from peripheral devices, and can also receive AR video signals input from the outside.
  • the video interface 1109 can receive video signals input from an external computer or terminal device.
  • the video module 1105 can decode the video input by the video interface 1109, for example, perform H.264 decoding.
  • the video module can also encode the video collected by the optical display device, such as H.264 encoding of the video collected by an external camera.
  • the processor 1101 can also decode the video input from the video interface 1109, and then output the decoded image signal to the display circuit 1110.
  • the display circuit 1110 and the modulator 1111 are used to display corresponding images.
  • the video interface 1109 receives an externally input video source signal.
  • the video module 1105 decodes and/or digitizes the signal and outputs one or more image signals to the display circuit 1110.
  • the display circuit 1110 drives the modulation according to the input image signal.
  • the detector 1111 images the incident polarized light, and then outputs at least two channels of image light.
  • the processor 1101 can also output one or more image signals to the display circuit 1110 .
  • the display circuit 1110 may also be called a driving circuit.
  • the power module 1106 is used to provide power to the processor 1101 and the light source 1112 based on input power (eg, direct current).
  • the power module 1106 may include a rechargeable battery, and the rechargeable battery may provide power to the processor 1101 and the light source 1112.
  • the light emitted by the light source 1112 can be transmitted to a modulator (or image source) 1111 for imaging, thereby forming image light.
  • the light source 1112 may be the optical emission module in any of the above embodiments.
  • the wireless communication module 1107 can enable the optical display device to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT) , Global navigation satellite system (GNSS), frequency modulation (FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • Wi-Fi wireless fidelity
  • Bluetooth bluetooth, BT
  • GNSS Global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 1107 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1107 receives electromagnetic waves through the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1101 .
  • the wireless communication module 1107 can also receive the signal to be sent from the processor 1101, frequency modulate it
  • the video data decoded by the video module 1105 can also be received wirelessly through the wireless communication module 1107 or read from an external memory.
  • the optical display device can pass through the wireless LAN in the car. After receiving video data from the terminal device or vehicle entertainment system, the optical display device can also read the audio and video data stored in the external memory.
  • the optical display device may include but is not limited to HUD, projector, display, vehicle display screen, AR device, or VR device, etc.
  • the AR device may include but is not limited to AR glasses or AR helmet, etc.
  • the VR device may Including but not limited to VR glasses or VR helmets, etc.
  • FIG. 12a is a schematic diagram of a possible functional framework of a vehicle provided by this application.
  • Components coupled to or included in vehicle 1200 may include sensor system 1201 , peripherals 1203 , power supply 1204 , computer system 1205 , user interface 1206 , and optical display device 1207 .
  • the components of the vehicle 1200 may be configured to operate in an interconnected manner with each other and/or with other components coupled to various systems.
  • power supply 1204 may provide power to all components of vehicle 1200.
  • Computer system 1205 may be configured to receive data from sensor system 1201 and peripheral devices 1203 and control them.
  • Computer system 1205 may also be configured to generate a display of the image on user interface 1206 and receive input from user interface 1206 .
  • Sensor system 1201 may include a number of sensors for sensing information about the environment in which vehicle 1200 is located, and the like.
  • the sensors of the sensor system 1201 may include, but are not limited to, a global positioning system (GPS), an inertial measurement unit (IMU), millimeter wave radar, lidar, cameras, and sensors for modifying the sensors. position and/or orientation brake.
  • Millimeter wave radar can utilize radio signals to sense targets within the surrounding environment of vehicle 1200 .
  • millimeter wave radar may be used to sense the speed and/or heading of the target.
  • LiDAR can utilize laser light to sense targets in the environment in which vehicle 1200 is located.
  • a lidar may include one or more laser sources, scanners, and one or more detectors, among other system components.
  • the camera may be used to capture multiple images of the surrounding environment of the vehicle 1200 .
  • the camera can be a still camera or a video camera.
  • Sensor system 1201 may also include sensors that monitor internal systems of vehicle 1200 (eg, in-vehicle air quality monitors, fuel gauges, oil temperature gauges, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding properties (position, shape, orientation, speed, etc.). This detection and identification is a critical function for the safe operation of the vehicle 1200 . Sensor system 1201 may also include other sensors. This application does not specifically limit this.
  • Peripheral devices 1203 may be configured to allow vehicle 1200 to interact with external sensors, other vehicles, and/or users.
  • peripherals 1203 may include, for example, a wireless communication system, a touch screen, a microphone, and/or a speaker.
  • Peripheral device 1203 may additionally or alternatively include other components than those shown in Figure 12a. This application does not specifically limit this.
  • peripheral device 1203 provides a means for a user of vehicle 1200 to interact with user interface 1206 .
  • a touch screen may provide information to a user of vehicle 1200 .
  • User interface 1206 may also operate a touch screen to receive user input.
  • peripheral device 1203 may provide a means for vehicle 1200 to communicate with other devices located within the vehicle.
  • the microphone may receive audio (eg, voice commands or other audio input) from a user of vehicle 1200 .
  • speakers may output audio to a user of vehicle 1200 .
  • a wireless communication system may wirelessly communicate with one or more devices directly or via a communication network.
  • the wireless communication system may use 3G cellular communications such as code division multiple access (CDMA), EVD0, global system for mobile communications (GSM)/general packet radio service, GPRS), or 4G cellular communications, such as long term evolution (LTE), or 5G cellular communications.
  • Wireless communication systems use WiFi to communicate with wireless local area network (WLAN).
  • WLAN wireless local area network
  • the wireless communication system may utilize infrared links, Bluetooth, or ZigBee to communicate directly with the device.
  • Other wireless protocols such as various vehicle communication systems.
  • wireless communication system 144 may include one or more dedicated short range communications (DSRC) devices, which may include vehicle and/or roadside stations. public and/or private data communication between.
  • DSRC dedicated short range communications
  • Power supply 1204 may be configured to provide power to some or all components of vehicle 1200 .
  • power source 1204 may include, for example, a rechargeable lithium-ion or lead-acid battery.
  • one or more battery packs may be configured to provide power.
  • Other power supply materials and configurations are also possible.
  • the power supply 1204 and energy source may be implemented together, as in some all-electric vehicles.
  • the components of the vehicle 1200 may be configured to operate in an interconnected manner with other components within and/or external to their respective systems. To this end, the components and systems of vehicle 1200 may be communicatively linked together through a system bus, network, and/or other connection mechanisms.
  • Computer system 1205 may include at least one processor 12051 that executes instructions stored in a non-transitory computer-readable medium such as memory 12052.
  • Computer system 1205 may also be a plurality of computing devices that control individual components or subsystems of vehicle 1200 in a distributed manner.
  • FIG. 12a functionally illustrates a processor, memory, and other elements of computer system 1205 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include a processor, a computer, or a memory that may or may Multiple processors, computers, or memories that are not stored in the same physical enclosure.
  • the memory may be a hard drive or other storage medium located in a housing different from computer system 1205.
  • a reference to a processor or computer will be understood to include a reference to a collection of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering component and the deceleration component, may each have their own processor that only performs calculations related to component-specific functionality.
  • the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle and others are performed by a remote processor, including taking the necessary steps to perform a single operation.
  • memory 12052 may contain instructions (eg, program logic) that may be executed by processor 12051 to perform various functions of vehicle 1200 , including those described above.
  • the memory may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the sensor system 1201 and peripheral device 1203 .
  • the memory may store data such as road maps, route information, data detected by sensors, vehicle position, direction, speed and other such vehicle data, as well as other information. This information may be used by vehicle 1200 and computer system 1205 during operation of vehicle 1200 in autonomous, semi-autonomous and/or manual modes.
  • User interface 1206 for providing information to or receiving information from a user of vehicle 1200 .
  • user interface 1206 may include one or more input/output devices within a set of peripheral devices 1203, such as a wireless communication system, a touch screen, a microphone, and a speaker.
  • the functions of some elements in the optical display device can also be implemented by other subsystems of the vehicle.
  • the control component can also be an element in the control system 1202.
  • the functions of the control components may be implemented by elements in the control system 1202 .
  • one or more of these components described above may be installed separately from or associated with vehicle 1200 .
  • the memory may exist partially or completely separate from the vehicle 1200 .
  • the components described above may be communicatively coupled together in wired and/or wireless manners.
  • vehicle functional framework shown in Figure 12a is just an example.
  • vehicle 1200 may include more, less, or different systems, and each system may include more, less, or different components.
  • systems and components shown can be combined or divided in any way, which is not specifically limited in this application.
  • the above-mentioned vehicle 1200 can be a car, a truck, a bus, a boat, an airplane, a helicopter, an entertainment vehicle, a construction equipment, a tram, a train, etc., which is not limited in this application.
  • FIG. 12b is a simplified schematic diagram of a partial structure of a vehicle provided by this application.
  • the vehicle may include a HUD device and a windshield.
  • the HUD device can be located below the steering wheel, for example, in the driver's side instrument panel (IP) below the steering wheel.
  • IP driver's side instrument panel
  • the HUD device may be the HUD device in any of the above embodiments.
  • Vehicles to which this application applies may have more or fewer components than the vehicle shown in Figure 12b, may combine two or more components, or may have a different configuration of components.
  • the vehicle may also include other devices such as processors, memories, wireless communication devices, sensors, etc.
  • this application provides an image display method, please refer to the introduction in Figure 13.
  • This image display method can be applied to the optical emission module shown in any of the embodiments shown in FIGS. 3 to 8 . It can also be understood that the image display method can be implemented based on the optical emission module shown in any of the embodiments shown in FIGS. 3 to 8 . Alternatively, the image display method may also be applied to the optical display device shown in FIGS. 10 to 11 , the vehicle shown in FIG. 12a , or the vehicle shown in FIG. 12b .
  • the image display method may be executed by a control device, which may belong to the optical display device or terminal equipment, or may be a control device independent of the optical display device or terminal equipment, such as a chip or a chip system.
  • a control device belongs to a terminal device (such as a vehicle)
  • the control device can be a domain processor in the vehicle, or it can also be an electronic control unit (ECU) in the vehicle, etc.
  • ECU electronice control unit
  • the image display method may include the following steps:
  • Step 1301 Control the first light source component of the optical emission module to emit the first light beam, and control the reflection area of the transflective component of the optical reflection module to align with the propagation optical path of the first light beam.
  • the information of the first image can be acquired, the first control signal is generated according to the information of the first image, and the first control signal is sent to the first light source component to control the first light source component to emit the first light beam. Furthermore, a third control signal can be generated based on the information of the first image, and the third control signal can be sent to the transflective component to control the transflective component to align the reflection area with the propagation optical path of the first light beam.
  • the information of the first image may include but is not limited to the content information of the first image to be displayed, for example, it may be navigation information, etc., so as to control the light source component to emit the first light beam corresponding to the first image based on the content information of the first image. .
  • the information of the first image may also include first indication information (or first identification information) of the first image, or it can also be understood that the information of the first image may also include an indication to be displayed.
  • the image is the information of the first image, so as to control the alignment (such as rotation) of the transflective component based on the information of the first image; specifically, the first indication information can be carried by the content information of the first image, or it can be This application does not specifically limit other information independent of the content information of the first image, as long as it can be used to control the alignment of the reflection area of the transflective component with the propagation optical path of the first light beam.
  • the information of the second image may include, but is not limited to, the content information of the second image to be displayed, such as driving speed information, driving mileage information, rotation speed information, temperature information, fuel level information, vehicle light status information, etc., in order to implement the second image based on the second image.
  • the content information of the image controls the light source component to emit a second light beam corresponding to the second image.
  • the information of the second image may also include second indication information (or called second identification information) of the second image, or it can also be understood that the information of the second image may also include information for indicating that the target is to be
  • the displayed image is the information of the second image, so as to control the alignment (such as rotation) of the transflective component based on the information of the second image; specifically, the second indication information can be carried by the content information of the second image, or it can It is other information independent of the content information of the second image. This application does not limit this. Anything that can control the alignment of the transmission area of the transflective component and the propagation optical path of the second light beam is within the scope of protection of this application.
  • the information of the first image may also be the same as the information of the second image, which is not limited in this application.
  • the first control signal can be generated according to the content information of the first image, and the third control signal can be generated according to the first instruction information; the second control signal can be generated according to the content information of the second image, and the third control signal can be generated according to the second instruction information.
  • Step 1302 Control the first light source component of the optical emission module to emit the second light beam, and control the transmission area of the transflective component to align with the propagation optical path of the second light beam.
  • information of the second image can be obtained, a second control signal is generated according to the information of the second image, and the second control signal is sent to the first light source component to control the first light source component to emit the second beam. Furthermore, a fourth control signal may be generated based on the information of the second image, and the fourth control signal may be sent to the transflective component to control the transmission area of the transflective component to align with the propagation optical path of the second light beam.
  • step 1302 is executed first and then step 1301 is executed, which is not limited in this application.
  • Step 1303 control the rotation of the second reflection module of the optical emission module.
  • the first beam scans on the first diffusion component through the rotation of the second reflection module to form a first image
  • the second beam scans on the second diffusion component through the rotation of the second reflection module to form a second image
  • the brightness of the first virtual image corresponding to the first image and the brightness of the second virtual image corresponding to the second image can also be obtained; and the input of the first light source and the second virtual image are determined according to the brightness of the first virtual image.
  • the current magnitudes of the light source and the third light source determine the magnitude of the current input to the first light source, the second light source, and the third light source according to the brightness of the second virtual image.
  • the relationship between the brightness of the first virtual image and the current weights of the three light sources input, and the relationship between the brightness of the second virtual image and the current weights of the three light sources input can be expressed in Table 1 below.
  • FIG. 14 and FIG. 15 are schematic structural diagrams of possible control devices provided by the present application. These control devices can be used to implement the method shown in Figure 13 in the above method embodiment, and therefore can also achieve the beneficial effects of the above method embodiment.
  • the control device can be the control module in the above-mentioned detection system, or it can also be in the optical display device shown in Figures 10 to 11 or the vehicle shown in Figure 12a or the vehicle shown in Figure 12b processor, or it can also be controlled by other independent control devices (such as chips).
  • control device 1400 includes a processing module 1401, and may further include a transceiver module 1402.
  • the control device 1400 is used to implement the method in the above method embodiment shown in FIG. 13 .
  • the processing module 1401 controls the first light source component of the optical emission module to emit the first beam, and controls the reflection area of the transflective component of the optical reflection module. Calibrate the propagation light path of the first light beam, and control the rotation of the second reflection module of the optical emission module.
  • the first light beam scans on the first diffusion component through the rotation of the second reflection module to form the first image;
  • control the first light source component Emit a second light beam, control the transmission area of the transflective component to align with the propagation light path of the second light beam, and control the rotation of the second reflection module.
  • the second light beam scans on the second diffusion component through the rotation of the second reflection module to form the second light beam.
  • processing module 1401 in the embodiment of the present application can be implemented by a processor or processor-related circuit components, and the transceiver module 1402 can be implemented by an interface circuit or other related circuit components.
  • control device 1500 may include a processor 1501 and, optionally, an interface circuit 1502.
  • the processor 1501 and the interface circuit 1502 are coupled to each other. It can be understood that the interface circuit 1502 may be an input-output interface.
  • the control device 1500 may also include a memory 1503 for storing computer programs or instructions executed by the processor 1501.
  • the processor 1501 is used to perform the functions of the above-mentioned processing module 1401, and the interface circuit 1502 is used to perform the functions of the above-mentioned transceiver module 1402.
  • the processor 1501 please refer to the introduction of the processor 1101 in Figure 11 above, which will not be described again this time.
  • the chip may include a processor and an interface circuit. Further, optionally, the chip may also include a memory.
  • the processor is used to execute computer programs or instructions stored in the memory, so that the chip executes any of the possible implementations in Figure 13 above. method.
  • the method steps in the embodiments of the present application can be implemented by hardware or by a processor executing software instructions.
  • the software instructions can be composed of corresponding software modules, and the software modules can be stored in a memory.
  • the memory please refer to the introduction of the memory in Figure 11 above, which will not be described again here.
  • a computer program product includes one or more computer programs or instructions.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • a computer program or instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., a computer program or instructions may be transferred from a website, computer, server, or data center Transmission by wired or wireless means to another website site, computer, server or data center.
  • Computer-readable storage media can be any available media that can be accessed by a computer, or data storage devices such as servers and data centers that integrate one or more available media. Available media can be magnetic media, such as floppy disks, hard disks, tapes; optical media, such as digital video discs (DVD); or semiconductor media, such as solid state drives (SSD) ).
  • a, b or c can mean: a, b, c, "a and b", “a and c", “b and c”, or “a and b and c” ”, where a, b, c can be single or multiple.
  • the character “/” generally indicates that the related objects are in an "or” relationship.
  • the character “/” indicates that the related objects are in a “division” relationship.
  • the word “exemplarily” is used to mean an example, illustration or explanation. Any embodiment or design described herein as “example” is not intended to be construed as preferred or advantageous over other embodiments or designs. Alternatively, it can be understood that the use of the word “example” is intended to present concepts in a specific manner and does not constitute a limitation on this application.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Mechanical Optical Scanning Systems (AREA)

Abstract

一种光学发射模组、光学显示装置、终端设备及图像显示方法,用于解决抬头显示HUD装置产生不同焦面的虚像时会增大HUD装置体积的技术问题。可应用于光学显示或人工驾驶等领域。光学发射模组包括:用于发射第一光束和第二光束的第一光源组件;透射组件的N个反射区域用于将第一光束反射至第一反射组件,第一反射组件用于将接收到的第一光束反射至第二反射组件;透射组件的M个透射区域用于将第二光束透射至第二反射组件;第二反射组件用于将来自第一反射组件的第一光束反射至第一扩散组件、并通过转动在第一扩散组件上形成第一图像,以及将来自透反组件的透射区域的第二光束反射至第二扩散组件、并通过转动在第二扩散组件上形成第二图像。

Description

光学发射模组、光学显示装置、终端设备及图像显示方法 技术领域
本申请涉及图像显示技术领域,尤其涉及一种光学发射模组、光学显示装置、终端设备及图像显示方法。
背景技术
随着显示技术的发展,光学显示装置的应用越来越广泛,例如应用于交通工具(如车辆)的抬头显示(head up display,HUD)装置等,HUD装置的广泛应用可以提升交通工具使用的安全性。HUD装置是一种将驾驶相关的信息(如仪表信息或导航信息等)投影到驾驶员的视野前方的装置,驾驶员可以在视野前方看到仪表信息和导航信息等,不需要低头观察方向盘下方的仪表盘或者中控显示屏,从而可提高紧急情况下的制动反应时间,进而可提升驾驶的安全性。
为了进一步增强驾驶员对路面信息的获取,实现增强现实(augmented reality,AR)导航、AR预警等功能,可采用AR-HUD装置。AR-HUD装置可以同时显示仪表信息和导航信息,为了保证显示的这两类信息之间相互不干扰,AR-HUD装置需要产生两个不同焦面的虚像(或称为双屏显示)。目前,主流的方案是采用两个激光束扫描(laser beam scanning,LBS)系统以实现AR-HUD装置产生两个不同焦面的虚像。但是设计两个LBS会增加AR-HUD装置的体积以及光路设计的复杂度。
综上所述,如何在不增加AR-HUD装置体积的情况下,实现产生不同焦面的虚像,是当前亟需解决的技术问题。
发明内容
本申请提供一种光学发射模组、光学显示装置、终端设备及图像显示方法,用于在减小HUD装置体积的情况下,实现产生不同焦面的虚像。
第一方面,本申请提供一种光学发射模组,该光学发射模组可包括第一光源组件、透反组件、第一反射组件和第二反射组件,透反组件包括N个反射区域和M个透射区域,N和M均为正整数;第一光源组件用于发射第一光束和第二光束;透反组件的反射区域用于将接收到的第一光束反射至第一反射组件;第一反射组件用于将接收到的第一光束反射至第二反射组件;透反组件的透射区域用于将接收到的第二光束透射至第二反射组件;第二反射组件用于将来自第一反射组件的第一光束反射至第一扩散组件、并通过第二反射组件的转动在第一扩散组件上扫描形成第一图像,以及将来自透反组件的透射区域的第二光束反射至第二扩散组件、并通过第二反射组件的转动在第二扩散组件上扫描形成第二图像。
其中,第一图像对应于远焦面的第一虚像,第二图像对应于近焦面的第二虚像。换言之,第一虚像和第二虚像为形成在两个不同焦平面的虚像。
基于上述方案,通过透反组件的反射区域和透射区域可以实现双光路(即第一光束的传播光路和第二光束的传播光路)设计,而且通过一个第二反射组件即可实现基于第一光束形成第一图像、基于第二光束形成第二图像,从而有助于减小光学发射模组的体积。也可以理解为,基于上述光学发射模组,可以通过较小体积的光学发射模组在两个不同的焦 面形成虚像(或称为双屏显示)。
在一种可能的实现方式中,透反组件的N个反射区域与M个透射区域交叉分布。
通过交叉分布的N个反射区域和M个透射区域,可以通过切换反射区域和透射区域,以实现反射区域将接收到的第一光束反射至第一反射组件,透射区域将接收到的第二光束透射至第二反射组件。
在一种可能的实现方式中,N个反射区域的面积相同或不同;和/或,M个透射区域的面积相同或不同。
在一种可能的实现方式中,透反组件的N个反射区域的面积之和大于或等于M个透射区域的面积之和。
通过N个反射区域的面积之和大于M个透射区域的面积之和,可以增加反射的第一光束的强度,从而有助于提高第一图像的亮度,进而可提高第一图像对应的第一虚像的亮度。
在一种可能的实现方式中,第一反射组件包括反射镜。
通过反射镜作为第一反射组件,有助于简化光学发射模组的结构。而且,光路设计也较简单。
在一种可能的实现方式中,第二反射组件为微机电系统(micro-electromechanical system,MEMS)振镜。
通过MEMS振镜的转动,可以实现第一光束在第一扩散组件上扫描形成第一图像,以及第二光束在第二扩散组件上扫描形成第二图像。
在一种可能的实现方式中,透反组件包括色轮。
通过色轮作为透反组件,实现简单;而且,便于控制反射区域和透射区域的切换。
在一种可能的实现方式中,第一扩散组件为第一扩散屏,和/或,第二扩散组件为第二扩散屏。
在一种可能的实现方式中,第一光源组件具体用于根据接收到的第一控制信号,发射第一光束,第一控制信号是基于第一图像的信息生成的;或者,根据接收到的第二控制信号发射第二光束,第二控制信号是基于第二图像的信息生成的。
通过接收到的控制信号,第一光源组件可以确定需要发射第一图像的第一光束还是第二图像的第二光束。
在一种可能的实现方式中,透反组件具体用于根据接收到的第三控制信号,将透反组件的反射区域对准第一光束的传播光路,第三控制信号是基于第一图像的信息生成的;或者,根据接收到的第四控制信号,将透反组件的透射区域对准第二光束的传播光路,第四控制信号是基于第二图像的信息生成的。
通过接收到的控制信号,透反组件可及时的切换反射区域对准第一光束的传播光路,或及时切换反射区域对准第二光束的传播光路。
在一种可能的实现方式中,透反组件的反射区域和透射区域的切换频率等于第一光源组件发射第一光束和第二光束的切换频率。
通过透反组件的反射区域和透射区域的切换频率等于第一光源组件发射第一光束和第二光束的切换频率,从而可使得当第一光源组件发射第一光束时,透反组件的反射区域正好对准第一光束的传播光路,当第一光源组件发射第二光束时,透反组件的透射区域正好对准第二光束的传播光路。
在一种可能的实现方式中,第一光源组件包括用于发射红光的第一光源、用于发射蓝光的第二光源、用于发射绿光的第三光源和合光元件,合光元件用于混合红光蓝光与绿光,得到第一光束或第二光束。
通过第一光源发射的红光、第二光源发射蓝光和第三光源发射的绿光,可以混合得到任意需要的颜色的第一光束或第二光束,也可以混合得到白光。
在一种可能的实现方式中,合光元件包括第一二向镜和第二二向镜;第一二向镜用于反射来自第二光源的蓝光,并透射来自第三光源的绿光;第二二向镜用于反射来自第一光源的红光,并透射第一二向镜透射的绿光,并透射第一二向镜反射的蓝光。
由于绿光对图像的亮度的影响最大,通过第一二向镜和第二二向镜可以实现将用于发射绿光的第三光源放置于距离合光模组最远的位置,如此可以尽可能的平衡红光、绿光和蓝光对第一图像和第二图像的亮度。
第二方面,本申请提供一种光学发射模组,包括第二光源组件、第三光源组件、第一反射组件和第二反射组件;第二光源组件用于发射第一光束;第三光源组件用于发射第二光束;第一反射组件用于将来自第二光源组件的第一光束反射至第二反射组件;第二反射组件用于将来自第一反射组件的第一光束反射至第一扩散组件、并通过第二反射组件的转动在第一扩散组件上扫描形成第一图像,以及将来自第三光源组件的第二光束反射至第二扩散组件、并通过第二反射组件的转动在第二扩散组件上扫描形成第二图像。
其中,第一图像对应于远焦面的第一虚像,第二图像对应于近焦面的第二虚像。换言之,第一虚像和第二虚像为形成在两个不同焦平面的虚像。
基于上述方案,通过两个光源组件(即第二光源组件、第三光源组件),可以实现双光路设计(即第一光束的传播光路和第二光束的传播光路),从而可以在较小体积的光学发射模组中形成第一图像和第二图像。
在一种可能的实现方式中,第一反射组件包括反射镜。
通过反射镜作为第一反射组件,有助于简化光学发射模组的结构。而且,光路设计也较简单。
在一种可能的实现方式中,第二反射组件为微机电系统(micro-electromechanical system,MEMS)振镜。
通过MEMS振镜的转动,可以实现第一光束在第一扩散组件上扫描形成第一图像,以及第二光束在第二扩散组件上扫描形成第二图像。
在一种可能的实现方式中,第一扩散组件为第一扩散屏,和/或,第二扩散组件为第二扩散屏。
在一种可能的实现方式中,第二光源组件具体用于根据接收到的第五控制信号发射第一光束,第五控制信号是基于第一图像的信息生成的;和/或,第三光源组件具体用于根据接收到的第六控制信号发射第二光束,第六控制信号是基于第二图像的信息生成的。
通过接收到的控制信号,第二光源组件可以发射第一图像的第一光束,第三光源组件可以发射第二图像的第二光束。
在一种可能的实现方式中,第二光源组件包括用于发射红光的第一光源、用于发射蓝光的第二光源、用于发射绿光的第三光源和合光元件,合光元件用于混合红光蓝光与绿光,得到第一光束。第三光源组件可以与第二光源组件相同。
通过第一光源发射的红光、第二光源发射蓝光和第三光源发射的绿光,可以混合得到 任意需要的颜色的第一光束和第二光束,也可以混合得到白光。
在一种可能的实现方式中,合光元件包括第一二向镜和第二二向镜;第一二向镜用于反射来自第二光源的蓝光,并透射来自第三光源的绿光;第二二向镜用于反射来自第一光源的红光,并透射第一二向镜透射的绿光,并透射第一二向镜反射的蓝光。
由于绿光对图像的亮度的影响最大,通过第一二向镜和第二二向镜可以实现将用于发射绿光的第三光源放置于距离合光模组最远的位置,如此可以尽可能的平衡红光、绿光和蓝光对第一图像和第二图像的亮度。
第三方面,本申请提供一种光学显示装置,该光学显示装置包括光放大模组以及上述第一方面或第一方面中的任意一种光学发射模组;或者,该光学显示装置包括光放大模组以及上述第二方面或第二方面中的任意一种光学发射模组。
示例性地,光学显示装置可以包括但不限于抬头显示装置(head-up device,HUD)、投影机、车载显示屏、增强现实(augmented reality)设备、(virtual reality,VR)设备等。
在一种可能的实现方式中,光放大模组包括至少一个曲面反射镜和/或至少一个柱面镜。
第四方面,本申请提供一种终端设备,该终端设备包括风挡以及上述第三方面或第三方面中的任意一种光学显示装置。
第五方面,本申请提供一种图像显示方法。该图像显示方法包括控制光学发射模组的第一光源组件发射第一光束,控制光学反射模组的透反组件的反射区域对准第一光束的传播光路,并控制光学发射模组的第二反射模组转动,第一光束通过第二反射模组的转动在第一扩散组件上扫描形成第一图像;控制第一光源组件发射第二光束,控制透反组件的透射区域对准第二光束的传播光路,并控制第二反射模组转动,第二光束通过第二反射模组的转动在第二扩散组件上扫描形成第二图像。
在一种可能的实现方式中,第一光源组件包括用于发射红光的第一光源、用于发射蓝光第二光源和用于发射绿光的第三光源;该方法还包括:获取第一图像对应的第一虚像的亮度以及所述第二图像对应的第二虚像的亮度;根据第一虚像的亮度确定输入第一光源、第二光源和第三光源的电流大小;根据第二虚像的亮度,确定输入第一光源、第二光源和第三光源的电流的大小。
通过控制输入第一光源、第二光源和第三光源的电流大小,可以灵活调节第一虚像和第二虚像的亮度。
进一步,可选的,可获取第一图像的信息,根据第一图像的信息生成第一控制信号,并向第一光源组件发送第一控制信号,第一控制信号用于控制第一光源组件发射第一光束。
进一步,可选的,可获取第二图像的信息,根据第二图像的信息生成第二控制信号,并向第一光源组件发送第二控制信号,第二控制信号用于控制第一光源组件发射第二光束。
第六方面,本申请提供一种控制装置,该控制装置用于实现上述第五方面或第五方面中的任意一种方法,包括相应的功能模块,分别用于实现以上方法中的步骤。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块。
在一种可能的实现方式中,该控制装置例如芯片或芯片系统或者逻辑电路等。有益效果可参见上述第五方面的描述,此处不再赘述。该控制装置可以包括:收发模块和处理模块。该处理模块可被配置为支持该控制装置执行以上第五方面的方法中的相应功能,该收发模块用于支持该控制装置与光学发射模组等中的功能组件等之间的交互。其中,收发模 块可以为独立的接收模块、独立的发射模块、集成收发功能的收发模块等。
第七方面,本申请提供一种控制装置,该控制装置用于实现上述第五方面或第五方面中的任意一种方法,包括相应的功能模块,分别用于实现以上方法中的步骤。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块。
在一种可能的实现方式中,该控制装置例如芯片或芯片系统或者逻辑电路等。有益效果可参见上述第五方面的描述,此处不再赘述。该控制装置可以包括:接口电路和处理器。该处理器可被配置为支持该控制装置执行以上第五方面的方法中的相应功能,该接口电路用于支持该控制装置与光学发射模组等之间的交互。可选地,该控制装置还可以包括存储器,该存储器可以与处理器耦合,其保存该控制装置必要的程序指令等。
第八方面,本申请提供一种芯片,该芯片包括至少一个处理器和接口电路,进一步,可选的,该芯片还可包括存储器,处理器用于执行存储器中存储的计算机程序或指令,使得芯片执行上述第五方面或第五方面的任意可能的实现方式中的方法。
第九方面,本申请提供一种计算机可读存储介质,计算机可读存储介质中存储有计算机程序或指令,当计算机程序或指令被控制装置执行时,使得该控制装置执行上述第五方面或第五方面的任意可能的实现方式中的方法。
第十方面,本申请提供一种计算机程序产品,该计算机程序产品包括计算机程序或指令,当该计算机程序或指令被控制装置执行时,使得该控制装置执行上述第五方面或第五方面的任意可能的实现方式中的方法。
上述第六方面至第十方面中任一方面可以达到的技术效果可以参照上述第五方面中有益效果的描述,此处不再重复赘述。
附图说明
图1a为本申请提供的一种光学发射模组应用于HUD装置的场景示意图;
图1b为本申请提供的一种光学发射模组应用于NED设备的场景示意图;
图1c为本申请提供的一种光学发射模组应用于投影仪的场景示意图;
图1d为本申请提供的一种光学发射模组应用于车载显示屏的场景示意图;
图2为现有技术中的一种HUD装置的结构示意图;
图3为本申请提供的一种光学发射模组的结构示意图;
图4a为本申请提供的一种第一光源组件的结构示意图;
图4b为本申请提供的另一种第一光源组件的结构示意图;
图5a为本申请提供的一种透反组件的结构示意图;
图5b为本申请提供的另一种透反组件的结构示意图;
图5c为本申请提供的又一种透反组件的结构示意图;
图5d为本申请提供的又一种透反组件的结构示意图;
图5e为本申请提供的又一种透反组件的结构示意图;
图6为本申请提供的一种经第二反射组件发射后的第一光束和第二光束对应的扩散组件扫描得到的线状图;
图7a为本申请提供的一种匀光元件的结构示意图;
图7b为本申请提供的另一种匀光元件的结构示意图;
图7c为本申请提供的另一种匀光元件的结构示意图;
图7d为本申请提供的另一种匀光元件的结构示意图;
图8为本申请提供的又一种光学发射模组的结构示意图;
图9为本申请提供的又一种光学发射模组的结构示意图;
图10为本申请提供的一种光学显示装置的结构示意图;
图11为本申请提供的一种光学显示装置的电路示意图;
图12a为本申请提供的一种交通工具的可能的功能框架示意图;
图12b为本申请提供的一种车辆部分结构的简化示意图;
图13为本申请提供的一种图像显示方法的方法流程示意图;
图14为本申请提供的一种控制装置的结构示意图;
图15为本申请提供的一种控制装置的结构示意图。
具体实施方式
下面将结合附图,对本申请实施例进行详细描述。
以下,对本申请中的部分用语进行解释说明。需要说明的是,这些解释是为了便于本领域技术人员理解,并不是对本申请所要求的保护范围构成限定。
一、虚像距(virtual image distance,VID)
虚像距是指眼盒中心与HUD虚像的中心之间的距离,可参见下述图1a。其中,眼盒通常是指驾驶员的眼睛能够看到全部虚像的范围。一般眼盒尺寸大小是130毫米(mm)x50mm。由于不同驾驶员的身高的差异,眼盒在垂直方向上有约±50mm的移动范围。也可以理解为,驾驶员在眼盒范围可以看到清晰的HUD虚像,当向左右或上下移动超出眼盒范围时,可能看到扭曲的HUD虚像、甚至看不到HUD虚像。
前文介绍了本申请所涉及到的一些用语,下面介绍本申请可能的应用场景。
在一种可能的应用场景中,本申请中的光学发射模组也可以集成于抬头显示(head-up device,HUD)装置,请参阅图1a。图1a以HUD安装于车辆为例介绍的。HUD可将形成的图像(称为HUD虚像)投射在驾驶员的前方视野范围,并与真实路面信息融合起来,从而可增强驾驶员对于实际驾驶环境的感知。例如,HUD可以将携带导航信息和/或仪表信息(如行驶速度、行驶里程、转速、温度、油量、车灯状态等信息)的HUD虚像叠加在交通工具外的真实环境上,使得驾驶员可获得增强现实的视觉效果。具体可应用于增强现实(augmented reality,AR)导航、自适应巡航、车道偏离预警等场景。其中,HUD包括但不限于增强现实抬头显示(augmented reality head up display,AR-HUD)装置等。
在又一种可能的应用场景中,本申请中的光学发射模组也可以集成于近眼显示(near eye display,NED)设备,NED设备例如可以是AR设备或VR设备,AR设备可以包括但不限于AR眼镜或AR头盔,VR设备可以包括但不限于VR眼镜或VR头盔。请参阅图1b,以AR眼镜为例示例,用户可佩戴AR眼镜设备进行游戏、观看视频、参加虚拟会议、或视频购物等。
在又一种可能应用场景中,本申请中的光学发射模组可以集成于投影仪,请参阅图1c,投影仪可以将图像投影到墙面或投影屏幕上。基于本申请的光学发射模组,可以实现双投影屏幕显示。
在又一种可能的实现方式中,本申请中的光学发射模组也可以集成于车载显示屏,请参阅图1d,车载显示屏可以安装在交通工具的座椅后背或副驾驶位置等,本申请对车载显示屏安装的位置不作限定。
应理解,上述给出的可能的应用场景仅是举例,本申请提供的光学发射模组还可以应用在其它可能的场景,而不限于上述示例出的场景。例如还可应用于显示器,作为背光源等。
请参阅图2,为现有技术中的一种HUD装置的结构示意图。基于该HUD可以在两个不同的位置分别形成HUD虚像。该HUD装置包括LBS1、LBS2、曲面反射镜1和曲面反射镜2。为了在两个不同的位置形成HUD虚像,LBS1和LBS2需要放置在两个不同的指定位置,指定位置是根据需要形成的两个虚像的虚像距来确定。由于两个LBS需要占据较大的空间,从而会造成HUD装置体积较大,进而会限制HUD装置的应用场景。特别是当HUD装置应用于车辆等空间有限的场景中时,需要小型化的HUD装置。
鉴于此,本申请提出一种光学发射模组,该基于该光学发射模组的光学显示装置可以在不增大光学显示模组的体积的情况下,实现产生不同焦面的虚像。
基于上述内容,下面结合附图3至附图9,对本申请提出的光学发射模组进行具体阐述。结合附图10和附图11对本申请提供的光学显示装置进行具体的阐述。结合附图12a和附图12b对本申请提供的终端设备进行具体阐述。结合附图13对本申请提供的图像显示方法进行具体阐述。
请参阅图3,为本申请提供的一种光学发射模组的结构示意图。该光学发射模组可包括第一光源组件、透反组件、第一反射组件和第二反射组件,透反组件包括N个反射区域和M个透射区域,N和M均为正整数。一种可能的情况下,透反组件包括一个反射区域和一个透射区域。另一种可能的情况下,N个反射区域和M个透射区域交叉分布。第一光源组件用于发射第一光束和第二光束。透反组件的反射区域用于将接收到的第一光束反射至第一反射组件;透反组件的透射区域用于将接收到的第二光束透射至第二反射组件。也可以理解为,第一光源组件发射第一光束,透反组件的发射区域对准第一光束的传播光路;第一光源组件发射第二光束,透反组件的透射区域对准第二光束的传播光束。第一反射组件用于将接收到的第一光束反射至第二反射组件;第二反射组件用于将来自第一反射组件的第一光束反射至第一扩散组件、并通过第二反射组件的转动在第一扩散组件上扫描形成第一图像,以及将来自透反组件的透射区域的第二光束反射至第二扩散组件、并通过第二反射组件的转动在第二扩散组件上扫描形成第二图像。
在一种可能的实现方式中,第一图像显示的内容可以与第二图像显示内容相同,或者也可以不同。例如,第一图像可以显示导航信息;第二图像可以显示仪表信息,如行驶速度、行驶里程、转速、温度、油量和车灯状态等信息。其中,第一图像对应于远焦面的第一虚像,第二图像对应于近焦面的第二虚像。换言之,第一虚像和第二虚像为形成在两个不同焦平面的虚像。
基于上述光学发射模组,通过透反组件的反射区域和透射区域可以实现双光路(即第一光束的传播光路和第二光束的传播光路)设计,而且通过一个第二反射组件即可实现基于第一光束形成第一图像、基于第二光束形成第二图像,从而有助于减小光学发射模组的体积。也可以理解为,基于上述光学发射模组,可以通过较小体积的光学发射模组在两个 不同的焦面形成虚像(或称为双屏显示)。
下面对图3所示的各个功能组件分别进行介绍说明,以给出示例性的具体实现方案。
一、第一光源组件
在一种可能的实现方式中,第一光源组件用于发射第一光束或第二光束。其中,第一光束和第二光束可以由单色光束合束,或者也可以由多种颜色的光束合成,本申请对此不作限定。
示例性地,第一光源组件可包括用于发射红光(Red,R)的第一光源(或称为R光源)、用于发射蓝光(Blue,B)的第二光源(或称为B光源)和用于发射绿光(Green,G)的第三光源(或称为G光源)。通过第一光源发射的红光、第二光源发射蓝光和第三光源发射的绿光,可以混合得到不同颜色的第一光束或第二光束,也可以混合得到白光。进一步,可选的,第一光源、第二光源、第三光源例如可以是激光二极管(laser diode,LD)、发光二极管(light emitting diode,LED)、有机发光二极管(organic light emitting diode,OLED),或者微型发光二极管(micro light emitting diode,micro-LED)等。
请参阅图4a,为本申请提供的一种第一光源组件的结构示意图。该第一光源组件包括第一光源、第二光源和第三光源,这三个光源排为一列,三个光源发射出的三种颜色(RGB)的光可混合形成第一光束或第二光束。进一步,可选的,为了提高各个光源发出的光束的均匀度,每个光源还对应一个准直镜(例如准直透镜、或曲面反射镜等)。具体的,第一光源对应一个准直镜,第二光源对应一个准直镜,第三光源也对应一个准直镜。基于该第一光源组件,不需要二向镜,从而可减小第一光源组件的结构,进而可进一步减小光学发射模组的体积。
请参阅图4b,为本申请提供的另一种第一光源组件的结构示意图。该第一光源组件包括第一光源、第二光源和第三光源。进一步,可选的,该第一光源组件还可包括合光元件,其中,合光元件包括第一二向镜和第二二向镜。其中,三个光源发射出的三种颜色(RGB)的光,第一二向镜用于反射来自第二光源的蓝光,并向第二二向镜透射来自第三光源的绿光;第二二向镜用于反射来自第一光源的红光,透射第一二向镜透射的绿光、以及透射第一二向镜反射的蓝光。也可以理解为,经第二二向镜后,来自第一光源的红光、来自第三光源的绿光和来自第二光源的蓝光混合形成第一光束或第二光束。由于绿光对图像的亮度的影响最大,通过第一二向镜和第二二向镜可以实现将用于发射绿光的第三光源放置于距离合光模组最远的位置,如此可以尽可能的平衡红光、绿光和蓝光对第一图像和第二图像的亮度。进一步,可选的,为了提高各个光源发出的光束的均匀度,每个光源也可以对应一个准直镜,具体可参见前述图4a的介绍,此处不再赘述。
需要说明的是,上述给出的第一光源组件中的第一光源、第二光源和第三光源的位置也可以互换。对于上述图4b所示的第一光源组件的结构,若第一光源和第二光源的位置互换,相应的,可用第三二向镜替换第二二向镜,第三二向镜可反射蓝光,透射红光和绿光。此处不再一一列举。
可以理解的是,第一光源组件发射的第一光束携带有第一图像的信息(例如第一图像的内容信息),第二光束携带有第二图像的信息(例如第二图像的内容信息)。其中,第一光源组件针对第一图像可以逐像素点投射,第一光源组件针对第二图像逐像素点投射,例如可由控制组件控制,具体可参见下述控制组件的介绍,此处不再赘述。
二、透反组件
在一种可能的实现方式中,透反组件包括N个反射区域和M个透射区域,N和M均为正整数。例如,透反组件包括一个反射区域和一个透射区域。再比如,透反组件包括多个反射区域和多个透射区域。具体的,透反组件的反射区域用于将接收到的第一光束反射至第一反射组件,透反组件的透射区域用于将接收到的第二光束透射至第二反射组件。
请参阅图5a,示例性地的示出了本申请提供的一种透反组件的结构示意图。该透反组件以包括2个反射区域和2个透射区域为例,这2个反射区域和2个透射区域交叉分布。
进一步,该透反组件还包括固定区域。固定区域可与驱动元件连接,驱动元件可以集成于光学发射模组,或者也可以独立于光学发射模组。通过驱动元件可以基于接收到的控制信号驱动该固定区域带动透反组件转动,关于控制信号的相关介绍可参见下述控制组件的介绍,此处不再赘述。具体的,驱动元件可以基于接收到的控制信号带动固定区域以C为中心绕y轴旋转切换。驱动元件例如可以是驱动电机或伺服电机等。可以理解的,驱动元件可以驱动透反组件沿顺时针方向转动、或者也可以沿逆时针方向转动,本申请对此不作限定。
示例性地,驱动元件基于接收到的第三控制信号,可带动固定区域以C为中心绕y轴旋转至反射区域对准来自第一光源组件的第一光束的传播光路,也可以理解为,透反组件的反射区域对准第一光束的传播光路,第一光束可射向透反组件的反射区域。驱动元件基于接收到的第四控制信号,可带动固定区域以C为中心绕y轴旋转至透射区域对准来自第一光源组件的第二光束的传播光路,也可以理解为,透反组件的透射区域对准第二光束的传播光路,第二光束可射向透反组件的透射区域。
可以理解的是,透反组件包括的N个反射区域的面积之和可以大于、或小于或等于M个透射区域的面积之和。具体可结合实际应用中对第一虚像(或称为远焦屏)和第二虚像(或称为近焦屏)的亮度的需求进行分配。当N个反射区域的面积之和大于M个透射区域的面积之和(请参阅图5b)时,可增加反射的第一光束的强度,从而有助于提高第一图像的亮度,进而可提高第一图像对应的第一虚像的亮度。
需要说明的是,为了改变第一图像和/或第二图像的亮度,还可以通过控制输入第一光源组件的第一光源、第二光源和第三光源的电流大小来实现,具体可参见下述图13中的相关介绍,此次不再赘述。
可以理解的是,透反组件也可以包括多于2的反射区域和多于2个的透射区域,请参阅图5c,以透反组件包括4个反射区域和4个透射区域为例,4个反射区域和4个透射区域交叉分布。或者,透反组件也可以包括1个反射区域和1个透射区域,请参阅图5d。需要说明的是,本申请对透反组件包括的反射区域的数量和透射区域的数量不作限定,上述图5a、图5b、图5c和图5d仅是示例。此外,透反组件包括的反射区域和透射区域的数量可以相同或者也可以不同,请参阅图5e。
在一种可能的实现方式中,透反组件的反射区域可以是在玻璃上镀制反射膜形成的,透反组件的透射区域可以是透明玻璃上镀制抗反射膜形成的。进一步,可选的,反射膜可反射可见光波段范围的光束。示例性的,透反组件可以为色轮。
三、反射组件
在一种可能的实现方式中,光学发射模组可包括第一反射组件和第二反射组件。示例性地,第一反射组件例如可以至少包括一个反射镜(例如平面反射镜),或者第一反射组 件例如可以包括反射棱镜(如全反射棱镜等),第二反射组件例如可以是MEMS振镜。若第一反射组件包括多个反射镜,第一光束可在多个反射镜之间反射,可进一步增加第一光束的光程。
其中,第二反射组件将来自第一反射组件的第一光束反射至第一扩散组件,并通过转动在第一扩散组件上扫描形成第一图像。第二反射组件将来自透反组件的透射区域的第二光束反射至第二扩散组件,并通过转动在第二扩散组件上扫描形成第二图像。也可以理解为,来自第一反射组件的第一光束和来自透反组件的透射区域的第二光束经过相同的第二反射组件反射后,第一光束和第二光束以特定的固定的夹角α投射到各自对应的扩展组件上进行来回线状扫描,产生两个实像(即第一图像和第二图像)。具体的,第二反射组件以MEMS振镜为例,经MEMS振镜转动并反射后的第一光束在第一扩散组件上进行来回线状扫描(请参阅图6中的(a)),产生第一图像;经MEMS振镜转动并反射后的第二光束在第二扩散组件上进来回线状扫描(请参阅图6中的(b)),产生第二图像。
需要说明的是,经第二反射组件反射后的第一光束和第二光束之间的夹角α与需求的第一虚像和第二虚像的大小和位置等信息相关,具体可根据第一虚像和第二虚像的大小和位置确定。其中,第一虚像和第二虚像的大小和位置信息可以预先设置的。
在一种可能的实现方式中,第一扩散组件和第二扩散组件例如可以是扩散屏或称为扩散板(Diffuser)或扩散片。其中,第一扩散组件可以与第二扩散组件相同。进一步,可选的,第一扩散组件和第二扩散组件可以是一个扩散屏的两部分,或者也可以是两个扩散屏。
示例性的,扩散屏可以是表面包括微透镜阵列(microlens array,MLA)(或称为复眼透镜)的屏。具体的,可以是在基板的单面压印微透镜阵列形成扩散屏(请参阅图7a),或者也可以是在基板的双面压印微透镜阵列形成扩散屏(请参阅图7b),或者也可以是微透镜阵列与基板的单面一体成型形成扩散屏(请参阅图7c),或者也可以是微透镜阵列与基板的双面一体成型形成扩散屏(请参阅图7d)。需要说明的是,扩散屏也可以是通过其它可能的方式形成。例如,通过玻璃刻蚀或玻璃冷加工等形成的。再比如,也可以是在片材中加入扩散剂等工艺形成。第一光束经第一扩散组件进行匀光,并实现对第一光束的发散角和第一光束的传播光路的方向控制,第二光束经第二扩散组件进行匀光,并实现对第二光束的发散角以及第二光束的传播光路的方向控制。也可以理解为,微透镜阵列可以实现将第一光束或第二光束的发散角进行控制,从而可使得第一光束和第二光束变得更均匀且角度可控,进而可提高形成的第一图像和第二图像的均匀性和清晰度,进而可使得第一虚像和第二虚像的均匀性好、清晰度高且亮度高。
需要说明的是,图7a~图7d中所示的微透镜阵列包括的透镜的数量仅是示例,本申请中微透镜阵列可以包括比图7a~图7d多的透镜,也可以比图7a~图7d少的透镜,本申请对此不作限定。应理解,微透镜阵列包括的微透镜(或称为子眼)越多,匀光效果越好。此外,微透镜阵列可以是一列,也可以是多列,本申请对此也不作限定。
当第一扩散组件和第二扩散组件为两个独立的扩散屏,有助于简化光学发射模组的光路的设计。进一步,第一扩散组件的中心和第二扩散组件的中心之间距离为a(请参阅图8)。距离a的大小具体可根据实际需求的第一虚像和第二虚像的位置和大小等信息确定。
本申请中,光学反射模组还可包括控制组件,下面对控制组件进行介绍。
四、控制组件
在一种可能的实现方式中,控制组件用于控制第一光源组件和透反组件。该控制组件控制第一光源组件发射第一光束(即发射远焦像素点),同步的控制透反组件的反射区域对准第一光束的传播光路。控制组件控制第一光源组件发射第二光束(即发射近焦像素点),同步的控制透反组件的透射区域对准第二光束的传播光路。
具体的,控制组件可以向第一光源组件发送第一控制信号,并向驱动元件发送第三控制信号。相应的,第一光源组件基于接收到的第一控制信号发射第一光束。驱动元件基于接收到的第三控制信号驱动透反组件的旋转至反射区域对准第一光束的传播光路。控制组件向光源发射第二控制信号,并向驱动元件发送第四控制信号。相应的,第一光源组件基于接收到的第二控制信号,发射第二光束。驱动元件基于接收到的第四控制信号驱动透反组件旋转至透射区域对准第二光束的传播光路。关于第一控制信号和第二控制信号的详细介绍了参见下述图11中相关介绍,此处不再赘述。
在一种可能的实现方式中,透反组件的反射区域和透射区域的切换频率等于第一光源组件发射第一光束和第二光束的切换频率。如此,可使得当第一光源组件发射第一光束时,透反组件的反射区域正好对准第一光束的传播光路,当第一光源组件发射第二光束时,透反组件的透射区域正好对准第二光束的传播光路。
需要说明的是,控制组件可能的具体形态可参见下述图11中的处理器的介绍,此处不再赘述。
基于上述内容,下面结合具体的硬件结构,给出上述光学发射模组的一种具体实现方式。以便于进一步理解上述光学发射模组的结构及光学发射模组实现双屏显示的过程。需要说明的是,上述给出各个功能组件中,如果没有特殊说明以及逻辑冲突,根据其内在的逻辑关系可以组合形成其它可能的光学发射模组。下面给出的光学发射模组的结构仅是示例。
如图8所示,为本申请提供的又一种光学发射模组的结构示意图。该光学发射模组可包括第一光源组件、色轮、反射镜和MEMS振镜。其中,色轮以上述图5a所示的色轮为例,第一光源组件以上述图4b所示的第一光源组件为例。第一光源组件发射的第一光束经色轮的反射区域反射至反射镜,经反射镜将第一光束反射至MEMS振镜;第一光源组件发射的第二光束经色轮的透射区域透射至MEMS振镜;射向MEMS振镜的第一光束和第二光束之间的夹角为α,基于反射原理,从MEMS振镜反射出的第一光束和第二光束之间的夹角也为α;通过MEMS振镜的转动,第一光束可在第一扩散屏上形成第一图像,第二光束可在第二扩散屏上形成第二图像。
请参阅图9,为本申请提供的又一种光学发射模组的结构示意图。该光学发射模组第二光源组件、第三光源组件、第一反射组件和第二反射组件。第二光源组件用于发射第一光束;第三光源组件用于发射第二光束;第一反射组件用于将来自第二光源组件的第一光束反射至第二反射组件;第二反射组件用于将来自第一反射组件的第一光束反射至第一扩散组件、并通过第二反射组件的转动在第一扩散组件上扫描形成第一图像,以及将来自第三光源组件的第二光束反射至第二扩散组件、并通过第二反射组件的转动在第二扩散组件上扫描形成第二图像。
基于该光学发射模组,通过两个光源组件(即第二光源组件、第三光源组件),可以实现双光路设计(即第一光束的传播光路和第二光束的传播光路),从而可以在较小体积 的光学发射模组中形成第一图像和第二图像。
该示例中的第二光源组件和第三光源组件可以与上述第一光源组件相同,具体可参见上述第一光源组件的介绍。其它功能组件可分别参见前述相关介绍,此处不再赘述。
在一种可能的实现方式中,第二光源组件可根据接收到的第五控制信号发射第一光束,第五控制信号是基于第一图像的信息生成的;第三光源组件可根据接收到的第六控制信号发射第二光束,第六控制信号是基于第二图像的信息生成的。第五控制信号可以是上述控制组件发送给第二光源组件的,第五控制信号可与上述第一控制信号相同,具体可参见上述第一控制信号的介绍,此处不再赘述。第六控制信号也可以是上述控制组件发送给第三光源组件的,第六控制信号可与上述第二控制信号相同,具体可参见上述第二控制信号的介绍,此处不再赘述。
基于上述描述的光学发射模组的结构和功能原理,本申请还提供一种光学显示装置。请参阅图10,该光学显示装置可包括上述任一实施例中的光学发射模组,进一步,光学显示装置还可包括光放大模组,用于将来自光学发射模组的第一图像和第二图像进行放大。
在一种可能的实现方式中,光放大模组包括至少一个曲面反射镜和/或至少一个柱面镜。图10中以光放大模组包括两个曲面反射镜为例。第一图像经光放大模组的放大后,经风挡可在第一位置形成第一虚像,第二图像经光放大模组的放大后,经风挡反射可在第二位置形成第二虚像。其中,第一位置和第二位置为两个不同的位置。也可以理解为,经风挡可以在两个不同的位置形成两个虚像。
在一种可能的实现方式中,光学显示装置还可包括第一扩散组件和第二扩散组件,可参见上述图10,具体可参见前述相关介绍,此处不再赘述。
可以理解的是,光学显示装置还可以包括其它可能的结构,本申请对此不作限定。例如,光学显示装置还可包括光学镜头,光学镜头可包括至少一个镜片,镜片可以是球面透镜,也可以是非球面透镜。通过多片球面透镜和/或非球面透镜的组合,有助于提高光学镜头的成像质量,降低光学镜头的像差。通过光学镜头可对形成第一图像的第一光束进行整形和/或匀光,从而有助于提高基于第一光束形成的第一图像的质量。同理,通过光学镜头可对形成第二图像的第二光束进行整形和/或匀光,从而有助于提高基于第二光束形成的第二图像的质量。
请参阅图11,为本申请提供的一种光学显示装置的电路示意图。该光学显示装置中的电路主要包括包含处理器1101,外部存储器接口1102,内部存储器1103,音频模块1104,视频模块1105,电源模块1106,无线通信模块1107,I/O接口1108、视频接口1109、显示电路1110和调制器1111等中的一个或多个。其中,处理器1101与其周边的元件,例如外部存储器接口1102,内部存储器1103,音频模块1104,视频模块1105,电源模块1106,无线通信模块1107,I/O接口1108、视频接口1109、显示电路1110可以通过总线连接。处理器1101可以称为前端处理器。
另外,本申请实施例示意的电路图并不构成对光学显示装置的具体限定。在本申请另一些实施例中,光学显示装置可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
其中,处理器1101可以是一种具有信号(或数据)的处理能力的电路,在一种实现中, 处理器可以是具有指令读取与运行能力的电路,例如中央处理单元(central processing unit,CPU)、微处理器、图形处理器(graphics processing unit,GPU)(可以理解为一种微处理器)、或数字信号处理器(digital singnal processor,DSP)等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为专用集成电路(application-specific integrated circuit,ASIC)或可编程逻辑器件(programmable logic device,PLD)实现的硬件电路,例如现场可编程门阵列(field programmable gate array,FPGA)。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如神经网络处理单元(neural network processing pnit,NPU)张量处理单元(tensor processing unit,TPU)、深度学习处理单元(deep learning processing unit,DPU)等。例如还可以是应用处理器(application processor,AP)、图像信号处理器(image signal processor,ISP)、或者其它可编程逻辑器件、晶体管逻辑器件,硬件部件或者其任意组合等。
处理器1101中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器1101中的存储器为高速缓冲存储器。该存储器可以保存处理器1101刚用过或循环使用的指令或数据。如果处理器1101需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器1101的等待时间,因而提高了光学显示装置的效率。其中,处理器1101可以执行存储的指令。
在一些实施例中,光学显示装置还可以包括多个连接到处理器1101的输入输出(input/output,I/O)接口1108。接口1108可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。上述I/O接口1108可以连接鼠标、触摸板、键盘、摄像头、扬声器/喇叭、麦克风等设备,也可以连接光学显示装置上的物理按键(例如音量键、亮度调节键、开关机键等)。
外部存储器接口1102可以用于连接外部存储卡,例如Micro SD卡,实现扩展光学显示装置的存储能力。外部存储卡通过外部存储器接口1102与处理器1101通信,实现数据存储功能。
内部存储器1103可以用于存储计算机可执行程序代码,可执行程序代码包括指令。内部存储器1103可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序等。存储数据区可存储光学显示装置使用过程中所创建的数据等。此外,内部存储器1103可以包括随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。另一种示例中,存储介质也可以是处理器的组成部分。处理器1101通过运行存储在内部存储器1103的指令,和/或存储 在设置于处理器1101中的存储器的指令,执行光学显示装置的各种功能应用以及数据处理。
光学显示装置可以通过音频模块1104以及应用处理器等实现音频功能。例如音乐播放,通话等。
音频模块1104用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块1104还可以用于对音频信号编码和解码,例如进行放音或录音。在一些实施例中,音频模块1104可以设置于处理器1101中,或将音频模块1104的部分功能模块设置于处理器1101中。
视频接口1109可以接收外部输入的音视频信号,其具体可以为高清晰多媒体接口(high definition multimedia interface,HDMI),数字视频接口(digital visual interface,DVI),视频图形阵列(video graphics array,VGA),显示端口(display port,DP)等,视频接口1109还可以向外输出视频。当光学显示装置作为抬头显示使用时,视频接口1109可以接收周边设备输入的速度信号、电量信号,还可以接收外部输入的AR视频信号。当光学显示装置作为投影仪使用时,视频接口1109可以接收外部电脑或终端设备输入的视频信号。
视频模块1105可以对视频接口1109输入的视频进行解码,例如进行H.264解码。视频模块还可以对光学显示装置采集到的视频进行编码,例如对外接的摄像头采集到的视频进行H.264编码。此外,处理器1101也可以对视频接口1109输入的视频进行解码,然后将解码后的图像信号输出到显示电路1110。
显示电路1110和调制器1111用于显示对应的图像。在本实施例中,视频接口1109接收外部输入的视频源信号,视频模块1105进行解码和/或数字化处理后输出一路或多路图像信号至显示电路1110,显示电路1110根据输入的图像信号驱动调制器1111将入射的偏振光进行成像,进而输出至少两路图像光。此外,处理器1101也可以向显示电路1110输出一路或多路图像信号。显示电路1110也可以称为驱动电路。
电源模块1106用于根据输入的电力(例如直流电)为处理器1101和光源1112提供电源,电源模块1106中可以包括可充电电池,可充电电池可以为处理器1101和光源1112提供电源。光源1112发出的光可以传输到调制器(或称为图像源)1111进行成像,从而形成图像光。此处,光源1112可以为上述任一实施例中的光学发射模组。
无线通信模块1107可以使得光学显示装置与外界进行无线通信,其可以提供无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near Field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块1107可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块1107经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器1101。无线通信模块1107还可以从处理器1101接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
另外,视频模块1105进行解码的视频数据除了通过视频接口1109输入之外,还可以通过无线通信模块1107以无线的方式接收或从外部存储器中读取,例如光学显示装置可以通过车内的无线局域网从终端设备或车载娱乐系统接收视频数据,光学显示装置还可以读取外部存储器中存储的音视频数据。
示例性地,光学显示装置可以包括但不限于HUD、投影机、显示器、车载显示屏、AR设备、或VR设备等,其中,AR设备可以包括但不限于AR眼镜或AR头盔等,VR 设备可以包括但不限于VR眼镜或VR头盔等。
基于上述描述的光学显示装置的结构和功能原理,本申请还可以提供一种终端设备。如下以终端设备为交通工具为例介绍。请参见图12a,为本申请提供的一种交通工具的可能的功能框架示意图。耦合到交通工具1200或包括在交通工具1200中的组件可以包括传感器系统1201、外围设备1203、电源1204、计算机系统1205、用户接口1206以及光学显示装置1207。交通工具1200的组件可以被配置为以与彼此互连和/或与耦合到各系统的其它组件互连的方式工作。例如,电源1204可以向交通工具1200的所有组件提供电力。计算机系统1205可以被配置为传感器系统1201和外围设备1203接收数据并对它们进行控制。计算机系统1205还可以被配置为在用户接口1206上生成图像的显示并从用户接口1206接收输入。
传感器系统1201可以包括用于感测关于交通工具1200所位于的环境的信息等的若干个传感器。示例性地,传感器系统1201的传感器可以包括但不限于全球定位系统(global positioning system,GPS)、惯性测量单元(inertial measurement unit,IMU)、毫米波雷达、激光雷达、相机以及用于修改传感器的位置和/或朝向的制动器。毫米波雷达可利用无线电信号来感测交通工具1200的周边环境内的目标。在一些实施例中,除了感测目标以外,毫米波雷达还可用于感测目标的速度和/或前进方向。激光雷达可利用激光来感测交通工具1200所位于的环境中的目标。在一些实施例中,激光雷达可包括一个或多个激光源、扫描器以及一个或多个探测器,以及其他系统组件。相机可用于捕捉交通工具1200的周边环境的多个图像。相机可以是静态相机或视频相机。
传感器系统1201还可包括被监视交通工具1200的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是交通工具1200的安全操作的关键功能。传感器系统1201还可以包括其它传感器。本申请对此不做具体限定。
外围设备1203可以被配置为允许交通工具1200与外部传感器、其它交通工具和/或用户交互。为此,外围设备1203可以包括例如无线通信系统、触摸屏、麦克风和/或扬声器。外围设备1203可以额外地或可替换地包括除了图12a所示出的组件以外的其他组件。本申请对此不做具体限定。
在一些实施例中,外围设备1203提供交通工具1200的用户与用户接口1206交互的手段。例如,触摸屏可向交通工具1200的用户提供信息。用户接口1206还可操作触摸屏来接收用户的输入。在其他情况中,外围设备1203可提供用于交通工具1200与位于车内的其它设备通信的手段。例如,麦克风可从交通工具1200的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器可向交通工具1200的用户输出音频。无线通信系统可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统可使用3G蜂窝通信,例如码分多址(code division multiple access,CDMA)、EVD0、全球移动通信系统(global system for mobile communications,GSM)/通用分组无线服务技术(general packet radio service,GPRS),或者4G蜂窝通信,例如长期演进(long term evolution,LTE),或者5G蜂窝通信。无线通信系统利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统可利用红外链路、蓝牙或ZigBee与设备直 接通信。其他无线协议,例如各种交通工具通信系统,例如,无线通信系统144可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备,这些设备可包括交通工具和/或路边台站之间的公共和/或私有数据通信。
电源1204可以被配置为向交通工具1200的一些或全部组件提供电力。为此,电源1204可以包括例如可再充电锂离子或铅酸电池。在一些示例中,一个或多个电池组可被配置为提供电力。其它电源材料和配置也是可能的。在一些示例中,电源1204和能量源可以一起实现,如一些全电动车中那样。交通工具1200的组件可以被配置为以与在其各自的系统内部和/或外部的其它组件互连的方式工作。为此,交通工具1200的组件和系统可以通过系统总线、网络和/或其它连接机制通信地链接在一起。
交通工具1200的部分或所有功能受计算机系统1205控制。计算机系统1205可包括至少一个处理器12051,处理器执行存储在例如存储器12052这样的非暂态计算机可读介质中的指令。计算机系统1205还可以是采用分布式方式控制交通工具1200的个体组件或子系统的多个计算设备。
处理器12051可以参见上述图11中的处理器1101的介绍,存储器12052可参见上述图11中的内部存储器1103的介绍,此处不再赘述。尽管图12a功能性地图示了处理器、存储器、和在相同块中的计算机系统1205的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机、或存储器实际上可以包括可以或者可以不存储在相同的物理外壳内的多个处理器、计算机、或存储器。例如,存储器可以是硬盘驱动器或位于不同于计算机系统1205的外壳内的其它存储介质。因此,对处理器或计算机的引用将被理解为包括对可以或者可以不并行操作的处理器或计算机或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该交通工具并且与该交通工具进行无线通信。在其它方面中,此处所描述的过程中的一些在布置于交通工具内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,存储器12052可包含指令(例如,程序逻辑),指令可被处理器12051执行来执行交通工具1200的各种功能,包括以上描述的那些功能。存储器也可包含额外的指令,包括向传感器系统1201和外围设备1203中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令以外,存储器还可存储数据,例如道路地图,路线信息,传感器检测到的数据,交通工具的位置、方向、速度以及其它这样的交通工具数据,以及其他信息。这种信息可在交通工具1200在自主、半自主和/或手动模式中操作期间被交通工具1200和计算机系统1205使用。
用户接口1206,用于向交通工具1200的用户提供信息或从其接收信息。可选地,用户接口1206可包括在外围设备1203的集合内的一个或多个输入/输出设备,例如无线通信系统、触摸屏、麦克风和扬声器。
光学显示装置1207可参见前述相关介绍,此处不再赘述。需要说明的是,光学显示装置中的部分元件的功能也可以由交通工具的其它子系统来实现,例如,控制组件也可以为控制系统1202中的元件。换言之,控制组件的功能可由控制系统1202中的元件实现。
可选地,上述这些组件中的一个或多个可与交通工具1200分开安装或关联。例如, 存储器可以部分或完全地与交通工具1200分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
需要说明的是,图12a给出的交通工具功能框架只是一个示例,在其它示例中,交通工具1200可以包括更多、更少或不同的系统,并且每个系统可以包括更多、更少或不同的组件。此外,示出的系统和组件可以按任意种的方式进行组合或划分,本申请对此不做具体限定。
上述交通工具1200可以为轿车、卡车、公共汽车、船、飞机、直升飞机、娱乐车、施工设备、电车和火车等,本申请对此不作限定。
以交通工具为车辆为例,请参阅图12b,为本申请提供的一种车辆部分结构的简化示意图。该车辆可包括HUD装置和风挡。该HUD装置可位于方向盘下方,例如可以位于方向盘下方的驾驶侧仪表台(instrument panel,IP)里面。HUD装置可以是上述任意实施例中的HUD装置。
应理解,图12b所示的硬件结构仅是一个示例。本申请所适用的车辆可以具有比图12b中所示车辆更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。例如,该车辆还可以包括其他器件,例如处理器、存储器、无线通信装置和传感器等。
基于上述内容和相同的构思,本申请提供一种图像显示方法,请参阅图13的介绍。该图像显示方法可应用于上述图3至图8任一实施例所示的光学发射模组。也可以理解为,可以基于上述图3至图8任一实施例所示的光学发射模组来实现图像显示方法。或者,该图像显示方法也可以应用于上述图10至图11所示的光学显示装置或图12a所示的交通工具或图12b所示的车辆。
该图像显示方法可由控制装置执行,该控制装置可以属于光学显示装置或终端设备,或者也可以是独立于光学显示装置或终端设备的控制装置,例如芯片或芯片系统等。当该控制装置属于终端设备(如车辆)时,该控制装置可以是车辆中的域处理器,或者也可以是车辆中的电子控制单元(electronic control unit,ECU)等。
如图13所示,该图像显示方法可包括以下步骤:
步骤1301,控制光学发射模组的第一光源组件发射第一光束,并控制光学反射模组的透反组件的反射区域对准第一光束的传播光路。
其中,可获取第一图像的信息,根据第一图像的信息生成第一控制信号,并向第一光源组件发送第一控制信号,以控制第一光源组件发射第一光束。进一步,还可根据第一图像的信息生成第三控制信号,并向透反组件发送第三控制信号,以控制透反组件将反射区域对准第一光束的传播光路。其中,第一图像的信息可以包括但不限于待显示的第一图像的内容信息,例如可以是导航信息等,以实现基于第一图像的内容信息控制光源组件发射对应第一图像的第一光束。一种设计中,第一图像的信息还可以包括第一图像的第一指示信息(或称为第一标识信息),或者也可以理解为,第一图像的信息还可包括用于指示待显示的图像为第一图像的信息,以实现基于第一图像的信息控制透反组件的对准(例如转动);具体的,该第一指示信息可以通过第一图像的内容信息承载,也可以为独立于第一图像的内容信息的其它信息,本申请不做具体限定,以能用于控制透反组件的反射区域与第一光束的传播光路对准为准。第二图像的信息可以包括但不限于待显示的第二图像的内 容信息,例如行驶速度信息、行驶里程信息、转速信息、温度信息、油量信息和车灯状态信息等,以实现基于第二图像的内容信息控制光源组件发射对应第二图像的第二光束。进一步,可选的,第二图像的信息还可以包括第二图像的第二指示信息(或称为第二标识信息),或者也可以理解为,第二图像的信息还可包括用于指示待显示的图像为第二图像的信息,以实现基于第二图像的信息控制透反组件的对准(例如转动);具体的,该第二指示信息可以通过第二图像的内容信息承载,也可以是独立于第二图像的内容信息的其它信息,本申请对此不作限定,凡是可以实现控制透反组件的透射区域与第二光束的传播光路对准的均在本申请的保护范围。需要说明的是,第一图像的信息也可以与第二图像的信息相同,本申请对此不作限定。示例性的,可根据第一图像的内容信息,生成第一控制信号,根据第一指示信息生成第三控制信号;根据第二图像的内容信息生成第二控制信号,根据第二指示信息生成第四控制信号。
步骤1302,控制光学发射模组的第一光源组件发射第二光束,并控制透反组件的透射区域对准第二光束的传播光路。
在一种可能的实现方式中,可获取第二图像的信息,根据第二图像的信息生成第二控制信号,并向第一光源组件发送第二控制信号,以控制第一光源组件发射第二光束。进一步,还可根据第二图像的信息生成第四控制信号,并向透反组件发送第四控制信号,以控制透反组件的透射区域对准第二光束的传播光路。
需要说明的是,上述步骤1301和步骤1302之间的顺序也可以调换,即可先执行步骤1302后执行步骤1301,本申请对此不作限定。
步骤1303,控制光学发射模组的第二反射模组转动。
其中,第一光束通过第二反射模组的转动在第一扩散组件上扫描形成第一图像,第二光束通过第二反射模组的转动在第二扩散组件上扫描形成第二图像。
在一种可能的实现方式中,还可获取第一图像对应的第一虚像的亮度以及所述第二图像对应的第二虚像的亮度;根据第一虚像的亮度确定输入第一光源、第二光源和第三光源的电流大小;根据第二虚像的亮度,确定输入第一光源、第二光源和第三光源的电流的大小。
具体的,第一虚像的亮度与输入三个光源的电流权重之间关系、以及第二虚像的亮度与输入三个光源的电流权重之间的关系均可用下述表1表示。
表1电流的权重与虚像的亮度之间的对应关系
电流权重 亮度
I 11:I 12:I 13 B 1
I 21:I 22:I 23 B 2
需要说明的是,通过表的形式表示电流权重与虚像亮度的对应关系仅是一种示例,还可以通过其它可能的形式,本申请对此不作限定。
基于上述内容和相同构思,图14和图15为本申请的提供的可能的控制装置的结构示意图。这些控制装置可以用于实现上述方法实施例中如图13中的方法,因此也能实现上述方法实施例所具备的有益效果。在本申请中,该控制装置可以是上述探测系统中的控制模组,或者也可以上述图10至图11所示的光学显示装置或图12a所示的交通工具或图12b 所示的车辆中的处理器,或者也可以由其它独立的控制装置(如芯片)等。
如图14所示,该控制装置1400包括处理模块1401,进一步,还可包括收发模块1402。控制装置1400用于实现上述图13中所示的方法实施例中的方法。
当控制装置1400用于实现图13所示的方法实施例的方法时:处理模块1401控制光学发射模组的第一光源组件发射第一光束,控制光学反射模组的透反组件的反射区域对准第一光束的传播光路,并控制光学发射模组的第二反射模组转动,第一光束通过第二反射模组的转动在第一扩散组件上扫描形成第一图像;控制第一光源组件发射第二光束,控制透反组件的透射区域对准第二光束的传播光路,并控制第二反射模组转动,第二光束通过第二反射模组的转动在第二扩散组件上扫描形成第二图像。
应理解,本申请实施例中的处理模块1401可以由处理器或处理器相关电路组件实现,收发模块1402可以由接口电路等相关电路组件实现。
基于上述内容和相同构思,如图15所示,本申请还提供一种控制装置1500。该控制装置1500可包括处理器1501,进一步,可选的,还可包括接口电路1502。处理器1501和接口电路1502之间相互耦合。可以理解的是,接口电路1502可以为输入输出接口。可选地,控制装置1500还可包括存储器1503,用于存储处理器1501执行的计算机程序或指令等。
当控制装置1500用于实现图13所示的方法时,处理器1501用于执行上述处理模块1401的功能,接口电路1502用于执行上述收发模块1402的功能。其中,处理器1501可参见上述图11中处理器1101的介绍,此次不再赘述。
基于上述内容和相同构思,本申请提供一种芯片。该芯片可包括处理器和接口电路,进一步,可选的,该芯片还可包括存储器,处理器用于执行存储器中存储的计算机程序或指令,使得芯片执行上述图13中任意可能的实现方式中的方法。
本申请的实施例中的方法步骤可以通过硬件的方式来实现,也可以由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存储器中,存储器可参见上述图11中的存储器的介绍,此处不再赘述。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。计算机程序产品包括一个或多个计算机程序或指令。在计算机上加载和执行计算机程序或指令时,全部或部分地执行本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络或者其它可编程装置。计算机程序或指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机程序或指令可以从一个网站站点、计算机、服务器或数据中心通过有线或无线方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是集成一个或多个可用介质的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,例如,软盘、硬盘、磁带;也可以是光介质,例如,数字视频光盘(digital video disc,DVD);还可以是半导体介质,例如,固态硬盘(solid state drive,SSD)。
在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,不同的实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑 关系可以组合形成新的实施例。
本申请中,“均匀”不是指绝对的均匀,可以允许有一定工程上的误差。“垂直”不是指绝对的垂直,可以允许有一定工程上的误差。“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。在本申请的文字描述中,字符“/”,一般表示前后关联对象是一种“或”的关系。在本申请的公式中,字符“/”,表示前后关联对象是一种“相除”的关系。另外,在本申请中,“示例性地”一词用于表示作例子、例证或说明。本申请中被描述为“示例”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。或者可理解为,使用示例的一词旨在以具体方式呈现概念,并不对本申请构成限定。
可以理解的是,在本申请中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定。术语“第一”、“第二”等类似表述,是用于分区别类似的对象,而不必用于描述特定的顺序或先后次序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (26)

  1. 一种光学发射模组,其特征在于,包括第一光源组件、透反组件、第一反射组件和第二反射组件,所述透反组件包括N个反射区域和M个透射区域,所述N和M均为正整数;
    所述第一光源组件,用于发射第一光束和第二光束;
    所述透反组件的反射区域,用于将接收到的所述第一光束反射至所述第一反射组件;
    所述第一反射组件,用于将接收到的所述第一光束反射至所述第二反射组件;
    所述透反组件的透射区域,用于将接收到的所述第二光束透射至所述第二反射组件;
    所述第二反射组件,用于将来自所述第一反射组件的第一光束反射至第一扩散组件、并通过所述第二反射组件的转动在所述第一扩散组件上扫描形成第一图像,以及将来自所述透反组件的透射区域的第二光束反射至第二扩散组件、并通过所述第二反射组件的转动在所述第二扩散组件上扫描形成第二图像。
  2. 如权利要求1所述的模组,其特征在于,所述N个反射区域与所述M个透射区域交叉分布。
  3. 如权利要求1或2所述的模组,其特征在于,所述N个反射区域的面积相同或不同;和/或,
    所述M个透射区域的面积相同或不同。
  4. 如权利要求1~3任一项所述的模组,其特征在于,所述透反组件的N个反射区域的面积之和大于或等于所述M个透射区域的面积之和。
  5. 如权利要求1~4任一项所述的模组,其特征在于,所述第一反射组件包括反射镜。
  6. 如权利要求1~5任一项所述的模组,其特征在于,所述第二反射组件包括微机电系统MEMS振镜。
  7. 如权利要求1~6任一项所述的模组,其特征在于,所述透反组件包括色轮。
  8. 如权利要求1~7任一项所述的模组,其特征在于,所述第一光源组件,具体用于:
    根据接收到的第一控制信号,发射所述第一光束,所述第一控制信号是基于所述第一图像的信息生成的;或者,
    根据接收到的第二控制信号,发射所述第二光束,所述第二控制信号是基于所述第二图像的信息生成的。
  9. 如权利要求8所述的模组,其特征在于,所述透反组件,具体用于:
    根据接收到的第三控制信号,将所述透反组件的反射区域对准所述第一光束传播的光路,所述第三控制信号是基于所述第一图像的信息生成的;或者,
    根据接收到的第四控制信号,将所述透反组件的透射区域对准所述第二光束的传播光路,所述第四控制信号是基于所述第二图像的信息生成的。
  10. 如权利要求9所述的模组,其特征在于,所述透反组件的反射区域和透射区域的切换频率等于所述第一光源组件发射所述第一光束和所述第二光束的切换频率。
  11. 如权利要求1~10任一项所述的模组,其特征在于,所述第一光源组件包括第一光源、第二光源、第三光源和合光元件;
    所述第一光源,用于发射红光;
    所述第二光源,用于发射蓝光;
    所述第三光源,用于发射绿光;
    所述合光元件,用于混合所述红光、所述蓝光与所述绿光,得到所述第一光束或所述第二光束。
  12. 如权利要求11所述的模组,其特征在于,所述合光元件包括第一二向镜和第二二向镜;
    所述第一二向镜,用于反射来自所述第二光源的蓝光,并透射来自所述第三光源的绿光;
    所述第二二向镜,用于反射来自所述第一光源的红光,并透射所述第一二向镜透射的绿光,并透射所述第一二向镜反射的蓝光。
  13. 一种光学发射模组,其特征在于,包括第二光源组件、第三光源组件、第一反射组件和第二反射组件;
    所述第二光源组件,用于发射第一光束;
    所述第三光源组件,用于发射第二光束;
    所述第一反射组件,用于将来自所述第二光源组件的第一光束反射至所述第二反射组件;
    所述第二反射组件,用于将来自所述第一反射组件的第一光束反射至第一扩散组件、并通过所述第二反射组件的转动在所述第一扩散组件上扫描形成第一图像,以及将来自所述第三光源组件的所述第二光束反射至第二扩散组件、并通过所述第二反射组件的转动在所述第二扩散组件上扫描形成第二图像。
  14. 如权利要求13所述的模组,其特征在于,所述第一反射组件为反射镜。
  15. 如权利要求13或14所述的模组,其特征在于,所述第二反射组件为微机电系统MEMS振镜。
  16. 如权利要求13~15任一项所述的模组,其特征在于,所述第二光源组件,具体用于:
    根据接收到的第五控制信号,发射所述第一光束,所述第五控制信号是基于所述第一图像的信息生成的;和/或,
    所述第三光源组件,具体用于:
    根据接收到的第六控制信号,发射所述第二光束,所述第六控制信号是基于所述第二图像的信息生成的。
  17. 一种光学显示装置,其特征在于,包括光放大模组、以及如权利要求1~12任一项所述的光学发射模组或如权利要求13~16任一项所述的光学发射模组;
    所述光放大模组,用于将来自所述光学发射模组的所述第一图像和所述第二图像进行放大。
  18. 如权利要求17所述的装置,其特征在于,所述光放大模组包括以下任一项或任多项组合:
    至少一个曲面反射镜;
    至少一个柱面镜。
  19. 如权利要求17或18所述的装置,其特征在于,所述光学显示装置还包括所述第一扩散组件和所述第二扩散组件。
  20. 一种终端设备,其特征在于,包括风挡以及如权利要求17或18所述的光学显示装置;
    所述风挡,用于将来自所述光学显示装置的第一图像进行反射,并在第一位置形成第一虚像,以及将所述第二图像进行反射,并在第二位置形成第二虚像。
  21. 一种图像显示方法,其特征在于,包括:
    控制光学发射模组的第一光源组件发射第一光束,控制所述光学反射模组的透反组件的反射区域对准所述第一光束的传播光路,并控制所述光学发射模组的第二反射模组转动,所述第一光束通过所述第二反射模组的转动在第一扩散组件上扫描形成第一图像;
    控制所述第一光源组件发射第二光束,控制所述透反组件的透射区域对准所述第二光束的传播光路,并控制所述第二反射模组转动,所述第二光束通过所述第二反射模组的转动在第二扩散组件上扫描形成第二图像。
  22. 如权利要求21所述的方法,其特征在于,所述第一光源组件包括用于发射红光的第一光源、用于发射蓝光第二光源和用于发射绿光的第三光源;
    所述方法还包括:
    获取所述第一图像对应的第一虚像的亮度、以及所述第二图像对应的第二虚像的亮度;
    根据所述第一虚像的亮度确定输入所述第一光源、所述第二光源和所述第三光源的电流大小;根据所述第二虚像的亮度,确定输入所述第一光源、所述第二光源和所述第三光源的电流的大小。
  23. 如权利要求21或22所述的方法,其特征在于,控制所述第一光源组件发射第一光束,包括:
    获取所述第一图像的信息;
    根据所述第一图像的信息生成第一控制信号,所述第一控制信号用于控制所述第一光源组件发射所述第一光束;
    向所述第一光源组件发送所述第一控制信号。
  24. 如权利要求21~23任一项所述的方法,其特征在于,控制所述第一光源组件发射第二光束,包括:
    获取所述第二图像的信息;
    根据所述第二图像的信息生成第二控制信号,所述第二控制信号用于控制所述第一光源组件发射第二光束;
    向所述第一光源组件发送所述第二控制信号。
  25. 一种芯片,其特征在于,包括至少一个处理器和接口电路,所述控制装置用于执行如权利要求21~24中的任一项所述方法。
  26. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序或指令,当所述计算机程序或指令被控制装置执行时,使得所述控制装置执行如权利要求21~24中任一项所述的方法。
PCT/CN2022/085666 2022-04-07 2022-04-07 光学发射模组、光学显示装置、终端设备及图像显示方法 WO2023193210A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/085666 WO2023193210A1 (zh) 2022-04-07 2022-04-07 光学发射模组、光学显示装置、终端设备及图像显示方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/085666 WO2023193210A1 (zh) 2022-04-07 2022-04-07 光学发射模组、光学显示装置、终端设备及图像显示方法

Publications (1)

Publication Number Publication Date
WO2023193210A1 true WO2023193210A1 (zh) 2023-10-12

Family

ID=88243768

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/085666 WO2023193210A1 (zh) 2022-04-07 2022-04-07 光学发射模组、光学显示装置、终端设备及图像显示方法

Country Status (1)

Country Link
WO (1) WO2023193210A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016133746A (ja) * 2015-01-22 2016-07-25 株式会社Jvcケンウッド 描画装置及び描画方法
JP2017194548A (ja) * 2016-04-20 2017-10-26 三菱電機株式会社 表示装置
CN107894660A (zh) * 2016-10-04 2018-04-10 矢崎总业株式会社 车辆用显示装置
US20180373027A1 (en) * 2016-02-09 2018-12-27 Miho Higuchi Image display device and image display method
CN112578566A (zh) * 2020-12-28 2021-03-30 广景视睿科技(深圳)有限公司 一种投影光学系统及汽车的抬头显示装置
WO2021246232A1 (ja) * 2020-06-05 2021-12-09 株式会社小糸製作所 車両用表示装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016133746A (ja) * 2015-01-22 2016-07-25 株式会社Jvcケンウッド 描画装置及び描画方法
US20180373027A1 (en) * 2016-02-09 2018-12-27 Miho Higuchi Image display device and image display method
JP2017194548A (ja) * 2016-04-20 2017-10-26 三菱電機株式会社 表示装置
CN107894660A (zh) * 2016-10-04 2018-04-10 矢崎总业株式会社 车辆用显示装置
WO2021246232A1 (ja) * 2020-06-05 2021-12-09 株式会社小糸製作所 車両用表示装置
CN112578566A (zh) * 2020-12-28 2021-03-30 广景视睿科技(深圳)有限公司 一种投影光学系统及汽车的抬头显示装置

Similar Documents

Publication Publication Date Title
WO2024017038A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2024021852A1 (zh) 立体显示装置、立体显示系统和交通工具
WO2024021574A1 (zh) 立体投影系统、投影系统和交通工具
WO2023193210A1 (zh) 光学发射模组、光学显示装置、终端设备及图像显示方法
WO2023185302A1 (zh) 一种光机模组、车灯模组和交通工具
CN217360538U (zh) 一种投影系统、显示设备和交通工具
WO2024041034A1 (zh) 一种显示模组、光学显示系统、终端设备及成像方法
CN220983541U (zh) 一种扩散屏、显示装置、交通工具和车载系统
WO2024065332A1 (zh) 一种显示模组、光学显示系统、终端设备及图像显示方法
CN220983636U (zh) 一种显示装置、交通工具和车载系统
WO2023138076A1 (zh) 显示装置以及交通工具
WO2023130759A1 (zh) 一种显示装置和交通工具
WO2023138138A1 (zh) 一种显示装置和交通工具
WO2024098828A1 (zh) 投影系统、投影方法和交通工具
CN221303711U (zh) 一种显示装置、处理设备、显示系统和交通工具
WO2023071548A1 (zh) 一种光学显示装置、显示系统、交通工具及色彩调节方法
CN221303607U (zh) 一种视窗单元、显示装置、显示系统和交通工具
WO2023185293A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2023103492A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2024021563A1 (zh) 一种显示装置和交通工具
WO2024001225A1 (zh) 虚像显示装置、图像数据的生成方法、装置和相关设备
WO2023184276A1 (zh) 一种显示方法、显示系统和终端设备
US20240036311A1 (en) Head-up display
WO2020218072A1 (ja) 車両用ヘッドアップディスプレイおよびそれに用いられる光源ユニット
CN116263554A (zh) 一种光源模组、光学显示装置及交通工具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22936153

Country of ref document: EP

Kind code of ref document: A1