WO2023184276A1 - 一种显示方法、显示系统和终端设备 - Google Patents

一种显示方法、显示系统和终端设备 Download PDF

Info

Publication number
WO2023184276A1
WO2023184276A1 PCT/CN2022/084201 CN2022084201W WO2023184276A1 WO 2023184276 A1 WO2023184276 A1 WO 2023184276A1 CN 2022084201 W CN2022084201 W CN 2022084201W WO 2023184276 A1 WO2023184276 A1 WO 2023184276A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
area
pgu
display system
display
Prior art date
Application number
PCT/CN2022/084201
Other languages
English (en)
French (fr)
Inventor
翁德正
徐彧
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2022/084201 priority Critical patent/WO2023184276A1/zh
Publication of WO2023184276A1 publication Critical patent/WO2023184276A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/02Catoptric systems, e.g. image erecting and reversing system
    • G02B17/06Catoptric systems, e.g. image erecting and reversing system using mirrors only, i.e. having only one curved mirror
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only

Definitions

  • Embodiments of the present application relate to the field of intelligent driving, and more specifically, to a display method, a display system and a terminal device.
  • Head-up display first appeared on fighter jets. In order to prevent pilots from frequently looking down at instruments to concentrate, important information was displayed on a piece of transparent glass in front of the line of sight. This technology was also introduced into vehicles a few years ago.
  • the imaging principle of HUD is to generate an image through an image generating unit (PGU), and then increase the optical depth of field and magnification through a reflector, and then project it onto the windshield to form a virtual image with depth of field in front of the vehicle.
  • PGU image generating unit
  • Embodiments of the present application provide a display method, a display system, and a terminal device, which help reduce the size and cost of the display system.
  • the terminal device in the embodiment of this application may include a vehicle, a helmet, an aircraft or a drone, etc.
  • the vehicle in this application (sometimes referred to as a vehicle for short) is a vehicle in a broad sense, which can be a means of transportation (such as a car, truck, motorcycle, train, airplane, subway, ship, etc.), an industrial vehicle (such as a forklift, a trailer) , tractors, etc.), engineering vehicles (such as excavators, bulldozers, cranes, etc.), agricultural equipment (such as lawn mowers, harvesters, etc.), amusement equipment, toy vehicles, etc.
  • This application does not limit the types of vehicles.
  • a display system in a first aspect, includes an image generation unit PGU and a reflective component, wherein the PGU is used to send a first light and a second light to the reflective component, and the first light is a visible light corresponding to Light, the second light is light corresponding to invisible light; the reflective component is used to reflect the received first light to the first display device and reflect the second light to the second display device including fluorescent material, the third A light ray passes through the first display device to form a virtual image, and the second light ray passes through the second display device to form a real image.
  • a first light corresponding to visible light and a second light corresponding to invisible light are generated by the PGU.
  • the first light passes through the first display device to form a virtual image
  • the second light passes through the second display device to form a real image.
  • the display device may include a HUD, a side window display system, and a rear windshield display system.
  • the PGU includes an invisible light laser diode and a laser diffusion film
  • the laser diffusion film includes a first region, wherein the PGU is used to emit the invisible light laser diode The invisible light beam passes through the first area to generate the second light.
  • an invisible light laser diode is added to the PGU, and the second light beam is generated by the invisible light beam emitted by the invisible light laser diode, so that a real image is finally formed through the second display device.
  • the second light beam is generated by the invisible light beam emitted by the invisible light laser diode, so that a real image is finally formed through the second display device.
  • the first area does not include a micro lens array (MLA).
  • MLA micro lens array
  • the PGU further includes a visible light laser diode
  • the laser diffusion film further includes a second region
  • the second region includes an MLA
  • the PGU is used to convert the visible light
  • the visible light beam emitted by the laser diode passes through the second area to form the first light beam.
  • the first light is generated by the visible light laser diode in the PGU, thereby finally forming a virtual image through the first display device.
  • the first light and the second light can be formed respectively through different areas in the laser diffusion film. In this way, by adding an invisible light laser diode to the PGU and changing the structure of the laser diffusion film, it is possible to display two images with different focal lengths, which helps to reduce the volume and cost of the display system. cost.
  • the PGU further includes a light combining system and a microelectromechanical system MEMS, wherein the light combining system is used to combine the invisible light beam and the visible light beam into a first beam and send the first beam to the MEMS; the MEMS is used to scan the first beam to obtain the first scanning light and the second scanning light; the MEMS is also used to project the first scanning light to the second area and projects the second scanning ray to the first area.
  • the light combining system is used to combine the invisible light beam and the visible light beam into a first beam and send the first beam to the MEMS
  • the MEMS is used to scan the first beam to obtain the first scanning light and the second scanning light
  • the MEMS is also used to project the first scanning light to the second area and projects the second scanning ray to the first area.
  • the technical solutions of the embodiments of this application can be applied to HUDs with a laser beam scanning (LBS) architecture.
  • LBS laser beam scanning
  • the invisible light beam and the visible light beam are combined into one beam through the light combining system and the combined beam is sent to the MEMS for processing. scanning.
  • the first scanning light and the second scanning light can be obtained.
  • the first light and the second light can be obtained, and finally reflected to the first display device and the second display device through the reflective component It is possible to display two images with different focal lengths, which helps to reduce the size and cost of the display system.
  • the first scanning light is a scanning light corresponding to the visible light beam
  • the second scanning light is a scanning light corresponding to the invisible light beam
  • the display system further includes a control device, wherein the control device is used to control the laser diffusion film to be in the first position, wherein when the laser diffusion film is in In the first position, the first scanning light can be projected to the second area and the second scanning light can be projected to the first area.
  • a control device is added to the display system, so that the position of the laser diffusion film is controlled by the control device.
  • the first scanning light can be projected to the second area and the second scanning light can be projected to the first area, so that two images with different focal lengths can be displayed.
  • the invisible light laser diode includes an infrared laser diode.
  • the infrared ray emitted by the infrared laser diode forms the second light, and the second light is reflected by the reflective component to the second display device including fluorescent material, so that a real image can be presented. After the second light encounters the fluorescent material, the invisible light can be excited into visible light, thereby displaying a real image on the second display device.
  • the invisible light laser diode can also be a laser diode that can emit radio waves, microwaves, infrared light, ultraviolet light, x-rays or gamma rays, etc.
  • the first region is a hollow structure.
  • the first region of the laser diffusion film is designed as a hollow structure, which helps to reduce the design complexity of the laser diffusion film and also helps to reduce the cost of the laser diffusion film.
  • the second scanning light and the second light can be the same light if the first region is a hollow structure.
  • the above first region can be a hollow structure; or, the first region can also be a region without optical design, that is, the laser diffusion film can only include the second region.
  • the laser diffusion film includes the second region, and the partial region other than the second region can be regarded as the “first region” mentioned above.
  • the first light carries navigation image information; and/or the second light carries instrument image information.
  • the display system is a head-up display system HUD, wherein the first display device is a windshield.
  • the technical solution of the embodiment of the present application can be applied to a head-up display system, which helps the user in the driver's or passenger's position to see the virtual image and the real image respectively through the first display device and the second display device.
  • the second display device is located in the windshield.
  • the second display device can be located in the windshield, which helps the user in the driver's or passenger's position to see a virtual image outside the cockpit and a real image on the windshield without increasing the size of the HUD. And on the premise of reducing costs, it helps to improve the user experience.
  • the fluorescent material is located in a polyvinyl butyral (PVB) film of the windshield.
  • PVB polyvinyl butyral
  • the second display device may be a fluorescent film, and the fluorescent film may be attached to the side of the windshield close to the cabin.
  • a display method includes: controlling the image generation unit PGU to send a first light and a second light to the reflective component.
  • the first light is a light corresponding to visible light
  • the second light is a light corresponding to invisible light. of light
  • control the reflective component to reflect the received first light to the first display device and reflect the second light to a second display device including fluorescent material, and the first light forms a virtual image through the first display device , the second light passes through the second display device to form a real image.
  • the PGU includes an invisible light laser diode and a laser diffusion film
  • the laser diffusion film includes a first region
  • the control image generation unit PGU sends the first light to the reflective component and the second light beam includes: controlling the invisible light beam emitted by the invisible light laser diode to pass through the first area to form the second light beam.
  • the PGU further includes a visible light laser diode
  • the laser diffusion film further includes a second area
  • the second area includes a microarray lens MLA
  • the control image generation unit PGU Sending the first light and the second light to the reflective component includes: controlling the visible light beam emitted by the visible light laser diode to pass through the second area to form the first light.
  • the PGU further includes a light combining system and a micro-electromechanical system MEMS, the invisible light and the visible light beam pass through the light combining system to form a first light beam, and the light combining system
  • the first light beam can be projected to the MEMS
  • the control of the image generation unit PGU to send the first light and the second light to the reflective component includes: controlling the MEMS to scan the first light beam to obtain the first scanning light and the second scanning Light; control the MEMS to project the first scanning light to the second area and the second scanning light to the first area.
  • the laser diffusion film is in a first position, wherein when the laser diffusion film is in the first position, the first scanning light can be projected to the second area and the second scanning light can be projected to the first area.
  • the first position may be a fixed position.
  • the method further includes: controlling the laser diffusion film to be in the first position.
  • the invisible light laser diode includes an infrared laser diode.
  • the first region is a hollow structure.
  • the first light carries navigation image information; and/or the second light carries instrument image information.
  • the display system is a head-up display system HUD, wherein the first display device is a windshield.
  • the second display device is located in the windshield.
  • the fluorescent material is located in the polyvinyl butyral PVB film of the windshield.
  • a third aspect provides a terminal device, which includes the display system in any possible implementation of the first aspect.
  • the terminal device is a vehicle.
  • a fourth aspect provides a terminal device, which includes a processor and a display system, wherein the display system includes the display system in any of the possible implementations of the first aspect.
  • the terminal device is a vehicle.
  • a computer-readable medium stores program code.
  • the computer program code When the computer program code is run on a terminal device, it causes the terminal device to execute the method in the second aspect.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a terminal device, it causes the terminal device to execute the method in the second aspect.
  • inventions of the present application provide a chip system.
  • the chip system includes a processor for calling a computer program or computer instructions stored in a memory, so that the processor executes the method in the second aspect.
  • the processor is coupled with the memory through an interface.
  • the chip system further includes a memory, and a computer program or computer instructions are stored in the memory.
  • a display system in an eighth aspect, includes an image generation unit PGU and a reflective component, wherein the PGU is used to send a second light to the reflective component, and the second light is a light corresponding to invisible light;
  • the reflective component is used to reflect the received second light to a second display device including fluorescent material, and the second light passes through the second display device to form a real image.
  • the PGU further includes a micro-electromechanical system MEMS, wherein the MEMS is used to scan the invisible light beam to obtain the second light; the MEMS is also used to scan the invisible light beam. The second light is projected to the reflective component.
  • MEMS micro-electromechanical system
  • the display system is a head-up display system, wherein the second display device is located in the windshield.
  • the second display device may be a fluorescent film, and the fluorescent film may be attached to a side of the windshield close to the cabin.
  • the fluorescent material is located in the PVB film of the windshield.
  • Figure 1 is a schematic functional block diagram of a vehicle provided by an embodiment of the present application.
  • Figure 2 is a schematic structural diagram of a windshield-type head-up display system provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of the PGU.
  • Figure 4 is a schematic diagram of scanning through the fast and slow axes.
  • Figure 5 is a schematic diagram of the HUD system architecture used to form virtual images at two different locations.
  • Figure 6 is a schematic structural diagram of a display system provided by an embodiment of the present application.
  • Figure 7 is another schematic structural diagram of a PGU provided by an embodiment of the present application.
  • Figure 8 is a schematic structural diagram of a laser diffusion film provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of the windshield provided in the embodiment of the present application.
  • FIG. 10 is another schematic structural diagram of a display system provided by an embodiment of the present application.
  • FIG 11 is a schematic diagram of the MEMS scanning process provided by the embodiment of the present application.
  • Figure 12 is a schematic diagram of the display effect of the bifocal HUD provided by the embodiment of the present application.
  • Figure 13 is another structural schematic diagram of a display system provided by an embodiment of the present application.
  • Figure 14 is a schematic diagram of the display effect of a single-focus HUD provided by an embodiment of the present application.
  • Figure 15 is a schematic diagram of another display effect of a single-focus HUD provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of switching between bifocal mode and single-focal mode provided by an embodiment of the present application.
  • Figure 17 is another structural schematic diagram of a display system provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of the display effect of the audio and video mode provided by the embodiment of the present application.
  • Figure 19 is another schematic structural diagram of a PGU provided by an embodiment of the present application.
  • Figure 20 is another schematic structural diagram of a PGU provided by an embodiment of the present application.
  • Figure 21 is a schematic diagram of the effect of the technical solution of the embodiment of the present application applied to the vehicle side window display system.
  • Figure 22 is another schematic diagram of the effect of the technical solution of the embodiment of the present application applied to the vehicle side window display system.
  • Figure 23 is a schematic diagram of the effect of the technical solution of the embodiment of the present application applied to the vehicle rear windshield display system.
  • Figure 24 is a schematic block diagram of a display system provided by an embodiment of the present application.
  • Figure 25 is a schematic flow chart of a display method provided by an embodiment of the present application.
  • Prefixes such as “first” and “second” are used in the embodiments of this application only to distinguish different description objects, and have no limiting effect on the position, order, priority, quantity or content of the described objects.
  • the use of ordinal words and other prefixes used to distinguish the described objects does not limit the described objects.
  • Words constitute redundant restrictions.
  • plural means two or more.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 may include a perception system 120 , a display device 130 , and a computing platform 150 , where the perception system 120 may include several types of sensors that sense information about the environment surrounding the vehicle 100 .
  • the sensing system 120 may include a positioning system.
  • the positioning system may be a global positioning system (GPS), a Beidou system or other positioning systems, an inertial measurement unit (IMU), a lidar, a millimeter One or more of wave radar, ultrasonic radar and camera device.
  • GPS global positioning system
  • IMU inertial measurement unit
  • lidar a millimeter One or more of wave radar, ultrasonic radar and camera device.
  • the computing platform 150 may include processors 151 to 15n (n is a positive integer).
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities.
  • CPU central processing unit
  • microprocessor graphics processing unit
  • GPU graphics processing unit
  • DSP digital signal processor
  • the processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is an application-specific integrated circuit (application-specific integrated circuit). ASIC) or programmable logic device (PLD) implemented hardware circuit, such as FPGA.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU), tensor processing unit (TPU), deep learning processing Unit (deep learning processing unit, DPU), etc.
  • the computing platform 150 may also include a memory, which is used to store instructions. Some or all of the processors 151 to 15n may call instructions in the memory and execute the instructions to implement corresponding functions.
  • the display device 130 in the cockpit is mainly divided into two categories.
  • the first category is a vehicle-mounted display screen;
  • the second category is a projection display screen, such as a head-up display (HUD).
  • the vehicle display screen is a physical display screen and an important part of the vehicle infotainment system.
  • There can be multiple displays in the cockpit such as digital instrument display, central control screen, passenger in the co-pilot seat (also known as The display in front of the front passenger), the display in front of the left rear passenger, the display in front of the right rear passenger, and even the car window can be used as a display.
  • Head-up display also known as head-up display system. It is mainly used to display driving information such as speed and navigation on the display device in front of the driver (such as the windshield). This reduces the driver's gaze shift time, avoids pupil changes caused by the driver's gaze shift, and improves driving safety and comfort.
  • HUD includes combined head-up display (combiner-HUD, C-HUD) system, windshield-type head-up display (windshield-HUD, W-HUD) system, augmented reality head-up display (augmented reality HUD, AR-HUD) system, etc.
  • Figure 2 shows a schematic structural diagram of a W-HUD system provided by an embodiment of the present application. As shown in Figure 2, the system includes a HUD body 210, a PGU 220, a first reflective component 230 (for example, a plane mirror or a curved mirror), a second reflective component 240 (for example, a plane mirror or a curved mirror), and a windshield 250.
  • the optical imaging principle of the W-HUD system is: the image generated by the PGU is reflected to the eye box 260 through the first reflective component 230, the second reflective component 240 and the windshield 250. Because the optical system has the effect of increasing the optical path and amplification, the user can A virtual image with depth of field is seen outside the car 270.
  • W-HUD systems are widely used in vehicle assisted driving, such as vehicle speed, warning lights, navigation, and advanced driving assist system (ADAS).
  • vehicle assisted driving such as vehicle speed, warning lights, navigation, and advanced driving assist system (ADAS).
  • ADAS advanced driving assist system
  • the eye box usually refers to the range within which the driver's eyes can see the entire displayed image, as shown in Figure 2.
  • the general eye box size is 130mm ⁇ 50mm, that is, the driver's eyes have a movement range of ⁇ 50mm in the longitudinal direction and ⁇ 130mm in the lateral direction. If the driver's eyes are within the range of the eye box, a complete and clear image can be seen. If the driver's eyes extend beyond the eye box, the image may be distorted or the image may not be visible.
  • the virtual image distance refers to the distance between the center of the eyebox and the center of the virtual image.
  • the angle formed by the left and right boundaries of the virtual image to the driver's eyes is the horizontal field of view angle, and the angle formed by the upper and lower boundaries of the projected virtual image to the driver's eyes is the vertical field of view angle.
  • LED Light-emitting diode
  • TFT-LCD thin film transistor liquid crystal display
  • the principle is similar to that of a liquid crystal display (LCD) screen.
  • the white light source is divided into red, green, and blue colors using a prism, and the projection effect is achieved through the liquid crystal unit.
  • the DLP architecture can be composed of LEDs and digital micro-mirror devices (DMD).
  • DMD digital micro-mirror devices
  • the principle is that the DMD with ultra-micro lenses can project the strong light source after reflection.
  • LBS Laser beam scanning
  • diffuser laser diffusion film
  • FIG. 3 shows a schematic structural diagram of the PGU.
  • This PGU is a structure of LBS with laser diffusion film.
  • LBS includes red laser diode (R LD) 310, blue laser diode (B LD) 320, green laser diode (G LD) 330, light combining system (for example, polarization Polarizing beam splitter (PBS) 340 and micro-electro-mechanical system (MEMS) 350.
  • the principle is that red, blue, and green (RGB) three-color laser light passes through PBS and becomes a combined beam of light.
  • the combined beam of light is incident on the MEMS for fast and slow axis rotation scanning.
  • the combined beam of light is scanned by MEMS and the scanning light is projected to the laser diffusion film 360 to generate an image.
  • LBS determines the FOV and focal length of the image; the laser diffusion film determines the optical diffusion angle, chief ray angle (CRA), resolution and eye box specifications.
  • CRA chief ray angle
  • the visible light beams emitted by the red laser diode 310, the blue laser diode 320, and the green laser diode 330 can enter the light combining system 340 after passing through corresponding collimators.
  • the function of the collimating lens may be to make the light emitted by the laser diode become parallel light.
  • Figure 4 shows a schematic diagram of scanning through the fast and slow axes.
  • the fast axis is the horizontal axis (for example, the fast axis can be scanned with a sine wave);
  • the slow axis is the vertical axis (for example, the slow axis can be scanned with a triangular wave).
  • FIG. 5 shows a schematic diagram of the HUD system architecture for forming virtual images at two different locations.
  • the HUD includes PGU1 and PGU2, curved reflector 1 and curved reflector 2.
  • PGU1 and PGU2 need to be placed at specified positions.
  • the designated positions where PGU1 and PGU2 need to be placed are determined based on the virtual image distance of the two virtual images that need to be formed.
  • the current dual-PGU architecture has not yet reached the mass production stage, and the dual-PGU architecture will make the HUD bulky and costly.
  • Embodiments of the present application provide a display method, a display system and a terminal device.
  • a first light corresponding to visible light and a second light corresponding to invisible light are generated through the PGU.
  • the first light passes through the first display device to form a virtual image, and the second light
  • a real image is formed via the second display device.
  • the visible light laser diodes for example, R LD, B LD and G LD
  • invisible light laser diodes for example, infrared laser diodes.
  • IR LD infrared laser diodes
  • FIG. 6 shows a schematic structural diagram of a display system 600 provided by an embodiment of the present application.
  • the display system 600 includes a main control board 610, a control board 620, an LBS 630, a laser diffusion film 640 and a back-end optical module 650.
  • the main control board 610 can control the control board 620, the LBS 630, the laser diffusion film 640 and the back-end optical module 650 through an input/output (I/O) interface.
  • the control board 620 can be used to control the brightness and darkness of the laser emitted by the visible light laser diode and the invisible light laser diode in the LBS 630, and to control the motion behavior of the MEMS.
  • the rear optical module 650 may include the above-mentioned first reflective component 230 and the second reflective component 240 for controlling the optical path difference and optical magnification.
  • an invisible light laser diode can be added to the LBS630 and matched with the original visible light laser diode, through the laser diffusion film 640 and the PVB film with phosphor particles added, thereby realizing the function of a bifocal HUD and supporting multiple functions.
  • scenes for example, single focus mode, audio and video mode, etc.).
  • the main control board 610 can receive instructions sent by the computing platform 150 in FIG. 1 , and the instructions can be used to control one or more of the control board 620 , the laser diffusion film 640 and the back-end optical module 650 . After receiving the instruction, the main control board 610 can control one or more of the control board 620, the laser diffusion film 640, and the rear optical module 650.
  • the main control board 610 can also receive image information sent by the GPU in the computing platform 150 , where the image information is image information to be displayed by the display system 600 .
  • the main control board 610 can generate an instruction according to the image information, thereby controlling one or more of the control board 620, the laser diffusion film 640, and the rear optical module 650 through the instruction.
  • the display system 600 may also include a GPU.
  • the image information is sent to the main control board 610 through the GPU, so that the main control board 610 generates instructions based on the image information.
  • the position of the laser diffusion film may be fixed; or it may not be fixed.
  • the display system may further include a control device, and the control device may control the laser diffusion film to be in different positions.
  • Figure 7 shows another schematic structural diagram of a PGU provided by an embodiment of the present application.
  • invisible light laser diodes for example, IR LD 370
  • the visible light beams emitted by the red laser diode 310, the blue laser diode 320, and the green laser diode 330 and the invisible light beam emitted by the infrared laser diode 370 can enter the light combining system after passing through corresponding collimating mirrors.
  • FIG 8 shows a schematic structural diagram of a laser diffusion film provided by an embodiment of the present application.
  • the laser diffusion film can be divided into a first area and a second area.
  • the second area may have a micro lens array (MLA), and the first area may be transparent (or the first area may be an area without optical design or the first area may be a hollow structure).
  • MLA micro lens array
  • Light corresponding to visible light can be projected into the second area, and light corresponding to invisible light can be projected into the first area.
  • the second area is located below and the first area is located above as an example.
  • This embodiment of the present application does not limit this in any way.
  • the second area can be located above and the first area can be located below; or the second area can be located on the right and the first area can be located on the left; or the second area can be located on the left and the first area can be located on the right side; alternatively, the second area may be located on the outside and the first area may be located on the inside; alternatively, the second area may be located on the inside and the first area may be located on the outside.
  • Figure 9 shows a schematic structural diagram of the windshield provided in the embodiment of the present application.
  • the PVB film in the middle of the windshield is added with substances (for example, phosphor particles) used to excite invisible light into visible light.
  • substances for example, phosphor particles
  • the invisible light in the image generated by the PGU is reflected to the windshield through the reflective component, it can excite visible light. , thereby forming a close-focus real image and displaying it on the windshield.
  • FIG. 10 shows a schematic structural diagram of a display system 1000 provided by an embodiment of the present application.
  • the display system 1000 can be used to implement a bifocal HUD function.
  • the display system includes a HUD body 1010, an LBS 1020, a laser diffusion film 1030, a first reflective component 1040, a second reflective component 1050 and a windshield 1060.
  • the structure of the LBS 1020 can be shown in FIG. 7
  • the structure of the laser diffusion film 1030 can be shown in FIG. 8 .
  • R LD, B LD, G LD and IR LD all emit lasers. The lasers are combined into one beam of light.
  • the scanning beam after scanning by MEMS passes through the laser diffusion film 1030, the first reflective component 1040 and the second reflective component. 1050.
  • the windshield 1060 finally presents a far-focus virtual image and a near-focus real image, thus realizing the function of a bifocal HUD.
  • the image formed at the near-focus position is a real image 1061; the image formed at the far-focus position is a virtual image 1070.
  • the imaging process of the far-focus virtual image and the near-focus real image is introduced below.
  • the imaging process of the far-focus virtual image the visible light emitted by the R LD, B LD and G LD in the LBS1020 is combined into a beam of light, and full-color display can be achieved by adjusting the energy ratio of each light source.
  • the scanning beam obtained after MEMS scanning will first pass through the second area of the laser diffusion film 1030 to become a real image, and the optical diffusion angle and chief ray angle of the light will be controlled by the MLA structure in the second area.
  • the light will pass through the first reflective component 1040, the second reflective component 1050, the windshield 1060 and other optical lens components (or called the rear optical module), whose function is to transmit the real image through the optical lens component to generate the target size and target distance. virtual image.
  • the optical path can be referred to the solid line shown in Figure 10: the real image generated by the laser diffusion film passes through the first reflective component 1040, the second reflective component 1050, and the windshield 1060 and then is reflected to the eye box 1080, so that the user can see the virtual image 1070 .
  • the imaging process of the near-focus real image the infrared beam emitted by the IR LD and the visible light are combined into one beam of light.
  • the infrared beam scanned by the MEMS passes through the first area on the laser diffusion film, the first reflective component 1040 and the second reflective component 1050. Visible light can be excited by phosphor particles in the PVB interlayer film projected through the windshield.
  • the optical path can be referred to the dotted line shown in Figure 10: after the infrared scanning beam output by the LBS 1020 passes through the first area of the laser diffusion film 1030, the first reflective component 1040, and the second reflective component 1050, it passes through the PVB interlayer film of the windshield.
  • Phosphor particles can excite monochromatic visible light, and users can see a real image 1061 on the windshield. This process is similar to the principle of lighting. Infrared rays are reflected through the reflective component to the PVB film with phosphor to produce a specific single-color bright spot. Then use IR LD light and dark control to display a monochrome image area in the windshield, becoming a close-focus real image.
  • the PGU can calculate the optimal frame rate and resolution in the vertical direction based on the frequency of the fast axis, and the resolution in the horizontal direction and the resolution of the laser spot. Size related.
  • (a) in Figure 11 shows a schematic coordinate diagram of a laser display area after MEMS scanning.
  • the coordinates of the upper left corner of the laser diffusion film can be (0, 0), and the coordinates of the lower right corner can be ( Xn, Yn), the timing of laser lighting for each coordinate can be determined by the scanning frequency of the fast and slow axes.
  • the display area 2 of the image formed by RGB laser is (0, 200)-(600, 600), and the image formed by IR laser
  • the display area 1 of the image is (0, 0)-(600, 200).
  • the IR laser can be made to light up only in display area 1, and the turning on/off timing of the IR laser can be controlled according to the display image; the RGB laser can also be made to light up only in display area 2, and the RGB laser can be controlled according to the display image. The timing of turning on/off the laser and the brightness of the RGB laser.
  • Figure 12 shows a schematic diagram of the display effect of the bifocal HUD provided by the embodiment of the present application.
  • the user can see images including area 1201 and area 1202 in the cockpit.
  • the image in area 1201 is a far-focus virtual image, and the vehicle's navigation information can be displayed in this far-focus virtual image (for example, prompting the user to turn left);
  • the image in area 1202 is a near-focus real image, and the vehicle can be displayed in this near-focus real image.
  • ABS anti-lock braking system
  • FIG. 13 shows another schematic structural diagram of a display system 1000 provided by an embodiment of the present application.
  • the IR LD emits laser light in the LBS1020.
  • the scanning beam after scanning by MEMS does not need to pass through the laser diffusion film, but directly passes through the first reflective component 1040, the second reflective component 1050, and the windshield. 1060, finally presenting a close-focus real image on the windshield, thereby realizing the function of a single-focus HUD.
  • Figure 14 shows a schematic diagram of the display effect of a single-focus HUD provided by an embodiment of the present application. For example, in parking mode, warning information can be displayed through the single-focus HUD.
  • Figure 15 shows another schematic diagram of the display effect of the single-focus HUD provided by the embodiment of the present application.
  • the user's mobile phone number information can be displayed through the single-focus HUD.
  • the near-focus real image can be seen by both users in the cockpit and users outside the cockpit.
  • the single-focus HUD function can also display other prompt information (for example, the current vehicle has turned on sentry mode, advertising information).
  • Figure 16 shows a schematic diagram of switching between bifocal mode and single-focal mode provided by an embodiment of the present application.
  • the laser diffusion film can be placed in a vertical state. At this time, the function of the bifocal HUD can be realized.
  • the laser diffusion film can be kept in a horizontal state by controlling the rotation of the gear by the motor. At this time, the function of a single-focus HUD can be realized.
  • FIG. 17 shows another schematic structural diagram of a display system 1000 provided by an embodiment of the present application.
  • R LD, B LD and G LD emit laser light in LBS1020.
  • the scanning beam after scanning by MEMS does not pass through the laser diffusion film, but passes through the first reflective component 1040 and the second reflective component. 1050 and the projection screen 1090, and finally the image is presented on the projection screen 1090.
  • the function of LBS is similar to that of a micro projector, which can be applied to the audio and video mode when parking and resting.
  • Figure 18 shows a schematic diagram of the display effect of the audio and video mode provided by the embodiment of the present application. For example, users can watch their favorite videos or pictures through the projection screen in audio-visual mode.
  • the PGU is used as an LBS with a laser diffusion film architecture as an example.
  • the technical solution of the embodiment of the present application can also be applied in a DLP architecture or LCoS architecture.
  • FIG 19 shows another schematic structural diagram of a PGU provided by an embodiment of the present application.
  • This PGU is a PGU of DLP architecture.
  • the PGU includes R LD1901, B LD1902, G LD1903, IR LD1904, collimating lens 1905, light combining system 1906, lens array 1907, DMD1908, prism 1909, projection lens (projector lens) 1910 and laser Diffusion film 1911, in which the function of the lens array is to convert the point light source into a surface light source.
  • the visible light beams emitted by R LD1901, B LD1902 and G LD1903 and the invisible light beam emitted by IR LD1904 enter the light combining system 1906 through the corresponding collimating lens 1905, thereby becoming combined light beams.
  • the combined light beam passes through the lens array 1907 and then passes through the prism 1909.
  • the prism 1909 can rotate the incident angle of the combined light beam 90° counterclockwise, and emit the light ray with the incident angle rotated 90° counterclockwise to the DMD1908.
  • the DMD 1908 can transmit the light through the prism 1909 and project it to the projection lens 1910.
  • the projection lens 1910 can divide the light into light corresponding to visible light and light corresponding to invisible light. Thereby, the light corresponding to the invisible light is projected to the first area of the laser diffusion film (the first area does not include the MLA) and the light corresponding to the visible light is projected to the second area of the laser diffusion film (the second area includes the MLA). .
  • FIG 20 shows another schematic structural diagram of a PGU provided by an embodiment of the present application.
  • the PGU is a PGU of LCoS architecture.
  • the PGU includes R LD2001, B LD2002, G LD2003, IR LD2004, collimating lens 2005, light combining system 2006, polarizer 2007, lens array 2008, LCoS2009, prism 2010, projection lens 2011 and laser diffusion film 2012.
  • the function of the polarizing film 2007 is to convert the combined light generated by the light combining system 2006 into polarized light
  • the function of the lens array 2008 is to convert the point light source into a surface light source.
  • the combined light beams pass through the polarizing film 2007 to form polarized light.
  • the polarized light changes from a point light source to an area light source.
  • the lens array 2008 projects the light onto the prism 2010, the incident angle of the light rotates 90° counterclockwise.
  • Prism 2010 emits light with the incident angle rotated 90° counterclockwise to LCoS2009.
  • the LCoS2009 can transmit the light through the prism 2010 and project it to the projection lens 2011.
  • the projection lens 2011 can divide the light into light corresponding to visible light and light corresponding to invisible light. Thereby, the light corresponding to the invisible light is projected to the first area of the laser diffusion film (the first area does not include the MLA) and the light corresponding to the visible light is projected to the second area of the laser diffusion film (the second area includes the MLA). .
  • the transmission process of the light can refer to the description in the above embodiment, and will not be repeated here. Repeat.
  • the head-up display system is used as an example for explanation, and the embodiments of the present application are not limited thereto.
  • the technical solutions provided by the embodiments of the present application can also be applied to vehicle side window display systems and vehicle rear windshield display systems.
  • Figure 21 shows a schematic diagram of the effect of the technical solution of the embodiment of the present application applied to the vehicle side window display system.
  • the side window display system can be located on the roof of the vehicle. Take the side window display system as an example for the display system on the left side of the second row. Fluorescent material may be included in the side window glass in the left area of the second row. Users in the left area of the second row can see virtual images outside the side windows and real images on the side windows. As shown in Figure 21, the user can see news information content 2101 on the side window and advertising content 2102 on the side window.
  • Figure 22 shows a schematic diagram of the effect of the technical solution of the embodiment of the present application applied to the vehicle side window display system.
  • the vehicle can use sensors (such as lidar, millimeter wave radar, cameras, etc.) to detect whether there is a vehicle or pedestrian passing by outside the corresponding door on the left side of the second row. This will determine whether there is a risk of collision after the door on the left side of the second row is opened. If there is a risk of collision, the side window display system can display a prompt message on the side window in single focus mode (for example, "Please be careful! There is a risk of collision when opening the door").
  • sensors such as lidar, millimeter wave radar, cameras, etc.
  • FIG 23 shows a schematic diagram of the effect of the technical solution of the embodiment of the present application applied to the vehicle rear windshield display system.
  • the rear windshield display system can be located on the roof of the vehicle, and the rear windshield glass can include fluorescent materials.
  • the vehicle can display the prompt message "autonomous driving" on the rear windshield through single focus mode. In this way, pedestrians or other vehicles behind the vehicle can use the prompt information to know that the vehicle is currently in an autonomous driving state.
  • Figure 24 shows a schematic block diagram of a display system 2400 provided by an embodiment of the present application.
  • the display system 2400 includes a PGU 2410 and a reflective component 2420, where the PGU 2410 is used to send a first light and a second light to the reflective component 2420.
  • the first light is a light corresponding to visible light
  • the second The light is light corresponding to invisible light
  • the reflective component 2420 is used to reflect the received first light to the first display device and reflect the second light to the second display device including fluorescent material.
  • the first light passes through the The first display device forms a virtual image
  • the second light passes through the second display device and forms a real image.
  • the PGU includes an invisible light laser diode and a laser diffusion film.
  • the laser diffusion film includes a first region, wherein the PGU 2410 is used to pass the invisible light beam emitted by the invisible light laser diode through the first region to form the third region. Two rays.
  • the PGU 2410 further includes a visible light laser diode
  • the laser diffusion film further includes a second area including a microarray lens MLA, wherein the PGU 2410 is used to pass the visible light beam emitted by the visible light laser diode through the third area. Two areas form the first light ray.
  • the light-emitting device that can emit visible light or invisible light can also be a light-emitting diode (LED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (FLED), Mini-LED or Micro-LED, etc.
  • LED light-emitting diode
  • AMOLED active-matrix organic light-emitting diode
  • FLED flexible light-emitting diode
  • Mini-LED or Micro-LED etc.
  • the PGU 2410 also includes a light combining system and a microelectromechanical system MEMS, wherein the light combining system is used to combine the invisible light beam and the visible light beam into a first light beam and send the first light beam to the MEMS;
  • the MEMS is used to scan the first beam to obtain a first scanning light and a second scanning light;
  • the MEMS is also used to project the first scanning light to the second area and the second scanning light to the third area. a region.
  • the first scanning light is projected to the second area to form the first light
  • the second scanning light is projected to the first area to form the second light.
  • the PGU 2410 also includes a control device, wherein the control device is used to control the laser diffusion film to be in a first position, wherein when the laser diffusion film is in the first position, the first scanning light can be projected to the second area and the second scanning light can be projected to the first area.
  • the control device is used to control the laser diffusion film to be in a first position, wherein when the laser diffusion film is in the first position, the first scanning light can be projected to the second area and the second scanning light can be projected to the first area.
  • the invisible light laser diode includes an infrared laser diode.
  • the first region is a hollow structure.
  • the second scanning light and the second light may be the same light.
  • the first light carries navigation image information; and/or the second light carries instrument image information.
  • the display system is a head-up display system HUD, wherein the first display device is a windshield.
  • the second display device is located in the windshield.
  • the fluorescent material is located in the polyvinyl butyral PVB film of the windshield.
  • the second display device may be a fluorescent film attached to the side of the windshield close to the cabin.
  • Figure 25 shows a schematic flowchart of a display method 2500 provided by an embodiment of the present application. As shown in Figure 25, the method includes:
  • the first light is the light corresponding to the visible light
  • the second light is the light corresponding to the invisible light.
  • controlling the PGU to send the first light and the second light to the reflective component includes: the first processor sending a first instruction to the PGU, the first instruction being used to control the PGU to send the first light to the reflective component. and that second ray.
  • the first processor may be the processor in the above-mentioned main control board 610, or the first processor may also be the processor (eg, CPU) in the above-mentioned computing platform 150.
  • the PGU may execute the first instruction, causing the PGU to send the first light and the second light to the reflective component.
  • the first processor may also receive image information sent by the second processor, where the image information is related to the above-mentioned real image and virtual image. After receiving the image information, the first processor can generate the first instruction according to the image information.
  • the second processor may be the GPU in the above-mentioned computing platform 150, and the first processor may be the processor in the above-mentioned main control board 610.
  • the GPU may send the image information to the first processor, so that the first processor generates the first instruction according to the image information.
  • both the second processor and the first processor may be processors in the above-mentioned main control board.
  • the above first command can include control of the laser diode and MEMS.
  • the laser diode may include a visible light laser diode and an invisible light laser diode.
  • the first light carries navigation image information; and/or the second light carries instrument image information.
  • the display system is a head-up display system HUD, wherein the first display device is a windshield.
  • the second display device is located in the windshield.
  • the fluorescent material is located in the polyvinyl butyral PVB film of the windshield.
  • the function of the above fluorescent materials is to excite invisible light into visible light.
  • the PGU includes an invisible light laser diode and a laser diffusion film
  • the laser diffusion film includes a first region
  • controlling the PGU to send the first light and the second light to the reflective component includes: controlling the light emitted by the invisible light laser diode.
  • the invisible light beam passes through the first area to form the second light beam.
  • controlling the invisible light beam emitted by the invisible light laser diode to pass through the first area to form the second light includes: the first processor sends a second instruction to the PGU, the second instruction is used to control the The invisible light beam emitted by the invisible light laser diode passes through the first area to form the second light beam.
  • the PGU can execute the second instruction, so that the invisible light beam emitted by the invisible light laser diode passes through the first area to form the second light.
  • the above second instruction may be a sub-instruction of the first instruction.
  • the above second command can include control of the invisible light laser diode and MEMS control.
  • the image information sent by the second processor to the first processor includes image information corresponding to the real image and image information corresponding to the virtual image.
  • the first processor may generate the second instruction according to the image information corresponding to the real image.
  • the first processor may control the turning on of the invisible light laser diode through a circuit (for example, by adjusting parameters such as voltage or current to turn on the invisible light laser diode).
  • the first processor can send the parameter information of the fast axis scan and the parameter information of the slow axis scan in the MEMS to the MEMS, so that the MEMS can scan the beam according to the parameter information of the fast axis scan and the parameter information of the slow axis scan.
  • the invisible light laser diode includes an infrared laser diode.
  • the first region is a hollow structure.
  • the PGU also includes a visible light laser diode
  • the laser diffusion film also includes a second area
  • the second area includes a microarray lens MLA
  • the control PGU sends the first light and the second light to the reflective component, including: controlling The visible light beam emitted by the visible light laser diode passes through the second area to form the first light ray.
  • controlling the visible light beam emitted by the visible light laser diode to pass through the second area to form the first light includes: the first processor sending a third instruction to the PGU, the third instruction being used to control the visible light The visible light beam emitted by the laser diode passes through the second area to form the first light beam. After receiving the third instruction, the PGU can execute the third instruction, so that the visible light beam emitted by the visible light laser diode passes through the second area to form the first light ray.
  • the above third instruction may be a sub-instruction of the first instruction.
  • the above third instruction can include control of the visible light laser diode and control of the MEMS.
  • the image information sent by the second processor to the first processor includes image information corresponding to the real image and image information corresponding to the virtual image.
  • the first processor may generate the third instruction according to the image information corresponding to the virtual image.
  • the first processor can control the visible light laser diode to turn on and control the visible light laser diode to turn on and off through the circuit.
  • the first processor can control the MEMS to scan the light beam according to the parameter information of fast axis scanning and the parameter information of slow axis scanning.
  • the PGU further includes a light combining system and a microelectromechanical system MEMS, the invisible light and the visible light beam pass through the light combining system to form a first light beam, and the light combining system can project the first light beam to the MEMS,
  • the control PGU sends the first light and the second light to the reflective component, including: controlling the MEMS to scan the first light beam to obtain the first scanning light and the second scanning light; controlling the MEMS to project the first scanning light to the third light beam. two areas and project the second scanning light to the first area.
  • the above process of controlling the MEMS to scan the first beam, projecting the scanned first scanning light to the second area, and projecting the second scanning light to the first area can also be implemented by the first processor sending instructions to the PGU.
  • the control instruction for the MEMS sent by the first processor can be carried in the above-mentioned first instruction; or it can also be carried in the above-mentioned second instruction; or it can also be carried in the above-mentioned third instruction.
  • the laser diffusion film is in a first position, wherein when the laser diffusion film is in the first position, the first scanning light can be projected to the second area and the second scanning light can be projected to the third area. a region.
  • the position of the above laser diffusion film can be fixed and does not need to be controlled by the first processor.
  • the position of the laser diffusion film can also be changed, and the position of the laser diffusion film can be controlled by the first processor.
  • the first processor can send a fourth instruction to the control device, and the fourth instruction can be used to instruct the laser diffusion film to be set to the first position.
  • the control device can receive the fourth instruction through the I/O interface, so that the laser diffusion film can be set to the first position according to the fourth instruction.
  • the above fourth instruction may be a sub-instruction of the first instruction; or, the fourth instruction may be an instruction independent of the first instruction.
  • control the reflective component to reflect the received first light to the first display device and reflect the second light to the second display device including fluorescent material, and the first light forms a virtual image through the first display device, The second light passes through the second display device to form a real image.
  • the above process of forming a real image can be that the invisible light beam emitted by the invisible light laser diode passes through the collimator and then enters the light combining system.
  • the first beam output by the light combining system is scanned by the MEMS to obtain the second scanning light.
  • the MEMS projects the second scanning light to the first area of the laser diffusion film to form the second light.
  • the second light is reflected by the reflective component to the second display device including fluorescent material, thereby forming a real image.
  • the process of forming a virtual image can be that the visible light emitted by the visible light laser diode passes through the collimator and enters the light combining system.
  • the first beam output from the light combining system is scanned by the MEMS to obtain the first scanning light.
  • the MEMS projects the first scanning light to the second area of the laser diffusion film to form the first light.
  • the first light is reflected to the first display device through the reflective component, thereby forming a virtual image.
  • controlling the reflective component to reflect the received first light to a first display device and to reflect the second light to a second display device including fluorescent material includes: the first processor sends a signal to the reflective component.
  • the fifth instruction is used to instruct the first light to be reflected to the first display device and the second light to be reflected to the second display device.
  • the reflective component can reflect the first light to the first display device and the second light to the second display device according to the fifth instruction.
  • the fifth instruction may include the rotation angle of the reflective component.
  • the reflective component includes multiple ones (for example, the first reflective component 1040 and the second reflective component 1050 in Figure 10), then the fifth instruction can carry the rotation of each reflective component in the multiple reflective components. angle.
  • the above virtual image is an image that the user can see outside the cockpit when sitting in the cockpit, but in fact there is no image information at this location outside the cockpit. Visible light is reflected to the windshield through the reflective component, and is reflected from the windshield to the human eye. Since the display system has the effect of increasing the light path and amplifying the light, the user can see a virtual image with depth of field at a location outside the cockpit.
  • the above real image refers to the image seen by the user on the second display device.
  • the real image is actually displayed on the second display device. That is to say, users at all positions in the cockpit or users outside the cockpit can see the real image through the second display device.
  • An embodiment of the present application also provides a terminal device, which may include the above display system.
  • the terminal device is a vehicle.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program code. .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

一种显示方法、显示系统(2400)和终端设备,显示系统(2400)包括图像产生单元(2410)和反射组件(2420),其中,图像产生单元(2410)用于向反射组件(2420)发送第一光线和第二光线,第一光线为可见光对应的光线,第二光线为不可见光对应的光线;反射组件(2420)用于将接收到的第一光线反射至第一显示装置以及将第二光线反射至包括荧光材料的第二显示装置,第一光线经第一显示装置形成虚像,第二光线经过第二显示装置形成实像。这种显示系统(2400)可以应用于智能汽车或者电动汽车,无需在显示系统(2400)中增加额外的硬件,有助于降低显示系统(2400)的体积和成本。

Description

一种显示方法、显示系统和终端设备 技术领域
本申请实施例涉及智能驾驶领域,并且更具体地,涉及一种显示方法、显示系统和终端设备。
背景技术
抬头显示装置(head up display,HUD)最早出现在战斗机上,为了让飞行员不用频繁低头看仪表集中注意力,将重要信息在视线前方的一块透明玻璃上显示。这项技术在几年前也被引入了车辆中。
HUD的成像原理为通过影像生成模块(picture generating unit,PGU)产生图像,然后通过反射镜增加光学景深和放大倍率后,投射至风挡玻璃并在车辆前方形成具有景深的虚像。当前为了使HUD产生两个不同焦距的图像,已有公司推出了在HUD中配置两组PGU,以实现HUD产生不同焦距的图像。但是这样的双PGU设计会使得HUD的体积庞大且成本较高。
发明内容
本申请实施例提供一种显示方法、显示系统和终端设备,有助于降低显示系统的体积,且有助于降低显示系统的成本。
本申请实施例中的终端设备可以包括车辆、头盔、飞行器或者无人机等。本申请中的车辆(有时简称为车)为广义概念上的车辆,可以是交通工具(如:汽车,卡车,摩托车,火车,飞机,地铁,轮船等),工业车辆(如:叉车,挂车,牵引车等),工程车辆(如:挖掘机,推土机,吊车等),农用设备(如割草机、收割机等),游乐设备,玩具车辆等,本申请对车辆的类型不做限定。
第一方面,提供了一种显示系统,该显示系统包括图像产生单元PGU和反射组件,其中,该PGU用于向该反射组件发送第一光线和第二光线,该第一光线为可见光对应的光线,该第二光线为不可见光对应的光线;该反射组件用于将接收到该第一光线反射至第一显示装置以及将该第二光线反射至包括荧光材料的第二显示装置,该第一光线经该第一显示装置形成虚像,该第二光线经过该第二显示装置形成实像。
本申请实施例中,通过PGU生成可见光对应的第一光线和不可见光对应的第二光线,第一光线经过第一显示装置形成虚像,第二光线经过第二显示装置形成实像。从而通过形成的虚像和实像实现两个不同焦距的图像的显示。无需在显示系统中增加额外的硬件,有助于降低显示系统的体积,且有助于降低显示系统的成本。
在一些可能的实现方式中,该显示装置可以包括HUD、侧窗显示系统以及后风挡显示系统。
结合第一方面,在第一方面的某些实现方式中,该PGU包括不可见光激光二极管和 激光扩散膜,该激光扩散膜包括第一区域,其中,该PGU用于将该不可见光激光二极管发出的不可见光光束通过该第一区域生成该第二光线。
本申请实施例中,通过在PGU中增加不可见光激光二极管,并通过不可见光激光二极管发出的不可见光光束生成第二光线,从而最终通过第二显示装置形成实像。这样,通过在PGU中增加占用体积较小且成本较低的不可见光激光二极管,就可以实现两个不同焦距的图像的显示,有助于降低显示系统的体积,且有助于降低显示系统的成本。
在一些可能的实现方式中,该第一区域中不包括微数组透镜(micro lens array,MLA)。
结合第一方面,在第一方面的某些实现方式中,该PGU还包括可见光激光二极管,该激光扩散膜还包括第二区域,该第二区域包括MLA,其中,该PGU用于将该可见光激光二极管发出的可见光光束通过该第二区域形成该第一光线。
本申请实施例中,通过PGU中的可见光激光二极管生成第一光线,从而最终通过第一显示装置形成虚像。通过激光扩散膜中不同的区域可以分别形成第一光线和第二光线。这样,通过在PGU中增加不可见光激光二极管以及对激光扩散膜的结构进行改变,就可以实现两个不同焦距的图像的显示,有助于降低显示系统的体积,且有助于降低显示系统的成本。
结合第一方面,在第一方面的某些实现方式中,该PGU还包括合光系统和微机电系统MEMS,其中,该合光系统用于将该不可见光光束和该可见光光束合成为第一光束并向该MEMS发送该第一光束;该MEMS用于对该第一光束进行扫描,得到第一扫描光线和第二扫描光线;该MEMS还用于将该第一扫描光线投射至该第二区域且将该第二扫描光线投射至该第一区域。
本申请实施例的技术方案可以应用于激光扫描(laser beam scanning,LBS)架构的HUD中,通过合光系统将不可见光光束和可见光光束合成一束光并将该合成后的光束发送给MEMS进行扫描。经过MEMS扫描后就可以得到第一扫描光线和第二扫描光线。在分别将第一扫描光线投射至第二区域且将第二扫描光线投射至第一区域后就可以得到第一光线和第二光线,最后通过反射组件反射至第一显示装置和第二显示装置就可以实现两个不同焦距的图像的显示,有助于降低显示系统的体积,且有助于降低显示系统的成本。
在一些可能的实现方式中,该第一扫描光线为该可见光光束对应的扫描光线,该第二扫描光线为该不可见光光束对应的扫描光线。
结合第一方面,在第一方面的某些实现方式中,该显示系统中还包括控制装置,其中,该控制装置用于控制该激光扩散膜处于第一位置,其中,当该激光扩散膜处于第一位置时,该第一扫描光线能够投射至该第二区域且该第二扫描光线能够投射至该第一区域。
本申请实施例中,通过在显示系统中增加控制装置,从而通过控制装置来控制激光扩散膜所处的位置。这样,当激光扩散膜处于第一位置时,该第一扫描光线能够投射至该第二区域且该第二扫描光线能够投射至该第一区域,从而可以实现两个不同焦距的图像的显示。
结合第一方面,在第一方面的某些实现方式中,该不可见光激光二极管包括红外线激光二极管。
本申请实施例中,通过红外线激光二极管发出的红外线形成第二光线,并通过反射组件将第二光线反射至包括荧光材料的第二显示装置可以呈现实像。第二光线遇到荧光材料 后可以将不可见光激发为可见光,从而在第二显示装置处显示实像。
在一些可能的实现方式中,该不可见光激光二极管还可以为能发出无线电波,微波,红外光,紫外光,x射线或者γ射线等的激光二极管。
结合第一方面,在第一方面的某些实现方式中,该第一区域为中空结构。
本申请实施例中,通过将激光扩散膜的第一区域设计为中空结构,这样有助于降低激光扩散膜的设计复杂度,同时也有助于降低激光扩散膜的成本。
在一些可能的实现方式中,若该第一区域为中空结构,那么该第二扫描光线和该第二光线可以为同一光线。
以上第一区域可以为中空结构;或者,该第一区域也可以是无光学设计的区域,也即激光扩散膜可以只包括该第二区域。在这种情况下,激光扩散膜包含第二区域,第二区域以外的部分区域可以认为是以上所涉及的“第一区域”。
结合第一方面,在第一方面的某些实现方式中,该第一光线携带导航图像信息;和/或,该第二光线携带仪表图像信息。
结合第一方面,在第一方面的某些实现方式中,该显示系统是抬头显示系统HUD,其中,该第一显示装置为风挡玻璃。
本申请实施例的技术方案可以应用于抬头显示系统中,这样有助于主驾或者副驾位置上的用户通过第一显示装置和第二显示装置分别看到虚像和实像。
结合第一方面,在第一方面的某些实现方式中,该第二显示装置位于该风挡玻璃中。
本申请实施例中,该第二显示装置可以位于风挡玻璃中,这样有助于主驾或者副驾位置上的用户在座舱外看到虚像且在风挡玻璃上看到实像,在不增加HUD的体积以及成本的前提下,有助于提升用户的使用体验。
结合第一方面,在第一方面的某些实现方式中,该荧光材料位于该风挡玻璃的聚乙烯醇缩丁醛(poly vinyl butyral,PVB)膜中。
在一些可能的实现方式中,该第二显示装置可以为荧光膜,该荧光膜可以贴在风挡玻璃靠近座舱的一侧。
第二方面,提供了一种显示方法,该方法包括:控制图像产生单元PGU向反射组件发送第一光线和第二光线,该第一光线为可见光对应的光线,该第二光线为不可见光对应的光线;控制该反射组件将接收到的该第一光线反射至第一显示装置以及将该第二光线反射至包括荧光材料的第二显示装置,该第一光线经该第一显示装置形成虚像,该第二光线经过该第二显示装置形成实像。
结合第二方面,在第二方面的某些实现方式中,该PGU包括不可见光激光二极管和激光扩散膜,该激光扩散膜包括第一区域,该控制图像产生单元PGU向反射组件发送第一光线和第二光线包括:控制该不可见光激光二极管发出的不可见光光束通过该第一区域形成该第二光线。
结合第二方面,在第二方面的某些实现方式中,该PGU还包括可见光激光二极管,该激光扩散膜还包括第二区域,该第二区域包括微数组透镜MLA,该控制图像产生单元PGU向反射组件发送第一光线和第二光线,包括:控制该可见光激光二极管发出的可见光光束通过该第二区域形成该第一光线。
结合第二方面,在第二方面的某些实现方式中,该PGU还包括合光系统和微机电系 统MEMS,该不可见光和该可见光光束经过该合光系统形成第一光束且该合光系统能够将该第一光束投射至该MEMS,该控制图像产生单元PGU向反射组件发送第一光线和第二光线包括:控制该MEMS对该第一光束进行扫描,得到第一扫描光线和第二扫描光线;控制该MEMS将该第一扫描光线投射至该第二区域且将该第二扫描光线投射至该第一区域。
结合第二方面,在第二方面的某些实现方式中,该激光扩散膜处于第一位置,其中,当该激光扩散膜处于该第一位置时,该第一扫描光线能够投射至该第二区域且该第二扫描光线能够投射至该第一区域。
在一些可能的实现方式中,该第一位置可以是固定位置。
在一些可能的实现方式中,该方法还包括:控制该激光扩散膜处于该第一位置。
结合第二方面,在第二方面的某些实现方式中,该不可见光激光二极管包括红外线激光二极管。
结合第二方面,在第二方面的某些实现方式中,该第一区域为中空结构。
结合第二方面,在第二方面的某些实现方式中,该第一光线携带导航图像信息;和/或,该第二光线携带仪表图像信息。
结合第二方面,在第二方面的某些实现方式中,该显示系统为抬头显示系统HUD,其中,该第一显示装置为风挡玻璃。
结合第二方面,在第二方面的某些实现方式中,该第二显示装置位于该风挡玻璃中。
结合第二方面,在第二方面的某些实现方式中,该荧光材料位于该风挡玻璃的聚乙烯醇缩丁醛PVB膜中。
第三方面,提供了一种终端设备,该终端设备包括上述第一方面中任一项可能的实现中的显示系统。
结合第三方面,在第三方面的某些实现方式中,该终端设备为车辆。
第四方面,提供了一种终端设备,该终端设备包括处理器和显示系统,其中,该显示系统包括上述第一方面中任一项可能的实现中的显示系统。
结合第四方面,在第四方面的某些实现方式中,该终端设备为车辆。
第五方面,提供了一种计算机可读介质,所述计算机可读介质存储有程序代码,当所述计算机程序代码在终端设备上运行时,使得终端设备执行上述第二方面中的方法。
第六方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在终端设备上运行时,使得终端设备执行上述第二方面中的方法。
第七方面,本申请实施例提供了一种芯片系统,该芯片系统包括处理器,用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行上述第二方面中的方法。
结合第七方面,在一种可能的实现方式中,该处理器通过接口与存储器耦合。
结合第七方面,在一种可能的实现方式中,该芯片系统还包括存储器,该存储器中存储有计算机程序或计算机指令。
第八方面,提供了一种显示系统,该显示系统包括图像产生单元PGU和反射组件,其中,该PGU用于向该反射组件发送第二光线,该第二光线为不可见光对应的光线;该反射组件用于将接收到的该第二光线反射至包括荧光材料的第二显示装置,该第二光线经过该第二显示装置形成实像。
结合第八方面,在一种可能的实现方式中,该PGU还包括微机电系统MEMS,其中,该MEMS用于对该不可见光光束进行扫描,得到该第二光线;该MEMS还用于将该第二光线投射至该反射组件。
结合第八方面,在一种可能的实现方式中,该显示系统为抬头显示系统,其中,该第二显示装置位于风挡玻璃中。
结合第八方面,在一种可能的实现方式中,该第二显示装置可以为荧光膜,该荧光膜可以贴在风挡玻璃靠近座舱的一侧。
结合第八方面,在一种可能的实现方式中,该荧光材料位于该风挡玻璃的PVB膜中。
附图说明
图1是本申请实施例提供的车辆的一个功能框图示意。
图2是本申请实施例提供的一种风挡型抬头显示系统的结构示意图。
图3是PGU的结构示意图。
图4是通过快慢轴进行扫描的示意图。
图5是用于形成两个不同位置的虚像的HUD系统架构示意图。
图6是本申请实施例提供的显示系统的结构示意图。
图7是本申请实施例提供的PGU的另一结构示意图。
图8是本申请实施例提供的激光扩散膜的结构示意图。
图9是本申请实施例中提供的风挡玻璃的结构示意图。
图10是本申请实施例提供的显示系统的另一结构示意图。
图11是本申请实施例提供的MEMS扫描过程的示意图。
图12是本申请实施例提供的双焦HUD的显示效果示意图。
图13是本申请实施例提供的显示系统的另一结构示意图。
图14是本申请实施例提供的单焦HUD的显示效果示意图。
图15是本申请实施例提供的单焦HUD的另一显示效果示意图。
图16是本申请实施例提供的双焦模式和单焦模式切换的示意图。
图17是本申请实施例提供的显示系统的另一结构示意图。
图18是本申请实施例提供的影音模式的显示效果示意图。
图19是本申请实施例提供的PGU的另一结构示意图。
图20是本申请实施例提供的PGU的另一结构示意图。
图21是本申请实施例的技术方案应用于车辆侧窗显示系统的效果示意图。
图22是本申请实施例的技术方案应用于车辆侧窗显示系统的另一效果示意图。
图23是本申请实施例的技术方案应用于车辆后风挡显示系统的效果示意图。
图24是本申请实施例提供的一种显示系统的示意性框图。
图25是本申请实施例提供的一种显示方法的示意性流程图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B; 本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。
本申请实施例中采用诸如“第一”、“第二”的前缀词,仅仅为了区分不同的描述对象,对被描述对象的位置、顺序、优先级、数量或内容等没有限定作用。本申请实施例中对序数词等用于区分描述对象的前缀词的使用不对所描述对象构成限制,对所描述对象的陈述参见权利要求或实施例中上下文的描述,不应因为使用这种前缀词而构成多余的限制。此外,在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
图1是本申请实施例提供的车辆100的一个功能框图示意。车辆100可以包括感知系统120、显示装置130和计算平台150,其中,感知系统120可以包括感测关于车辆100周边的环境的信息的若干种传感器。例如,感知系统120可以包括定位系统,定位系统可以是全球定位系统(global positioning system,GPS),也可以是北斗系统或者其他定位系统、惯性测量单元(inertial measurement unit,IMU)、激光雷达、毫米波雷达、超声雷达以及摄像装置中的一种或者多种。
车辆100的部分或所有功能可以由计算平台150控制。计算平台150可包括处理器151至15n(n为正整数),处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如中央处理单元(central processing unit,CPU)、微处理器、图形处理器(graphics processing unit,GPU)(可以理解为一种微处理器)、或数字信号处理器(digital signal processor,DSP)等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为专用集成电路(application-specific integrated circuit,ASIC)或可编程逻辑器件(programmable logic device,PLD)实现的硬件电路,例如FPGA。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如神经网络处理单元(neural network processing unit,NPU)、张量处理单元(tensor processing unit,TPU)、深度学习处理单元(deep learning processing unit,DPU)等。此外,计算平台150还可以包括存储器,存储器用于存储指令,处理器151至15n中的部分或全部处理器可以调用存储器中的指令,执行指令,以实现相应的功能。
座舱内的显示装置130主要分为两类,第一类是车载显示屏;第二类是投影显示屏,例如抬头显示装置(head up display,HUD)。车载显示屏是一种物理显示屏,是车载信息娱乐系统的重要组成部分,座舱内可以设置有多块显示屏,如数字仪表显示屏,中控屏,副驾驶位上的乘客(也称为前排乘客)面前的显示屏,左侧后排乘客面前的显示屏以及右侧后排乘客面前的显示屏,甚至是车窗也可以作为显示屏进行显示。抬头显示,也称平视显示系统。主要用于在驾驶员前方的显示设备(例如风挡玻璃)上显示例如时速、导航等驾驶信息。以降低驾驶员视线转移时间,避免因驾驶员视线转移而导致的瞳孔变化,提升行驶安全性和舒适性。
HUD包括组合型抬头显示(combiner-HUD,C-HUD)系统、风挡型抬头显示(windshield-HUD,W-HUD)系统、增强现实型抬头显示(augmented reality HUD,AR-HUD)系统等。图2示出了本申请实施例提供的一种W-HUD系统的结构示意图。如 图2所示,该系统包括HUD本体210、PGU220、第一反射组件230(例如,平面镜或者曲面镜)、第二反射组件240(例如,平面镜或者曲面镜)、风挡玻璃250。W-HUD系统的光学成像原理为:PGU产生的影像经过第一反射组件230、第二反射组件240以及风挡玻璃250反射至眼盒260处,因为光学系统有增加光程与放大效果,用户可以在车室外看到有景深的虚像270。目前W-HUD系统广泛应用于车载辅助驾驶,例如,车速、警示灯号、导航、高级驾驶辅助系统(advanced driving assistant system,ADAS)。
以下,对本申请实施例中的部分术语进行解释。需要说明的是,这些解释是为了便于本领域技术人员理解,并不是对本申请所要求的保护范围构成限定。
(1)眼盒(eyebox)
眼盒通常是指驾驶员的眼睛能够看到全部显示图像的范围,如图2所示。为了适应驾驶员的身高的差异,一般眼盒尺寸是130mm×50mm,即驾驶员的眼睛在纵向上有±50mm的移动范围,在横向上有±130mm的移动范围。若驾驶员的眼睛处于眼盒的范围内,可以看到完整且清晰的图像。若驾驶员的眼睛超出眼盒范围,可能会看到图像扭曲或者无法看到图像。
(2)虚像距(virtual image distance,VID)
虚像距是指眼盒(eyebox)中心与虚像的中心之间的距离。
(3)视场角(field of view,FOV)
虚像的左右边界对驾驶员眼睛所成的角度为水平视场角,投影虚像的上下边界对驾驶员的眼睛所成的角度为垂直视场角。
PGU常用的架构包括但不限于以下几种:
(1)发光二极管(light-emitting diode,LED)背光搭配薄膜晶体管液晶显示器(thin film transistor liquid crystal display,TFT-LCD)显示架构。其原理与液晶显示器(liquid crystal display,LCD)屏幕的原理类似,将白光光源用棱镜分为红、绿、蓝三色,经过液晶单元达到投影的效果。
(2)数字光处理(digital light processing,DLP)架构。DLP架构可以由LED和数字微镜单元(digital micro-mirror device,DMD)组成。其原理是通过超微型镜片的DMD可以将强光源经过反射后投影出来。
(3)激光扫描(laser beam scanning,LBS)搭配激光扩散膜(diffuser)架构。其具有低功率、高亮度、高对比度以及FOV大的特点,适用于AR-HUD系统。
(4)硅基液晶(liquid crystal on silicon,LCoS)架构。
图3示出了PGU的结构示意图。该PGU为LBS搭配激光扩散膜的架构。LBS中包括红色激光二极管(red laser diode,R LD)310、蓝色激光二极管(blue laser diode,B LD)320、绿色激光二极管(green laser diode,G LD)330、合光系统(例如,偏振分光棱镜(polarizing beam splitter,PBS))340以及微机电系统(micro-electro-mechanical system,MEMS)350。其原理为红蓝绿(red blue green,RGB)三色激光透过PBS变成合束光线。合束光线入射至MEMS进行快慢轴旋转扫描。合束光线经过MEMS扫描得到的扫描光线投射至激光扩散膜360产生影像。LBS决定影像的FOV与焦距的大小;激光扩散膜决定光学扩散角、主光线角(chief ray angle,CRA)、分辨率和眼盒(eye box)的规格。
一个实施例中,红色激光二极管310、蓝色激光二极管320和绿色激光二极管330发 出的可见光光束可以分别经过对应的准直镜(collimator)后进入合光系统340。该准直镜的作用可以是使得激光二极管发出的光线变成平行光。
图4示出了通过快慢轴进行扫描的示意图。其中,快轴为横轴(例如,快轴可以以正弦波扫描);慢轴为纵轴(例如,慢轴可以以三角波扫描)。
图5示出了用于形成两个不同位置的虚像的HUD系统架构示意图。该HUD包括PGU1和PGU2、曲面反射镜1和曲面反射镜2。为了实现形成两个不同虚像距的虚像,PGU1和PGU2需要放置在指定的位置。PGU1和PGU2需要放置的指定位置是根据需要形成的两个虚像的虚像距来确定的。当前的双PGU架构还未能达到量产阶段且双PGU架构会使得HUD的体积庞大且成本较高。同时,在成像时需要考虑错开PGU1和PGU2生成的光线,这样会导致光学设计复杂;也会导致虚像1和虚像2之间的间距较大,视场角FOV的规格难以满足车厂需求。
本申请实施例中提供了一种显示方法、显示系统和终端设备,通过PGU生成可见光对应的第一光线和不可见光对应的第二光线,第一光线经过第一显示装置形成虚像,第二光线经过第二显示装置形成实像。从而通过形成的虚像和实像实现两个不同焦距的图像的显示。无需在显示系统中增加额外的硬件,有助于降低显示系统的体积,且有助于降低显示系统的成本。
示例性的,以LBS架构的PGU为例,在原有的LBS中可见光激光二极管(例如,R LD、B LD以及G LD)的基础上搭配不可见光激光二极管(例如,红外线激光二极管(infrared laser diode,IR LD)),通过激光扩散膜位置的切换与添加荧光粉粒子的PVB膜,从而实现双焦HUD的功能。其光学设计简单,可以在不增大HUD体积的前提下实现双焦HUD的功能,且成本较低。
图6示出了本申请实施例提供的显示系统600的结构示意图。该显示系统600中包括主控板610、控制板620、LBS630、激光扩散膜640和后段光学模块650。其中,主控板610可以通过输入/输出(input/output,I/O)接口控制控制板620、LBS630、激光扩散膜640和后段光学模块650。控制板620可以用于控制LBS630中可见光激光二极管和不可见光激光二极管发出的激光的亮暗,以及控制MEMS的运动行为。后段光学模块650可以包括上述第一反射组件230和第二反射组件240,用于控制光程差与光学放大倍率。本申请实施例中,可以在LBS630中新增不可见光激光二极管并搭配原有的可见光激光二极管,通过激光扩散膜640以及添加荧光粉粒子的PVB膜,从而实现双焦HUD的功能,并支持多场景(例如,单焦模式、影音模式等)的运用。
一个实施例中,主控板610可以接收图1中的计算平台150发送的指令,该指令可以用于控制控制板620、激光扩散膜640和后段光学模块650中的一个或多个。主控板610在接收到该指令后,可以实现对控制板620、激光扩散膜640和后段光学模块650中的一个或多个的控制。
一个实施例中,主控板610也可以接收计算平台150中的GPU发送的图像信息,该图像信息为待通过该显示系统600显示的图像信息。主控板610可以根据该图像信息生成指令,从而通过该指令控制控制板620、激光扩散膜640和后段光学模块650中的一个或者多个。
一个实施例中,该显示系统600中也可以包括GPU。通过GPU向主控板610发送图 像信息,从而使得主控板610根据该图像信息生成指令。
一个实施例中,该激光扩散膜的位置可以是固定的;或者,也可以是不固定的,例如,显示系统中还可以包括控制装置,控制装置可以控制激光扩散膜处于不同的位置。
图7示出了本申请实施例提供的PGU的另一结构示意图。相比于图3中所示的LBS结构,在LBS中除了R LD、B LD以及G LD以外,还增加了不可见光激光二极管(例如,IR LD 370)。
一个实施例中,红色激光二极管310、蓝色激光二极管320和绿色激光二极管330发出的可见光光束以及红外线激光二极管370发出的不可见光光束可以分别经过对应的准直镜后进入合光系统。
图8示出了本申请实施例提供的激光扩散膜的结构示意图。激光扩散膜可以分成第一区域和第二区域。第二区域中可以具有微数组透镜(micro lens array,MLA),第一区域可以是透明的(或者,第一区域可以是无光学设计的区域或者第一区域为中空结构)。可见光对应的光线可以投射于第二区域中,不可见光对应的光线可以投射于第一区域中。
应理解,图8中以第二区域位于下方,第一区域中位于上方为例进行说明,本申请实施例对此不作任何限定。例如,第二区域可以位于上方,第一区域可以位于下方;或者,第二区域可以位于右侧,第一区域可以位于左侧;或者,第二区域可以位于左侧,第一区域可以位于右侧;或者,第二区域可以位于外侧,第一区域可以位于内侧;或者,第二区域可以位于内侧,第一区域可以位于外侧。
图9示出了本申请实施例中提供的风挡玻璃的结构示意图。该风挡玻璃中间的PVB膜中添加了用于将不可见光激发为可见光的物质(例如,荧光粉粒子),PGU产生的影像中的不可见光在通过反射组件反射至风挡玻璃时,可以激发出可见光,从而形成近焦实像并显示于风挡玻璃上。
图10示出了本申请实施例提供的显示系统1000的结构示意图。该显示系统1000可以用于实现双焦HUD功能。该显示系统包括HUD主体1010、LBS1020、激光扩散膜1030、第一反射组件1040、第二反射组件1050和风挡玻璃1060。其中,LBS1020的结构可以如图7所示,激光扩散膜1030的结构可以如图8所示。LBS1020中R LD、B LD、G LD以及IR LD均发出激光,激光经过合束后合成一束光,通过MEMS扫描后的扫描光束经过激光扩散膜1030、第一反射组件1040、第二反射组件1050、风挡玻璃1060,最终呈现远焦虚像和近焦实像,从而实现双焦HUD的功能,其中,近焦位置所成的像为实像1061;远焦位置所成的像为虚像1070。下面对远焦虚像和近焦实像的成像过程进行介绍。
远焦虚像的成像过程:通过LBS1020中的R LD、B LD以及G LD发出的可见光经过合束合成一束光,并通过调配各个光源能量配比可以达到全彩显示。通过MEMS扫描后得到的扫描光束首先会经过激光扩散膜1030的第二区域成为实像,并通过第二区域中的MLA结构控制光线的光学扩散角和主光线角。接下来光线会经过第一反射组件1040、第二反射组件1050和风挡玻璃1060等光学镜片组件(或称为后段光学模块),其功能是将实像透过光学镜片组件产生目标大小和目标距离的虚像。其光学路径可以参考图10所示的实线:通过激光扩散膜产生的实像经过第一反射组件1040、第二反射组件1050、风挡玻璃1060后反射至眼盒1080,从而用户可以看到虚像1070。
近焦实像的成像过程:IR LD发出的红外线光束与可见光合成为一束光,通过MEMS 扫描后的红外线光束经过激光扩散膜上的第一区域、第一反射组件1040和第二反射组件1050,投射经过风挡玻璃的PVB夹层膜中的荧光粉颗粒,即可激发出可见光。其光学路径可以参考图10所示的虚线:LBS1020输出的红外线扫描光束经过激光扩散膜1030的第一区域、第一反射组件1040、第二反射组件1050后,经过风挡玻璃的PVB夹层膜中的荧光粉颗粒,即可激发出单色可见光,用户可以在风挡玻璃上看到的实像1061。该过程类似照明原理中红外线透过反射组件反射至有荧光粉的PVB膜可产生特定单色亮点。再运用IR LD亮暗控制即可在风挡玻璃中显示单色图像区域,成为近焦实像。
下面结合图11介绍本申请实施例提供的MEMS通过扫描将合束光束中的可见光投射至激光扩散膜的第二区域以及将合束光束中的不可见光投射至激光扩散膜的第一区域的过程。以图5示出的MEMS通过快慢轴进行扫描为例,PGU可以基于快轴的频率计算最佳的帧率(frame rate)与垂直方向上的解析度,水平方向上的解析度与激光光斑的大小相关。示例性的,图11中的(a)示出了一种经过MEMS扫描后的激光显示区域的坐标示意图,该激光扩散膜的左上角坐标可以为(0,0),右下角坐标可以为(Xn,Yn),通过快慢轴的扫描频率可以确定各坐标激光点亮的时机。
如图11中的(b)所示,假设显示区域的解析度为600*600,RGB激光所形成的图像的显示区域2为(0,200)-(600,600),IR激光所形成的图像的显示区域1为(0,0)-(600,200)。通过对MEMS的控制,可以使得IR激光只在显示区域1点亮,并且根据显示图像控制IR激光的开启/关闭时机;也可以使得RGB激光只在显示区域2点亮,并根据显示图像控制RGB激光的开启/关闭时机以及RGB激光的亮度。
图12示出了本申请实施例提供的双焦HUD的显示效果示意图。如图12所示,用户在座舱内可以看到包括区域1201和区域1202在内的图像。其中,区域1201中的图像为远焦虚像,该远焦虚像中可以显示车辆的导航信息(例如,提示用户左转);区域1202中的图像为近焦实像,该近焦实像中可以显示车辆的实时速度、剩余电量(或者剩余油量)、防抱死制动系统(anti-lock braking system,ABS)功能是否开启。
以上结合图10至图12介绍了实现双焦HUD的过程,下面结合图13至图15介绍实现单焦HUD的过程。
图13示出了本申请实施例提供的显示系统1000的另一结构示意图。相比于图10所示的结构,LBS1020中仅IR LD发出激光,通过MEMS扫描后的扫描光束可以不经过激光扩散膜,而是直接经过第一反射组件1040、第二反射组件1050、风挡玻璃1060,最终在风挡玻璃上呈现近焦实像,从而实现单焦HUD的功能。
应理解,图13中呈现近焦实像的过程可以参考上述实施例中的描述,为了简洁,此处不再赘述。
图14示出了本申请实施例提供的单焦HUD的显示效果示意图。例如,在驻车模式下,可以通过单焦HUD显示告警信息。
图15示出了本申请实施例提供的单焦HUD的另一显示效果示意图。例如,在驻车模式下,可以通过单焦HUD显示用户的手机号码的信息。
应理解,有别于远焦虚像只有座舱内的用户(例如,主驾位置上的用户)可以看到,近焦实像是座舱内的用户和座舱外的用户均可以看到的。通过单焦HUD功能还可以显示其他提示信息(例如,当前车辆已经开启哨兵模式、广告信息)。
图16示出了本申请实施例提供的双焦模式和单焦模式切换的示意图。如图16中的(a)所示,通过马达控制齿轮旋转,可以使得激光扩散膜处于竖直状态,此时可以实现双焦HUD的功能。如图16中的(b)所示,通过马达控制齿轮旋转,可以使得激光扩散膜处于水平状态,此时可以实现单焦HUD的功能。
以上结合图13至图16介绍实现单焦HUD的过程。下面结合图17至图18介绍实现通过HUD实现影音模式的过程。
图17示出了本申请实施例提供的显示系统1000的另一结构示意图。相比于图10所示的结构,LBS1020中仅R LD、B LD以及G LD发出激光,通过MEMS扫描后的扫描光束不经过激光扩散膜,而是经过第一反射组件1040、第二反射组件1050和投影幕布1090,并最终在投影幕布1090上呈现影像。此场景下LBS的功能类似于微投影机,可以适用于驻车休息时的影音模式。
图18示出了本申请实施例提供的影音模式的显示效果示意图。例如,用户可以在影音模式下,通过投影幕布观看自己喜欢的视频或者图片。
以上实施例中是以PGU为LBS搭配激光扩散膜架构为例进行说明的,本申请实施例的技术方案还可以运用在DLP架构或者LCoS架构中。
图19示出了本申请实施例提供的PGU的另一结构示意图。该PGU为DLP架构的PGU。该PGU包括R LD1901、B LD1902、G LD1903、IR LD1904、准直镜1905、合光系统1906、镜片阵列(lens array)1907、DMD1908、棱镜(prism)1909、投影透镜(projector lens)1910以及激光扩散膜1911,其中,镜片阵列的作用是将点光源变成面光源。R LD1901、B LD1902和G LD1903发出的可见光光束和IR LD1904发出的不可见光光束经过对应的准直镜1905进入合光系统1906,从而变成合束光线。合束光线经过镜片阵列1907后经过棱镜1909。棱镜1909可以将合束光线的入射角度逆时针旋转90°,并将入射角度逆时针旋转90°的光线发射至DMD1908。DMD1908在接收到棱镜1909反射的光线后可以将该光线穿透棱镜1909并投射至投影镜头1910。投影镜头1910可以将该光线分为可见光对应的光线和不可见光对应的光线。从而将不可见光对应的光线投射到激光扩散膜的第一区域(该第一区域中不包括MLA)且将可见光对应的光线投射到激光扩散膜的第二区域(该第二区域中包括MLA)。
图20示出了本申请实施例提供的PGU的另一结构示意图。该PGU为LCoS架构的PGU。该PGU包括R LD2001、B LD2002、G LD2003、IR LD2004、准直镜2005、合光系统2006、偏光膜(polarizer)2007、镜片阵列2008、LCoS2009、棱镜2010、投影透镜2011以及激光扩散膜2012,其中,偏光膜2007的作用是将合光系统2006产生的合束光线变成偏振光;镜片阵列2008的作用是将点光源变成面光源。
R LD2001、B LD2002和G LD2003发出的可见光光束和IR LD2004发出的不可见光光束经过对应的准直镜2005进入合光系统2006,从而变成合束光线。合束光线经过偏光膜2007后形成偏振光。偏振光经过镜片阵列2008后从点光源变成面光源。镜片阵列2008将光线投射到棱镜2010后,光线的入射角度逆时针旋转90°。棱镜2010将入射角度逆时针旋转90°的光线发射至LCoS2009。LCoS2009在接收到棱镜2010反射的光线后可以将该光线穿透棱镜2010并投射至投影镜头2011。投影镜头2011可以将该光线分为可见光对应的光线和不可见光对应的光线。从而将不可见光对应的光线投射到激光扩散膜的第 一区域(该第一区域中不包括MLA)且将可见光对应的光线投射到激光扩散膜的第二区域(该第二区域中包括MLA)。
应理解,不可见光对应的光线投射到激光扩散膜的第一区域且可见光对应的光线投射到激光扩散膜的第二区域后,光线的传输过程可以参考上述实施例中的描述,此处不再赘述。
以上实施例中是以抬头显示系统为例进行说明的,本申请实施例并不限于此。示例性的,本申请实施例提供的技术方案还可以应用于车辆侧窗显示系统以及车辆后风挡显示系统。
图21示出了本申请实施例的技术方案应用于车辆侧窗显示系统的效果示意图。该侧窗显示系统可以位于车辆顶棚。以侧窗显示系统为二排左侧区域的显示系统为例。二排左侧区域的侧窗玻璃中可以包括荧光材料。二排左侧区域的用户可以在侧窗外看到虚像以及在侧窗上看到实像。如图21所示,用户可以在侧窗外看到新闻资讯内容2101以及在侧窗上看到广告内容2102。
图22示出了本申请实施例的技术方案应用于车辆侧窗显示系统的效果示意图。当车辆检测到二排左侧的用户即将打开车门下车时,车辆可以通过传感器(例如,激光雷达、毫米波雷达、摄像头等)检测二排左侧对应的车门外是否有车辆或者行人经过,从而确定二排左侧的车门打开后是否有碰撞风险。若存在碰撞风险,则该侧窗显示系统可以单焦模式在侧窗上显示提示信息(例如,“请当心!打开车门有碰撞风险”)。
图23示出了本申请实施例的技术方案应用于车辆后风挡显示系统的效果示意图。该后风挡显示系统可以位于车辆顶棚,后风挡玻璃上可以包括荧光材料。示例性的,当车辆处于自动驾驶状态时,车辆可以通过单焦模式在后风挡玻璃上显示“自动驾驶中”这一提示信息。这样,车辆后方的行人或者其他车辆可以通过该提示信息获知该车辆目前处于自动驾驶状态。
图24示出了本申请实施例提供的一种显示系统2400的示意性框图。如图24所示,该显示系统2400包括PGU2410和反射组件2420,其中,该PGU2410用于向该反射组件2420发送第一光线和第二光线,该第一光线为可见光对应的光线,该第二光线为不可见光对应的光线;该反射组件2420用于将接收到该第一光线反射至第一显示装置以及将该第二光线反射至包括荧光材料的第二显示装置,该第一光线经该第一显示装置形成虚像,该第二光线经过该第二显示装置形成实像。
可选地,该PGU包括不可见光激光二极管和激光扩散膜,该激光扩散膜包括第一区域,其中,该PGU2410用于将该不可见光激光二极管发出的不可见光光束通过该第一区域形成该第二光线。
可选地,该PGU2410还包括可见光激光二极管,该激光扩散膜还包括第二区域,该第二区域包括微数组透镜MLA,其中,该PGU2410用于将该可见光激光二极管发出的可见光光束通过该第二区域形成该第一光线。
以上是以激光二极管为例进行说明的,本申请实施例并不限于此。例如,可以发出可见光或者不可见光的发光装置还可以为发光二极管(light-emitting diode,LED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Mini-LED或者Micro-LED 等。
可选地,该PGU2410还包括合光系统和微机电系统MEMS,其中,该合光系统用于将该不可见光光束和该可见光光束合成为第一光束并向该MEMS发送该第一光束;该MEMS用于对该第一光束进行扫描,得到第一扫描光线和第二扫描光线;该MEMS还用于将该第一扫描光线投射至该第二区域且将该第二扫描光线投射至该第一区域。该第一扫描光线投射至该第二区域后形成该第一光线,该第二扫描光线投射至该第一区域后形成该第二光线。
可选地,该PGU2410中还包括控制装置,其中,该控制装置用于控制该激光扩散膜处于第一位置,其中,当该激光扩散膜处于该第一位置时,该第一扫描光线能够投射至该第二区域且该第二扫描光线能够投射至该第一区域。
可选地,该不可见光激光二极管包括红外线激光二极管。
可选地,该第一区域为中空结构。
当第一区域为中空结构或者无光学设计的区域时,该第二扫描光线和该第二光线可以为同一光线。
可选地,该第一光线携带导航图像信息;和/或,该第二光线携带仪表图像信息。
可选地,该显示系统是抬头显示系统HUD,其中,该第一显示装置为风挡玻璃。
可选地,该第二显示装置位于该风挡玻璃中。
可选地,该荧光材料位于该风挡玻璃的聚乙烯醇缩丁醛PVB膜中。
可选地,该第二显示装置可以为荧光膜,该荧光膜贴在风挡玻璃靠近座舱的一侧。
图25示出了本申请实施例提供的一种显示方法2500的示意性流程图。如图25所示,该方法包括:
S2501,控制图像产生单元PGU向反射组件发送第一光线和第二光线,该第一光线为可见光对应的光线,该第二光线为不可见光对应的光线。
可选地,该控制PGU向反射组件发送第一光线和第二光线,包括:第一处理器向PGU发送第一指令,该第一指令用于控制该PGU向该反射组件发送该第一光线和该第二光线。
示例性的,该第一处理器可以是上述主控板610中的处理器,或者,该第一处理器也可以是上述计算平台150中的处理器(例如,CPU)。PGU在接收到该第一指令后,可以执行该第一指令,从而使得PGU向该反射组件发送该第一光线和该第二光线。
示例性的,该第一处理器在向PGU发送该第一指令之前,还可以先接收到第二处理器发送的图像信息,该图像信息与上述实像和虚像相关。第一处理器在接收到该图像信息后,可以根据该图像信息生成该第一指令。例如,该第二处理器可以是上述计算平台150中的GPU,该第一处理器可以为上述主控板610中的处理器。GPU可以将该图像信息发送给该第一处理器,从而使得该第一处理器根据该图像信息生成该第一指令。
示例性的,该第二处理器和该第一处理器均可以为上述主控板中的处理器。
以LBS架构的PGU为例,以上第一指令可以包括对激光二极管和MEMS的控制。其中,激光二极管可以包括可见光激光二极管和不可见光激光二极管。
可选地,该第一光线携带导航图像信息;和/或,该第二光线携带仪表图像信息。
可选地,该显示系统为抬头显示系统HUD,其中,该第一显示装置为风挡玻璃。
可选地,该第二显示装置位于该风挡玻璃中。
可选地,该荧光材料位于该风挡玻璃的聚乙烯醇缩丁醛PVB膜中。
以上荧光材料的作用是将不可见光激发为可见光。
可选地,该PGU包括不可见光激光二极管和激光扩散膜,该激光扩散膜包括第一区域,该控制PGU向反射组件发送第一光线和第二光线,包括:控制该不可见光激光二极管发出的不可见光光束通过该第一区域形成该第二光线。
可选地,该控制该不可见光激光二极管发出的不可见光光束通过该第一区域形成该第二光线,包括:该第一处理器向该PGU发送第二指令,该第二指令用于控制该不可见光激光二极管发出的不可见光光束通过该第一区域形成该第二光线。PGU在接收到该第二指令后,可以执行该第二指令,从而使得该不可见光激光二极管发出的不可见光光束通过该第一区域形成该第二光线。
以上第二指令可以是第一指令的一条子指令。以LBS架构的PGU为例,以上第二指令中可以包括对不可见光激光二极管的控制和MEMS的控制。
示例性的,第二处理器向第一处理器发送的图像信息中包括该实像对应的图像信息和该虚像对应的图像信息。第一处理器可以根据该实像对应的图像信息生成该第二指令。例如,第一处理器可以通过电路来控制不可见光激光二极管的开启(例如,通过调节电压或者电流等参数使得不可见光激光二极管开启)。第一处理器可以将MEMS中快轴扫描的参数信息以及慢轴扫描的参数信息发送给MEMS,从而MEMS可以根据该快轴扫描的参数信息和慢轴扫描的参数信息对光束进行扫描。
可选地,该不可见光激光二极管包括红外线激光二极管。
可选地,该第一区域为中空结构。
可选地,该PGU还包括可见光激光二极管,该激光扩散膜还包括第二区域,该第二区域包括微数组透镜MLA,该控制PGU向反射组件发送第一光线和第二光线,包括:控制该可见光激光二极管发出的可见光光束通过该第二区域形成所述第一光线。
可选地,该控制该可见光激光二极管发出的可见光光束通过该第二区域形成所述第一光线,包括:该第一处理器向该PGU发送第三指令,该第三指令用于控制该可见光激光二极管发出的可见光光束通过该第二区域形成所述第一光线。PGU在接收到该第三指令后,可以执行该第三指令,从而使得该可见光激光二极管发出的可见光光束通过该第二区域形成所述第一光线。
以上第三指令可以是第一指令的一条子指令。以LBS架构的PGU为例,以上第三指令中可以包括对可见光激光二极管的控制和MEMS的控制。
示例性的,第二处理器向第一处理器发送的图像信息中包括该实像对应的图像信息和该虚像对应的图像信息。第一处理器可以根据该虚像对应的图像信息生成该第三指令。例如,第一处理器可以通过电路来控制可见光激光二极管开启以及控制可见光激光二极管的亮暗。第一处理器可以控制MEMS根据快轴扫描的参数信息和慢轴扫描的参数信息对光束进行扫描。
可选地,该PGU还包括合光系统和微机电系统MEMS,该不可见光和该可见光光束经过该合光系统形成第一光束且该合光系统能够将该述第一光束投射至该MEMS,该控制PGU向反射组件发送第一光线和第二光线,包括:控制MEMS对该第一光束进行扫描,得到第一扫描光线和第二扫描光线;控制MEMS将该第一扫描光线投射至该第二区域且 将该第二扫描光线投射至该第一区域。
以上控制MEMS对第一光束进行扫描、将扫描得到的第一扫描光线投射至第二区域和第二扫描光线投射至第一区域的过程也可以通过第一处理器向PGU发送指令的方式实现。第一处理器发送的对MEMS的控制指令可以携带在上述第一指令中;或者,也可以携带在上述第二指令中;或者,还可以携带在上述第三指令中。
可选地,该激光扩散膜处于第一位置,其中,当该激光扩散膜处于该第一位置时,该第一扫描光线能够投射至该第二区域且该第二扫描光线能够投射至该第一区域。
以上激光扩散膜的位置可以是固定的,无需通过第一处理器进行控制。或者,该激光扩散膜的位置也可以是变化的,通过第一处理器可以对激光扩散膜的位置进行控制。例如,当需要通过显示系统显示上述虚像和实像时,第一处理器可以向控制装置发送第四指令,该第四指令可以用于指示将激光扩散膜设置到该第一位置。控制装置可以通过I/O接口接收该第四指令,从而可以根据该第四指令将激光扩散膜设置到第一位置。
以上第四指令可以是第一指令的一条子指令;或者,该第四指令也可以是和第一指令相互独立的指令。
S2502,控制该反射组件将接收到的该第一光线反射至第一显示装置以及将该第二光线反射至包括荧光材料的第二显示装置,该第一光线经该第一显示装置形成虚像,该第二光线经过该第二显示装置形成实像。
以LBS架构的PGU为例,以上形成实像的过程可以为不可见光激光二极管发出的不可见光光束经过准直镜后进入合光系统。合光系统输出的第一光束经过MEMS扫描后得到第二扫描光线。MEMS将该第二扫描光线投射至激光扩散膜的第一区域后形成该第二光线。该第二光线经过反射组件反射至包括荧光材料的该第二显示装置,从而形成实像。
形成虚像的过程可以为可见光激光二极管发出的可见光经过准直镜后进入合光系统。合光系统输出的第一光束经过MEMS扫描后得到第一扫描光线。MEMS将该第一扫描光线投射至激光扩散膜的第二区域后形成该第一光线。该第一光线经过反射组件反射至该第一显示装置,从而形成虚像。
可选地,该控制该反射组件将接收到的该第一光线反射至第一显示装置以及将该第二光线反射至包括荧光材料的第二显示装置,包括:第一处理器向反射组件发送第五指令,该第五指令用于指示将该第一光线反射至第一显示装置以及将该第二光线反射至该第二显示装置。该反射组件在接收到该第五指令后,可以根据该第五指令,将该第一光线反射至第一显示装置以及将该第二光线反射至该第二显示装置。示例性的,该第五指令中可以包括反射组件的旋转角度。例如,若该反射组件中包括多个(例如,图10中的第一反射组件1040和第二反射组件1050),那么该第五指令中可以分别携带多个反射组件中每个反射组件的旋转角度。
以上虚像是用户坐在座舱内时可以在座舱外看到的图像,而实际上座舱外的该位置处并没有图像信息。可见光经过反射组件反射至风挡玻璃,并由风挡玻璃反射至人眼,由于显示系统有增加光程和放大的效果,使得用户可以在座舱外的某个位置看到具有景深的虚像。
以上实像是指用户在第二显示装置上看到的图像。该实像是真实显示在第二显示装置上的。也就是说,座舱内所有位置上的用户或者座舱外的用户均可以通过该第二显示装置 看到该实像。
本申请实施例还提供了一种终端设备,该终端设备可以包括上述显示系统。
可选地,该终端设备为车辆。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (26)

  1. 一种显示系统,其特征在于,所述显示系统包括图像产生单元PGU和反射组件,其中,
    所述PGU,用于向所述反射组件发送第一光线和第二光线,所述第一光线为可见光对应的光线,所述第二光线为不可见光对应的光线;
    所述反射组件,用于将接收到所述第一光线反射至第一显示装置以及将所述第二光线反射至包括荧光材料的第二显示装置,所述第一光线经所述第一显示装置形成虚像,所述第二光线经过所述第二显示装置形成实像。
  2. 如权利要求1所述的显示系统,其特征在于,所述PGU包括不可见光激光二极管和激光扩散膜,所述激光扩散膜包括第一区域,其中,
    所述PGU,用于将所述不可见光激光二极管发出的不可见光光束通过所述第一区域形成所述第二光线。
  3. 如权利要求2所述的显示系统,其特征在于,所述PGU还包括可见光激光二极管,所述激光扩散膜还包括第二区域,所述第二区域包括微数组透镜MLA,其中,
    所述PGU,用于将所述可见光激光二极管发出的可见光光束通过所述第二区域形成所述第一光线。
  4. 如权利要求3所述的显示系统,其特征在于,所述PGU还包括合光系统和微机电系统MEMS,其中,
    所述合光系统,用于将所述不可见光光束和所述可见光光束合成为第一光束并向所述MEMS发送所述第一光束;
    所述MEMS,用于对所述第一光束进行扫描,得到第一扫描光线和第二扫描光线;
    所述MEMS,还用于将所述第一扫描光线投射至所述第二区域且将所述第二扫描光线投射至所述第一区域。
  5. 如权利要求4所述的显示系统,其特征在于,所述PGU中还包括控制装置,其中,
    所述控制装置,用于控制所述激光扩散膜处于第一位置,其中,当所述激光扩散膜处于所述第一位置时,所述第一扫描光线能够投射至所述第二区域且所述第二扫描光线能够投射至所述第一区域。
  6. 如权利要求2至5中任一项所述的显示系统,其特征在于,所述不可见光激光二极管包括红外线激光二极管。
  7. 如权利要求2至6中任一项所述的显示系统,其特征在于,所述第一区域为中空结构。
  8. 如权利要求1至7中任一项所述的显示系统,其特征在于,所述第一光线携带导航图像信息;和/或,
    所述第二光线携带仪表图像信息。
  9. 如权利要求1至8中任一项所述的显示系统,其特征在于,所述显示系统是抬头显示系统HUD,其中,所述第一显示装置为风挡玻璃。
  10. 如权利要求9所述的显示系统,其特征在于,所述第二显示装置位于所述风挡玻 璃中。
  11. 如权利要求10所述的显示系统,其特征在于,所述荧光材料位于所述风挡玻璃的聚乙烯醇缩丁醛PVB膜中。
  12. 一种显示方法,其特征在于,所述方法包括:
    控制图像产生单元PGU向反射组件发送第一光线和第二光线,所述第一光线为可见光对应的光线,所述第二光线为不可见光对应的光线;
    控制所述反射组件将接收到的所述第一光线反射至第一显示装置以及将所述第二光线反射至包括荧光材料的第二显示装置,所述第一光线经所述第一显示装置形成虚像,所述第二光线经过所述第二显示装置形成实像。
  13. 如权利要求12所述的方法,其特征在于,所述PGU包括不可见光激光二极管和激光扩散膜,所述激光扩散膜包括第一区域,所述控制图像产生单元PGU向反射组件发送第一光线和第二光线包括:
    控制所述不可见光激光二极管发出的不可见光光束通过所述第一区域形成所述第二光线。
  14. 如权利要求13所述的方法,其特征在于,所述PGU还包括可见光激光二极管,所述激光扩散膜还包括第二区域,所述第二区域包括微数组透镜MLA,所述控制图像产生单元PGU向反射组件发送第一光线和第二光线,包括:
    控制所述可见光激光二极管发出的可见光光束通过所述第二区域形成所述第一光线。
  15. 如权利要求14所述的方法,其特征在于,所述PGU还包括合光系统和微机电系统MEMS,所述不可见光和所述可见光光束经过所述合光系统形成第一光束且所述合光系统能够将所述第一光束投射至所述MEMS,所述控制图像产生单元PGU向反射组件发送第一光线和第二光线包括:
    控制所述MEMS对所述第一光束进行扫描,得到第一扫描光线和第二扫描光线;
    控制所述MEMS将所述第一扫描光线投射至所述第二区域且将所述第二扫描光线投射至所述第一区域。
  16. 如权利要求15所述的方法,其特征在于,当所述激光扩散膜处于第一位置时,所述第一扫描光线能够投射至所述第二区域且所述第二扫描光线能够投射至所述第一区域;
    其中,所述第一位置为固定位置,或者,
    所述方法还包括:
    控制所述激光扩散膜处于所述第一位置。
  17. 如权利要求13至16中任一项所述的方法,其特征在于,所述不可见光激光二极管包括红外线激光二极管。
  18. 如权利要求13至17中任一项所述的方法,其特征在于,所述第一区域为中空结构。
  19. 如权利要求12至18中任一项所述的方法,其特征在于,所述第一光线携带导航图像信息;和/或,
    所述第二光线携带仪表图像信息。
  20. 如权利要求12至19中任一项所述的方法,其特征在于,所述显示系统为抬头显 示系统HUD,其中,所述第一显示装置为风挡玻璃。
  21. 如权利要求20所述的方法,其特征在于,所述第二显示装置位于所述风挡玻璃中。
  22. 如权利要求21所述的方法,其特征在于,所述荧光材料位于所述风挡玻璃的聚乙烯醇缩丁醛PVB膜中。
  23. 一种终端设备,其特征在于,包括如权利要求1至11中任一项所述的显示系统。
  24. 如权利要求23所述的终端设备,其特征在于,所述终端设备为车辆。
  25. 一种计算机可读存储介质,其特征在于,其上存储有计算机程序,所述计算机程序被计算机执行时,以使得实现如权利要求12至22中任一项所述的方法。
  26. 一种芯片,其特征在于,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,以执行如权利要求12至22中任一项所述的方法。
PCT/CN2022/084201 2022-03-30 2022-03-30 一种显示方法、显示系统和终端设备 WO2023184276A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/084201 WO2023184276A1 (zh) 2022-03-30 2022-03-30 一种显示方法、显示系统和终端设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/084201 WO2023184276A1 (zh) 2022-03-30 2022-03-30 一种显示方法、显示系统和终端设备

Publications (1)

Publication Number Publication Date
WO2023184276A1 true WO2023184276A1 (zh) 2023-10-05

Family

ID=88198553

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/084201 WO2023184276A1 (zh) 2022-03-30 2022-03-30 一种显示方法、显示系统和终端设备

Country Status (1)

Country Link
WO (1) WO2023184276A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020120916A1 (en) * 2001-01-16 2002-08-29 Snider Albert Monroe Head-up display system utilizing fluorescent material
US20020175880A1 (en) * 1998-01-20 2002-11-28 Melville Charles D. Augmented retinal display with view tracking and data positioning
CN101872067A (zh) * 2009-04-02 2010-10-27 通用汽车环球科技运作公司 全挡风玻璃hud增强:有限像素化视场结构
JP2012163613A (ja) * 2011-02-03 2012-08-30 Denso Corp 虚像表示装置
CN213987029U (zh) * 2020-08-21 2021-08-17 未来(北京)黑科技有限公司 双层成像抬头显示装置、抬头显示系统及交通设备
WO2022037703A1 (zh) * 2020-08-21 2022-02-24 未来(北京)黑科技有限公司 多层图像显示装置、抬头显示器以及交通设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175880A1 (en) * 1998-01-20 2002-11-28 Melville Charles D. Augmented retinal display with view tracking and data positioning
US20020120916A1 (en) * 2001-01-16 2002-08-29 Snider Albert Monroe Head-up display system utilizing fluorescent material
CN101872067A (zh) * 2009-04-02 2010-10-27 通用汽车环球科技运作公司 全挡风玻璃hud增强:有限像素化视场结构
JP2012163613A (ja) * 2011-02-03 2012-08-30 Denso Corp 虚像表示装置
CN213987029U (zh) * 2020-08-21 2021-08-17 未来(北京)黑科技有限公司 双层成像抬头显示装置、抬头显示系统及交通设备
WO2022037703A1 (zh) * 2020-08-21 2022-02-24 未来(北京)黑科技有限公司 多层图像显示装置、抬头显示器以及交通设备

Similar Documents

Publication Publication Date Title
US11333521B2 (en) Head-up display, vehicle device, and information display method
US10732408B2 (en) Projection type display device and projection display method
US9678340B2 (en) Vehicular display apparatus
US9958676B2 (en) Projector device
WO2022188096A1 (zh) 一种hud系统、车辆及虚像的位置调节方法
US11878586B2 (en) Information display apparatus and spatial sensing apparatus
JPH07195960A (ja) 動力車用のヘッドアップ式表示システム
JP2019051823A (ja) 情報表示装置
JP7228646B2 (ja) 情報表示装置
US20210131818A1 (en) Head-up display, vehicle device, and information display method
WO2021054277A1 (ja) ヘッドアップディスプレイおよび画像表示システム
KR102145455B1 (ko) 헤드-업 디스플레이 디바이스 및 방법
WO2021015171A1 (ja) ヘッドアップディスプレイ
WO2024021574A1 (zh) 立体投影系统、投影系统和交通工具
TWI443377B (zh) 抬頭顯示裝置
WO2020110598A1 (ja) ヘッドアップディスプレイ
WO2023184276A1 (zh) 一种显示方法、显示系统和终端设备
JP6370793B2 (ja) ヘッドアップディスプレイ装置及び方法
WO2022124028A1 (ja) ヘッドアップディスプレイ
CN115685654A (zh) 投影装置、交通工具和显示设备
US20220197025A1 (en) Vehicular head-up display and light source unit used therefor
WO2020189636A1 (en) Information providing system, moving body, information providing method, and information providing program
US20240073379A1 (en) Picture generation apparatus, projection apparatus, and vehicle
WO2024021852A1 (zh) 立体显示装置、立体显示系统和交通工具
WO2023098228A1 (zh) 显示装置、电子设备以及交通工具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22934127

Country of ref document: EP

Kind code of ref document: A1