WO2024098828A1 - Système de projection, procédé de projection et moyen de transport - Google Patents

Système de projection, procédé de projection et moyen de transport Download PDF

Info

Publication number
WO2024098828A1
WO2024098828A1 PCT/CN2023/107815 CN2023107815W WO2024098828A1 WO 2024098828 A1 WO2024098828 A1 WO 2024098828A1 CN 2023107815 W CN2023107815 W CN 2023107815W WO 2024098828 A1 WO2024098828 A1 WO 2024098828A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection system
image information
imaging
imaging light
light
Prior art date
Application number
PCT/CN2023/107815
Other languages
English (en)
Chinese (zh)
Inventor
王金蕾
李肖
陈宇宸
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024098828A1 publication Critical patent/WO2024098828A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details

Definitions

  • the present application relates to the field of display, and in particular to a projection system, a projection method and a vehicle.
  • Head-up display also known as head-up display system
  • head-up display system is a device that projects information such as speed and navigation in front of the driver, so that the driver can see the instrument information without lowering his head.
  • information such as speed and navigation in front of the driver, so that the driver can see the instrument information without lowering his head.
  • HUD in order to improve visual enjoyment, different image information can be projected to different distances from the driver.
  • instrument-related information is projected 2.5 meters (metre, m) away from the driver
  • AR augmented reality
  • HUD includes two projection systems. One projection system is used to project instrument-related information. The other projection system is used to project AR information.
  • the cost of the projection system is high, which leads to the high cost of the HUD.
  • the present application provides a projection system, which can adjust the visual distance by adjusting the parallax information of two image information, thereby reducing the number of projection systems and reducing the cost of HUD, and can improve the user experience by controlling the convergence adjustment conflict VAC.
  • the present application provides a projection system.
  • the projection system includes an image generating component and an optical element.
  • the image generating component is used to output two imaging lights.
  • the two imaging lights include a first imaging light and a second imaging light.
  • the first imaging light carries first image information.
  • the second imaging light carries second image information.
  • the optical element is used to reflect or transmit the two imaging lights, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane.
  • the two imaging lights after reflection or transmission are irradiated to two positions of a receiving surface.
  • the two positions correspond to the two imaging lights one by one.
  • the distance between the two positions is m.
  • the distance between the virtual image plane and the receiving surface is d1.
  • the first parallax information will cause the two image information viewed by the user to generate a first parallax angle, so that the image information observed by the user is located in the first visual plane.
  • the distance between the receiving surface and the first visual plane is d2.
  • d2 is determined based on the first parallax information between the first image information and the second image information, m and d1.
  • VAC vergence accommodation conflict
  • the optical element is a windshield, which includes a first glass layer, a second glass layer, and an intermediate layer for bonding the first glass layer and the second glass layer.
  • the two imaging lights are linearly polarized lights.
  • the intermediate layer is used to absorb the linearly polarized light.
  • the two imaging lights are incident on the optical element, they are reflected by two glass layers in contact with the air inside and outside the optical element, thereby forming two virtual images at the receiving position of the human eye.
  • the two virtual images form a ghost image due to partial overlap. ghosting can seriously affect the clarity of the HUD display and driving safety.
  • the impact of ghosting can be reduced, thereby improving the clarity of the HUD display and driving safety.
  • the middle layer is a wedge-shaped structure.
  • the wedge-shaped structure can reduce the impact of ghosting, thereby improving the clarity of the HUD display and driving safety.
  • the value range of d1 is between 2.5 m and 7 m.
  • the thickness of the intermediate layer at different positions is the same. Intermediate layers of the same thickness can reduce the cost of the optical element, thereby reducing the cost of the projection system.
  • d2 is less than d1.
  • the projection system will form a ghost image.
  • d2 is less than d1, the influence of the ghost image can be reduced, thereby improving the clarity of the HUD display and driving safety.
  • the value range of d1 is between 10m and 15m.
  • the impact of ghosting is relatively large.
  • the value of d1 is too large and VAC is less than 0.25 diopters, the value of d2 is large, thereby affecting the user experience. Therefore, in the present application, by controlling the value of d1, the impact of ghosting can be reduced and the user experience can be improved.
  • the first imaging light also carries third image information.
  • the second imaging light also carries fourth image information.
  • the distance between the receiving surface and the second visual plane is d3.
  • d3 is a second parallax information between the third image information and the fourth image information.
  • d3 is equal to d1.
  • d1 is equal to d3
  • VAC-free display can be achieved for content displayed at a close distance.
  • the focal length of the optical element is f.
  • the distance between the image generating component and the optical element is d.
  • d is smaller than f.
  • the image information carried by the two imaging lights can be amplified, thereby improving the user experience.
  • the distance between the virtual image plane and the optical element is d0.
  • d0 satisfies the following formula:
  • the image generating component includes a backlight component and a spatial light modulator.
  • the backlight component is used to output two light beams to the spatial light modulator at different angles in a time-sharing manner.
  • the spatial light modulator is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two imaging lights, and output the two imaging lights at different angles.
  • the image generating component includes a backlight component, a spatial light modulator and a spectroscopic element.
  • the backlight component is used to generate a light beam to be modulated.
  • the spectroscopic element is used to split the light beam to be modulated to obtain two sub-beams to be modulated.
  • the spatial light modulator is used to modulate the two sub-beams to be modulated and output two imaging lights.
  • the spectroscopic element can reduce the refresh rate of the spatial light modulator, thereby improving the reliability of the image generating component.
  • the spectroscopic element can be a cylindrical grating, a liquid crystal grating, a barrier grating, an electronic grating, a diffraction element, etc.
  • the image generating component further includes a diffusion screen.
  • the diffusion screen is used to receive two paths of imaging light from the spatial light modulator, diffuse the two paths of imaging light, and output the diffused two paths of imaging light at different angles.
  • the two light beams include a first light beam and a second light beam.
  • the backlight assembly is used to output the first light beam at a first position and output the second light beam at a second position. By moving the backlight assembly, the backlight assembly can output the first light beam and the second light beam through one light source, thereby reducing the cost of the backlight assembly.
  • the projection system further includes a human eye tracking module and a processor.
  • the human eye tracking module is used to obtain the position of the receiving surface.
  • the processor is used to adjust the first parallax information of the first image information and the second image information according to the position of the receiving surface.
  • the second aspect of the present application provides a projection method.
  • the projection method can be applied to a projection system.
  • the projection method includes the following steps: the projection system outputs two imaging lights.
  • the two imaging lights include a first imaging light and a second imaging light.
  • the first imaging light carries the first image information.
  • the second imaging light carries the second image information;
  • the projection system reflects or transmits the two imaging lights, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane.
  • the two imaging lights after reflection or transmission are irradiated to two positions of the receiving surface.
  • the distance between the two positions is m.
  • the distance between the virtual image plane and the receiving surface is d1, and the distance between the receiving surface and the first visual plane is d2.
  • d2 is determined based on the first parallax information between the first image information and the second image information, m and d1.
  • VAC satisfies the following relationship: Among them, VAC is less than 0.25 di
  • the third aspect of the present application provides a vehicle, wherein the vehicle comprises a projection system as described in the first aspect or any optional manner of the first aspect, and the projection system is installed on the vehicle.
  • FIG1 is a first optical path schematic diagram of a projection system provided in an embodiment of the present application.
  • FIG2 is a first structural schematic diagram of a windshield provided in an embodiment of the present application.
  • FIG3 is a second structural schematic diagram of a windshield provided in an embodiment of the present application.
  • FIG4 is a third structural schematic diagram of a windshield provided in an embodiment of the present application.
  • FIG5 is a second optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • FIG6 is a first structural diagram of a VAC provided in an embodiment of the present application.
  • FIG7 is a second structural diagram of a VAC provided in an embodiment of the present application.
  • FIG8 is a third optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • FIG9 is a first structural diagram of an image generation component provided in an embodiment of the present application.
  • FIG10 is a second structural diagram of an image generation component provided in an embodiment of the present application.
  • FIG11 is a third structural schematic diagram of the image generation component provided in an embodiment of the present application.
  • FIG12 is a circuit diagram of a projection system provided in an embodiment of the present application.
  • FIG13 is a schematic diagram of a projection system installed in a vehicle according to an embodiment of the present application.
  • FIG14 is a schematic diagram of a possible functional framework of a vehicle provided in an embodiment of the present application.
  • FIG. 15 is a flow chart of the projection method provided in an embodiment of the present application.
  • the present application provides a projection system that can adjust the visual distance by adjusting the parallax information of two image information, thereby reducing the number of projection systems and reducing the cost of HUD, and by controlling the convergence adjustment conflict VAC, the user experience can be improved.
  • the "first”, “second”, “target”, etc. used in this application are only used for the purpose of distinguishing the description, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
  • reference numbers and/or letters are repeated in multiple figures of the present application. Repetition does not indicate a strict limiting relationship between various embodiments and/or configurations.
  • the projection system in this application is applied to the display field.
  • the head-up display in order to improve visual enjoyment, the head-up display (HUD) can project different image information to different distances from the driver.
  • the HUD includes two projection systems. One projection system is used to project instrument-related information.
  • the other projection system is used to project augmented reality (AR) information. Therefore, the cost of HUD is relatively high.
  • AR augmented reality
  • FIG. 1 is a first optical path schematic diagram of a projection system provided in an embodiment of the present application.
  • the projection system includes an image generation component 101 and an optical element 102.
  • the image generation component 101 is used to output two imaging lights.
  • the optical element 102 may be a reflector, a windshield, a lens or a diffraction element, etc.
  • the optical element 102 is used to reflect or transmit two imaging lights, and there is an angle between the two imaging lights after reflection or transmission.
  • the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105, such as the left eye (upper eye position) and the right eye (lower eye position) of the user.
  • the distance between the two positions is m.
  • the two positions correspond one to one to the two imaging lights.
  • the position of the human eye can also be called a viewpoint.
  • the above-mentioned projection system can provide multiple viewpoints for multiple people to watch.
  • the image generation component 101 can produce multiple groups of imaging lights for different people to watch. Among them, a group of imaging lights includes two imaging lights. This embodiment takes a viewpoint as an example, that is, the image generation component 101 generates two imaging lights as an example to illustrate the imaging process of the projection system.
  • the two-path imaging light includes a first path of imaging light and a second path of imaging light.
  • the first path of imaging light carries the first image information.
  • the second path of imaging light carries the second image information.
  • the two paths of imaging light after reflection or transmission form a virtual image on the virtual image plane 103.
  • P1 and P2 are both image points on the virtual image plane 103.
  • the light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P1 belongs to the first path of imaging light.
  • the light beam corresponding to P2 belongs to the second path of imaging light.
  • the light beam corresponding to P1 irradiates the upper eye position of the user.
  • the light beam corresponding to P2 irradiates the lower eye position of the user.
  • the light beam corresponding to P1 will not irradiate the lower eye position of the user.
  • the light beam corresponding to P2 will not irradiate the upper eye position of the user.
  • the distance between the virtual image plane 103 and the receiving surface 105 is d1.
  • the first parallax information between the first image information and the second image information can be the distance between two pixel points in the pixel group on the virtual image plane (referred to as the first distance for short).
  • the first image information includes N first pixel points.
  • the second image information includes N second pixel points. N is an integer greater than 0.
  • the N first pixel points correspond one to one with the N second pixel points.
  • the first image information and the second image information include N pixel groups.
  • a pixel group includes a first pixel point and a second pixel point corresponding thereto. Two pixel points in a pixel group are used to display the same point of an object.
  • both pixel points are used to display the center of a circle.
  • both pixel points are used to display the tip of a person's nose.
  • both pixel points are used to display the vertex of the number "1".
  • the first distances of the N pixel groups are the same.
  • the first parallax information will cause the two image information viewed by the user to generate a first parallax angle.
  • the first parallax angle is related to d1 and m.
  • P1 and P2 are two pixel points in the pixel group.
  • the distance between P1 and P2 is the first distance dm.
  • P1, P2 and the two positions illuminated by the two imaging lights form an isosceles trapezoid.
  • the first parallax angle ⁇ 1 is obtained by dm, m and d1.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user are P10 located on the visual plane 104.
  • the distance between the visual plane 104 and the receiving surface 105 is d2.
  • d2 is also called the visual distance.
  • d2, ⁇ 1 and m satisfy the following relationship:
  • VAC vergence accommodation conflict
  • the visual distance d2 can be adjusted by adjusting the parallax information of the two image information (the first image information and the second image information). Therefore, when the projection system in the present application is installed on a vehicle, the projection system can project different image information. To different distances from the driver. For example, the instrument-related information is projected to 2.5m away from the driver, and the AR information is projected to 10m away from the driver. Therefore, the embodiment of the present application can reduce the number of projection systems, thereby reducing the cost of the HUD. In addition, by controlling the VAC, the user experience can be improved.
  • the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105.
  • the two positions correspond to two different light spots.
  • m can be the center distance of the two light spots.
  • m can be the pupil distance of the eyes.
  • FIG. 2 is a first structural schematic diagram of a windshield provided in an embodiment of the present application.
  • the windshield includes a first glass layer 201, a second glass layer 203, and an intermediate layer 202.
  • the intermediate layer 202 is used to bond the first glass layer 201 and the second glass layer 203.
  • the thickness of the intermediate layer 202 at different positions in the target area is the same.
  • the target area refers to the area through which the two imaging lights pass. It should be understood that there may be processing errors in the thickness of the intermediate layer 202 at different positions. Therefore, the same thickness of the intermediate layer 202 at different positions means that the thickness deviation of the intermediate layer 202 at different positions is less than 1 mm.
  • any one of the two imaging lights when any one of the two imaging lights is incident on the windshield, it will be reflected by the two glass layers in contact with the air inside and outside the windshield, thereby forming two virtual images at the receiving position of the human eye.
  • any one of the two imaging lights is incident on the second glass layer 203.
  • the imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the second glass layer 203 forms a main image on the outside of the second glass layer 203.
  • the imaging light transmitted by the second glass layer 203 reaches the first glass layer 201 after passing through the middle layer.
  • the imaging light reflected by the first glass layer 201 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the first glass layer 201 forms a virtual image on the outside of the first glass layer 201.
  • the virtual image and the main image are located at different positions.
  • the two virtual images form a ghost image because of partial overlap. ghosting will seriously affect the clarity of the HUD display and driving safety.
  • the present application can reduce the impact of ghosting in any one or more of the following ways.
  • FIG. 3 is a second structural schematic diagram of the windshield provided in an embodiment of the present application. As shown in FIG. 3 , based on FIG. 2 , any one of the two imaging lights is incident on the second glass layer 203. The imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye. The imaging light reflected by the second glass layer 203 forms a virtual image on the outside of the second glass layer 203. The imaging light transmitted by the second glass layer 203 is absorbed by the intermediate layer 202.
  • FIG. 4 is a third structural schematic diagram of a windshield provided in an embodiment of the present application.
  • the windshield includes a first glass layer 201, a second glass layer 203 and an middle layer 202.
  • the middle layer 202 is used to bond the first glass layer 201 and the second glass layer 203.
  • the middle layer 202 is a wedge-shaped structure in the target area. Any one of the two imaging lights is incident on the second glass layer 203.
  • the imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the second glass layer 203 forms a main image on the outside of the second glass layer 203.
  • the imaging light transmitted by the second glass layer 203 reaches the first glass layer 201 after passing through the middle layer.
  • the imaging light reflected by the first glass layer 201 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the first glass layer 201 forms a virtual image on the outside of the first glass layer 201.
  • the virtual image and the main image are located at the same position.
  • FIG. 5 is a second optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • the projection system includes an image generation component 101 and an optical element 102.
  • the image generation component 101 is used to output two imaging lights.
  • the optical element 102 is used to reflect or transmit the two imaging lights, and there is an angle between the two imaging lights after reflection or transmission.
  • the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105. The distance between the two positions is m.
  • the two imaging lights include a first imaging light and a second imaging light.
  • the first imaging light carries the first image information.
  • the second imaging light carries the second image information.
  • the two imaging lights after reflection or transmission form a virtual image on the virtual image plane 103.
  • P1 and P2 are both image points on the virtual image plane 103.
  • the light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P1 belongs to the first imaging light.
  • the light beam corresponding to P2 belongs to the second imaging light.
  • the light beam corresponding to P1 is irradiated to the lower eye position of the user.
  • the light beam corresponding to P2 is irradiated to the upper eye position of the user.
  • the first parallax information causes the two image information viewed by the user to produce a first parallax angle.
  • the first parallax angle is ⁇ 2.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user are P10 located on the visual plane 104.
  • VAC can be less than 0.25 diopters.
  • the value of VAC is related to d1 and d2.
  • the relationship between VAC and d1 and d2 is described below in an exemplary manner.
  • Figure 6 is a first structural diagram of VAC provided in an embodiment of the present application.
  • the ordinate is VAC, and the unit is diopter.
  • the abscissa is distance, and the unit is meter.
  • Figure 6 provides 12 curves.
  • the starting points of the curves are connected to the abscissas.
  • the abscissa of the starting point represents d1.
  • the abscissa of the starting point of curve 601 is 4.5.
  • 4.5 is d1 of curve 601.
  • the abscissa of any point on curve 601 represents d2.
  • the abscissa of point 602 is 8.
  • the ordinate of point 602 is 0.1.
  • curve 601 and point 602 represent that when d1 is equal to 4.5 and d2 is equal to 8, VAC is equal to 0.1. It should be understood that for the description of other curves, reference can be made to the description of curve 601.
  • the value of d2 when VAC is less than 0.25 diopters, the value of d2 is related to the value of d1. According to the description of FIG. 1 above, d2 is the visual distance. Therefore, the value of d2 affects the user experience. In the embodiment of the present application, in order to improve the user experience, The value range of d2 can be controlled by controlling the value of d1. For example, the value range of d1 is between 2.5m and 7m, wherein d1 can be 2.5m or 7m.
  • FIG7 is a second schematic diagram of the structure of VAC provided in an embodiment of the present application.
  • the ordinate is VAC, in diopters.
  • the abscissa is distance, in meters.
  • FIG7 provides six curves. The starting points of the curves are connected to the abscissas.
  • the abscissas of the six curves are 10, 10, 12, 12, 14 and 14, respectively. It should be understood that for the description of any of the six curves, reference may be made to the description of curve 601 in FIG6 .
  • the value of d1 is between 10m and 15m. Among them, d1 can be 10m or 15m.
  • the value of d2 is related to d1.
  • the value of d2 is also related to the position of the receiving plane 105.
  • the projection system may further include an eye tracking module and a processor.
  • the personnel tracking module is used to obtain the position of the receiving surface 105.
  • the processor is used to adjust the first parallax information of the first image information and the second image information according to the position of the receiving surface 105. By adjusting the parallax information, the value of d2 can be adjusted.
  • the first imaging light of the two imaging lights can be reflected or transmitted to the left eye of the user through the optical element 102.
  • the second imaging light of the two imaging lights can be reflected or transmitted to the right eye of the user through the optical element 102.
  • different process errors exist at different positions of the optical element 102.
  • the process error will cause the zoom factor and imaging position of the image observed by the user to have display differences relative to the ideal position, thereby reducing the user experience. Therefore, in an embodiment of the present application, image information with different parallaxes can be preprocessed.
  • the display difference is compensated by preprocessing, thereby enhancing the display effect.
  • the processor can perform one or more of the following processing on the left eye image and/or the right eye image loaded by the image generation component 101: translate the entire or part of the left eye image or the right eye image. Enlarge or reduce the entire or part of the left eye image or the right eye image. Distort the entire or part of the left eye image or the right eye image.
  • the projection system can project different image information to the user at different distances in a time-sharing or simultaneous manner. This is described below.
  • FIG8 is a third optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • the first imaging light carries the first image information group.
  • the first image information group includes the first image information and the third image information.
  • the second imaging light carries the second image information group.
  • the second image information group includes the second image information and the fourth image information.
  • the two imaging lights reflected or transmitted by the optical element 102 form a virtual image on the virtual image plane 103.
  • the virtual image also includes the first image information group and the second image information group.
  • the different image information groups observed by the user are located in different visual planes. These are described separately below.
  • P1 and P2 are image points on the virtual image plane 103.
  • the light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P1 belongs to the first image information in the first imaging light.
  • the light beam corresponding to P2 belongs to the second image information in the second imaging light.
  • the light beam corresponding to P1 is irradiated to the upper eye position of the user.
  • the light beam corresponding to P2 is irradiated to the lower eye position of the user.
  • the first parallax information will cause the two image information viewed by the user to produce a first parallax angle.
  • the first parallax angle is ⁇ 1.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user are P10 located on the visual plane 104.
  • P3 and P4 are image points on the virtual image plane 103.
  • the light beams corresponding to P3 and P4 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P3 belongs to the third image information in the first imaging light.
  • the light beam corresponding to P4 belongs to the fourth image information in the second imaging light.
  • the light beam corresponding to P3 is irradiated to the upper eye position of the user.
  • the light beam corresponding to P4 is irradiated to the lower eye position of the user.
  • the second parallax information reference can be made to the aforementioned description of the first parallax information.
  • the second parallax information will cause the two image information viewed by the user to generate a second parallax angle.
  • the second parallax angle is related to d1 and m.
  • the second parallax angle is ⁇ 3.
  • the distance between the visual plane 801 and the receiving surface 105 is d3.
  • d3 is determined based on the second parallax information, m and d1 between the third image information and the fourth image information.
  • the image information observed by the user is located on the visual plane 801.
  • P3 and P4 observed by the user are P11 located on the visual plane 801.
  • the projection system projects different image information to different distances from the user in a time-sharing manner.
  • the first imaging light output by the image generation component 101 carries the first image information.
  • the second imaging light output by the image generation component 101 carries the second image information.
  • the two imaging lights reflected or transmitted by the optical element 102 form a virtual image on the virtual image plane 103.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user on the virtual image plane 103 are P10 located on the visual plane 104.
  • the first imaging light output by the image generation component 101 carries the third image information.
  • the second imaging light output by the image generation component 101 carries the fourth image information. After being reflected by the optical element 102 Or the two imaging lights after transmission form a virtual image on the virtual image plane 103.
  • the third image information and the fourth image information have second parallax information.
  • the image information observed by the user is located on the visual plane 801.
  • P3 and P4 observed by the user on the virtual image plane 103 are P11 located on the visual plane 801.
  • the first moment and the second moment are alternately distributed.
  • the projection system projects different image information to different distances of the user in a time-sharing manner.
  • FIG. 8 is only an example of a projection system provided by an embodiment of the present application. In practical applications, those skilled in the art can design a projection system according to requirements. For example, in FIG. 8, d3 and d2 are greater than d1. In practical applications, d3 and/or d2 may be less than d1. For another example, the first imaging light and the second imaging light may also carry more image information. More image information corresponds to more visual planes.
  • the third image information and the fourth image information can be used to form a 3D image, or can be used to form a 2D image.
  • d2 or d3 can be equal to d1.
  • FIG9 is a first structural schematic diagram of an image generation component provided by an embodiment of the present application.
  • the image generation component 101 includes a backlight component 901 and a spatial light modulator 902.
  • the backlight component 901 is used to output two light beams to the spatial light modulator 902 at different angles in a time-sharing manner.
  • the spatial light modulator 103 can be a liquid crystal display (LCD), liquid crystal on silicon (LCOS), a digital micro-mirror device (DMD), or a micro-electro-mechanical system (MEMS).
  • the spatial light modulator 902 is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two imaging lights, and output the two imaging lights at different angles.
  • the spatial light modulator 103 is used to modulate the first light beam according to the first image information to obtain the first imaging light.
  • the spatial light modulator 103 is used to modulate the second light beam according to the second image information to obtain the second imaging light.
  • the two-path imaging light is irradiated to the optical element.
  • the two-path imaging light reflected or transmitted by the optical element is irradiated to different viewpoints respectively.
  • the cost of the spatial light modulator 902 can be reduced, thereby reducing the cost of the image generation component 101.
  • the image generation component can also be a display that does not require a backlight, such as an organic light-emitting diode (OLED) display or a Micro LED.
  • OLED organic light-emitting diode
  • FIG10 is a second structural schematic diagram of an image generating assembly provided in an embodiment of the present application.
  • the image generating assembly 101 includes a backlight assembly 901 and a spatial light modulator 902.
  • the backlight assembly 901 moves between position A and position B.
  • the backlight assembly 901 is used to output a first light beam at a first position (position A) and output a second light beam at a second position (position B).
  • the mobile backlight assembly 901 may be a partial device in the mobile backlight assembly 901.
  • the backlight assembly 901 includes a light source device and a non-fixed element.
  • the non-fixed element may be a lens, a reflector, a prism, or a Fresnel mirror, etc.
  • the backlight assembly 901 may output two light beams in a time-sharing manner by moving the non-fixed element.
  • the backlight assembly may output the first light beam and the second light beam through one light source, thereby reducing the cost of the backlight assembly.
  • FIG11 is a third structural schematic diagram of the image generation component provided in an embodiment of the present application.
  • the image generation component 101 also includes a lens 1101 and a diffuser screen 1102.
  • the spatial light modulator 902 is used to output two paths of imaging light to the lens 1101 at different angles.
  • the lens 1101 is used to change the transmission direction of the two paths of imaging light and transmit the two paths of transmission light to the diffuser screen 1102.
  • the diffuser screen 1102 is used to diffuse the two paths of imaging light and output the diffused two paths of imaging light at different angles.
  • the diffused two paths of imaging light can be irradiated to different viewpoints through optical elements.
  • the embodiment of the present application can project different image information to different distances from the user in a time-sharing manner.
  • the first imaging light at the first moment carries the first image information.
  • the second imaging light at the first moment carries the second image information.
  • the first imaging light at the second moment carries the third image information.
  • the second imaging light at the first moment carries the fourth image information.
  • the projection system can project different image information to different distances from the user in another time-sharing manner.
  • the first imaging light at the first moment carries the first image information and the third image information.
  • the second imaging light at the second moment carries the second image information and the fourth image information.
  • the focal length of the optical element 102 is f.
  • the distance between the image plane of the image generating component 101 (the display surface of the image) and the optical element 102 is d.
  • the image plane of the image generating component 101 can be a pixel component or a diffusion screen.
  • d can be the farthest vertical distance between the optical element 102 and the image plane of the image generating component 101.
  • d can be the straight-line distance between the central pixel of the image plane of the image generating component 101 and the target point on the optical element 102.
  • the central pixel is one or more pixels at the center position of the image plane.
  • the imaging light output by the central pixel irradiates the target point on the optical element 102.
  • the distance between the virtual image formed by the two imaging lights after reflection or transmission and the optical element is d0, that is, the distance between the virtual image plane 103 and the optical element 102 is d0.
  • d0, d and f satisfy the following formula:
  • d may be smaller than f.
  • the optical element 102 may magnify the virtual image. Therefore, when the distance between the user and the optical element 102 is relatively close, the user may see to an enlarged virtual image, thereby improving the user experience.
  • the optical element 102 reflects or transmits two imaging lights through the reflection or transmission area.
  • the target point may refer to any point in the reflection or transmission area.
  • the optical element 102 reflects or transmits two imaging lights through two reflection or transmission areas.
  • the two imaging lights correspond to the reflection or transmission areas one by one.
  • the target point may refer to the center point between the two reflection or transmission areas.
  • FIG. 12 is a circuit diagram of a projection system provided in an embodiment of the present application.
  • the circuit in the projection system mainly includes a processor 1001, an internal memory 1002, an external memory interface 1003, an audio module 1004, a video module 1005, a power module 1006, a wireless communication module 1007, an I/O interface 1008, a video interface 1009, a Controller Area Network (CAN) transceiver 1010, a display circuit 1028, and a display panel 1029.
  • the processor 1001 and its peripheral components, such as the internal memory 1002, the CAN transceiver 1010, the audio module 1004, the video module 1005, the power module 1006, the wireless communication module 1007, the I/O interface 1008, the video interface 1009, the touch unit 1010, and the display circuit 1028 can be connected through a bus.
  • the processor 1001 can be called a front-end processor.
  • circuit diagrams shown in the embodiments of the present application do not constitute a specific limitation on the projection system.
  • the projection system may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1001 includes one or more processing units, for example, the processor 1001 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural network processor (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural network processor
  • a memory may also be provided in the processor 1001 for storing instructions and data.
  • the operating system of the projection system, the AR Creator software package, etc. may be stored.
  • the memory in the processor 1001 is a cache memory.
  • the memory may store instructions or data that the processor 1001 has just used or cyclically used. If the processor 1001 needs to use the instruction or data again, it may be directly called from the memory. Repeated access is avoided, the waiting time of the processor 1001 is reduced, and the efficiency of the system is improved.
  • the functions of the processor 1001 can be implemented by a domain processor on the vehicle.
  • the projection system may further include a plurality of input/output (I/O) interfaces 1008 connected to the processor 1001.
  • the interface 1008 may include, but is not limited to, an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • the I/O interface 1008 may be connected to devices such as a mouse, touch screen, keyboard, camera, speaker, microphone, etc., and may also be connected to physical buttons on the projection system (such as volume buttons, brightness adjustment buttons, power buttons, etc.).
  • the internal memory 1002 can be used to store computer executable program codes, which include instructions.
  • the internal memory 1002 may include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application required for at least one function (such as a call function, a time setting function, an AR function, etc.), etc.
  • the data storage area may store data created during the use of the projection system (such as a phone book, world time, etc.), etc.
  • the internal memory 1002 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (Universal Flash Storage, UFS), etc.
  • the processor 1001 executes various functional applications and data processing of the projection system by running instructions stored in the internal memory 1002 and/or instructions stored in a memory provided in the processor 1001.
  • the external memory interface 1003 can be used to connect an external memory (such as a Micro SD card).
  • the external memory can store data or program instructions as needed, and the processor 1001 can perform operations such as reading and writing these data or programs through the external memory interface 1003.
  • the audio module 1004 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1004 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1004 can be arranged in the processor 1001, or some functional modules of the audio module 1004 can be arranged in the processor 1001.
  • the projection system can realize audio functions through the audio module 1004 and the application processor.
  • the video interface 1009 can receive external audio and video input, which can be a high-definition multimedia interface (HDMI), a digital visual interface (DVI), a video graphics array (VGA), a display port (DP), a low voltage differential signal (LVDS) interface, etc.
  • the video interface 1009 can also output video to the outside.
  • the projection system receives the audio and video input through the video interface.
  • the video module 1005 can decode the video input by the video interface 1009, for example, by performing H.264 decoding.
  • the video module can also encode the video collected by the projection system, for example, by performing H.264 encoding on the video collected by the external camera.
  • the processor 1001 can also decode the video input by the video interface 1009, and then output the decoded image signal to the display circuit.
  • the stereoscopic projection system further includes a CAN transceiver 1010, which can be connected to the CAN bus (CAN BUS) of the car.
  • CAN BUS CAN bus
  • the stereoscopic projection system can communicate with the in-vehicle entertainment system (music, radio, video module), the vehicle status system, etc.
  • the user can turn on the in-vehicle music playback function by operating the projection system.
  • the vehicle status system can send vehicle status information (doors, seat belts, etc.) to the stereoscopic projection system for display.
  • the display circuit 1028 and the display panel 1029 jointly realize the function of displaying an image.
  • the display circuit 1028 receives the image signal output by the processor 1001, processes the image signal and then inputs it into the display panel 1029 for imaging.
  • the display circuit 1028 can also control the image displayed by the display panel 1029. For example, it controls parameters such as display brightness or contrast.
  • the display circuit 1028 may include a driving circuit, an image control circuit, etc.
  • the above-mentioned display circuit 1028 and the display panel 1029 may be located in the pixel component 502.
  • the display panel 1029 is used to modulate the light beam input by the light source according to the input image signal, so as to generate a visible image.
  • the display panel 1029 can be a liquid crystal on silicon panel, a liquid crystal display panel or a digital micromirror device.
  • the video interface 1009 can receive input video data (or called a video source), and the video module 1005 decodes and/or digitally processes the data and outputs an image signal to the display circuit 1028.
  • the display circuit 1028 drives the display panel 1029 to image the light beam emitted by the light source according to the input image signal, thereby generating a visible image (emitting imaging light).
  • the power module 1006 is used to provide power to the processor 1001 and the light source according to the input power (e.g., direct current), and the power module 1006 may include a rechargeable battery, which can provide power to the processor 1001 and the light source.
  • the light emitted by the light source can be transmitted to the display panel 1029 for imaging, thereby forming an image light signal (imaging light).
  • the power module 1006 can be connected to a power module of a car (eg, a power battery), and the power module of the car supplies power to the power module 1006 of the projection system.
  • a power module of a car eg, a power battery
  • the wireless communication module 1007 enables the projection system to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared
  • the wireless communication module 1007 can be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1007 receives electromagnetic waves via an antenna, modulates the frequency of the electromagnetic wave signal and performs filtering, and sends the processed signal to the processor 1001.
  • the wireless communication module 1007 can also receive the signal to be sent from the processor 1001, modulate the frequency of the signal, amplify it, and convert it into electromagnetic waves for radiation through the antenna.
  • the video data decoded by the video module 1005 can also be wirelessly received through the wireless communication module 1007 or read from the internal memory 1002 or the external memory.
  • the projection system can receive video data from the terminal device or the in-vehicle entertainment system through the wireless LAN in the car, and the projection system can also read the audio and video data stored in the internal memory 1002 or the external memory.
  • the embodiment of the present application also provides a vehicle, which is equipped with any of the aforementioned stereoscopic projection systems.
  • the projection system is used to output two imaging lights.
  • the two imaging lights carry different image information.
  • the two output imaging lights are illuminated to the receiving surface through the windshield to form a virtual image.
  • the virtual image is located on one side of the windshield, and the driver or passenger is located on the other side of the windshield.
  • the two imaging lights after reflection or transmission are respectively irradiated to the eyes of the driver or the passenger.
  • the first imaging light is irradiated to the left eye of the passenger.
  • the second imaging light is irradiated to the right eye of the passenger.
  • FIG. 13 is a schematic diagram of the projection system installed in a vehicle provided in the embodiment of the present application.
  • the windshield of the vehicle can be used as an optical element in the projection system.
  • the image generation component 101 in the projection system is located on the same side of the windshield.
  • the image generation component 101 is used to output two imaging lights.
  • the two imaging lights carry different image information.
  • the windshield is used to reflect or transmit the two imaging lights to form a virtual image.
  • the virtual image is located on one side of the windshield, and the driver or passenger is located on the other side of the windshield.
  • the two imaging lights after reflection or transmission are respectively irradiated to the eyes of the driver or the passenger.
  • the first imaging light is irradiated to the left eye of the passenger.
  • the second imaging light is irradiated to the right eye of the passenger.
  • the vehicle can be a car, truck, motorcycle, bus, ship, airplane, helicopter, lawn mower, recreational vehicle, amusement park vehicle, construction equipment, tram, golf cart, train, and cart, etc., which are not particularly limited in the embodiments of the present application.
  • the projection system can be installed on the instrument panel (IP) of the vehicle, located at the co-pilot position or the main driver position, or it can be installed on the back of the seat.
  • IP instrument panel
  • HUD can be used to display navigation information, vehicle speed, power/fuel level, etc.
  • FIG. 14 is a schematic diagram of a possible functional framework of a vehicle provided in an embodiment of the present application.
  • the functional framework of the vehicle may include various subsystems, such as the control system 14, the sensor system 12, one or more peripheral devices 16 (one is shown as an example), the power supply 18, the computer system 20, and the display system 32.
  • the vehicle may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
  • the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (GPS), a vehicle speed sensor, an inertial measurement unit (IMU), a radar unit, a laser rangefinder, a camera device, a wheel speed sensor, a steering sensor, a gear position sensor, or other components for automatic detection, etc., and this application does not limit them.
  • the control system 14 may include several components, such as the steering unit, brake unit, lighting system, automatic driving system, map navigation system, network timing system and obstacle avoidance system shown in the figure.
  • the control system 14 may also include components such as a throttle processor and an engine processor for controlling the vehicle's speed, which are not limited in this application.
  • the peripheral device 16 may include several components, such as the communication system, touch screen, user interface, microphone, and speaker shown in the figure.
  • the communication system is used to realize network communication between the vehicle and other devices other than the vehicle.
  • the communication system may use wireless communication technology or wired communication technology to realize network communication between the vehicle and other devices.
  • the wired communication technology may refer to communication between the vehicle and other devices through network cables or optical fibers.
  • the power source 18 represents a system that provides power or energy for the vehicle, which may include but is not limited to a rechargeable lithium battery or a lead-acid battery, etc. In practical applications, one or more battery components in the power source are used to provide power or energy for starting the vehicle, and the type and material of the power source are not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (also referred to as a storage device).
  • processors 2001 one processor is shown as an example
  • memory 2002 also referred to as a storage device.
  • the memory 2002 is also inside the computer system 20, or it may be outside the computer system 20, for example, as a cache in the vehicle, etc., which is not limited in this application.
  • Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU). Processor 2001 may be used to run the relevant programs or instructions corresponding to the programs stored in memory 2002 to implement the corresponding functions of the vehicle.
  • GPU graphics processing unit
  • the memory 2002 may include a volatile memory, such as a RAM; the memory may also include a non-volatile memory, such as a ROM, a flash memory or a solid state drive (SSD); the memory 2002 may also include a combination of the above-mentioned types of memories.
  • the memory 2002 may be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 calls the program codes or instructions stored in the memory 2002 to implement the corresponding functions of the vehicle.
  • the function includes but is not limited to some or all of the functions in the vehicle function framework diagram shown in FIG13.
  • a set of program codes for vehicle control may be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle is specifically described in detail below in the present application.
  • the memory 2002 may also store information such as road maps, driving routes, sensor data, etc.
  • the computer system 20 may be combined with other elements in the vehicle functional framework diagram, such as sensors in the sensor system, GPS, etc., to implement relevant functions of the vehicle.
  • the computer system 20 may control the driving direction or driving speed of the vehicle based on the data input from the sensor system 12, which is not limited in this application.
  • the display system 32 may include several components, such as a processor, an optical element, and the stereoscopic projection system 100 described above.
  • the processor is used to generate images according to user instructions (such as generating images containing vehicle status such as vehicle speed, power/fuel level, and images of augmented reality AR content), and send the image content to the stereoscopic projection system 100.
  • the stereoscopic projection system 100 is used to output two-way imaging light carrying different image information.
  • the windshield is an optical element. The windshield is used to reflect or transmit two-way imaging light so that a virtual image corresponding to the image content is presented in front of the driver or passenger.
  • the functions of some components in the display system 32 can also be implemented by other subsystems of the vehicle.
  • the processor can also be a component in the control system 14.
  • FIG. 14 of the present application includes four subsystems, and the sensor system 12, the control system 14, the computer system 20 and the display system 32 are only examples and do not constitute limitations.
  • vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions.
  • vehicles can include more or fewer systems or components, and this application does not limit them.
  • FIG15 is a flow chart of a projection method provided in an embodiment of the present application.
  • the projection method can be applied to a projection system or a vehicle equipped with a projection system.
  • the projection method is described by taking the application of the projection method to the projection system as an example.
  • the projection method includes the following steps: Steps.
  • the projection system outputs two paths of imaging light.
  • the two paths of imaging light include a first imaging light and a second imaging light.
  • the first imaging light carries first image information
  • the second imaging light carries second image information.
  • the projection system includes an image generating component 101 and an optical element 102.
  • the projection system outputs two paths of imaging light through the image generating component 101.
  • the image generating component 101 includes a backlight component 901 and a spatial light modulator 902.
  • the backlight component 901 is used to output two light beams to the spatial light modulator 902 at different angles in a time-sharing manner.
  • the spatial light modulator 902 is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two paths of imaging light.
  • the projection system reflects or transmits two imaging lights.
  • the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane.
  • the two imaging lights are respectively irradiated to two positions of the receiving surface.
  • the distance between the two positions is m.
  • the distance between the virtual image plane and the receiving surface is d1.
  • the distance between the receiving surface and the first visual plane is d2.
  • d2 is determined based on the first parallax information between the first image information and the second image information, m and d1. d1 and d2 meet the first condition.
  • the two paths of imaging light after reflection or transmission are respectively irradiated to different positions of the receiving surface 105, such as the left eye (upper eye position) and the right eye (lower eye position) of the user.
  • the two paths of imaging light after reflection or transmission generate a virtual image on the virtual image plane 103.
  • the virtual image plane 103 is located on one side of the optical element 102, and the receiving surface 105 is located on the other side of the optical element 102.
  • the first parallax information will cause the two image information viewed by the user to produce a first parallax angle.
  • the image information observed by the user is located on the visual plane 104.
  • the distance between the visual plane 104 and the receiving surface 105 is d2.
  • d1 and d2 meet the first condition, and the first condition can be any one or more of the following conditions.
  • d2 is smaller than d1.
  • the projection system may form a ghost image.
  • the ghost image will affect the clarity of the HUD display and driving safety.
  • d2 is smaller than d1, the impact of the ghost image can be reduced, thereby improving the clarity of the HUD display and driving safety.
  • VAC is less than 0.25 diopters.
  • the VAC size is negatively correlated with the user experience. By controlling VAC, the user experience can be improved.
  • the projection system outputs two light beams in a time-sharing manner, and obtains two imaging lights by modulating the two light beams in a time-sharing manner.
  • the projection system outputs two imaging lights at a first moment.
  • the first imaging light in the two imaging lights carries the first image information.
  • the second imaging light in the two imaging lights carries the second image information.
  • the first image information and the second image information carry the first parallax information.
  • the projection method further includes the following steps: the projection system outputs two imaging lights at a second moment.
  • the first imaging light in the two imaging lights carries the third image information.
  • the second imaging light in the two imaging lights carries the fourth image information.
  • the third image information and the fourth image information carry the second parallax information.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Un système de projection, celui-ci étant appliqué au domaine de l'affichage. Le système de projection comprend un composant de génération d'image (101) et un élément optique (102). Le composant de génération d'image (101) est utilisé pour émettre deux trajets de lumière d'imagerie. Les deux trajets de lumière d'imagerie comprennent un premier trajet de lumière d'imagerie et un second trajet de lumière d'imagerie. Le premier trajet de lumière d'imagerie contient des premières informations d'image. Le second trajet de lumière d'imagerie contient des secondes informations d'image. L'élément optique (102) est utilisé pour réfléchir ou pour transmettre les deux trajets de lumière d'imagerie. Les deux trajets de lumière d'imagerie génèrent une image virtuelle sur un plan d'image virtuelle (103). Les deux trajets de lumière d'imagerie sont irradiés vers deux positions sur un plan de réception (105). La distance entre les deux positions est m. La distance entre le plan d'image virtuelle (103) et le plan de réception (105) est d1. La distance entre le plan de réception (105) et un premier plan visuel est d2. Le conflit entre la vergence et l'accomodation (VAC) satisfait la relation suivante : VAC = |1/d2 - 1/d1|. Le VAC est inférieur à 0,25 dioptrie. En ajustant les informations de parallaxe des deux éléments d'informations d'image, la distance visuelle d2 peut être ajustée, de telle sorte que le nombre de systèmes de projection peut être réduit, et le coût d'un dispositif d'affichage tête haute (HUD) peut être réduit.
PCT/CN2023/107815 2022-11-11 2023-07-18 Système de projection, procédé de projection et moyen de transport WO2024098828A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211414939.3 2022-11-11
CN202211414939.3A CN118033971A (zh) 2022-11-11 2022-11-11 投影系统、投影方法和交通工具

Publications (1)

Publication Number Publication Date
WO2024098828A1 true WO2024098828A1 (fr) 2024-05-16

Family

ID=90988068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/107815 WO2024098828A1 (fr) 2022-11-11 2023-07-18 Système de projection, procédé de projection et moyen de transport

Country Status (2)

Country Link
CN (1) CN118033971A (fr)
WO (1) WO2024098828A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180024355A1 (en) * 2016-07-19 2018-01-25 The Board Of Trustees Of The University Of Illinoi Method and system for near-eye three dimensional display
KR20200022890A (ko) * 2018-08-24 2020-03-04 (주) 태진금속 수렴 및 조절 불일치 문제가 없는 입체 영상 표시 장치
CN112130325A (zh) * 2020-09-25 2020-12-25 东风汽车有限公司 车载抬头显示器视差矫正系统、方法、存储介质和电子设备
CN114137725A (zh) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 一种可显示三维图像的抬头显示系统
CN114185171A (zh) * 2020-09-14 2022-03-15 未来(北京)黑科技有限公司 一种成像距离可变的抬头显示装置及抬头显示系统
CN114787690A (zh) * 2019-12-10 2022-07-22 奇跃公司 用于混合现实显示的增加的景深

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180024355A1 (en) * 2016-07-19 2018-01-25 The Board Of Trustees Of The University Of Illinoi Method and system for near-eye three dimensional display
KR20200022890A (ko) * 2018-08-24 2020-03-04 (주) 태진금속 수렴 및 조절 불일치 문제가 없는 입체 영상 표시 장치
CN114787690A (zh) * 2019-12-10 2022-07-22 奇跃公司 用于混合现实显示的增加的景深
CN114137725A (zh) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 一种可显示三维图像的抬头显示系统
CN114185171A (zh) * 2020-09-14 2022-03-15 未来(北京)黑科技有限公司 一种成像距离可变的抬头显示装置及抬头显示系统
CN112130325A (zh) * 2020-09-25 2020-12-25 东风汽车有限公司 车载抬头显示器视差矫正系统、方法、存储介质和电子设备

Also Published As

Publication number Publication date
CN118033971A (zh) 2024-05-14

Similar Documents

Publication Publication Date Title
CN112639581B (zh) 抬头显示器和抬头显示方法
WO2024021852A1 (fr) Appareil d'affichage stéréoscopique, système d'affichage stéréoscopique et véhicule
WO2024021574A1 (fr) Système de projection 3d, système de projection et véhicule
WO2024017038A1 (fr) Appareil de génération d'image, dispositif d'affichage et véhicule
WO2024098828A1 (fr) Système de projection, procédé de projection et moyen de transport
CN115639673B (zh) 一种显示装置和显示方法
CN217360538U (zh) 一种投影系统、显示设备和交通工具
WO2023216670A1 (fr) Appareil d'affichage tridimensionnel et véhicule
WO2023185293A1 (fr) Appareil de génération d'images, dispositif d'affichage et véhicule
WO2023130759A1 (fr) Dispositif d'affichage et véhicule
CN115542644B (zh) 投影装置、显示设备及交通工具
WO2024021563A1 (fr) Dispositif d'affichage et véhicule
WO2023040669A1 (fr) Dispositif d'affichage tête haute et véhicule
WO2023138138A1 (fr) Dispositif d'affichage et véhicule
WO2024065332A1 (fr) Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image
WO2023103492A1 (fr) Appareil de génération d'image, dispositif d'affichage et véhicule
WO2023098228A1 (fr) Appareil d'affichage, dispositif électronique et véhicule
CN220983636U (zh) 一种显示装置、交通工具和车载系统
WO2023138076A1 (fr) Appareil d'affichage et véhicule
WO2023087739A1 (fr) Appareil de projection, dispositif d'affichage et véhicule
WO2023071548A1 (fr) Appareil d'affichage optique, système d'affichage, véhicule et procédé de réglage de couleur
WO2024041034A1 (fr) Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie
CN117991569A (zh) 投影装置、显示设备和交通工具
CN115826332A (zh) 图像生成装置、相关设备和图像投射方法