WO2024098828A1 - Projection system, projection method, and transportation means - Google Patents

Projection system, projection method, and transportation means Download PDF

Info

Publication number
WO2024098828A1
WO2024098828A1 PCT/CN2023/107815 CN2023107815W WO2024098828A1 WO 2024098828 A1 WO2024098828 A1 WO 2024098828A1 CN 2023107815 W CN2023107815 W CN 2023107815W WO 2024098828 A1 WO2024098828 A1 WO 2024098828A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection system
image information
imaging
imaging light
light
Prior art date
Application number
PCT/CN2023/107815
Other languages
French (fr)
Chinese (zh)
Inventor
王金蕾
李肖
陈宇宸
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024098828A1 publication Critical patent/WO2024098828A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details

Definitions

  • the present application relates to the field of display, and in particular to a projection system, a projection method and a vehicle.
  • Head-up display also known as head-up display system
  • head-up display system is a device that projects information such as speed and navigation in front of the driver, so that the driver can see the instrument information without lowering his head.
  • information such as speed and navigation in front of the driver, so that the driver can see the instrument information without lowering his head.
  • HUD in order to improve visual enjoyment, different image information can be projected to different distances from the driver.
  • instrument-related information is projected 2.5 meters (metre, m) away from the driver
  • AR augmented reality
  • HUD includes two projection systems. One projection system is used to project instrument-related information. The other projection system is used to project AR information.
  • the cost of the projection system is high, which leads to the high cost of the HUD.
  • the present application provides a projection system, which can adjust the visual distance by adjusting the parallax information of two image information, thereby reducing the number of projection systems and reducing the cost of HUD, and can improve the user experience by controlling the convergence adjustment conflict VAC.
  • the present application provides a projection system.
  • the projection system includes an image generating component and an optical element.
  • the image generating component is used to output two imaging lights.
  • the two imaging lights include a first imaging light and a second imaging light.
  • the first imaging light carries first image information.
  • the second imaging light carries second image information.
  • the optical element is used to reflect or transmit the two imaging lights, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane.
  • the two imaging lights after reflection or transmission are irradiated to two positions of a receiving surface.
  • the two positions correspond to the two imaging lights one by one.
  • the distance between the two positions is m.
  • the distance between the virtual image plane and the receiving surface is d1.
  • the first parallax information will cause the two image information viewed by the user to generate a first parallax angle, so that the image information observed by the user is located in the first visual plane.
  • the distance between the receiving surface and the first visual plane is d2.
  • d2 is determined based on the first parallax information between the first image information and the second image information, m and d1.
  • VAC vergence accommodation conflict
  • the optical element is a windshield, which includes a first glass layer, a second glass layer, and an intermediate layer for bonding the first glass layer and the second glass layer.
  • the two imaging lights are linearly polarized lights.
  • the intermediate layer is used to absorb the linearly polarized light.
  • the two imaging lights are incident on the optical element, they are reflected by two glass layers in contact with the air inside and outside the optical element, thereby forming two virtual images at the receiving position of the human eye.
  • the two virtual images form a ghost image due to partial overlap. ghosting can seriously affect the clarity of the HUD display and driving safety.
  • the impact of ghosting can be reduced, thereby improving the clarity of the HUD display and driving safety.
  • the middle layer is a wedge-shaped structure.
  • the wedge-shaped structure can reduce the impact of ghosting, thereby improving the clarity of the HUD display and driving safety.
  • the value range of d1 is between 2.5 m and 7 m.
  • the thickness of the intermediate layer at different positions is the same. Intermediate layers of the same thickness can reduce the cost of the optical element, thereby reducing the cost of the projection system.
  • d2 is less than d1.
  • the projection system will form a ghost image.
  • d2 is less than d1, the influence of the ghost image can be reduced, thereby improving the clarity of the HUD display and driving safety.
  • the value range of d1 is between 10m and 15m.
  • the impact of ghosting is relatively large.
  • the value of d1 is too large and VAC is less than 0.25 diopters, the value of d2 is large, thereby affecting the user experience. Therefore, in the present application, by controlling the value of d1, the impact of ghosting can be reduced and the user experience can be improved.
  • the first imaging light also carries third image information.
  • the second imaging light also carries fourth image information.
  • the distance between the receiving surface and the second visual plane is d3.
  • d3 is a second parallax information between the third image information and the fourth image information.
  • d3 is equal to d1.
  • d1 is equal to d3
  • VAC-free display can be achieved for content displayed at a close distance.
  • the focal length of the optical element is f.
  • the distance between the image generating component and the optical element is d.
  • d is smaller than f.
  • the image information carried by the two imaging lights can be amplified, thereby improving the user experience.
  • the distance between the virtual image plane and the optical element is d0.
  • d0 satisfies the following formula:
  • the image generating component includes a backlight component and a spatial light modulator.
  • the backlight component is used to output two light beams to the spatial light modulator at different angles in a time-sharing manner.
  • the spatial light modulator is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two imaging lights, and output the two imaging lights at different angles.
  • the image generating component includes a backlight component, a spatial light modulator and a spectroscopic element.
  • the backlight component is used to generate a light beam to be modulated.
  • the spectroscopic element is used to split the light beam to be modulated to obtain two sub-beams to be modulated.
  • the spatial light modulator is used to modulate the two sub-beams to be modulated and output two imaging lights.
  • the spectroscopic element can reduce the refresh rate of the spatial light modulator, thereby improving the reliability of the image generating component.
  • the spectroscopic element can be a cylindrical grating, a liquid crystal grating, a barrier grating, an electronic grating, a diffraction element, etc.
  • the image generating component further includes a diffusion screen.
  • the diffusion screen is used to receive two paths of imaging light from the spatial light modulator, diffuse the two paths of imaging light, and output the diffused two paths of imaging light at different angles.
  • the two light beams include a first light beam and a second light beam.
  • the backlight assembly is used to output the first light beam at a first position and output the second light beam at a second position. By moving the backlight assembly, the backlight assembly can output the first light beam and the second light beam through one light source, thereby reducing the cost of the backlight assembly.
  • the projection system further includes a human eye tracking module and a processor.
  • the human eye tracking module is used to obtain the position of the receiving surface.
  • the processor is used to adjust the first parallax information of the first image information and the second image information according to the position of the receiving surface.
  • the second aspect of the present application provides a projection method.
  • the projection method can be applied to a projection system.
  • the projection method includes the following steps: the projection system outputs two imaging lights.
  • the two imaging lights include a first imaging light and a second imaging light.
  • the first imaging light carries the first image information.
  • the second imaging light carries the second image information;
  • the projection system reflects or transmits the two imaging lights, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane.
  • the two imaging lights after reflection or transmission are irradiated to two positions of the receiving surface.
  • the distance between the two positions is m.
  • the distance between the virtual image plane and the receiving surface is d1, and the distance between the receiving surface and the first visual plane is d2.
  • d2 is determined based on the first parallax information between the first image information and the second image information, m and d1.
  • VAC satisfies the following relationship: Among them, VAC is less than 0.25 di
  • the third aspect of the present application provides a vehicle, wherein the vehicle comprises a projection system as described in the first aspect or any optional manner of the first aspect, and the projection system is installed on the vehicle.
  • FIG1 is a first optical path schematic diagram of a projection system provided in an embodiment of the present application.
  • FIG2 is a first structural schematic diagram of a windshield provided in an embodiment of the present application.
  • FIG3 is a second structural schematic diagram of a windshield provided in an embodiment of the present application.
  • FIG4 is a third structural schematic diagram of a windshield provided in an embodiment of the present application.
  • FIG5 is a second optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • FIG6 is a first structural diagram of a VAC provided in an embodiment of the present application.
  • FIG7 is a second structural diagram of a VAC provided in an embodiment of the present application.
  • FIG8 is a third optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • FIG9 is a first structural diagram of an image generation component provided in an embodiment of the present application.
  • FIG10 is a second structural diagram of an image generation component provided in an embodiment of the present application.
  • FIG11 is a third structural schematic diagram of the image generation component provided in an embodiment of the present application.
  • FIG12 is a circuit diagram of a projection system provided in an embodiment of the present application.
  • FIG13 is a schematic diagram of a projection system installed in a vehicle according to an embodiment of the present application.
  • FIG14 is a schematic diagram of a possible functional framework of a vehicle provided in an embodiment of the present application.
  • FIG. 15 is a flow chart of the projection method provided in an embodiment of the present application.
  • the present application provides a projection system that can adjust the visual distance by adjusting the parallax information of two image information, thereby reducing the number of projection systems and reducing the cost of HUD, and by controlling the convergence adjustment conflict VAC, the user experience can be improved.
  • the "first”, “second”, “target”, etc. used in this application are only used for the purpose of distinguishing the description, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
  • reference numbers and/or letters are repeated in multiple figures of the present application. Repetition does not indicate a strict limiting relationship between various embodiments and/or configurations.
  • the projection system in this application is applied to the display field.
  • the head-up display in order to improve visual enjoyment, the head-up display (HUD) can project different image information to different distances from the driver.
  • the HUD includes two projection systems. One projection system is used to project instrument-related information.
  • the other projection system is used to project augmented reality (AR) information. Therefore, the cost of HUD is relatively high.
  • AR augmented reality
  • FIG. 1 is a first optical path schematic diagram of a projection system provided in an embodiment of the present application.
  • the projection system includes an image generation component 101 and an optical element 102.
  • the image generation component 101 is used to output two imaging lights.
  • the optical element 102 may be a reflector, a windshield, a lens or a diffraction element, etc.
  • the optical element 102 is used to reflect or transmit two imaging lights, and there is an angle between the two imaging lights after reflection or transmission.
  • the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105, such as the left eye (upper eye position) and the right eye (lower eye position) of the user.
  • the distance between the two positions is m.
  • the two positions correspond one to one to the two imaging lights.
  • the position of the human eye can also be called a viewpoint.
  • the above-mentioned projection system can provide multiple viewpoints for multiple people to watch.
  • the image generation component 101 can produce multiple groups of imaging lights for different people to watch. Among them, a group of imaging lights includes two imaging lights. This embodiment takes a viewpoint as an example, that is, the image generation component 101 generates two imaging lights as an example to illustrate the imaging process of the projection system.
  • the two-path imaging light includes a first path of imaging light and a second path of imaging light.
  • the first path of imaging light carries the first image information.
  • the second path of imaging light carries the second image information.
  • the two paths of imaging light after reflection or transmission form a virtual image on the virtual image plane 103.
  • P1 and P2 are both image points on the virtual image plane 103.
  • the light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P1 belongs to the first path of imaging light.
  • the light beam corresponding to P2 belongs to the second path of imaging light.
  • the light beam corresponding to P1 irradiates the upper eye position of the user.
  • the light beam corresponding to P2 irradiates the lower eye position of the user.
  • the light beam corresponding to P1 will not irradiate the lower eye position of the user.
  • the light beam corresponding to P2 will not irradiate the upper eye position of the user.
  • the distance between the virtual image plane 103 and the receiving surface 105 is d1.
  • the first parallax information between the first image information and the second image information can be the distance between two pixel points in the pixel group on the virtual image plane (referred to as the first distance for short).
  • the first image information includes N first pixel points.
  • the second image information includes N second pixel points. N is an integer greater than 0.
  • the N first pixel points correspond one to one with the N second pixel points.
  • the first image information and the second image information include N pixel groups.
  • a pixel group includes a first pixel point and a second pixel point corresponding thereto. Two pixel points in a pixel group are used to display the same point of an object.
  • both pixel points are used to display the center of a circle.
  • both pixel points are used to display the tip of a person's nose.
  • both pixel points are used to display the vertex of the number "1".
  • the first distances of the N pixel groups are the same.
  • the first parallax information will cause the two image information viewed by the user to generate a first parallax angle.
  • the first parallax angle is related to d1 and m.
  • P1 and P2 are two pixel points in the pixel group.
  • the distance between P1 and P2 is the first distance dm.
  • P1, P2 and the two positions illuminated by the two imaging lights form an isosceles trapezoid.
  • the first parallax angle ⁇ 1 is obtained by dm, m and d1.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user are P10 located on the visual plane 104.
  • the distance between the visual plane 104 and the receiving surface 105 is d2.
  • d2 is also called the visual distance.
  • d2, ⁇ 1 and m satisfy the following relationship:
  • VAC vergence accommodation conflict
  • the visual distance d2 can be adjusted by adjusting the parallax information of the two image information (the first image information and the second image information). Therefore, when the projection system in the present application is installed on a vehicle, the projection system can project different image information. To different distances from the driver. For example, the instrument-related information is projected to 2.5m away from the driver, and the AR information is projected to 10m away from the driver. Therefore, the embodiment of the present application can reduce the number of projection systems, thereby reducing the cost of the HUD. In addition, by controlling the VAC, the user experience can be improved.
  • the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105.
  • the two positions correspond to two different light spots.
  • m can be the center distance of the two light spots.
  • m can be the pupil distance of the eyes.
  • FIG. 2 is a first structural schematic diagram of a windshield provided in an embodiment of the present application.
  • the windshield includes a first glass layer 201, a second glass layer 203, and an intermediate layer 202.
  • the intermediate layer 202 is used to bond the first glass layer 201 and the second glass layer 203.
  • the thickness of the intermediate layer 202 at different positions in the target area is the same.
  • the target area refers to the area through which the two imaging lights pass. It should be understood that there may be processing errors in the thickness of the intermediate layer 202 at different positions. Therefore, the same thickness of the intermediate layer 202 at different positions means that the thickness deviation of the intermediate layer 202 at different positions is less than 1 mm.
  • any one of the two imaging lights when any one of the two imaging lights is incident on the windshield, it will be reflected by the two glass layers in contact with the air inside and outside the windshield, thereby forming two virtual images at the receiving position of the human eye.
  • any one of the two imaging lights is incident on the second glass layer 203.
  • the imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the second glass layer 203 forms a main image on the outside of the second glass layer 203.
  • the imaging light transmitted by the second glass layer 203 reaches the first glass layer 201 after passing through the middle layer.
  • the imaging light reflected by the first glass layer 201 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the first glass layer 201 forms a virtual image on the outside of the first glass layer 201.
  • the virtual image and the main image are located at different positions.
  • the two virtual images form a ghost image because of partial overlap. ghosting will seriously affect the clarity of the HUD display and driving safety.
  • the present application can reduce the impact of ghosting in any one or more of the following ways.
  • FIG. 3 is a second structural schematic diagram of the windshield provided in an embodiment of the present application. As shown in FIG. 3 , based on FIG. 2 , any one of the two imaging lights is incident on the second glass layer 203. The imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye. The imaging light reflected by the second glass layer 203 forms a virtual image on the outside of the second glass layer 203. The imaging light transmitted by the second glass layer 203 is absorbed by the intermediate layer 202.
  • FIG. 4 is a third structural schematic diagram of a windshield provided in an embodiment of the present application.
  • the windshield includes a first glass layer 201, a second glass layer 203 and an middle layer 202.
  • the middle layer 202 is used to bond the first glass layer 201 and the second glass layer 203.
  • the middle layer 202 is a wedge-shaped structure in the target area. Any one of the two imaging lights is incident on the second glass layer 203.
  • the imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the second glass layer 203 forms a main image on the outside of the second glass layer 203.
  • the imaging light transmitted by the second glass layer 203 reaches the first glass layer 201 after passing through the middle layer.
  • the imaging light reflected by the first glass layer 201 is reflected to the receiving position of the human eye.
  • the imaging light reflected by the first glass layer 201 forms a virtual image on the outside of the first glass layer 201.
  • the virtual image and the main image are located at the same position.
  • FIG. 5 is a second optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • the projection system includes an image generation component 101 and an optical element 102.
  • the image generation component 101 is used to output two imaging lights.
  • the optical element 102 is used to reflect or transmit the two imaging lights, and there is an angle between the two imaging lights after reflection or transmission.
  • the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105. The distance between the two positions is m.
  • the two imaging lights include a first imaging light and a second imaging light.
  • the first imaging light carries the first image information.
  • the second imaging light carries the second image information.
  • the two imaging lights after reflection or transmission form a virtual image on the virtual image plane 103.
  • P1 and P2 are both image points on the virtual image plane 103.
  • the light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P1 belongs to the first imaging light.
  • the light beam corresponding to P2 belongs to the second imaging light.
  • the light beam corresponding to P1 is irradiated to the lower eye position of the user.
  • the light beam corresponding to P2 is irradiated to the upper eye position of the user.
  • the first parallax information causes the two image information viewed by the user to produce a first parallax angle.
  • the first parallax angle is ⁇ 2.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user are P10 located on the visual plane 104.
  • VAC can be less than 0.25 diopters.
  • the value of VAC is related to d1 and d2.
  • the relationship between VAC and d1 and d2 is described below in an exemplary manner.
  • Figure 6 is a first structural diagram of VAC provided in an embodiment of the present application.
  • the ordinate is VAC, and the unit is diopter.
  • the abscissa is distance, and the unit is meter.
  • Figure 6 provides 12 curves.
  • the starting points of the curves are connected to the abscissas.
  • the abscissa of the starting point represents d1.
  • the abscissa of the starting point of curve 601 is 4.5.
  • 4.5 is d1 of curve 601.
  • the abscissa of any point on curve 601 represents d2.
  • the abscissa of point 602 is 8.
  • the ordinate of point 602 is 0.1.
  • curve 601 and point 602 represent that when d1 is equal to 4.5 and d2 is equal to 8, VAC is equal to 0.1. It should be understood that for the description of other curves, reference can be made to the description of curve 601.
  • the value of d2 when VAC is less than 0.25 diopters, the value of d2 is related to the value of d1. According to the description of FIG. 1 above, d2 is the visual distance. Therefore, the value of d2 affects the user experience. In the embodiment of the present application, in order to improve the user experience, The value range of d2 can be controlled by controlling the value of d1. For example, the value range of d1 is between 2.5m and 7m, wherein d1 can be 2.5m or 7m.
  • FIG7 is a second schematic diagram of the structure of VAC provided in an embodiment of the present application.
  • the ordinate is VAC, in diopters.
  • the abscissa is distance, in meters.
  • FIG7 provides six curves. The starting points of the curves are connected to the abscissas.
  • the abscissas of the six curves are 10, 10, 12, 12, 14 and 14, respectively. It should be understood that for the description of any of the six curves, reference may be made to the description of curve 601 in FIG6 .
  • the value of d1 is between 10m and 15m. Among them, d1 can be 10m or 15m.
  • the value of d2 is related to d1.
  • the value of d2 is also related to the position of the receiving plane 105.
  • the projection system may further include an eye tracking module and a processor.
  • the personnel tracking module is used to obtain the position of the receiving surface 105.
  • the processor is used to adjust the first parallax information of the first image information and the second image information according to the position of the receiving surface 105. By adjusting the parallax information, the value of d2 can be adjusted.
  • the first imaging light of the two imaging lights can be reflected or transmitted to the left eye of the user through the optical element 102.
  • the second imaging light of the two imaging lights can be reflected or transmitted to the right eye of the user through the optical element 102.
  • different process errors exist at different positions of the optical element 102.
  • the process error will cause the zoom factor and imaging position of the image observed by the user to have display differences relative to the ideal position, thereby reducing the user experience. Therefore, in an embodiment of the present application, image information with different parallaxes can be preprocessed.
  • the display difference is compensated by preprocessing, thereby enhancing the display effect.
  • the processor can perform one or more of the following processing on the left eye image and/or the right eye image loaded by the image generation component 101: translate the entire or part of the left eye image or the right eye image. Enlarge or reduce the entire or part of the left eye image or the right eye image. Distort the entire or part of the left eye image or the right eye image.
  • the projection system can project different image information to the user at different distances in a time-sharing or simultaneous manner. This is described below.
  • FIG8 is a third optical path schematic diagram of the projection system provided in an embodiment of the present application.
  • the first imaging light carries the first image information group.
  • the first image information group includes the first image information and the third image information.
  • the second imaging light carries the second image information group.
  • the second image information group includes the second image information and the fourth image information.
  • the two imaging lights reflected or transmitted by the optical element 102 form a virtual image on the virtual image plane 103.
  • the virtual image also includes the first image information group and the second image information group.
  • the different image information groups observed by the user are located in different visual planes. These are described separately below.
  • P1 and P2 are image points on the virtual image plane 103.
  • the light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P1 belongs to the first image information in the first imaging light.
  • the light beam corresponding to P2 belongs to the second image information in the second imaging light.
  • the light beam corresponding to P1 is irradiated to the upper eye position of the user.
  • the light beam corresponding to P2 is irradiated to the lower eye position of the user.
  • the first parallax information will cause the two image information viewed by the user to produce a first parallax angle.
  • the first parallax angle is ⁇ 1.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user are P10 located on the visual plane 104.
  • P3 and P4 are image points on the virtual image plane 103.
  • the light beams corresponding to P3 and P4 will be irradiated to different positions of the receiving surface 105.
  • the light beam corresponding to P3 belongs to the third image information in the first imaging light.
  • the light beam corresponding to P4 belongs to the fourth image information in the second imaging light.
  • the light beam corresponding to P3 is irradiated to the upper eye position of the user.
  • the light beam corresponding to P4 is irradiated to the lower eye position of the user.
  • the second parallax information reference can be made to the aforementioned description of the first parallax information.
  • the second parallax information will cause the two image information viewed by the user to generate a second parallax angle.
  • the second parallax angle is related to d1 and m.
  • the second parallax angle is ⁇ 3.
  • the distance between the visual plane 801 and the receiving surface 105 is d3.
  • d3 is determined based on the second parallax information, m and d1 between the third image information and the fourth image information.
  • the image information observed by the user is located on the visual plane 801.
  • P3 and P4 observed by the user are P11 located on the visual plane 801.
  • the projection system projects different image information to different distances from the user in a time-sharing manner.
  • the first imaging light output by the image generation component 101 carries the first image information.
  • the second imaging light output by the image generation component 101 carries the second image information.
  • the two imaging lights reflected or transmitted by the optical element 102 form a virtual image on the virtual image plane 103.
  • the image information observed by the user is located on the visual plane 104.
  • P1 and P2 observed by the user on the virtual image plane 103 are P10 located on the visual plane 104.
  • the first imaging light output by the image generation component 101 carries the third image information.
  • the second imaging light output by the image generation component 101 carries the fourth image information. After being reflected by the optical element 102 Or the two imaging lights after transmission form a virtual image on the virtual image plane 103.
  • the third image information and the fourth image information have second parallax information.
  • the image information observed by the user is located on the visual plane 801.
  • P3 and P4 observed by the user on the virtual image plane 103 are P11 located on the visual plane 801.
  • the first moment and the second moment are alternately distributed.
  • the projection system projects different image information to different distances of the user in a time-sharing manner.
  • FIG. 8 is only an example of a projection system provided by an embodiment of the present application. In practical applications, those skilled in the art can design a projection system according to requirements. For example, in FIG. 8, d3 and d2 are greater than d1. In practical applications, d3 and/or d2 may be less than d1. For another example, the first imaging light and the second imaging light may also carry more image information. More image information corresponds to more visual planes.
  • the third image information and the fourth image information can be used to form a 3D image, or can be used to form a 2D image.
  • d2 or d3 can be equal to d1.
  • FIG9 is a first structural schematic diagram of an image generation component provided by an embodiment of the present application.
  • the image generation component 101 includes a backlight component 901 and a spatial light modulator 902.
  • the backlight component 901 is used to output two light beams to the spatial light modulator 902 at different angles in a time-sharing manner.
  • the spatial light modulator 103 can be a liquid crystal display (LCD), liquid crystal on silicon (LCOS), a digital micro-mirror device (DMD), or a micro-electro-mechanical system (MEMS).
  • the spatial light modulator 902 is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two imaging lights, and output the two imaging lights at different angles.
  • the spatial light modulator 103 is used to modulate the first light beam according to the first image information to obtain the first imaging light.
  • the spatial light modulator 103 is used to modulate the second light beam according to the second image information to obtain the second imaging light.
  • the two-path imaging light is irradiated to the optical element.
  • the two-path imaging light reflected or transmitted by the optical element is irradiated to different viewpoints respectively.
  • the cost of the spatial light modulator 902 can be reduced, thereby reducing the cost of the image generation component 101.
  • the image generation component can also be a display that does not require a backlight, such as an organic light-emitting diode (OLED) display or a Micro LED.
  • OLED organic light-emitting diode
  • FIG10 is a second structural schematic diagram of an image generating assembly provided in an embodiment of the present application.
  • the image generating assembly 101 includes a backlight assembly 901 and a spatial light modulator 902.
  • the backlight assembly 901 moves between position A and position B.
  • the backlight assembly 901 is used to output a first light beam at a first position (position A) and output a second light beam at a second position (position B).
  • the mobile backlight assembly 901 may be a partial device in the mobile backlight assembly 901.
  • the backlight assembly 901 includes a light source device and a non-fixed element.
  • the non-fixed element may be a lens, a reflector, a prism, or a Fresnel mirror, etc.
  • the backlight assembly 901 may output two light beams in a time-sharing manner by moving the non-fixed element.
  • the backlight assembly may output the first light beam and the second light beam through one light source, thereby reducing the cost of the backlight assembly.
  • FIG11 is a third structural schematic diagram of the image generation component provided in an embodiment of the present application.
  • the image generation component 101 also includes a lens 1101 and a diffuser screen 1102.
  • the spatial light modulator 902 is used to output two paths of imaging light to the lens 1101 at different angles.
  • the lens 1101 is used to change the transmission direction of the two paths of imaging light and transmit the two paths of transmission light to the diffuser screen 1102.
  • the diffuser screen 1102 is used to diffuse the two paths of imaging light and output the diffused two paths of imaging light at different angles.
  • the diffused two paths of imaging light can be irradiated to different viewpoints through optical elements.
  • the embodiment of the present application can project different image information to different distances from the user in a time-sharing manner.
  • the first imaging light at the first moment carries the first image information.
  • the second imaging light at the first moment carries the second image information.
  • the first imaging light at the second moment carries the third image information.
  • the second imaging light at the first moment carries the fourth image information.
  • the projection system can project different image information to different distances from the user in another time-sharing manner.
  • the first imaging light at the first moment carries the first image information and the third image information.
  • the second imaging light at the second moment carries the second image information and the fourth image information.
  • the focal length of the optical element 102 is f.
  • the distance between the image plane of the image generating component 101 (the display surface of the image) and the optical element 102 is d.
  • the image plane of the image generating component 101 can be a pixel component or a diffusion screen.
  • d can be the farthest vertical distance between the optical element 102 and the image plane of the image generating component 101.
  • d can be the straight-line distance between the central pixel of the image plane of the image generating component 101 and the target point on the optical element 102.
  • the central pixel is one or more pixels at the center position of the image plane.
  • the imaging light output by the central pixel irradiates the target point on the optical element 102.
  • the distance between the virtual image formed by the two imaging lights after reflection or transmission and the optical element is d0, that is, the distance between the virtual image plane 103 and the optical element 102 is d0.
  • d0, d and f satisfy the following formula:
  • d may be smaller than f.
  • the optical element 102 may magnify the virtual image. Therefore, when the distance between the user and the optical element 102 is relatively close, the user may see to an enlarged virtual image, thereby improving the user experience.
  • the optical element 102 reflects or transmits two imaging lights through the reflection or transmission area.
  • the target point may refer to any point in the reflection or transmission area.
  • the optical element 102 reflects or transmits two imaging lights through two reflection or transmission areas.
  • the two imaging lights correspond to the reflection or transmission areas one by one.
  • the target point may refer to the center point between the two reflection or transmission areas.
  • FIG. 12 is a circuit diagram of a projection system provided in an embodiment of the present application.
  • the circuit in the projection system mainly includes a processor 1001, an internal memory 1002, an external memory interface 1003, an audio module 1004, a video module 1005, a power module 1006, a wireless communication module 1007, an I/O interface 1008, a video interface 1009, a Controller Area Network (CAN) transceiver 1010, a display circuit 1028, and a display panel 1029.
  • the processor 1001 and its peripheral components, such as the internal memory 1002, the CAN transceiver 1010, the audio module 1004, the video module 1005, the power module 1006, the wireless communication module 1007, the I/O interface 1008, the video interface 1009, the touch unit 1010, and the display circuit 1028 can be connected through a bus.
  • the processor 1001 can be called a front-end processor.
  • circuit diagrams shown in the embodiments of the present application do not constitute a specific limitation on the projection system.
  • the projection system may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1001 includes one or more processing units, for example, the processor 1001 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural network processor (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural network processor
  • a memory may also be provided in the processor 1001 for storing instructions and data.
  • the operating system of the projection system, the AR Creator software package, etc. may be stored.
  • the memory in the processor 1001 is a cache memory.
  • the memory may store instructions or data that the processor 1001 has just used or cyclically used. If the processor 1001 needs to use the instruction or data again, it may be directly called from the memory. Repeated access is avoided, the waiting time of the processor 1001 is reduced, and the efficiency of the system is improved.
  • the functions of the processor 1001 can be implemented by a domain processor on the vehicle.
  • the projection system may further include a plurality of input/output (I/O) interfaces 1008 connected to the processor 1001.
  • the interface 1008 may include, but is not limited to, an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
  • the I/O interface 1008 may be connected to devices such as a mouse, touch screen, keyboard, camera, speaker, microphone, etc., and may also be connected to physical buttons on the projection system (such as volume buttons, brightness adjustment buttons, power buttons, etc.).
  • the internal memory 1002 can be used to store computer executable program codes, which include instructions.
  • the internal memory 1002 may include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application required for at least one function (such as a call function, a time setting function, an AR function, etc.), etc.
  • the data storage area may store data created during the use of the projection system (such as a phone book, world time, etc.), etc.
  • the internal memory 1002 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (Universal Flash Storage, UFS), etc.
  • the processor 1001 executes various functional applications and data processing of the projection system by running instructions stored in the internal memory 1002 and/or instructions stored in a memory provided in the processor 1001.
  • the external memory interface 1003 can be used to connect an external memory (such as a Micro SD card).
  • the external memory can store data or program instructions as needed, and the processor 1001 can perform operations such as reading and writing these data or programs through the external memory interface 1003.
  • the audio module 1004 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1004 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1004 can be arranged in the processor 1001, or some functional modules of the audio module 1004 can be arranged in the processor 1001.
  • the projection system can realize audio functions through the audio module 1004 and the application processor.
  • the video interface 1009 can receive external audio and video input, which can be a high-definition multimedia interface (HDMI), a digital visual interface (DVI), a video graphics array (VGA), a display port (DP), a low voltage differential signal (LVDS) interface, etc.
  • the video interface 1009 can also output video to the outside.
  • the projection system receives the audio and video input through the video interface.
  • the video module 1005 can decode the video input by the video interface 1009, for example, by performing H.264 decoding.
  • the video module can also encode the video collected by the projection system, for example, by performing H.264 encoding on the video collected by the external camera.
  • the processor 1001 can also decode the video input by the video interface 1009, and then output the decoded image signal to the display circuit.
  • the stereoscopic projection system further includes a CAN transceiver 1010, which can be connected to the CAN bus (CAN BUS) of the car.
  • CAN BUS CAN bus
  • the stereoscopic projection system can communicate with the in-vehicle entertainment system (music, radio, video module), the vehicle status system, etc.
  • the user can turn on the in-vehicle music playback function by operating the projection system.
  • the vehicle status system can send vehicle status information (doors, seat belts, etc.) to the stereoscopic projection system for display.
  • the display circuit 1028 and the display panel 1029 jointly realize the function of displaying an image.
  • the display circuit 1028 receives the image signal output by the processor 1001, processes the image signal and then inputs it into the display panel 1029 for imaging.
  • the display circuit 1028 can also control the image displayed by the display panel 1029. For example, it controls parameters such as display brightness or contrast.
  • the display circuit 1028 may include a driving circuit, an image control circuit, etc.
  • the above-mentioned display circuit 1028 and the display panel 1029 may be located in the pixel component 502.
  • the display panel 1029 is used to modulate the light beam input by the light source according to the input image signal, so as to generate a visible image.
  • the display panel 1029 can be a liquid crystal on silicon panel, a liquid crystal display panel or a digital micromirror device.
  • the video interface 1009 can receive input video data (or called a video source), and the video module 1005 decodes and/or digitally processes the data and outputs an image signal to the display circuit 1028.
  • the display circuit 1028 drives the display panel 1029 to image the light beam emitted by the light source according to the input image signal, thereby generating a visible image (emitting imaging light).
  • the power module 1006 is used to provide power to the processor 1001 and the light source according to the input power (e.g., direct current), and the power module 1006 may include a rechargeable battery, which can provide power to the processor 1001 and the light source.
  • the light emitted by the light source can be transmitted to the display panel 1029 for imaging, thereby forming an image light signal (imaging light).
  • the power module 1006 can be connected to a power module of a car (eg, a power battery), and the power module of the car supplies power to the power module 1006 of the projection system.
  • a power module of a car eg, a power battery
  • the wireless communication module 1007 enables the projection system to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared
  • the wireless communication module 1007 can be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1007 receives electromagnetic waves via an antenna, modulates the frequency of the electromagnetic wave signal and performs filtering, and sends the processed signal to the processor 1001.
  • the wireless communication module 1007 can also receive the signal to be sent from the processor 1001, modulate the frequency of the signal, amplify it, and convert it into electromagnetic waves for radiation through the antenna.
  • the video data decoded by the video module 1005 can also be wirelessly received through the wireless communication module 1007 or read from the internal memory 1002 or the external memory.
  • the projection system can receive video data from the terminal device or the in-vehicle entertainment system through the wireless LAN in the car, and the projection system can also read the audio and video data stored in the internal memory 1002 or the external memory.
  • the embodiment of the present application also provides a vehicle, which is equipped with any of the aforementioned stereoscopic projection systems.
  • the projection system is used to output two imaging lights.
  • the two imaging lights carry different image information.
  • the two output imaging lights are illuminated to the receiving surface through the windshield to form a virtual image.
  • the virtual image is located on one side of the windshield, and the driver or passenger is located on the other side of the windshield.
  • the two imaging lights after reflection or transmission are respectively irradiated to the eyes of the driver or the passenger.
  • the first imaging light is irradiated to the left eye of the passenger.
  • the second imaging light is irradiated to the right eye of the passenger.
  • FIG. 13 is a schematic diagram of the projection system installed in a vehicle provided in the embodiment of the present application.
  • the windshield of the vehicle can be used as an optical element in the projection system.
  • the image generation component 101 in the projection system is located on the same side of the windshield.
  • the image generation component 101 is used to output two imaging lights.
  • the two imaging lights carry different image information.
  • the windshield is used to reflect or transmit the two imaging lights to form a virtual image.
  • the virtual image is located on one side of the windshield, and the driver or passenger is located on the other side of the windshield.
  • the two imaging lights after reflection or transmission are respectively irradiated to the eyes of the driver or the passenger.
  • the first imaging light is irradiated to the left eye of the passenger.
  • the second imaging light is irradiated to the right eye of the passenger.
  • the vehicle can be a car, truck, motorcycle, bus, ship, airplane, helicopter, lawn mower, recreational vehicle, amusement park vehicle, construction equipment, tram, golf cart, train, and cart, etc., which are not particularly limited in the embodiments of the present application.
  • the projection system can be installed on the instrument panel (IP) of the vehicle, located at the co-pilot position or the main driver position, or it can be installed on the back of the seat.
  • IP instrument panel
  • HUD can be used to display navigation information, vehicle speed, power/fuel level, etc.
  • FIG. 14 is a schematic diagram of a possible functional framework of a vehicle provided in an embodiment of the present application.
  • the functional framework of the vehicle may include various subsystems, such as the control system 14, the sensor system 12, one or more peripheral devices 16 (one is shown as an example), the power supply 18, the computer system 20, and the display system 32.
  • the vehicle may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
  • the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (GPS), a vehicle speed sensor, an inertial measurement unit (IMU), a radar unit, a laser rangefinder, a camera device, a wheel speed sensor, a steering sensor, a gear position sensor, or other components for automatic detection, etc., and this application does not limit them.
  • the control system 14 may include several components, such as the steering unit, brake unit, lighting system, automatic driving system, map navigation system, network timing system and obstacle avoidance system shown in the figure.
  • the control system 14 may also include components such as a throttle processor and an engine processor for controlling the vehicle's speed, which are not limited in this application.
  • the peripheral device 16 may include several components, such as the communication system, touch screen, user interface, microphone, and speaker shown in the figure.
  • the communication system is used to realize network communication between the vehicle and other devices other than the vehicle.
  • the communication system may use wireless communication technology or wired communication technology to realize network communication between the vehicle and other devices.
  • the wired communication technology may refer to communication between the vehicle and other devices through network cables or optical fibers.
  • the power source 18 represents a system that provides power or energy for the vehicle, which may include but is not limited to a rechargeable lithium battery or a lead-acid battery, etc. In practical applications, one or more battery components in the power source are used to provide power or energy for starting the vehicle, and the type and material of the power source are not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (also referred to as a storage device).
  • processors 2001 one processor is shown as an example
  • memory 2002 also referred to as a storage device.
  • the memory 2002 is also inside the computer system 20, or it may be outside the computer system 20, for example, as a cache in the vehicle, etc., which is not limited in this application.
  • Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU). Processor 2001 may be used to run the relevant programs or instructions corresponding to the programs stored in memory 2002 to implement the corresponding functions of the vehicle.
  • GPU graphics processing unit
  • the memory 2002 may include a volatile memory, such as a RAM; the memory may also include a non-volatile memory, such as a ROM, a flash memory or a solid state drive (SSD); the memory 2002 may also include a combination of the above-mentioned types of memories.
  • the memory 2002 may be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 calls the program codes or instructions stored in the memory 2002 to implement the corresponding functions of the vehicle.
  • the function includes but is not limited to some or all of the functions in the vehicle function framework diagram shown in FIG13.
  • a set of program codes for vehicle control may be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle is specifically described in detail below in the present application.
  • the memory 2002 may also store information such as road maps, driving routes, sensor data, etc.
  • the computer system 20 may be combined with other elements in the vehicle functional framework diagram, such as sensors in the sensor system, GPS, etc., to implement relevant functions of the vehicle.
  • the computer system 20 may control the driving direction or driving speed of the vehicle based on the data input from the sensor system 12, which is not limited in this application.
  • the display system 32 may include several components, such as a processor, an optical element, and the stereoscopic projection system 100 described above.
  • the processor is used to generate images according to user instructions (such as generating images containing vehicle status such as vehicle speed, power/fuel level, and images of augmented reality AR content), and send the image content to the stereoscopic projection system 100.
  • the stereoscopic projection system 100 is used to output two-way imaging light carrying different image information.
  • the windshield is an optical element. The windshield is used to reflect or transmit two-way imaging light so that a virtual image corresponding to the image content is presented in front of the driver or passenger.
  • the functions of some components in the display system 32 can also be implemented by other subsystems of the vehicle.
  • the processor can also be a component in the control system 14.
  • FIG. 14 of the present application includes four subsystems, and the sensor system 12, the control system 14, the computer system 20 and the display system 32 are only examples and do not constitute limitations.
  • vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions.
  • vehicles can include more or fewer systems or components, and this application does not limit them.
  • FIG15 is a flow chart of a projection method provided in an embodiment of the present application.
  • the projection method can be applied to a projection system or a vehicle equipped with a projection system.
  • the projection method is described by taking the application of the projection method to the projection system as an example.
  • the projection method includes the following steps: Steps.
  • the projection system outputs two paths of imaging light.
  • the two paths of imaging light include a first imaging light and a second imaging light.
  • the first imaging light carries first image information
  • the second imaging light carries second image information.
  • the projection system includes an image generating component 101 and an optical element 102.
  • the projection system outputs two paths of imaging light through the image generating component 101.
  • the image generating component 101 includes a backlight component 901 and a spatial light modulator 902.
  • the backlight component 901 is used to output two light beams to the spatial light modulator 902 at different angles in a time-sharing manner.
  • the spatial light modulator 902 is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two paths of imaging light.
  • the projection system reflects or transmits two imaging lights.
  • the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane.
  • the two imaging lights are respectively irradiated to two positions of the receiving surface.
  • the distance between the two positions is m.
  • the distance between the virtual image plane and the receiving surface is d1.
  • the distance between the receiving surface and the first visual plane is d2.
  • d2 is determined based on the first parallax information between the first image information and the second image information, m and d1. d1 and d2 meet the first condition.
  • the two paths of imaging light after reflection or transmission are respectively irradiated to different positions of the receiving surface 105, such as the left eye (upper eye position) and the right eye (lower eye position) of the user.
  • the two paths of imaging light after reflection or transmission generate a virtual image on the virtual image plane 103.
  • the virtual image plane 103 is located on one side of the optical element 102, and the receiving surface 105 is located on the other side of the optical element 102.
  • the first parallax information will cause the two image information viewed by the user to produce a first parallax angle.
  • the image information observed by the user is located on the visual plane 104.
  • the distance between the visual plane 104 and the receiving surface 105 is d2.
  • d1 and d2 meet the first condition, and the first condition can be any one or more of the following conditions.
  • d2 is smaller than d1.
  • the projection system may form a ghost image.
  • the ghost image will affect the clarity of the HUD display and driving safety.
  • d2 is smaller than d1, the impact of the ghost image can be reduced, thereby improving the clarity of the HUD display and driving safety.
  • VAC is less than 0.25 diopters.
  • the VAC size is negatively correlated with the user experience. By controlling VAC, the user experience can be improved.
  • the projection system outputs two light beams in a time-sharing manner, and obtains two imaging lights by modulating the two light beams in a time-sharing manner.
  • the projection system outputs two imaging lights at a first moment.
  • the first imaging light in the two imaging lights carries the first image information.
  • the second imaging light in the two imaging lights carries the second image information.
  • the first image information and the second image information carry the first parallax information.
  • the projection method further includes the following steps: the projection system outputs two imaging lights at a second moment.
  • the first imaging light in the two imaging lights carries the third image information.
  • the second imaging light in the two imaging lights carries the fourth image information.
  • the third image information and the fourth image information carry the second parallax information.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A projection system, applied to the field of display. The projection system comprises an image generation component (101) and an optical element (102). The image generation component (101) is used for outputting two paths of imaging light. The two paths of imaging light comprise a first path of imaging light and a second path of imaging light. The first path of imaging light carries first image information. The second path of imaging light carries second image information. The optical element (102) is used for reflecting or transmitting the two paths of imaging light. The two paths of imaging light generate a virtual image on a virtual image plane (103). The two paths of imaging light are irradiated to two positions on a receiving plane (105). The distance between the two positions is m. The distance between the virtual image plane (103) and the receiving plane (105) is d1. The distance between the receiving plane (105) and a first visual plane is d2. The vergence-accommodation conflict (VAC) satisfies the following relation: VAC=|1/d2-1/d1|. The VAC is less than 0.25 diopters. By adjusting parallax information of the two pieces of image information, the visual distance d2 can be adjusted, so that the number of projection systems can be reduced, and the cost of a head-up display (HUD) can be lowered.

Description

投影系统、投影方法和交通工具Projection system, projection method and vehicle
本申请要求于2022年11月11日提交中国国家知识产权局、申请号为CN202211414939.3、申请名称为“投影系统、投影方法和交通工具”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims priority to the Chinese patent application filed with the State Intellectual Property Office of China on November 11, 2022, with application number CN202211414939.3 and application name “Projection System, Projection Method and Vehicle”, the entire contents of which are incorporated by reference in this application.
技术领域Technical Field
本申请涉及显示领域,尤其涉及一种投影系统、投影方法和交通工具。The present application relates to the field of display, and in particular to a projection system, a projection method and a vehicle.
背景技术Background technique
抬头显示(Head-up display,HUD)也叫平视显示系统,它是把速度、导航等信息投射至驾驶员前方的一种装置,使驾驶员不必低头即可看到仪表信息。在HUD中,为了提高视觉享受,可以将不同的图像信息投影至与驾驶员不同的距离。例如,仪表相关的信息投影至距离驾驶员2.5米(metre,m)处,增强现实(Augmented reality,AR)的信息投影至距离驾驶员10m处。此时,HUD包括两套投影系统。其中一套投影系统用于投影仪表相关的信息。另一套投影系统用于投影AR的信息。Head-up display (HUD), also known as head-up display system, is a device that projects information such as speed and navigation in front of the driver, so that the driver can see the instrument information without lowering his head. In HUD, in order to improve visual enjoyment, different image information can be projected to different distances from the driver. For example, instrument-related information is projected 2.5 meters (metre, m) away from the driver, and augmented reality (AR) information is projected 10 meters away from the driver. At this time, HUD includes two projection systems. One projection system is used to project instrument-related information. The other projection system is used to project AR information.
在实际应用中,投影系统的成本较高,从而导致HUD的成本较高。In practical applications, the cost of the projection system is high, which leads to the high cost of the HUD.
发明内容Summary of the invention
本申请提供了一种投影系统,通过调节两个图像信息的视差信息,可以调整视觉距离,从而减少投影系统的数量并降低HUD的成本,并且,通过控制辐辏调节冲突VAC,可以提高用户体验。The present application provides a projection system, which can adjust the visual distance by adjusting the parallax information of two image information, thereby reducing the number of projection systems and reducing the cost of HUD, and can improve the user experience by controlling the convergence adjustment conflict VAC.
本申请第一方面提供了一种投影系统。投影系统包括图像生成组件和光学元件。图像生成组件用于输出两路成像光。两路成像光包括第一路成像光和第二路成像光。第一路成像光携带第一图像信息。第二路成像光携带第二图像信息。光学元件用于反射或透射两路成像光,反射或透射后的两路成像光在虚像面生成虚像。反射或透射后的两路成像光照射至接收面的两个位置。两个位置和两路成像光一一对应。两个位置之间的距离为m。虚像面与接收面之间的距离为d1。第一图像信息和第二图像信息之间存在第一视差信息。第一视差信息会导致用户观看到的两个图像信息产生第一视差角,使得用户观察到的图像信息位于第一视觉平面。接收面与第一视觉平面的距离为d2。d2是根据第一图像信息和第二图像信息之间的第一视差信息、m和d1确定的。辐辏调节冲突(vergence accommodation conflict,VAC)满足以下关系:其中,VAC小于0.25屈光度,d1和d2的单位为米。In a first aspect, the present application provides a projection system. The projection system includes an image generating component and an optical element. The image generating component is used to output two imaging lights. The two imaging lights include a first imaging light and a second imaging light. The first imaging light carries first image information. The second imaging light carries second image information. The optical element is used to reflect or transmit the two imaging lights, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane. The two imaging lights after reflection or transmission are irradiated to two positions of a receiving surface. The two positions correspond to the two imaging lights one by one. The distance between the two positions is m. The distance between the virtual image plane and the receiving surface is d1. There is first parallax information between the first image information and the second image information. The first parallax information will cause the two image information viewed by the user to generate a first parallax angle, so that the image information observed by the user is located in the first visual plane. The distance between the receiving surface and the first visual plane is d2. d2 is determined based on the first parallax information between the first image information and the second image information, m and d1. The vergence accommodation conflict (VAC) satisfies the following relationship: Where VAC is less than 0.25 diopters and the units of d1 and d2 are meters.
在第一方面的一种可选方式中,光学元件为挡风玻璃。挡风玻璃包括第一玻璃层、第二玻璃层和粘合第一玻璃层与第二玻璃层的中间层。In an optional manner of the first aspect, the optical element is a windshield, which includes a first glass layer, a second glass layer, and an intermediate layer for bonding the first glass layer and the second glass layer.
在第一方面的一种可选方式中,两路成像光为线性偏振光。中间层用于吸收线性偏振光。其中,当两路成像光入射到光学元件时,会在光学元件的内外两个与空气接触的玻璃层进行反射,从而在人眼接收位置形成两个虚像。两个虚像因为部分重叠而形成重影。重影会严重影响HUD显示的清晰度及驾驶安全性。通过吸收线性偏振光,可以降低重影的影响,从而提高HUD显示的清晰度及驾驶安全性。In an optional manner of the first aspect, the two imaging lights are linearly polarized lights. The intermediate layer is used to absorb the linearly polarized light. When the two imaging lights are incident on the optical element, they are reflected by two glass layers in contact with the air inside and outside the optical element, thereby forming two virtual images at the receiving position of the human eye. The two virtual images form a ghost image due to partial overlap. Ghosting can seriously affect the clarity of the HUD display and driving safety. By absorbing linearly polarized light, the impact of ghosting can be reduced, thereby improving the clarity of the HUD display and driving safety.
在第一方面的一种可选方式中,中间层为楔形结构。其中,通过楔形结构,可以降低重影的影响,从而提高HUD显示的清晰度及驾驶安全性。In an optional manner of the first aspect, the middle layer is a wedge-shaped structure. The wedge-shaped structure can reduce the impact of ghosting, thereby improving the clarity of the HUD display and driving safety.
在第一方面的一种可选方式中,d1的取值范围在2.5m至7m之间。In an optional manner of the first aspect, the value range of d1 is between 2.5 m and 7 m.
在第一方面的一种可选方式中,中间层在不同位置处的厚度相同。相同厚度的中间层可以降低光学元件的成本,从而降低投影系统的成本。In an optional manner of the first aspect, the thickness of the intermediate layer at different positions is the same. Intermediate layers of the same thickness can reduce the cost of the optical element, thereby reducing the cost of the projection system.
在第一方面的一种可选方式中,d2小于d1。当中间层在不同位置处的厚度相同时,投影系统会形成重影。当d2小于d1时,可以降低重影的影响,从而提高HUD显示的清晰度及驾驶安全性。In an optional manner of the first aspect, d2 is less than d1. When the thickness of the intermediate layer at different positions is the same, the projection system will form a ghost image. When d2 is less than d1, the influence of the ghost image can be reduced, thereby improving the clarity of the HUD display and driving safety.
在第一方面的一种可选方式中,d1的取值范围在10m至15m之间。其中,当d1值太小时,重影的影响比较大。当d1值太大,VAC小于0.25屈光度时,d2的取值较大,从而影响用户体验。因此,在本申请中,通过控制d1的取值,可以降低重影的影响,提高用户体验。In an optional manner of the first aspect, the value range of d1 is between 10m and 15m. When the value of d1 is too small, the impact of ghosting is relatively large. When the value of d1 is too large and VAC is less than 0.25 diopters, the value of d2 is large, thereby affecting the user experience. Therefore, in the present application, by controlling the value of d1, the impact of ghosting can be reduced and the user experience can be improved.
在第一方面的一种可选方式中,第一路成像光还携带第三图像信息。第二路成像光还携带第四图像信息。接收面与第二视觉平面的距离为d3。d3是根据第三图像信息和第四图像信息之间的第二视差信 息、m和d1确定的。通过在两路成像光中携带包括不同视差信息的两组图像信息,可以将两组图像信息投影至与驾驶员不同的距离,从而提高用户体验并降低HUD的成本。In an optional manner of the first aspect, the first imaging light also carries third image information. The second imaging light also carries fourth image information. The distance between the receiving surface and the second visual plane is d3. d3 is a second parallax information between the third image information and the fourth image information. By carrying two sets of image information including different parallax information in two imaging lights, the two sets of image information can be projected to different distances from the driver, thereby improving user experience and reducing the cost of HUD.
在第一方面的一种可选方式中,d3等于d1。当d1等于d3时,对于近距离显示的内容可以实现无VAC显示。In an optional manner of the first aspect, d3 is equal to d1. When d1 is equal to d3, VAC-free display can be achieved for content displayed at a close distance.
在第一方面的一种可选方式中,光学元件的焦距为f。图像生成组件与光学元件的距离为d。d小于f。当d小于f时,可以对两路成像光携带的图像信息进行放大,从而提高用户体验。In an optional manner of the first aspect, the focal length of the optical element is f. The distance between the image generating component and the optical element is d. d is smaller than f. When d is smaller than f, the image information carried by the two imaging lights can be amplified, thereby improving the user experience.
在第一方面的一种可选方式中,虚像面和光学元件的距离为d0。d0满足如下公式: In an optional manner of the first aspect, the distance between the virtual image plane and the optical element is d0. d0 satisfies the following formula:
在第一方面的一种可选方式中,图像生成组件包括背光组件和空间光调制器。背光组件用于分时的以不同的角度向空间光调制器输出两束光束。空间光调制器用于根据不同的图像信息分时的调制两束光束,得到两路成像光,以不同的角度输出两路成像光。通过分时的调制两路光束,可以降低空间光调制器的成本,从而降低图像生成组件的成本。In an optional manner of the first aspect, the image generating component includes a backlight component and a spatial light modulator. The backlight component is used to output two light beams to the spatial light modulator at different angles in a time-sharing manner. The spatial light modulator is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two imaging lights, and output the two imaging lights at different angles. By modulating the two light beams in a time-sharing manner, the cost of the spatial light modulator can be reduced, thereby reducing the cost of the image generating component.
在第一方面的一种可选方式中,图像生成组件包括背光组件、空间光调制器及分光元件。背光组件用于生成待调制光束。分光元件用于对待调制光束进行分束,得到2束待调制子光束。空间光调制器用于对2束待调制子光束进行调制,输出2路成像光。通过分光元件,可以降低空间光调制器的刷新率,从而提高图像生成组件的可靠性。分光元件可以是柱镜光栅、液晶光栅、屏障光栅、电子光栅、衍射元件等。In an optional manner of the first aspect, the image generating component includes a backlight component, a spatial light modulator and a spectroscopic element. The backlight component is used to generate a light beam to be modulated. The spectroscopic element is used to split the light beam to be modulated to obtain two sub-beams to be modulated. The spatial light modulator is used to modulate the two sub-beams to be modulated and output two imaging lights. The spectroscopic element can reduce the refresh rate of the spatial light modulator, thereby improving the reliability of the image generating component. The spectroscopic element can be a cylindrical grating, a liquid crystal grating, a barrier grating, an electronic grating, a diffraction element, etc.
在第一方面的一种可选方式中,图像生成组件还包括扩散屏。扩散屏用于从空间光调制器接收两路成像光,对两路成像光进行扩散,以不同的角度输出扩散后的两路成像光。In an optional manner of the first aspect, the image generating component further includes a diffusion screen. The diffusion screen is used to receive two paths of imaging light from the spatial light modulator, diffuse the two paths of imaging light, and output the diffused two paths of imaging light at different angles.
在第一方面的一种可选方式中,两束光束包括第一光束和第二光束。背光组件用于在第一位置输出第一光束,在第二位置输出第二光束。通过移动背光组件,背光组件可以通过一个光源输出第一光束和第二光束,从而降低背光组件的成本。In an optional manner of the first aspect, the two light beams include a first light beam and a second light beam. The backlight assembly is used to output the first light beam at a first position and output the second light beam at a second position. By moving the backlight assembly, the backlight assembly can output the first light beam and the second light beam through one light source, thereby reducing the cost of the backlight assembly.
在第一方面的一种可选方式中,投影系统还包括人眼追踪模块和处理器。人员追踪模块用于获取接收面的位置。处理器用于根据接收面的位置调整第一图像信息和第二图像信息的第一视差信息。当人眼和光学元件的距离发生变化时,d2的值会发生变化,从而可能影响用户体验。通过调整视差信息,可以控制d2的值,从而提高用户体验。In an optional manner of the first aspect, the projection system further includes a human eye tracking module and a processor. The human eye tracking module is used to obtain the position of the receiving surface. The processor is used to adjust the first parallax information of the first image information and the second image information according to the position of the receiving surface. When the distance between the human eye and the optical element changes, the value of d2 will change, which may affect the user experience. By adjusting the parallax information, the value of d2 can be controlled, thereby improving the user experience.
本申请第二方面提供了一种投影方法。投影方法可以应用于投影系统。投影方法包括以下步骤:投影系统输出两路成像光。两路成像光包括第一路成像光和第二路成像光。第一路成像光携带第一图像信息。第二路成像光携带第二图像信息;投影系统反射或透射两路成像光,反射或透射后的两路成像光在虚像面生成虚像。反射或透射后的两路成像光照射至接收面的两个位置。两个位置之间的距离为m。虚像面与接收面之间的距离为d1,接收面与第一视觉平面的距离为d2。d2是根据第一图像信息和第二图像信息之间的第一视差信息、m和d1确定的。VAC满足以下关系:其中,VAC小于0.25屈光度。The second aspect of the present application provides a projection method. The projection method can be applied to a projection system. The projection method includes the following steps: the projection system outputs two imaging lights. The two imaging lights include a first imaging light and a second imaging light. The first imaging light carries the first image information. The second imaging light carries the second image information; the projection system reflects or transmits the two imaging lights, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane. The two imaging lights after reflection or transmission are irradiated to two positions of the receiving surface. The distance between the two positions is m. The distance between the virtual image plane and the receiving surface is d1, and the distance between the receiving surface and the first visual plane is d2. d2 is determined based on the first parallax information between the first image information and the second image information, m and d1. VAC satisfies the following relationship: Among them, VAC is less than 0.25 diopters.
应理解,第二方面的投影方法和第一方面所述的投影系统之间存在相同之处。因此,关于第二方面的任意一种可选方式的描述,可以参考前述第一方面的任意一种可选方式的描述。It should be understood that there are similarities between the projection method of the second aspect and the projection system of the first aspect. Therefore, the description of any optional manner of the second aspect can refer to the description of any optional manner of the first aspect.
本申请第三方面提供了一种交通工具。交通工具包括如前述第一方面或第一方面中任意一种可选方式中所述的投影系统。投影系统安装在交通工具上。The third aspect of the present application provides a vehicle, wherein the vehicle comprises a projection system as described in the first aspect or any optional manner of the first aspect, and the projection system is installed on the vehicle.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
图1为本申请实施例提供的投影系统的第一个光路示意图;FIG1 is a first optical path schematic diagram of a projection system provided in an embodiment of the present application;
图2为本申请实施例提供的挡风玻璃的第一个结构示意图;FIG2 is a first structural schematic diagram of a windshield provided in an embodiment of the present application;
图3为本申请实施例提供的挡风玻璃的第二个结构示意图;FIG3 is a second structural schematic diagram of a windshield provided in an embodiment of the present application;
图4为本申请实施例提供的挡风玻璃的第三个结构示意图;FIG4 is a third structural schematic diagram of a windshield provided in an embodiment of the present application;
图5为本申请实施例提供的投影系统的第二个光路示意图;FIG5 is a second optical path schematic diagram of the projection system provided in an embodiment of the present application;
图6为本申请实施例提供的VAC的第一个结构示意图;FIG6 is a first structural diagram of a VAC provided in an embodiment of the present application;
图7为本申请实施例提供的VAC的第二个结构示意图;FIG7 is a second structural diagram of a VAC provided in an embodiment of the present application;
图8为本申请实施例提供的投影系统的第三个光路示意图;FIG8 is a third optical path schematic diagram of the projection system provided in an embodiment of the present application;
图9为本申请实施例提供的图像生成组件的第一个结构示意图;FIG9 is a first structural diagram of an image generation component provided in an embodiment of the present application;
图10为本申请实施例提供的图像生成组件的第二个结构示意图; FIG10 is a second structural diagram of an image generation component provided in an embodiment of the present application;
图11为本申请实施例提供的图像生成组件的第三个结构示意图;FIG11 is a third structural schematic diagram of the image generation component provided in an embodiment of the present application;
图12为本申请实施例提供的一种投影系统的电路示意图;FIG12 is a circuit diagram of a projection system provided in an embodiment of the present application;
图13为本申请实施例提供投影系统安装在交通工具的示意图;FIG13 is a schematic diagram of a projection system installed in a vehicle according to an embodiment of the present application;
图14为本申请请实施例提供的交通工具的一种可能的功能框架示意图;FIG14 is a schematic diagram of a possible functional framework of a vehicle provided in an embodiment of the present application;
图15为本申请实施例提供的投影方法的流程示意图。FIG. 15 is a flow chart of the projection method provided in an embodiment of the present application.
具体实施方式Detailed ways
本申请提供了一种投影系统,通过调节两个图像信息的视差信息,可以调整视觉距离,从而减少投影系统的数量并降低HUD的成本,并且,通过控制辐辏调节冲突VAC,可以提高用户体验。应理解,本申请中使用的“第一”、“第二”、“目标”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。另外,为了简明和清楚,本申请多个附图中重复参考编号和/或字母。重复并不表明各种实施例和/或配置之间存在严格的限定关系。The present application provides a projection system that can adjust the visual distance by adjusting the parallax information of two image information, thereby reducing the number of projection systems and reducing the cost of HUD, and by controlling the convergence adjustment conflict VAC, the user experience can be improved. It should be understood that the "first", "second", "target", etc. used in this application are only used for the purpose of distinguishing the description, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order. In addition, for the sake of simplicity and clarity, reference numbers and/or letters are repeated in multiple figures of the present application. Repetition does not indicate a strict limiting relationship between various embodiments and/or configurations.
本申请中的投影系统应用于显示领域。在显示领域中,为了提高视觉享受,抬头显示(Head-up display,HUD)可以将不同的图像信息投影至与驾驶员不同的距离。此时,HUD包括两套投影系统。一套投影系统用于投影仪表相关的信息。另一套投影系统用于投影增强现实(Augmented reality,AR)的信息。因此,HUD的成本较高。The projection system in this application is applied to the display field. In the display field, in order to improve visual enjoyment, the head-up display (HUD) can project different image information to different distances from the driver. At this time, the HUD includes two projection systems. One projection system is used to project instrument-related information. The other projection system is used to project augmented reality (AR) information. Therefore, the cost of HUD is relatively high.
为此,本申请提供了一种投影系统。图1为本申请实施例提供的投影系统的第一个光路示意图。如图1所示,投影系统包括图像生成组件101和光学元件102。图像生成组件101用于输出两路成像光。光学元件102可以反射镜、挡风玻璃、透镜或衍射元件等。光学元件102用于反射或透射两路成像光,反射或透射后的两路成像光之间存在夹角。反射或透射后的两路成像光分别照射至接收面105不同的两个位置,例如用户的左眼(上眼位)和右眼(下眼位)。两个位置之间的距离为m。两个位置和两路成像光一一对应。人眼所在位置也可以称为视点。上述投影系统可以提供多个视点,供多人观看。对应的,图像生成组件101可以生产多组成像光,分别供不同的人观看。其中,一组成像光包括两路成像光。本实施例以一个视点为例,即图像生成组件101生成两路成像光为例,来说明投影系统的成像过程。To this end, the present application provides a projection system. FIG. 1 is a first optical path schematic diagram of a projection system provided in an embodiment of the present application. As shown in FIG. 1 , the projection system includes an image generation component 101 and an optical element 102. The image generation component 101 is used to output two imaging lights. The optical element 102 may be a reflector, a windshield, a lens or a diffraction element, etc. The optical element 102 is used to reflect or transmit two imaging lights, and there is an angle between the two imaging lights after reflection or transmission. The two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105, such as the left eye (upper eye position) and the right eye (lower eye position) of the user. The distance between the two positions is m. The two positions correspond one to one to the two imaging lights. The position of the human eye can also be called a viewpoint. The above-mentioned projection system can provide multiple viewpoints for multiple people to watch. Correspondingly, the image generation component 101 can produce multiple groups of imaging lights for different people to watch. Among them, a group of imaging lights includes two imaging lights. This embodiment takes a viewpoint as an example, that is, the image generation component 101 generates two imaging lights as an example to illustrate the imaging process of the projection system.
两路成像光包括第一路成像光和第二路成像光。第一路成像光携带第一图像信息。第二路成像光携带第二图像信息。反射或透射后的两路成像光在虚像面103形成虚像。例如,P1和P2均为虚像面103上的像点。P1和P2对应的光束会被照射至接收面105的不同位置。在图1中,P1对应的光束属于第一路成像光。P2对应的光束属于第二路成像光。P1对应的光束照射至用户的上眼位。P2对应的光束照射至用户的下眼位。为了避免光束的串扰,P1对应的光束不会照射至用户的下眼位。P2对应的光束不会照射至用户的上眼位。The two-path imaging light includes a first path of imaging light and a second path of imaging light. The first path of imaging light carries the first image information. The second path of imaging light carries the second image information. The two paths of imaging light after reflection or transmission form a virtual image on the virtual image plane 103. For example, P1 and P2 are both image points on the virtual image plane 103. The light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105. In Figure 1, the light beam corresponding to P1 belongs to the first path of imaging light. The light beam corresponding to P2 belongs to the second path of imaging light. The light beam corresponding to P1 irradiates the upper eye position of the user. The light beam corresponding to P2 irradiates the lower eye position of the user. In order to avoid crosstalk of light beams, the light beam corresponding to P1 will not irradiate the lower eye position of the user. The light beam corresponding to P2 will not irradiate the upper eye position of the user.
虚像面103和接收面105之间的距离为d1。第一图像信息和第二图像信息之间存在第一视差信息。第一图像信息和第二图像信息之间的第一视差信息可以为像素组中的两个像素点在虚像面上的距离(简称为第一距离)。第一图像信息包括N个第一像素点。第二图像信息包括N个第二像素点。N为大于0的整数。N个第一像素点和N个第二像素点一一对应。第一图像信息和第二图像信息包括N个像素组。一个像素组包括一个第一像素点和与其对应的第二像素点。像素组中的两个像素点用于展示物体的同一点。例如,两个像素点都用于展示圆形的中心。又如,两个像素点都用于展示人物的鼻尖。又如,两个像素点都用于展示数字“1”的顶点。N个像素组的第一距离相同。The distance between the virtual image plane 103 and the receiving surface 105 is d1. There is first parallax information between the first image information and the second image information. The first parallax information between the first image information and the second image information can be the distance between two pixel points in the pixel group on the virtual image plane (referred to as the first distance for short). The first image information includes N first pixel points. The second image information includes N second pixel points. N is an integer greater than 0. The N first pixel points correspond one to one with the N second pixel points. The first image information and the second image information include N pixel groups. A pixel group includes a first pixel point and a second pixel point corresponding thereto. Two pixel points in a pixel group are used to display the same point of an object. For example, both pixel points are used to display the center of a circle. For another example, both pixel points are used to display the tip of a person's nose. For another example, both pixel points are used to display the vertex of the number "1". The first distances of the N pixel groups are the same.
第一视差信息会导致用户观看到的两个图像信息产生第一视差角。第一视差角和d1、m相关。例如,在图1中,P1和P2为像素组中的两个像素点。P1和P2之间距离为第一距离dm。P1、P2和两路成像光照射的两个位置组成等腰梯形。此时,通过dm、m和d1,得到第一视差角β1。用户观察到的图像信息位于视觉平面104。例如,用户观察到的P1和P2为位于视觉平面104上的P10。视觉平面104和接收面105之间的距离为d2。d2也称为视觉距离。d2、β1和m满足以下关系:
The first parallax information will cause the two image information viewed by the user to generate a first parallax angle. The first parallax angle is related to d1 and m. For example, in Figure 1, P1 and P2 are two pixel points in the pixel group. The distance between P1 and P2 is the first distance dm. P1, P2 and the two positions illuminated by the two imaging lights form an isosceles trapezoid. At this time, the first parallax angle β1 is obtained by dm, m and d1. The image information observed by the user is located on the visual plane 104. For example, P1 and P2 observed by the user are P10 located on the visual plane 104. The distance between the visual plane 104 and the receiving surface 105 is d2. d2 is also called the visual distance. d2, β1 and m satisfy the following relationship:
在图1中,辐辏调节冲突(vergence accommodation conflict,VAC)满足以下关系:当VAC的值过大时,两路成像光会让用户产生眩晕,从而降低用户体验。因此,在实际应用中,VAC可以小于0.25屈光度。In Figure 1, the vergence accommodation conflict (VAC) satisfies the following relationship: When the VAC value is too large, the two-way imaging light will make the user dizzy, thus reducing the user experience. Therefore, in practical applications, VAC can be less than 0.25 diopters.
在本申请实施例中,通过调节两个图像信息(第一图像信息和第二图像信息)的视差信息,可以调整视觉距离d2。因此,当本申请中的投影系统安装于交通工具时,投影系统可以将不同的图像信息投影 至与驾驶员不同的距离。例如,仪表相关的信息投影至距离驾驶员2.5m处,AR的信息投影至距离驾驶员10m处。因此,本申请实施例可以降低投影系统的数量,从而降低HUD的成本。并且,通过控制VAC,可以提高用户体验。In the embodiment of the present application, the visual distance d2 can be adjusted by adjusting the parallax information of the two image information (the first image information and the second image information). Therefore, when the projection system in the present application is installed on a vehicle, the projection system can project different image information. To different distances from the driver. For example, the instrument-related information is projected to 2.5m away from the driver, and the AR information is projected to 10m away from the driver. Therefore, the embodiment of the present application can reduce the number of projection systems, thereby reducing the cost of the HUD. In addition, by controlling the VAC, the user experience can be improved.
根据前面的描述可知,反射或透射后的两路成像光分别照射至接收面105不同的两个位置。两个位置对应两个不同的光斑。应理解,m可以为两个光斑的中心距。或者,当用户的双眼位于两个位置中,两个位置和双眼一一对应时,m可以为双眼的瞳距。According to the above description, the two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105. The two positions correspond to two different light spots. It should be understood that m can be the center distance of the two light spots. Alternatively, when the user's eyes are located in the two positions and the two positions correspond to the eyes one by one, m can be the pupil distance of the eyes.
在实际应用中,当本申请中的投影系统安装于交通工具时,光学元件102可以为挡风玻璃。图2为本申请实施例提供的挡风玻璃的第一个结构示意图。如图2所示,挡风玻璃包括第一玻璃层201、第二玻璃层203和中间层202。中间层202用于粘合第一玻璃层201和第二玻璃层203。在图2中,中间层202在目标区域的不同位置处的厚度相同。目标区域是指两路成像光经过的区域。应理解,中间层202在不同位置处的厚度可能存在加工误差。因此,中间层202在不同位置处的厚度相同是指中间层202在不同位置处的厚度偏差小于1毫米。In practical applications, when the projection system in the present application is installed on a vehicle, the optical element 102 can be a windshield. FIG. 2 is a first structural schematic diagram of a windshield provided in an embodiment of the present application. As shown in FIG. 2 , the windshield includes a first glass layer 201, a second glass layer 203, and an intermediate layer 202. The intermediate layer 202 is used to bond the first glass layer 201 and the second glass layer 203. In FIG. 2 , the thickness of the intermediate layer 202 at different positions in the target area is the same. The target area refers to the area through which the two imaging lights pass. It should be understood that there may be processing errors in the thickness of the intermediate layer 202 at different positions. Therefore, the same thickness of the intermediate layer 202 at different positions means that the thickness deviation of the intermediate layer 202 at different positions is less than 1 mm.
在图2中,当两路成像光中的任意一束成像光入射到挡风玻璃时,会在挡风玻璃的内外两个与空气接触的玻璃层进行反射,从而在人眼接收位置形成两个虚像。具体地,两路成像光中的任意一束成像光入射到第二玻璃层203。第二玻璃层203反射的成像光反射至人眼接收位置。第二玻璃层203反射的成像光在第二玻璃层203的外侧形成主像。第二玻璃层203透射的成像光经过中间层后到达第一玻璃层201。第一玻璃层201反射的成像光反射至人眼接收位置。第一玻璃层201反射的成像光在第一玻璃层201的外侧形成虚像。虚像和主像位于不同位置。两个虚像因为部分重叠而形成重影。重影会严重影响HUD显示的清晰度及驾驶安全性。为此,本申请可以通过以下任意一种或多种方式来降低重影的影响。In FIG2 , when any one of the two imaging lights is incident on the windshield, it will be reflected by the two glass layers in contact with the air inside and outside the windshield, thereby forming two virtual images at the receiving position of the human eye. Specifically, any one of the two imaging lights is incident on the second glass layer 203. The imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye. The imaging light reflected by the second glass layer 203 forms a main image on the outside of the second glass layer 203. The imaging light transmitted by the second glass layer 203 reaches the first glass layer 201 after passing through the middle layer. The imaging light reflected by the first glass layer 201 is reflected to the receiving position of the human eye. The imaging light reflected by the first glass layer 201 forms a virtual image on the outside of the first glass layer 201. The virtual image and the main image are located at different positions. The two virtual images form a ghost image because of partial overlap. Ghosting will seriously affect the clarity of the HUD display and driving safety. To this end, the present application can reduce the impact of ghosting in any one or more of the following ways.
在第一种方式中,两路成像光为线性偏振光。中间层202用于吸收线性偏振光。图3为本申请实施例提供的挡风玻璃的第二个结构示意图。如图3所示,在图2的基础上,两路成像光中的任意一束成像光入射到第二玻璃层203。第二玻璃层203反射的成像光反射至人眼接收位置。第二玻璃层203反射的成像光在第二玻璃层203的外侧形成虚像。第二玻璃层203透射的成像光被中间层202吸收。In the first mode, the two imaging lights are linearly polarized lights. The intermediate layer 202 is used to absorb the linearly polarized lights. FIG. 3 is a second structural schematic diagram of the windshield provided in an embodiment of the present application. As shown in FIG. 3 , based on FIG. 2 , any one of the two imaging lights is incident on the second glass layer 203. The imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye. The imaging light reflected by the second glass layer 203 forms a virtual image on the outside of the second glass layer 203. The imaging light transmitted by the second glass layer 203 is absorbed by the intermediate layer 202.
在第二种方式中,中间层202为楔形结构。图4为本申请实施例提供的挡风玻璃的第三个结构示意图。如图4所示,挡风玻璃包括第一玻璃层201、第二玻璃层203和中间层202。中间层202用于粘合第一玻璃层201和第二玻璃层203。中间层202在目标区域为楔形结构。两路成像光中的任意一束成像光入射到第二玻璃层203。第二玻璃层203反射的成像光反射至人眼接收位置。第二玻璃层203反射的成像光在第二玻璃层203的外侧形成主像。第二玻璃层203透射的成像光经过中间层后到达第一玻璃层201。第一玻璃层201反射的成像光反射至人眼接收位置。第一玻璃层201反射的成像光在第一玻璃层201的外侧形成虚像。虚像和主像位于同一位置。In the second mode, the middle layer 202 is a wedge-shaped structure. FIG. 4 is a third structural schematic diagram of a windshield provided in an embodiment of the present application. As shown in FIG. 4, the windshield includes a first glass layer 201, a second glass layer 203 and an middle layer 202. The middle layer 202 is used to bond the first glass layer 201 and the second glass layer 203. The middle layer 202 is a wedge-shaped structure in the target area. Any one of the two imaging lights is incident on the second glass layer 203. The imaging light reflected by the second glass layer 203 is reflected to the receiving position of the human eye. The imaging light reflected by the second glass layer 203 forms a main image on the outside of the second glass layer 203. The imaging light transmitted by the second glass layer 203 reaches the first glass layer 201 after passing through the middle layer. The imaging light reflected by the first glass layer 201 is reflected to the receiving position of the human eye. The imaging light reflected by the first glass layer 201 forms a virtual image on the outside of the first glass layer 201. The virtual image and the main image are located at the same position.
在第三种方式中,控制两个图像信息的视差信息,使得d2小于d1。图5为本申请实施例提供的投影系统的第二个光路示意图。如图5所示,投影系统包括图像生成组件101和光学元件102。图像生成组件101用于输出两路成像光。光学元件102用于反射或透射两路成像光,反射或透射后的两路成像光之间存在夹角。反射或透射后的两路成像光分别照射至接收面105不同的两个位置。两个位置之间的距离为m。两路成像光包括第一路成像光和第二路成像光。第一路成像光携带第一图像信息。第二路成像光携带第二图像信息。反射或透射后的两路成像光在虚像面103形成虚像。例如,P1和P2均为虚像面103上的像点。P1和P2对应的光束会被照射至接收面105的不同位置。在图5中,P1对应的光束属于第一路成像光。P2对应的光束属于第二路成像光。P1对应的光束照射至用户的下眼位。P2对应的光束照射至用户的上眼位。第一图像信息和第二图像信息之间存在第一视差信息。第一视差信息会导致用户观看到的两个图像信息产生第一视差角。在图5中,第一视差角为β2。此时,用户观察到的图像信息位于视觉平面104。例如,用户观察到的P1和P2为位于视觉平面104上的P10。In the third method, the parallax information of the two image information is controlled so that d2 is less than d1. FIG. 5 is a second optical path schematic diagram of the projection system provided in an embodiment of the present application. As shown in FIG. 5, the projection system includes an image generation component 101 and an optical element 102. The image generation component 101 is used to output two imaging lights. The optical element 102 is used to reflect or transmit the two imaging lights, and there is an angle between the two imaging lights after reflection or transmission. The two imaging lights after reflection or transmission are respectively irradiated to two different positions of the receiving surface 105. The distance between the two positions is m. The two imaging lights include a first imaging light and a second imaging light. The first imaging light carries the first image information. The second imaging light carries the second image information. The two imaging lights after reflection or transmission form a virtual image on the virtual image plane 103. For example, P1 and P2 are both image points on the virtual image plane 103. The light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105. In FIG. 5, the light beam corresponding to P1 belongs to the first imaging light. The light beam corresponding to P2 belongs to the second imaging light. The light beam corresponding to P1 is irradiated to the lower eye position of the user. The light beam corresponding to P2 is irradiated to the upper eye position of the user. There is a first parallax information between the first image information and the second image information. The first parallax information causes the two image information viewed by the user to produce a first parallax angle. In FIG5 , the first parallax angle is β2. At this time, the image information observed by the user is located on the visual plane 104. For example, P1 and P2 observed by the user are P10 located on the visual plane 104.
在前述图1的描述中,为了提高用户体验,VAC可以小于0.25屈光度。VAC的值跟d1和d2相关。下面对VAC和d1、d2的关系进行示例性的描述。In the description of FIG. 1 above, in order to improve the user experience, VAC can be less than 0.25 diopters. The value of VAC is related to d1 and d2. The relationship between VAC and d1 and d2 is described below in an exemplary manner.
图6为本申请实施例提供的VAC的第一个结构示意图。在图6中,纵坐标为VAC,单位为屈光度。横坐标为距离,单位为米。图6提供了12条曲线。曲线的起点连接横坐标。起点的横坐标表征d1。例如,曲线601的起点的横坐标为4.5。4.5为曲线601的d1。曲线601上的任意一点的横坐标表示d2。例如,点602的横坐标为8。此时,点602的纵坐标为0.1。此时,曲线601和点602表征当d1等于4.5,d2等于8时,VAC等于0.1。应理解,关于其它曲线的描述,可以参考曲线601的描述。Figure 6 is a first structural diagram of VAC provided in an embodiment of the present application. In Figure 6, the ordinate is VAC, and the unit is diopter. The abscissa is distance, and the unit is meter. Figure 6 provides 12 curves. The starting points of the curves are connected to the abscissas. The abscissa of the starting point represents d1. For example, the abscissa of the starting point of curve 601 is 4.5. 4.5 is d1 of curve 601. The abscissa of any point on curve 601 represents d2. For example, the abscissa of point 602 is 8. At this time, the ordinate of point 602 is 0.1. At this time, curve 601 and point 602 represent that when d1 is equal to 4.5 and d2 is equal to 8, VAC is equal to 0.1. It should be understood that for the description of other curves, reference can be made to the description of curve 601.
根据图6可知,在VAC小于0.25屈光度的情况下,d2的取值和d1的取值相关。根据前述图1的描述可知,d2为视觉距离。因此,d2的取值影响用户的体验。在本申请实施例中,为了提高用户体验, 可以通过控制d1的值来控制d2的取值范围。例如,d1的取值范围在2.5m至7m之间。其中,d1可以为2.5m或7m。According to FIG. 6 , when VAC is less than 0.25 diopters, the value of d2 is related to the value of d1. According to the description of FIG. 1 above, d2 is the visual distance. Therefore, the value of d2 affects the user experience. In the embodiment of the present application, in order to improve the user experience, The value range of d2 can be controlled by controlling the value of d1. For example, the value range of d1 is between 2.5m and 7m, wherein d1 can be 2.5m or 7m.
图7为本申请实施例提供的VAC的第二个结构示意图。在图7中,纵坐标为VAC,单位为屈光度。横坐标为距离,单位为米。图7提供了6条曲线。曲线的起点连接横坐标。6条曲线的横坐标分别为10、10、12、12、14和14。应理解,关于6条曲线中任一曲线的描述,可以参考前述图6中曲线601的描述。FIG7 is a second schematic diagram of the structure of VAC provided in an embodiment of the present application. In FIG7 , the ordinate is VAC, in diopters. The abscissa is distance, in meters. FIG7 provides six curves. The starting points of the curves are connected to the abscissas. The abscissas of the six curves are 10, 10, 12, 12, 14 and 14, respectively. It should be understood that for the description of any of the six curves, reference may be made to the description of curve 601 in FIG6 .
在前述第三种方式中,当d1值太小时,重影的影响比较大。当d1值太大,VAC小于0.25屈光度时,d2的取值较大,从而影响用户体验。因此,在本申请实施例中,可以通过控制d1的取值,可以降低重影的影响,提高用户体验。例如,d1的取值范围在10m至15m之间。其中,d1可以为10m或15m。In the third method described above, when the d1 value is too small, the impact of ghosting is relatively large. When the d1 value is too large and VAC is less than 0.25 diopters, the value of d2 is relatively large, thereby affecting the user experience. Therefore, in the embodiment of the present application, by controlling the value of d1, the impact of ghosting can be reduced and the user experience can be improved. For example, the value range of d1 is between 10m and 15m. Among them, d1 can be 10m or 15m.
根据图6和图7的描述可知,d2的值和d1相关。在实际应用中,d2的值也和接收平面105的位置相关。当用户相对于光学元件102的位置发生变化时,d2的值也会发生相应的变化。为了提高用户体验,投影系统还可以包括人眼追踪模块和处理器。人员追踪模块用于获取接收面105的位置。处理器用于根据接收面105的位置调整第一图像信息和第二图像信息的第一视差信息。通过调整视差信息,可以调整d2的值。According to the description of Figures 6 and 7, the value of d2 is related to d1. In practical applications, the value of d2 is also related to the position of the receiving plane 105. When the position of the user relative to the optical element 102 changes, the value of d2 will also change accordingly. In order to improve the user experience, the projection system may further include an eye tracking module and a processor. The personnel tracking module is used to obtain the position of the receiving surface 105. The processor is used to adjust the first parallax information of the first image information and the second image information according to the position of the receiving surface 105. By adjusting the parallax information, the value of d2 can be adjusted.
根据前述的描述可知,两路成像光中的第一路成像光可以经过光学元件102的反射或透射到用户的左眼。两路成像光中的第二路成像光可以经过光学元件102反射或透射到用户的右眼。在实际应用中,光学元件102不同位置存在不同的工艺误差。工艺误差会导致用户观察到的图像的缩放倍数、成像位置相对于理想位置存在显示差别,降低用户体验。因此,在本申请实施例中,可以对不同视差的图像信息进行预处理。通过预处理来补偿显示差别,从而增强显示效果。例如,处理器可以对图像生成组件101加载的左眼图像和/或右眼图像进行以下一项或多项处理:对左眼图像或右眼图像的整体或局部进行平移。对左眼图像或右眼图像的整体或局部进行放大或缩小。对左眼图像或右眼图像的整体或局部进行畸变。According to the foregoing description, the first imaging light of the two imaging lights can be reflected or transmitted to the left eye of the user through the optical element 102. The second imaging light of the two imaging lights can be reflected or transmitted to the right eye of the user through the optical element 102. In practical applications, different process errors exist at different positions of the optical element 102. The process error will cause the zoom factor and imaging position of the image observed by the user to have display differences relative to the ideal position, thereby reducing the user experience. Therefore, in an embodiment of the present application, image information with different parallaxes can be preprocessed. The display difference is compensated by preprocessing, thereby enhancing the display effect. For example, the processor can perform one or more of the following processing on the left eye image and/or the right eye image loaded by the image generation component 101: translate the entire or part of the left eye image or the right eye image. Enlarge or reduce the entire or part of the left eye image or the right eye image. Distort the entire or part of the left eye image or the right eye image.
在实际应用中,为了提高用户的体验,投影系统可以分时或同时的将不同的图像信息投影至用户不同的距离。下面对此分别进行描述。In practical applications, in order to improve the user experience, the projection system can project different image information to the user at different distances in a time-sharing or simultaneous manner. This is described below.
投影系统同时将不同的图像信息投影至用户不同的距离。图8为本申请实施例提供的投影系统的第三个光路示意图。如图8所示,在图1的基础上,第一路成像光携带第一图像信息组。第一图像信息组包括第一图像信息和第三图像信息。第二路成像光携带第二图像信息组。第二图像信息组包括第二图像信息和第四图像信息。经过光学元件102反射或透射后的两路成像光在虚像面103形成虚像。虚像也包括第一图像信息组和第二图像信息组。用户观察到的不同图像信息组位于不同的视觉平面。下面对此分别进行描述。The projection system simultaneously projects different image information to different distances from the user. FIG8 is a third optical path schematic diagram of the projection system provided in an embodiment of the present application. As shown in FIG8, based on FIG1, the first imaging light carries the first image information group. The first image information group includes the first image information and the third image information. The second imaging light carries the second image information group. The second image information group includes the second image information and the fourth image information. The two imaging lights reflected or transmitted by the optical element 102 form a virtual image on the virtual image plane 103. The virtual image also includes the first image information group and the second image information group. The different image information groups observed by the user are located in different visual planes. These are described separately below.
P1和P2为虚像面103上的像点。P1和P2对应的光束会被照射至接收面105的不同位置。在图8中,P1对应的光束属于第一路成像光中的第一图像信息。P2对应的光束属于第二路成像光中的第二图像信息。P1对应的光束照射至用户的上眼位。P2对应的光束照射至用户的下眼位。第一图像信息和第二图像信息之间存在第一视差信息。第一视差信息会导致用户观看到的两个图像信息产生第一视差角。在图8中,第一视差角为β1。此时,用户观察到的图像信息位于视觉平面104。用户观察到的P1和P2为位于视觉平面104上的P10。P1 and P2 are image points on the virtual image plane 103. The light beams corresponding to P1 and P2 will be irradiated to different positions of the receiving surface 105. In Figure 8, the light beam corresponding to P1 belongs to the first image information in the first imaging light. The light beam corresponding to P2 belongs to the second image information in the second imaging light. The light beam corresponding to P1 is irradiated to the upper eye position of the user. The light beam corresponding to P2 is irradiated to the lower eye position of the user. There is first parallax information between the first image information and the second image information. The first parallax information will cause the two image information viewed by the user to produce a first parallax angle. In Figure 8, the first parallax angle is β1. At this time, the image information observed by the user is located on the visual plane 104. P1 and P2 observed by the user are P10 located on the visual plane 104.
P3和P4为虚像面103上的像点。P3和P4对应的光束会被照射至接收面105的不同位置。在图8中,P3对应的光束属于第一路成像光中的第三图像信息。P4对应的光束属于第二路成像光中的第四图像信息。P3对应的光束照射至用户的上眼位。P4对应的光束照射至用户的下眼位。第三图像信息和第四图像信息之间存在第二视差信息。关于第二视差信息的描述,可以参考前述对第一视差信息的描述。第二视差信息会导致用户观看到的两个图像信息产生第二视差角。第二视差角和d1、m相关。在图8中,第二视差角为β3。视觉平面801和接收面105之间的距离为d3。d3是根据第三图像信息和第四图像信息之间的第二视差信息、m和d1确定的。此时,用户观察到的图像信息位于视觉平面801。例如,用户观察到的P3和P4为位于视觉平面801上的P11。P3 and P4 are image points on the virtual image plane 103. The light beams corresponding to P3 and P4 will be irradiated to different positions of the receiving surface 105. In FIG8 , the light beam corresponding to P3 belongs to the third image information in the first imaging light. The light beam corresponding to P4 belongs to the fourth image information in the second imaging light. The light beam corresponding to P3 is irradiated to the upper eye position of the user. The light beam corresponding to P4 is irradiated to the lower eye position of the user. There is second parallax information between the third image information and the fourth image information. For the description of the second parallax information, reference can be made to the aforementioned description of the first parallax information. The second parallax information will cause the two image information viewed by the user to generate a second parallax angle. The second parallax angle is related to d1 and m. In FIG8 , the second parallax angle is β3. The distance between the visual plane 801 and the receiving surface 105 is d3. d3 is determined based on the second parallax information, m and d1 between the third image information and the fourth image information. At this time, the image information observed by the user is located on the visual plane 801. For example, P3 and P4 observed by the user are P11 located on the visual plane 801.
投影系统分时的将不同的图像信息投影至用户不同的距离。例如,如图8所示,在第一时刻,图像生成组件101输出的第一路成像光携带第一图像信息。图像生成组件101输出的第二路成像光携带第二图像信息。经过光学元件102反射或透射后的两路成像光在虚像面103形成虚像。第一图像信息和第二图像信息存在第一视差信息。此时,用户观察到的图像信息位于视觉平面104。例如,用户在虚像面103观察到的P1和P2为位于视觉平面104上的P10。在第二时刻,图像生成组件101输出的第一路成像光携带第三图像信息。图像生成组件101输出的第二路成像光携带第四图像信息。经过光学元件102反射 或透射后的两路成像光在虚像面103形成虚像。第三图像信息和第四图像信息存在第二视差信息。此时,用户观察到的图像信息位于视觉平面801。例如,用户在虚像面103观察到的P3和P4为位于视觉平面801上的P11。第一时刻和第二时刻交替分布。此时,投影系统分时的将不同的图像信息投影至用户不同的距离。The projection system projects different image information to different distances from the user in a time-sharing manner. For example, as shown in FIG8 , at the first moment, the first imaging light output by the image generation component 101 carries the first image information. The second imaging light output by the image generation component 101 carries the second image information. The two imaging lights reflected or transmitted by the optical element 102 form a virtual image on the virtual image plane 103. There is first parallax information between the first image information and the second image information. At this time, the image information observed by the user is located on the visual plane 104. For example, P1 and P2 observed by the user on the virtual image plane 103 are P10 located on the visual plane 104. At the second moment, the first imaging light output by the image generation component 101 carries the third image information. The second imaging light output by the image generation component 101 carries the fourth image information. After being reflected by the optical element 102 Or the two imaging lights after transmission form a virtual image on the virtual image plane 103. The third image information and the fourth image information have second parallax information. At this time, the image information observed by the user is located on the visual plane 801. For example, P3 and P4 observed by the user on the virtual image plane 103 are P11 located on the visual plane 801. The first moment and the second moment are alternately distributed. At this time, the projection system projects different image information to different distances of the user in a time-sharing manner.
应理解,前述图8只是本申请实施例提供的投影系统的一个示例。在实际应用中,本领域技术人员可以根据需求设计投影系统。例如,在图8中,d3和d2大于d1。在实际应用中,d3和/或d2可以小于d1。又例如,第一路成像光和第二路成像光还可以携带更多的图像信息。更多的图像信息对应更多的视觉平面。It should be understood that the aforementioned FIG. 8 is only an example of a projection system provided by an embodiment of the present application. In practical applications, those skilled in the art can design a projection system according to requirements. For example, in FIG. 8, d3 and d2 are greater than d1. In practical applications, d3 and/or d2 may be less than d1. For another example, the first imaging light and the second imaging light may also carry more image information. More image information corresponds to more visual planes.
应理解,在本申请实施例中,第三图像信息和第四图像信息可以用于组成3D图像,也可以用于组成2D图像。当第三图像信息和第四图像信息用于组成2D图像时,d2或d3可以等于d1。It should be understood that in the embodiment of the present application, the third image information and the fourth image information can be used to form a 3D image, or can be used to form a 2D image. When the third image information and the fourth image information are used to form a 2D image, d2 or d3 can be equal to d1.
图9为本申请实施例提供的图像生成组件的第一个结构示意图。如图9所示,图像生成组件101包括背光组件901和空间光调制器902。背光组件901用于分时的以不同的角度向空间光调制器902输出两束光束。空间光调制器103可以是液晶显示器(liquid crystal display,LCD)、硅基液晶(liquid crystal on silicon,LCOS)、数字微镜器件(digital micro-mirror device,DMD)、或微机电系统(Micro-Electro-Mechanical System,MEMS)等。空间光调制器902用于根据不同的图像信息分时的调制两束光束,得到两路成像光,以不同的角度输出两路成像光。例如,在第一时刻,空间光调制器103用于根据第一图像信息调制第一光束,得到第一路成像光。在第二时刻,空间光调制器103用于根据第二图像信息调制第二光束,得到第二路成像光。关于两路成像光的描述,可以参考前述图1、图5或图8中的描述。例如,两路成像光照射至光学元件。光学元件反射或透射的两路成像光分别照射至不同的视点。在本申请实施例中,通过分时的调制两路光束,可以降低空间光调制器902的成本,从而降低图像生成组件101的成本。此外,图像生成组件还可以不需要背光的显示器,例如可以为有机发光二极管(organic light-emitting diode,OLED)显示屏或Micro LED等。FIG9 is a first structural schematic diagram of an image generation component provided by an embodiment of the present application. As shown in FIG9 , the image generation component 101 includes a backlight component 901 and a spatial light modulator 902. The backlight component 901 is used to output two light beams to the spatial light modulator 902 at different angles in a time-sharing manner. The spatial light modulator 103 can be a liquid crystal display (LCD), liquid crystal on silicon (LCOS), a digital micro-mirror device (DMD), or a micro-electro-mechanical system (MEMS). The spatial light modulator 902 is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two imaging lights, and output the two imaging lights at different angles. For example, at a first moment, the spatial light modulator 103 is used to modulate the first light beam according to the first image information to obtain the first imaging light. At a second moment, the spatial light modulator 103 is used to modulate the second light beam according to the second image information to obtain the second imaging light. For the description of the two-path imaging light, reference may be made to the description in the aforementioned FIG. 1, FIG. 5 or FIG. 8. For example, the two-path imaging light is irradiated to the optical element. The two-path imaging light reflected or transmitted by the optical element is irradiated to different viewpoints respectively. In the embodiment of the present application, by modulating the two light beams in a time-sharing manner, the cost of the spatial light modulator 902 can be reduced, thereby reducing the cost of the image generation component 101. In addition, the image generation component can also be a display that does not require a backlight, such as an organic light-emitting diode (OLED) display or a Micro LED.
图10为本申请实施例提供的图像生成组件的第二个结构示意图。如图10所示,图像生成组件101包括背光组件901和空间光调制器902。背光组件901在位置A和位置B之间移动。背光组件901用于在第一位置(位置A)输出第一光束,在第二位置(位置B)输出第二光束。应理解,移动背光组件901可以是移动背光组件901中的部分器件。例如,背光组件901包括光源器件和非固定元件。非固定元件可以为透镜、反射镜、棱镜或菲涅尔镜等。背光组件901可以通过移动非固定元件来分时的输出两束光束。在本申请实施例中,通过移动背光组件,背光组件可以通过一个光源输出第一光束和第二光束,从而降低背光组件的成本。FIG10 is a second structural schematic diagram of an image generating assembly provided in an embodiment of the present application. As shown in FIG10 , the image generating assembly 101 includes a backlight assembly 901 and a spatial light modulator 902. The backlight assembly 901 moves between position A and position B. The backlight assembly 901 is used to output a first light beam at a first position (position A) and output a second light beam at a second position (position B). It should be understood that the mobile backlight assembly 901 may be a partial device in the mobile backlight assembly 901. For example, the backlight assembly 901 includes a light source device and a non-fixed element. The non-fixed element may be a lens, a reflector, a prism, or a Fresnel mirror, etc. The backlight assembly 901 may output two light beams in a time-sharing manner by moving the non-fixed element. In an embodiment of the present application, by moving the backlight assembly, the backlight assembly may output the first light beam and the second light beam through one light source, thereby reducing the cost of the backlight assembly.
图11为本申请实施例提供的图像生成组件的第三个结构示意图。如图11所示,在图9或图10的基础上,图像生成组件101还包括镜头1101和扩散屏1102。空间光调制器902用于以不同的角度向镜头1101输出两路成像光。镜头1101用于改变两路成像光的传输方向,向扩散屏1102传输两路传输光。扩散屏1102用于对两路成像光进行扩散,以不同的角度输出扩散后的两路成像光。扩散后的两路成像光可以通过光学元件照射至不同的视点。FIG11 is a third structural schematic diagram of the image generation component provided in an embodiment of the present application. As shown in FIG11, based on FIG9 or FIG10, the image generation component 101 also includes a lens 1101 and a diffuser screen 1102. The spatial light modulator 902 is used to output two paths of imaging light to the lens 1101 at different angles. The lens 1101 is used to change the transmission direction of the two paths of imaging light and transmit the two paths of transmission light to the diffuser screen 1102. The diffuser screen 1102 is used to diffuse the two paths of imaging light and output the diffused two paths of imaging light at different angles. The diffused two paths of imaging light can be irradiated to different viewpoints through optical elements.
根据前述图8的描述可知,本申请实施例可以分时的将不同的图像信息投影至用户不同的距离。具体地,第一时刻的第一路成像光携带第一图像信息。第一时刻的第二路成像光携带第二图像信息。第二时刻的第一路成像光携带第三图像信息。第一时刻的第二路成像光携带第四图像信息。在实际应用中,结合图9的相关描述,投影系统可以根据另一种分时的方式将不同的图像信息投影至用户不同的距离。具体地,第一时刻的第一路成像光携带第一图像信息和第三图像信息。第二时刻的第二路成像光携带第二图像信息和第四图像信息。According to the description of the aforementioned FIG. 8 , the embodiment of the present application can project different image information to different distances from the user in a time-sharing manner. Specifically, the first imaging light at the first moment carries the first image information. The second imaging light at the first moment carries the second image information. The first imaging light at the second moment carries the third image information. The second imaging light at the first moment carries the fourth image information. In practical applications, combined with the relevant description of FIG. 9 , the projection system can project different image information to different distances from the user in another time-sharing manner. Specifically, the first imaging light at the first moment carries the first image information and the third image information. The second imaging light at the second moment carries the second image information and the fourth image information.
如图1所示,光学元件102的焦距为f。图像生成组件101的图像面(图像的显示面)与光学元件102的距离为d。图像生成组件101的图像面可以是像素组件或扩散屏。光学元件102上的每个点和图像生成组件101的图像面存在一个垂直距离。d可以为光学元件102与图像生成组件101的图像面的最远垂直距离。或者,d可以为图像生成组件101的图像面的中心像素与光学元件102上的目标点的直线距离。中心像素为图像面的中心位置处的一个或多个像素。中心像素输出的成像光照射到光学元件102上的目标点。反射或透射后的两路成像光形成的虚像和光学元件的距离为d0,即虚像面103与光学元件102的距离为d0。d0、d和f满足如下公式:在本申请实施例中,d可以小于f。当d小于f时,光学元件102可以对虚像进行放大。因此,在用户和光学元件102之间的距离较近时,用户可以看 到放大的虚像,从而提升用户体验。As shown in Figure 1, the focal length of the optical element 102 is f. The distance between the image plane of the image generating component 101 (the display surface of the image) and the optical element 102 is d. The image plane of the image generating component 101 can be a pixel component or a diffusion screen. There is a vertical distance between each point on the optical element 102 and the image plane of the image generating component 101. d can be the farthest vertical distance between the optical element 102 and the image plane of the image generating component 101. Alternatively, d can be the straight-line distance between the central pixel of the image plane of the image generating component 101 and the target point on the optical element 102. The central pixel is one or more pixels at the center position of the image plane. The imaging light output by the central pixel irradiates the target point on the optical element 102. The distance between the virtual image formed by the two imaging lights after reflection or transmission and the optical element is d0, that is, the distance between the virtual image plane 103 and the optical element 102 is d0. d0, d and f satisfy the following formula: In the embodiment of the present application, d may be smaller than f. When d is smaller than f, the optical element 102 may magnify the virtual image. Therefore, when the distance between the user and the optical element 102 is relatively close, the user may see to an enlarged virtual image, thereby improving the user experience.
在实际应用中,光学元件102通过反射或透射区域反射或透射两路成像光。目标点可以是指反射或透射区域中的任意点。或者,光学元件102通过两个反射或透射区域反射或透射两路成像光。两路成像光和反射或透射区域一一对应。目标点可以是指两个反射或透射区域之间的中心点。In practical applications, the optical element 102 reflects or transmits two imaging lights through the reflection or transmission area. The target point may refer to any point in the reflection or transmission area. Alternatively, the optical element 102 reflects or transmits two imaging lights through two reflection or transmission areas. The two imaging lights correspond to the reflection or transmission areas one by one. The target point may refer to the center point between the two reflection or transmission areas.
参考图12,图12为本申请实施例提供的一种投影系统的电路示意图。Refer to FIG. 12 , which is a circuit diagram of a projection system provided in an embodiment of the present application.
如图12所示,投影系统中的电路主要包括包含处理器1001,内部存储器1002,外部存储器接口1003,音频模块1004,视频模块1005,电源模块1006,无线通信模块1007,I/O接口1008、视频接口1009、处理器局域网(Controller Area Network,CAN)收发器1010,显示电路1028和显示面板1029等。其中,处理器1001与其周边的元件,例如内部存储器1002,CAN收发器1010,音频模块1004,视频模块1005,电源模块1006,无线通信模块1007,I/O接口1008、视频接口1009、触控单元1010、显示电路1028可以通过总线连接。处理器1001可以称为前端处理器。As shown in FIG. 12 , the circuit in the projection system mainly includes a processor 1001, an internal memory 1002, an external memory interface 1003, an audio module 1004, a video module 1005, a power module 1006, a wireless communication module 1007, an I/O interface 1008, a video interface 1009, a Controller Area Network (CAN) transceiver 1010, a display circuit 1028, and a display panel 1029. The processor 1001 and its peripheral components, such as the internal memory 1002, the CAN transceiver 1010, the audio module 1004, the video module 1005, the power module 1006, the wireless communication module 1007, the I/O interface 1008, the video interface 1009, the touch unit 1010, and the display circuit 1028 can be connected through a bus. The processor 1001 can be called a front-end processor.
另外,本申请实施例示意的电路图并不构成对投影系统的具体限定。在本申请另一些实施例中,投影系统可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。In addition, the circuit diagrams shown in the embodiments of the present application do not constitute a specific limitation on the projection system. In other embodiments of the present application, the projection system may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently. The components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
其中,处理器1001包括一个或多个处理单元,例如:处理器1001可以包括应用处理器(Application Processor,AP),调制解调处理器,图形处理器(Graphics Processing Unit,GPU),图像信号处理器(Image Signal Processor,ISP),视频编解码器,数字信号处理器(Digital Signal Processor,DSP),基带处理器,和/或神经网络处理器(Neural-Network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 1001 includes one or more processing units, for example, the processor 1001 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural network processor (NPU), etc. Different processing units may be independent devices or integrated in one or more processors.
处理器1001中还可以设置存储器,用于存储指令和数据。例如,存储投影系统的操作系统、AR Creator软件包等。在一些实施例中,处理器1001中的存储器为高速缓冲存储器。该存储器可以保存处理器1001刚用过或循环使用的指令或数据。如果处理器1001需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器1001的等待时间,因而提高了系统的效率。A memory may also be provided in the processor 1001 for storing instructions and data. For example, the operating system of the projection system, the AR Creator software package, etc. may be stored. In some embodiments, the memory in the processor 1001 is a cache memory. The memory may store instructions or data that the processor 1001 has just used or cyclically used. If the processor 1001 needs to use the instruction or data again, it may be directly called from the memory. Repeated access is avoided, the waiting time of the processor 1001 is reduced, and the efficiency of the system is improved.
另外,如果本实施例中的投影系统安装在交通工具上,处理器1001的功能可以由交通工具上的域处理器来实现。In addition, if the projection system in this embodiment is installed on a vehicle, the functions of the processor 1001 can be implemented by a domain processor on the vehicle.
在一些实施例中,投影系统还可以包括多个连接到处理器1001的输入输出(Input/Output,I/O)接口1008。接口1008可以包括但不限于集成电路(Inter-Integrated Circuit,I2C)接口,集成电路内置音频(Inter-Integrated Circuit Sound,I2S)接口,脉冲编码调制(Pulse Code Modulation,PCM)接口,通用异步收发传输器(Universal Asynchronous Receiver/Transmitter,UART)接口,移动产业处理器接口(Mobile Industry Processor Interface,MIPI),通用输入输出(General-Purpose Input/Output,GPIO)接口,用户标识模块(Subscriber Identity Module,SIM)接口,和/或通用串行总线(Universal Serial Bus,USB)接口等。上述I/O接口1008可以连接鼠标、触摸屏、键盘、摄像头、扬声器/喇叭、麦克风等设备,也可以连接投影系统上的物理按键(例如音量键、亮度调节键、开关机键等)。In some embodiments, the projection system may further include a plurality of input/output (I/O) interfaces 1008 connected to the processor 1001. The interface 1008 may include, but is not limited to, an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc. The I/O interface 1008 may be connected to devices such as a mouse, touch screen, keyboard, camera, speaker, microphone, etc., and may also be connected to physical buttons on the projection system (such as volume buttons, brightness adjustment buttons, power buttons, etc.).
内部存储器1002可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器1002可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如通话功能,时间设置功能,AR功能等)等。存储数据区可存储投影系统使用过程中所创建的数据(比如电话簿,世界时间等)等。此外,内部存储器1002可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(Universal Flash Storage,UFS)等。处理器1001通过运行存储在内部存储器1002的指令,和/或存储在设置于处理器1001中的存储器的指令,执行投影系统的各种功能应用以及数据处理。The internal memory 1002 can be used to store computer executable program codes, which include instructions. The internal memory 1002 may include a program storage area and a data storage area. The program storage area may store an operating system, an application required for at least one function (such as a call function, a time setting function, an AR function, etc.), etc. The data storage area may store data created during the use of the projection system (such as a phone book, world time, etc.), etc. In addition, the internal memory 1002 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (Universal Flash Storage, UFS), etc. The processor 1001 executes various functional applications and data processing of the projection system by running instructions stored in the internal memory 1002 and/or instructions stored in a memory provided in the processor 1001.
外部存储器接口1003可以用于连接外部存储器(例如Micro SD卡),外部存储器可以根据需要存储数据或程序指令,处理器1001可以通过外部存储器接口1003对这些数据或程序执行进行读写等操作。The external memory interface 1003 can be used to connect an external memory (such as a Micro SD card). The external memory can store data or program instructions as needed, and the processor 1001 can perform operations such as reading and writing these data or programs through the external memory interface 1003.
音频模块1004用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块1004还可以用于对音频信号编码和解码,例如进行放音或录音。在一些实施例中,音频模块1004可以设置于处理器1001中,或将音频模块1004的部分功能模块设置于处理器1001中。投影系统可以通过音频模块1004以及应用处理器等实现音频功能。The audio module 1004 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. The audio module 1004 can also be used to encode and decode audio signals, such as playing or recording. In some embodiments, the audio module 1004 can be arranged in the processor 1001, or some functional modules of the audio module 1004 can be arranged in the processor 1001. The projection system can realize audio functions through the audio module 1004 and the application processor.
视频接口1009可以接收外部输入的音视频,其具体可以为高清晰多媒体接口(High Definition Multimedia Interface,HDMI),数字视频接口(Digital Visual Interface,DVI),视频图形阵列(Video Graphics Array,VGA),显示端口(Display port,DP),低压差分信号(Low Voltage Differential Signaling,LVDS)接口等,视频接口1009还可以向外输出视频。例如,投影系统通过视频接口接收导 航系统发送的视频数据或者接收域处理器发送的视频数据。The video interface 1009 can receive external audio and video input, which can be a high-definition multimedia interface (HDMI), a digital visual interface (DVI), a video graphics array (VGA), a display port (DP), a low voltage differential signal (LVDS) interface, etc. The video interface 1009 can also output video to the outside. For example, the projection system receives the audio and video input through the video interface. The video data sent by the navigation system or the video data sent by the receiving domain processor.
视频模块1005可以对视频接口1009输入的视频进行解码,例如进行H.264解码。视频模块还可以对投影系统采集到的视频进行编码,例如对外接的摄像头采集到的视频进行H.264编码。此外,处理器1001也可以对视频接口1009输入的视频进行解码,然后将解码后的图像信号输出到显示电路。The video module 1005 can decode the video input by the video interface 1009, for example, by performing H.264 decoding. The video module can also encode the video collected by the projection system, for example, by performing H.264 encoding on the video collected by the external camera. In addition, the processor 1001 can also decode the video input by the video interface 1009, and then output the decoded image signal to the display circuit.
进一步的,上述立体投影系统还包括CAN收发器1010,CAN收发器1010可以连接到汽车的CAN总线(CAN BUS)。通过CAN总线,立体投影系统可以与车载娱乐系统(音乐、电台、视频模块)、车辆状态系统等进行通信。例如,用户可以通过操作投影系统来开启车载音乐播放功能。车辆状态系统可以将车辆状态信息(车门、安全带等)发送给立体投影系统进行显示。Furthermore, the stereoscopic projection system further includes a CAN transceiver 1010, which can be connected to the CAN bus (CAN BUS) of the car. Through the CAN bus, the stereoscopic projection system can communicate with the in-vehicle entertainment system (music, radio, video module), the vehicle status system, etc. For example, the user can turn on the in-vehicle music playback function by operating the projection system. The vehicle status system can send vehicle status information (doors, seat belts, etc.) to the stereoscopic projection system for display.
显示电路1028和显示面板1029共同实现显示图像的功能。显示电路1028接收处理器1001输出的图像信号,对该图像信号进行处理后输入显示面板1029进行成像。显示电路1028还可以对显示面板1029显示的图像进行控制。例如,控制显示亮度或对比度等参数。其中,显示电路1028可以包括驱动电路、图像控制电路等。其中,上述显示电路1028和显示面板1029可以位于像素组件502中。The display circuit 1028 and the display panel 1029 jointly realize the function of displaying an image. The display circuit 1028 receives the image signal output by the processor 1001, processes the image signal and then inputs it into the display panel 1029 for imaging. The display circuit 1028 can also control the image displayed by the display panel 1029. For example, it controls parameters such as display brightness or contrast. Among them, the display circuit 1028 may include a driving circuit, an image control circuit, etc. Among them, the above-mentioned display circuit 1028 and the display panel 1029 may be located in the pixel component 502.
显示面板1029用于根据输入的图像信号对光源输入的光束进行调制,从而生成可视图像。显示面板1029可以为硅基液晶面板、液晶显示面板或数字微镜设备。The display panel 1029 is used to modulate the light beam input by the light source according to the input image signal, so as to generate a visible image. The display panel 1029 can be a liquid crystal on silicon panel, a liquid crystal display panel or a digital micromirror device.
在本实施例中,视频接口1009可以接收输入的视频数据(或称为视频源),视频模块1005进行解码和/或数字化处理后输出图像信号至显示电路1028,显示电路1028根据输入的图像信号驱动显示面板1029将光源发出的光束进行成像,从而生成可视图像(发出成像光)。In this embodiment, the video interface 1009 can receive input video data (or called a video source), and the video module 1005 decodes and/or digitally processes the data and outputs an image signal to the display circuit 1028. The display circuit 1028 drives the display panel 1029 to image the light beam emitted by the light source according to the input image signal, thereby generating a visible image (emitting imaging light).
电源模块1006用于根据输入的电力(例如直流电)为处理器1001和光源提供电源,电源模块1006中可以包括可充电电池,可充电电池可以为处理器1001和光源提供电源。光源发出的光可以传输到显示面板1029进行成像,从而形成图像光信号(成像光)。The power module 1006 is used to provide power to the processor 1001 and the light source according to the input power (e.g., direct current), and the power module 1006 may include a rechargeable battery, which can provide power to the processor 1001 and the light source. The light emitted by the light source can be transmitted to the display panel 1029 for imaging, thereby forming an image light signal (imaging light).
此外,上述电源模块1006可以连接到汽车的供电模块(例如动力电池),由汽车的供电模块为投影系统的电源模块1006供电。In addition, the power module 1006 can be connected to a power module of a car (eg, a power battery), and the power module of the car supplies power to the power module 1006 of the projection system.
无线通信模块1007可以使得投影系统与外界进行无线通信,其可以提供无线局域网(Wireless Local Area Networks,WLAN)(如无线保真(Wireless Fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(Global Navigation Satellite System,GNSS),调频(Frequency Modulation,FM),近距离无线通信技术(Near Field Communication,NFC),红外技术(Infrared,IR)等无线通信的解决方案。无线通信模块1007可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块1007经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器1001。无线通信模块1007还可以从处理器1001接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。The wireless communication module 1007 enables the projection system to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and other wireless communication solutions. The wireless communication module 1007 can be one or more devices integrating at least one communication processing module. The wireless communication module 1007 receives electromagnetic waves via an antenna, modulates the frequency of the electromagnetic wave signal and performs filtering, and sends the processed signal to the processor 1001. The wireless communication module 1007 can also receive the signal to be sent from the processor 1001, modulate the frequency of the signal, amplify it, and convert it into electromagnetic waves for radiation through the antenna.
另外,视频模块1005进行解码的视频数据除了通过视频接口1009输入之外,还可以通过无线通信模块1007以无线的方式接收或从内部存储器1002或外部存储器中读取,例如投影系统可以通过车内的无线局域网从终端设备或车载娱乐系统接收视频数据,投影系统还可以读取内部存储器1002或外部存储器中存储的音视频数据。In addition, in addition to being input through the video interface 1009, the video data decoded by the video module 1005 can also be wirelessly received through the wireless communication module 1007 or read from the internal memory 1002 or the external memory. For example, the projection system can receive video data from the terminal device or the in-vehicle entertainment system through the wireless LAN in the car, and the projection system can also read the audio and video data stored in the internal memory 1002 or the external memory.
本申请实施例还提供了一种交通工具,该交通工具安装有前述任意一种立体投影系统。投影系统用于输出两路成像光。两路成像光携带不同的图像信息。输出的两路成像光经过挡风玻璃照着至接收面,形成虚像。虚像位于挡风玻璃的一侧,驾驶员或乘客位于挡风玻璃的另一侧。反射或透射后的两路成像光分别照射至驾驶员或乘客的双眼。例如,第一路成像光照射至乘客的左眼。第二路成像光照射至乘客的右眼。The embodiment of the present application also provides a vehicle, which is equipped with any of the aforementioned stereoscopic projection systems. The projection system is used to output two imaging lights. The two imaging lights carry different image information. The two output imaging lights are illuminated to the receiving surface through the windshield to form a virtual image. The virtual image is located on one side of the windshield, and the driver or passenger is located on the other side of the windshield. The two imaging lights after reflection or transmission are respectively irradiated to the eyes of the driver or the passenger. For example, the first imaging light is irradiated to the left eye of the passenger. The second imaging light is irradiated to the right eye of the passenger.
本申请实施例还提供了一种交通工具,该交通工具安装有前述图1、图5、图8或图12中的投影系统。图13为本申请实施例提供投影系统安装在交通工具的示意图。交通工具的挡风玻璃可以作为投影系统中的光学元件。投影系统中的图像生成组件101位于挡风玻璃的同一侧。图像生成组件101用于输出两路成像光。两路成像光携带不同的图像信息。挡风玻璃用于反射或透射两路成像光,形成虚像。虚像位于挡风玻璃的一侧,驾驶员或乘客位于挡风玻璃的另一侧。反射或透射后的两路成像光分别照射至驾驶员或乘客的双眼。例如,第一路成像光照射至乘客的左眼。第二路成像光照射至乘客的右眼。The embodiment of the present application also provides a vehicle, which is equipped with the projection system in Figures 1, 5, 8 or 12 above. Figure 13 is a schematic diagram of the projection system installed in a vehicle provided in the embodiment of the present application. The windshield of the vehicle can be used as an optical element in the projection system. The image generation component 101 in the projection system is located on the same side of the windshield. The image generation component 101 is used to output two imaging lights. The two imaging lights carry different image information. The windshield is used to reflect or transmit the two imaging lights to form a virtual image. The virtual image is located on one side of the windshield, and the driver or passenger is located on the other side of the windshield. The two imaging lights after reflection or transmission are respectively irradiated to the eyes of the driver or the passenger. For example, the first imaging light is irradiated to the left eye of the passenger. The second imaging light is irradiated to the right eye of the passenger.
示例性的,交通工具可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不作特别的限定。投影系统可以安装于交通工具的仪表板(Instrument Panel,IP)台上,位于副驾位置或主驾位置,也可以安装在座椅后背。上述投影系统应用在交通工具时,可以称为HUD,可以用于显示导航信息、车速、电量/油量等。 Exemplarily, the vehicle can be a car, truck, motorcycle, bus, ship, airplane, helicopter, lawn mower, recreational vehicle, amusement park vehicle, construction equipment, tram, golf cart, train, and cart, etc., which are not particularly limited in the embodiments of the present application. The projection system can be installed on the instrument panel (IP) of the vehicle, located at the co-pilot position or the main driver position, or it can be installed on the back of the seat. When the above-mentioned projection system is applied to a vehicle, it can be called HUD, which can be used to display navigation information, vehicle speed, power/fuel level, etc.
图14为本申请请实施例提供的交通工具的一种可能的功能框架示意图。FIG. 14 is a schematic diagram of a possible functional framework of a vehicle provided in an embodiment of the present application.
如图14示,交通工具的功能框架中可包括各种子系统,例如,图示中的控制系统14、传感器系统12、一个或多个外围设备16(图示以一个为例示出)、电源18、计算机系统20、显示系统32。可选地,交通工具还可包括其他功能系统,例如,为交通工具提供动力的引擎系统等等,本申请这里不做限定。As shown in FIG14 , the functional framework of the vehicle may include various subsystems, such as the control system 14, the sensor system 12, one or more peripheral devices 16 (one is shown as an example), the power supply 18, the computer system 20, and the display system 32. Optionally, the vehicle may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
其中,传感器系统12可包括若干检测装置,这些检测装置能感受到被测量的信息,并将感受到的信息按照一定规律将其转换为电信号或者其他所需形式的信息输出。如图示出,这些检测装置可包括全球定位系统(global positioning system,GPS)、车速传感器、惯性测量单元(inertial measurement unit,IMU)、雷达单元、激光测距仪、摄像装置、轮速传感器、转向传感器、档位传感器、或者其他用于自动检测的元件等等,本申请并不做限定。Among them, the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules. As shown in the figure, these detection devices may include a global positioning system (GPS), a vehicle speed sensor, an inertial measurement unit (IMU), a radar unit, a laser rangefinder, a camera device, a wheel speed sensor, a steering sensor, a gear position sensor, or other components for automatic detection, etc., and this application does not limit them.
控制系统14可包括若干元件,例如图示出的转向单元、制动单元、照明系统、自动驾驶系统、地图导航系统、网络对时系统和障碍规避系统。可选地,控制系统14还可包括诸如用于控制车辆行驶速度的油门处理器及发动机处理器等元件,本申请不做限定。The control system 14 may include several components, such as the steering unit, brake unit, lighting system, automatic driving system, map navigation system, network timing system and obstacle avoidance system shown in the figure. Optionally, the control system 14 may also include components such as a throttle processor and an engine processor for controlling the vehicle's speed, which are not limited in this application.
外围设备16可包括若干元件,例如图示中的通信系统、触摸屏、用户接口、麦克风以及扬声器等等。其中,通信系统用于实现交通工具和除交通工具之外的其他设备之间的网络通信。在实际应用中,通信系统可采用无线通信技术或有线通信技术实现交通工具和其他设备之间的网络通信。该有线通信技术可以是指车辆和其他设备之间通过网线或光纤等方式通信。The peripheral device 16 may include several components, such as the communication system, touch screen, user interface, microphone, and speaker shown in the figure. The communication system is used to realize network communication between the vehicle and other devices other than the vehicle. In practical applications, the communication system may use wireless communication technology or wired communication technology to realize network communication between the vehicle and other devices. The wired communication technology may refer to communication between the vehicle and other devices through network cables or optical fibers.
电源18代表为车辆提供电力或能源的系统,其可包括但不限于再充电的锂电池或铅酸电池等。在实际应用中,电源中的一个或多个电池组件用于提供车辆启动的电能或能量,电源的种类和材料本申请并不限定。The power source 18 represents a system that provides power or energy for the vehicle, which may include but is not limited to a rechargeable lithium battery or a lead-acid battery, etc. In practical applications, one or more battery components in the power source are used to provide power or energy for starting the vehicle, and the type and material of the power source are not limited in this application.
交通工具的若干功能均由计算机系统20控制实现。计算机系统20可包括一个或多个处理器2001(图示以一个处理器为例示出)和存储器2002(也可称为存储装置)。在实际应用中,该存储器2002也在计算机系统20内部,也可在计算机系统20外部,例如作为交通工具中的缓存等,本申请不做限定。其中,Several functions of the vehicle are controlled and implemented by the computer system 20. The computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (also referred to as a storage device). In actual applications, the memory 2002 is also inside the computer system 20, or it may be outside the computer system 20, for example, as a cache in the vehicle, etc., which is not limited in this application.
关于处理器2001的描述,可以参考前述图12中处理器1001的描述。处理器2001可包括一个或多个通用处理器,例如,图形处理器(graphic processing unit,GPU)。处理器2001可用于运行存储器2002中存储的相关程序或程序对应的指令,以实现车辆的相应功能。For the description of processor 2001, reference may be made to the description of processor 1001 in FIG. 12 . Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU). Processor 2001 may be used to run the relevant programs or instructions corresponding to the programs stored in memory 2002 to implement the corresponding functions of the vehicle.
存储器2002可以包括易失性存储器(volatile memory),例如,RAM;存储器也可以包括非易失性存储器(non-volatile memory),例如,ROM、快闪存储器(flash memory)或固态硬盘(solid state drives,SSD);存储器2002还可以包括上述种类的存储器的组合。存储器2002可用于存储一组程序代码或程序代码对应的指令,以便于处理器2001调用存储器2002中存储的程序代码或指令以实现车辆的相应功能。该功能包括但不限于图13所示的车辆功能框架示意图中的部分功能或全部功能。本申请中,存储器2002中可存储一组用于车辆控制的程序代码,处理器2001调用该程序代码可控制车辆安全行驶,关于如何实现车辆安全行驶具体在本申请下文详述。The memory 2002 may include a volatile memory, such as a RAM; the memory may also include a non-volatile memory, such as a ROM, a flash memory or a solid state drive (SSD); the memory 2002 may also include a combination of the above-mentioned types of memories. The memory 2002 may be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 calls the program codes or instructions stored in the memory 2002 to implement the corresponding functions of the vehicle. The function includes but is not limited to some or all of the functions in the vehicle function framework diagram shown in FIG13. In the present application, a set of program codes for vehicle control may be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle is specifically described in detail below in the present application.
可选地,存储器2002除了存储程序代码或指令之外,还可存储诸如道路地图、驾驶线路、传感器数据等信息。计算机系统20可以结合车辆功能框架示意图中的其他元件,例如传感器系统中的传感器、GPS等,实现车辆的相关功能。例如,计算机系统20可基于传感器系统12的数据输入控制交通工具的行驶方向或行驶速度等,本申请不做限定。Optionally, in addition to storing program codes or instructions, the memory 2002 may also store information such as road maps, driving routes, sensor data, etc. The computer system 20 may be combined with other elements in the vehicle functional framework diagram, such as sensors in the sensor system, GPS, etc., to implement relevant functions of the vehicle. For example, the computer system 20 may control the driving direction or driving speed of the vehicle based on the data input from the sensor system 12, which is not limited in this application.
显示系统32可包括若干元件,例如,处理器、光学元件和前文中描述的立体投影系统100。处理器用于根据用户指令生成图像(如生成包含车速、电量/油量等车辆状态的图像以及增强现实AR内容的图像),并将该图像内容发送至立体投影系统100。立体投影系统100用于输出携带不同图像信息的两路成像光。挡风玻璃为光学元件。挡风玻璃用于反射或透射两路成像光,以使在驾驶员或乘客的前方呈现图像内容对应的虚像。需要说明的是,显示系统32中的部分元件的功能也可以由车辆的其它子系统来实现,例如,处理器也可以为控制系统14中的元件。The display system 32 may include several components, such as a processor, an optical element, and the stereoscopic projection system 100 described above. The processor is used to generate images according to user instructions (such as generating images containing vehicle status such as vehicle speed, power/fuel level, and images of augmented reality AR content), and send the image content to the stereoscopic projection system 100. The stereoscopic projection system 100 is used to output two-way imaging light carrying different image information. The windshield is an optical element. The windshield is used to reflect or transmit two-way imaging light so that a virtual image corresponding to the image content is presented in front of the driver or passenger. It should be noted that the functions of some components in the display system 32 can also be implemented by other subsystems of the vehicle. For example, the processor can also be a component in the control system 14.
其中,本申请图14包括四个子系统,传感器系统12、控制系统14、计算机系统20和显示系统32仅为示例,并不构成限定。在实际应用中,交通工具可根据不同功能对车辆中的若干元件进行组合,从而得到相应不同功能的子系统。在实际应用中,交通工具可包括更多或更少的系统或元件,本申请不做限定。Among them, FIG. 14 of the present application includes four subsystems, and the sensor system 12, the control system 14, the computer system 20 and the display system 32 are only examples and do not constitute limitations. In actual applications, vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions. In actual applications, vehicles can include more or fewer systems or components, and this application does not limit them.
图15为本申请实施例提供的投影方法的流程示意图。投影方法可以应用于投影系统或安装有投影系统的交通工具。此处以投影方法应用于投影系统为例进行描述。如图15所示,投影方法包括以下步 骤。FIG15 is a flow chart of a projection method provided in an embodiment of the present application. The projection method can be applied to a projection system or a vehicle equipped with a projection system. Here, the projection method is described by taking the application of the projection method to the projection system as an example. As shown in FIG15, the projection method includes the following steps: Steps.
在步骤1501中,投影系统输出两路成像光。两路成像光包括第一成像光和第二成像光。第一成像光携带第一图像信息,第二成像光携带第二图像信息。In step 1501, the projection system outputs two paths of imaging light. The two paths of imaging light include a first imaging light and a second imaging light. The first imaging light carries first image information, and the second imaging light carries second image information.
关于投影系统的描述,可以参考前述图1至图12中任一图中的描述。例如,投影系统包括图像生成组件101和光学元件102。投影系统通过图像生成组件101输出两路成像光。又例如,图像生成组件101包括背光组件901和空间光调制器902。背光组件901用于分时的以不同的角度向空间光调制器902输出两束光束。空间光调制器902用于根据不同的图像信息分时的调制两束光束,得到两路成像光。For the description of the projection system, reference may be made to the description in any of the aforementioned Figures 1 to 12. For example, the projection system includes an image generating component 101 and an optical element 102. The projection system outputs two paths of imaging light through the image generating component 101. For another example, the image generating component 101 includes a backlight component 901 and a spatial light modulator 902. The backlight component 901 is used to output two light beams to the spatial light modulator 902 at different angles in a time-sharing manner. The spatial light modulator 902 is used to modulate the two light beams in a time-sharing manner according to different image information to obtain two paths of imaging light.
在步骤1502中,投影系统反射或透射两路成像光。反射或透射后的两路成像光在虚像面生成虚像。两路成像光分别照射至接收面的两个位置。两个位置之间的距离为m。虚像面与接收面之间的距离为d1。接收面与第一视觉平面的距离为d2。d2是根据第一图像信息和第二图像信息之间的第一视差信息、m和d1确定的。d1和d2满足第一条件。In step 1502, the projection system reflects or transmits two imaging lights. The two imaging lights after reflection or transmission generate a virtual image on the virtual image plane. The two imaging lights are respectively irradiated to two positions of the receiving surface. The distance between the two positions is m. The distance between the virtual image plane and the receiving surface is d1. The distance between the receiving surface and the first visual plane is d2. d2 is determined based on the first parallax information between the first image information and the second image information, m and d1. d1 and d2 meet the first condition.
反射或透射后的两路成像光分别照射至接收面105的不同位置,例如用户的左眼(上眼位)和右眼(下眼位)。反射或透射后的两路成像光在虚像面103生成虚像。虚像面103位于光学元件102的一侧,接收面105位于光学元件102的另一侧。第一图像信息和第二图像信息之间存在第一视差信息。第一视差信息会导致用户观看到的两个图像信息产生第一视差角。此时,用户观察到的图像信息位于视觉平面104。视觉平面104与接收面105的距离为d2。d1和d2满足第一条件,第一条件可以为以下任意一个或多个条件。The two paths of imaging light after reflection or transmission are respectively irradiated to different positions of the receiving surface 105, such as the left eye (upper eye position) and the right eye (lower eye position) of the user. The two paths of imaging light after reflection or transmission generate a virtual image on the virtual image plane 103. The virtual image plane 103 is located on one side of the optical element 102, and the receiving surface 105 is located on the other side of the optical element 102. There is first parallax information between the first image information and the second image information. The first parallax information will cause the two image information viewed by the user to produce a first parallax angle. At this time, the image information observed by the user is located on the visual plane 104. The distance between the visual plane 104 and the receiving surface 105 is d2. d1 and d2 meet the first condition, and the first condition can be any one or more of the following conditions.
1、d2小于d1。其中,当光学元件为挡风玻璃时,投影系统可能会形成重影。重影会影响HUD显示的清晰度及驾驶安全性。当d2小于d1时,可以降低重影的影响,从而提高HUD显示的清晰度及驾驶安全性。1. d2 is smaller than d1. When the optical element is a windshield, the projection system may form a ghost image. The ghost image will affect the clarity of the HUD display and driving safety. When d2 is smaller than d1, the impact of the ghost image can be reduced, thereby improving the clarity of the HUD display and driving safety.
2、VAC小于0.25屈光度。其中,VAC大小和用户的体验负相关。通过控制VAC,可以提高用户体验。2, VAC is less than 0.25 diopters. The VAC size is negatively correlated with the user experience. By controlling VAC, the user experience can be improved.
应理解,投影方法的内容和前述投影系统的内容存在相似之处。因此,关于投影方法的描述,可以参考前述投影系统中的相关描述。例如,在步骤1501中,投影系统分时的输出两束光束,通过分时的调制两束光束,得到两路成像光。又例如,在步骤1501中,投影系统在第一时刻输出两路成像光。两路成像光中的第一成像光携带第一图像信息。两路成像光中的第二成像光携带第二图像信息。第一图像信息和第二图像信息携带第一视差信息。投影方法还包括以下步骤:投影系统在第二时刻输出两路成像光。两路成像光中的第一成像光携带第三图像信息。两路成像光中的第二成像光携带第四图像信息。第三图像信息和第四图像信息携带第二视差信息。It should be understood that there are similarities between the content of the projection method and the content of the aforementioned projection system. Therefore, for the description of the projection method, reference may be made to the relevant description in the aforementioned projection system. For example, in step 1501, the projection system outputs two light beams in a time-sharing manner, and obtains two imaging lights by modulating the two light beams in a time-sharing manner. For another example, in step 1501, the projection system outputs two imaging lights at a first moment. The first imaging light in the two imaging lights carries the first image information. The second imaging light in the two imaging lights carries the second image information. The first image information and the second image information carry the first parallax information. The projection method further includes the following steps: the projection system outputs two imaging lights at a second moment. The first imaging light in the two imaging lights carries the third image information. The second imaging light in the two imaging lights carries the fourth image information. The third image information and the fourth image information carry the second parallax information.
应理解,在本说明书的描述中,具体特征、结构、材料或者特点可以在任何的一个或多个实施例或示例中以合适的方式结合。以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。 It should be understood that in the description of this specification, specific features, structures, materials or characteristics can be combined in any one or more embodiments or examples in a suitable manner. The above are only specific implementations of the present application, but the protection scope of the present application is not limited thereto. Any technician familiar with the technical field can easily think of changes or replacements within the technical scope disclosed in this application, which should be included in the protection scope of the present application. Therefore, the protection scope of the present application shall be based on the protection scope of the claims.

Claims (18)

  1. 一种投影系统,其特征在于,包括图像生成组件和光学元件,其中:A projection system, characterized in that it comprises an image generating component and an optical element, wherein:
    所述图像生成组件用于输出两路成像光,所述两路成像光包括第一路成像光和第二路成像光,所述第一路成像光携带第一图像信息,所述第二路成像光携带第二图像信息;The image generating component is used to output two paths of imaging light, wherein the two paths of imaging light include a first path of imaging light and a second path of imaging light, wherein the first path of imaging light carries first image information, and the second path of imaging light carries second image information;
    所述光学元件用于反射或透射所述两路成像光,反射或透射后的所述两路成像光在虚像面生成虚像,反射或透射后的所述两路成像光照射至接收面的两个位置,所述两个位置之间的距离为m;The optical element is used to reflect or transmit the two imaging lights, the two imaging lights after reflection or transmission generate a virtual image on the virtual image plane, and the two imaging lights after reflection or transmission are irradiated to two positions of the receiving surface, and the distance between the two positions is m;
    所述虚像面与所述接收面之间的距离为d1,所述接收面与第一视觉平面的距离为d2,所述d2是根据所述第一图像信息和所述第二图像信息之间的第一视差信息、所述m和所述d1确定的,辐辏调节冲突VAC满足以下关系:
    The distance between the virtual image plane and the receiving plane is d1, and the distance between the receiving plane and the first visual plane is d2, wherein d2 is determined according to the first parallax information between the first image information and the second image information, m and d1, and the convergence accommodation conflict VAC satisfies the following relationship:
    其中,所述VAC小于0.25屈光度。Wherein, the VAC is less than 0.25 diopters.
  2. 根据权利要求1所述的投影系统,其特征在于,所述光学元件为挡风玻璃,所述挡风玻璃包括第一玻璃层、第二玻璃层和粘合所述第一玻璃层与所述第二玻璃层的中间层。The projection system according to claim 1, characterized in that the optical element is a windshield, and the windshield comprises a first glass layer, a second glass layer, and an intermediate layer bonding the first glass layer and the second glass layer.
  3. 根据权利要求2所述的投影系统,其特征在于,所述两路成像光为线性偏振光,所述中间层用于吸收所述线性偏振光。The projection system according to claim 2, characterized in that the two paths of imaging light are linearly polarized light, and the intermediate layer is used to absorb the linearly polarized light.
  4. 根据权利要求2所述的投影系统,其特征在于,所述中间层为楔形结构。The projection system according to claim 2, characterized in that the intermediate layer is a wedge-shaped structure.
  5. 根据权利要求3或4所述的投影系统,其特征在于,所述d1的取值范围在2.5m至7m之间。The projection system according to claim 3 or 4, characterized in that the value range of d1 is between 2.5m and 7m.
  6. 根据权利要求2所述的投影系统,其特征在于,所述中间层在不同位置处的厚度相同。The projection system according to claim 2, characterized in that the thickness of the intermediate layer at different positions is the same.
  7. 根据权利要求6所述的投影系统,其特征在于,所述d2小于所述d1。The projection system according to claim 6, wherein d2 is smaller than d1.
  8. 根据权利要求7所述的投影系统,其特征在于,所述d1的取值范围在10m至15m之间。The projection system according to claim 7, characterized in that the value range of d1 is between 10m and 15m.
  9. 根据权利要求1至8中任意一项所述的投影系统,其特征在于,所述第一路成像光还携带第三图像信息,所述第二路成像光还携带第四图像信息,所述接收面与第二视觉平面的距离为d3,所述d3是根据所述第三图像信息和所述第四图像信息之间的第二视差信息、所述m和所述d1确定的。The projection system according to any one of claims 1 to 8, characterized in that the first imaging light also carries third image information, the second imaging light also carries fourth image information, the distance between the receiving surface and the second visual plane is d3, and d3 is determined based on the second parallax information between the third image information and the fourth image information, m and d1.
  10. 根据权利要求9所述的投影系统,其特征在于,所述d3等于所述d1。The projection system according to claim 9, characterized in that d3 is equal to d1.
  11. 根据权利要求1至10中任意一项所述的投影系统,其特征在于,所述光学元件的焦距为f,所述图像生成组件与所述光学元件的距离为d,所述d小于所述f。The projection system according to any one of claims 1 to 10, characterized in that the focal length of the optical element is f, the distance between the image generating component and the optical element is d, and d is smaller than f.
  12. 根据权利要求11所述的投影系统,其特征在于,所述虚像面和所述光学元件的距离为d0,所述d0满足如下公式:
    The projection system according to claim 11, characterized in that the distance between the virtual image plane and the optical element is d0, and d0 satisfies the following formula:
  13. 根据权利要求1至12中任意一项所述的投影系统,其特征在于,所述图像生成组件包括背光组件和空间光调制器;The projection system according to any one of claims 1 to 12, characterized in that the image generating component comprises a backlight component and a spatial light modulator;
    所述背光组件用于分时的以不同的角度向所述空间光调制器输出两束光束;The backlight assembly is used to output two light beams to the spatial light modulator at different angles in a time-sharing manner;
    所述空间光调制器用于根据不同的图像信息分时的调制所述两束光束,得到所述两路成像光,以不同的角度输出所述两路成像光。The spatial light modulator is used to modulate the two light beams in a time-sharing manner according to different image information to obtain the two paths of imaging light, and output the two paths of imaging light at different angles.
  14. 根据权利要求13所述的投影系统,其特征在于,所述图像生成组件还包括扩散屏;The projection system according to claim 13, wherein the image generating component further comprises a diffusion screen;
    所述扩散屏用于从所述空间光调制器接收所述两路成像光,对所述两路成像光进行扩散,以不同的角度输出扩散后的所述两路成像光。The diffusion screen is used to receive the two paths of imaging light from the spatial light modulator, diffuse the two paths of imaging light, and output the diffused two paths of imaging light at different angles.
  15. 根据权利要求13或14所述的投影系统,其特征在于,所述两束光束包括第一光束和第二光束;The projection system according to claim 13 or 14, characterized in that the two light beams include a first light beam and a second light beam;
    所述背光组件用于分时的以不同的角度向所述空间光调制器输出两束光束包括;所述背光组件用于在第一位置输出所述第一光束,在第二位置输出所述第二光束。The backlight assembly is used to output two light beams to the spatial light modulator at different angles in a time-sharing manner, including: the backlight assembly is used to output the first light beam at a first position, and output the second light beam at a second position.
  16. 根据权利要求1至15中任意一项所述的投影系统,其特征在于,所述投影系统还包括人眼追踪模块和处理器;The projection system according to any one of claims 1 to 15, characterized in that the projection system further comprises an eye tracking module and a processor;
    所述人员追踪模块用于获取所述接收面的位置;The personnel tracking module is used to obtain the position of the receiving surface;
    所述处理器用于根据所述接收面的位置调整所述第一图像信息和所述第二图像信息的所述第一视差信息。The processor is used to adjust the first disparity information of the first image information and the second image information according to the position of the receiving surface.
  17. 一种投影方法,其特征在于,包括:A projection method, characterized by comprising:
    输出两路成像光,所述两路成像光包括第一路成像光和第二路成像光,所述第一路成像光携带第一图像信息,所述第二路成像光携带第二图像信息;Outputting two paths of imaging light, the two paths of imaging light comprising a first path of imaging light and a second path of imaging light, the first path of imaging light carrying first image information, and the second path of imaging light carrying second image information;
    反射或透射所述两路成像光,反射或透射后的所述两路成像光在虚像面生成虚像,反射或透射后的 所述两路成像光照射至接收面的两个位置,所述两个位置之间的距离为m,所述虚像面与所述接收面之间的距离为d1,所述接收面与第一视觉平面的距离为d2,所述d2是根据所述第一图像信息和所述第二图像信息之间的第一视差信息、所述m和所述d1确定的,辐辏调节冲突VAC满足以下关系:
    The two imaging lights are reflected or transmitted, and the two imaging lights after reflection or transmission generate a virtual image on the virtual image surface. The two paths of imaging light are irradiated to two positions of the receiving surface, the distance between the two positions is m, the distance between the virtual image plane and the receiving surface is d1, the distance between the receiving surface and the first visual plane is d2, d2 is determined according to the first parallax information between the first image information and the second image information, m and d1, and the convergence accommodation conflict VAC satisfies the following relationship:
    其中,所述VAC小于0.25屈光度。Wherein, the VAC is less than 0.25 diopters.
  18. 一种交通工具,其特征在于,包括如权利要求1至16中任意一项所述的投影系统,所述投影系统安装在所述交通工具上。 A vehicle, characterized in that it comprises the projection system according to any one of claims 1 to 16, wherein the projection system is installed on the vehicle.
PCT/CN2023/107815 2022-11-11 2023-07-18 Projection system, projection method, and transportation means WO2024098828A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211414939.3 2022-11-11
CN202211414939.3A CN118033971A (en) 2022-11-11 2022-11-11 Projection system, projection method and vehicle

Publications (1)

Publication Number Publication Date
WO2024098828A1 true WO2024098828A1 (en) 2024-05-16

Family

ID=90988068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/107815 WO2024098828A1 (en) 2022-11-11 2023-07-18 Projection system, projection method, and transportation means

Country Status (2)

Country Link
CN (1) CN118033971A (en)
WO (1) WO2024098828A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180024355A1 (en) * 2016-07-19 2018-01-25 The Board Of Trustees Of The University Of Illinoi Method and system for near-eye three dimensional display
KR20200022890A (en) * 2018-08-24 2020-03-04 (주) 태진금속 Three dimensional image display apparatus without vergence-accommodation conflict
CN112130325A (en) * 2020-09-25 2020-12-25 东风汽车有限公司 Parallax correction system and method for vehicle-mounted head-up display, storage medium and electronic device
CN114137725A (en) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 Head-up display system capable of displaying three-dimensional image
CN114185171A (en) * 2020-09-14 2022-03-15 未来(北京)黑科技有限公司 Head-up display device with variable imaging distance and head-up display system
CN114787690A (en) * 2019-12-10 2022-07-22 奇跃公司 Increased depth of field for mixed reality displays

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180024355A1 (en) * 2016-07-19 2018-01-25 The Board Of Trustees Of The University Of Illinoi Method and system for near-eye three dimensional display
KR20200022890A (en) * 2018-08-24 2020-03-04 (주) 태진금속 Three dimensional image display apparatus without vergence-accommodation conflict
CN114787690A (en) * 2019-12-10 2022-07-22 奇跃公司 Increased depth of field for mixed reality displays
CN114137725A (en) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 Head-up display system capable of displaying three-dimensional image
CN114185171A (en) * 2020-09-14 2022-03-15 未来(北京)黑科技有限公司 Head-up display device with variable imaging distance and head-up display system
CN112130325A (en) * 2020-09-25 2020-12-25 东风汽车有限公司 Parallax correction system and method for vehicle-mounted head-up display, storage medium and electronic device

Also Published As

Publication number Publication date
CN118033971A (en) 2024-05-14

Similar Documents

Publication Publication Date Title
CN112639581B (en) Head-up display and head-up display method
WO2024021852A1 (en) Stereoscopic display apparatus, stereoscopic display system, and vehicle
WO2024021574A1 (en) 3d projection system, projection system, and vehicle
WO2024017038A1 (en) Image generation apparatus, display device and vehicle
WO2024098828A1 (en) Projection system, projection method, and transportation means
CN115639673B (en) Display device and display method
CN217360538U (en) Projection system, display device and vehicle
WO2023216670A1 (en) Three-dimensional display apparatus and vehicle
WO2023185293A1 (en) Image generation apparatus, display device, and vehicle
WO2023130759A1 (en) Display device and vehicle
CN115542644B (en) Projection device, display equipment and vehicle
WO2024021563A1 (en) Display device and vehicle
WO2023040669A1 (en) Head-up display device and vehicle
WO2023138138A1 (en) Display device and vehicle
WO2024065332A1 (en) Display module, optical display system, terminal device and image display method
WO2023103492A1 (en) Image generation apparatus, display device and vehicle
WO2023098228A1 (en) Display apparatus, electronic device and vehicle
CN220983636U (en) Display device, vehicle and vehicle-mounted system
WO2023138076A1 (en) Display apparatus and vehicle
WO2023087739A1 (en) Projection apparatus, display device, and vehicle
WO2023071548A1 (en) Optical display apparatus, display system, vehicle, and color adjustment method
WO2024041034A1 (en) Display module, optical display system, terminal device, and imaging method
CN117991569A (en) Projection device, display apparatus and vehicle
CN115826332A (en) Image generation device, related equipment and image projection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23887524

Country of ref document: EP

Kind code of ref document: A1