WO2023216670A1 - Appareil d'affichage tridimensionnel et véhicule - Google Patents

Appareil d'affichage tridimensionnel et véhicule Download PDF

Info

Publication number
WO2023216670A1
WO2023216670A1 PCT/CN2023/076650 CN2023076650W WO2023216670A1 WO 2023216670 A1 WO2023216670 A1 WO 2023216670A1 CN 2023076650 W CN2023076650 W CN 2023076650W WO 2023216670 A1 WO2023216670 A1 WO 2023216670A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
light
display device
image
image information
Prior art date
Application number
PCT/CN2023/076650
Other languages
English (en)
Chinese (zh)
Inventor
邓宁
贺俊妮
邹冰
常泽山
黄志勇
常天海
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023216670A1 publication Critical patent/WO2023216670A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/33Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving directional light or back-light sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • G02B30/35Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers using reflective optical elements in the optical path between the images and the observer

Definitions

  • the present application relates to the field of display, and in particular, to a three-dimensional display device and a vehicle including the three-dimensional display device.
  • Stereoscopic display requires providing image information with different parallaxes to both eyes. Compared with 2D display, stereoscopic display can give people a better experience.
  • Naked-eye stereoscopic display technology is also a stereoscopic display solution in which users do not need to wear polarized glasses or shutter glasses.
  • the stereoscopic display device outputs two channels of imaging light to the user's left and right eyes respectively.
  • the stereoscopic display device can output two channels of imaging light in a time-sharing manner.
  • one channel of imaging light output by the stereoscopic display device irradiates one eye of the user.
  • another path of imaging light output by the stereoscopic display device irradiates the other eye of the user.
  • the first time period and the second time period are divided alternately.
  • the two channels of imaging light carry image information with different parallaxes, thereby providing users with three-dimensional visual enjoyment.
  • stereoscopically displayed images require a larger format. Therefore, when the distance between the user and the stereoscopic display device is short, the user's experience is low.
  • the present application provides a stereoscopic display device and a vehicle, which can enlarge the stereoscopically displayed image through a curved mirror or lens. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
  • a first aspect of this application provides a three-dimensional display device.
  • the stereoscopic display device includes an image generating component and a curved mirror.
  • the image generation component is used to generate two channels of imaging light.
  • the two imaging lights carry image information with different parallaxes.
  • Curved mirrors are used to reflect two-way imaging light. There is an angle between the two reflected imaging lights.
  • the focal length of the curved mirror is f.
  • the distance between the image surface of the image generation component and the curved mirror is d. d is less than f.
  • d is smaller than f, and the curved mirror can amplify the stereoscopically displayed image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved. Moreover, compared with lenses, the volume of the curved mirror can be smaller, thereby reducing the volume of the stereoscopic display device.
  • the distance between the virtual image formed by the two reflected imaging lights and the curved mirror is D.
  • D satisfies the following formula:
  • ⁇ 1 there is an included angle ⁇ 1 between the two imaging lights after being reflected by the curved mirror.
  • S is the distance between the receiving position of the two imaging lights and the curved mirror.
  • the value of E ranges from 53 mm to 73 mm.
  • w is the width at the receiving position of at least one of the two imaging lights after being reflected by the curved mirror.
  • ⁇ 2 there is an angle ⁇ 2 between the two imaging lights before being reflected by the curved mirror.
  • S is the distance between the receiving position of the two imaging lights and the curved mirror.
  • E ranges from 53 mm to 73 mm.
  • w is the width at the receiving position of at least one of the two imaging lights after being reflected by the curved mirror.
  • the divergence angle of each of the two imaging lights before being reflected by the curved mirror is ⁇ .
  • w satisfies the following formula: w is less than 73 mm.
  • the image generating component includes a first light source component and a pixel component.
  • the first light source component is used to time-share the first light beam and the second light beam in different emission directions to the pixel component.
  • the pixel component is used to separately modulate the first light beam and the second light beam using different image information to generate two paths of imaging light.
  • the first light source component includes a first light source device and a second light source device.
  • the first light source device and the second light source device are used to alternately output the first light beam and the second light beam in time division.
  • the image generation component further includes a timing control unit.
  • the timing control unit is used to control the first light source device and the second light source device to alternately output the first light beam and the second light beam in time division.
  • the timing control unit is also used to control the pixel component to use different image information to modulate the first light beam and the second light beam in a time-sharing manner.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the image generating component includes a second light source component, a pixel component and a lens array.
  • the second light source component is used to output the third light beam to the pixel component.
  • the pixel component is used to modulate the third light beam according to different image information to generate the first imaging light and the second imaging light.
  • the curved mirror array is used to transmit the first imaging light and the second imaging light at different angles.
  • the pixel component includes a first pixel and a second pixel.
  • the first pixel is used to modulate the third light beam according to the first image information to generate a first path of imaging light.
  • the second pixel is used to modulate the third light beam according to the second image information to generate a second path of imaging light. Due to the process error of the curved mirror, the zoom factor and imaging position of the image observed by the user will be displayed differently relative to the ideal position. Display differences can cause users to experience dizziness and other physiological discomfort, reducing user experience. To this end, the image information can be preprocessed to compensate for display differences.
  • the third beam includes a first sub-beam and a second sub-beam.
  • the first pixel is used to modulate the third light beam according to the first image information and generate the first imaging light.
  • the first pixel is used to modulate the first sub-beam according to the first image information to generate the first imaging light.
  • the second pixel is used to modulate the third light beam according to the second image information and generate the second imaging light.
  • the second pixel is used to modulate the second sub-beam according to the second image information to generate the second imaging light.
  • the second light source component is used to simultaneously generate the first sub-beam and the second sub-beam.
  • the image information of different disparities includes first image information and second image information.
  • the stereoscopic display device also includes a processor. Wherein, the processor can be set inside the image generation component, or can be set outside the image generation component.
  • the processor is configured to preprocess the third image information to obtain the first image information.
  • the processor is used to preprocess the fourth image information to obtain the second image information.
  • the processor is further configured to obtain first coordinate information of the first position and/or second coordinate information of the second position.
  • One of the two imaging lights is illuminated to the first position.
  • the other imaging light of the two imaging lights is illuminated to the second position.
  • the processor is configured to preprocess the third image information including: the processor is configured to preprocess the third image information according to the first coordinate information.
  • the processor is configured to preprocess the fourth image information including: the processor is configured to preprocess the fourth image information according to the second coordinate information.
  • both eyes of the user receive the same image information. Therefore, the processor can only preprocess one image information.
  • the processor may obtain coordinate information of the middle position of the user's eyes.
  • the processor preprocesses the image information according to the coordinate information of the intermediate position.
  • the processor can also preprocess different image information respectively according to the coordinate information of the intermediate position.
  • the coordinate information of the intermediate position corresponds to two correction parameters.
  • the processor can preprocess different image information respectively according to two correction parameters.
  • by using coordinate information of different locations for preprocessing the accuracy of preprocessing can be improved, thereby improving user experience.
  • f is less than 400 mm.
  • the present application can reduce the volume of the stereoscopic display device.
  • the image generating component includes a projector and a diffusion screen.
  • the projector is used to generate two channels of imaging light.
  • the diffusion screen is used to receive two channels of imaging light, diffuse the two channels of imaging light, and output the two channels of diffused imaging light.
  • the curved mirror is used to reflect the diffused two-way imaging light.
  • a second aspect of the present application provides a three-dimensional display device.
  • the stereoscopic display device includes an image generating component and a lens.
  • the image generation component is used to generate two channels of imaging light.
  • the two imaging lights carry image information with different parallaxes.
  • the lens is used to transmit two-way imaging light. There is an angle between the two transmitted imaging lights.
  • the focal length of transmission is f.
  • the distance between the image plane of the image generating component and the lens is d. d is less than f.
  • d is smaller than f, and the lens can amplify the stereoscopically displayed image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
  • the distance between the virtual image formed by the two reflected imaging lights and the curved mirror is D.
  • D satisfies the following formula:
  • ⁇ 1 there is an included angle ⁇ 1 between the two imaging lights after being reflected by the lens.
  • S is the distance between the receiving position of the two imaging lights and the lens.
  • E ranges from 53 mm to 73 mm.
  • w is the width of each of the at least one imaging light reflected by the lens at the receiving position.
  • ⁇ 2 there is an angle ⁇ 2 between the two imaging lights before being reflected by the lens.
  • S is the distance between the receiving position of the two imaging lights and the curved mirror.
  • E ranges from 53 mm to 73 mm.
  • w is the width at the receiving position of at least one of the two imaging lights after being reflected by the curved mirror.
  • the divergence angle of each of the two imaging lights before being reflected by the lens is ⁇ .
  • w satisfies the following formula: w is less than 73 mm.
  • the image generating component includes a first light source component and a pixel component.
  • the first light source component is used to output the first light beam and the second light beam in different emission directions to the pixel component in a time-sharing manner.
  • the pixel component is used to modulate the first light beam and the second light beam respectively according to different image information to generate two paths of imaging light.
  • the first light source component includes a first light source device and a second light source device.
  • the first light source device and the second light source device are used to alternately output the first light beam and the second light beam in time division.
  • the image generation component further includes a timing control unit.
  • the timing control unit is used to control the first light source device and the second light source device to alternately output the first light beam and the second light beam in time division.
  • the timing control unit is also used to control the pixel component to use different image information to modulate the first light beam and the second light beam in a time-sharing manner.
  • the pixel component includes a first pixel and a second pixel.
  • the first pixel is used to modulate the first light beam to obtain the first imaging light of the two imaging lights.
  • the second pixel is used to modulate the second light beam to obtain the second imaging light of the two imaging lights.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the image generating component includes a second light source component, a pixel component and a lens array.
  • the second light source component is used to output the third light beam to the pixel component.
  • the pixel component is used to modulate the third light beam using different image information to generate the first imaging light and the second imaging light.
  • the lens array is used to transmit the first imaging light and the second imaging light at different angles.
  • the pixel component includes a first pixel and a second pixel.
  • the first pixel is used to modulate the third light beam according to the first image information to generate a first path of imaging light.
  • the second pixel is used to modulate the third light beam according to the second image information to generate a second path of imaging light.
  • the third beam includes a first sub-beam and a second sub-beam.
  • the first pixel is used to modulate the first sub-beam according to the first image information to generate a first path of imaging light.
  • the second pixel is used to modulate the second sub-beam according to the second image information to generate a second path of imaging light.
  • the second light source component is used to simultaneously generate the first sub-beam and the second sub-beam.
  • the image information of different disparities includes first image information and second image information.
  • the stereoscopic display device also includes a processor.
  • the processor is used to preprocess the third image information to obtain the first image information.
  • the processor is used to preprocess the fourth image information to obtain the second image information.
  • the processor is further configured to obtain first coordinate information of the first position and/or second coordinate information of the second position.
  • One of the two imaging lights is illuminated to the first position.
  • the other imaging light of the two imaging lights is illuminated to the second position.
  • the processor is configured to preprocess the third image information including: the processor is configured to preprocess the third image information according to the first coordinate information.
  • the processor is configured to preprocess the fourth image information including: the processor is configured to preprocess the fourth image information according to the second coordinate information.
  • f is less than 300 mm.
  • the image generating component includes a projector and a diffusion screen.
  • the projector is used to generate two channels of imaging light.
  • the diffusion screen is used to receive two channels of imaging light, diffuse the two channels of imaging light, and output the two channels of diffused imaging light.
  • the lens is used to transmit the diffused two-way imaging light.
  • the third aspect of this application provides a vehicle.
  • the vehicle includes a three-dimensional display device as described in the aforementioned first aspect, any optional manner of the first aspect, the second aspect, or any optional manner of the second aspect.
  • the stereoscopic display device is installed on the vehicle.
  • Figure 1 is a first structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of the first light path projection of the three-dimensional display device provided by the embodiment of the present application.
  • Figure 3 is a second structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of the second optical path projection of the three-dimensional display device provided by the embodiment of the present application.
  • Figure 5 is a first structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 6 is a second structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 7a is a third structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 7b is a fourth structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 8 is a schematic structural diagram of a pixel component and a lens array provided by an embodiment of the present application.
  • Figure 9a is a third structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 9b is a fourth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 10a is a fifth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 10b is a sixth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 11 is a seventh structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 12 is an eighth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of the third optical path projection of the three-dimensional display device provided by the embodiment of the present application.
  • Figure 14 is a circuit schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 15 is a schematic structural diagram of a vehicle provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of a possible functional framework of the vehicle provided by the embodiment of the present application.
  • the present application provides a stereoscopic display device and a vehicle, which can enlarge the stereoscopically displayed image through a curved mirror or lens. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
  • first, second, etc. used in this application are only used for the purpose of distinguishing descriptions, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
  • reference numbers and/or letters are repeated in multiple drawings of the embodiments of this application. Repetition does not imply a strictly limiting relationship between the various embodiments and/or configurations.
  • the three-dimensional display device in this application may also be called a 3D display device.
  • Stereoscopic display devices are used in the field of projection technology.
  • directional backlight devices can be used to provide users with three-dimensional visual enjoyment.
  • stereoscopically displayed images require a larger format. Therefore, when the distance between the user and the stereoscopic display device is short, the user's experience is low.
  • FIG. 1 is a first structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes an image generating component 101 and a curved mirror 102 .
  • the image generating component 101 is used to generate two paths of imaging light.
  • each solid line connected to the image generating component 101 represents a path of imaging light.
  • the curved mirror 102 is used to reflect two paths of imaging light. There is an angle between the two reflected imaging lights. Therefore, the two reflected imaging lights can illuminate different locations. For example, one channel of imaging light irradiates the user's left eye, and another channel of imaging light irradiates the user's right eye.
  • the two channels of imaging light carry image (pattern) information with different parallaxes, thereby providing users with three-dimensional visual enjoyment.
  • the position of the human eye can be called the viewpoint.
  • the above-mentioned three-dimensional display device can provide multiple viewpoints for viewing by multiple people.
  • the image generating component 101 can produce multiple channels of imaging light for viewing by different people. This embodiment takes a viewpoint as an example, that is, the image generation component 101 generates two paths of imaging light, to illustrate the imaging process of the stereoscopic display device.
  • the focal length of the curved mirror 102 is f.
  • the distance between the image surface (display surface of the image) of the image generation component 101 and the curved mirror 102 is d.
  • d may be the furthest vertical distance between the curved mirror 102 and the image plane of the image generating component 101 .
  • d may be the straight-line distance between the center pixel of the image surface of the image generation component 101 and the target point on the curved mirror 102 .
  • the center pixel is the image One or more pixels at the center of the image plane.
  • the imaging light output by the central pixel irradiates the target point on the curved mirror 102 .
  • d is less than f.
  • the curved mirror 102 can amplify the virtual image. Therefore, when the distance between the user and the stereoscopic display device 100 is relatively close, the user can see the enlarged virtual image, thereby improving the user experience.
  • FIG. 2 is a first optical path projection schematic diagram of the three-dimensional display device provided by the embodiment of the present application.
  • the image generating component 101 is used to generate two paths of imaging light.
  • the divergence angle of each of the two imaging lights is ⁇ .
  • the dotted line in Figure 2 represents one of the two imaging lights.
  • the solid line in Figure 2 represents the other imaging light among the two imaging lights.
  • the curved mirror 102 is used to reflect two paths of imaging light.
  • D can be obtained from f and d.
  • the two reflected imaging lights can illuminate different locations.
  • the position where the two imaging lights are illuminated is also called the receiving position of the two imaging lights, such as the user's eyes.
  • the interpupillary distance (pupillary distance) of both eyes is E.
  • the value range of E can be between 53 mm and 73 mm.
  • E is 53mm or 73mm.
  • the distance between the eyes and the curved mirror 102 is S.
  • the width of each of the two reflected imaging lights is w.
  • w is related to S, ⁇ and D. According to the following formula 2, w can be obtained from S, ⁇ , and D.
  • the width w of each imaging light is too large, one imaging light may cover the user's eyes, resulting in crosstalk of light beams.
  • w can be smaller than the maximum value of E.
  • the maximum value of E is 73 mm.
  • the value of S can be 0 mm or 5000 mm.
  • the reflected two imaging lights can be illuminated at appropriate locations, such as the user's eyes.
  • the distance M between the two reflected imaging lights is related to the angles ⁇ 1, D, and S. According to the following formula 3, M can be obtained from ⁇ 1, D and S.
  • the distance M between the two reflected imaging lights is related to the angles ⁇ 2, D, and S. According to the following formula 4, M can be obtained through ⁇ 2, D, and S.
  • the relationship between M, E, and w can be set. Specifically, when the value of S is a value between 0 mm and 5000 mm, E-w ⁇ M ⁇ E+2w. w can be w1 or w2. w1 is the width of the first imaging light. w2 is the width of the second imaging light. In practical applications, w can be w1 and w2. At this time, when the value of S is a value between 0 mm and 5000 mm, E-w1 ⁇ M ⁇ E+2w1, E-w2 ⁇ M ⁇ E+2w2.
  • FIG. 3 is a second structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes an image generating component 101 and a lens 301 .
  • the image generating component 101 is used to generate two paths of imaging light.
  • each solid line connected to the image generating component 101 represents a path of imaging light.
  • Lens 301 is used to transmit two paths of imaging light. There is an angle between the two transmitted imaging lights. Therefore, the two transmitted imaging lights can illuminate different locations.
  • one channel of imaging light irradiates the user's left eye, and another channel of imaging light irradiates the user's right eye.
  • the two channels of imaging light carry image information with different parallaxes, thereby providing users with three-dimensional visual enjoyment.
  • the focal length of the lens is f.
  • the distance between the image plane of the image generating component 101 and the lens 301 is d.
  • d is less than f.
  • the lens 301 can amplify the virtual image. Therefore, when the distance between the user and the stereoscopic display device 100 is relatively short, the user experience can be improved.
  • the parameters of the two imaging lights are the same.
  • the divergence angles of the two imaging lights are ⁇ .
  • the divergence angle of the first imaging light is ⁇ 1.
  • the divergence angle of the first imaging light is ⁇ 2.
  • two w can be obtained through ⁇ 1, ⁇ 2 and the aforementioned formula 1.
  • the two ws include the width w1 of the first imaging light and the width w2 of the second imaging light. In order to reduce or avoid beam crosstalk, both w1 and w2 can be smaller than the maximum value of E.
  • FIG. 4 is a second optical path projection schematic diagram of the three-dimensional display device provided by the embodiment of the present application.
  • the image generating component 101 is used to generate two paths of imaging light.
  • the divergence angle of each of the two imaging lights is ⁇ .
  • the dotted line in Figure 4 represents one of the two imaging lights.
  • the solid line in Figure 4 represents the other imaging light among the two imaging lights.
  • the lens 301 is used to transmit two channels of imaging light, and the propagation directions of the two channels of imaging light are deflected after transmission.
  • the distance between the virtual image formed by the transmitted two imaging lights and the lens 301 is D.
  • FIG. 5 is a first structural schematic diagram of an image generation component provided by an embodiment of the present application.
  • the image generation component 101 includes a first light source component 501 and a pixel component 502 .
  • the first light source component 501 can be a light emitting diode (LED) light source or a laser diode (LD) light source.
  • the first light source component 501 is used to output the first light beam and the second light beam in different emission directions to the pixel component 502 in a time-sharing manner.
  • LED light emitting diode
  • LD laser diode
  • the dotted line connected to the first light source assembly 501 represents the first light beam.
  • the solid line connected to the first light source assembly 501 represents the second light beam.
  • the pixel component 502 may be a liquid crystal display (LCD), liquid crystal on silicon (LCOS), digital micro-mirror device (DMD), etc. Pixel component 502 may be referred to as an image modulator.
  • the pixel component 502 is used to respectively modulate the first light beam and the second light beam using different image information to generate two paths of imaging light.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the dotted line connected to the pixel component 502 represents the first path of imaging light.
  • the solid line connected to the pixel component 502 represents the second path of imaging light.
  • the first light source assembly 501 may include multiple light source devices.
  • Figure 6 is a second structural schematic diagram of an image generation component provided by an embodiment of the present application. As shown in FIG. 6 , the first light source assembly 501 includes a first light source device 505 and a second light source device 506 . Based on FIG. 5 , the image generation component 101 also includes a timing control unit 504 . The timing control unit 504 is used to control the first light source device 505 and the second light source device 506 to alternately output the first light beam and the second light beam in time division.
  • the timing control unit 504 is also used to control the pixel component 502 to alternately display (load) images with different parallaxes in a time-sharing manner. For example, in the first period of time, the timing control unit 504 is used to control the pixel component 502 to display the image of the left eye. In the second time period, the timing control unit 504 is used to control the first light source device 505 to output the first light beam. The pixel component 502 uses the image of the left eye to modulate the first light beam to obtain a first path of imaging light. In the third period of time, the timing control unit 504 is used to control the pixel component 502 to display the image of the right eye.
  • the timing control unit 504 is used to control the second light source device 506 to output the second light beam.
  • the pixel component 502 uses the image of the right eye to modulate the second light beam to obtain a second path of imaging light.
  • the first time period, the second time period, the third time period and the fourth time period are alternately distributed.
  • the image generating component 101 may further include a light source component 501 located between the first light source component 501 and the pixel component 502 Beam control unit 503 between.
  • the beam control unit 503 may be a Fresnel screen, a cylindrical lens or a lens array, etc.
  • the beam control unit 503 is used to change the divergence angle of the first light beam and/or the second light beam, thereby improving the light utilization efficiency of the first light source assembly 501 and increasing the brightness of the generated imaging light, thus improving the brightness of the stereoscopic display device.
  • FIG. 7a is a third structural schematic diagram of an image generation component provided by an embodiment of the present application.
  • the image generation component 101 includes a second light source component 701, a pixel component 502 and a lens array 702.
  • the second light source component 701 may be an LED light source or an LD light source, or the like.
  • the second light source component 701 is used to output the third light beam to the pixel component 502 .
  • the solid line connected to the second light source assembly 701 represents the third light beam.
  • the pixel component 502 is used to modulate the third light beam according to different image information to generate a first path of imaging light and a second path of imaging light output from different directions.
  • the first imaging light and the second imaging light have certain directionality and divergence angle.
  • the dotted line connected to the pixel component 502 represents the first path of imaging light.
  • the solid line connected to the pixel component 502 represents the second path of imaging light.
  • the pixel component 502 may include left-eye pixels and right-eye pixels. The left-eye pixels are used to display left-eye images, and the right-eye pixels are used to display right-eye images. The pixels of the left eye are modulated and emit the first imaging light, and the pixels of the right eye are modulated and emit the second imaging light.
  • the imaging light emitted by the pixel component 502 is input to the lens array 702.
  • the lens array 702 is used to transmit the first imaging light and the second imaging light at different angles, so that the first imaging light and the second imaging light output by the lens array 702
  • the imaging light has different output (propagation) directions, and the first imaging light and the second imaging light propagate to the left and right eyes of the person respectively.
  • the dotted line connected to the lens array 702 represents the first path of imaging light.
  • the solid line connected to the lens array 702 represents the second imaging light.
  • the distance between the image surface of the image generation component 101 and the curved mirror 102 is d.
  • the image surface of the image generating component 101 may be a pixel component or a diffusion screen.
  • the two light beams output by the second light source component 701 are light beams that do not carry image information
  • the image plane of the image generation component 101 is the pixel component 502.
  • FIG. 7b is a fourth structural schematic diagram of an image generation component provided by an embodiment of the present application. As shown in Figure 7b, the image generation component 101 includes a projector 703, a diffusion screen 704 and a lens array 702.
  • the projector 703 outputs a third light beam, and the third light beam carries image information.
  • Diffusion screen 704 is a pixelated device.
  • the diffusion screen 704 is used to amplify the divergence angle of the third light beam output by the projector 703.
  • the third light beam can carry image information with different parallaxes in a time-sharing manner, and the diffusion screen 704 can output two channels of imaging light, and the two channels of imaging light carry image information with different parallaxes.
  • the lens array 702 is used to transmit two imaging lights at different angles.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the dotted line connected to the lens array 702 represents the first path of imaging light.
  • the solid line connected to the lens array 702 represents the second imaging light.
  • the image surface of the image generation component 101 is a diffusion screen 704.
  • FIG. 8 is a schematic structural diagram of a pixel component and a lens array provided by an embodiment of the present application.
  • the pixel component 502 includes N pixel groups 801 .
  • N is an integer greater than 0.
  • Each pixel group 801 includes a first pixel and a second pixel.
  • the first pixel is used to modulate the third light beam and output a first sub-imaging light.
  • the second pixel is used to modulate the third light beam and output a second sub-imaging light.
  • the first sub-imaging light and the second sub-imaging light have certain directionality and divergence angle.
  • the dotted line connected to the first pixel represents the first sub-imaging light.
  • the solid line connected to the second pixel represents the second sub-imaging light.
  • Lens array 702 includes N lenses 802 .
  • Each lens 802 is used to transmit a first sub-imaging light and a second sub-imaging light.
  • Each lens 802 is used to output a first sub-imaging light and a second sub-imaging light in a certain direction.
  • the dotted line connected to the lens 802 represents the first sub-imaging light.
  • the solid line connected to lens 802 represents the second sub-imaging light.
  • N pixel groups 801 and N lenses 802 correspond one to one.
  • N lenses 802 are used to output N first sub-imaging lights and N second sub-imaging lights. The N first sub-imaging lights converge to form a first path of imaging light.
  • N second sub-imaging lights converge to form a second path component Like light.
  • FIG. 9a is a third structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 9b is a fourth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes an image generating component 101 and a curved mirror 102 .
  • the stereoscopic display device 100 reference may be made to the relevant description in FIG. 1 .
  • the first imaging light of the two imaging lights is reflected to the left eye through point A of the curved mirror 102 .
  • the second imaging light of the two imaging lights is reflected to the right eye through point B of the curved mirror 102 .
  • different process errors exist at different positions of the curved mirror 102 .
  • Process errors will cause display differences in the zoom factor and imaging position of the image observed by the user relative to the ideal position.
  • the two virtual images observed by the user's eyes are at different locations. Display differences can cause users to experience dizziness and other physiological discomfort, reducing user experience. Therefore, in the embodiment of the present application, image information of different disparities can be preprocessed. Compensate for display differences through preprocessing to enhance the display effect.
  • the stereoscopic display device may perform one or more of the following processes on the left eye image and/or right eye image loaded by the pixel component 502:
  • Figure 10a is a fifth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 10b is a sixth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes a processor 1001 and an image generation component 101.
  • the processor 1001 may be a central processing unit (CPU), a network processor (NP), or a combination of CPU and NP.
  • the processor may further include a hardware chip or other general-purpose processor.
  • the above-mentioned hardware chip can be an application specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • the processor 1001 is used to obtain third image information, and preprocess the third image information to obtain first image information.
  • the processor 1001 is used to obtain first coordinate information of a first location.
  • the first position may be the position of the user's left eye.
  • the processor 1001 is used to obtain the mapping table.
  • the mapping table contains the corresponding relationship between coordinate information and correction parameters.
  • the processor 1001 searches the mapping table for the first correction parameter corresponding to the first coordinate information.
  • the processor 1001 preprocesses the third image information according to the first correction parameter to obtain the first image information.
  • the first correction parameter may be a translation of 2 pixels to the left.
  • the processor 1001 can control the first path of imaging light to shift to the right to obtain the stereoscopic display device shown in Figure 10a.
  • the image generating component 101 Before the offset, the image generating component 101 generates the first imaging light through pixel 1. After the offset, the image generating component 101 generates the first imaging light through the pixel 2 . The position of pixel 1 after it is translated 2 pixels to the right is the position of pixel 2.
  • the processor 1001 can control the second path of imaging light to shift to the left.
  • the processor 1001 can control the first imaging light to shift to the left to obtain the stereoscopic display device shown in Figure 10b.
  • the processor 1001 can control the second imaging light to shift to the right.
  • the processor 1001 may also be used to obtain fourth image information, and preprocess the fourth image information to obtain second image information.
  • the processor 1001 may be used to obtain second coordinate information of the second location.
  • the second position may be the position of the user's right eye.
  • the processor 1001 searches the mapping table for the second correction parameter corresponding to the second coordinate information, and preprocesses the fourth image information according to the second correction parameter to obtain the second image information.
  • the second correction parameter may be an overall reduction of 5%.
  • Pixel component 502 in image generation component 101 may include display circuitry and a display panel.
  • the display circuit can also be called a display controller (DC), which has a display control function.
  • the display circuit is used to receive the first image information and the second image information output by the processor 1001.
  • the display circuit is also used to control the display panel to display the first image and the second image according to the first image information and the second image information.
  • the first image information corresponds to the first image.
  • the second image information corresponds to the second image.
  • the function of the above timing control unit 504 can be implemented by a display circuit.
  • FIG. 11 is a seventh structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the first light source component 501 outputs the first light beam and the first light beam in a time-sharing manner. Specifically, LED1 in the first light source assembly 501 generates a first light beam. LED2 in the first light source assembly 501 generates a second light beam.
  • FIG. 12 is an eighth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application. As shown in FIG.
  • the first light source component 501 outputs the first light beam and the second light beam in time division.
  • LED3 in the first light source assembly 501 generates a first light beam.
  • LED2 in the first light source assembly 501 generates a second light beam.
  • the first beam and the second beam pass through the beam control unit 503 and then reach the pixel component 502 .
  • the pixel component 502 modulates the first light beam and the second light beam through different pixels in a time-sharing manner to obtain two paths of imaging light. Among them, the pixel component 502 modulates the first light beam through the pixel 2 to obtain the first imaging light, which is reflected by the curved mirror 102 and then enters the left eye.
  • the pixel component 502 modulates the second beam through the pixel 1 to obtain the second imaging light, which enters the right eye after being reflected by the curved mirror 102.
  • the two virtual images observed by the user's eyes are at the same position, and dizziness does not occur, which improves the accuracy of the stereoscopic display. Effect.
  • the description with respect to Figure 12 is only an example.
  • the pixel component 502 may change the used pixels at the same time. Specifically, the pixel component 502 modulates the first light beam through the pixel 2 to obtain the first imaging light.
  • the pixel component 502 modulates the second light beam through the pixel 3 to obtain a second path of imaging light.
  • pixel 1, pixel 2, first pixel or second pixel, etc. may refer to one pixel point, or may refer to a set of multiple pixel points. This application does not limit this.
  • LED1 or LED2 may refer to one LED or a collection of multiple LEDs.
  • the first light source device or the second light source device may also refer to one LED, or may refer to a collection of multiple LEDs.
  • FIG. 13 is a schematic diagram of the third optical path projection of the three-dimensional display device provided by the embodiment of the present application.
  • two pixels of the image generating component 101 output two paths of imaging light.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the two pixels include pixel 1 and pixel 2.
  • the coordinates of pixel 1 are (X_Oleft, Y_Oleft).
  • the coordinates of pixel 2 are (X_Oright, Y_Oright).
  • the curved mirror 102 is used to reflect the first imaging light output by the pixel 1.
  • the reflected first imaging light strikes the user's left eye.
  • the coordinates of the user's left eye are (X_left, Y_left).
  • the curved mirror 102 is also used to reflect the second path of imaging light output by the pixel 2.
  • the reflected second imaging light irradiates the user's right eye.
  • the coordinates of the user's right eye are (X_right, Y_right).
  • the virtual images corresponding to pixel 1 and pixel 2 are in the same virtual image on the virtual image plane 1301. Like on point.
  • the coordinates of the virtual image point are (X_V, Y_V).
  • the virtual image corresponding to pixel 1 is on virtual image point 1 of the virtual image plane 1301.
  • the coordinates of virtual image point 1 are (X_V1, Y_V1).
  • the virtual image corresponding to pixel 2 is on virtual image point 2 of the virtual image plane 1301.
  • the coordinates of virtual image point 2 are (X_V2, Y_V2).
  • the calculation display error ⁇
  • the above-mentioned pixel 1 and pixel 2 are a pair of sampling points, respectively displaying the left eye image and the right eye image.
  • the characterization processor 1001 needs to perform preprocessing.
  • the threshold may be tan(2.5mrad) ⁇ S. Wherein, S is the distance between the user's eyes and the curved mirror 102 .
  • the pixels included in a pair of sampling points can be changed.
  • a pair of sample points includes pixel 1 and pixel 3.
  • the virtual image of pixel 3 is projected on virtual image point 3 of virtual image plane 1301.
  • the coordinates of virtual image point 3 are (X_V3, Y_V3).
  • the display error ⁇
  • is calculated. If the display error ⁇ is less than the threshold, it means that the display error is within an acceptable range after correction, and the processor 1001 does not need to preprocess the image information. If after preprocessing, the display error ⁇ is still greater than or equal to the threshold, the processor 1001 may perform further preprocessing until the display error ⁇ is less than the threshold.
  • FIG. 14 is a schematic circuit diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the circuit in the display device mainly includes a processor 1001, internal memory 1002, external memory interface 1003, audio module 1004, video module 1005, power module 1006, wireless communication module 1007, I/O interface 1008, Video interface 1009, controller area network (Controller Area Network, CAN) transceiver 1010, display circuit 1028 and display panel 1029, etc.
  • the processor 1001 and its peripheral components such as memory 1002, CAN transceiver 1010, audio module 1004, video module 1005, power module 1006, wireless communication module 1007, I/O interface 1008, video interface 1009, touch unit 1010 , the display circuit 1028 can be connected through the bus.
  • Processor 1001 may be called a front-end processor.
  • circuit diagram schematically illustrated in the embodiment of the present application does not constitute a specific limitation on the display device.
  • the display device may include more or fewer components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1001 includes one or more processing units.
  • the processor 1001 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processing unit. (Image Signal Processor, ISP), video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-Network Processing Unit, NPU), etc.
  • application processor Application Processor, AP
  • modem processor a graphics processor
  • GPU Graphics Processing Unit
  • ISP Image Signal Processor
  • video codec video codec
  • digital signal processor Digital Signal Processor
  • DSP Digital Signal Processor
  • baseband processor baseband processor
  • neural network processor Neural-Network Processing Unit, NPU
  • different processing units can be independent devices or integrated in one or more processors.
  • the processor 1001 may also be provided with a memory for storing instructions and data.
  • a memory for storing instructions and data.
  • the memory in processor 1001 is a cache memory. This memory can hold instructions or data that have just been used or are recycled by the processor 1001 . If the processor 1001 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 1001 is reduced, thus improving the efficiency of the system.
  • the functions of the processor 1001 can be implemented by a domain controller on the vehicle.
  • the display device may also include a plurality of input/output (I/O) interfaces 1008 connected to the processor 1001 .
  • the interface 1008 may include, but is not limited to, an integrated circuit (Inter-Integrated Circuit, I2C) interface, integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interface, pulse code modulation (Pulse Code Modulation, PCM) interface, Universal Asynchronous Receiver/Transmitter (UART) interface, mobile industry processing Mobile Industry Processor Interface (MIPI), General-Purpose Input/Output (GPIO) interface, Subscriber Identity Module (SIM) interface, and/or Universal Serial Bus (Universal Serial Bus, USB) interface, etc.
  • I2C Inter-Integrated Circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART Universal Asynchronous Receiver/Transmitter
  • MIPI mobile industry processing Mobile Industry Processor Interface
  • GPIO General-Purpose Input/Output
  • SIM Subscriber Identity Module
  • USB Universal Serial Bus
  • the above-mentioned I/O interface 1008 can be connected to devices such as a mouse, touch screen, keyboard, camera, speaker/speaker, microphone, etc., or can be connected to physical buttons on the display device (such as volume keys, brightness adjustment keys, power on/off keys, etc.).
  • Internal memory 1002 may be used to store computer executable program code, which includes instructions.
  • the memory 1002 may include a program storage area and a data storage area.
  • the stored program area can store the operating system, at least one application program required for the function (such as call function, time setting function, AR function, etc.).
  • the storage data area can store data created during use of the display device (such as phone book, world time, etc.).
  • the internal memory 1002 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, Universal Flash Storage (UFS), etc.
  • the processor 1001 executes instructions stored in the internal memory 1002 and/or instructions stored in a memory provided in the processor 1001 to execute various functional applications and data processing of the display device.
  • the external memory interface 1003 can be used to connect an external memory (such as a Micro SD card).
  • the external memory can store data or program instructions as needed.
  • the processor 1001 can read and write these data or program instructions through the external memory interface 1003.
  • the audio module 1004 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1004 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1004 may be disposed in the processor 1001, or some functional modules of the audio module 1004 may be disposed in the processor 1001.
  • the display device can implement audio functions through the audio module 1004 and an application processor.
  • the video interface 1009 can receive external audio and video input, which can specifically be a High Definition Multimedia Interface (HDMI), a Digital Video Interface (Digital Visual Interface, DVI), or a Video Graphics Array (VGA). Display port (DP), Low Voltage Differential Signaling (LVDS) interface, etc.
  • the video interface 1009 can also output video.
  • the display device receives video data sent by the navigation system or receives video data sent by the domain controller through the video interface.
  • the video module 1005 can decode the video input by the video interface 1009, for example, perform H.264 decoding.
  • the video module can also encode the video collected by the display device, such as H.264 encoding of the video collected by an external camera.
  • the processor 1001 can also decode the video input from the video interface 1009, and then output the decoded image signal to the display circuit.
  • the above-mentioned display device also includes a CAN transceiver 1010, and the CAN transceiver 1010 can be connected to the CAN bus (CAN BUS) of the car.
  • CAN BUS CAN bus
  • the display device can communicate with the in-vehicle entertainment system (music, radio, video module), vehicle status system, etc.
  • the user can activate the car music playback function by operating the display device.
  • the vehicle status system can send vehicle status information (doors, seat belts, etc.) to the display device for display.
  • the display circuit 1010 and the display panel 1011 jointly implement the function of displaying images.
  • the display circuit 1010 receives the image signal output by the processor 1001, processes the image signal, and then inputs it into the display panel 1011 for imaging.
  • the display circuit 1010 can also control the image displayed by the display panel 1011. For example, control parameters such as display brightness or contrast.
  • the display circuit 1010 may include a driving circuit, an image control circuit, and the like.
  • the above-mentioned display circuit 1010 and display panel 1011 may be located in pixel component 502.
  • the display panel 1011 is used to modulate the light beam input from the light source according to the input image signal, thereby generating a visible image.
  • the display panel 1011 may be a silicon-based liquid crystal panel, a liquid crystal display panel or a digital micromirror device.
  • the video interface 1009 can receive input video data (also called a video source).
  • the video module 1005 decodes and/or digitizes the data and then outputs the image signal to the display circuit 1010.
  • the display circuit 1010 responds to the input image signal.
  • the display panel 1011 is driven to image the light beam emitted by the light source, thereby generating a visible image (emitting imaging light).
  • the power module 1006 is used to provide power to the processor 1001 and the light source according to the input power (eg, direct current).
  • the power module 1006 may include a rechargeable battery, and the rechargeable battery may provide power to the processor 1001 and the light source.
  • the light emitted by the light source can be transmitted to the display panel 1029 for imaging, thereby forming an image light signal (imaging light).
  • the power supply module 1006 can be connected to a power supply module (such as a power battery) of a car, and the power supply module of the car supplies power to the power supply module 1006 of the display device.
  • a power supply module such as a power battery
  • the wireless communication module 1007 can enable the display device to communicate wirelessly with the outside world, and can provide Wireless Local Area Networks (WLAN) (such as Wireless Fidelity (Wi-Fi) network), Bluetooth (Bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR) and other wireless communication solutions.
  • WLAN Wireless Local Area Networks
  • Wi-Fi Wireless Fidelity
  • BT Bluetooth
  • GNSS Global Navigation Satellite System
  • FM Frequency Modulation
  • NFC Near Field Communication
  • IR Infrared
  • the wireless communication module 1007 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1007 receives electromagnetic waves through the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1001 .
  • the wireless communication module 1007 can also receive the signal to be sent from the processor 1001, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna for radiation.
  • the video data decoded by the video module 1005 can also be received wirelessly through the wireless communication module 1007 or read from the internal memory 1002 or an external memory.
  • the display device can pass through the in-car
  • the wireless LAN receives video data from the terminal device or the vehicle entertainment system, and the display device can also read the audio and video data stored in the internal memory 1002 or the external memory.
  • An embodiment of the present application also provides a vehicle equipped with any one of the aforementioned three-dimensional display devices.
  • the two imaging lights carry image information with different parallaxes.
  • the output two-way imaging light is reflected to the windshield through the reflector, and the windshield further reflects the two-way imaging light to form a virtual image.
  • the virtual image is on one side of the windshield, with the driver or passenger on the other side.
  • the reflected two-channel imaging light shines on the eyes of the driver or passenger respectively. For example, the first imaging light hits the passenger's left eye.
  • the second imaging light is illuminated to the passenger's right eye.
  • FIG. 15 is a schematic diagram of a three-dimensional display device installed on a vehicle according to an embodiment of the present application.
  • the windshield of a vehicle can be used as a curved mirror or lens in a stereoscopic display device.
  • the image generating assembly 101 and the driver or passenger are located on the same side of the windshield.
  • the image generating assembly 101 and the driver or passenger are located on different sides of the windshield.
  • the image generating component 101 is used to output two channels of imaging light. The two imaging lights carry image information with different parallaxes.
  • the windshield is used to reflect or transmit two-way imaging light to form a virtual image.
  • the virtual image is on one side of the windshield, with the driver or passenger on the other side.
  • the two-channel imaging light after reflection or transmission is illuminated to the eyes of the driver or passenger respectively.
  • the first imaging light shines on the passenger's left eye.
  • the second imaging light is illuminated to the passenger's right eye.
  • vehicles may be cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trolleys, golf carts, trains, and handcarts etc.
  • the three-dimensional display device can be installed on the instrument panel (Instrument Panel, IP) of the vehicle On the platform, it is located in the passenger or main driver's position, or it can be installed on the back of the seat.
  • IP Instrument Panel
  • the above-mentioned three-dimensional display device is used in a vehicle, it can be called a head-up display (HUD), and can be used to display navigation information, vehicle speed, power/fuel level, etc.
  • HUD head-up display
  • Figure 16 is a schematic diagram of a possible functional framework of the vehicle provided by the embodiment of the present application.
  • the functional framework of the vehicle may include various subsystems, such as the control system 14, the sensor system 12, one or more peripheral devices 16 (one is shown as an example), Power supply 18, computer system 20, display system 32.
  • the vehicle may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
  • the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (GPS), vehicle speed sensor, inertial measurement unit (IMU), radar unit, laser rangefinder, camera device, wheel speed sensor, Steering sensors, gear sensors, or other components used for automatic detection, etc. are not limited in this application.
  • the control system 14 may include several elements, such as the illustrated steering unit, braking unit, lighting system, automatic driving system, map navigation system, network time synchronization system and obstacle avoidance system.
  • the control system 14 may also include components such as a throttle controller and an engine controller for controlling the driving speed of the vehicle, which are not limited in this application.
  • Peripheral device 16 may include several elements, such as a communication system, a touch screen, a user interface, a microphone and a speaker as shown, among others.
  • the communication system is used to realize network communication between vehicles and other devices other than vehicles.
  • the communication system can use wireless communication technology or wired communication technology to realize network communication between vehicles and other devices.
  • the wired communication technology may refer to communication between vehicles and other devices through network cables or optical fibers.
  • the power source 18 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, rechargeable lithium batteries or lead-acid batteries, etc. In practical applications, one or more battery components in the power supply are used to provide electric energy or energy for starting the vehicle. The type and material of the power supply are not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (which may also be referred to as a storage device).
  • the memory 2002 may also be inside the computer system 20 or outside the computer system 20 , for example, as a cache in a vehicle, etc., which is not limited by this application. in,
  • Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU).
  • the processor 2001 may be used to run relevant programs or instructions corresponding to the programs stored in the memory 2002 to implement corresponding functions of the vehicle.
  • Memory 2002 may include volatile memory (volatile memory), such as RAM; memory may also include non-volatile memory (non-volatile memory), such as ROM, flash memory (flash memory) or solid state drive (solid state). drives, SSD); the memory 2002 may also include a combination of the above types of memory.
  • the memory 2002 can be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 can call the program codes or instructions stored in the memory 2002 to implement corresponding functions of the vehicle. This function includes but is not limited to some or all of the functions in the vehicle function framework diagram shown in Figure 13. In this application, a set of program codes for vehicle control can be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle will be described in detail below in this application.
  • the memory 2002 may also store information such as road maps, driving routes, sensor data, and the like.
  • Computer system 20 may be combined with other elements in the vehicle functional framework diagram, such as sensors Sensors, GPS, etc. in the system realize vehicle-related functions.
  • the computer system 20 can control the driving direction or driving speed of the vehicle based on data input from the sensor system 12 , which is not limited in this application.
  • the display system 32 may include several elements, such as a controller and the stereoscopic display device 100 described above.
  • the controller is configured to generate an image according to user instructions (for example, generate an image including vehicle status such as vehicle speed, power/fuel level, and an image of augmented reality AR content), and send the image content to the stereoscopic display device 100 .
  • the image generation module 101 in the stereoscopic display device 100 is used to output two channels of imaging light carrying different image information.
  • the curved screen 102 in the stereoscopic display device 100 is a windshield.
  • the windshield is used to reflect or transmit two-way imaging light, so that a virtual image corresponding to the image content is presented in front of the driver or passenger.
  • the functions of some components in the display system 32 can also be implemented by other subsystems of the vehicle.
  • the controller can also be a component in the control system 14 .
  • Figure 16 of this application shows that it includes four subsystems.
  • the sensor system 12, the control system 14, the computer system 20 and the display system 32 are only examples and do not constitute a limitation.
  • vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions.
  • the vehicle may include more or fewer systems or components, which is not limited by this application.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

L'invention concerne un appareil d'affichage tridimensionnel (100) qui s'applique au domaine de l'affichage. L'appareil d'affichage tridimensionnel (100) comprend un module de génération d'image (101) et un miroir incurvé (102). Le module de génération d'image (101) est utilisé pour générer deux rayons de lumière d'imagerie, les deux rayons de lumière d'imagerie transportant des informations d'image de différentes parallaxes. Le miroir incurvé (102) est utilisé pour réfléchir les deux rayons lumineux d'imagerie, et il y a un angle inclus entre les deux rayons lumineux d'imagerie réfléchis. La distance focale du miroir incurvé (102) est f, et la distance entre une surface d'image du module de génération d'image (101) et le miroir incurvé (102) est d, d étant inférieure à f. Le miroir incurvé (102) peut agrandir une image virtuelle. Lorsque la distance entre un utilisateur et l'appareil d'affichage tridimensionnel (100) est relativement courte, l'expérience de l'utilisateur peut être améliorée.
PCT/CN2023/076650 2022-05-10 2023-02-17 Appareil d'affichage tridimensionnel et véhicule WO2023216670A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210505264.7A CN117075359A (zh) 2022-05-10 2022-05-10 立体显示装置和交通工具
CN202210505264.7 2022-05-10

Publications (1)

Publication Number Publication Date
WO2023216670A1 true WO2023216670A1 (fr) 2023-11-16

Family

ID=86469458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/076650 WO2023216670A1 (fr) 2022-05-10 2023-02-17 Appareil d'affichage tridimensionnel et véhicule

Country Status (2)

Country Link
CN (2) CN117075359A (fr)
WO (1) WO2023216670A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019219555A (ja) * 2018-06-21 2019-12-26 創智車電股▲ふん▼有限公司Conserve&Associates,Inc. ディスプレイ装置、および、それを用いた自動車のヘッドアップディスプレイシステム(display device and automobile head−up display system using the same)
JP2021021914A (ja) * 2019-07-30 2021-02-18 怡利電子工業股▲ふん▼有限公司 裸眼3d反射型拡散片ヘッドアップディスプレイ装置
CN112526748A (zh) * 2019-09-02 2021-03-19 未来(北京)黑科技有限公司 一种抬头显示设备、成像系统和车辆
CN112639581A (zh) * 2020-10-31 2021-04-09 华为技术有限公司 抬头显示器和抬头显示方法
JP2021067909A (ja) * 2019-10-28 2021-04-30 日本精機株式会社 立体表示装置及びヘッドアップディスプレイ装置
CN213457538U (zh) * 2020-09-08 2021-06-15 未来(北京)黑科技有限公司 抬头显示装置及抬头显示系统
CN114137725A (zh) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 一种可显示三维图像的抬头显示系统

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130035587A (ko) * 2011-09-30 2013-04-09 엘지디스플레이 주식회사 입체영상 디스플레이장치 및 그 제조 방법
KR101322910B1 (ko) * 2011-12-23 2013-10-29 한국과학기술연구원 다수의 관찰자에 적용가능한 동적 시역 확장을 이용한 다시점 3차원 영상표시장치 및 그 방법
CN104536578B (zh) * 2015-01-13 2018-02-16 京东方科技集团股份有限公司 裸眼3d显示装置的控制方法及装置、裸眼3d显示装置
CN105025289B (zh) * 2015-08-10 2017-08-08 重庆卓美华视光电有限公司 一种立体显示方法及装置
CN105404011B (zh) * 2015-12-24 2017-12-12 深圳点石创新科技有限公司 一种抬头显示器的3d图像校正方法以及抬头显示器
CN108663807B (zh) * 2017-03-31 2021-06-01 宁波舜宇车载光学技术有限公司 平视显示光学系统和装置及其成像方法
JP6873850B2 (ja) * 2017-07-07 2021-05-19 京セラ株式会社 画像投影装置及び移動体
CN110874867A (zh) * 2018-09-03 2020-03-10 广东虚拟现实科技有限公司 显示方法、装置、终端设备及存储介质
CN109462750A (zh) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 一种抬头显示系统、信息显示方法、装置及介质
WO2020261830A1 (fr) * 2019-06-26 2020-12-30 株式会社Jvcケンウッド Dispositif d'affichage tête haute
CN114153066A (zh) * 2020-09-08 2022-03-08 未来(北京)黑科技有限公司 抬头显示装置及抬头显示系统
CN112752085A (zh) * 2020-12-29 2021-05-04 北京邮电大学 基于人眼跟踪的裸眼3d视频播放系统及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019219555A (ja) * 2018-06-21 2019-12-26 創智車電股▲ふん▼有限公司Conserve&Associates,Inc. ディスプレイ装置、および、それを用いた自動車のヘッドアップディスプレイシステム(display device and automobile head−up display system using the same)
JP2021021914A (ja) * 2019-07-30 2021-02-18 怡利電子工業股▲ふん▼有限公司 裸眼3d反射型拡散片ヘッドアップディスプレイ装置
CN112526748A (zh) * 2019-09-02 2021-03-19 未来(北京)黑科技有限公司 一种抬头显示设备、成像系统和车辆
JP2021067909A (ja) * 2019-10-28 2021-04-30 日本精機株式会社 立体表示装置及びヘッドアップディスプレイ装置
CN114137725A (zh) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 一种可显示三维图像的抬头显示系统
CN213457538U (zh) * 2020-09-08 2021-06-15 未来(北京)黑科技有限公司 抬头显示装置及抬头显示系统
CN112639581A (zh) * 2020-10-31 2021-04-09 华为技术有限公司 抬头显示器和抬头显示方法

Also Published As

Publication number Publication date
CN117075359A (zh) 2023-11-17
CN116184686A (zh) 2023-05-30

Similar Documents

Publication Publication Date Title
CN112639581B (zh) 抬头显示器和抬头显示方法
WO2021054277A1 (fr) Affichage tête haute et système d'affichage d'image
WO2024021852A1 (fr) Appareil d'affichage stéréoscopique, système d'affichage stéréoscopique et véhicule
WO2024021574A1 (fr) Système de projection 3d, système de projection et véhicule
WO2024017038A1 (fr) Appareil de génération d'image, dispositif d'affichage et véhicule
WO2021015171A1 (fr) Affichage tête haute
WO2023216670A1 (fr) Appareil d'affichage tridimensionnel et véhicule
CN217360538U (zh) 一种投影系统、显示设备和交通工具
WO2024098828A1 (fr) Système de projection, procédé de projection et moyen de transport
WO2023185293A1 (fr) Appareil de génération d'images, dispositif d'affichage et véhicule
JP7492971B2 (ja) ヘッドアップディスプレイ
WO2023130759A1 (fr) Dispositif d'affichage et véhicule
US20240069335A1 (en) Head-up display
US20240036311A1 (en) Head-up display
CN115542644B (zh) 投影装置、显示设备及交通工具
WO2023098228A1 (fr) Appareil d'affichage, dispositif électronique et véhicule
WO2024001225A1 (fr) Appareil d'affichage d'image virtuelle, procédé et appareil de génération de données d'image, et dispositif associé
WO2024065332A1 (fr) Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image
WO2020218072A1 (fr) Affichage tête haute de véhicule et unité de source de lumière utilisée à cet effet
WO2024041034A1 (fr) Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie
US20230152586A1 (en) Image generation device and head-up display
WO2022009605A1 (fr) Dispositif de génération d'images et affichage tête haute
WO2023040669A1 (fr) Dispositif d'affichage tête haute et véhicule
WO2023138076A1 (fr) Appareil d'affichage et véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23802446

Country of ref document: EP

Kind code of ref document: A1