CN116184686A - Stereoscopic display device and vehicle - Google Patents

Stereoscopic display device and vehicle Download PDF

Info

Publication number
CN116184686A
CN116184686A CN202211091089.8A CN202211091089A CN116184686A CN 116184686 A CN116184686 A CN 116184686A CN 202211091089 A CN202211091089 A CN 202211091089A CN 116184686 A CN116184686 A CN 116184686A
Authority
CN
China
Prior art keywords
imaging light
light
paths
image
stereoscopic display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211091089.8A
Other languages
Chinese (zh)
Inventor
邓宁
贺俊妮
邹冰
常泽山
黄志勇
常天海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211091089.8A priority Critical patent/CN116184686A/en
Publication of CN116184686A publication Critical patent/CN116184686A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/33Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving directional light or back-light sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • G02B30/35Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers using reflective optical elements in the optical path between the images and the observer

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

The application provides a stereoscopic display device, which is applied to the field of display. The stereoscopic display apparatus includes an image generating assembly and a curved mirror. The image generation assembly is used for generating two paths of imaging light. The two paths of imaging light carry image information of different parallaxes. The curved mirror is used for reflecting two paths of imaging light. An included angle exists between the two paths of reflected imaging light. The focal length of the curved mirror is f. The image plane of the image generating assembly is spaced from the curved mirror by a distance d. d is less than f. In the present application, a curved mirror may amplify a virtual image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.

Description

Stereoscopic display device and vehicle
This application is a divisional application, the application number of the original application is CN202210505264.7, the date of the original application is 2022, 5 months and 10 days, and the entire contents of the original application are incorporated herein by reference.
Technical Field
The present disclosure relates to the field of display, and more particularly, to a stereoscopic display device and a vehicle including the stereoscopic display device.
Background
Stereoscopic display requires providing both eyes with image information carrying different parallaxes. Compared with 2D display, stereoscopic display can give better experience to people.
The autostereoscopic display technique is also a stereoscopic display scheme in which the user does not need to wear polarized glasses or shutter glasses. The stereoscopic display device outputs two paths of imaging light to left and right eyes of a user, respectively. Specifically, the stereoscopic display device can output two imaging lights in a time-sharing manner, and one imaging light output by the stereoscopic display device is irradiated into one eye of a user in a first time period. In a second period, the other path of imaging light output by the stereoscopic display device is irradiated into the other eye of the user. The first time period and the second time period are alternately divided. The two paths of imaging light carry image information with different parallaxes, thereby providing stereoscopic visual enjoyment for users.
However, in order to enhance the user experience, a larger frame is required for the stereoscopic displayed image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience is low.
Disclosure of Invention
The application provides a stereoscopic display device and a vehicle, wherein a stereoscopic display image can be magnified through a curved mirror or a lens. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
The first aspect of the application provides a stereoscopic display device. The stereoscopic display apparatus includes an image generating assembly and a curved mirror. The image generation assembly is used for generating two paths of imaging light. The two paths of imaging light carry image information of different parallaxes. The curved mirror is used for reflecting two paths of imaging light. An included angle exists between the two paths of reflected imaging light. The focal length of the curved mirror is f. The image plane of the image generating assembly is spaced from the curved mirror by a distance d. d is less than f.
In this application, d is smaller than f, and the curved mirror can magnify the stereoscopic image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved. Also, the volume of the curved mirror may be smaller than that of the lens, so that the volume of the stereoscopic display device may be reduced.
In an alternative form of the first aspect, the virtual image formed by the reflected two imaging light paths is separated from the curved mirror by a distance D. D satisfies the following formula:
Figure BDA0003837266640000011
in an alternative form of the first aspect, an angle γ1 exists between the two imaging light paths reflected by the curved mirror. When the value of S is a certain value between 0 mm and 5000 mm, γ1 satisfies the following formula: e-w < tan (γ1) × (S+D) < E+2w. Wherein S is the distance between the receiving position of the two paths of imaging light and the curved mirror. E has a value ranging from 53 mm to 73 mm. w is the width of at least one of the two imaging light paths reflected by the curved mirror at the receiving position. When two paths of imaging light irradiate the same eye of a user, beam crosstalk can be generated, and user experience is reduced. By controlling the included angle gamma 1, the reflected two paths of imaging light can be respectively irradiated to different eyes of a user. Therefore, the user experience can be further improved.
In an alternative form of the first aspect, there is an angle γ2 between the two paths of imaging light before being reflected by the curved mirror. When the value of S is a certain value between 0 mm and 5000 mm, γ2 satisfies the following formula:
Figure BDA0003837266640000021
Figure BDA0003837266640000022
where S is the distance between the receiving position of the two imaging light paths and the curved mirror. Value range of EBetween 53 mm and 73 mm. w is the width of at least one of the two imaging light paths reflected by the curved mirror at the receiving position. By controlling the included angle gamma 2, the reflected two paths of imaging light can be respectively irradiated to different eyes of a user. Therefore, the user experience can be further improved.
In an alternative form of the first aspect, the divergence angle of each of the two imaging light paths before being reflected by the curved mirror is α. When the value of S is a certain value between 0 mm and 5000 mm, w satisfies the following formula:
Figure BDA0003837266640000023
w is less than 73 mm. When the width of each imaging light path is too large, each imaging light path may cover both eyes of the user, thereby causing crosstalk of light beams. By controlling the size of w, one path of imaging light can be prevented from entering both eyes of the user. Therefore, the user experience can be further improved.
In one alternative of the first aspect, the image generation assembly includes a first light source assembly and a pixel assembly. The first light source component is used for outputting first light beams and second light beams with different emergent directions to the pixel component in a time-sharing mode. The pixel assembly is used for modulating the first light beam and the second light beam respectively by using different image information to generate two paths of imaging light.
In an alternative form of the first aspect, the first light source assembly comprises a first light source device and a second light source device. The first light source device and the second light source device are used for outputting a first light beam and a second light beam alternately in a time sharing mode.
In an alternative form of the first aspect, the image generation assembly further comprises a timing control unit. The time sequence control unit is used for controlling the first light source device and the second light source device to alternately output the first light beam and the second light beam in a time-sharing mode. The timing control unit is further configured to control the pixel assembly to time-share the first light beam and the second light beam using different image information.
In an alternative form of the first aspect, the two paths of imaging light comprise a first path of imaging light and a second path of imaging light. The image generation assembly includes a second light source assembly, a pixel assembly, and a lens array. The second light source component is used for outputting a third light beam to the pixel component. The pixel component is used for modulating the third light beam according to different image information to generate a first path of imaging light and a second path of imaging light. The curved mirror array is used for transmitting the first path of imaging light and the second path of imaging light at different angles. By using the third light beam, the cost of the stereoscopic implementation device can be reduced.
In an alternative form of the first aspect, the pixel assembly comprises a first pixel and a second pixel. The first pixel is used for modulating the third light beam according to the first image information to generate first path of imaging light. The second pixel is used for modulating the third light beam according to the second image information to generate a second path of imaging light. Due to the process error of the curved mirror, the zoom multiple and the imaging position of the image observed by the user are different from the ideal position. The display difference can cause physiological discomfort such as dizziness and the like to the user, and the user experience is reduced. For this purpose, the display difference can be compensated by preprocessing the image information. When the first light beam and the second light beam are modulated using the same pixel, only the image information of different parallaxes can be subjected to the same degree of preprocessing. The same degree of preprocessing tends to create new display errors, thereby reducing the user experience. When the first light beam and the second light beam are modulated respectively using different pixels, image information of different parallaxes can be preprocessed to different degrees. The preprocessing aspects of different degrees can reduce or eliminate display errors, thereby improving user experience.
In an alternative form of the first aspect, the third beam comprises a first sub-beam and a second sub-beam. The first pixel is configured to modulate the third light beam according to the first image information, and generating the first path of imaging light includes: the first pixel is used for modulating the first sub-beam according to the first image information to generate a first path of imaging light. The second pixel is configured to modulate the third light beam according to second image information, and generating a second path of imaging light includes: the second pixel is used for modulating the second sub-beam according to the second image information to generate a second path of imaging light. The second light source assembly is used for generating the first sub-beam and the second sub-beam simultaneously.
In an alternative form of the first aspect, the image information of the different disparities includes first image information and second image information. The stereoscopic display device further includes a processor. The processor may be disposed in the image generating assembly or may be disposed outside the image generating assembly.
In an alternative form of the first aspect, the processor is configured to pre-process the third image information to obtain the first image information. The processor is used for preprocessing the fourth image information to obtain second image information. By preprocessing the image information with different parallaxes to different degrees, the display error can be reduced or eliminated, and therefore the user experience is improved.
In an alternative form of the first aspect, the processor is further configured to obtain first coordinate information of the first location and/or second coordinate information of the second location. One of the two imaging lights irradiates the first position. The other imaging light of the two imaging lights irradiates the second position. The processor is configured to pre-process the third image information including: the processor is used for preprocessing the third image information according to the first coordinate information. The processor is configured to pre-process the fourth image information including: the processor is used for preprocessing the fourth image information according to the second coordinate information. In the field of 2D display, both eyes of a user receive the same image information. Thus, the processor can only preprocess one image information. Specifically, the processor may acquire coordinate information of the middle position of both eyes of the user. The processor preprocesses the image information according to the coordinate information of the intermediate position. In this application, the processor may also perform preprocessing on different image information according to the coordinate information of the intermediate position. For example, the coordinate information of the intermediate position corresponds to two correction parameters. The processor may preprocess different image information according to the two correction parameters, respectively. In the method, the accuracy of preprocessing can be improved by preprocessing the coordinate information of different positions, so that the user experience is improved.
In an alternative form of the first aspect, f is less than 400 millimeters. When f is too large, the volume of the stereoscopic display device is large. Therefore, the volume of the stereoscopic display device can be reduced.
In an alternative form of the first aspect, the image generation assembly comprises a projector and a diffusion screen. The projector is used for generating two paths of imaging light. The diffusion screen is used for receiving the two paths of imaging light, diffusing the two paths of imaging light and outputting the diffused two paths of imaging light. The curved mirror is used for reflecting the two paths of diffused imaging light.
A second aspect of the present application provides a stereoscopic display device. The stereoscopic display apparatus includes an image generating assembly and a lens. The image generation assembly is used for generating two paths of imaging light. The two paths of imaging light carry image information of different parallaxes. The lens is used for transmitting two paths of imaging light. An included angle exists between the two paths of transmitted imaging light. The transmitted focal length is f. The distance between the image plane of the image generation assembly and the lens is d. d is less than f.
In this application, d is less than f and the lens can magnify the stereoscopic image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
In an alternative form of the second aspect, the virtual image formed by the reflected two imaging light paths is separated from the curved mirror by a distance D. D satisfies the following formula:
Figure BDA0003837266640000031
in an alternative form of the second aspect, there is an angle γ1 between the two imaging light paths after reflection by the lens. When the value of S is a certain value between 0 mm and 5000 mm, γ1 satisfies the following formula: e-w < tan (γ1) × (S+D) < E+2w. Where S is the distance between the receiving position of the two imaging light paths and the lens. E has a value ranging from 53 mm to 73 mm. w is the width of each of the at least one imaged light path after reflection by the lens at the receiving location.
In an alternative form of the second aspect, there is an angle γ2 between the two paths of imaging light before being reflected by the lens. When the value of S is 0 mm to 5000At a value between millimeters, γ2 satisfies the following equation:
Figure BDA0003837266640000041
Figure BDA0003837266640000042
wherein S is the distance between the receiving position of the two paths of imaging light and the curved mirror. E has a value ranging from 53 mm to 73 mm. w is the width of at least one of the two imaging light paths reflected by the curved mirror at the receiving position.
In an alternative form of the second aspect, the divergence angle of each of the two imaging light paths before being reflected by the lens is α. When the value of S is a certain value between 0 mm and 5000 mm, w satisfies the following formula:
Figure BDA0003837266640000043
w is less than 73 mm.
In one alternative of the second aspect, the image generation assembly includes a first light source assembly and a pixel assembly. The first light source component is used for outputting first light beams and second light beams with different emergent directions to the pixel component in a time-sharing mode. The pixel component is used for modulating the first light beam and the second light beam respectively according to different image information to generate two paths of imaging light.
In an alternative form of the second aspect, the first light source assembly comprises a first light source device and a second light source device. The first light source device and the second light source device are used for outputting a first light beam and a second light beam alternately in a time sharing mode.
In an alternative form of the second aspect, the image generation assembly further comprises a timing control unit. The time sequence control unit is used for controlling the first light source device and the second light source device to alternately output the first light beam and the second light beam in a time-sharing mode. The timing control unit is further configured to control the pixel assembly to time-share the first light beam and the second light beam using different image information.
In an alternative form of the second aspect, the pixel assembly comprises a first pixel and a second pixel. The first pixel is used for modulating the first light beam to obtain a first imaging light path of the two imaging light paths. The second pixel is used for modulating a second light beam to obtain a second imaging light path in the two imaging light paths.
In an alternative form of the second aspect, the two paths of imaging light comprise a first path of imaging light and a second path of imaging light. The image generation assembly includes a second light source assembly, a pixel assembly, and a lens array. The second light source component is used for outputting a third light beam to the pixel component. The pixel assembly is used for modulating the third light beam by using different image information to generate a first path of imaging light and a second path of imaging light. The lens array is used for transmitting the first path of imaging light and the second path of imaging light at different angles.
In an alternative form of the second aspect, the pixel assembly comprises a first pixel and a second pixel. The first pixel is used for modulating the third light beam according to the first image information to generate a first path of imaging light. The second pixel is used for modulating the third light beam according to the second image information to generate a second path of imaging light.
In an alternative form of the second aspect, the third beam comprises a first sub-beam and a second sub-beam. The first pixel is used for modulating the first sub-beam according to the first image information to generate a first path of imaging light. The second pixel is used for modulating the second sub-beam according to the second image information to generate a second path of imaging light. The second light source assembly is used for generating the first sub-beam and the second sub-beam simultaneously.
In an alternative form of the second aspect, the image information of the different parallaxes includes first image information and second image information. The stereoscopic display device further includes a processor. The processor is used for preprocessing the third image information to obtain first image information. The processor is used for preprocessing the fourth image information to obtain second image information.
In an alternative form of the second aspect, the processor is further configured to obtain first coordinate information of the first location and/or second coordinate information of the second location. One of the two imaging lights irradiates the first position. The other imaging light of the two imaging lights irradiates the second position. The processor is configured to pre-process the third image information including: the processor is used for preprocessing the third image information according to the first coordinate information. The processor is configured to pre-process the fourth image information including: the processor is used for preprocessing the fourth image information according to the second coordinate information.
In an alternative form of the second aspect, f is less than 300 mm.
In an alternative form of the second aspect, the image generation assembly comprises a projector and a diffuser screen. The projector is used for generating two paths of imaging light. The diffusion screen is used for receiving the two paths of imaging light, diffusing the two paths of imaging light and outputting the diffused two paths of imaging light. The lens is used for transmitting the two paths of diffused imaging light.
A third aspect of the present application provides a vehicle. The vehicle comprises a stereoscopic display device as described in the first aspect, any one of the alternatives of the first aspect, the second aspect or any one of the alternatives of the second aspect. The stereoscopic display device is mounted on a vehicle.
Drawings
Fig. 1 is a schematic diagram of a first structure of a stereoscopic display device according to an embodiment of the present application;
fig. 2 is a schematic view of a first optical path projection of a stereoscopic display device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a second structure of the stereoscopic display device according to the embodiment of the present application;
fig. 4 is a schematic view of a second optical path projection of the stereoscopic display device according to the embodiment of the present application;
FIG. 5 is a schematic diagram of a first configuration of an image generating assembly according to an embodiment of the present application;
FIG. 6 is a second schematic structural view of an image generating assembly according to an embodiment of the present application;
FIG. 7a is a schematic diagram of a third configuration of an image generation assembly according to an embodiment of the present application;
FIG. 7b is a schematic diagram of a fourth configuration of an image generation assembly according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a pixel assembly and a lens array according to an embodiment of the present disclosure;
Fig. 9a is a schematic diagram of a third structure of the stereoscopic display device according to the embodiment of the present application;
fig. 9b is a schematic diagram of a fourth structure of the stereoscopic display device according to the embodiment of the present application;
fig. 10a is a schematic view of a fifth structure of the stereoscopic display device according to the embodiment of the present application;
fig. 10b is a schematic view of a sixth structure of the stereoscopic display device according to the embodiment of the present application;
fig. 11 is a seventh schematic structural diagram of a stereoscopic display device according to an embodiment of the present application;
fig. 12 is an eighth schematic structural view of the stereoscopic display device according to the embodiment of the present application;
fig. 13 is a schematic view of a third optical path projection of the stereoscopic display device according to the embodiment of the present application;
fig. 14 is a schematic circuit diagram of a stereoscopic display device according to an embodiment of the present disclosure;
FIG. 15 is a schematic structural diagram of a vehicle according to an embodiment of the present disclosure;
fig. 16 is a schematic diagram of one possible functional framework of a vehicle provided in an embodiment of the present application.
Detailed Description
The application provides a stereoscopic display device and a vehicle, wherein a stereoscopic display image can be magnified through a curved mirror or a lens. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved. It is to be understood that the use of "first," "second," etc. herein is for descriptive purposes only and is not to be construed as indicating or implying any relative importance or order. In addition, for simplicity and clarity, reference numbers and/or letters are repeated throughout the several figures of the embodiments of the present application. Repetition does not indicate a tightly defined relationship between the various embodiments and/or configurations.
The stereoscopic display device in the present application may also be referred to as a 3D display device. The stereoscopic display device is applied to the technical field of projection. In the field of projection technology, stereoscopic visual enjoyment can be provided to a user by a directional backlight. However, in order to enhance the user experience, a larger frame is required for the stereoscopic displayed image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience is low.
To this end, the present application provides a stereoscopic display device. Fig. 1 is a schematic diagram of a first structure of a stereoscopic display device according to an embodiment of the present application. As shown in fig. 1, the stereoscopic display apparatus 100 includes an image generating component 101 and a curved mirror 102. The image generation component 101 is used to generate two paths of imaging light. In fig. 1, each solid line connected to the image generating section 101 represents one path of imaging light. The curved mirror 102 is used to reflect two paths of imaging light. An included angle exists between the two paths of reflected imaging light. Thus, the reflected two paths of imaging light can be irradiated to different positions. For example, one imaging light is directed to the left eye of the user and the other imaging light is directed to the right eye of the user. The two paths of imaging light carry image (pattern) information of different parallaxes, thereby providing stereoscopic visual enjoyment for users. The location of the human eye may be referred to as the point of view. The stereoscopic display device can provide multiple viewpoints for multiple people to watch. Correspondingly, the image generation assembly 101 may produce multiple paths of imaging light for viewing by different persons, respectively. The present embodiment describes an imaging process of the stereoscopic display device taking one viewpoint as an example, that is, the image generating section 101 generates two paths of imaging light as an example.
In the present embodiment, the focal length of the curved mirror 102 is f. The distance between the image plane (display plane of the image) of the image generating unit 101 and the curved mirror 102 is d. Each point on the curved mirror 102 is a perpendicular distance from the image plane of the image generation assembly 101. d may be the furthest perpendicular distance of the curved mirror 102 from the image plane of the image generation assembly 101. Alternatively, d may be the linear distance of the center pixel of the image plane of the image generation component 101 from the target point on the curved mirror 102. The center pixel is one or more pixels at the center position of the image plane. The imaging light output from the center pixel irradiates the target point on the curved mirror 102. d is less than f. When d is less than f, the curved mirror 102 may magnify the virtual image. Therefore, when the distance between the user and the stereoscopic display apparatus 100 is relatively short, the user can see the enlarged virtual image, thereby improving the user experience.
Fig. 2 is a schematic view of a first optical path projection of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 2, the image generation assembly 101 is configured to generate two paths of imaging light. The divergence angle of each of the two imaging lights is α. The dot-dash line in fig. 2 indicates one of the two imaging lights. The solid line in fig. 2 represents the other of the two imaging lights. An included angle gamma 2 exists between the two imaging lights. The curved mirror 102 is used to reflect two paths of imaging light. An included angle gamma 1 exists between the two paths of reflected imaging light. The virtual image formed by the reflected two imaging light paths is separated from the curved mirror 102 by a distance D. D can be obtained by f and D according to the following equation 1.
Figure BDA0003837266640000061
The reflected two imaging light paths can be irradiated to different positions. The positions to which the two imaging light are irradiated are also referred to as receiving positions of the two imaging light, for example, both eyes of the user. The interpupillary distance (interpupillary distance) of both eyes is E. E may range from 53 mm to 73 mm. For example, E is 53 millimeters or 73 millimeters. The eyes are spaced a distance S from the curved mirror 102. At the binocular positions, the width of each of the two reflected imaging light paths is w. w and S, α, and D. W can be obtained by S, α, and D according to the following formula 2. When the width w of each imaging light path is too large, one imaging light path may cover both eyes of the user, resulting in crosstalk of the light beams. Therefore, in order to reduce or avoid crosstalk of the light beam, w may be smaller than the maximum value of E when S takes a value between 0 mm and 5000 mm. E has a maximum value of 73 mm. The value of S can be 0 mm or 5000 mm.
Figure BDA0003837266640000062
In practical application, by setting the distance between the two reflected imaging lights, the two reflected imaging lights can be respectively irradiated to suitable positions, such as eyes of a user. The distance M between the reflected two imaging light paths is related to the included angles γ1 and D, S. M can be obtained from γ1, D and S according to the following equation 3.
M=tan (γ1) × (s+d) formula 3
Similarly, in practical applications, the distance M between the reflected two imaging light paths is related to the included angles γ2, D, S. M can be obtained from γ2, D, and S according to the following equation 4.
Figure BDA0003837266640000071
In order to make the reflected two paths of imaging light respectively strike both eyes of the user, a relationship between M and E, w may be set. Specifically, when S takes a value of between 0 mm and 5000 mm, E-w < M < e+2w. w may be w1 or w2. w1 is the width of the first imaging light path. w2 is the width of the second path of imaging light. In practical applications, w may be w1 and w2. At this time, when the value of S is a certain value between 0 mm and 5000 mm, E-w1< M < E+2w1, E-w2< M < E+2w2.
In fig. 1 and 2, the amplification of the virtual image is achieved by the reflection principle of a curved mirror. In practical applications, the amplification of the virtual image can also be achieved by the principle of transmission of the lens. Fig. 3 is a schematic diagram of a second structure of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 3, the stereoscopic display apparatus 100 includes an image generating component 101 and a lens 301. The image generation component 101 is used to generate two paths of imaging light. In fig. 3, each solid line connected to the image generating section 101 represents one path of imaging light. The lens 301 is used to transmit two paths of imaging light. An included angle exists between the two paths of transmitted imaging light. Thus, the two paths of imaging light after transmission can be irradiated to different positions. For example, one imaging light is directed to the left eye of the user and the other imaging light is directed to the right eye of the user. The two paths of imaging light carry information of images with different parallaxes, thereby providing stereoscopic visual enjoyment for users. The focal length of the lens is f. The distance between the image plane of the image generation assembly 101 and the lens 301 is d. d is less than f. When d is smaller than f, the lens 301 may amplify the virtual image. Therefore, when the distance between the user and the stereoscopic display apparatus 100 is close, the user experience can be improved.
In the embodiment of the present application, for the purpose of aspect description, parameters of two paths of imaging light are the same. For example, in fig. 2 and 4, the divergence angle of both paths of imaging light is α. In practical applications, there may be a deviation in the divergence angle of the two imaging light paths. For example, the divergence angle of the first imaging light is α1. The divergence angle of the first imaging light is α2. Similarly, two w can be obtained by α1, α2 and the foregoing equation 1. The two w include a width w1 of the first imaging light path and a width w2 of the second imaging light path. To reduce or avoid cross-talk of the beams, w1 and w2 may both be less than the maximum value of E.
Fig. 4 is a schematic view of a second optical path projection of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 4, the image generation component 101 is configured to generate two paths of imaging light. The divergence angle of each of the two imaging lights is α. The dot-dash line in fig. 4 indicates one of the two imaging lights. The solid line in fig. 4 represents the other of the two imaging lights. An included angle gamma 2 exists between the two paths of imaging light generated by the image generating component 101. The lens 301 is configured to transmit two paths of imaging light, and the propagation directions of the transmitted two paths of imaging light are deflected. An included angle gamma 1 exists between the two paths of transmitted imaging light. The distance between the virtual image formed by the transmitted two imaging lights and the lens 301 is D. With respect to the descriptions of D, w, E, M, etc., reference may be made to the foregoing related descriptions of fig. 2. As can be seen from fig. 2 and 4, there are some similarities between a stereoscopic display device realized by the principle of transmission and a stereoscopic display device realized by the principle of reflection. Therefore, as for the description of the stereoscopic display device realized by the transmission principle, reference may be made to the description of the stereoscopic display device realized by the reflection principle. In the following examples, a stereoscopic display device implemented by the reflection principle will be described as an example.
As can be seen from the foregoing description, the image generation assembly 101 is configured to generate two paths of imaging light. The structure of the image generation component 101 is exemplarily described below. Fig. 5 is a schematic diagram of a first structure of an image generating assembly according to an embodiment of the present application. As shown in fig. 5, the image generation assembly 101 includes a first light source assembly 501 and a pixel assembly 502. The first light source assembly 501 may be a light emitting diode (light emitting diode) light source or a Laser Diode (LD) light source, etc. The first light source module 501 is configured to output a first light beam and a second light beam with different emission directions to the pixel module 502 in a time-sharing manner. In fig. 5, a dash-dot line connected to the first light source module 501 indicates the first light beam. The solid line connected to the first light source module 501 represents the second light beam. The pixel assembly 502 may be a liquid crystal display (liquid crystal display, LCD), liquid crystal on silicon (liquid crystal on silicon, LCOS), digital micro-mirror device (DMD), or the like. The pixel assembly 502 may be referred to as an image modulator. The pixel assembly 502 is configured to modulate the first light beam and the second light beam with different image information, respectively, to generate two paths of imaging light. The two imaging light paths comprise a first imaging light path and a second imaging light path. In fig. 5, a dash-dot line connected to the pixel assembly 502 indicates a first path of imaging light. The solid line connected to pixel assembly 502 represents the second path of imaging light.
In practical applications, the first light source assembly 501 may include a plurality of light source devices in order to alternately output the first light beam and the second light beam in a time-sharing manner. Fig. 6 is a second schematic structural diagram of an image generating assembly according to an embodiment of the present application. As shown in fig. 6, the first light source assembly 501 includes a first light source device 505 and a second light source device 506. On the basis of fig. 5, the image generation component 101 further comprises a timing control unit 504. The timing control unit 504 is configured to control the first light source device 505 and the second light source device 506 to alternately output the first light beam and the second light beam in a time-sharing manner. The timing control unit 504 is also used for controlling the pixel component 502 to alternately display (load) images with different parallaxes in a time-sharing manner. For example, during a first period of time, the timing control unit 504 is used to control the pixel assembly 502 to display an image for the left eye. In the second period, the timing control unit 504 is used to control the first light source device 505 to output the first light beam. The pixel assembly 502 modulates the first light beam with an image of the left eye to obtain a first imaged light path. In the third period, the timing control unit 504 is used to control the pixel assembly 502 to display an image for the right eye. In the fourth period, the timing control unit 504 is configured to control the second light source device 506 to output the second light beam. The pixel assembly 502 modulates the second beam with the image of the right eye to obtain a second imaged light path. The first period, the second period, the third period, and the fourth period are alternately distributed.
The second imaging light and the first imaging light have certain directivity and divergence angle. Similarly, the first and second light beams also have a certain directivity and divergence angle. However, in practical applications, the divergence angle of the first and second light beams may be too large or too small. Accordingly, the image generation assembly 101 may further include a beam control unit 503 between the first light source assembly 501 and the pixel assembly 502. The beam control unit 503 may be a fresnel screen, a lenticular lens, a lens array, or the like. The beam control unit 503 is configured to change the divergence angle of the first beam and/or the second beam, thereby improving the light utilization rate of the first light source assembly 501, and improving the brightness of the generated imaging light, and further improving the brightness of the stereoscopic display device.
In the previous examples of fig. 5 or 6, the pixel assembly 502 obtains different imaging light by modulating different light beams. In practice, the pixel assembly 502 may obtain different imaging lights by modulating the same light beam. Fig. 7a is a schematic diagram of a third structure of an image generating assembly according to an embodiment of the present application. As shown in fig. 7a, the image generation assembly 101 comprises a second light source assembly 701, a pixel assembly 502 and a lens array 702. The second light source assembly 701 may be an LED light source or an LD light source, etc. The second light source component 701 is configured to output a third light beam to the pixel component 502. In fig. 7a, the solid line connected to the second light source assembly 701 represents the third light beam. The pixel component 502 is configured to modulate the third light beam according to different image information, and generate a first path of imaging light and a second path of imaging light output from different directions. The first path of imaging light and the second path of imaging light have certain directivity and divergence angle.
In fig. 7a, the dash-dot line connected to the pixel assembly 502 represents the first path of imaging light. The solid line connected to pixel assembly 502 represents the second path of imaging light. The pixel component 502 may include a left-eye pixel for displaying a left-eye image and a right-eye pixel for displaying a right-eye image. The left eye pixels are modulated to emit a first path of imaging light, and the right eye pixels are modulated to emit a second path of imaging light. The imaging light emitted by the pixel component 502 is input to the lens array 702, and the lens array 702 is used for transmitting the first path of imaging light and the second path of imaging light at different angles, so that the first path of imaging light and the second path of imaging light output by the lens array 702 have different output (transmission) directions, and further the first path of imaging light and the second path of imaging light are respectively transmitted to the left eye and the right eye of a person. In fig. 7a, the dot-dash line connected to the lens array 702 represents the first path of imaging light. The solid line connected to lens array 702 represents the second path of imaging light.
As can be seen from the foregoing description of fig. 1, the image plane of the image generating assembly 101 is spaced from the curved mirror 102 by a distance d. The image plane of the image generation component 101 may be a pixel component or a diffusion screen. For example, in fig. 7a, the two light beams output by the second light source module 701 are light beams that do not carry image information, and the image plane of the image generating module 101 is the pixel module 502. For another example, fig. 7b is a schematic diagram of a fourth structure of an image generating assembly according to an embodiment of the present application. As shown in fig. 7b, the image generation assembly 101 includes a projector 703, a diffuser screen 704, and a lens array 702. The projector 703 outputs a third light beam carrying image information. Diffusion screen 704 is a pixelated device. The diffusion screen 704 is used to enlarge the divergence angle of the third light beam output from the projector 703. The third beam can carry image information with different parallaxes in a time sharing manner, and then the diffusion screen 704 can output two paths of imaging light, wherein the two paths of imaging light carry image information with different parallaxes. The lens array 702 is used to transmit two paths of imaging light at different angles. The two imaging light paths comprise a first imaging light path and a second imaging light path. In fig. 7b, the dot-dash line connected to the lens array 702 represents the first path of imaging light. The solid line connected to lens array 702 represents the second path of imaging light. In fig. 7b, the image side of the image generation assembly 101 is a diffusion screen 704.
Fig. 8 is a schematic structural diagram of a pixel assembly and a lens array according to an embodiment of the present application. As shown in fig. 8, the pixel assembly 502 includes N pixel groups 801.N is an integer greater than 0. Each pixel group 801 includes one first pixel and one second pixel. The first pixel is used for modulating the third light beam and outputting a first sub-imaging light. The second pixel is used for modulating the third light beam and outputting second sub-imaging light. The first sub-imaging light and the second sub-imaging light have a certain directivity and divergence angle. In fig. 8, a dot-dash line connected to the first pixel indicates the first sub-imaging light. The solid line connected to the second pixel represents the second sub-imaging light. Lens array 702 includes N lenses 802. Each lens 802 is configured to transmit a first sub-imaged light and a second sub-imaged light. Each lens 802 is configured to output a first sub-image light and a second sub-image light in a direction. In fig. 8, a dot-dash line connected to the lens 802 indicates the first sub-imaging light. The solid line connected to lens 802 represents the second sub-imaged light. The N pixel groups 801 are in one-to-one correspondence with the N lenses 802. The N lenses 802 are used to output N first sub-imaging lights and N second sub-imaging lights. The N first sub-imaging lights are converged to form a first path of imaging light. The N second sub-imaging lights are converged to form a second path of imaging light.
In the field of display technology, imaging light received by the left and right eyes of a user comes from different positions of curved mirrors or lenses, whether in 2D display or stereoscopic display. Fig. 9a is a schematic diagram of a third structure of the stereoscopic display device according to the embodiment of the present application. Fig. 9b is a schematic diagram of a fourth structure of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 9a and 9b, the stereoscopic display apparatus 100 includes an image generating assembly 101 and a curved mirror 102. Regarding the description of the stereoscopic display apparatus 100, reference may be made to the related description in fig. 1. The first of the two imaging light rays is reflected to the left eye through point a of curved mirror 102. The second of the two imaging light is reflected to the right eye through point B of curved mirror 102. In practical applications, different process errors exist at different positions of the curved mirror 102. Process errors can cause display differences in the magnification of the image, the imaging position, relative to the ideal position, as observed by the user. For example, in fig. 9a and 9b, two virtual images observed by both eyes of a user are at different positions. The display difference can cause physiological discomfort such as dizziness and the like to the user, and the user experience is reduced. Therefore, in the embodiment of the present application, image information of different parallaxes can be preprocessed. The display difference is compensated for by preprocessing, thereby enhancing the display effect. For example, the stereoscopic display device may process the left-eye image and/or the right-eye image loaded by the pixel component 502 by one or more of:
1. The whole or part of the left eye image or the right eye image is translated.
2. The left eye image or the right eye image is enlarged or reduced in whole or in part.
3. The whole or part of the left eye image or the right eye image is distorted.
Fig. 10a is a schematic diagram of a fifth structure of the stereoscopic display device according to the embodiment of the present application. Fig. 10b is a sixth schematic structural diagram of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 10a and 10b, the stereoscopic display apparatus 100 includes a processor 1001 and an image generating component 101 on the basis of fig. 1. The processor 1001 may be a central processing unit (central processing unit, CPU), a network processor (network processor, NP) or a combination of CPU and NP. The processor may further comprise a hardware chip or other general purpose processor. The hardware chip may be an application specific integrated circuit (application specific integrated circuit, ASIC), a programmable logic device (programmable logic device, PLD), or a combination thereof.
The processor 1001 is configured to obtain third image information, and perform preprocessing on the third image information to obtain first image information. For example, the processor 1001 is configured to obtain first coordinate information of a first location. The first position may be a position of the left eye of the user. The processor 1001 is configured to obtain a mapping table. The mapping table has a corresponding relation between the coordinate information and the correction parameters. The processor 1001 searches the mapping table for a first correction parameter corresponding to the first coordinate information. The processor 1001 performs preprocessing on the third image information according to the first correction parameter, to obtain first image information. The first correction parameter may be 2 pixels shifted to the left. For example, for fig. 9a, the processor 1001 may control the first imaging light to shift to the right, resulting in the stereoscopic display device shown in fig. 10 a. Before the offset, the image generation component 101 generates a first path of imaging light through the pixel 1. After the offset, the image generation component 101 generates a first path of imaging light through the pixels 2. The position after shifting pixel 1 by 2 pixels to the right is the position of pixel 2. Alternatively, the processor 1001 may control the second path of imaging light to shift to the left. Similarly, for fig. 9b, the processor 1001 may control the first imaging light to shift to the left, resulting in the stereoscopic display device shown in fig. 10 b. Alternatively, the processor 1001 may control the second path of imaging light to shift to the right.
The processor 1001 may be further configured to acquire fourth image information, and perform preprocessing on the fourth image information to obtain second image information. For example, the processor 1001 may be configured to obtain second coordinate information for a second location. The second position may be a position of the right eye of the user. The processor 1001 searches the mapping table for a second correction parameter corresponding to the second coordinate information, and performs preprocessing on the fourth image information according to the second correction parameter to obtain second image information. For example, the second correction parameter may be reduced by 5% as a whole.
The pixel component 502 in the image generation component 101 can include a display circuit and a display panel. The display circuit may also be referred to as a display controller (display controller, DC) having a display control function. The display circuit is configured to receive the first image information and the second image information output from the processor 1001. The display circuit is also used for controlling the display panel to display the first image and the second image according to the first image information and the second image information. Wherein the first image information corresponds to the first image. The second image information corresponds to a second image. The functions of the timing control unit 504 described above may be implemented by a display circuit.
In fig. 5, the first light source module 501 outputs the first light beam and the second light beam in a time-sharing manner. Therefore, before the processor 1001 performs preprocessing on the image information with different parallaxes, the pixel component 502 may modulate the first light beam and the second light beam with the same pixel, so as to obtain two paths of imaging light corresponding to the left eye and the right eye. Fig. 11 is a seventh structural schematic diagram of a stereoscopic display device according to an embodiment of the present application. As shown in fig. 11, the first light source assembly 501 outputs the first light beam and the first light beam in a time-sharing manner. Specifically, LED1 in first light source assembly 501 generates a first light beam. The LED2 in the first light source assembly 501 generates a second light beam. The first light beam and the second light beam reach the pixel assembly 502 after passing through the light beam control unit 503. The pixel assembly 502 time-divisionally modulates the first light beam or the second light beam by the pixel 1 to obtain two paths of imaging light, that is, the pixel assembly 502 uses the same pixel to display the left eye image and the right eye image. After the processor 1001 pre-processes the image information with different parallaxes, the pixel component 502 can modulate the first light beam and the second light beam with different pixels, so as to obtain two paths of imaging light, that is, the pixel component 502 uses different pixels to display the left eye image and the right eye image. Fig. 12 is an eighth schematic structural diagram of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 12, the first light source module 501 outputs the first light beam and the second light beam in a time-sharing manner on the basis of fig. 11. For example, LED3 in first light source assembly 501 generates a first light beam. The LED2 in the first light source assembly 501 generates a second light beam. The first light beam and the second light beam pass through the light beam control unit 503 to reach the pixel assembly 502. The pixel component 502 modulates the first light beam and the second light beam by different pixels in a time-sharing manner to obtain two paths of imaging light. The pixel component 502 modulates the first light beam through the pixel 2 to obtain a first path of imaging light, and the first path of imaging light is reflected by the curved mirror 102 and enters the left eye. The pixel component 502 modulates the second light beam through the pixel 1 to obtain a second path of imaging light, the second path of imaging light enters the right eye after being reflected by the curved mirror 102, two virtual images observed by two eyes of a user are positioned at the same position, dizziness cannot occur, and the stereoscopic display effect is improved. It should be understood that the description with respect to fig. 12 is only one example. For example, in practice, the pixel component 502 may change the pixels used at the same time after the display panel pre-processes the displayed image. Specifically, pixel assembly 502 modulates the first light beam with pixel 2 to obtain a first imaged light path. The pixel assembly 502 modulates the second beam of light by pixel 3 to obtain a second path of imaging light.
It is understood that, in the embodiment of the present application, the pixel 1, the pixel 2, the first pixel, the second pixel, or the like may refer to one pixel point, or may refer to a set of a plurality of pixel points. The present application is not limited in this regard. Similarly, LED1 or LED2 may refer to one LED or a collection of LEDs. Similarly, the first light source device or the second light source device may also refer to one LED, or may refer to a collection of LEDs.
As can be seen from the foregoing description, the present embodiment can compensate for display errors by preprocessing by the processor 1001. In practical applications, in order to avoid excessive processing by the processor 1001, the preprocessing may be performed only when the display error is greater than the threshold value. The following is one example of calculating a display error. Fig. 13 is a schematic view of a third optical path projection of the stereoscopic display device according to the embodiment of the present application. As shown in fig. 13, two pixels of the image generation component 101 output two paths of imaging light. The two imaging light paths comprise a first imaging light path and a second imaging light path. The two pixels include a pixel 1 and a pixel 2. The coordinates of the pixel 1 are (x_left, y_left). The coordinates of pixel 2 are (X_origin, Y_origin). The curved mirror 102 is used to reflect the first path of imaging light output by the pixel 1. The reflected first path of imaging light irradiates the left eye of the user. The coordinates of the left eye of the user are (x_left, y_left). Curved mirror 102 is also used to reflect the second path of imaging light output by pixel 2. The reflected second imaging light irradiates the right eye of the user. The coordinates of the right eye of the user are (x_right, y_right).
In an ideal state where there is no display error, the virtual images corresponding to the pixel 1 and the pixel 2 are on the same virtual image point on the virtual image plane 1301. The coordinates of the virtual image point are (x_v, y_v). When there is a display error, the virtual image corresponding to the pixel 1 is on the virtual image point 1 of the virtual image plane 1301. The coordinates of the virtual image point 1 are (x_v1, y_v1). The virtual image corresponding to the pixel 2 is on the virtual image point 2 of the virtual image plane 1301. The coordinates of the virtual image point 2 are (x_v2, y_v2).
At this time, the display error Δ= | (x_v1, x_v2) - (y_v1, y_v2) | is calculated. The pixel 1 and the pixel 2 are a pair of sampling points, and respectively display a left eye image and a right eye image.
If the average of the display errors delta is greater than the threshold for a plurality of pairs of sample points within a certain area, the characterization processor 1001 needs to perform preprocessing. For example, the threshold may be tan (2.5 mrad) ×s. Where S is the distance between the eyes of the user and the curved mirror 102.
Through the preprocessing described above, the pixels included in a pair of sampling points can be changed. For example, after preprocessing, a pair of sampling points includes pixel 1 and pixel 3. The virtual image of the pixel 3 is projected on the virtual image point 3 of the virtual image plane 1301. The coordinates of the virtual dot 3 are (x_v3, y_v3). By the similar method as described above, the display error Δ= | (x_v1, x_v3) - (y_v1, y_v3) | is calculated. If the display error delta is less than the threshold, then the characterization display error is corrected to be within an acceptable range and the processor 1001 does not need to pre-process the image information. If the display error delta is still greater than or equal to the threshold after preprocessing, the processor 1001 may perform further preprocessing until the display error delta is less than the threshold.
Referring to fig. 14, fig. 14 is a schematic circuit diagram of a stereoscopic display device according to an embodiment of the present application.
As shown in fig. 14, the circuits in the display device mainly include a processor 1001, an internal memory 1002, an external memory interface 1003, an audio module 1004, a video module 1005, a power module 1006, a wireless communication module 1007, an i/O interface 1008, a video interface 1009, a controller area network (Controller Area Network, CAN) transceiver 1010, a display circuit 1028, a display panel 1029, and the like. The processor 1001 and its peripheral elements, such as the memory 1002, the can transceiver 1010, the audio module 1004, the video module 1005, the power module 1006, the wireless communication module 1007, the i/O interface 1008, the video interface 1009, the touch unit 1010, and the display circuit 1028, may be connected through a bus. The processor 1001 may be referred to as a front-end processor.
In addition, the circuit diagram illustrated in the embodiment of the present application does not constitute a specific limitation of the display device. In other embodiments of the present application, the display device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 1001 includes one or more processing units, for example: the processor 1001 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processor (Image Signal Processor, ISP), a video codec, a digital signal processor (Digital Signal Processor, DSP), a baseband processor, and/or a Neural network processor (Neural-Network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 1001 for storing instructions and data. For example, an operating system of the display device, an AR Creator software package, and the like are stored. In some embodiments, the memory in the processor 1001 is a cache memory. The memory may hold instructions or data that the processor 1001 has just used or recycled. If the processor 1001 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 1001 is reduced, thus improving the efficiency of the system.
In addition, if the display device in the present embodiment is mounted on a vehicle, the functions of the processor 1001 may be implemented by a domain controller on the vehicle.
In some embodiments, the display device may also include a plurality of Input/Output (I/O) interfaces 1008 coupled to the processor 1001. The interface 1008 may include, but is not limited to, an integrated circuit (Inter-Integrated Circuit, I2C) interface, an integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interface, a pulse code modulation (Pulse Code Modulation, PCM) interface, a universal asynchronous receiver Transmitter (Universal Asynchronous Receiver/Transmitter, UART) interface, a mobile industry processor interface (Mobile Industry Processor Interface, MIPI), a General-Purpose Input/Output (GPIO) interface, a subscriber identity module (Subscriber Identity Module, SIM) interface, and/or a universal serial bus (Universal Serial Bus, USB) interface, among others. The I/O interface 1008 may be connected to a mouse, a touch screen, a keyboard, a camera, a speaker/horn, a microphone, or may be connected to physical keys (e.g., a volume key, a brightness adjustment key, an on/off key, etc.) on the display device.
The internal memory 1002 may be used to store computer-executable program code that includes instructions. The memory 1002 may include a stored program area and a stored data area. The storage program area may store an application program (such as a call function, a time setting function, an AR function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the display device (e.g., phone book, universal time, etc.), etc. In addition, the internal memory 1002 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (Universal Flash Storage, UFS), and the like. The processor 1001 performs various functional applications of the display device and data processing by executing instructions stored in the internal memory 1002 and/or instructions stored in a memory provided in the processor 1001.
The external memory interface 1003 may be used to connect to an external memory (for example, micro SD card), and the external memory may store data or program instructions as needed, and the processor 1001 may perform operations such as reading and writing on these data or program execution through the external memory interface 1003.
The audio module 1004 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 1004 may also be used to encode and decode audio signals, such as for playback or recording. In some embodiments, the audio module 1004 may be provided in the processor 1001, or a part of functional modules of the audio module 1004 may be provided in the processor 1001. The display device may implement audio functions through an audio module 1004, an application processor, and the like.
The video interface 1009 may receive externally input audio and video, which may specifically be a high-definition multimedia interface (High Definition Multimedia Interface, HDMI), a digital video interface (Digital Visual Interface, DVI), a video graphics array (Video Graphics Array, VGA), a Display Port (DP), a low voltage differential signaling (Low Voltage Differential Signaling, LVDS) interface, and the like, and the video interface 1009 may further output video. For example, the display device receives video data transmitted from the navigation system or video data transmitted from the domain controller through the video interface.
The video module 1005 may decode the video input by the video interface 1009, for example, h.264 decoding. The video module can also encode the video collected by the display device, for example, H.264 encoding is carried out on the video collected by the external camera. The processor 1001 may decode the video input from the video interface 1009 and output the decoded image signal to the display circuit.
Further, the display device further includes a CAN transceiver 1010, and the CAN transceiver 1010 may be connected to a CAN BUS (CAN BUS) of the automobile. The display device CAN communicate with in-vehicle entertainment systems (music, radio, video modules), vehicle status systems, etc. via the CAN bus. For example, the user may turn on the in-vehicle music play function by operating the display device. The vehicle status system may send vehicle status information (doors, seat belts, etc.) to a display device for display.
The display circuit 1010 and the display panel 1011 realize a function of displaying an image together. The display circuit 1010 receives an image signal output from the processor 1001, processes the image signal, and inputs the processed image signal to the display panel 1011 to form an image. The display circuit 1010 can also control an image displayed on the display panel 1011. For example, parameters such as display brightness or contrast are controlled. The display circuit 1010 may include a driving circuit, an image control circuit, and the like. The display circuit 1010 and the display panel 1011 may be located in the pixel assembly 502.
The display panel 1011 is used to modulate a light beam input from a light source according to an input image signal, thereby generating a visual image. The display panel 1011 may be a liquid crystal on silicon panel, a liquid crystal display panel, or a digital micromirror device.
In this embodiment, the video interface 1009 may receive input video data (or referred to as a video source), and the video module 1005 outputs an image signal to the display circuit 1010 after decoding and/or digitizing, and the display circuit 1010 drives the display panel 1011 to image a light beam emitted by the light source according to the input image signal, so as to generate a visual image (emit imaging light).
The power module 1006 is configured to provide power to the processor 1001 and the light source based on input power (e.g., direct current), and the power module 1006 may include a rechargeable battery that may provide power to the processor 1001 and the light source. Light emitted from the light source may be transmitted to the display panel 1029 for imaging, thereby forming an image light signal (imaging light).
In addition, the power module 1006 may be connected to a power module (e.g., a power battery) of the vehicle, and the power module 1006 of the display device is powered by the power module of the vehicle.
The wireless communication module 1007 may enable the display device to communicate wirelessly with the outside world, which may provide solutions for wireless communication such as wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (Wireless Fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (Global Navigation Satellite System, GNSS), frequency modulation (Frequency Modulation, FM), near field wireless communication technology (Near Field Communication, NFC), infrared technology (IR), etc. The wireless communication module 1007 may be one or more devices that integrate at least one communication processing module. The wireless communication module 1007 receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 1001. The wireless communication module 1007 may also receive signals to be transmitted from the processor 1001, frequency modulate them, amplify them, and convert them to electromagnetic waves for radiation via an antenna.
In addition, the video data decoded by the video module 1005 may be received wirelessly by the wireless communication module 1007 or read from the internal memory 1002 or the external memory, for example, the display apparatus may receive video data from a terminal device or an in-vehicle entertainment system through a wireless lan in the vehicle, and the display apparatus may also read audio/video data stored in the internal memory 1002 or the external memory, in addition to the video data input through the video interface 1009.
The embodiment of the application also provides a vehicle, and the vehicle is provided with any one of the three-dimensional display devices. The two paths of imaging light carry image information of different parallaxes. The two paths of imaging light are reflected to the windshield through the reflecting mirror, and the windshield further reflects the two paths of imaging light to form a virtual image. The virtual image is located on one side of the windshield and the driver or passenger is located on the other side of the windshield. The reflected two paths of imaging light are respectively irradiated to both eyes of the driver or the passenger. For example, the first imaging light irradiates the left eye of the passenger. The second imaging light irradiates the right eye of the passenger.
The embodiment of the application also provides a vehicle, and the vehicle is provided with any one of the three-dimensional display devices. Fig. 15 is a schematic view of a stereoscopic display device according to an embodiment of the present application installed in a vehicle. The windshield of the vehicle may act as a curved mirror or lens in a stereoscopic display device. When the windshield is used as a curved mirror in a stereoscopic display device, the image generating assembly 101 and the driver or passenger are located on the same side of the windshield. When the windshield is used as a lens in a stereoscopic display device, the image generation assembly 101 and the driver or passenger are located on different sides of the windshield. The image generation component 101 is configured to output two paths of imaging light. The two paths of imaging light carry image information of different parallaxes. The windshield is used to reflect or transmit two paths of imaging light to form a virtual image. The virtual image is located on one side of the windshield and the driver or passenger is located on the other side of the windshield. The reflected or transmitted two paths of imaging light are respectively irradiated to both eyes of the driver or the passenger. For example, the first imaging light irradiates the left eye of the passenger. The second imaging light irradiates the right eye of the passenger.
By way of example, the vehicle may be a car, truck, motorcycle, bus, boat, airplane, helicopter, mower, recreational vehicle, casino vehicle, construction equipment, electric car, golf cart, train, trolley, etc., and embodiments of the present application are not particularly limited. The stereoscopic display device can be installed on an Instrument Panel (IP) table of a vehicle, located at a secondary driving position or a primary driving position, and also can be installed on the back of a seat. When applied to a vehicle, the stereoscopic Display device may be referred to as Head Up Display (HUD), and may be used to Display navigation information, vehicle speed, electric quantity/oil quantity, and the like.
Fig. 16 is a schematic diagram of one possible functional framework of a vehicle provided in an embodiment of the present application.
As shown in fig. 16, various subsystems may be included in the functional framework of the vehicle, such as a control system 14, a sensor system 12, one or more peripheral devices 16 (one shown in the illustration), a power supply 18, a computer system 20, and a display system 32, as shown. Alternatively, the vehicle may also include other functional systems, such as an engine system to power the vehicle, etc., as not limited herein.
The sensor system 12 may include a plurality of sensing devices that sense the measured information and convert the sensed information to an electrical signal or other desired form of information output according to a certain rule. As illustrated, these detection devices may include, without limitation, a global positioning system (global positioning system, GPS), a vehicle speed sensor, an inertial measurement unit (inertial measurement unit, IMU), a radar unit, a laser rangefinder, an imaging device, a wheel speed sensor, a steering sensor, a gear sensor, or other elements for automatic detection, and so forth.
The control system 14 may include several elements such as a steering unit, a braking unit, a lighting system, an autopilot system, a map navigation system, a network timing system, and an obstacle avoidance system as shown. Optionally, the control system 14 may also include elements such as throttle controls and engine controls for controlling the speed of the vehicle, as the application is not limited.
Peripheral device 16 may include several elements such as the communication system in the illustration, a touch screen, a user interface, a microphone, and a speaker, among others. Wherein the communication system is used for realizing network communication between the vehicle and other devices except the vehicle. In practical applications, the communication system may employ wireless communication technology or wired communication technology to enable network communication between the vehicle and other devices. The wired communication technology may refer to communication between the vehicle and other devices through a network cable or an optical fiber, etc.
The power source 18 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, a rechargeable lithium battery or lead acid battery, or the like. In practical applications, one or more battery packs in the power supply are used to provide electrical energy or power for vehicle start-up, and the type and materials of the power supply are not limited in this application.
Several functions of the vehicle are performed by the control of the computer system 20. The computer system 20 may include one or more processors 2001 (shown as one processor) and memory 2002 (which may also be referred to as storage devices). In practical applications, the memory 2002 is also internal to the computer system 20, or external to the computer system 20, for example, as a cache in a vehicle, and the present application is not limited thereto. Wherein,,
for a description of the processor 2001, reference may be made to the description of the processor 1001 previously described. The processor 2001 may include one or more general-purpose processors, e.g., a graphics processor (graphic processing unit, GPU). The processor 2001 may be used to execute related programs or instructions corresponding to the programs stored in the memory 2002 to implement the corresponding functions of the vehicle.
Memory 2002 may include volatile memory (RAM), for example; the memory may also include a non-volatile memory (ROM), flash memory (flash memory), or solid state disk (solid state drives, SSD); memory 2002 may also include combinations of the above types of memory. Memory 2002 may be used to store a set of program codes or instructions corresponding to the program codes so that processor 2001 invokes the program codes or instructions stored in memory 2002 to implement the corresponding functions of the vehicle. Including but not limited to some or all of the functions in the vehicle function frame schematic shown in fig. 13. In this application, the memory 2002 may store a set of program codes for vehicle control, which the processor 2001 invokes to control the safe driving of the vehicle, as to how the safe driving of the vehicle is achieved, as described in detail below.
Alternatively, the memory 2002 may store information such as road maps, driving routes, sensor data, and the like, in addition to program codes or instructions. The computer system 20 may implement the relevant functions of the vehicle in combination with other elements in the functional framework schematic of the vehicle, such as sensors in the sensor system, GPS, etc. For example, the computer system 20 may control the direction of travel or speed of travel of the vehicle, etc., based on data input from the sensor system 12, without limitation.
The display system 32 may include several elements, such as a controller and the stereoscopic display device 100 described previously. The controller is configured to generate an image (e.g., an image including a vehicle state such as a vehicle speed, an amount of electricity/oil, etc., and an image of augmented reality AR content) according to a user instruction, and transmit the image content to the stereoscopic display device 100. The image generation module 101 in the stereoscopic display device 100 is configured to output two paths of imaging light carrying different image information. The curved screen 102 in the stereoscopic display device 100 is a windshield. The windshield is used to reflect or transmit two paths of imaging light so that a virtual image corresponding to the image content is presented in front of the driver or passenger. It should be noted that the functions of some of the elements in the display system 32 may be implemented by other subsystems of the vehicle, for example, the controller may be an element in the control system 14.
Herein, fig. 16 illustrates a system including four subsystems, the sensor system 12, the control system 14, the computer system 20, and the display system 32 are only examples, and are not limiting. In practical applications, the vehicle may combine several elements in the vehicle according to different functions, thereby obtaining subsystems with corresponding different functions. In practice, the vehicle may include more or fewer systems or elements, and the present application is not limited thereto.
In the description of the present specification, a particular feature, structure, material, or characteristic may be combined in any suitable manner in one or more embodiments or examples.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A stereoscopic display device comprising an image generation assembly and a curved mirror, wherein:
the image generation component is used for generating two paths of imaging light which carry image information with different parallaxes;
The curved mirror is used for reflecting the two paths of imaging light, an included angle exists between the two paths of imaging light after reflection, the focal length of the curved mirror is f, the distance between the image surface of the image generating component and the curved mirror is d, the image surface of the image generating component is a diffusion screen, and d is smaller than f.
2. The stereoscopic display apparatus according to claim 1, wherein the distance between the virtual image formed by the reflected two imaging light paths and the curved mirror is D, and D satisfies the following formula:
Figure FDA0003837266630000011
3. the stereoscopic display apparatus according to claim 2, wherein an included angle γ1 exists between the two reflected imaging lights, and the γ1 satisfies the following formula: e-w < tan (gamma 1) × (S+D) < E+2w, wherein S is the distance between the receiving position of the two paths of imaging light and the curved mirror, the value range of E is 53-73 mm, and w is the width of at least one path of imaging light in the two paths of imaging light reflected by the curved mirror at the receiving position.
4. The stereoscopic display apparatus according to claim 2, whichIs characterized in that an included angle gamma 2 exists between the two paths of imaging light before being reflected by the curved mirror, and the gamma 2 satisfies the following formula:
Figure FDA0003837266630000012
Figure FDA0003837266630000013
The distance between the receiving position of the two paths of imaging light and the curved mirror is S, the value range of E is 53-73 mm, and w is the width of at least one path of imaging light in the two paths of imaging light reflected by the curved mirror at the receiving position.
5. A stereoscopic display apparatus according to claim 3, wherein a divergence angle of each of the two paths of imaging light before being reflected by the curved mirror is α, and the w satisfies the following formula:
Figure FDA0003837266630000014
wherein w is less than 73 mm.
6. The stereoscopic display apparatus according to any one of claims 1 to 5, wherein the image generating assembly comprises a first light source assembly and a pixel assembly;
the first light source component is used for outputting first light beams and second light beams with different emergent directions to the pixel component in a time-sharing mode;
the pixel component is used for modulating the first light beam and the second light beam respectively according to different image information to generate the two paths of imaging light.
7. The stereoscopic display apparatus according to claim 6, wherein the first light source assembly includes a first light source device and a second light source device for alternately outputting the first light beam and the second light beam in a time-sharing manner.
8. The stereoscopic display device according to claim 7, wherein the image generating assembly further comprises a timing control unit;
the time sequence control unit is used for controlling the first light source device and the second light source device to alternately output the first light beam and the second light beam in a time-sharing way;
the timing control unit is further configured to control the pixel assembly to modulate the first light beam and the second light beam using different image information time-sharing.
9. The stereoscopic display apparatus according to any one of claims 1 to 5, wherein the two paths of imaging light include a first path of imaging light and a second path of imaging light, and the image generation assembly includes a second light source assembly, a pixel assembly, and a lens array;
the second light source component is used for outputting a third light beam to the pixel component;
the pixel component is used for modulating the third light beam by using different image information to generate a first path of imaging light and a second path of imaging light;
the lens array is used for transmitting the first path of imaging light and the second path of imaging light at different angles.
10. The stereoscopic display device of claim 9, wherein the pixel assembly comprises a first pixel and a second pixel;
The first pixel is used for modulating the third light beam according to first image information to generate the first path of imaging light;
the second pixel is used for modulating the third light beam according to second image information to generate the second path of imaging light.
11. The stereoscopic display apparatus according to any one of claims 1 to 10, wherein the image information of different parallaxes includes first image information and second image information;
the stereoscopic display device further includes a processor;
the processor is used for preprocessing third image information to obtain the first image information;
the processor is further used for preprocessing fourth image information to obtain second image information.
12. The stereoscopic display apparatus according to claim 11, wherein,
the processor is further used for acquiring first coordinate information of a first position and second coordinate information of a second position, one of the two imaging light beams irradiates the first position, and the other imaging light beam irradiates the second position;
the processor is configured to pre-process the third image information including: the processor is used for preprocessing third image information according to the first coordinate information;
The processor is configured to pre-process the fourth image information, including: the processor is used for preprocessing fourth image information according to the second coordinate information.
13. The stereoscopic display apparatus according to any one of claims 1 to 12, wherein f is less than 300 mm.
14. The stereoscopic display apparatus according to any one of claims 1 to 13, wherein the image generating assembly comprises a projector for generating the two-way imaging light and the diffusion screen for receiving and diffusing the two-way imaging light and outputting the diffused two-way imaging light;
the curved mirror is used for reflecting the two paths of imaging light and comprises: the curved mirror is used for reflecting the two paths of diffused imaging light.
15. A vehicle comprising a stereoscopic display device according to any one of claims 1 to 14, mounted on the vehicle.
CN202211091089.8A 2022-05-10 2022-05-10 Stereoscopic display device and vehicle Pending CN116184686A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211091089.8A CN116184686A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211091089.8A CN116184686A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle
CN202210505264.7A CN117075359A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210505264.7A Division CN117075359A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle

Publications (1)

Publication Number Publication Date
CN116184686A true CN116184686A (en) 2023-05-30

Family

ID=86469458

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211091089.8A Pending CN116184686A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle
CN202210505264.7A Pending CN117075359A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210505264.7A Pending CN117075359A (en) 2022-05-10 2022-05-10 Stereoscopic display device and vehicle

Country Status (2)

Country Link
CN (2) CN116184686A (en)
WO (1) WO2023216670A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117192779A (en) * 2023-09-20 2023-12-08 江苏泽景汽车电子股份有限公司 Head-up display

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130035587A (en) * 2011-09-30 2013-04-09 엘지디스플레이 주식회사 Display apparatus for displaying three dimensional picture and manufacturing method for the same
CN104137538A (en) * 2011-12-23 2014-11-05 韩国科学技术研究院 Device for displaying multi-view 3d image using dynamic visual field expansion applicable to multiple observers and method for same
CN104536578A (en) * 2015-01-13 2015-04-22 京东方科技集团股份有限公司 Control method and device for naked eye 3D display device and naked eye 3D display device
CN105025289A (en) * 2015-08-10 2015-11-04 重庆卓美华视光电有限公司 Stereo display method and device
CN105404011A (en) * 2015-12-24 2016-03-16 深圳点石创新科技有限公司 3D image correction method of head up display and head up display
CN108663807A (en) * 2017-03-31 2018-10-16 宁波舜宇车载光学技术有限公司 Head-up display optical system and device and its imaging method
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
CN110874867A (en) * 2018-09-03 2020-03-10 广东虚拟现实科技有限公司 Display method, display device, terminal equipment and storage medium
US20200150431A1 (en) * 2017-07-07 2020-05-14 Kyocera Corporation Image projection apparatus and mobile body
CN112752085A (en) * 2020-12-29 2021-05-04 北京邮电大学 Naked eye 3D video playing system and method based on human eye tracking
CN113661432A (en) * 2019-06-26 2021-11-16 Jvc建伍株式会社 Head-up display device
CN114153066A (en) * 2020-09-08 2022-03-08 未来(北京)黑科技有限公司 Head-up display device and head-up display system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7114146B2 (en) * 2018-06-21 2022-08-08 創智車電股▲ふん▼有限公司 DISPLAY DEVICE AND AUTOMOBILE HEAD-UP DISPLAY SYSTEM USING THE SAME
JP2021021914A (en) * 2019-07-30 2021-02-18 怡利電子工業股▲ふん▼有限公司 Naked eye 3d reflection type diffusion piece head-up display device
CN112526748A (en) * 2019-09-02 2021-03-19 未来(北京)黑科技有限公司 Head-up display device, imaging system and vehicle
JP7358909B2 (en) * 2019-10-28 2023-10-11 日本精機株式会社 Stereoscopic display device and head-up display device
CN114137725A (en) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 Head-up display system capable of displaying three-dimensional image
CN213457538U (en) * 2020-09-08 2021-06-15 未来(北京)黑科技有限公司 Head-up display device and head-up display system
WO2022088159A1 (en) * 2020-10-31 2022-05-05 华为技术有限公司 Head-up display and head-up display method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130035587A (en) * 2011-09-30 2013-04-09 엘지디스플레이 주식회사 Display apparatus for displaying three dimensional picture and manufacturing method for the same
CN104137538A (en) * 2011-12-23 2014-11-05 韩国科学技术研究院 Device for displaying multi-view 3d image using dynamic visual field expansion applicable to multiple observers and method for same
CN104536578A (en) * 2015-01-13 2015-04-22 京东方科技集团股份有限公司 Control method and device for naked eye 3D display device and naked eye 3D display device
CN105025289A (en) * 2015-08-10 2015-11-04 重庆卓美华视光电有限公司 Stereo display method and device
CN105404011A (en) * 2015-12-24 2016-03-16 深圳点石创新科技有限公司 3D image correction method of head up display and head up display
CN108663807A (en) * 2017-03-31 2018-10-16 宁波舜宇车载光学技术有限公司 Head-up display optical system and device and its imaging method
US20200150431A1 (en) * 2017-07-07 2020-05-14 Kyocera Corporation Image projection apparatus and mobile body
CN110874867A (en) * 2018-09-03 2020-03-10 广东虚拟现实科技有限公司 Display method, display device, terminal equipment and storage medium
CN109462750A (en) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 A kind of head-up-display system, information display method, device and medium
CN113661432A (en) * 2019-06-26 2021-11-16 Jvc建伍株式会社 Head-up display device
CN114153066A (en) * 2020-09-08 2022-03-08 未来(北京)黑科技有限公司 Head-up display device and head-up display system
CN112752085A (en) * 2020-12-29 2021-05-04 北京邮电大学 Naked eye 3D video playing system and method based on human eye tracking

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117192779A (en) * 2023-09-20 2023-12-08 江苏泽景汽车电子股份有限公司 Head-up display

Also Published As

Publication number Publication date
CN117075359A (en) 2023-11-17
WO2023216670A1 (en) 2023-11-16

Similar Documents

Publication Publication Date Title
CN114466761B (en) Head-up display and image display system
WO2021065617A1 (en) Vehicular display system and vehicle
WO2023216670A1 (en) Three-dimensional display apparatus and vehicle
JP2023175794A (en) Head-up display
WO2024021852A1 (en) Stereoscopic display apparatus, stereoscopic display system, and vehicle
WO2024021574A1 (en) 3d projection system, projection system, and vehicle
WO2021015171A1 (en) Head-up display
US20240036311A1 (en) Head-up display
US20230152586A1 (en) Image generation device and head-up display
CN115586647A (en) Image generation device, display equipment and vehicle
CN116500784A (en) Display device and vehicle
WO2020218072A1 (en) Vehicular head-up display and light source unit used therefor
WO2024098828A1 (en) Projection system, projection method, and transportation means
CN114503010B (en) Head-up display
US20240069335A1 (en) Head-up display
WO2023185293A1 (en) Image generation apparatus, display device, and vehicle
WO2023130759A1 (en) Display device and vehicle
WO2023040669A1 (en) Head-up display device and vehicle
WO2024065332A1 (en) Display module, optical display system, terminal device and image display method
WO2024041034A1 (en) Display module, optical display system, terminal device, and imaging method
CN118859549A (en) Projection lens, image generating device, display device and vehicle
CN117331271A (en) Projection device, display equipment and vehicle
CN116203726A (en) Display device, electronic apparatus, and vehicle
CN117492224A (en) Image generation device, display equipment, vehicle and image generation method
CN117369127A (en) Virtual image display device, image data generation method and device and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination