WO2024065332A1 - Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image - Google Patents
Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image Download PDFInfo
- Publication number
- WO2024065332A1 WO2024065332A1 PCT/CN2022/122330 CN2022122330W WO2024065332A1 WO 2024065332 A1 WO2024065332 A1 WO 2024065332A1 CN 2022122330 W CN2022122330 W CN 2022122330W WO 2024065332 A1 WO2024065332 A1 WO 2024065332A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- viewpoint
- scattering layer
- layer
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000003287 optical effect Effects 0.000 title claims abstract description 47
- 238000002310 reflectometry Methods 0.000 claims description 25
- 238000004590 computer program Methods 0.000 claims description 14
- 239000000945 filler Substances 0.000 claims description 12
- 238000002834 transmittance Methods 0.000 claims description 10
- 230000010287 polarization Effects 0.000 claims description 5
- 230000015654 memory Effects 0.000 description 37
- 238000010586 diagram Methods 0.000 description 36
- 230000006870 function Effects 0.000 description 16
- 238000012545 processing Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 239000011521 glass Substances 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 239000002245 particle Substances 0.000 description 6
- 230000003190 augmentative effect Effects 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 101100012902 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) FIG2 gene Proteins 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000000243 solution Substances 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000004424 eye movement Effects 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000002086 nanomaterial Substances 0.000 description 2
- 239000000843 powder Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 101100233916 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) KAR5 gene Proteins 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000005357 flat glass Substances 0.000 description 1
- 239000005338 frosted glass Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000003921 oil Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
Definitions
- the present application relates to the field of display technology, and in particular to a display module, an optical display system, a terminal device and an image display method.
- Transparent display will not affect the user's observation of ambient light, but can also display image information.
- Transparent display is mainly divided into active light-emitting type and projection type.
- Active light-emitting type is such as organic light emitting diode (OLED) technology.
- Projection type is such as adding scattering particles or fluorescent powder to the windshield or window glass. The windshield with added scattering particles or fluorescent powder will emit light and produce images under the irradiation of light source. By controlling the concentration of scattering particles or making the scattering particles have wavelength selectivity, the transparency of the windshield to ambient light can still be good, thereby realizing transparent display.
- three-dimensional (3D) display can give observers a strong sense of stereoscopic reality, and increase the display depth and enhance the richness of the displayed content.
- the display device emits left and right eye image light, which has different polarization states.
- the left and right eye lenses only allow the corresponding polarization state image light to enter. Therefore, the left and right eyes observe different images, and the 3D vision is formed through the visual fusion of the brain.
- the present application provides a display module, an optical display system, a terminal device and an image display method for realizing 3D transparent display.
- the present application provides a display module, which includes a scattering layer and a reflective layer.
- the scattering layer is used to receive different image lights from a picture generation unit (PGU) and scatter the different image lights to corresponding positions of the reflective layer, wherein the image lights carry image information.
- the reflective layer is used to reflect the image lights from corresponding positions of the scattering layer to corresponding viewpoints and transmit ambient light.
- the different image lights carry different image information. Furthermore, the deflection states of the different image lights may be the same or different.
- the reflective layer includes an array of concave cylindrical reflective elements, and surfaces of the concave cylindrical reflective elements in the array of concave cylindrical reflective elements close to the scattering layer are concave surfaces, and the concave surfaces are covered with a reflective film.
- an image formed based on the image light from the scattering layer can have parallax and three-dimensional effect.
- the projection of the concave surface close to the scattering layer on at least a first plane is an arc, wherein the first plane is, for example, a horizontal plane.
- the image formed by the image light can have parallax and stereoscopic effect in the first direction parallel to the first plane.
- the reflectivity of the reflective film is greater than a first reflectivity threshold and less than a second reflectivity threshold.
- the reflective film can both reflect the image light from the scattering layer and transmit the ambient light, thereby achieving 3D transparent display.
- the reflectivity of the reflective film is related to the requirement for the transmittance of the reflective film.
- the application scenario requirements of the reality module can be met. For example, when the display module should be set on a windshield, the transmittance of the windshield needs to be greater than 70%, so the transmittance of the reflective film needs to be greater than 70%, and accordingly, the reflectivity of the reflective film is less than 30% and greater than 0.
- the reflective film includes a nanometal film or a dielectric film.
- the concave surface is filled with a filler, and the refractive index of the filler and the refractive index of the concave cylindrical reflector meet a preset error requirement.
- the refractive index of the filler is the same as the refractive index of the concave cylindrical reflector.
- the material of the filler is the same as that of the concave cylindrical reflector.
- the display module can be made not to affect the propagation direction of the ambient light, thereby achieving better transparent display.
- the concave cylindrical reflector array is a one-dimensional array
- one concave cylindrical reflector in the concave cylindrical reflector array corresponds to N columns of regions of the scattering layer, the N columns of regions are used to receive different image lights, and N is an integer greater than 1.
- the display module can produce a 3D transparent display with a stronger stereoscopic effect.
- the concave cylindrical reflector array is a two-dimensional array, and one concave cylindrical reflector in the concave cylindrical reflector array corresponds to one area of the scattering layer.
- the image formed by the image light has parallax and stereoscopic effect in the first direction, and also has parallax and stereoscopic effect in the second direction.
- the reflective layer includes a holographic reflective medium layer. Further, the holographic reflective medium layer is manufactured by holographic exposure.
- the thickness of the display module can be reduced, thereby contributing to making the display module lighter and thinner.
- the incident angle of the image light scattered to the corresponding position of the reflective layer entering the scattering layer belongs to a preset angle range.
- the scattering layer By setting the scattering layer to have angle selectivity, the scattering layer can only scatter light that meets a preset angle range and is transparent to ambient light entering from other directions. In this way, the display module can achieve 3D transparent display.
- the scattering layer is obtained by holographic exposure.
- the scattering layer is prepared by holographic exposure.
- the reference laser during exposure has the same field of view and incident angle as the image light emitted by the image generation unit, so that only the image light with the same field of view and incident angle can excite the scattering layer.
- the ambient light is directly transmitted because it does not meet the angle selection conditions of the scattering layer.
- the display module based on this scattering layer has high transparency.
- the present application provides an optical display system, which includes an image generating unit and the above-mentioned first aspect or any one of the display modules in the first aspect; the image generating unit is used to emit different image lights.
- polarization states of different image lights emitted by the image generating unit are the same.
- the image generating unit includes, but is not limited to, a projector.
- the present application provides a terminal device, which includes the above-mentioned second aspect or any one of the optical display systems in the second aspect, and the optical display system is installed on the terminal device.
- the present application provides an image display method, the method comprising: obtaining the coordinates of a first viewpoint and/or a second viewpoint; determining K eye distances according to the coordinates of the first viewpoint and/or the second viewpoint and the coordinates of K target positions of a display area of the windshield, the K target positions corresponding one-to-one to K rows of an image projected onto the display area, and the parallax between the first viewpoint and the second viewpoint, the eye distance and the virtual image distance satisfy a corresponding relationship; and controlling the display of the image by adjusting the parallax of the K rows of the image.
- the virtual image distance of each row of the displayed image can be made the same by correcting the parallax in each row, thereby achieving non-tilted (or upright) display of the controlled image.
- ⁇ P i is the parallax between the first viewpoint and the second viewpoint in the i-th row
- T is the distance between the first viewpoint and the second viewpoint
- VID is the virtual image distance
- ER i is the eye relief of the i-th row.
- the present application provides a chip, which includes at least one processor and an interface circuit. Further, optionally, the chip may also include a memory, and the processor is used to execute computer programs or instructions stored in the memory, so that the chip executes the method in the above-mentioned fourth aspect or any possible implementation of the fourth aspect.
- the present application provides a computer-readable storage medium, in which a computer program or instruction is stored.
- the control device executes the method in the above-mentioned fourth aspect or any possible implementation of the fourth aspect.
- the present application provides a computer program product, which includes a computer program or instructions.
- the control device executes the method in the above-mentioned fourth aspect or any possible implementation of the fourth aspect.
- FIG. 1a is a schematic diagram of a front windshield of a vehicle to which a display module is applied provided by the present application;
- FIG1b is a schematic diagram of a display module applied to a vehicle window provided by the present application.
- FIG1c is a schematic diagram of a display module applied to a skylight provided in the present application.
- FIG1d is a schematic diagram of a display module provided by the present application applied to a NED device
- FIG. 1e is a schematic diagram of a display module provided by the present application applied to a vehicle-mounted display screen;
- FIG. 1f is a schematic diagram of a display module provided by the present application applied to a display
- FIG2 is a schematic diagram of the structure of a display module provided by the present application.
- FIG3a is a schematic diagram of the relationship between ambient light and image light provided by the present application.
- FIG3 b is another schematic diagram of the relationship between ambient light and image light provided by the present application.
- FIG4a is a schematic diagram showing the principle of manufacturing a scattering layer by holographic exposure technology provided by the present application.
- FIG4b is a schematic diagram of the principle of scattering layer reconstruction provided by the present application.
- FIG5a is a schematic structural diagram of a scattering layer provided in the present application.
- FIG5b is a schematic diagram of the structure of another scattering layer provided by the present application.
- FIG6 is a schematic structural diagram of a concave cylindrical reflector array provided by the present application.
- FIG7a is a schematic diagram of a projection of a concave cylindrical reflector array provided by the present application on a first plane;
- FIG7b is a schematic diagram of a projection of a concave surface on a first plane provided by the present application.
- FIG8 is a schematic diagram of a projection of a concave cylindrical reflector array provided by the present application on a second plane;
- FIG9 is a corresponding relationship between an array of concave cylindrical reflectors and the area of a scattering layer provided by the present application.
- FIG10 is a diagram showing a correspondence between another concave cylindrical reflector array and the area of the scattering layer provided by the present application.
- FIG11 is a schematic structural diagram of another concave cylindrical reflector array provided by the present application.
- FIG12a is a schematic diagram of a holographic reflective medium layer obtained by a lens array provided by the present application.
- FIG12b is a schematic diagram of a holographic reflective medium layer obtained by a reflector array provided by the present application.
- FIG13 is a light path diagram of a display module provided by the present application.
- FIG14 is a schematic diagram of an optical display system architecture provided by the present application.
- FIG15 is a schematic diagram of a method flow of an image display method provided by the present application.
- FIG16a is a schematic diagram of the positional relationship among an eye relief, a virtual image distance and a parallax provided by the present application;
- FIG16b is a schematic diagram of the relationship between a parallax and a position of a display module provided by the present application.
- FIG17 is a circuit diagram of an optical display system provided by the present application.
- FIG. 18 is an exemplary functional block diagram of a vehicle provided in the present application.
- Scattering refers to the phenomenon that part of light deviates from its original direction when it passes through a medium.
- the light that deviates from its original direction is called scattered light.
- the first step of holographic technology is to use the principle of interference to record the object's light wave information, that is, the shooting process; the photographed object forms a diffuse object beam under laser radiation; another part of the laser is used as a reference beam to shoot onto the holographic film, and it overlaps with the object beam to produce interference.
- Interference refers to the phenomenon that two or more waves overlap or cancel each other when they meet in space to form a new waveform.
- the phase and amplitude of each point on the object's light wave are converted into spatially varying intensities, so that all the information of the object's light wave can be recorded by using the contrast and interval between the interference fringes.
- the film recording the interference fringes becomes a holographic layer (or holographic photo) after development, fixing and other processing.
- the second step is to use the principle of diffraction to reproduce the object's light wave information, which is the imaging process.
- the viewpoint refers to the position where the image is observed. Specifically, the viewpoint can be two or more positions in the eye box.
- the eye box usually refers to the range where the driver's eyes can see the entire displayed image, which can be combined with the following Figure 1a.
- the size of the general eye box is 130mm ⁇ 50mm. If the driver's eyes are within the eye box range, they can see a complete and clear image. If the driver's eyes are beyond the eye box range, they may see distorted images, color errors, etc., or even be unable to see the image.
- VID Virtual Image Distance
- the virtual image distance is the distance between the viewpoint and the center of the image.
- Parallax refers to the difference in direction when observing the same object from two viewpoints at a certain distance (T).
- the display module provided in the present application can be arranged on the front windshield of the vehicle (with an inclination angle of about 30° to 60°) and used in combination with a head-up display device (HUD), see FIG1a.
- the HUD can project the formed image (called the HUD virtual image) into the driver's front field of view and integrate it with the real road information, thereby enhancing the driver's perception of the actual driving environment.
- the HUD can superimpose the HUD virtual image carrying navigation information (such as direction arrows, distance, and/or driving time, etc.) and/or vehicle status information (such as driving speed, mileage, speed, temperature, fuel level, and/or headlight status, etc.) on the real environment outside the vehicle (such as safe vehicle distance, surrounding obstacles, and/or reversing images, etc.), so that the driver can obtain an augmented reality visual effect.
- the vehicle can realize functions such as augmented reality (AR) navigation, adaptive cruise, and lane departure warning.
- the function of the advanced driving assistant system (ADAS) can be combined to realize assisted driving or intelligent driving of the vehicle.
- the virtual image distance of the HUD virtual image formed based on the vehicle's status information is about 2 to 3 meters.
- the virtual image distance of the HUD virtual image formed based on the navigation information is about 7 to 15 meters.
- HUD includes but is not limited to windshield (windshield-head up device, W-HUD), or AR-HUD, etc.
- the display module provided by the present application can also be set on a vehicle window (such as a side window or a rear window) and used in conjunction with an image generation unit, which can project image light onto the display module of the vehicle window, see FIG1b.
- the image generation unit can include, for example, but is not limited to, a projector.
- the display module provided by the present application can also be set on the skylight, and used in conjunction with the image generation unit, and the image generation unit can project image light onto the display module of the skylight, see Figure 1c.
- the image generation unit can include, for example, but is not limited to, a projector.
- the display module provided by the present application can also be integrated into a near eye display (NED) device.
- the NED device can be, for example, an augmented reality (AR) device or a virtual reality (VR) device.
- the AR device can include but is not limited to AR glasses or AR helmets
- the VR device can include but is not limited to VR glasses or VR helmets.
- FIG. 1d taking AR glasses as an example, users can wear AR glasses to play games, watch videos, participate in virtual meetings, or video shopping, etc.
- the display module provided by the present application can also be integrated into a vehicle display screen. Please refer to FIG1e.
- the vehicle display screen can be installed on the back of a vehicle seat or the co-pilot position, etc.
- the present application does not limit the location of the vehicle display screen installation.
- FIG1e is an example of installation on the back of a seat.
- the display module provided in the present application can also be integrated into a display as a desktop display, please refer to FIG. 1f.
- the present application provides a display module, which can realize 3D transparent display.
- the display module proposed in this application is described in detail below with reference to FIGS. 2 to 13 .
- the display module includes a scattering layer and a reflecting layer.
- the scattering layer is used to receive different image lights from the image generation unit, and scatter the different image lights to the corresponding positions of the reflecting layer.
- the image light carries image information, and the image information carried by different image lights can be the same or different, and the polarization states of different image lights can be the same or different.
- the reflecting layer is used to reflect the image light from the corresponding position of the scattering layer to the corresponding viewpoint, and transmit the ambient light.
- the reflecting layer is used to reflect the first image light from the scattering layer to the first viewpoint, and the second image light to the second viewpoint.
- FIG2 is an example of the image generation unit emitting the first image light and the second image light, and the first image light converges at the first viewpoint after passing through the scattering layer and the reflecting layer, and the second image light converges at the second viewpoint after passing through the scattering layer and the reflecting layer.
- the ambient light refers to light other than the image light.
- the propagation direction of the ambient light is opposite to the propagation direction of the image light, see FIG3a.
- the propagation direction of the ambient light is consistent with the propagation direction of the image light, see FIG3b. It should be understood that the ambient light shown in FIG3a and FIG3b is only an example, and the propagation direction of the ambient light in the present application may also be other possible directions, which are not limited thereto.
- the image generating unit may include but is not limited to a projector.
- the image generating unit is taken as an example to be described below.
- the scattering layer has angle selectivity, and only light incident at a specific incident angle can be scattered, and it is transparent to ambient light incident from other directions.
- the scattering layer is used to scatter image light whose incident angle meets a preset angle range. In other words, the incident angle of the image light scattered to the corresponding position of the reflective layer entering the scattering layer belongs to the preset angle range.
- the scattering layer can be made by holographic exposure, see Figure 4a, the first laser beam is used as the object light after passing through a traditional diffuser (such as frosted glass), and the reference laser beam interferes with the object light on the holographic recording layer through a semi-transparent and semi-reflective mirror.
- the field of view angle (or light cone angle) of the reference laser beam is the same as the field of view angle of the image light emitted by the projector, and the incident angle of the reference laser beam entering the scattering layer is equal to the incident angle of the image light emitted from the projector entering the scattering layer.
- the obtained holographic recording layer can be used as the scattering layer of the display module.
- the field angle of the image light emitted by the projector is ⁇
- the incident angle of the central image light entering the scattering layer is ⁇
- the incident angle ⁇ of the central image light entering the scattering layer is determined.
- the field angle of the reference laser beam is controlled to be ⁇
- the incident angle of the central laser beam of the reference laser beam entering the holographic recording layer is ⁇ by adjusting the semi-transparent and semi-reflective mirrors.
- the scattering layer is used as a holographic exposure method.
- the reference laser beam has the same light cone angle and incident angle as the projector, so that only the image light with the same light cone angle and incident angle can excite the scattering layer, while the ambient light is directly transmitted because it does not meet the angle selection conditions.
- the scattering layer has higher transparency.
- the above-mentioned method of making a scattering layer by holographic exposure is only an example.
- the scattering layer in the present application can also be made by other possible methods, for example, the scattering layer can be obtained by doping scattering particles, combining a micro-nano structure with a phase compensation layer, etc. The present application does not limit this.
- FIG5a is a schematic diagram of the structure of a scattering layer provided in the present application.
- the image light from the projector includes a first image light and a second image light.
- the scattering layer includes a first area and a second area. The first area and the second area are divided based on the received image light, and are represented by different fillings in FIG5a.
- the first area is used to scatter the first image light
- the second area is used to scatter the second image light.
- one first image light corresponds to one first area
- one second image light corresponds to one second area.
- the image light from the projector includes the first image light, the second image light, the third image light and the fourth image light.
- the scattering layer includes the first area, the second area, the third area and the fourth area as an example. Among them, the first area, the second area, the third area and the fourth area are divided based on the received image light, and are represented by different fillings in Figure 5b.
- a first image light corresponds to a first area
- a second image light corresponds to a second area
- a third image light corresponds to a third area
- a fourth image light corresponds to a fourth area.
- the first area is used to scatter the first image light
- the second area is used to scatter the second image light
- the third area is used to scatter the third image light
- the fourth area is used to scatter the fourth image light.
- the scattering layers shown in Figures 5a and 5b can be made based on the holographic exposure method shown in Figure 4a, or can also be made based on doping scattering particles or micro-nano structures, etc., and this application does not limit this. Generally, the sizes of the various regions included in the scattering layer can be the same.
- the reflective layer is used to reflect image light from a corresponding position of the scattering layer to a corresponding viewpoint and transmit ambient light.
- the concave cylindrical reflector array can be manufactured by injection molding or nano-imprinting, etc. Two possible reflective layer structures are shown below as examples.
- the reflective layer includes an array of concave cylindrical reflective elements.
- the concave cylindrical reflector array is a one-dimensional array or a two-dimensional array, it can be divided into the following two situations.
- Case 1 the concave cylindrical reflector array is a one-dimensional concave cylindrical reflector array.
- FIG 6 is a schematic diagram of the structure of a concave cylindrical reflector array provided by the present application.
- the concave cylindrical reflector array is a one-dimensional concave cylindrical reflector array.
- the surface of the concave cylindrical reflector in the concave cylindrical reflector array close to the scattering layer is a concave surface.
- the projection of the concave surface on the first plane (xoy plane) is an arc, please refer to Figure 7a.
- the arc height or coordinates (x, y) can be expressed by the following formula 1.
- c is the curvature
- the arc height or x-coordinate can be expressed by the following formula 2, that is, a polynomial term can be added to the basis of formula 1:
- the arc includes but is not limited to a parabola, a circular arc, an elliptical arc, a hyperbolic arc or other possible arcs, for example, the edge of the arc includes a serration (see FIG. 7 b).
- the projection of the concave surface on the first plane can be a regular arc or an irregular arc, and the present application does not limit this.
- the caliber of the arc is usually between 100 micrometers ( ⁇ m) and 10 millimeters (mm).
- the concave cylindrical reflectors in the one-dimensional concave cylindrical reflector array are columnar structures in the second plane (xoz plane) (e.g., vertical plane), see FIG8 .
- the second plane is perpendicular to the first plane, the first plane can be a horizontal plane, and the second plane is a vertical plane.
- the direction parallel to the first plane is called the first direction, and the direction parallel to the second plane is called the second direction.
- the image formed by the image light has parallax and stereoscopic effect in the first direction, but no stereoscopic effect in the second direction.
- one concave cylindrical reflector in the one-dimensional concave cylindrical reflector array corresponds to at least two columns of regions of the scattering layer. It can be understood that the columns of regions of the scattering layer corresponding to one concave cylindrical reflector are related to the image light emitted by the projector. Please refer to Figure 9.
- the projector emits the first image light and the second image light.
- One concave cylindrical reflector corresponds to a column of first regions and a column of second regions of the scattering layer.
- the scattering layer in this example can refer to the introduction of Figure 4a above.
- the first region and the second region of the scattering layer are at different positions relative to the concave surface of the concave cylindrical reflector.
- the first image light passing through the first region of the scattering layer and the second image light passing through the second region of the scattering layer will be scattered to the corresponding position of the concave surface of the concave cylindrical reflector.
- the first image light is reflected to the first viewpoint and the second image light is reflected to the second viewpoint through the corresponding position of the concave surface of the concave cylindrical reflector.
- a concave cylindrical reflector corresponds to a first area, a second area, a third area and a fourth area of the scattering layer
- the first area, the second area, the third area and the fourth area of the scattering layer are at different positions relative to the concave surface of the concave cylindrical reflector
- the first image light passing through the first area of the scattering layer, the second image light passing through the second area of the scattering layer, the third image light passing through the third area of the scattering layer, and the fourth image light passing through the fourth area of the scattering layer will be scattered to the corresponding position of the concave surface of the concave cylindrical reflector
- the scattering layer in this example can refer to the introduction of FIG.
- the display module can produce a 3D transparent display with a stronger stereoscopic sense.
- f represents the focal length of the concave cylindrical reflector, which is equal to the distance from the scattering layer to the concave cylindrical reflector array
- Wp represents the length of the first region or the second region of the scattering layer in the first plane (i.e., the xoy plane)
- p represents the aperture of the concave cylindrical reflector
- L represents the distance from the first viewpoint and the second viewpoint to the display module
- T represents the spacing between the first viewpoint and the second viewpoint. It can be understood that the first viewpoint and the second viewpoint can be the positions of the left and right eyes of the observer.
- Case 2 the concave cylindrical reflector array is a two-dimensional concave cylindrical reflector array.
- FIG11 it is a schematic diagram of the structure of another concave cylindrical reflector array provided by the present application.
- the concave cylindrical reflector array is a two-dimensional concave cylindrical reflector array.
- the projection of the concave surface of the concave cylindrical reflector near the scattering layer in the two-dimensional concave cylindrical reflector array on the first plane is an arc, and the projection on the second plane is also an arc.
- the concave surface can be a parabola, a sphere, an ellipsoid, a hyperboloid, etc.
- the sagittal height or x-coordinate of the concave surface can be expressed by the following formula 5.
- the vector height or coordinates (x, y, z) of the concave surface can be expressed by the following formula 6. That is, a polynomial term may be added on the basis of formula 5:
- N is the order of the polynomial (i.e. the largest power series).
- the concave surface may be, for example, a spherical surface, an ellipsoidal surface, a hyperbolic surface, or other possible surface shapes, which is not limited in the present application.
- the image formed by the image light has parallax and stereoscopic effect in the first direction, and also has parallax and stereoscopic effect in the second direction.
- the concave surface of the concave cylindrical reflector in the concave cylindrical reflector array is covered with a reflective film, and the reflectivity of the reflective film is greater than the first reflectivity threshold and less than the second reflectivity threshold, so that the reflective film can reflect the image light from the scattering layer and transmit the ambient light, thereby realizing 3D transparent display.
- the reflective film is a partially reflective and partially transmissive reflective film.
- the reflective film may include but is not limited to a nanometal film or a dielectric film, and the nanometal film or the dielectric film may realize partial reflection and partial transmission.
- the reflectivity of the reflective film is greater than 0 and less than 100%.
- the reflectivity of the reflective film is related to the requirement of the transmittance of the reflective film.
- the transmittance of the reflective film needs to be greater than 70%, so the transmittance of the reflective film needs to be greater than 70%, and since the sum of the reflectivity and transmittance of the reflective film is equal to 100%, the reflectivity of the reflective film is less than 30% and greater than 0. It can also be understood that the reflective film needs to satisfy both the transmission of ambient light and the reflection of the received image light.
- the first reflectivity threshold of the reflective film needs to satisfy that the reflected image light can realize 3D image display
- the second reflectivity threshold of the reflective film needs to satisfy that the transmitted ambient light can meet the application scenario of the display module.
- the reflectivity of the reflective film can be set between 10% and 30%, that is, the first reflectivity threshold is equal to 10%, and the second reflectivity threshold is equal to 30%. Based on this reflective film, the display module can realize 3D transparent display.
- the concave surface of the concave cylindrical reflector is filled with a filler, and the refractive index of the filler and the refractive index of the concave cylindrical reflector meet the preset error requirement. Further, the refractive index of the filler is the same as the refractive index of the concave cylindrical reflector.
- the material of the filler and the material of the concave cylindrical reflector can be the same or different, and this application does not limit this. It can be understood that due to the small thickness of the reflective film, the influence of the reflective film on the propagation direction of the ambient light can be ignored.
- the concave cylindrical reflector array based on the above structure 1 is used as a reflective layer, and there is no need for a traditional transmissive cylindrical lens array.
- the display module can be made transparent to ambient light directly by using a concave cylindrical reflector array covered with a reflective surface and filled with a filling layer on the concave surface, thereby realizing 3D transparent display.
- the reflective layer includes a holographic reflective medium layer.
- the holographic reflective medium layer has angle selectivity. Only the image light scattered by the scattering layer enters the holographic reflective medium layer from the front to excite the holographic reflective medium layer, that is, only the image light entering the holographic reflective medium layer from the front can be reflected (or diffracted) by the holographic reflective medium layer to the corresponding viewpoint. Most of the ambient light directly passes through the holographic reflective medium layer.
- the holographic reflective medium layer can be manufactured by holographic exposure. Two implementation methods of manufacturing the holographic reflective medium layer by holographic exposure are shown below.
- Implementation method 1 is to prepare a holographic reflective medium layer by holographic exposure using a lens array.
- the second laser beam is vertically incident on the holographic reflective medium layer from one side
- the third laser beam is incident on the lens array from the other side. After being diffused by the lens array, it is incident on the holographic reflective medium layer.
- the size of the lens in the lens array is the same as the size of the concave cylindrical reflector in the concave cylindrical reflector array in the display module
- the focal length of the lens in the lens array is the same as the focal length of the concave cylindrical reflector in the concave cylindrical reflector array in the display module. Based on this, a holographic reflective medium layer is obtained, and the image light scattered by the scattering layer can excite the holographic reflective medium layer, and most of the ambient light passes directly through.
- the lens array may be a cylindrical lens array.
- the cylindrical lens array may be arranged in one dimension, and the cylindrical lens array arranged in one dimension may generate parallax and stereoscopic vision in a first direction.
- the cylindrical lens array may be arranged in two dimensions, and the cylindrical lens array arranged in two dimensions may generate parallax and stereoscopic vision in both the first direction and the second direction.
- Implementation method 2 is to prepare a holographic reflective medium layer by holographic exposure using a reflector array.
- the fourth laser beam vertically enters the holographic reflective medium layer from one side, is transmitted to the concave reflector array, and is reflected by the concave reflector array and enters the holographic reflective medium layer again.
- the size of the concave reflectors in the concave reflector array is the same as the size of the concave cylindrical reflectors in the concave cylindrical reflector array in the display module, and the focal length of the concave reflectors in the concave reflector array is the same as the focal length of the concave cylindrical reflectors in the concave cylindrical reflector array in the display module. Based on this, a holographic reflective medium layer is obtained, and the image light scattered by the scattering layer can excite the holographic reflective medium layer, and most of the ambient light passes directly through.
- the concave reflector array may be a concave cylindrical reflector array.
- the arrangement of the cylindrical concave reflector array may refer to the arrangement of the cylindrical lens, which will not be described in detail here.
- the holographic reflective medium layer is prepared by the third laser beam, which has low requirements on environmental vibration and low exposure cost. Moreover, the holographic reflective medium layer prepared by the holographic exposure method has a small thickness (such as ⁇ 100um), which is conducive to the thinning of the display module.
- the holographic reflective medium layer manufactured by the above implementation method 1 or implementation method 2 is a volume grating structure with strong angle selectivity.
- the holographic reflective medium layer is used as a reflective layer, and the optical path of the display module is reproduced as shown in Figure 13.
- the image light emitted from the projector is scattered to the holographic reflective medium layer through the scattering layer to excite the holographic reflective medium layer, and the first image light is reflected to the first viewpoint and the second image light is reflected to the second viewpoint through the holographic reflective medium layer. Most of the ambient light will directly pass through the holographic reflective medium layer.
- the present application also provides an optical display system.
- the optical display system includes an image generation unit and a display module in any of the above embodiments.
- the image generation unit is used to emit different image lights.
- the image generation unit emitting the first image light and the second image light is taken as an example
- the scattering layer included in the display module is taken as an example of Figure 5a above
- the reflective layer included in the display module is taken as an example of Figure 6 above.
- the first image light emitted by the image generation unit is converged at the first viewpoint through the display module, and the second image light emitted by the image generation unit is converged at the second viewpoint through the display module.
- the ambient light can pass through the display module, so that 3D transparent display can be achieved.
- the image generation unit includes a light source component and a light modulation component.
- the light source component is used to emit a first light beam and a second light beam.
- the light modulation component is used to modulate the first light beam to obtain a first image light carrying the first image information, and to modulate the second light beam to obtain a second image light carrying the second image information.
- the light modulation component can load (or modulate) the first image information on the first light beam to obtain the first image light carrying the image information; and load the second image information on the second light beam to obtain the second image light beam carrying the image information.
- the first light beam and the second light beam can be called optical carriers.
- the light source assembly may be, for example, a laser diode (LD), a light-emitting diode (LED), a vertical cavity surface emitting laser (VCSEL), an edge emitting laser (EEL), a diode pumped solid state laser (DPSS), or a fiber laser, etc. It is to be understood that the light source assembly given above is only an example and the present application does not limit this.
- LD laser diode
- LED light-emitting diode
- VCSEL vertical cavity surface emitting laser
- EEL edge emitting laser
- DPSS diode pumped solid state laser
- the light modulation component can be, for example, a liquid crystal on silicon (LCOS) display, a liquid crystal display (LCD), a digital light processing (DLP) display, a laser beam scanning (LBS) display, an organic light emitting diode (OLED), a micro light emitting diode (micro-LED), an active-matrix organic light emitting diode or an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), a quantum dot light emitting diode (QLED), a reflective display based on a digital micro-mirror device (DMD), etc.
- LCOS liquid crystal on silicon
- LCD liquid crystal display
- DLP digital light processing
- LBS laser beam scanning
- OLED organic light emitting diode
- micro-LED micro light emitting diode
- AMOLED active-matrix organic light emitting diode
- FLED flexible light-emitting diode
- QLED quantum dot light
- the above-mentioned image generating unit may be, for example, a projector.
- the present application provides an image display method, please refer to the introduction of Figure 15.
- the image display method can be applied to the display module shown in any embodiment of Figures 2 to 13 above, or to the optical display system shown in Figure 14 above. It can also be understood that the image display method can be implemented based on the display module shown in any embodiment of Figures 2 to 13 above, or the image display method can be implemented based on the optical display system shown in Figure 14 above.
- an image display method provided by the present application includes the following steps:
- Step 1501 Obtain the coordinates of the first viewpoint and/or the second viewpoint.
- the coordinates of the first viewpoint are (x 1 , y 1 ), and the coordinates of the second viewpoint are (x 2 , y 2 ).
- the left and right eyes of the observer may be located at the first viewpoint and the second viewpoint.
- the coordinates of the first viewpoint and the coordinates of the second viewpoint may be default eye point coordinates.
- the first viewpoint and the second viewpoint may be the positions of both eyes of the observer, and the position coordinates of both eyes may be obtained by an eye tracker.
- Eye tracking refers to tracking eye movement by measuring the position of the eye's gaze point or the movement of the eye relative to the head.
- An eye tracker is a device that can track and measure the position of the eye and eye movement information. The eye tracker can track and output the position coordinates of the observer's eyes in real time.
- the eye tracker may include, but is not limited to, a camera (such as a driver monitor system (DMS) camera), an infrared transmitter or an infrared detector, etc.
- DMS driver monitor system
- Step 1502 Determine K eye reliefs according to the coordinates of the first viewpoint and/or the second viewpoint and the coordinates of K target positions in the display area of the windshield.
- the K target positions correspond one-to-one to the K rows of the image projected onto the display area.
- one row of the image projected onto the display area can correspond to one target position on the windshield.
- the coordinates of the three target positions are: the coordinates of target position 1 (X 1 , Y 1 , Z 1 ), the coordinates of target position 2 (X 2 , Y 2 , Z 2 ), and the coordinates of target position 3 (X 3 , Y 3 , Z 3 ), and these three target positions correspond to three different rows of the image. It can be understood that the coordinates of the target positions of the display area of the windshield can be pre-stored.
- the eye relief refers to the distance between the first viewpoint (or the second viewpoint) and the target position of the display area of the windshield.
- the eye relief ER i satisfies the following formula 7. It can be understood that the ER of the same plane is the same, see Figure 16b. Therefore, for the left and right eyes of the observer, the ER is the same.
- the above formula 7 can use the coordinates of the first viewpoint; if the above step 1501 obtains the coordinates of the second viewpoint, the above formula 7 can use the coordinates of the second viewpoint; if the above step 1501 obtains the coordinates of the first viewpoint and the second viewpoint, the above formula 7 can use the average value of the coordinates of the first viewpoint and the second viewpoint.
- the eye distance ER 1 , the eye distance ER 2 and the eye distance ER 3 can be determined.
- the parallax, eye distance, and virtual image distance between the first viewpoint and the second viewpoint satisfy the following formula 8.
- ⁇ P i is the parallax between the first viewpoint and the second viewpoint in the i-th row
- VID is the virtual image distance
- ER i is the eye distance of the i-th row
- T is the distance between the first viewpoint and the second viewpoint. If the first viewpoint and the second viewpoint are the left and right eyes of the observer, T is the distance between the observer's eyes, which is usually about 65 millimeters (mm).
- Step 1503 controlling the display of the image by adjusting the parallax of K lines of the image.
- the parallax ⁇ P 1 between the first viewpoint and the second viewpoint in the first row, the parallax ⁇ P 2 between the first viewpoint and the second viewpoint in the second row, and the parallax ⁇ P 3 between the first viewpoint and the second viewpoint in the third row can be determined, please refer to Figure 16a. Therefore, the parallax ⁇ P i of each row can be adjusted to observe the non-tilted image.
- the non-tilted image can be controlled to be displayed. That is, based on the above method, the tilted 3D image displayed due to the tilted windshield can be corrected.
- the target VID can be input first, and the parallax ⁇ P i can be adjusted according to the above formula 8 and the target VID to realize that the image is displayed at the imaging position required by the user. It can be understood that if the parallax ⁇ P i is increased, the imaging distance will be extended (i.e., the virtual image distance will be increased); if the parallax ⁇ P i is reduced, the imaging distance will be reduced (i.e., the virtual image distance will be reduced).
- the image display method can be executed by a control module, which can belong to the optical display system or can be independent of the optical display.
- the control module may include a processor, which may be a circuit having the ability to process signals (or data).
- the processor may be a circuit having the ability to read and run instructions, such as a central processing unit (CPU), a microprocessor, a graphics processing unit (GPU) (which can be understood as a microprocessor), or a digital signal processor (DSP); in another implementation, the processor may implement certain functions through the logical relationship of a hardware circuit, and the logical relationship of the hardware circuit is fixed or reconfigurable, such as a hardware circuit implemented by a processor as an application-specific integrated circuit (ASIC) or a programmable logic device (PLD), such as a field programmable gate array (FPGA).
- ASIC application-specific integrated circuit
- PLD programmable logic device
- the process of the processor loading a configuration document to implement the hardware circuit configuration can be understood as the process of the processor loading instructions to implement the functions of some or all of the above units.
- the processor can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU), a tensor processing unit (TPU), a deep learning processing unit (DPU), etc.
- ASIC application processor
- ISP image signal processor
- control module can be a domain processor in the vehicle, or it can also be an electronic control unit (ECU) in the vehicle.
- ECU electronice control unit
- Figure 17 is a circuit diagram of an optical display system provided by the present application.
- the circuit in the optical display system mainly includes a processor 1701, an external memory interface 1702, an internal memory 1703, an audio module 1704, a video module 1705, a power module 1706, a wireless communication module 1707, an input/output (I/O) interface 1708, a video interface 1709, a display circuit 1710, a modulator 1711 and a light source 1712, etc.
- I/O input/output
- the processor 1701 and its peripheral components such as an external memory interface 1702, an internal memory 1703, an audio module 1704, a video module 1705, a power module 1706, a wireless communication module 1707, an I/O interface 1708, a video interface 1709, and a display circuit 1710 can be connected through a bus.
- the circuit diagrams shown in the present application do not constitute a specific limitation on the optical display system.
- the optical display system may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently.
- the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
- the processor 1701 includes one or more processing units, and the processing unit may be a circuit having the ability to process signals (or data). For details, please refer to the above-mentioned related introduction, which will not be repeated here. Different processing units may be independent devices or integrated in one or more processors.
- the processor 1701 may also be provided with a memory for storing instructions and data.
- the memory in the processor 1701 is a cache memory.
- the memory may store instructions or data that the processor 1701 has just used or cyclically used. If the processor 1701 needs to use the instruction or data again, it may be directly called from the memory. This avoids repeated access, reduces the waiting time of the processor 1701, and thus improves the efficiency of the optical display system.
- the processor 1701 may execute the stored instructions to perform the above-mentioned imaging method.
- the optical display system may further include a plurality of input/output (I/O) interfaces 1708 connected to the processor 1701.
- the I/O interface 1708 may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
- I2C inter-integrated circuit
- I2S inter-integrated circuit sound
- PCM pulse code modulation
- UART universal asynchronous receiver/transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial bus
- the above-mentioned I/O interface 1708 can be connected to devices such as a mouse, touchpad, keyboard, camera, speaker, microphone, etc., and can also be connected to physical buttons on the optical display system (such as volume buttons, brightness adjustment buttons, power buttons, etc.).
- the external memory interface 1702 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the optical display system.
- the external memory card communicates with the processor 1701 through the external memory interface 1702 to realize the data storage function.
- the internal memory 1703 can be used to store computer executable program codes, and the executable program codes include instructions.
- the internal memory 1703 can include a program storage area and a data storage area.
- the program storage area can store an operating system, an application required for at least one function, etc.
- the data storage area can store data created during the use of the optical display system, etc.
- the internal memory 1703 can include a random access memory (RAM), a flash memory, a universal flash storage (UFS), a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a register, a hard disk, a mobile hard disk, a CD-ROM, or any other form of storage medium known in the art.
- the processor 1701 executes various functional applications and data processing of the optical display system by running instructions stored in the internal memory 1703 and/or instructions stored in a memory provided in the processor 1701.
- An exemplary storage medium is coupled to a processor so that the processor can read information from the storage medium and write information to the storage medium.
- the storage medium can also be a component of the processor.
- the processor and the storage medium can be located in an ASIC.
- the ASIC can be located in an optical display system.
- the processor and the storage medium can also exist in the optical display system as discrete components.
- the optical display system can implement audio functions such as music playing and calls through the audio module 1704 and the application processor.
- the audio module 1704 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
- the audio module 1704 can also be used to encode and decode audio signals, such as playing or recording.
- the audio module 1704 can be arranged in the processor 1701, or some functional modules of the audio module 1704 can be arranged in the processor 1701.
- the video interface 1709 can receive external audio and video signals, which can be specifically a high definition multimedia interface (HDMI), a digital visual interface (DVI), a video graphics array (VGA), a display port (DP), etc.
- the video interface 1709 can also output video to the outside.
- the video interface 1709 can receive speed signals and power signals input from peripheral devices, and can also receive external AR video signals.
- the video interface 1709 can receive video signals input from an external computer or terminal device.
- the video module 1705 can decode the video input by the video interface 1709, for example, by performing H.264 decoding.
- the video module can also encode the video collected by the optical display system, for example, by performing H.264 encoding on the video collected by the external camera.
- the processor 1701 can also decode the video input by the video interface 1709, and then output the decoded image signal to the display circuit 1710.
- the power module 1706 is used to provide power to the processor 1701 and the light source 1712 according to the input power (e.g., direct current), and the power module 1706 may include a rechargeable battery, which can provide power to the processor 1701 and the light source 1712.
- the light emitted by the light source 1712 can be transmitted to the modulator 1711 for imaging, thereby forming an image light signal.
- the wireless communication module 1707 enables the optical display system to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and other wireless communication solutions.
- WLAN wireless local area networks
- BT Bluetooth
- GNSS global navigation satellite system
- FM frequency modulation
- NFC near field communication
- IR infrared
- the wireless communication module 1707 can be one or more devices integrating at least one communication processing module.
- the wireless communication module 1707 receives electromagnetic waves via an antenna, modulates the frequency of the electromagnetic wave signal and performs filtering, and sends the processed signal to the processor 1701.
- the wireless communication module 1707 can also receive the signal to be sent from the processor 1701, modulate the frequency of the signal, amplify it, and convert it into electromagnetic waves for radiation through the antenna.
- the video data decoded by the video module 1705 can also be wirelessly received through the wireless communication module 1707 or read from an external memory.
- the optical display system can receive video data from a terminal device or an in-vehicle entertainment system through the wireless LAN in the vehicle, and the optical display system can also read audio and video data stored in an external memory.
- the display circuit 1710 and the modulator 1711 are used to display the corresponding image.
- the video interface 1709 receives an external video source signal, and the video module 1705 decodes and/or digitally processes the video signal and outputs one or more image signals to the display circuit 1710.
- the display circuit 1710 drives the modulator 1711 to image the incident polarized light according to the input image signal, and then outputs at least two image lights.
- the processor 1701 can also output one or more image signals to the display circuit 1710.
- the present application can also provide a terminal device.
- the terminal device may include the optical display system in any of the above embodiments.
- the terminal device can be, for example, a vehicle (e.g., an unmanned vehicle, a smart car, an electric car, or a digital car, etc.), a robot, a mapping device, a drone, a smart home device (e.g., a TV, a sweeping robot, an intelligent desk lamp, a sound system, an intelligent lighting system, an electrical control system, a home background music, a home theater system, an intercom system, or a video surveillance, etc.), an intelligent manufacturing device (e.g., an industrial device), an intelligent transportation device (e.g., an AGV, an unmanned transport vehicle, or a truck, etc.), or an intelligent terminal (a mobile phone, a computer, a tablet computer, a PDA, a desktop, a headset, a sound system, a wearable device, a vehicle-
- a vehicle e.g., an un
- FIG. 18 is an exemplary functional block diagram of a vehicle provided by the present application.
- Components coupled to or included in the terminal device 1800 may include a propulsion system 1801, a sensing system 1802, a control system 1803, a computer system 1804, a user interface 1805, and an optical display system 1806.
- the components of the terminal device 1800 can be configured to work in a manner interconnected with each other and/or with other components coupled to each system.
- the computer system 1804 can be configured to receive data from the propulsion system 1801, the sensing system 1802, and the control system 1803 and control them.
- the computer system 1804 can also be configured to generate a display of an image on the user interface 1805 and receive input from the user interface 1805.
- the propulsion system 1801 can provide power movement for the terminal device 1800.
- the propulsion system 1801 may include an engine/motor, an energy source, a transmission, and wheels/tires.
- the propulsion system 1801 may additionally or alternatively include other components in addition to the components shown in FIG. 18. This application does not specifically limit this.
- the sensing system 1802 may include several sensors for sensing information about the environment in which the terminal device 1800 is located.
- the sensors of the sensing system 1802 may include, but are not limited to, a global positioning system (GPS), an inertial measurement unit (IMU), a millimeter wave radar, a laser radar, a camera, and an actuator for modifying the position and/or orientation of the sensor.
- the millimeter wave radar may use radio signals to sense targets in the surrounding environment of the terminal device 1800.
- the millimeter wave radar may also be used to sense the speed and/or forward direction of the target.
- the laser radar may use lasers to sense targets in the environment in which the terminal device 1800 is located.
- the laser radar may include one or more laser sources and one or more detectors, as well as other system components.
- the camera may be used to capture multiple images of the surrounding environment of the terminal device 1800.
- the camera may be a still camera or a video camera.
- the GPS may be any sensor for estimating the geographic location of the terminal device 1800.
- the GPS may include a transceiver to estimate the position of the terminal device 1800 relative to the earth based on satellite positioning data.
- the computer system 1804 can be used to use GPS in combination with map data to estimate the road that the terminal device 1800 is traveling.
- the IMU can be used to sense the position and orientation changes of the terminal device 1800 based on inertial acceleration and any combination thereof.
- the combination of sensors in the IMU may include, for example, an accelerometer and a gyroscope. In addition, other combinations of sensors in the IMU are also possible.
- the sensing system 1802 may also include sensors of the internal system of the monitored terminal device 1800 (e.g., in-vehicle air quality monitor, fuel gauge, oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, direction, speed, etc.). This detection and identification is a key function for the safe operation of the terminal device 1800.
- the sensing system 1802 may also include other sensors. This application does not specifically limit this.
- the control system 1803 is to control the operation of the terminal device 1800 and its components.
- the control system 1803 may include various elements, including a steering unit, a throttle, a brake unit, a sensor fusion algorithm, a computer vision system, a route control system, and an obstacle avoidance system.
- the steering system can be operated to adjust the forward direction of the terminal device 1800.
- it can be a steering wheel system.
- the throttle is used to control the operating speed of the engine and thus control the speed of the terminal device 1800.
- the control system 1803 can additionally or alternatively include other components other than the components shown in Figure 18. This application is not specifically limited to this.
- the brake unit is used to control the terminal device 1800 to decelerate.
- the brake unit can use friction to slow down the wheel.
- the brake unit can convert the kinetic energy of the wheel into electric current.
- the brake unit can also take other forms to slow down the wheel speed to control the speed of the terminal device 1800.
- the computer vision system can be operated to process and analyze the images captured by the camera in order to identify the target and/or feature in the surrounding environment of the terminal device 1800.
- the target and/or feature may include traffic signals, road boundaries and obstacles.
- the computer vision system may use target recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques.
- the computer vision system may be used to map the environment, track targets, estimate the speed of targets, and the like.
- the route control system is used to determine the driving route of the terminal device 1800.
- the route control system may combine data from the sensing system 1802, GPS, and one or more predetermined maps to determine the driving route for the terminal device 1800.
- the obstacle avoidance system is used to identify, evaluate, avoid, or otherwise overcome potential obstacles in the environment of the terminal device 1800.
- the control system 1803 may include components other than those shown and described in addition or in an alternative manner. Alternatively, a portion of the components shown above may be reduced.
- the terminal device 1800 may include at least one processor 18041, and further, the computer system 1804 may also include an interface circuit 18042.
- the processor 18041 executes instructions stored in a non-transitory computer-readable medium such as a memory 18043.
- the computer system 1804 may also be a plurality of computing devices that control individual components or subsystems of the terminal device 1800 in a distributed manner.
- Processor 18041 may be a circuit having the capability of processing signals (or data). For details, please refer to the above-mentioned related introduction, which will not be repeated here.
- FIG. 18 functionally illustrates a processor, memory, and other elements of a computer system 1804 in the same block, it will be appreciated by those skilled in the art that the processor and memory may not actually be stored in multiple processors or memories in the same physical housing.
- the memory may be a hard drive or other storage medium located in a housing different from the computer system 1804.
- some components such as a steering assembly and a deceleration assembly may each have their own processor that performs only calculations related to the functions specific to the component.
- the processor may also be remote from the vehicle but may communicate wirelessly with the vehicle.
- memory 18043 may include instructions (e.g., program logic) that can be read by processor 18041 to perform various functions of terminal device 1800, including the functions described above.
- Memory 18043 may also include additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of propulsion system 1801, sensor system 1802, and control system 1803.
- memory 18043 may also store data, such as road maps, route information, data detected by sensors, vehicle location, direction, speed, and other such vehicle data, as well as other information. This information can be used by terminal device 1800 and computer system 1804 in autonomous, semi-autonomous, and/or manual modes at terminal device 1800.
- the memory may refer to the introduction of the internal memory 1703 in FIG. 17 above, which will not be described again here.
- the user interface 1805 is used to provide information to or receive information from a user of the terminal device 1800.
- the user interface 1805 may include one or more input/output devices in a set of peripheral devices, and the peripheral devices may include, for example, a wireless communication system, a touch screen, a microphone and/or a speaker, etc.
- Computer system 1804 may control the functions of terminal device 1800 based on input received from various subsystems (e.g., propulsion system 1801, sensing system 1802, and control system 1803) and from user interface 1805.
- computer system 1804 may utilize input from control system 1803 in order to control a steering unit to avoid obstacles detected by sensing system 1802 and obstacle avoidance system.
- computer system 1804 may be operable to provide control over many aspects of terminal device 1800 and its subsystems.
- the optical display system 1806 can refer to the introduction of any of the above embodiments, which will not be described here in detail. It should be noted that the functions of some components in the optical display system can also be implemented by other subsystems of the vehicle, for example, the controller can also be a component in the control system.
- one or more of the above components may be installed or associated separately from the terminal device 1800.
- the memory 18043 may be partially or completely separate from the terminal device 1800.
- the above components may be communicatively coupled together in a wired and/or wireless manner.
- terminal device functional framework given in Figure 18 is only an example.
- the terminal device 1800 may include more, fewer or different systems, and each system may include more, fewer or different components.
- systems and components shown may be combined or divided in any manner, and this application does not specifically limit this.
- the method steps in the embodiments of the application can be implemented by hardware or by a processor executing software instructions.
- the software instructions can be composed of corresponding software modules, and the software modules can be stored in a storage medium.
- a storage medium please refer to the introduction of the aforementioned memory 18043, which will not be repeated here.
- An exemplary storage medium is coupled to the processor so that the processor can read information from the storage medium and write information to the storage medium.
- the storage medium can also be a component of the processor.
- the computer program product includes one or more computer programs or instructions.
- the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
- the computer program or instruction may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer program or instruction may be transmitted from one website site, computer, server or data center to another website site, computer, server or data center by wired or wireless means.
- the computer-readable storage medium may be any available medium that a computer can access or a data storage device such as a server, data center, etc. that integrates one or more available media.
- the available medium may be a magnetic medium, for example, a floppy disk, a hard disk, a tape; it may also be an optical medium, for example, a digital video disc (DVD); it may also be a semiconductor medium, for example, a solid state drive (SSD).
- a, b or c can represent: a, b, c, "a and b", “a and c", “b and c", or "a and b and c", where a, b, c can be single or multiple.
- the character “/” generally indicates that the front and back associated objects are in an “or” relationship.
- the character “/” indicates that the front and back associated objects are in a "divided” relationship.
- the word “exemplary” is used to mean an example, illustration or description. Any embodiment or design described as an “example” in this application should not be interpreted as being more preferred or more advantageous than other embodiments or designs. Alternatively, it can be understood that the use of the word “example” is intended to present concepts in a specific way and does not constitute a limitation on this application.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Instrument Panels (AREA)
Abstract
La présente demande divulgue un module d'affichage, un système d'affichage optique, un dispositif terminal et un procédé d'affichage d'image. Le procédé peut être utilisé dans des domaines tels que des dispositifs d'affichage tête haute (HUD), des dispositifs d'affichage proche de l'œil (NED), des affichages ou des écrans d'affichage montés sur véhicule. Le module d'affichage comprend une couche de diffusion et une couche réfléchissante. La couche de diffusion est utilisée pour recevoir différentes lumières d'image provenant d'une unité de génération d'image et diffuser les différentes lumières d'image à des positions correspondantes sur la couche réfléchissante, les lumières d'image transportant des informations d'image. La couche réfléchissante est utilisée pour réfléchir les lumières d'image à partir des positions correspondantes de la couche de diffusion vers des points de vue correspondants, et pour transmettre la lumière ambiante. Au moyen de la coopération de la couche de diffusion et de la couche réfléchissante, différentes lumières d'image peuvent être réfléchies vers des points de vue correspondants, de telle sorte qu'une image 3D peut être formée. De plus, étant donné que la lumière ambiante peut être transmise à travers la couche réfléchissante, un affichage transparent 3D peut être réalisé sur la base du module d'affichage.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2022/122330 WO2024065332A1 (fr) | 2022-09-28 | 2022-09-28 | Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2022/122330 WO2024065332A1 (fr) | 2022-09-28 | 2022-09-28 | Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024065332A1 true WO2024065332A1 (fr) | 2024-04-04 |
Family
ID=90475308
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/122330 WO2024065332A1 (fr) | 2022-09-28 | 2022-09-28 | Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024065332A1 (fr) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105812778A (zh) * | 2015-01-21 | 2016-07-27 | 成都理想境界科技有限公司 | 双目ar头戴显示设备及其信息显示方法 |
US20170168294A1 (en) * | 2015-12-11 | 2017-06-15 | Panasonic Intellectual Property Management Co., Ltd. | Screen |
CN107333121A (zh) * | 2017-06-27 | 2017-11-07 | 山东大学 | 曲面屏幕上移动视点的沉浸式立体渲染投影系统及其方法 |
CN108761802A (zh) * | 2017-08-25 | 2018-11-06 | 深圳京龙睿信科技有限公司 | 一种裸眼3d-hud显示装置 |
CN114137725A (zh) * | 2020-09-04 | 2022-03-04 | 未来(北京)黑科技有限公司 | 一种可显示三维图像的抬头显示系统 |
CN114153066A (zh) * | 2020-09-08 | 2022-03-08 | 未来(北京)黑科技有限公司 | 抬头显示装置及抬头显示系统 |
CN114609782A (zh) * | 2022-02-14 | 2022-06-10 | 广东未来科技有限公司 | 裸眼3d抬头显示装置及裸眼3d抬头显示方法 |
-
2022
- 2022-09-28 WO PCT/CN2022/122330 patent/WO2024065332A1/fr unknown
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105812778A (zh) * | 2015-01-21 | 2016-07-27 | 成都理想境界科技有限公司 | 双目ar头戴显示设备及其信息显示方法 |
US20170168294A1 (en) * | 2015-12-11 | 2017-06-15 | Panasonic Intellectual Property Management Co., Ltd. | Screen |
CN107333121A (zh) * | 2017-06-27 | 2017-11-07 | 山东大学 | 曲面屏幕上移动视点的沉浸式立体渲染投影系统及其方法 |
CN108761802A (zh) * | 2017-08-25 | 2018-11-06 | 深圳京龙睿信科技有限公司 | 一种裸眼3d-hud显示装置 |
CN114137725A (zh) * | 2020-09-04 | 2022-03-04 | 未来(北京)黑科技有限公司 | 一种可显示三维图像的抬头显示系统 |
CN114153066A (zh) * | 2020-09-08 | 2022-03-08 | 未来(北京)黑科技有限公司 | 抬头显示装置及抬头显示系统 |
CN114609782A (zh) * | 2022-02-14 | 2022-06-10 | 广东未来科技有限公司 | 裸眼3d抬头显示装置及裸眼3d抬头显示方法 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220365345A1 (en) | Head-up display and picture display system | |
JP2023175794A (ja) | ヘッドアップディスプレイ | |
US20220063510A1 (en) | Head up display and control method thereof | |
WO2024017038A1 (fr) | Appareil de génération d'image, dispositif d'affichage et véhicule | |
WO2021015171A1 (fr) | Affichage tête haute | |
WO2024021852A1 (fr) | Appareil d'affichage stéréoscopique, système d'affichage stéréoscopique et véhicule | |
CN115629515B (zh) | 立体投影系统、投影系统和交通工具 | |
WO2023216670A1 (fr) | Appareil d'affichage tridimensionnel et véhicule | |
WO2024065332A1 (fr) | Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'affichage d'image | |
US20240036311A1 (en) | Head-up display | |
US12061335B2 (en) | Vehicular head-up display and light source unit used therefor | |
WO2024041034A1 (fr) | Module d'affichage, système d'affichage optique, dispositif terminal et procédé d'imagerie | |
US20230152586A1 (en) | Image generation device and head-up display | |
WO2024188007A1 (fr) | Appareil d'affichage et moyen de transport | |
WO2024098828A1 (fr) | Système de projection, procédé de projection et moyen de transport | |
WO2023193210A1 (fr) | Module d'émission optique, dispositif d'affichage optique, dispositif terminal et procédé d'affichage d'image | |
US20240069335A1 (en) | Head-up display | |
WO2023184276A1 (fr) | Procédé d'affichage, système d'affichage et dispositif terminal | |
WO2023225902A1 (fr) | Module de transmission, appareil de détection et dispositif terminal | |
CN220983636U (zh) | 一种显示装置、交通工具和车载系统 | |
CN115933185B (zh) | 虚像显示装置、图像数据的生成方法、装置和相关设备 | |
US20240319343A1 (en) | SYSTEMS AND TECHNIQUES FOR MITIGATING CROSSTALK AND INTERFERENCE FOR FLASH IMAGING LIGHT DETECTION AND RANGING (LiDAR) DEVICES | |
CN116203726A (zh) | 显示装置、电子设备以及交通工具 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22959975 Country of ref document: EP Kind code of ref document: A1 |