WO2024045704A1 - 显示装置、显示设备及交通工具 - Google Patents

显示装置、显示设备及交通工具 Download PDF

Info

Publication number
WO2024045704A1
WO2024045704A1 PCT/CN2023/095982 CN2023095982W WO2024045704A1 WO 2024045704 A1 WO2024045704 A1 WO 2024045704A1 CN 2023095982 W CN2023095982 W CN 2023095982W WO 2024045704 A1 WO2024045704 A1 WO 2024045704A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
area
pixels
distortion
display device
Prior art date
Application number
PCT/CN2023/095982
Other languages
English (en)
French (fr)
Inventor
许志高
毛磊
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024045704A1 publication Critical patent/WO2024045704A1/zh

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Definitions

  • the present application relates to, in particular, the field of optical display technology, and in particular to a display device, a display device and a vehicle.
  • Light display products include desktop displays, head-up displays (HUD), and projector lights. Their application scenarios include but are not limited to transportation, such as cars, trains, etc., but desktop display devices can also be used in education. fields, such as various desktops with desktop display functions.
  • HUD head-up displays
  • the characteristic of the above-mentioned light display products is that through its optical imaging unit, it can present a virtual image with an enlarged field of view and frame, and map it into the eyes of the observer, thereby allowing the observer to obtain a large-screen viewing experience.
  • the optical imaging unit is generally composed of optical lenses, and the manufacturing accuracy of the optical lens and the assembly process of the optical imaging unit vary, the optical imaging unit will cause optical distortion of the virtual image when imaging, that is, the virtual image presented will be deformed. virtual image, thereby affecting the observer’s viewing experience.
  • Embodiments of the present application provide a display device, a display device including the display device, and a vehicle including the display device or the display device.
  • the main purpose is to provide an ideal virtual image that can present an image without distortion without loss. Image resolution display device.
  • this application provides a display device, which can be used in scenes such as audio and video entertainment, office work, learning, and assisted driving, including but not limited to desktop displays, near-eye displays, heads-up displays, projectors,
  • the display device can be a flat display device or a three-dimensional display device. It can be used alone or integrated as a component in other equipment.
  • the display device includes an image generation unit and an optical imaging unit; the image generation unit is used to generate a real image according to the image signal input by the processor.
  • the image signal is an image signal corresponding to an ideal image to be displayed without distortion, and the real image has pre-distortion relative to the image to be displayed;
  • the optical imaging unit is used to image the real image to form a virtual image corresponding to the real image, for example It can be an enlarged virtual image, and the pre-distortion is compensated by the optical distortion caused to the virtual image. In this way, a relatively ideal virtual image can be formed to ensure the user's viewing experience.
  • the image generation unit directly generates a pre-distorted real image of the image to be displayed based on the ideal image signal of the image to be displayed input by the processor peripheral to the display device, rather than based on the pre-correction of the image to be displayed.
  • the image generates a real image of the pre-corrected image, so the pixels of the image generation unit are effectively utilized, avoiding the loss of image resolution.
  • this application achieves pre-correction of optical distortion through the image generation unit in the "real image generation stage". In order to obtain a pre-corrected image, there is no need to use a processor peripheral to the display device to pre-correct the image to be displayed, so no additional power consumption and cost will be incurred.
  • the optical imaging unit corresponds to a first distortion parameter
  • the first distortion parameter is a parameter characterizing optical distortion
  • the image generation unit includes: a light modulator, configured to adjust the input image signal according to the input image signal.
  • the input optical signal is optically modulated to generate a real image; the size and/or distribution of the pixels on the optical modulator matches the second distortion parameter; the second distortion parameter is determined based on the first distortion parameter and can represent predistortion.
  • the size and/or distribution of the pixels of the light modulator is designed according to the first distortion parameter/optical distortion of the optical generation unit, that is, corresponding to the first distortion parameter, so the light modulator generates
  • the real image has pre-distortion that can be characterized by the second distortion parameter relative to the image to be displayed, and the pre-distortion can offset the optical distortion caused when the subsequent optical imaging unit performs virtual image imaging, so that in the "real image generation stage" Implementing pre-correction of this optical distortion not only avoids loss of image resolution, but also avoids additional power consumption and cost.
  • optical distortion may include barrel distortion and pincushion distortion.
  • the optical distortion caused by the optical imaging unit is barrel distortion
  • compress the size of some or all pixels on the light modulator that is, reduce the pixel size
  • the image generation unit generates a real image with pincushion distortion.
  • the barrel distortion caused by the virtual image offsets the pincushion distortion of the real image, and finally forms an ideal virtual image.
  • the optical distortion caused by the optical imaging unit is pincushion distortion
  • the size of some or all pixels on the light modulator is expanded, that is, the pixel size is increased, so that the image generation unit generates a real image with barrel distortion.
  • the real image enters the optical imaging unit when the optical imaging unit performs virtual image imaging on it, it offsets the barrel distortion of the real image by causing pincushion distortion to the virtual image, and finally forms an ideal virtual image.
  • only the pixels in a certain area can be compressed or expanded to ensure the formation of an ideal virtual image.
  • the light modulator includes a first area and a second area, and the pixels in the first area and the pixels in the second area are distributed in an array; wherein, the pixels in the first area The size is different from the size of the pixels in the second area, and/or the array distribution mode of the pixels in the first area is different from the array distribution mode of the pixels in the second area.
  • the size and distribution method of the pixels can specifically be the size and distribution method of the light-emitting area in the sub-pixel area where the sub-pixel is located.
  • the dimensions include width, height, area, etc.
  • the distribution method includes the position and the array distribution direction when the array is distributed. , spacing, etc.
  • the first area is farther from the center of the light modulator relative to the second area; the size and/or array distribution of the pixels in the first area matches the second distortion parameter. In other words, the size and/or array distribution of the pixels in the first area correspond to the first distortion parameter.
  • the light modulator is divided into two parts, and the size and/or array distribution of the pixels in the two parts can comply with different rules.
  • the size and/or array distribution of the pixels in the first area is based on
  • the second distortion parameter is designed to comply with the rules corresponding to the second distortion parameter, such as compression design rules or expansion design rules.
  • the size and/or array distribution of the pixels in the second area conform to another rule, such as the standard design rules for light modulators.
  • the standard design rule here can be understood as that the size of the pixels and/or the array distribution pattern has nothing to do with the second distortion parameter, that is, nothing to do with the imaging defects of the optical imaging unit.
  • the optical distortion is barrel distortion
  • the light modulation The width of the controller along the column direction of the pixel gradually increases from the middle to both ends in the row direction, so that the shape of the real image is "pincushion".
  • the size of the pixels in the first area is smaller than the size of the pixels in the second area, so that the pre-distortion of the real image becomes pincushion distortion, that is, it exhibits pincushion distortion.
  • a row of pixels in the first area is distributed along a curve, wherein pixels near both ends of the curve are farther away from the second area than pixels near the midpoint of the curve.
  • a row of pixels in the first area can gradually move away from the center line of the light modulator parallel to the row direction from the middle to both ends in the row direction.
  • optical pincushion distortion can be understood as pincushion distortion caused by assembly errors of optical lenses, manufacturing accuracy and other factors. Its formation principle is different from the principle of pincushion distortion caused by the light modulator in this application.
  • the width of the light modulator along the column direction of the pixel gradually decreases from the middle to both ends in the row direction, so that the real image The shape is "barrel".
  • the size of the pixels in the first area is larger than the size of the pixels in the second area, so that the pre-distortion of the real image becomes barrel distortion, that is, barrel distortion is exhibited.
  • a row of pixels in the first area is distributed along a curve, wherein pixels close to the midpoint of the curve are farther away from the second area than pixels close to both ends of the curve.
  • a row of pixels in the first area gradually moves away from the center line of the light modulator parallel to the row direction from the middle to both ends in the row direction.
  • the width of the pixels gradually increases from the middle to both ends.
  • Such an array distribution of pixels in the first area can make the barrel distortion caused by the light modulator on the real image closer to the optical barrel distortion, and then more closely match the pincushion distortion caused by the optical imaging unit. After the two are offset, a The virtual image is more ideal.
  • the field of view angle corresponding to the pixels in the first area may be greater than or equal to the preset field of view angle. That is to say, the first area is a large field of view area. In this case, there is no need to compress or expand all the pixels of the light modulator to ensure the formation of an ideal virtual image within the error range of the human eye.
  • the light modulator further includes an edge region far away from the center of the light modulator relative to the first region, and the size of the pixels in the edge region is smaller than the size of the pixels in the first region.
  • the smoothness of the edge of the light modulator can be further optimized by adjusting the position of the pixels at the edge of the light modulator, for example, by staggering continuous pixels, thereby optimizing the smoothness of the edge of the real image and optimizing the image display effect.
  • the display device further includes: a light diffusion element located on the light exit side of the image generation unit, used to diffuse the first light beam corresponding to the real image to improve the uniformity of the imaging picture.
  • the optical imaging unit includes the above-mentioned light diffusion element.
  • the optical imaging unit includes a folded optical path element and a virtual image imaging lens group.
  • the folded light path element is used to receive the first light beam and transmit the first light beam to the virtual image imaging lens group;
  • the virtual image imaging lens group is used to reflect the first light beam, and the reflected light reaches the folding light path element;
  • the folding light path element is also used to The light beam reflected by the virtual image imaging lens group is transmitted to emit a second light beam for forming a virtual image.
  • the first light beam is circularly polarized light
  • the folded optical path element includes a polarization converter and a first film layer, and the first film layer is located on the side of the polarization converter away from the virtual image imaging lens group ;
  • the polarization converter is used to receive the first light beam and convert the first light beam into the linearly polarized light of the first polarization state;
  • the first film layer is used to reflect the first light beam of the first polarization state from the polarization converter to the polarization converter;
  • the polarization converter is also used to convert the first light beam of the first polarization state from the first film layer into circularly polarized light, and emits it to the virtual image imaging lens group;
  • the polarization converter is also used to convert the virtual image into circularly polarized light.
  • the light beam reflected by the imaging lens group is converted into linearly polarized light of the second polarization state; the first film layer is also used to transmit the light beam of the second polarization state from the polarization converter, and finally emits the second light beam.
  • the above-mentioned polarization converter can be a 1/4 wave plate.
  • the folded optical path element can reflect the first beam to the virtual image imaging lens group, and the virtual image imaging lens group processes the first beam.
  • the folded light path element can also transmit the light beam emitted by the virtual image imaging lens group so that it can enter the user's eyes, that is, it can fold the light path and improve the integration of the display device. Conducive to miniaturization of the display device. At the same time, it can be seen from the propagation process of the above-mentioned beam that no loss of the beam is caused, so the imaging quality can be guaranteed.
  • the virtual image imaging lens group includes an optical free-form surface, and the optical free-form surface can be provided by an optical free-form mirror.
  • the folded light path element includes a second film layer, a third film layer and a film-carrying sheet.
  • the third film layer and the second film layer may be located in sequence away from the virtual image of the film-carrying sheet.
  • One side of the imaging lens group One side of the imaging lens group.
  • the second film layer is used to reflect a part of the light in the first light beam to the virtual image imaging module, and transmit a part of the light beam reflected by the virtual image imaging module to the third film layer.
  • the third film layer is used to transmit the transmitted light from the second film layer.
  • the transmitted light of the third film layer is the second light beam.
  • the second light beam continues to propagate and finally enters the user's eyes, allowing the user to see an enlarged virtual image.
  • the structure of the folded optical path element is simple and easy to implement.
  • a part of the first light beam can be reflected to the virtual image imaging lens group, and the part of the light beam can be processed by the virtual image imaging lens group.
  • a part of the light beam emitted by the virtual image imaging lens group can be transmitted. This allows the user to see an enlarged virtual image, thereby folding the optical path, improving the integration of the display device, and conducive to miniaturization of the display device.
  • the present application provides a display device, including a processor and any one of the above display devices.
  • the processor is configured to send an image signal to an image generation unit in the display device.
  • the present application provides a vehicle, which includes a reflective element and any of the above display devices or display devices.
  • the display device or display device is installed on the vehicle.
  • the display device or display device is used to emit a light beam corresponding to the virtual image to a reflective element; the reflective element is used to reflect the light beam to the user's eyes.
  • the reflective element is such as a windshield of a vehicle, and the display device
  • the optical imaging unit includes the reflective element.
  • the vehicle has a seat, and the display device or display device is installed on the backrest of the seat.
  • Figure 1 is a schematic diagram of optical distortion given in an embodiment of the present application.
  • Figure 2 is a schematic diagram of the imaging process of a display device according to an embodiment of the present application.
  • Figure 3 is a schematic diagram of an application scenario of a display device provided by an embodiment of the present application.
  • Figure 4 is a schematic structural diagram of a display device provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of the imaging process of another display device provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of the relationship between the optical modulator and the first distortion parameter and the second distortion parameter according to the embodiment of the present application;
  • Figure 7 is a schematic diagram of the relationship between the optical modulator structure and the real image provided by the embodiment of the present application.
  • Figure 8 is a schematic diagram of an optical modulator provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of local area pixels in the light modulator provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram of an optical modulator provided by an embodiment of the present application.
  • Figure 11 is a schematic diagram of another optical modulator provided by an embodiment of the present application.
  • Figure 12 is a schematic structural diagram of another display device provided by an embodiment of the present application.
  • Figure 13 is a schematic structural diagram of an optical imaging unit provided by an embodiment of the present application.
  • Figure 14 is a schematic structural diagram of another display device including the optical imaging unit shown in Figure 13 provided by an embodiment of the present application;
  • Figure 15 is a schematic diagram of an optical path in an optical imaging unit provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of an optical path in another optical imaging unit provided by an embodiment of the present application.
  • Figure 17 is a schematic diagram of a display device provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of applying the display device provided in the embodiment of the present application to a head-up display device HUD of a vehicle;
  • Figure 19 is a schematic diagram of a possible functional framework of a vehicle according to the embodiment of the present application.
  • first and second are used for descriptive purposes only and shall not be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • Light display products include desktop displays, head-up displays (HUD), and projector lights. Their application scenarios include but are not limited to transportation, such as cars, trains, etc., but desktop display devices can also be used in education. fields, such as various desktops with desktop display functions.
  • HUD head-up displays
  • the characteristic of the above-mentioned light display products is that through its optical imaging unit, it can present a virtual image with an enlarged field of view and frame, and map it into the eyes of the observer, thereby allowing the observer to obtain a large-screen viewing experience.
  • the optical imaging unit is generally composed of optical lenses.
  • the virtual image formed by the optical imaging unit is larger than that of the image generation unit.
  • the generated real image loses its similarity, that is, it undergoes geometric deformation.
  • This geometric deformation is also called optical distortion.
  • the optical distortion here can also be understood as: for an ideal optical imaging unit, the magnification ratio of the virtual image relative to the real image is constant, that is, there is a high similarity between the virtual image and the real image, that is, no optical distortion occurs. .
  • the magnification of the virtual image varies with the field of view, and the larger the field of view, the greater the change in magnification. Therefore, the virtual image loses similarity with the real image, that is, optical distortion.
  • the optical distortion caused by the optical imaging unit to the virtual image can be pincushion distortion as shown in (b) of Figure 1, or barrel distortion as shown in (c) of Figure 1, in which a grid represents a pixel.
  • barrel distortion causes some pixel positions of the real virtual image to shift away from the center of the image.
  • the pixel size becomes larger
  • pincushion distortion causes some pixel positions of the real virtual image to shift closer to the center of the image and the pixel size becomes smaller.
  • This imaging defect of the optical imaging unit causes the user to see a geometrically deformed image, which will undoubtedly affect the user's viewing experience.
  • a processor peripheral to the display device is used to perform pre-correction processing on the image to be displayed to obtain a pre-corrected image, and the pre-corrected image is obtained.
  • the image signal of the image is input to the image generating unit, so that the image generating unit generates a real image of the pre-corrected image. After the real image of the pre-corrected image is processed by the optical imaging unit, a relatively ideal virtual image can be formed.
  • the process can be divided into three stages: The first stage is the "pre-correction processing stage". Through the pre-correction process, a pre-corrected geometric deformation relative to the image to be displayed is obtained. Correct the image; the second stage is the "real image generation stage”. After the image signal of the pre-corrected image is input to the image generation unit, the image generation unit can generate a real image of the pre-correction image; the third stage is the "virtual image formation stage" "After the real image of the pre-corrected image passes through the optical imaging unit, the optical distortion caused by the optical imaging unit can offset the geometric deformation of the pre-corrected image to a certain extent, thereby forming a relatively ideal virtual image.
  • the image to be displayed is rectangular, while the pre-corrected image is "barrel-shaped", which means that a large number of The position of the pixels shifts and the size of the pixels also changes. Therefore, when the image generation unit generates a real image of the pre-corrected image, the pixels of the image generation unit are not fully utilized, resulting in a loss of image resolution.
  • a processor peripheral to the display device needs to be used to pre-correct the image to be displayed, that is, this implementation requires corresponding hardware support, thus resulting in additional power consumption and cost.
  • embodiments of the present application provide a display device, which can also be called a projection device.
  • the display device can form a relatively ideal virtual image to ensure the user's viewing experience, and can also make full use of the pixels of the image generation unit to avoid Loss of image resolution without increasing power consumption and cost.
  • the display device can be used in audio and video entertainment, education, office, assisted driving, medical and other scenarios.
  • the display device can be a flat display device that displays two-dimensional images, or it can be a three-dimensional display device based on spectroscopic stereoscopic display technology that provides left eye images and right eye images for the user's left eye and right eye respectively, so that the user has a three-dimensional visual experience. display device.
  • the display device can be integrated into other equipment as a component or used alone.
  • the display device can be a desktop display device, that is, the display window is located on the desktop of a table structure such as a game table, an office desk, a study table, etc.
  • the desktop display device can display larger-format images within the user's front field of view, thereby providing a more immersive and shocking visual experience and reducing the user's visual fatigue.
  • the imaging beam generated by the desktop display device is reflected by the display window and reflected into the user's eyes, thus presenting an image in the user's front field of view.
  • a magnified virtual image of an image can contain book content, auxiliary learning content, or images of online courses, etc. In this way, users do not need to lower their heads to read books or learn online courses through small-screen smart devices, which is convenient for protecting users' eyesight and providing users with more information. Good learning conditions.
  • the display device can be integrated into a near eye display (NED)
  • NED near eye display
  • the NED device may be, for example, an augmented reality (AR) device or a virtual reality (VR) device.
  • the AR device may include but is not limited to AR glasses or AR helmets.
  • the VR device may include but is not limited to VR glasses or VR helmet.
  • FIG (b) in Figure 3 taking the NED device as VR glasses as an example, when the user wears the VR glasses, the VR glasses can present an enlarged virtual image corresponding to the image in the field of view in front of the user, thereby immersing the user in virtual reality. Scenarios for playing games, watching videos, participating in virtual meetings, or video shopping, etc.
  • the display device can be integrated into the HUD.
  • the HUD can project the imaging beam to the eyes of the user (driver), thereby presenting an enlarged virtual image of the image in the user's forward field of view.
  • the image can include instrument information, navigation information, etc., so that the user does not need to lower his head to view this information and avoid affecting driving safety.
  • the types of HUD include but are not limited to windshield (W)-HUD, augmented reality head-up display (AR-HUD), etc.
  • the display device can also be integrated into the car light.
  • the car lights can also implement an adaptive driving beam (ADB) system, which can display text, traffic signs, video images, etc. in front of the vehicle. An enlarged virtual image of the content image, thereby providing users with assisted driving functions or audio-visual entertainment functions.
  • ADB adaptive driving beam
  • the display device may be a projector.
  • the projector can project the enlarged virtual image corresponding to the image onto the wall or projection screen.
  • FIG. 4 is a schematic structural diagram of a display device 400 provided by an embodiment of the present application.
  • the display device 400 includes an image generation unit 410 and an optical imaging unit 420 .
  • the image generation unit 410 can generate a real image of the image to be displayed corresponding to the image signal according to the input image signal, that is, generate the light beam P1 containing the image information to be displayed, and the light beam P1 can form a real image of the image to be displayed.
  • the optical imaging unit 420 is used to fold (such as change the propagation direction of the light beam), amplify, and other processes on the light beam P1, and finally form an enlarged virtual image corresponding to the real image, that is, form a light beam P2 containing the image information to be displayed.
  • the light beam P2 can form Magnified virtual image.
  • the light beam P1 generated by the image generating unit 410 is called the first light beam
  • the light beam P2 emitted by the optical imaging unit 420 is called the second light beam.
  • the positional relationship between the image generation unit 410 and the optical imaging unit 420 may be different.
  • the positional relationship between the image generation unit 410 and the optical imaging unit 420 depends on the optical path design in the specific application scenario, which is not limited in this application.
  • the optical imaging unit 420 is usually located on the light exit side of the image generating unit 410.
  • the image generation unit 410 In order to form a relatively ideal virtual image while fully utilizing the pixels of the image generation unit 410 to avoid loss of image resolution and without increasing power consumption and cost, in the embodiment of the present application, as shown in Figure 5 , the image generation unit 410 The generated real image of the image to be displayed has pre-distortion relative to the image to be displayed. After the real image enters the optical imaging unit 420, the optical imaging unit 420 can form a corresponding enlarged virtual image, and the optical distortion caused by the optical imaging unit 420 to the virtual image can compensate for the aforementioned pre-distortion. In this way, a relatively ideal virtual image can be formed to ensure the user's viewing experience.
  • the image generation unit 410 directly generates a real image with pre-distortion based on the image signal of the image to be displayed input by a processor peripheral to the display device, rather than generating a corresponding real image based on the image signal of the pre-corrected image, Therefore, the pixels of the image generating unit 410 are effectively utilized, and the loss of image resolution is avoided.
  • this application implements pre-correction of optical distortion in the "real image generation phase" through the image generation unit 410, omitting the "pre-correction processing phase", and does not require peripheral hardware support of the display device 400, so there is no Incur additional power consumption and cost.
  • the optical imaging unit 420 corresponds to a distortion parameter, which can be understood as a characteristic parameter of the optical imaging unit 420 .
  • the distortion parameter can characterize the optical distortion caused by the optical imaging unit 420 .
  • the distortion parameter corresponding to the optical imaging unit 420 is referred to as the first distortion parameter below.
  • image generation unit 410 includes a light source and a light modulator.
  • the light modulator is located on the light exit side of the light source, and the light source is used to provide basic light, such as natural light or primary color light, for the light modulator.
  • the light modulator is also called a display panel, such as a liquid crystal display (LCD) and an organic light-emitting diode (OLED). It is used to perform light modulation on the signal of the image to be displayed to generate a first light beam. That is, as the light beam P1 shown in FIG. 2 above, this first light beam is used to form a real image of the image to be displayed.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the size and/or distribution pattern of the pixels of the light modulator matches the second distortion parameter, and the second distortion parameter is determined based on the above-mentioned first distortion parameter.
  • the size and/or distribution of the pixels of the light modulator is designed according to the first distortion parameter/optical distortion of the optical generation unit 420, that is, corresponds to the first distortion parameter, so based on the light modulation
  • the real image generated by the sensor has pre-distortion that can be characterized by the second distortion parameter relative to the image to be displayed, and the pre-distortion can offset the optical distortion caused when the subsequent optical imaging unit 420 performs virtual image imaging, so that in the "real image”
  • the pre-correction of this optical distortion is implemented in the "generation stage", which not only avoids the loss of image resolution, but also avoids additional power consumption and cost.
  • the above-mentioned optical distortion is defined as the deviation of the real virtual image relative to the ideal virtual image.
  • the deviation can be the position deviation of the same feature point in the real virtual image and the ideal virtual image, or it can also be the real virtual image. Deviation from the height and/or width of the same feature area in the ideal virtual image.
  • the ideal virtual image is a virtual image formed after a real image passes through the ideal optical imaging unit 420 assuming that the optical imaging unit 420 is an ideal optical imaging unit 420 with zero distortion.
  • the real virtual image is a virtual image formed after the real image passes through the actual optical imaging unit 420 .
  • the relationship between the real virtual image and the ideal virtual image can be a real
  • the relationship between the position coordinates of the feature point on the virtual image and the position coordinate of the feature point on the ideal virtual image, and the parameter used to characterize/describe this relationship is the first distortion parameter.
  • the first distortion parameter is also a parameter that characterizes/describes optical distortion
  • the first distortion parameter corresponds to the optical imaging unit 420 .
  • the feature points can be pixel positions.
  • (x ⁇ , y ⁇ ) represents the position coordinates of the pixel in the real virtual image
  • (x, y) represents the position coordinates of the pixel in the ideal virtual image
  • the formula describes the relationship between x ⁇ and x and the relationship between y ⁇ and y
  • the parameters of the relationship between them, such as k 1 and k 2 , are the first distortion parameters.
  • the optical distortion involved in this application is the distortion caused by the virtual image formed by the optical imaging unit 420.
  • the optical distortion can have different definitions depending on the object to which the virtual image is compared.
  • the first distortion parameter can also be different.
  • the above-mentioned optical distortion is defined as the deviation amount of the virtual image formed by the optical imaging unit 420 relative to the real image entering the optical imaging unit 420 .
  • the virtual image and the real image have a corresponding relationship, when there are multiple feature points (or feature areas) on the real image, the virtual image also has multiple feature points (or feature areas), and the multiple feature points in the real image (or feature areas) correspond to multiple feature points (or feature areas) in the virtual image one-to-one.
  • the deviation amount of the virtual image relative to the real image can be the position deviation amount of the corresponding feature points in the virtual image and the real image, or it can also be the height and/or height deviation of the corresponding feature areas in the virtual image and the real image.
  • the relationship between the virtual image and the real image can be the relationship between the position coordinates of the corresponding feature points in the virtual image and the real image, and
  • the parameters used to characterize/describe this relationship can be used as the first distortion parameters corresponding to the optical imaging unit 420, that is, the parameters that characterize the optical distortion generated by the virtual image.
  • the first distortion parameter corresponding to the optical imaging unit 420 can be obtained through corresponding actual measurement or simulated measurement. It should be understood that based on the above-mentioned different definitions of optical distortion, the measurement methods of distortion parameters used are also different. Inspired by the definition of optical distortion given in the embodiments of the present application, those skilled in the art can select an appropriate measurement method to obtain the first distortion parameter corresponding to the optical imaging unit 420. The embodiments of the present application do not specify the specific measurement method. be limited.
  • FIG. 6 exemplarily shows the relationship between the light modulator and the first distortion parameter and the second distortion parameter.
  • the first distortion parameter describes the relationship between the virtual image R formed by the optical imaging unit 420 and the input optical imaging.
  • the virtual image R formed by the optical imaging unit 420 is expected to be an ideal virtual image, the virtual image R can be regarded as known.
  • the position coordinates of each pixel imgR in the image and the first distortion parameter can be used to determine the coordinates of each pixel imgR in the real image D, and then the real image D can be obtained.
  • the real image D is a pre-distorted real image generated by the desired image generation unit 410, and its pre-distortion can be characterized by the second distortion parameter, that is, the relationship between the real image D and the image I to be displayed can be characterized by the second distortion parameter. In this way, when the size and/or distribution pattern of the pixels of the light modulator matches the second distortion parameter, based on the light modulator, the desired real image D can be generated according to the image to be displayed.
  • the ideal virtual image R is used as the virtual image imaging target, and the real image D generated by the desired image generation unit 410 can be determined in combination with the first distortion parameter.
  • the relationship between the real image D and the image I to be displayed, that is, the second distortion parameter can be further obtained. Designing the size and/or distribution of the pixels of the light modulator based on the second distortion parameter can make the real image generated by the image generation unit 410 a desired real image D with pre-distortion.
  • a light modulator has a panel area including pixels, each pixel including a plurality of sub-pixels.
  • each pixel is composed of the three primary colors of red, blue, and green (RGB).
  • RGB red, blue, and green
  • Each color on each pixel is a "sub-pixel.”
  • each pixel includes 4 sub-pixels, one for red. , green and two blue.
  • the panel area includes sub-pixel areas distributed according to certain rules, and the sub-pixels are located within the sub-pixel area.
  • the sub-pixel area includes a light-emitting area, and the aperture ratio of the sub-pixel area represents the area ratio of the light-emitting area in the sub-pixel area.
  • the sub-pixels here are commonly known as pixels.
  • the size and distribution method of the pixels mentioned in the embodiments of the present application can specifically be the size and distribution method of the light-emitting area in the sub-pixel area where the sub-pixel is located.
  • the dimensions include width, height, area, etc.
  • the distribution method includes position and array distribution. array distribution direction, spacing, etc.
  • the optical distortion caused by the optical imaging unit 420 appears as a change in the shape of the image.
  • the images after distortion shown in (b) and (c) in Figure 3 have obvious shapes compared to the image before distortion shown in (a) in Figure 3 changes.
  • the undistorted shape is referred to as the ideal shape and the distorted shape is referred to as the non-ideal shape.
  • the real image generated by the image generation unit 410 is deformed relative to the image to be displayed, or in other words, the image to be displayed has an ideal shape, and the real image Pre-deformation occurs and takes on a non-ideal shape.
  • the above-mentioned pre-distortion includes this pre-distortion.
  • the shape of the real image depends on the shape of the light modulator, and its shape depends on the size and distribution of the pixels on it, therefore, when the size and/or distribution of the pixels on the light modulator matches the second distortion parameter
  • the shape of the light modulator may be a non-ideal shape, thereby ensuring that a real image with a non-ideal shape is generated, thereby ensuring that the optical imaging unit 420 forms a virtual image with an ideal shape.
  • the real image generated by it has an ideal shape, specifically a rectangle.
  • the real image enters the optical imaging unit 420, due to the optical distortion caused by the optical imaging unit 420, the virtual image formed has a non-ideal shape and is no longer a rectangle.
  • the real image generated by it has an ideal shape.
  • the virtual image formed may be in an ideal shape, that is, a rectangle.
  • the shape of the light modulator is irregular, it means that the area bounded by the edge area is irregular in shape and size. Limited by the aforementioned area shape and size, the edge area cannot accommodate pixels of the same size as the internal area, so it is easy to cause defects such as "burrs", "gaps” and color casts in the real image. Therefore, in order to avoid defects such as "burrs” and "gaps" in the generated real image, in one possible implementation, the size of the pixels distributed at the edge of the light modulator (which can be referred to as the "edge area" for short) is smaller than Other pixel dimensions.
  • FIG. 8 exemplarily shows the size and distribution of pixels in the light modulator.
  • the size of pixels distributed in the edge area is smaller than the size of other pixels.
  • the sub-pixel area located in the edge area is smaller and has an irregular shape.
  • pixels located in the edge area their size and position can be adapted to the size and aperture ratio of the corresponding sub-pixel area, thereby avoiding defects such as "burrs", "gaps” and color casts in the generated real image.
  • the application is not limited to this.
  • the optical distortion caused by the optical imaging unit 420 not only manifests itself as a change in the shape of the image, but also manifests itself as a change in pixel size and pixel position, that is, a distortion of the image, such as the barrel distortion shown in Figure 3 and pincushion distortion.
  • Barrel distortion also known as barrel distortion
  • barrel distortion is a distortion phenomenon in which the picture becomes barrel-shaped. From the perspective of changes in pixel size, the picture becomes barrel-shaped, that is, the pixel size becomes larger.
  • Pincushion distortion also known as pincushion distortion, is a picture that is compressed in a pincushion shape. Distortion phenomenon, from the perspective of pixel size changes, the picture is pincushion-shaped compression, that is, the pixel size becomes smaller. It should be understood that barrel distortion and pincushion distortion can be accompanied by changes in the shape of the image.
  • the size and array distribution of the pixels on the light modulator are designed according to the type of optical distortion caused by the optical imaging unit 420 .
  • the optical distortion caused by the optical imaging unit 420 is barrel distortion
  • the size of some or all pixels on the light modulator is compressed and designed so that the image generation unit 410 generates a real image with pincushion distortion.
  • the barrel distortion caused by the virtual image offsets the pincushion distortion of the real image, and finally forms an ideal virtual image.
  • the optical distortion caused by the optical imaging unit 420 is pincushion distortion
  • the size of some or all of the pixels on the light modulator is designed to be expanded, so that the image generation unit 410 generates a real image with barrel distortion.
  • the pincushion distortion caused to the virtual image offsets the barrel distortion of the real image, and finally forms an ideal virtual image.
  • compressing the size of the pixel can be understood as reducing the size of the pixel, that is, reducing the size of the light-emitting area in the sub-pixel area.
  • Expanding the size of the pixel can be understood as increasing the size of the pixel, that is, increasing the size of the light-emitting area in the sub-pixel area.
  • the pixels obtained through compression design can be shown in (b) in Figure 9
  • the pixels obtained through expansion design can be shown in (c) in Figure 9 .
  • the pixels obtained by the compression design are smaller than the normal-sized pixels
  • the pixels obtained by the expansion design are larger than the normal-sized pixels.
  • the real image generated does not have barrel distortion or pincushion distortion.
  • the real image enters the optical imaging unit 420, if the optical distortion caused by the optical imaging unit 420 is barrel distortion, the virtual image formed by it has barrel distortion. If the optical distortion caused by the optical imaging unit 420 is pincushion distortion, the virtual image formed by it has pincushion distortion.
  • the optical distortion caused by the optical imaging unit 420 is barrel distortion
  • the size of some or all of the pixels of the light modulator is designed through compression, that is, smaller than the normal size
  • the real image generated by it has pincushion distortion.
  • the barrel distortion caused by the optical imaging unit 420 can compensate the barrel distortion of the real image, thereby forming an ideal virtual image.
  • the light modulator includes a first area and a second area, and the pixels in the first area and the pixels in the second area are distributed in an array; wherein the size of the pixels in the first area is the same as the size of the pixels in the second area. The sizes are different, and/or the array distribution of pixels in the first area is different from the array distribution of pixels in the second area. It can be understood here that the light modulator is divided into two parts, and the size and/or array distribution of the pixels in the two parts may comply with different rules.
  • the first area is further away from the center of the light modulator relative to the second area.
  • the size and/or array distribution of pixels in the first area matches the second distortion parameter.
  • the size and/or array distribution of the pixels in the first area correspond to the first distortion parameter. That is to say, the size and/or array distribution of the pixels in the first area are designed according to the second distortion parameter, that is, in compliance with the rules corresponding to the second distortion parameter, such as compression design rules or expansion design rules.
  • the size and/or array distribution of the pixels in the second area conform to another rule, such as the standard design rules for light modulators.
  • the standard design rule here can be understood that the size of the pixels and/or the array distribution pattern has nothing to do with the second distortion parameter, that is, nothing to do with the imaging defects of the optical imaging unit 410 .
  • the field of view angle corresponding to the pixels in the first area may be greater than or equal to the preset field of view angle. That is to say, the first area is a large field of view area. In this case, there is no need to compress or expand all pixels to ensure the formation of an ideal virtual image within the error range of the human eye.
  • the size of the pixels in the first area corresponds to the first distortion parameter.
  • the size of the pixels in the first area is smaller than the size of the pixels in the second area, so that the pre-distortion of the real image is pincushion distortion.
  • the size of the pixels in the first area is larger than the size of the pixels in the second area, so that the pre-distortion of the real image is barrel distortion.
  • the array distribution of pixels in the first area corresponds to the first distortion parameter. Specifically, a row of pixels in the first area is distributed along a curve. Wherein, if the optical distortion generated by the optical imaging unit 420 is barrel distortion, the pixels near both ends of the curve are farther away from the second area than the pixels near the midpoint of the curve; if the optical distortion generated by the optical imaging unit 420 is pincushion distortion, If the shape distortion occurs, the pixels close to the midpoint of the curve are farther away from the second area than the pixels close to both ends of the curve.
  • the size and array distribution of the pixels in the first area may both correspond to the first distortion parameter.
  • FIG. 10 illustrates a possible light modulator.
  • the light modulator includes a first area and a second area, and the first area surrounds the periphery of the second area.
  • the pixels in the first area and the second area are distributed in an array, and the field of view angle corresponding to each pixel position in the first area is greater than the preset field of view angle.
  • the size and array distribution of the pixels in the first area are designed according to the second distortion parameter, that is, they match the second distortion parameter.
  • the width of the light modulator along the column direction of the pixel gradually increases from the middle to both ends in the row direction, so that the shape of the real image is "pincushion".
  • the size of the pixels in the first area is smaller than the size of the pixels in the second area, so that the pre-distortion of the real image becomes pincushion distortion, that is, pincushion distortion is exhibited. That is to say, the light modulator shown in FIG. 10 can generate a real image with pincushion distortion relative to the image to be displayed.
  • a row of pixels in the first area is distributed along a curve, wherein pixels near both ends of the curve are farther away from the second area than pixels near the midpoint of the curve.
  • a row of pixels located on the left side of the second area in the first area are arrayed along the extension direction of the curve S1
  • a row of pixels located on the right side of the second area are arrayed along the curve S1.
  • a row of pixels located on the upper side of the second area in the first area are array distributed along the extension direction of curve S3, and a row of pixels located on the lower side of the second area are array distributed along curve S4 ;
  • the pixels in the second area are arranged in an array in the height direction of the light modulator, that is, along the height direction, and in the width direction, that is, in an array along the width direction.
  • the array distribution of pixels in the first area in Figure 10 can also be described as follows: a row of pixels in the first area gradually moves away from the center line of the light modulator parallel to the row direction from the middle to both ends in the row direction.
  • the center line parallel to the row direction is a straight line parallel to the row direction and passing through the geometric center of the light modulator.
  • the width of the pixels gradually decreases from the middle to both ends.
  • the pixel size and array distribution in the first area are designed in such a way that the pincushion distortion caused by the light modulator on the real image is closer to the optical pincushion distortion, and thus more closely matched with the barrel distortion caused by the optical imaging unit.
  • optical pincushion distortion can be understood as pincushion distortion caused by assembly errors of optical lenses, manufacturing accuracy and other factors. Its formation principle is different from the principle of pincushion distortion caused by the light modulator in this application.
  • Figure 11 exemplarily shows another possible light modulator.
  • the light modulator The width along the column direction of the pixel gradually decreases from the middle to both ends in the row direction, so that the shape of the real image is "barrel-shaped".
  • the size of the pixels in the first area is larger than the size of the pixels in the second area, so that the pre-distortion of the real image becomes barrel distortion, that is, barrel distortion is exhibited. That is to say, the light modulator shown in FIG. 11 can generate a real image with barrel distortion relative to the image to be displayed.
  • a row of pixels in the first area is distributed along a curve, in which pixels near the midpoint of the curve are farther away from the second area than pixels near both ends of the curve.
  • the light modulator shown in Figure 10 what is different from the light modulator shown in Figure 10 is that in the height direction of the light modulator, a row of pixels in the first area located on the left side of the second area are array-distributed along the extension direction of curve S5 , a row of pixels located on the right side of the second area is arrayed along the curve S6; in the width direction of the light modulator, a row of pixels located on the upper side of the second area in the first area is arrayed along the extension direction of the curve S7, located at the A row of pixels on the lower side of the second area is arrayed along the curve S8.
  • the array distribution of pixels in the first area in Figure 11 can also be described as follows: a row of pixels in the first area gradually moves away from the center line of the light modulator parallel to the row direction from the middle to both ends in the row direction. In addition, in a row of pixels in the first area, along the row direction, the width of the pixels gradually increases from the middle to both ends.
  • the pixel size and array distribution in the first area are designed in such a way that the barrel distortion caused by the light modulator on the real image is closer to the optical barrel distortion, and thus more closely matched with the pincushion distortion caused by the optical imaging unit. After cancellation, the virtual image formed is more ideal.
  • the light modulators shown in Figure 10 and Figure 11 respectively compress and compress the pixels in the first area under a given resolution (such as 1400 ⁇ 1050) and pixel panel size. Expansive design. In other words, when the resolution is determined, after the pixels in the first area are compressed and designed, the pixel size in the first area is smaller than the pixel size in the second area, and the pixels in the first area are designed to be expanded. Finally, the pixel size in the first area is larger than the pixel size in the second area.
  • the second area includes 800 ⁇ 600 pixels, and the area outside the second area includes ( 1400 ⁇ 1050)-(800 ⁇ 600) pixels, some or all of the (1400 ⁇ 1050)-(800 ⁇ 600) are located in the first area, and the size of the pixels in the first area is smaller than that in the second area The size of the pixels.
  • the light modulator can also be divided into the above-mentioned first area, second area and edge area.
  • the first area is farther away from the center of the light modulator than the second area
  • the edge area is farther away from the center than the first area.
  • the size of the pixels in the edge area is smaller than the size of the pixels in the first area.
  • the size of the pixels in the edge area may be smaller than the size of the pixels in the second area, or may be larger than the size of the pixels in the second area.
  • the light modulators mentioned in the above embodiments include but are not limited to LCDs and OLEDs.
  • they can also be liquid crystal on silicon (LCOS) displays and digital light processing (DLP) displays.
  • LCOS liquid crystal on silicon
  • DLP digital light processing
  • the display device 400 of the present application uses different types of light modulators, When using a modulator, different manufacturing processes need to be used to produce a light modulator whose size and/or distribution pattern of pixels matches the second distortion parameter.
  • those skilled in the art can use well-known manufacturing processes in the art to prepare the light modulator provided in the embodiments of the present application without exerting creative work. This article does not describe the achievable manufacturing processes. Repeat.
  • the display device 400 may further include a light diffusion element 430 , and the light diffusion element 430 may also be called a diffusion screen.
  • the light diffusing element 430 is located on the light exit side of the image generating unit 410 and is used to diffuse the first light beam P1 emitted by the image generating unit 410 to improve the uniformity of the imaging screen.
  • the light diffusion element as a type of optical lens, will also cause optical distortion of the image when its production accuracy is low and/or it includes a curved surface structure.
  • the first distortion parameter corresponding to the optical imaging unit 420 when measuring the first distortion parameter corresponding to the optical imaging unit 420, it needs to be regarded as a part of the optical imaging unit 420 to ensure the accuracy of the first distortion parameter and to ensure the first distortion parameter.
  • the optical distortion described includes optical distortion caused by the light diffusing element. In this way, it can be ensured that the optical modulator designed according to the first distortion parameter can achieve pre-correction of optical distortion.
  • the optical imaging unit 420 includes a folded optical path element 421 and a virtual image imaging lens group 422 .
  • the folded optical path element 421 is used to receive the first light beam P1 emitted by the image generation unit 410, and transmit the first light beam P1 to the virtual image imaging lens group 422; the virtual image imaging lens group 422 is used to reflect the first light beam to form a third light beam. Two light beams; the folded light path element 421 is also used to transmit the second light beam, so that the second light beam enters the user's eyes, allowing the user to see a virtual image.
  • This structural design of the optical imaging unit 420 improves the integration level of the display device 400 and is conducive to miniaturization of the display device 400 .
  • the light diffusion element 430 is located on the optical path between the image generation unit 410 and the folded light path element 421 , and the first light beam P1 is diffused by the light diffusion element 430 Then, go to the folded optical path element 421.
  • the virtual image imaging lens group 422 includes an optical freeform surface, which may be provided by an optical freeform mirror.
  • the light output side of the light modulator in the image generation unit 410 may also be provided with a 1/4 wave plate.
  • the quarter wave plate located on the light exit side of the light modulator is used to convert the light beam into circularly polarized light, that is, the first light beam becomes circularly polarized light.
  • the virtual image imaging lens group 422 includes an optical free-form mirror and the first light beam is circularly polarized light
  • the specific implementation of the folded optical path element 421 will be introduced.
  • the folded optical path element 421 includes a polarization converter 421a and a first film layer 421b.
  • the first film layer 421b and the polarization converter 421a are located on the base in sequence away from the optical freeform mirror 422a. side.
  • the first film layer 421b and the polarization converter 421a are provided separately. It should be understood that in practical applications, the first film layer 421b and the polarization converter 421a are sequentially attached to the side of the substrate away from the optical free-form mirror 422a.
  • the substrate is used to carry the first film layer 421b and the polarization converter 421a without affecting the light beam.
  • the substrate may be a transparent glass sheet.
  • the polarization converter 421a is used to receive the first light beam P1 (circle), and convert the first light beam P1 (circle) into linearly polarized light of the first polarization state.
  • the first light beam P1 (circle) is converted into S light, that is, P1-s.
  • the first film layer 421b is used to reflect the light of the first polarization state. In this way, when P1-s reaches the first film layer 421b, it will be reflected to the polarization converter 421a. Under the action of the polarization converter 421a, P1 -s is converted into circularly polarized light, namely P1 (circle), and transmitted to the optical freeform mirror 422a.
  • the optical free-form mirror 422a reflects P1 (circle), and the reflected light is the second light beam.
  • the polarization converter 421a is also used to receive the reflected light of the free-form mirror 422a, that is, P1 (circle), and convert it into the polarized light of the second polarization state, that is, P1-p.
  • the P1-p reaches the first film layer 421b.
  • the first film layer 421b is also used to transmit polarized light in the second polarization state. In this way, after P1-p reaches the first film layer 421b, it will be transmitted through the first film layer 421b, continue to propagate, and finally enter the user's eyes, allowing the user to see an enlarged virtual image.
  • the polarization converter 421a can be a quarter wave plate, and the first film layer 421b can be a reflection-transmission polarizing film.
  • the folded optical path element In the implementation shown in Figure 15, based on the fact that the first beam can be circularly polarized light, the folded optical path element The first light beam can be reflected to the virtual image imaging lens group, and the first light beam can be processed by the virtual image imaging lens group, such as amplification processing.
  • the folded optical path element can also transmit the light beam emitted by the virtual image imaging lens group, so that It can enter the user's eyes, that is, it can fold the optical path, improve the integration of the display device, and be conducive to the miniaturization of the display device. At the same time, it can be seen from the propagation process of the above-mentioned beam that no loss of the beam is caused, so the imaging quality can be guaranteed.
  • the folded light path element 421 includes a second film layer 421c and a third film layer 421d.
  • the third film layer 421d and the second film layer 421c can be located on the base in sequence. The side away from the optical freeform mirror 422a.
  • the second film layer 421c and the third film layer 421d are provided separately. It should be understood that in practical applications, the third film layer 421d and the second film layer 421c can be sequentially attached to the surface of the substrate.
  • the second film layer 421c is used to reflect a part of the light in the first light beam P1.
  • the part of the light that is transmitted is P11, so that the P11 is emitted to the free-form mirror 422a.
  • the second film layer 421c simultaneously transmits the remaining light in the first light beam P1 (not shown in the figure).
  • the optical free-form mirror 422a is used to reflect P11. After the reflected light P11 is transmitted to the second film layer 421c, a part of the P111 is transmitted through the second film layer 421c and to the third film layer 421d, and the other part (in the figure) (not shown) is reflected by the second film layer 421c.
  • the third film layer 421d is used to transmit P111.
  • the transmitted light is P2, which continues to propagate and eventually enters the user's eyes, allowing the user to see an enlarged virtual image.
  • the second film layer 421c may be a semi-transmissive and semi-reflective film
  • the third film layer 421d may be a transmission-absorption polarizing film.
  • the structure of the folded optical path element is simple and easy to implement.
  • a part of the first light beam can be reflected to the virtual image imaging lens group, and the part of the light beam can be processed by the virtual image imaging lens group.
  • a part of the light beam emitted by the virtual image imaging lens group can be transmitted. This allows the user to see an enlarged virtual image, thereby folding the optical path, improving the integration of the display device, and conducive to miniaturization of the display device.
  • the embodiment of the present application also provides a display device.
  • the display device may be a projection display device used in homes, classrooms, conference rooms, auditoriums, cinemas, stadiums, squares, etc.
  • the display device may also be a vehicle-mounted display. Screens or displays integrated in smart home appliances, etc., can also be network TVs, smart TVs, Internet Protocol TVs (IPTV), or integrated into them.
  • IPTV Internet Protocol TVs
  • Figure 17 is a schematic diagram of a display device provided by an embodiment of the present application.
  • the circuit in the display device mainly includes a processor 1701, internal memory 1702, external memory interface 1703, audio module 1704, video module 1705, power module 1706, wireless communication module 1707, I/O interface 1708, Video interface 1709, Controller Area Network (CAN) transceiver 1710, display circuit 1711, and any of the above display devices 400, etc.
  • CAN Controller Area Network
  • the processor 1701 and its peripheral components such as internal memory 1702, CAN transceiver 1710, audio module 1704, video module 1705, power module 1706, wireless communication module 1707, I/O interface 1708, video interface 1709, transceiver 1710 , the display circuit 1711 can be connected through the bus.
  • the processor 1701 may be called a front-end processor.
  • the processor 1701 includes one or more processing units.
  • the processor 1701 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal processor ( image signal processor (ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processing unit (NPU), etc.
  • different processing units can be independent devices or integrated in one or more processors.
  • the processor 1701 may also be provided with a memory for storing instructions and data.
  • a memory for storing instructions and data.
  • the memory in processor 1701 is cache memory. This memory may hold instructions or data that have been recently used or recycled by the processor 1701 . If the processor 1701 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 1701 is reduced, thus improving the efficiency of the system.
  • the function of the processor 1701 can be implemented by a domain controller on the vehicle.
  • the display device may also include a plurality of input/output (I/O) interfaces 1708 connected to the processor 1701 .
  • the interface 1708 may include, but is not limited to, an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and universal asynchronous reception and transmission.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART Universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the above-mentioned I/O interface 1708 can be connected to devices such as a mouse, touch screen, keyboard, camera, speaker/speaker, microphone, etc., or can be connected to physical buttons on the display device (such as volume keys, brightness adjustment keys, power on/off keys, etc.).
  • Internal memory 1702 may be used to store computer executable program code, including instructions.
  • Memory 1702 may include an area for storing programs and an area for storing data.
  • the stored program area can store the operating system, at least one application program required for the function (such as call function, time setting function, AR function, etc.).
  • the storage data area can store data created during use of the display device (such as phone book, world time, etc.).
  • the internal memory 1702 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the processor 1701 executes various functional applications and data processing of the display device by executing instructions stored in the internal memory 1702 and/or instructions stored in a memory provided in the processor 1701 .
  • the external memory interface 1703 can be used to connect an external memory (such as a Micro SD card).
  • the external memory can store data or program instructions as needed.
  • the processor 1701 can read and write these data or program instructions through the external memory interface 1703.
  • the audio module 1704 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1704 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1704 may be provided in the processor 1701, or some functional modules of the audio module 1704 may be provided in the processor 1701.
  • the display device can implement audio functions through the audio module 1704 and an application processor.
  • the video interface 1709 can receive external audio and video input, which can be specifically a high definition multimedia interface (HDMI), a digital visual interface (DVI), a video graphics array (VGA), Display port (DP), low voltage differential signaling (LVDS) interface, etc.
  • the video interface 1709 can also output video.
  • the display device receives video data sent by the navigation system through the video interface or receives video data sent by the domain controller.
  • the video module 1705 can decode the video input by the video interface 1709, for example, perform H.264 decoding.
  • video The module can also encode the video collected by the display device, such as H.264 encoding of the video collected by an external camera.
  • the processor 1701 can also decode the video input from the video interface 1709 and then output the decoded image signal to the display circuit 1711.
  • the above-mentioned display device also includes a CAN transceiver 1710.
  • the CAN transceiver 1710 can be connected to the CAN bus (CAN BUS) of the car.
  • the display device can communicate with the in-vehicle entertainment system (music, radio, video module), vehicle status system, etc.
  • the user can activate the car music playback function by operating the display device.
  • the vehicle status system can send vehicle status information (doors, seat belts, etc.) to the display device for display.
  • the display circuit 1711 and the display device 400 jointly implement the function of displaying images.
  • the display circuit 1711 receives the image signal output by the processor 1701, processes the image signal, and then inputs it into the image generation unit 410 of the display device 400 to generate a real image.
  • the display circuit 1711 can also control the image displayed by the image generation unit 410. For example, control parameters such as display brightness or contrast.
  • the display circuit 1711 may include a driving circuit, an image control circuit, etc.
  • the video interface 1709 can receive input video data (also called a video source).
  • the video module 1705 decodes and/or digitizes the data and then outputs the image signal to the display circuit 1711.
  • the display circuit 1711 responds to the input image signal.
  • the driving image generating unit 410 images the light beam emitted by the light source, thereby generating a visible image (emitting imaging light).
  • the power module 1706 is used to provide power to devices such as the processor 1701 and the projection device according to the input power (eg, direct current).
  • the power module 1706 may include a rechargeable battery.
  • the above-mentioned power module 1706 can be connected to a power supply module (such as a power battery) of the car, and the power supply module of the car supplies power to the power module 1706 of the display device.
  • the wireless communication module 1707 can enable the display device to communicate wirelessly with the outside world, and can provide wireless local area networks (WLAN), wireless fidelity (Wi-Fi) network, Bluetooth (bluetooth, BT), and global navigation. Satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 1707 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1707 receives electromagnetic waves through the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1701 .
  • the wireless communication module 1707 can also receive the signal to be sent from the processor 1701, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna for radiation.
  • the video data decoded by the video module 1705 can also be received wirelessly through the wireless communication module 1707 or read from the internal memory 1702 or external memory.
  • the display device can The wireless LAN receives video data from the terminal device or vehicle entertainment system, and the display device can also read the audio and video data stored in the internal memory 1702 or external memory.
  • circuit diagram schematically illustrated in the embodiment of the present application does not constitute a specific limitation on the display device.
  • the display device may include more or fewer components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the above display device can also provide broadcast and television reception functions.
  • the above display device can be integrated into network TV, smart TV, and Internet Protocol TV (IPTV).
  • IPTV Internet Protocol TV
  • the embodiment of the present application also provides a table structure.
  • the table structure has a desktop with a display window. Any of the above display devices or display devices 400 are installed in the table structure, and the display device 400 The emitted second light beam can reach the display window, and the display window reflects the second light beam, so that the second light beam enters the user's eyes, allowing the user to see an enlarged virtual image.
  • Embodiments of the present application also provide a vehicle, which includes a reflective element, and any of the foregoing display devices or any of the foregoing display devices.
  • the reflective element is used to reflect the second light beam from the display device, so that the second light beam can be received by the observer's eyes, thereby creating an enlarged virtual image of the observer.
  • the above display device is integrated into a head-up display device (HUD) of the vehicle.
  • the reflective element included in the vehicle may specifically be its front windshield (ie, front windshield).
  • FIG. 18 is a schematic diagram of applying the display device according to the present application to a head-up display device HUD of a vehicle.
  • the optical generating unit of the display device includes the front windshield of the vehicle.
  • the second light beam corresponding to the virtual image is reflected from the front windshield to the driver's eyes, so that an enlarged virtual image of the image M is formed on one side of the front windshield.
  • vehicles may be cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trolleys, golf carts, trains, and handcarts etc., the embodiments of the present application are not particularly limited.
  • the content of the above image M includes but is not limited to map auxiliary information, indication information of external objects, vehicle status information, entertainment information, etc.
  • Map auxiliary information is used to assist driving.
  • map auxiliary information includes but is not limited to direction arrows, distance and driving time, etc.
  • the indication information of external objects includes but is not limited to safe distance between vehicles, surrounding obstacles and reversing images.
  • the status information of the vehicle is generally the information displayed on the instrument of the vehicle, also known as instrument information, including but not limited to information such as driving speed, driving mileage, fuel level, water temperature, and car light status.
  • the display device may be installed on the back of a seat of the vehicle. Taking the vehicle as a small vehicle as an example, the display device can be installed on the backrest of the front seat of the vehicle. In this way, the light beam emitted by the display device enters the eyes of the rear passengers, so that the rear passengers can view the enlarged virtual image of the image to be displayed.
  • Figure 19 is a schematic diagram of a possible functional framework of a vehicle provided by an embodiment of the present application.
  • the functional framework of the vehicle may include various subsystems, such as the control system 1901, the sensor system 1902, one or more peripheral devices 1903 (one is shown as an example), Power supply 1904, computer system 1905, display system 1906.
  • the vehicle may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
  • the sensor system 1902 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (GPS), vehicle speed sensor, inertial measurement unit (IMU), radar unit, laser rangefinder, camera device, wheel speed sensor, Steering sensors, gear sensors, or other components used for automatic detection, etc. are not limited in this application.
  • the control system 1901 may include several elements, such as the illustrated steering unit, braking unit, lighting system, automatic driving system, map navigation system, network time synchronization system, and obstacle avoidance system.
  • the control system 1901 may also include components such as a throttle controller and an engine controller for controlling the driving speed of the vehicle, which are not limited in this application.
  • Peripheral device 1903 may include several elements, such as a communication system, a touch screen, a user interface, a microphone, a speaker, etc. as shown.
  • the communication system is used to realize network communication between vehicles and other devices other than vehicles.
  • the communication system can use wireless communication technology or wired communication technology to realize transportation Network communication between tools and other devices.
  • the wired communication technology may refer to communication between vehicles and other devices through network cables or optical fibers.
  • Power supply 1904 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, rechargeable lithium batteries or lead-acid batteries, etc. In practical applications, one or more battery components in the power supply are used to provide electric energy or energy for starting the vehicle. The type and material of the power supply are not limited in this application.
  • the computer system 1905 may include one or more processors (one processor is shown as an example) and a memory (which may also be referred to as a storage device).
  • the memory may be internal to the computer system 1905 or external to the computer system 1905, for example, as a cache in a vehicle, which is not limited by this application. in,
  • the processor may include one or more general-purpose processors, such as a graphics processing unit (GPU).
  • the processor can be used to run relevant programs or instructions corresponding to the programs stored in the memory to implement corresponding functions of the vehicle.
  • the memory may include volatile memory, such as RAM; the memory may also include non-volatile memory, such as ROM, flash memory, or solid state drives. , SSD); the memory may also include a combination of the above types of memory.
  • volatile memory such as RAM
  • non-volatile memory such as ROM, flash memory, or solid state drives. , SSD
  • the memory may also include a combination of the above types of memory.
  • the memory can be used to store a set of program codes or instructions corresponding to the program codes, so that the processor can call the program codes or instructions stored in the memory to implement the corresponding functions of the vehicle.
  • a set of program codes for vehicle control can be stored in the memory, and the processor can call the program codes to control the safe driving of the vehicle.
  • the memory may also store information such as road maps, driving routes, sensor data, and the like.
  • the computer system 1905 can be combined with other elements in the vehicle functional framework diagram, such as sensors in the sensor system, GPS, etc., to implement vehicle-related functions.
  • the computer system 1905 can control the driving direction or driving speed of the vehicle based on data input from the sensor system, which is not limited by this application.
  • Display system 1906 may include several elements, such as a windshield, controls, and display device 400 as described above.
  • the controller is used to generate images according to user instructions (such as generating images containing vehicle status such as vehicle speed, power/fuel level, and images of augmented reality AR content), and send the image content to the projection device; the projection device will carry the light of the image content. Projected to the windshield, the windshield is used to reflect the light carrying the image content, so that a virtual image corresponding to the image content is presented in front of the driver.
  • the functions of some components in the display system 1906 can also be implemented by other subsystems of the vehicle.
  • the controller can also be a component in the control system.
  • Figure 19 of this application shows that it includes four subsystems.
  • the sensor system, the control system, the computer system and the display system are only examples and do not constitute a limitation.
  • vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions.
  • the vehicle may include more or fewer systems or components, which is not limited by this application.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)

Abstract

一种涉及光显示技术领域的显示装置(400),包括图像生成单元(410)和光学成像单元(420),图像生成单元(410)用于根据处理器(1701)输入的图像信号,生成实像,实像相对于图像信号对应的图像具有预畸变;光学成像单元(420)用于对实像进行成像,以形成实像对应的虚像,并通过虚像造成的光学畸变补偿预畸变,从而形成相对理想的虚像,保证用户的观看体验,而且图像生成单元(410)的像素均被有效利用,避免图像分辨率的损失。还公开了一种显示设备和交通工具。

Description

显示装置、显示设备及交通工具
本申请要求于2022年9月2日提交中国国家知识产权局、申请号202211073982.8、申请名称为“显示装置、显示设备及交通工具”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及,尤其涉及光显示技术领域,尤其涉及一种显示装置、显示设备及交通工具。
背景技术
光显示产品包括桌面显示装置、抬头显示装置(head-up display,HUD)和投影车灯等,其应用场景包括但不限于交通工具,如汽车、列车等,但桌面显示装置还可以应用于教育领域,如具有桌面显示功能的各类桌体。
上述光显示产品的特点是,能够通过其光学成像单元,呈现出视场和画幅得到放大的虚像,映射到观察者眼中,从而使得观察者获得大屏观看体验。
然而,由于光学成像单元一般由光学镜片组成,而光学镜片的制造精度以及光学成像单元的组装工艺的偏差,因此使得光学成像单元在成像时会导致虚像出现光学畸变,即呈现出的虚像是变形的虚像,从而影响观察者的观看体验。
发明内容
本申请实施例提供一种显示装置,包含该显示装置的显示设备,包含该显示装置或者该显示设备的交通工具,主要目的在于提供一种能够呈现出图像的无畸变的理想虚像,同时不损失图像分辨率的显示装置。
第一方面,本申请提供了一种显示装置,该显示装置可以被应用于影音娱乐、办公、学习以及辅助驾驶等场景中,包括但不限于如桌面显示、近眼显示、抬头显示、投影仪、车载显示设备及车灯等具体场景中,该显示装置可以是平面显示装置,也可以是立体显示装置,可以单独使用,也可以作为部件集成在其它设备中。
该显示装置包括图像生成单元和光学成像单元;图像生成单元,用于根据处理器输入的图像信号,生成实像。该图像信号即不具有畸变的理想的待显示图像对应的图像信号,该实像相对于待显示图像具有预畸变;光学成像单元,用于对该实像进行成像,以形成该实像对应的虚像,例如可以是放大的虚像,并通过对该虚像造成的光学畸变补偿预畸变。这样,即可形成相对理想的虚像,保证用户的观看体验。
并且,由于图像生成单元是根据显示装置外围的处理器输入的理想的待显示图像的图像信号,直接生成待显示图像的具有预畸变的实像,而非根据相对于待显示图像具有畸变的预校正图像生成预校正图像的实像,因此图像生成单元的像素均被有效利用,避免了图像分辨率的损失。另,本申请是通过图像生成单元在“实像生成阶段”实现对光学畸变的预校正, 无需为了得到预校正图像,采用显示装置外围的处理器对待显示图像进行预校正处理,因此不会产生额外的功耗及成本。
在第一方面的一种可能的实现方式中,光学成像单元对应有第一畸变参数,第一畸变参数为表征光学畸变的参数;图像生成单元包括:光调制器,用于根据输入的图像信号对输入的光信号进行光调制,生成实像;其中,光调制器上像素的尺寸和/或分布方式与第二畸变参数匹配;第二畸变参数根据第一畸变参数确定,且可表征预畸变。
也就是说,光调制器的像素的尺寸和/或分布方式是根据光学生成单元的第一畸变参数/光学畸变设计得到的,也即与第一畸变参数相对应,因此基于该光调制器生成的实像,相对于待显示图像具有可通过第二畸变参数进行表征的预畸变,且该预畸变能够与后续光学成像单元进行虚像成像时所引起的光学畸变相抵消,从而在“实像生成阶段”实现对该光学畸变的预校正,不但可以避免损失图像分辨率,还能避免产生额外功耗和成本。
由上述可知,本申请实施例中,根据光学成像单元引起的光学畸变,设计光调制器上像素的尺寸和分布方式。其中,光学畸变可以包括桶形畸变和枕形畸变。
示例的,如果光学成像单元引起的光学畸变为桶形畸变,那么则对光调制器上部分或者全部像素的尺寸进行压缩设计,即缩小像素尺寸,以使得图像生成单元生成具有枕形畸变的实像。这样的话,该实像进入到这光学成像单元后,光学成像单元在对其进行虚像成像时,通过对该虚像造成的桶形畸变,抵消实像的枕形畸变,最终形成理想虚像。如果光学成像单元引起的光学畸变为枕形畸变,那么则对光调制器上部分或者全部像素的尺寸进行膨胀设计,即增大像素尺寸,以使得图像生成单元生成具有桶形畸变的实像。这样的话,该实像进入到这光学成像单元后,光学成像单元在对其进行虚像成像时,通过对该虚像造成的枕形畸变,抵消实像的桶形畸变,最终形成理想虚像。
本申请实施例中,可以仅对部分区域内的像素进行压缩设计或者膨胀设计,即可保证形成理想虚像。
在第一方面的一种可能的实现方式中,光调制器包括第一区域和第二区域,第一区域内的像素和第二区域内的像素均呈阵列分布;其中,第一区域内像素的尺寸与第二区域内像素的尺寸不同,和/或,第一区域内像素的阵列分布方式与第二区域内像素的阵列分布方式不同。
其中,像素的尺寸和分布方式,具体可以为子像素所在的子像素区域中发光区域的尺寸和分布方式,尺寸如宽、高、面积等,分布方式则包括位置、阵列分布时的阵列分布方向、间距等。
在第一方面的一种可能的实现方式中,第一区域相对于第二区域远离光调制器的中心;第一区域内像素的尺寸和/或阵列分布方式与第二畸变参数匹配。或者说,第一区域内像素的尺寸和/或阵列分布方式与第一畸变参数对应。
也就是说,光调制器被划分为两部分,该两部分区域中像素的尺寸和/或阵列分布方式可以符合不同的规则,其中的第一区域内像素的尺寸和/或阵列分布方式是根据第二畸变参数设计的,即符合第二畸变参数对应的规则,如压缩设计规则,或者膨胀设计规则。而第二区域像素点的尺寸和/或阵列分布方式则符合另一种规则,如光调制器的标准设计规则。这里的标准设计规则,可以理解为像素的尺寸和/或阵列分布方式与第二畸变参数无关,即与光学成像单元的成像缺陷无关。
基于此,在第一方面的一种可能的实现方式中,在光学畸变为桶形畸变的情况下,光调 制器沿像素的列方向的宽度,由行方向的中部向两端逐渐增大,以使得实像的形状呈“枕形”。
进一步的,第一区域内像素的尺寸小于第二区域内像素的尺寸,以使得实像的预畸变为枕形畸变,即表现出枕形失真。
进一步的,第一区域中的一行像素沿曲线分布,其中,靠近该曲线的两端的像素相对于靠近该曲线的中点的像素,远离第二区域。或者说,第一区域中的一行像素,可以由行方向的中部向两端逐渐远离光调制器的平行于行方向的中线。
此外,第一区域内的一行像素中,沿其行方向,由中部向两端像素的宽度逐渐减小。第一区域中的像素尺寸和阵列分布方式如此设计,能够使得光调制器对实像造成的枕形畸变更加贴近光学上的枕形畸变,进而与光学成像单元引起的桶形畸变更加匹配,二者抵消后,形成的虚像更加理想。其中,光学上的枕形畸变可以理解为由光学镜片的组装误差、制造精度等因素引起的枕形畸变,其形成原理与本申请中光调制器引起枕形畸变的原理不同。
或者,在另一种可能的实现方式中,在光学畸变为枕形畸变的情况下,光调制器沿像素的列方向的宽度,由行方向的中部向两端逐渐减小,以使得实像的形状呈“桶形”。
进一步的,第一区域内像素的尺寸大于第二区域内像素的尺寸,以使得实像的预畸变为桶形畸变,即表现出桶形失真。
进一步的,第一区域中的一行像素沿曲线分布,其中,靠近该曲线的中点的像素相对于靠近该曲线的两端的像素,远离第二区域。或者说,第一区域中的一行像素,由行方向的中部向两端逐渐远离光调制器的平行于行方向的中线。
此外,第一区域内的一行像素中,沿其行方向,由中部向两端像素的宽度逐渐增大。
第一区域中的像素如此阵列分布,能够使得光调制器对实像造成的桶形畸变更加贴近光学上的桶形畸变,进而与光学成像单元引起的枕形畸变更加匹配,二者抵消后,形成的虚像更加理想。
由于无论是桶形畸变,还是枕形畸变,对应视场越大的区域,畸变表现的越明显。而对应视场较小的区域,畸变不影响用户的观看体验。因此,上述第一区域内的像素对应的视场角可以大于或者等于预设视场角,也就是说,第一区域为大视场区域。这样的话,无需对光调制器的所有像素进行压缩设计或者膨胀设计,即可保证形成在人眼误差范围内的理想虚像。
在第一方面的一种可能的实现方式中,光调制器还包括相对于上述第一区域远离光调制器中心的边缘区域,边缘区域内像素的尺寸小于上述第一区域内像素的尺寸。通过缩小边缘的像素的尺寸,保证光调制器边缘的较小的子像素区域内也具有完整的发光区域,即可避免其生成的实像具有“毛边”、“缺口”等缺陷。除此之外,还可以通过调整光调制器边缘的像素的位置,例如,使连续的像素交错分布,进一步优化光调制器边缘的光滑度,进而优化实像边缘的光滑度,优化图像显示效果。
在第一方面的一种可能的实现方式中,显示装置还包括:光扩散元件,位于图像生成单元的出光侧,用于对实像对应的第一光束进行扩散,以提升成像画面的均匀性。
在第一方面的一种可能的实现方式中,光学成像单元包括上述光扩散元件。
在第一方面的一种可能的实现方式中,光学成像单元包括折叠光路元件和虚像成像镜组。折叠光路元件,用于接收第一光束,将第一光束发射至虚像成像镜组;虚像成像镜组,用于对第一光束进行反射,反射光到达折叠光路元件;折叠光路元件,还用于对虚像成像镜组反射的光束进行透射,以射出用于形成虚像的第二光束。光学成像单元的这种结构设计,可以 提高显示装置的集成度,有利于显示装置的小型化。
在第一方面的一种可能的实现方式中,第一光束为圆偏振光,折叠光路元件包括偏振转换器和第一膜层,第一膜层位于偏振转换器远离虚像成像镜组的一侧;偏振转换器,用于接收第一光束,并将第一光束转换成第一偏振态的线偏振光;第一膜层,用于将来自偏振转换器的第一偏振态的第一光束反射至偏振转换器;偏振转换器,还用于将来自第一膜层的第一偏振态的第一光束转换成圆偏振光,并射出至虚像成像镜组;偏振转换器,还用于将虚像成像镜组反射的光束转换成第二偏振态的线偏振光;第一膜层,还用于透射来自偏振转换器的第二偏振态的光束,最终射出第二光束。其中,上述偏振转换器可以为1/4波片。在这种实现方式中,基于第一光束可以为圆偏振光的这一特点,折叠光路元件一方面可以将第一光束反射至虚像成像镜组,由虚像成像镜组对第一光束进行处理,如放大处理,折叠光路元件另一方面还可以将虚像成像镜组射出的光束进行透射,以使其能够进入用户的眼睛,也即实现对光路进行折叠的作用,提高显示装置的集成度,有利于显示装置的小型化。与此同时,由上述光束的传播过程可知,并未造成光束的损失,因此可以保证成像质量。
在第一方面的一种可能的实现方式中,虚像成像镜组包括光学自由曲面,该光学自由曲面可以由光学自由曲面镜提供。
在第一方面的一种可能的实现方式中,折叠光路元件包括第二膜层、第三膜层和载膜片,该第三膜层和第二膜层可以依次位于载膜片的远离虚像成像镜组的一侧。
其中,第二膜层用于对第一光束中的一部分光反射至虚像成像模组,并对虚像成像模组反射的光束中的一部分透射至第三膜层。第三膜层用于对来自第二膜层的透射光进行透射,第三膜层的透射光即为第二光束,第二光束继续传播,最终进入用户双眼,使用户看到放大的虚像。在这种实现方式中,折叠光路元件的结构简单,易于实现。其一方面可以将第一光束中的一部分反射至虚像成像镜组,由虚像成像镜组对该部分光束进行处理,其另一方面还可以将虚像成像镜组射出的光束的一部分进行透射,以使其能够进入用户的眼睛,使用户看到放大的虚像,也即实现对光路进行折叠的作用,提高显示装置的集成度,有利于显示装置的小型化。
第二方面,本申请提供了一种显示设备,包括处理器以及上述任意一种显示装置,处理器用于向显示装置中的图像生成单元发送图像信号。
第三方面,本申请提供了一种交通工具,包括反射元件以及上述任一种显示装置或者显示设备,显示装置或者显示设备安装在交通工具上。
在第三方面一种可能的实现方式中,显示装置或者显示设备用于射出虚像对应的光束至反射元件;反射元件用于对该光束反射至用户双眼,反射元件如交通工具的风挡,显示装置的光学成像单元包括该反射元件。
在第三方面另一种可能的实现方式中,交通工具具有座椅,显示装置或者显示设备安装在座椅的靠背。
其中,第二方面至第三方面中任一种设计方式所带来的技术效果可参见第一方面中不同设计方式所带来的技术效果,此处不再赘述。
附图说明
图1为本申请实施例给出的光学畸变示意图;
图2为本申请实施例给出的一种显示装置的成像过程示意图;
图3为本申请实施例提供的显示装置的应用场景示意图;
图4为本申请实施例提供的一种显示装置的结构示意图;
图5为本申请实施例提供的另一种显示装置的成像过程示意图;
图6为本申请实施例给出的光调制器与第一畸变参数和第二畸变参数的关系示意图;
图7为本申请实施例提供的光调制器结构与实像的关系示意图;
图8为本申请实施例提供的一种光调制器的示意图;
图9为本申请实施例提供的光调制器中局部区域像素示意图;
图10为本申请实施例提供的一种光调制器的示意图;
图11为本申请实施例提供的另一种光调制器的示意图;
图12为本申请实施例提供的另一种显示装置的结构示意图;
图13为本申请实施例提供的一种光学成像单元的结构示意图;
图14为本申请实施例提供的包含图13所示光学成像单元的另一种显示装置结构示意图;
图15为本申请实施例提供的一种光学成像单元中的光路示意图;
图16为本申请实施例提供的另一种光学成像单元中的光路示意图;
图17为本申请实施例提供的一种显示设备的示意图;
图18为将本申请实施例给出的显示装置应用在交通工具的抬头显示装置HUD中的示意图;
图19为本申请实施例给出的一种交通工具的一种可能的功能框架示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。
以下内容中,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
光显示产品包括桌面显示装置、抬头显示装置(head-up display,HUD)和投影车灯等,其应用场景包括但不限于交通工具,如汽车、列车等,但桌面显示装置还可以应用于教育领域,如具有桌面显示功能的各类桌体。
上述光显示产品的特点是,能够通过其光学成像单元,呈现出视场和画幅得到放大的虚像,映射到观察者眼中,从而使得观察者获得大屏观看体验。
然而,由于光学成像单元一般由光学镜片组成。实际应用中,由于光学镜片的制造精度和光学成像单元的组装工艺的偏差,以及光学镜片的结构特点(如光学自由曲面或者各类球面),导致光学成像单元所形成的虚像相对于图像生成单元生成的实像失去相似性,也即产生几何变形。这种几何变形亦称为光学畸变。这里的光学畸变还可以理解为:对于理想的光学成像单元,虚像各处相对于实像各处的放大率是常数,也即虚像与实像之间具有较高的相似性,也即未产生光学畸变。但是,对于实际的光学成像单元,虚像的放大率随视场而异,并且,视场越大,放大率的变化程度越大,因此就会使虚相对于实像失去相似性,也即产生光学畸变。
示例的,光学成像单元对虚像造成的光学畸变可以是如图1中(b)示出的枕形畸变,也可以是如图1中(c)示出的桶形畸变,其中的一个网格代表一个像素。简单来说,假设将产生畸变的虚像称为真实虚像,那么相对于图1中(a)示出的理想虚像而言,桶形畸变使得真实虚像的一些像素位置向远离像中心的方向偏移且像素尺寸变大,枕形畸变使得真实虚像的一些像素位置向靠近像中心的方向偏移且像素尺寸变小。无论是桶形畸变,还是枕形畸变,对应视场越大的位置,畸变表现的越明显。光学成像单元的这种成像缺陷,使得用户看到的是几何变形的图像,无疑会影响用户的观看体验。
为了解决由光学成像单元带来的光学畸变的问题,一些实现方式中,如图2所示,利用显示装置外围的处理器,对待显示图像进行预校正处理,得到预校正图像,并将预校正图像的图像信号输入到图像生成单元,以使得图像生成单元生成预校正图像的实像。预校正图像的实像经过光学成像单元处理后,可形成相对理想的虚像。
为了更好地理解这种实现方式的原理,可以将该过程划分为三个阶段:第一阶段即为“预校正处理阶段”,通过预校正处理,得到相对于待显示图像发生几何变形的预校正图像;第二阶段即为“实像生成阶段”,将该预校正图像的图像信号输入到图像生成单元后,图像生成单元可生成该预校正图像的实像;第三阶段即为“虚像形成阶段”,该预校正图像的实像经过光学成像单元后,光学成像单元所引起的光学畸变一定程度上可以抵消预校正图像的几何变形,进而形成相对理想的虚像。
然而,一方面,由于预校正图像相对于待显示图像发生几何变形,比如,在图2中,待显示图像呈矩形,而预校正图像呈“桶形”,这意味着预校正图像上的大量像素的位置发生偏移,像素大小也发生变化,因此图像生成单元在生成预校正图像的实像时,图像生成单元的像素并未被完全利用,进而导致图像分辨率的损失。另一方面,由于需要利用显示装置外围的处理器对待显示图像进行预校正处理,也即该实现方式需要相应的硬件支持,因此会产生额外的功耗及成本。
有鉴于此,本申请实施例提供一种显示装置,亦可称之为投影装置,该显示装置既可形成相对理想的虚像,保证用户的观看体验,还能够充分利用图像生成单元的像素,避免损失图像分辨率,且不增加功耗和成本。
该显示装置可被应用于影音娱乐、教育、办公、辅助驾驶、医疗等场景中。该显示装置可以是显示二维图像的平面显示装置,也可以是基于分光立体显示技术,为用户的左眼和右眼分别提供左眼图像和右眼图像,以使得用户产生立体视觉体验的立体显示装置。具体应用时,该显示装置可以作为部件集成在其他设备中,也可以单独使用。
比如,在又一些可能的应用场景中,该显示装置可以为桌面显示装置,即其显示窗位于如游戏桌、办公桌、学习桌等桌体结构的桌面上。桌面显示装置可以在用户的前方视野范围内显示更大画幅的图像,从而提供沉浸感和震撼感更强的视觉体验,并减轻用户的视觉疲劳。如图3中(a)所示,以桌面显示装置安装于学习桌为例,桌面显示装置产生的成像光束通过显示窗的反射,映入到用户双目,从而在用户的前方视野范围呈现出图像的放大虚像。示例的,图像中可以包含图书内容、辅助学习内容或者网络课程的画面等,这样,用户无需低头翻看书本,也无需通过小屏智能设备学习网络课程,便于保护用户视力,同时为用户提供更好的学习条件。
在另一些可能的应用场景中,该显示装置可以集成于近眼显示(near eye display,NED) 设备,NED设备例如可以是增强现实(augmented reality,AR)设备或虚拟现实(virtual reality,VR)设备,AR设备可以包括但不限于AR眼镜或AR头盔,VR设备可以包括但不限于VR眼镜或VR头盔。如图3中(b)所示,以NED设备为VR眼镜为例,用户佩戴VR眼镜时,VR眼镜可以在用户前方的视野范围呈现出图像对应的放大的虚像,从而使用户沉浸在虚拟现实场景,进行游戏、观看视频、参加虚拟会议、或视频购物等。
在又一些可能的应用场景中,该显示装置可以集成于HUD中。如图3中(c)所示,以HUD安装于交通工具为例,HUD可将成像光束投射到用户(驾驶者)的双目,从而在用户的前方视野范围呈现出图像的放大虚像。示例的,图像中可以包括仪表信息、导航信息等,从而,用户无需低头查看这些信息,避免影响驾驶安全。其中,HUD的类型包括但不限于风挡(Windshield,W)-HUD、增强现实抬头显示(AR-HUD)等。
在又一种可能的应用场景中,该显示装置也可以集成于车灯中。请如图3中(d)所示,除了实现照明功能,车灯还可以实现自适应远光系统(adaptive driving beam,ADB),可以在行车前方,呈现出包含文字、交通标志、视频画面等内容的图像的放大虚像,从而为用户提供辅助驾驶功能或影音娱乐功能。
在又一些可能的应用场景中,该显示装置可以是投影仪。如图3中(e)所示,投影仪可以将图像对应的放大的虚像投影到墙面或投影屏幕上。
需要说明的是,上述给出的应用场景仅是举例,本申请实施例提供的显示装置还可以应用在其它可能的场景,例如集成在医疗设备中,用于辅助医疗,本申请对此不予限定。
图4为本申请实施例提供的一种显示装置400的结构示意图。如图4所示,该显示装置400包括图像生成单元410和光学成像单元420。其中,图像生成单元410能够根据输入的图像信号,生成该图像信号对应的待显示图像的实像,也即产生包含待显示图像信息的光束P1,光束P1可以形成待显示图像的实像。光学成像单元420用于对光束P1进行折叠(如改变光束的传播方向)、放大等处理,最终形成该实像对应的放大的虚像,也即形成包含待显示图像信息的光束P2,光束P2可以形成放大的虚像。为便于区分和说明,以下将图像生成单元410产生的光束P1,称为第一光束,将光学成像单元420射出的光束P2称为第二光束。
应理解的是,在不同应用场景中,图像生成单元410与光学成像单元420的位置关系可能不同。例如,上述显示装置400应用于桌面显示场景时和应用于HUD中时,其图像生成单元410与光学成像单元420的位置关系可能不同。或者可以这样理解,对于特定应用场景,图像生成单元410与光学成像单元420的位置关系取决于该特定应用场景下的光路设计,本申请对此不予限定。但,为了保证第一光束可以入射到光学成像单元420,光学成像单元420通常位于图像生成单元410的出光侧。
为了能够在形成相对理想的虚像的同时,充分利用图像生成单元410的像素,避免损失图像分辨率,且不增加功耗和成本,本申请实施例中,如图5所示,图像生成单元410生成的待显示图像的实像,相对于待显示图像具有预畸变。该实像进入到光学成像单元420后,光学成像单元420可形成对应的放大的虚像,且光学成像单元420对该虚像造成的光学畸变可补偿前述预畸变。这样,即可形成相对理想的虚像,保证用户的观看体验。
其中值得注意的是,由于图像生成单元410是根据显示装置外围的处理器输入的待显示图像的图像信号,直接生成具有预畸变的实像,而非根据预校正图像的图像信号生成对应的实像,因此图像生成单元410的像素均被有效利用,避免了图像分辨率的损失。另,由此还 可以看出,本申请是通过图像生成单元410,在“实像生成阶段”实现对光学畸变的预校正,省略了“预校正处理阶段”,也就无需显示装置400外围的硬件支持,因此不会产生额外的功耗及成本。
光学成像单元420对应有畸变参数,其可以理解为光学成像单元420的特性参数,该畸变参数可以表征光学成像单元420引起的光学畸变。为便于说明,以下将光学成像单元420对应的畸变参数称为第一畸变参数。
在一些实施例中,图像生成单元410包括光源和光调制器。光调制器位于光源的出光侧,光源用于为光调制器提供基础光,如自然光或者基色光。光调制器亦称为显示面板,如液晶显示器(liquid crystal display,LCD)和有机发光二极管(organic light-emitting diode,OLED),用于对待显示图像的信号进行光调制,以生成第一光束,即如上述图2中示出的光束P1,该第一光束用于形成待显示图像的实像。其中,该光调制器的像素的尺寸和/或分布方式与第二畸变参数匹配,该第二畸变参数根据上述第一畸变参数确定。此处还可以理解为,光调制器的像素的尺寸和/或分布方式是根据光学生成单元420的第一畸变参数/光学畸变设计得到的,即与第一畸变参数对应,因此基于该光调制器生成的实像,相对于待显示图像具有可通过第二畸变参数进行表征的预畸变,且该预畸变能够与后续光学成像单元420进行虚像成像时所引起的光学畸变相抵消,从而在“实像生成阶段”实现对该光学畸变的预校正,不但可以避免损失图像分辨率,还能避免产生额外功耗和成本。
以下先依次对第一畸变参数、光调制器与第一畸变参数和第二畸变参数之间的关系进行解释说明,再对光调制器相关的具体实现方式予以介绍。
在一种可能的实现方式中,将上述光学畸变定义为真实虚像相对于理想虚像的偏差量,该偏差量可以为真实虚像和理想虚像中相同特征点的位置偏移量,还可以是真实虚像和理想虚像中相同特征区域的高度和/或宽度的偏差。其中,理想虚像是在假设光学成像单元420为零畸变的理想光学成像单元420的情况下,某一实像经过该理想光学成像单元420后所形成的虚像。真实虚像则为该实像经过实际的光学成像单元420后所形成的虚像。
如下式(1-1)和式(1-2)所示,若以所述偏差量为相同特征点的位置偏移量为例,那么真实虚像与理想虚像之间的关系,则可以为真实虚像上特征点的位置坐标和理想虚像上该特征点的位置坐标之间的关系,而用于表征/描述这种关系的参数,即为第一畸变参数。应理解的是,该第一畸变参数同时也是表征/描述光学畸变的参数,该第一畸变参数与光学成像单元420相对应。示例的,特征点可以为像素位置。
x`=x+x[k1(x2+y2)+k2(x2+y2)2]   式(1-1)
y`=y+y[k1(x2+y2)+k2(x2+y2)2]   式(1-2)
上式中,(x`,y`)表示真实虚像中像素的位置坐标,(x,y)表示理想虚像中像素的位置坐标,式中描述x`与x之间关系和y`与y之间关系的参数,如k1和k2即为第一畸变参数。
需要说明的是,上述式(1-1)和式(1-2)作为一种示例,并不对本申请实施例涉及的第一畸变参数构成限定。实际上,根据光学畸变的类型的不同,如径向畸变(包括上述提及的桶形畸变和枕形畸变)和切向畸变,真实虚像上特征点的位置坐标和理想虚像上该特征点的位置坐标之间的关系也会有所不同,第一畸变参数也会有所不同,而不限于式(1-1)和式(1-2)示出的函数关系及k1和k2
另外,本申请涉及的光学畸变是光学成像单元420对其形成的虚像造成的畸变,基于与 虚像进行对比的对象不同,该光学畸变可以具有不同的定义。针对不同定义下的光学畸变,第一畸变参数也可以不同。
比如,在另一种可能的实现方式中,将上述光学畸变定义为光学成像单元420所形成的虚像相对于进入光学成像单元420的实像的偏差量。由于该虚像和该实像具有对应关系,那么,在实像上具有多个特征点(或者特征区域)的情况下,虚像同样具有多个特征点(或者特征区域),且实像中的多个特征点(或者特征区域)与虚像中的多个特征点(或者特征区域)一一对应。基于此,该虚像相对于该实像的偏差量,可以为虚像和实像中相对应的特征点的位置偏移量,还可以为虚像和实像中相对应的特征区域的高度和/或高度的偏差。若以所述偏差量为相对应的特征点的位置偏移量为例,那么虚像与实像之间的关系,则可以为虚像和实像中相对应的特征点的位置坐标之间的关系,而用于表征/描述这种关系的参数,即可作为光学成像单元420对应的第一畸变参数,也即为表征虚像所产生的光学畸变的参数。
针对于上述任一种实现方式,都可以通过相应的实际测量或者仿真测量的方式,获得光学成像单元420对应的第一畸变参数。应理解的是,基于上述光学畸变的定义的不同,所采用的畸变参数的测量方式也有所不同。本领域技术人员可以在本申请实施例给出的光学畸变的定义的启示下,选择合适的测量方式,以获得光学成像单元420对应的第一畸变参数,本申请实施例对具体的测量方式不予限定。
图6示例性示出了光调制器与第一畸变参数和第二畸变参数之间的关系,该示例中,假设第一畸变参数描述的是光学成像单元420所形成的虚像R与进入光学成像单元420的实像D之间的关系。请参阅图6,由于期望光学成像单元420形成的虚像R为理想虚像,因此可以将虚像R视为已知。由于实像D中的像素imgD与虚像R中的像素imgR一一对应,且相对应的imgR和imgD的像素值相同,因此在已获得第一畸变参数且虚像R已知的情况下,根据虚像R中每个像素imgR的位置坐标和第一畸变参数,可以确定实像D中每个像素imgR的坐标,进而可以得到实像D。实像D则为期望图像生成单元410生成的预畸变实像,其预畸变可以通过第二畸变参数进行表征,也即实像D与待显示图像I之间的关系可以通过第二畸变参数进行表征。如此,在光调制器的像素的尺寸和/或分布方式与第二畸变参数匹配的情况下,基于该光调制器,能够根据待显示图像生成期望的实像D。
简而言之,在获得光学成像单元420对应的第一畸变参数的情况下,以理想的虚像R作为虚像成像目标,结合第一畸变参数可以确定期望图像生成单元410生成的实像D。在确定出实像D的情况下,可以进一步获得实像D与待显示图像I之间的关系,即第二畸变参数。基于第二畸变参数设计光调制器的像素的尺寸和/或分布方式,即可使得图像生成单元410生成的实像为符合期望的具有预畸变的实像D。
通常,光调制器的面板区域包括上具有像素,每个像素包括多个子像素。例如,对于多数LCD,每个像素由红蓝绿(RGB)三原色组成,每个像素上的每种颜色即为一个“子像素”,对于多数OLED,每个像素包括4个子像素,分别对应红色、绿和两个蓝色。面板区域包括按照某种规则分布的子像素区域,子像素位于子像素区域内。子像素区域包括发光区域,通过子像素区域的开口率表征发光区域在子像素区域中的面积占比。这里的子像素即俗称的像素点。本申请实施例提及的像素的尺寸和分布方式,具体可以为子像素所在的子像素区域中发光区域的尺寸和分布方式,尺寸如宽、高、面积等,分布方式则包括位置、阵列分布时的阵列分布方向、间距等。
在一些实施例中,光学成像单元420引起的光学畸变表现为像的形状的变化。比如,由图3即可看出,图3中(b)和(c)示出的畸变后的像,相对于图3中(a)示出的畸变前的像而言,具有明显的形状的改变。为便于说明,以下将未发生畸变的形状称为理想形状,将发生畸变的形状称为非理想形状。
为了使得光学成像单元420形成的虚像具有理想形状,在一种可能的实现方式中,图像生成单元410生成的实像相对于待显示图像发生变形,或者说,待显示图像呈理想形状,而该实像发生预变形,呈非理想形状。上述预畸变包括该预变形。这样的话,该实像进入到光学成像单元420后,光学成像单元420在对其进行虚像成像时,通过对该虚像造成的形状的改变,抵消实像的预变形,使得最终形成的虚像呈理想形状。
其中,由于实像的形状取决于光调制器的形状,而其形状取决于其上像素的尺寸和分布方式,因此,在光调制器的像素的尺寸和/或分布方式与第二畸变参数匹配的情况下,光调制器的形状可以为非理想形状,从而保证生成非理想形状的实像,进而保证光学成像单元420形成理想形状的虚像。
如图7中(a)所示,假设光调制器呈理想形状,具体为矩形,那么其生成的实像呈理想形状,具体也为矩形。该实像进入到光学成像单元420后,由于光学成像单元420引起的光学畸变,形成的虚像呈非理想形状,而不再是矩形。
如图7中(b)所示,针对于同一光学成像单元420,假设光调制器呈非理想形状,那么其生成的实像呈理想形状。该实像进入到光学成像单元420后,由于光学成像单元420引起的光学畸变,形成的虚像可以呈理想形状,即矩形。
应理解的是,如果光调制器的形状不规则,意味着受限于边缘区域的区域形状和尺寸不规则。受限于前述的区域形状和尺寸,边缘区域无法容纳与内部区域同等大小的像素,因此容易导致实像出现“毛边”、“缺口”和色偏等缺陷。考因此,为了避免其生成的实像具有“毛边”、“缺口”等缺陷,在一种可能的实现方式中,分布于光调制器的边缘(可以简称为“边缘区域”)的像素的尺寸小于其他像素的尺寸。通过缩小像素的尺寸,保证边缘区域内较小的子像素区域内也具有完整的发光区域,即可避免其生成的实像具有“毛边”、“缺口”、色偏等缺陷。除此之外,还可以通过调整光调制器边缘区域的像素的位置,例如,使连续的像素交错分布,进一步优化光调制器边缘区域的光滑度、避免色偏现象,进而优化实像边缘的光滑度和色彩均匀度,优化图像显示效果。
以图7中(b)示出的光调制器为例,图8示例性示出了光调制器内像素的尺寸和分布情况。如图8所示,分布于边缘区域的像素的尺寸小于其他像素的尺寸。由于相对于其他子像素区域,位于边缘区域的子像素区域的面积较小,且形状也不甚规则,但由于其内像素的尺寸较小,所以保证了此类子像素区域也具有完整的发光区域。
需要说明的是,对于位于边缘区域的像素,其尺寸大小和位置可以适应于对应子像素区域的尺寸及开口率,从而避免生成的实像具有“毛边”、“缺口”和色偏等缺陷,本申请对此不予限定。
在一些实施例中,光学成像单元420引起的光学畸变不仅表现为像的形状的变化,还表现为像素尺寸的变化和像素位置的变化,即像的失真,例如图3示出的桶形畸变和枕形畸变。桶形畸变又称为桶形失真,它是画面呈桶形膨胀状的失真现象,从像素尺寸变化的角度来说,画面呈桶形膨胀状,即像素尺寸变大。枕形畸变又称为枕形失真,它是画面呈枕形压缩状的 失真现象,从像素尺寸变化的角度来说,画面呈枕形压缩状,即像素尺寸变小。应理解的是,桶形畸变和枕形畸变可以伴随着像的形状的改变。
基于此,本申请实施例中,根据光学成像单元420引起的光学畸变的类型,设计光调制器上像素的尺寸和阵列分布方式。其中,如果光学成像单元420引起的光学畸变为桶形畸变,那么则对光调制器上部分或者全部像素的尺寸进行压缩设计,以使得图像生成单元410生成具有枕形畸变的实像。这样的话,该实像进入到这光学成像单元420后,光学成像单元420在对其进行虚像成像时,通过对该虚像造成的桶形畸变,抵消实像的枕形畸变,最终形成理想虚像。如果光学成像单元420引起的光学畸变为枕形畸变,那么则对光调制器上部分或者全部像素的尺寸进行膨胀设计,以使得图像生成单元410生成具有桶形畸变的实像。这样的话,该实像进入到这光学成像单元420后,光学成像单元420在对其进行虚像成像时,通过对该虚像造成的枕形畸变,抵消实像的桶形畸变,最终形成理想虚像。
其中,对像素的尺寸进行压缩设计,可以理解为缩小像素的尺寸,也即缩小子像素区域内发光区域的尺寸。对像素的尺寸进行膨胀设计,可以理解为增大像素的尺寸,也即增大子像素区域内发光区域的尺寸。
示例的,通过压缩设计得到的像素可如图9中(b)所示,通过膨胀设计得到的像素可如图9中(c)所示。相对于图9中(a)示出的正常尺寸的像素而言,压缩设计得到的像素小于正常尺寸的像素,膨胀设计得到的像素大于正常尺寸的像素。
如图7中(a)所示,假设光调制器的所有像素的尺寸均相同,那么其生成的实像不具有桶形畸变,也不具有枕形畸变。该实像进入到光学成像单元420后,如果光学成像单元420引起的光学畸变为桶形畸变,则其形成的虚像具有桶形畸变。如果光学成像单元420引起的光学畸变为枕形畸变,则其形成的虚像具有枕形畸变。
假设光学成像单元420引起的光学畸变为桶形畸变,且光调制器的部分或者全部像素的尺寸是经过压缩设计得到的,也即小于正常尺寸,那么其生成的实像具有枕形畸变。该实像进入到光学成像单元420后,光学成像单元420引起的桶形畸变可补偿实像的桶形畸变,进而形成理想虚像。
实际应用中,可以仅对部分像素进行压缩设计或者膨胀设计,即可保证形成理想虚像。当然,这里的理想虚像并非绝对理想的虚像,而是指在人眼误差范围内的理想虚像,即人眼无法辨识出图像的畸变。具体来说,光调制器包括第一区域和第二区域,第一区域内的像素和第二区域内的像素均呈阵列分布;其中,第一区域内像素的尺寸与第二区域内像素的尺寸不同,和/或,第一区域内像素的阵列分布方式与第二区域内像素的阵列分布方式不同。此处可以这样理解,光调制器被划分为两部分,该两部分区域中像素的尺寸和/或阵列分布方式可以符合不同的规则。
示例的,第一区域相对于第二区域远离光调制器的中心。第一区域内像素的尺寸和/或阵列分布方式与第二畸变参数匹配。或者说,第一区域内像素的尺寸和/或阵列分布方式与第一畸变参数对应。也就是说,第一区域内像素的尺寸和/或阵列分布方式是根据第二畸变参数设计的,即符合第二畸变参数对应的规则,如压缩设计规则,或者膨胀设计规则。而第二区域像素点的尺寸和/或阵列分布方式则符合另一种规则,如光调制器的标准设计规则。这里的标准设计规则,可以理解为像素的尺寸和/或阵列分布方式与第二畸变参数无关,即与光学成像单元410的成像缺陷无关。
由于,无论是桶形畸变,还是枕形畸变,对应视场越大的区域,畸变表现的越明显。而对应视场较小的区域,畸变不影响用户的观看体验。因此,上述第一区域内的像素对应的视场角可以大于或者等于预设视场角,也就是说,第一区域为大视场区域。这样的话,无需对所有像素进行压缩设计或者膨胀设计,即可保证形成在人眼误差范围内的理想虚像。
在一种实现方式中,第一区域内像素的尺寸与第一畸变参数对应。其中,如果光学成像单元420产生的光学畸变是桶形畸变,则第一区域内像素的尺寸小于第二区域内像素的尺寸,以使得实像具有的预畸变为枕形畸变。如果光学成像单元420产生的光学畸变是枕形畸变,则第一区域内像素的尺寸大于第二区域内像素的尺寸,以使得实像具有的预畸变为桶形畸变。
在另一种可能的实现方式中,第一区域内像素的阵列分布方式与第一畸变参数对应。具体而言,第一区域中的一行像素沿曲线分布。其中,如果光学成像单元420产生的光学畸变是桶形畸变,则靠近该曲线的两端的像素相对于靠近曲线的中点的像素,远离第二区域;如果光学成像单元420产生的光学畸变是枕形畸变,则靠近该曲线的中点的像素相对于靠近该曲线的两端的像素,远离第二区域。
当然,在一些实施例中,第一区域内像素的尺寸和阵列分布方式可以都与第一畸变参数对应。
图10示例性示出了一种可能的光调制器,如图10所示,该光调制器包括第一区域和第二区域,第一区域环绕在第二区域的外围。第一区域和第二区域内的像素均呈阵列分布,并且,第一区域每个像素位置对应的视场角均大于预设视场角。其中,第一区域内的像素的尺寸和阵列分布方式是根据第二畸变参数设计得到的,即与第二畸变参数相匹配。
如图10所示,光调制器沿像素的列方向的宽度,由行方向的中部向两端逐渐增大,以使得实像的形状呈“枕形”。同时,第一区域内像素的尺寸小于第二区域内像素的尺寸,以使得实像的预畸变为枕形畸变,即表现出枕形失真。也就是说,图10示出的光调制器,可生成相对于待显示图像具有枕形畸变的实像。
由图10还可以看出,第一区域中的一行像素沿曲线分布,其中,靠近该曲线的两端的像素相对于靠近该曲线的中点的像素,远离第二区域。比如,在图10中,在光调制器的高度方向上,第一区域内的位于第二区域左侧的一行像素沿曲线S1的延伸方向阵列分布,位于第二区域右侧的一行像素沿曲线S2阵列分布;在光调制器的宽度方向上,第一区域内的位于第二区域上侧的一行像素沿曲线S3的延伸方向阵列分布,位于第二区域下侧的一行像素沿曲线S4阵列分布;而第二区域内的像素,在光调制器的高度方向上,即沿该高度方向阵列分布,在宽度方向上,即沿该宽度方向阵列分布。
或者,图10中第一区域内像素的阵列分布方式还可以这样描述:第一区域中的一行像素,由行方向的中部向两端逐渐远离光调制器的平行于行方向的中线。其中,平行于行方向的中线即为平行于行方向且过光调制器的几何中心的直线。此外,第一区域内的一行像素中,沿其行方向,由中部向两端像素的宽度逐渐减小。第一区域中的像素尺寸和阵列分布方式如此设计,能够使得光调制器对实像造成的枕形畸变更加贴近光学上的枕形畸变,进而与光学成像单元引起的桶形畸变更加匹配,二者抵消后,形成的虚像更加理想。其中,光学上的枕形畸变可以理解为由光学镜片的组装误差、制造精度等因素引起的枕形畸变,其形成原理与本申请中光调制器引起枕形畸变的原理不同。
图11示例性示出了另一种可能的光调制器,与图10所示光调制器不同的是,光调制器 沿像素的列方向的宽度,由行方向的中部向两端逐渐减小,以使得实像的形状呈“桶形”。同时,第一区域内像素的尺寸大于第二区域内像素的尺寸,以使得实像的预畸变为桶形畸变,即表现出桶形失真。也就是说,图11示出的光调制器,可生成相对于待显示图像具有桶形畸变的实像。
由图11还可以看出,第一区域中的一行像素沿曲线分布,其中,靠近该曲线的中点的像素相对于靠近该曲线的两端的像素,远离第二区域。比如,在图11中,与图10所示光调制器不同的是,在光调制器的高度方向上,第一区域内的位于第二区域左侧的一行像素沿曲线S5的延伸方向阵列分布,位于第二区域右侧的一行像素沿曲线S6阵列分布;在光调制器的宽度方向上,第一区域内的位于第二区域上侧的一行像素沿曲线S7的延伸方向阵列分布,位于第二区域下侧的一行像素沿曲线S8阵列分布。
或者,图11中第一区域内像素的阵列分布方式还可以这样描述:第一区域中的一行像素,由行方向的中部向两端逐渐远离光调制器的平行于行方向的中线。此外,第一区域内的一行像素中,沿其行方向,由中部向两端像素的宽度逐渐增大。第一区域中的像素尺寸和阵列分布方式如此设计,能够使得光调制器对实像造成的桶形畸变更加贴近光学上的桶形畸变,进而与光学成像单元引起的枕形畸变更加匹配,二者抵消后,形成的虚像更加理想。
需要说明的是,上述图10和图11示出的光调制器,是在给定分辨率(如1400×1050)和像素面板尺寸的情况下,分别对第一区域内的像素进行了压缩和膨胀设计。换句话说,在分辨率确定的情况下,对第一区域内的像素进行压缩设计后,第一区域内的像素尺寸小于第二区域内的像素尺寸,对第一区域内的像素进行膨胀设计后,第一区域内的像素尺寸大于第二区域内的像素尺寸。具体以图11示出的光调制器为例,假设其第二区域的尺寸为800×600,那么,其第二区域内即包括800×600个像素,其第二区域以外的区域即包括(1400×1050)-(800×600)个像素,该(1400×1050)-(800×600)中的部分或者全部位于第一区域内,且第一区域内的像素的尺寸小于第二区域内像素的尺寸。
应理解,在本申请的另一些实施例中,还可以将光调制器划分成上述的第一区域、第二区域和边缘区域。第一区域相对第二区域远离光调制器的中心,边缘区域相对于第一区域远离该中心。其中,边缘区域内像素的尺寸小于第一区域内像素的尺寸。在第一区域内像素的尺寸大于第二区域内像素尺寸的实施情形中,边缘区域内的像素尺寸可以小于第二区域内像素的尺寸,也可以大于第二区域内像素的尺寸。
上述实施例中提及的光调制器包括但不限于为LCD,OLED,例如,还可以为硅基液晶(liquid crystal on silicon,LCOS)显示器,和数字光处理(digital light processing,DLP)显示器。容易理解的是,由于不同类型的光调制器的光显示原理不同,光调制器的结构不同,因此不同类型的光调制器的制作工艺不同,进而,当本申请显示装置400采用不同类型的光调制器时,需要采用不同的制作工艺,制作得到像素的尺寸和/或分布方式与第二畸变参数匹配的光调制器。同时,出于这个目的,本领域技术人员能够在不付出创造性劳动的情况下,采用本领域公知的制作工艺,制备得到本申请实施例提供的光调制器,本文对可实现的制作工艺不予赘述。
在上述任一种实施例基础上,如图12所示,显示装置400还可以包括光扩散元件430,该光扩散元件430还可以被称为扩散屏。该光扩散元件430位于图像生成单元410的出光侧,用于对图像生成单元410发出的第一光束P1进行扩散,以提升成像画面的均匀性。
容易理解的是,光扩散元件作为光学镜片的一种,当其制作精度低和/或包括曲面结构时,也会引起像的光学畸变。对于这样的光扩散元件,在测得光学成像单元420对应的第一畸变参数时,需要将其视为光学成像单元420的一部分,才能保证第一畸变参数的准确性,才能保证第一畸变参数所描述的光学畸变包含光扩散元件引起的光学畸变。这样的话,才能保证根据第一畸变参数设计的光调制器能够实现对光学畸变的预校正。
请参阅图13,在一些实施例中,光学成像单元420包括折叠光路元件421和虚像成像镜组422。其中,折叠光路元件421用于接收图像生成单元410发出的第一光束P1,将第一光束P1发射至虚像成像镜组422;虚像成像镜组422用于对第一光束进行反射,以形成第二光束;折叠光路元件421还用于对第二光束进行透射,使得该第二光束进入用户双眼,使用户看到虚像。光学成像单元420的这种结构设计,提高了显示装置400的集成度,有利于显示装置400的小型化。
请参阅图14,在显示装置400还包括光扩散元件430的设计中,光扩散元件430位于图像生成单元410与折叠光路元件421之间的光路上,第一光束P1经光扩散元件430的扩散后,至折叠光路元件421。
在一些实施例中,虚像成像镜组422包括光学自由曲面,该光学自由曲面可由光学自由曲面镜提供。另外,图像生成单元410中光调制器的出光侧还可以设有1/4波片。在光调制器发出的光束为线偏振光的设计中,位于光调制器出光侧的1/4波片用于将该光束转换成圆偏振光,也即使得第一光束成为圆偏振光。
下面以虚像成像镜组422包括光学自由曲面镜,且第一光束为圆偏振光为例,对折叠光路元件421的具体实现方式予以介绍。
在一种可能的实现方式中,如图15所示,折叠光路元件421包括偏振转换器421a和第一膜层421b,第一膜层421b和偏振转换器421a依次位于基底远离光学自由曲面镜422a的一侧。为了便于说明折叠光路元件421的作用,本实施例的图示图15中,将第一膜层421b与偏振转换器421a分开设置。应理解的是,实际应用中,第一膜层421b和偏振转换器421a依次贴设于基底远离光学自由曲面镜422a的一侧。其中,基底用于承载第一膜层421b和偏振转换器421a,而不会对光束产生影响。例如,基底具体可以为透明玻璃片。
其中,偏振转换器421a用于接收第一光束P1(圆),并将第一光束P1(圆)转换成第一偏振态的线偏光。例如,在图15中,将第一光束P1(圆)转换成S光,即P1-s。
第一膜层421b用于对第一偏振态的光进行反射,这样,当P1-s到达第一膜层421b,会被其反射至偏振转换器421a,在偏振转换器421a的作用下,P1-s转换成圆偏振光,即P1(圆),并传输至光学自由曲面镜422a。
光学自由曲面镜422a则对P1(圆)进行反射,反射光即为第二光束。
偏振转换器421a还用于接收自由曲面镜422a的反射光,即P1(圆),并将其转换成第二偏振态的偏振光,即P1-p,P1-p到达第一膜层421b。
第一膜层421b还用于对第二偏振态的偏振光进行透射。这样,P1-p到达第一膜层421b后,会透射过第一膜层421b,继续传播,最终进入用户双眼,使用户看到放大的虚像。
在图15所示实现方式中,偏振转换器421a可以为1/4波片,第一膜层421b可以为反射-透射型偏振膜。
在图15所示实现方式中,基于第一光束可以为圆偏振光的这一特点,折叠光路元件一方 面可以将第一光束反射至虚像成像镜组,由虚像成像镜组对第一光束进行处理,如放大处理,折叠光路元件另一方面还可以将虚像成像镜组射出的光束进行透射,以使其能够进入用户的眼睛,也即实现对光路进行折叠的作用,提高显示装置的集成度,有利于显示装置的小型化。与此同时,由上述光束的传播过程可知,并未造成光束的损失,因此可以保证成像质量。在另一种可能的实现方式中,如图16所示,折叠光路元件421包括第二膜层421c和第三膜层421d,该第三膜层421d和第二膜层421c可以依次位于基底的远离光学自由曲面镜422a的一侧。为了便于说明折叠光路元件421的作用,本实施例的图示图16中,将第二膜层421c和第三膜层421d分开设置。应理解的是,实际应用中,第三膜层421d和第二膜层421c可以依次贴设于基底的表面。
其中,第二膜层421c用于对第一光束P1中的一部分光进行反射,如透射过的部分光为P11,使得P11射向自由曲面镜422a。第二膜层421c同时透射第一光束P1中的剩余光(图中未示出)。
光学自由曲面镜422a则用于对P11进行反射,反射光P11传输至第二膜层421c后,其中的一部分P111透射过第二膜层421c,并至第三膜层421d,另一部分(图中未示出)则被第二膜层421c反射。
第三膜层421d用于对P111进行透射。透射光即为P2,P2继续传播,最终进入用户双眼,使用户看到放大的虚像。
在图16所示实现方式中,第二膜层421c可以为半透半反膜,第三膜层421d可以为透射-吸收型偏振膜。在这种实现方式中,折叠光路元件的结构简单,易于实现。其一方面可以将第一光束中的一部分反射至虚像成像镜组,由虚像成像镜组对该部分光束进行处理,其另一方面还可以将虚像成像镜组射出的光束的一部分进行透射,以使其能够进入用户的眼睛,使用户看到放大的虚像,也即实现对光路进行折叠的作用,提高显示装置的集成度,有利于显示装置的小型化。本申请实施例还提供了一种显示设备,该显示设备具体可以是在家庭、教室、会议室、大礼堂、影院、球场、广场等场合使用的投影显示设备,该显示设备还可以是车载显示屏或者集成在智能家电设备中的显示屏等,还可以是网络电视、智能电视、互联网协议电视(IPTV),或者集成于其中。
图17是本申请实施例提供的一种显示设备的示意图。如图17所示,显示设备中的电路主要包括包含处理器1701,内部存储器1702,外部存储器接口1703,音频模块1704,视频模块1705,电源模块1706,无线通信模块1707,I/O接口1708、视频接口1709、控制器局域网(Controller Area Network,CAN)收发器1710,显示电路1711,以及上述任意一种显示装置400等。其中,处理器1701与其周边的元件,例如内部存储器1702,CAN收发器1710,音频模块1704,视频模块1705,电源模块1706,无线通信模块1707,I/O接口1708、视频接口1709、收发器1710、显示电路1711可以通过总线连接。
其中,处理器1701可以称为前端处理器。处理器1701包括一个或多个处理单元,例如:处理器1701可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器1701中还可以设置存储器,用于存储指令和数据。例如,存储显示设备的操作系统、AR Creator软件包等。在一些实施例中,处理器1701中的存储器为高速缓冲存储器。该存储器可以保存处理器1701刚用过或循环使用的指令或数据。如果处理器1701需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器1701的等待时间,因而提高了系统的效率。
另外,如果本实施例中的显示设备安装在交通工具上,处理器1701的功能可以由交通工具上的域控制器来实现。
在一些实施例中,显示设备还可以包括多个连接到处理器1701的输入输出(input/output,I/O)接口1708。接口1708可以包括但不限于集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。上述I/O接口1708可以连接鼠标、触摸屏、键盘、摄像头、扬声器/喇叭、麦克风等设备,也可以连接显示设备上的物理按键(例如音量键、亮度调节键、开关机键等)。
内部存储器1702可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。存储器1702可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如通话功能,时间设置功能,AR功能等)等。存储数据区可存储显示装置使用过程中所创建的数据(比如电话簿,世界时间等)等。此外,内部存储器1702可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器1701通过运行存储在内部存储器1702的指令,和/或存储在设置于处理器1701中的存储器的指令,执行显示设备的各种功能应用以及数据处理。
外部存储器接口1703可以用于连接外部存储器(例如Micro SD卡),外部存储器可以根据需要存储数据或程序指令,处理器1701可以通过外部存储器接口1703对这些数据或程序执行进行读写等操作。
音频模块1704用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块1704还可以用于对音频信号编码和解码,例如进行放音或录音。在一些实施例中,音频模块1704可以设置于处理器1701中,或将音频模块1704的部分功能模块设置于处理器1701中。显示装置可以通过音频模块1704以及应用处理器等实现音频功能。
视频接口1709可以接收外部输入的音视频,其具体可以为高清晰多媒体接口(high definition multimedia interface,HDMI),数字视频接口(digital visual interface,DVI),视频图形阵列(video graphics array,VGA),显示端口(display port,DP),低压差分信号(low voltage differential signaling,LVDS)接口等,视频接口1709还可以向外输出视频。例如,显示设备通过视频接口接收导航系统发送的视频数据或者接收域控制器发送的视频数据。
视频模块1705可以对视频接口1709输入的视频进行解码,例如进行H.264解码。视频 模块还可以对显示设备采集到的视频进行编码,例如对外接的摄像头采集到的视频进行H.264编码。此外,处理器1701也可以对视频接口1709输入的视频进行解码,然后将解码后的图像信号输出到显示电路1711。
进一步的,上述显示设备还包括CAN收发器1710,CAN收发器1710可以连接到汽车的CAN总线(CAN BUS)。通过CAN总线,显示设备可以与车载娱乐系统(音乐、电台、视频模块)、车辆状态系统等进行通信。例如,用户可以通过操作显示设备来开启车载音乐播放功能。车辆状态系统可以将车辆状态信息(车门、安全带等)发送给显示设备进行显示。
显示电路1711和显示装置400共同实现显示图像的功能。显示电路1711接收处理器1701输出的图像信号,对该图像信号进行处理后输入显示装置400的图像生成单元410中生成实像。显示电路1711还可以对图像生成单元410显示的图像进行控制。例如,控制显示亮度或对比度等参数。其中,显示电路1711可以包括驱动电路、图像控制电路等。
在本实施例中,视频接口1709可以接收输入的视频数据(或称为视频源),视频模块1705进行解码和/或数字化处理后输出图像信号至显示电路1711,显示电路1711根据输入的图像信号驱动图像生成单元410将光源发出的光束进行成像,从而生成可视图像(发出成像光)。
电源模块1706用于根据输入的电力(例如直流电)为处理器1701和投影装置等器件提供电源,电源模块1706中可以包括可充电电池。此外,上述电源模块1706可以连接到汽车的供电模块(例如动力电池),由汽车的供电模块为显示设备的电源模块1706供电。
无线通信模块1707可以使得显示设备与外界进行无线通信,其可以提供无线局域网(wireless local area networks,WLAN),无线保真(wireless fidelity,Wi-Fi)网络,蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块1707可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块1707经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器1701。无线通信模块1707还可以从处理器1701接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
另外,视频模块1705进行解码的视频数据除了通过视频接口1709输入之外,还可以通过无线通信模块1707以无线的方式接收或从内部存储器1702或外部存储器中读取,例如显示设备可以通过车内的无线局域网从终端设备或车载娱乐系统接收视频数据,显示设备还可以读取内部存储器1702或外部存储器中存储的音视频数据。
另外,本申请实施例示意的电路图并不构成对显示设备的具体限定。在本申请另一些实施例中,显示设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
上述显示设备除了提供上述功能外,还可以提供广播接收电视功能。比如,上述显示设备可以集成于网络电视、智能电视、互联网协议电视(IPTV)中。
本申请实施例还提供了一种桌体结构,该桌体结构具有桌面,桌面上设有显示窗,上述任意一种显示设备或者显示装置400安装在该桌体结构中,并且,显示装置400发出的第二光束可至显示窗,显示窗对该第二光束进行反射,以使得第二光束进入用户眼睛,使用户看到放大的虚像。
本申请实施例还提供了一种交通工具,该交通工具包括反射元件,和前述任一种显示装置或者前述任一种显示设备。该反射元件用于将来自显示装置的第二光束进行反射,以使得第二光束能被观察者的双眼接收,从而使得观察者的放大的虚像。
在可能的实现方式中,上述显示设备集成在交通工具的抬头显示装置HUD中。该交通工具所包括的反射元件具体可以为其前侧挡风玻璃(即前风挡)。
比如,图18是将本申请涉及的显示装置应用在交通工具的抬头显示装置HUD中的示意图。在该应用方式中,显示装置(或者上述显示设备)的光学生成单元包括交通工具的前风挡。如图18所示,虚像对应的的第二光束由前风挡反射至驾驶员的眼睛,使得在前风挡的一侧形成图像M的放大的虚像。示例性的,交通工具可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不作特别的限定。
上述图像M的内容包括但不限于地图辅助信息、外界物体的指示信息、交通工具的状态信息和娱乐信息等。地图辅助信息用作辅助驾驶,例如,地图辅助信息包括但不限于方向箭头、距离和行驶时间等。外界物体的指示信息包括但不限于安全车距、周围障碍物和倒车影像等。以汽车为例,交通工具的状态信息一般是显示在交通工具仪表上的信息,也称为是仪表信息,包括但不限于行驶速度、行驶里程、燃油量、水温和车灯状态等信息。
在另一些实施例中,显示装置(或者上述显示设备)可以安装于交通工具的座椅的靠背上。以交通工具为小型车辆为例,显示装置可以安装于车辆的前排座椅的靠背上。这样,显示装置射出的光束进入到后排乘客的眼睛,即可使得后排乘客观看到待显示图像的放大虚像。
图19是本申请实施例提供的一种交通工具的一种可能的功能框架示意图。
如图19所示,交通工具的功能框架中可包括各种子系统,例如,图示中的控制系统1901、传感器系统1902、一个或多个外围设备1903(图示以一个为例示出)、电源1904、计算机系统1905、显示系统1906。可选地,交通工具还可包括其他功能系统,例如,为交通工具提供动力的引擎系统等等,本申请这里不做限定。
其中,传感器系统1902可包括若干检测装置,这些检测装置能感受到被测量的信息,并将感受到的信息按照一定规律将其转换为电信号或者其他所需形式的信息输出。如图示出,这些检测装置可包括全球定位系统(global positioning system,GPS)、车速传感器、惯性测量单元(inertial measurement unit,IMU)、雷达单元、激光测距仪、摄像装置、轮速传感器、转向传感器、档位传感器、或者其他用于自动检测的元件等等,本申请并不做限定。
控制系统1901可包括若干元件,例如图示出的转向单元、制动单元、照明系统、自动驾驶系统、地图导航系统、网络对时系统和障碍规避系统。可选地,控制系统1901还可包括诸如用于控制车辆行驶速度的油门控制器及发动机控制器等元件,本申请不做限定。
外围设备1903可包括若干元件,例如图示中的通信系统、触摸屏、用户接口、麦克风以及扬声器等等。其中,通信系统用于实现交通工具和除交通工具之外的其他设备之间的网络通信。在实际应用中,通信系统可采用无线通信技术或有线通信技术实现交通 工具和其他设备之间的网络通信。该有线通信技术可以是指车辆和其他设备之间通过网线或光纤等方式通信。
电源1904代表为车辆提供电力或能源的系统,其可包括但不限于再充电的锂电池或铅酸电池等。在实际应用中,电源中的一个或多个电池组件用于提供车辆启动的电能或能量,电源的种类和材料本申请并不限定。
交通工具的若干功能均由计算机系统1905控制实现。计算机系统1905可包括一个或多个处理器(图示以一个处理器为例示出)和存储器(也可称为存储装置)。在实际应用中,该存储器也在计算机系统1905内部,也可在计算机系统1905外部,例如作为交通工具中的缓存等,本申请不做限定。其中,
处理器可包括一个或多个通用处理器,例如,图形处理器(graphic processing unit,GPU)。处理器可用于运行存储器中存储的相关程序或程序对应的指令,以实现车辆的相应功能。
存储器可以包括易失性存储器(volatile memory),例如,RAM;存储器也可以包括非易失性存储器(non-volatile memory),例如,ROM、快闪存储器(flash memory)或固态硬盘(solid state drives,SSD);存储器还可以包括上述种类的存储器的组合。存储器可用于存储一组程序代码或程序代码对应的指令,以便于处理器调用存储器中存储的程序代码或指令以实现车辆的相应功能。本申请中,存储器中可存储一组用于车辆控制的程序代码,处理器调用该程序代码可控制车辆安全行驶。
可选地,存储器除了存储程序代码或指令之外,还可存储诸如道路地图、驾驶线路、传感器数据等信息。计算机系统1905可以结合车辆功能框架示意图中的其他元件,例如传感器系统中的传感器、GPS等,实现车辆的相关功能。例如,计算机系统1905可基于传感器系统的数据输入控制交通工具的行驶方向或行驶速度等,本申请不做限定。
显示系统1906可包括若干元件,例如,挡风玻璃、控制器和前文中描述的显示装置400。控制器用于根据用户指令生成图像(如生成包含车速、电量/油量等车辆状态的图像以及增强现实AR内容的图像),并将该图像内容发送至投影装置;投影装置将承载图像内容的光投射至挡风玻璃,挡风玻璃用于反射承载图像内容的光,以使在驾驶员前方呈现图像内容对应的虚像。需要说明的是,显示系统1906中的部分元件的功能也可以由车辆的其它子系统来实现,例如,控制器也可以为控制系统中的元件。
其中,本申请图19示出包括四个子系统,传感器系统、控制系统、计算机系统和显示系统仅为示例,并不构成限定。在实际应用中,交通工具可根据不同功能对车辆中的若干元件进行组合,从而得到相应不同功能的子系统。在实际应用中,交通工具可包括更多或更少的系统或元件,本申请不做限定。

Claims (18)

  1. 一种显示装置,其特征在于,包括图像生成单元和光学成像单元;
    所述图像生成单元,用于根据处理器输入的图像信号,生成实像,所述实像相对于所述图像信号对应的图像具有预畸变;
    所述光学成像单元,用于对所述实像进行成像,以形成所述实像对应的虚像,并通过对所述虚像造成的光学畸变补偿所述预畸变。
  2. 根据权利要求1所述的显示装置,其特征在于,所述光学成像单元对应有畸变参数,所述畸变参数为表征所述光学畸变的参数;
    所述图像生成单元包括:
    光调制器,用于根据所述图像信号对接收的光信号进行光调制,生成所述实像;其中,所述光调制器上像素的尺寸和/或分布方式与所述畸变参数对应。
  3. 根据权利要求2所述的显示装置,其特征在于,所述光调制器包括第一区域和第二区域,所述第一区域内的像素和所述第二区域内的像素均呈阵列分布;其中,所述第一区域内像素的尺寸与所述第二区域内像素的尺寸不同,和/或,所述第一区域内像素的阵列分布方式与所述第二区域内像素的阵列分布方式不同。
  4. 根据权利要求3所述的显示装置,其特征在于,所述第一区域相对于所述第二区域远离所述光调制器的中心;所述第一区域内像素的尺寸和阵列分布方式与所述畸变参数对应。
  5. 根据权利要求4所述的显示装置,其特征在于,所述光学畸变为桶形畸变;所述第一区域内像素的尺寸小于所述第二区域内像素的尺寸,以使得所述预畸变为枕形畸变。
  6. 根据权利要求4或5所述的显示装置,其特征在于,所述第一区域中的一行像素沿曲线分布,其中,靠近所述曲线的两端的像素相对于靠近所述曲线的中点的像素,远离所述第二区域。
  7. 根据权利要求4所述的显示装置,其特征在于,所述光学畸变为枕形畸变;所述第一区域内像素的尺寸大于所述第二区域内像素的尺寸,以使得所述预畸变为桶形畸变。
  8. 根据权利要求4或7所述的显示装置,其特征在于,所述第一区域中的一行像素沿曲线分布,其中,靠近所述曲线的中点的像素相对于靠近所述曲线的两端的像素,远离所述第二区域。
  9. 根据权利要求4-6任一项所述的显示装置,其特征在于,所述第一区域内像素对应的视场角大于或者等于预设视场角。
  10. 根据权利要求4-9任一项所述的显示装置,其特征在于,所述光调制器还包括相对于所述第一区域远离所述中心的边缘区域,所述边缘区域内像素的尺寸小于所述第一区域内像素的尺寸。
  11. 根据权利要求1-10任一项所述的显示装置,其特征在于,所述显示装置还包括:
    光扩散元件,位于所述图像生成单元的出光侧,用于对所述实像对应的第一光束进行扩散。
  12. 根据权利要求1-11任一项所述的显示装置,其特征在于,所述光学成像单元包括折叠光路元件和虚像成像镜组;
    所述折叠光路元件,用于接收所述实像对应的第一光束,将所述第一光束发射至所述虚 像成像镜组;
    所述虚像成像镜组,用于对所述第一光束进行反射,反射光到达所述折叠光路元件;
    所述折叠光路元件,还用于对所述虚像成像镜组的反射光进行透射,透射出的第二光束用于形成所述虚像。
  13. 根据权利要求12所述的显示装置,其特征在于,所述第一光束为圆偏振光,所述折叠光路元件包括偏振转换器和第一膜层,所述第一膜层位于所述偏振转换器远离所述虚像成像镜组的一侧;
    所述偏振转换器,用于接收所述第一光束,并将所述第一光束转换成第一偏振态的线偏振光;
    所述第一膜层,用于将来自所述偏振转换器的第一偏振态的第一光束反射至所述偏振转换器;
    所述偏振转换器,还用于将来自所述第一膜层的第一光束转换成圆偏振光,并射出至所述虚像成像镜组;
    所述偏振转换器,还用于将所述虚像成像镜组反射的光束转换成第二偏振态的线偏振光;
    所述第一膜层,还用于透射来自所述偏振转换器的第二偏振态的线偏振光。
  14. 根据权利要求12或13所述的显示装置,其特征在于,所述虚像成像镜组包括光学自由曲面镜。
  15. 根据权利要求12-14任一项所述的显示装置,其特征在于,所述偏振转换器为1/4波片。
  16. 一种显示设备,其特征在于,包括处理器以及如权利要求1至15任一项所述的显示装置,所述处理器用于向所述图像生成单元发送所述图像信号。
  17. 一种交通工具,其特征在于,包括如权利要求16的显示设备,所述显示设备安装在所述交通工具上。
  18. 根据权利要求17所述的交通工具,其特征在于,所述交通工具具有座椅,所述显示设备安装于所述座椅的靠背。
PCT/CN2023/095982 2022-09-02 2023-05-24 显示装置、显示设备及交通工具 WO2024045704A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211073982.8 2022-09-02
CN202211073982.8A CN117687210A (zh) 2022-09-02 2022-09-02 显示装置、显示设备及交通工具

Publications (1)

Publication Number Publication Date
WO2024045704A1 true WO2024045704A1 (zh) 2024-03-07

Family

ID=90100324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/095982 WO2024045704A1 (zh) 2022-09-02 2023-05-24 显示装置、显示设备及交通工具

Country Status (2)

Country Link
CN (1) CN117687210A (zh)
WO (1) WO2024045704A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106094200A (zh) * 2016-08-24 2016-11-09 深圳市视睿迪科技有限公司 一种像素结构、显示面板及显示装置
CN106920475A (zh) * 2017-04-25 2017-07-04 京东方科技集团股份有限公司 显示面板、显示装置及显示面板的驱动方法
CN109477968A (zh) * 2016-07-07 2019-03-15 麦克赛尔株式会社 平视显示装置
CN210488131U (zh) * 2019-10-10 2020-05-08 浙江水晶光电科技股份有限公司 一种光学模组和智能眼镜
JP2020149063A (ja) * 2020-05-22 2020-09-17 マクセル株式会社 ヘッドアップディスプレイ装置
CN111929906A (zh) * 2020-09-25 2020-11-13 歌尔光学科技有限公司 图像显示结构和头戴显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109477968A (zh) * 2016-07-07 2019-03-15 麦克赛尔株式会社 平视显示装置
CN106094200A (zh) * 2016-08-24 2016-11-09 深圳市视睿迪科技有限公司 一种像素结构、显示面板及显示装置
CN106920475A (zh) * 2017-04-25 2017-07-04 京东方科技集团股份有限公司 显示面板、显示装置及显示面板的驱动方法
CN210488131U (zh) * 2019-10-10 2020-05-08 浙江水晶光电科技股份有限公司 一种光学模组和智能眼镜
JP2020149063A (ja) * 2020-05-22 2020-09-17 マクセル株式会社 ヘッドアップディスプレイ装置
CN111929906A (zh) * 2020-09-25 2020-11-13 歌尔光学科技有限公司 图像显示结构和头戴显示设备

Also Published As

Publication number Publication date
CN117687210A (zh) 2024-03-12

Similar Documents

Publication Publication Date Title
CN105143964B (zh) 多激光驱动系统
WO2024017038A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2024021852A1 (zh) 立体显示装置、立体显示系统和交通工具
WO2024021574A1 (zh) 立体投影系统、投影系统和交通工具
WO2024045704A1 (zh) 显示装置、显示设备及交通工具
CN116420106A (zh) 光学装置、系统和方法
WO2024007749A1 (zh) 光学系统、显示设备以及交通工具
CN217360538U (zh) 一种投影系统、显示设备和交通工具
WO2023098228A1 (zh) 显示装置、电子设备以及交通工具
WO2023071548A1 (zh) 一种光学显示装置、显示系统、交通工具及色彩调节方法
CN115542644B (zh) 投影装置、显示设备及交通工具
CN115933185B (zh) 虚像显示装置、图像数据的生成方法、装置和相关设备
WO2023103492A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2024098828A1 (zh) 投影系统、投影方法和交通工具
WO2024001225A1 (zh) 虚像显示装置、图像数据的生成方法、装置和相关设备
US20240073379A1 (en) Picture generation apparatus, projection apparatus, and vehicle
WO2023040669A1 (zh) 抬头显示设备和交通工具
CN115543130B (zh) 显示装置以及电子设备
WO2023087739A1 (zh) 投影装置、显示设备和交通工具
WO2024032057A1 (zh) 一种立体显示装置、立体显示设备以及立体显示方法
WO2023138076A1 (zh) 显示装置以及交通工具
WO2023138138A1 (zh) 一种显示装置和交通工具
WO2023185293A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2023184276A1 (zh) 一种显示方法、显示系统和终端设备
WO2024065332A1 (zh) 一种显示模组、光学显示系统、终端设备及图像显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23858743

Country of ref document: EP

Kind code of ref document: A1