CN117687210A - Display device, display equipment and vehicle - Google Patents

Display device, display equipment and vehicle Download PDF

Info

Publication number
CN117687210A
CN117687210A CN202211073982.8A CN202211073982A CN117687210A CN 117687210 A CN117687210 A CN 117687210A CN 202211073982 A CN202211073982 A CN 202211073982A CN 117687210 A CN117687210 A CN 117687210A
Authority
CN
China
Prior art keywords
image
pixels
display device
distortion
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211073982.8A
Other languages
Chinese (zh)
Inventor
许志高
毛磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211073982.8A priority Critical patent/CN117687210A/en
Priority to PCT/CN2023/095982 priority patent/WO2024045704A1/en
Publication of CN117687210A publication Critical patent/CN117687210A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)

Abstract

The embodiment of the application provides a display device, display equipment and a vehicle, relates to the technical field of light display, and mainly aims to provide a display device which can display an undistorted ideal virtual image of an image and does not lose image resolution, and the display device can be applied to specific scenes such as desktop display, near-eye display, head-up display, projection display, vehicle-mounted display and the like. The display device includes an image generating unit and an optical imaging unit; the image generating unit is used for generating a real image according to the image signal input by the processor, wherein the real image has predistortion relative to an image corresponding to the image signal; the optical imaging unit is used for imaging the real image to form a virtual image corresponding to the real image, such as an amplified virtual image, and compensating predistortion through optical distortion caused to the virtual image. In this way, a relatively ideal virtual image can be formed, the viewing experience of a user is ensured, and the pixels of the image generation unit are effectively utilized, so that the loss of image resolution is avoided.

Description

Display device, display equipment and vehicle
Technical Field
The present disclosure relates to the field of optical display technologies, and in particular, to a display device, a display apparatus, and a vehicle.
Background
The light display products include desktop display devices, head-up display devices (HUDs), projection lamps, and the like, and the application scenes of the light display products include, but are not limited to, vehicles, such as automobiles, trains, and the like, but the desktop display devices can also be applied to the education field, such as various desk bodies with desktop display functions.
The optical display product is characterized in that a virtual image with an enlarged view field and a magnified frame can be displayed through the optical imaging unit of the optical display product and is mapped into eyes of an observer, so that the observer obtains a large-screen viewing experience.
However, since the optical imaging unit generally consists of an optical lens, and the manufacturing accuracy of the optical lens and the assembly process of the optical imaging unit deviate, the optical imaging unit causes optical distortion of the virtual image when imaging, that is, the virtual image is displayed as a distorted virtual image, thereby affecting the viewing experience of the observer.
Disclosure of Invention
The embodiment of the application provides a display device, display equipment comprising the display device, and a vehicle comprising the display device or the display equipment, and mainly aims to provide the display device capable of presenting an ideal virtual image without distortion of an image and without losing image resolution.
In a first aspect, the present application provides a display device, which may be applied to a scene such as video entertainment, office, learning, driving assistance, etc., including but not limited to a specific scene such as desktop display, near-eye display, head-up display, projector, vehicle-mounted display device, and vehicle lamp, where the display device may be a flat display device, a stereoscopic display device, a stand-alone device, or an integrated component in other devices.
The display device includes an image generating unit and an optical imaging unit; and the image generating unit is used for generating a real image according to the image signal input by the processor. The image signal is an image signal corresponding to an ideal image to be displayed without distortion, and the real image has predistortion relative to the image to be displayed; and the optical imaging unit is used for imaging the real image to form a virtual image corresponding to the real image, such as an amplified virtual image, and compensating predistortion through optical distortion caused by the virtual image. In this way, a relatively ideal virtual image can be formed, and the viewing experience of the user is ensured.
In addition, the image generating unit directly generates the real image with predistortion of the image to be displayed according to the ideal image signal of the image to be displayed, which is input by the processor at the periphery of the display device, instead of generating the real image of the pre-correction image according to the pre-correction image with distortion relative to the image to be displayed, so that the pixels of the image generating unit are effectively utilized, and the loss of image resolution is avoided. In addition, the image generation unit is used for realizing the pre-correction of the optical distortion in the 'real image generation stage', and a processor at the periphery of the display device is not required to be used for carrying out the pre-correction processing on the image to be displayed in order to obtain the pre-corrected image, so that extra power consumption and cost are not generated.
In a possible implementation manner of the first aspect, the optical imaging unit corresponds to a first distortion parameter, and the first distortion parameter is a parameter representing optical distortion; the image generation unit includes: an optical modulator for modulating an input optical signal according to an input image signal to generate a real image; wherein the size and/or distribution of pixels on the light modulator is matched with the second distortion parameter; the second distortion parameter is determined from the first distortion parameter and may characterize predistortion.
That is, the size and/or distribution of the pixels of the light modulator are designed according to, that is, correspond to, the first distortion parameter of the optical generating unit, so that based on the real image generated by the light modulator, there is a predistortion that can be characterized by the second distortion parameter with respect to the image to be displayed, and the predistortion can be offset from the optical distortion caused when the subsequent optical imaging unit performs virtual image imaging, so that the predistortion of the optical distortion is implemented in the "real image generating stage", not only the loss of image resolution, but also the generation of additional power consumption and cost can be avoided.
As can be seen from the foregoing, in the embodiments of the present application, the size and distribution of the pixels on the light modulator are designed according to the optical distortion caused by the optical imaging unit. The optical distortion may include barrel distortion and pincushion distortion, among others.
For example, if the optical distortion caused by the optical imaging unit is barrel-shaped, the size of some or all of the pixels on the optical modulator is compressively designed, i.e., the pixel size is reduced, so that the image generating unit generates a real image having pincushion distortion. In this way, after the real image enters the optical imaging unit, when the optical imaging unit images the virtual image, the barrel-shaped distortion of the real image is counteracted by barrel-shaped distortion of the virtual image, and finally an ideal virtual image is formed. If the optical distortion caused by the optical imaging unit is pincushion distortion, then the size of some or all of the pixels on the optical modulator is expansion designed, i.e. the pixel size is increased, so that the image generating unit generates a real image with barrel distortion. In this way, after the real image enters the optical imaging unit, when the optical imaging unit images the virtual image, the barrel distortion of the real image is counteracted by the pincushion distortion caused to the virtual image, and finally an ideal virtual image is formed.
In this embodiment of the present application, only the pixels in a part of the areas may be designed in a compressed manner or in an expanded manner, so that an ideal virtual image may be formed.
In a possible implementation manner of the first aspect, the light modulator includes a first area and a second area, and pixels in the first area and pixels in the second area are distributed in an array; the size of the pixels in the first area is different from the size of the pixels in the second area, and/or the array distribution mode of the pixels in the first area is different from the array distribution mode of the pixels in the second area.
The size and the distribution mode of the pixels may specifically be the size and the distribution mode of the light emitting area in the sub-pixel area where the sub-pixels are located, where the size is, for example, width, height, area, etc., and the distribution mode includes a position, an array distribution direction, a pitch, etc. when the array is distributed.
In a possible implementation manner of the first aspect, the first area is far from the center of the light modulator relative to the second area; the size and/or array distribution of pixels within the first region is matched to the second distortion parameter. Alternatively, the size and/or array distribution of pixels in the first region corresponds to the first distortion parameter.
That is, the light modulator is divided into two parts, and the size and/or array distribution of the pixels in the two part regions may conform to different rules, wherein the size and/or array distribution of the pixels in the first region is designed according to the second distortion parameter, i.e. conforms to a rule corresponding to the second distortion parameter, such as a compression design rule or an expansion design rule. The size and/or array distribution of the pixels in the second area conform to another rule, such as a standard design rule of the light modulator. The standard design rules herein may be understood as that the size and/or array distribution of pixels is independent of the second distortion parameter, i.e. of the imaging defects of the optical imaging unit.
Based on this, in one possible implementation manner of the first aspect, in the case that the optical distortion is barrel distortion, the width of the light modulator along the column direction of the pixels gradually increases from the middle to both ends in the row direction, so that the shape of the real image is "pincushion".
Further, the size of the pixels in the first region is smaller than the size of the pixels in the second region, so that the predistortion of the real image is pincushion distortion, i.e. exhibits pincushion distortion.
Further, a row of pixels in the first region is distributed along a curve, wherein pixels near both ends of the curve are distant from the second region relative to pixels near a midpoint of the curve. Alternatively, a row of pixels in the first region may be progressively farther from a midline of the light modulator parallel to the row direction from the middle of the row direction toward both ends.
Further, in one row of pixels in the first region, the width of the pixels gradually decreases from the middle toward the both ends in the row direction thereof. The pixel size and the array distribution mode in the first area are designed in such a way that the pincushion distortion caused by the light modulator on the real image is more close to the pincushion distortion on the optics, and then is more matched with barrel-shaped distortion caused by the optical imaging unit, and after the pincushion distortion and the barrel-shaped distortion are counteracted, the formed virtual image is more ideal. The optical pincushion distortion is understood to be pincushion distortion caused by factors such as assembly errors of optical lenses and manufacturing accuracy, and the principle of the pincushion distortion is different from that of the pincushion distortion caused by the optical modulator in the present application.
Alternatively, in another possible implementation, in the case where the optical distortion is pincushion distortion, the width of the light modulator in the column direction of the pixels gradually decreases from the middle to both ends in the row direction, so that the shape of the real image is "barrel-shaped".
Further, the size of the pixels in the first region is larger than the size of the pixels in the second region, so that the predistortion of the real image is barrel distortion, i.e. exhibits barrel distortion.
Further, a row of pixels in the first region is distributed along a curve, wherein pixels near a midpoint of the curve are farther from the second region than pixels near both ends of the curve. Alternatively, a row of pixels in the first region is gradually separated from a midline of the light modulator parallel to the row direction from the middle to both ends of the row direction.
Further, in one row of pixels in the first region, the width of the pixels gradually increases from the middle to the both ends in the row direction thereof.
The pixels in the first area are distributed in such an array, so that barrel-shaped distortion caused by the light modulator on the real image is more close to barrel-shaped distortion on the optics, and then is more matched with pincushion distortion caused by the optical imaging unit, and after the barrel-shaped distortion and the pincushion distortion are counteracted, the formed virtual image is more ideal.
The distortion is more pronounced as a result of the larger field of view, whether barrel-shaped or pincushion distortion. And the distortion does not affect the viewing experience of the user corresponding to the region with smaller field of view. Therefore, the field angle corresponding to the pixels in the first area may be greater than or equal to the preset field angle, that is, the first area is a large field area. In this way, the ideal virtual image formed within the error range of human eyes can be ensured without compression design or expansion design of all pixels of the optical modulator.
In a possible implementation manner of the first aspect, the light modulator further includes an edge area distant from a center of the light modulator with respect to the first area, and a size of a pixel in the edge area is smaller than a size of a pixel in the first area. By reducing the size of the pixels at the edge, the complete luminous area is ensured to be also provided in the smaller sub-pixel area at the edge of the light modulator, and the defects of rough edges, gaps and the like of the generated real image can be avoided. In addition, the smoothness of the edge of the light modulator can be further optimized by adjusting the positions of the pixels at the edge of the light modulator, for example, the continuous pixels are distributed in a staggered manner, so that the smoothness of the edge of the real image is optimized, and the image display effect is optimized.
In a possible implementation manner of the first aspect, the display device further includes: the light diffusion element is positioned at the light emitting side of the image generation unit and used for diffusing the first light beam corresponding to the real image so as to improve the uniformity of an imaging picture.
In a possible implementation manner of the first aspect, the optical imaging unit includes the light diffusing element described above.
In a possible implementation manner of the first aspect, the optical imaging unit includes a folded light path element and a virtual image imaging lens group. A folded light path element for receiving the first light beam and transmitting the first light beam to the virtual image imaging lens group; the virtual image imaging lens group is used for reflecting the first light beam, and the reflected light reaches the folding light path element; the folding light path element is also used for transmitting the light beam reflected by the virtual image imaging lens group so as to emit a second light beam for forming a virtual image. The structural design of the optical imaging unit can improve the integration level of the display device and is beneficial to miniaturization of the display device.
In a possible implementation manner of the first aspect, the first light beam is circularly polarized light, and the folded light path element includes a polarization converter and a first film layer, where the first film layer is located on a side of the polarization converter away from the virtual image forming set; a polarization converter for receiving the first light beam and converting the first light beam into linearly polarized light of a first polarization state; a first film layer for reflecting a first light beam of a first polarization state from the polarization converter to the polarization converter; the polarization converter is also used for converting the first light beam with the first polarization state from the first film layer into circularly polarized light and emitting the circularly polarized light to the virtual image imaging lens group; the polarization converter is also used for converting the light beam reflected by the virtual image imaging lens group into linearly polarized light in a second polarization state; the first film layer is also used for transmitting the light beam with the second polarization state from the polarization converter and finally emitting the second light beam. The polarization converter may be a 1/4 wave plate. In this implementation manner, based on the characteristic that the first light beam may be circularly polarized light, the folded light path element may reflect the first light beam to the virtual image imaging lens group on one hand, and the virtual image imaging lens group processes the first light beam, for example, amplifies the first light beam, and may transmit the light beam emitted from the virtual image imaging lens group on the other hand, so that the first light beam may enter the eyes of the user, that is, the effect of folding the light path is achieved, so that the integration level of the display device is improved, and the miniaturization of the display device is facilitated. Meanwhile, the propagation process of the light beam can ensure the imaging quality without causing the loss of the light beam.
In a possible implementation manner of the first aspect, the virtual image imaging lens group includes an optical free-form surface, which may be provided by an optical free-form surface lens.
In a possible implementation manner of the first aspect, the folded light path element includes a second film layer, a third film layer, and a carrier film, where the third film layer and the second film layer may be sequentially located on a side of the carrier film away from the virtual image imaging group.
The second film layer is used for reflecting part of light in the first light beam to the virtual image imaging module and transmitting part of light reflected by the virtual image imaging module to the third film layer. The third film layer is used for transmitting the transmitted light from the second film layer, the transmitted light of the third film layer is the second light beam, the second light beam continues to propagate, and finally the second light beam enters the eyes of a user, so that the user can see the amplified virtual image. In this implementation, the folded light path element is simple in structure and easy to implement. On the one hand, part of the first light beam can be reflected to the virtual image imaging lens group, the virtual image imaging lens group processes the part of the light beam, and on the other hand, part of the light beam emitted by the virtual image imaging lens group can be transmitted, so that the light beam can enter eyes of a user, the user can see the amplified virtual image, namely, the effect of folding the light path is realized, the integration level of the display device is improved, and the miniaturization of the display device is facilitated.
In a second aspect, the present application provides a display apparatus comprising a processor and any one of the above display devices, the processor being configured to send an image signal to an image generating unit in the display device.
In a third aspect, the present application provides a vehicle comprising a reflective element and any one of the display devices or display apparatus described above, the display device or display apparatus being mounted on the vehicle.
In a possible implementation manner of the third aspect, the display device or the display apparatus is configured to emit a light beam corresponding to the virtual image to the reflective element; the reflective element is used to reflect the light beam to both eyes of a user, and the reflective element, such as a windshield of a vehicle, is included in an optical imaging unit of a display device.
In another possible implementation of the third aspect, the vehicle has a seat, and the display device or the display apparatus is mounted on a backrest of the seat.
The technical effects caused by any one of the design manners of the second aspect to the third aspect may be referred to the technical effects caused by the different design manners of the first aspect, and will not be described herein.
Drawings
FIG. 1 is a schematic view of optical distortion given in an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an imaging process of a display device according to an embodiment of the present application;
fig. 3 is an application scenario schematic diagram of a display device provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a display device according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating an imaging process of another display device according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a relationship between an optical modulator and first and second distortion parameters according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a relationship between a light modulator structure and a real image according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of an optical modulator according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a local area pixel in a light modulator according to an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of an optical modulator according to an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of another optical modulator provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of another display device according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of an optical imaging unit according to an embodiment of the present application;
FIG. 14 is a schematic view of another display device including the optical imaging unit shown in FIG. 13 according to an embodiment of the present disclosure;
Fig. 15 is a schematic view of an optical path in an optical imaging unit according to an embodiment of the present application;
FIG. 16 is a schematic view of an optical path in another optical imaging unit according to an embodiment of the present disclosure;
fig. 17 is a schematic diagram of a display device according to an embodiment of the present application;
FIG. 18 is a schematic diagram of a display device according to an embodiment of the present application applied to a head-up display device HUD of a vehicle;
fig. 19 is a schematic diagram of one possible functional framework of a vehicle according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
The light display products include desktop display devices, head-up display devices (HUDs), projection lamps, and the like, and the application scenes of the light display products include, but are not limited to, vehicles, such as automobiles, trains, and the like, but the desktop display devices can also be applied to the education field, such as various desk bodies with desktop display functions.
The optical display product is characterized in that a virtual image with an enlarged view field and a magnified frame can be displayed through the optical imaging unit of the optical display product and is mapped into eyes of an observer, so that the observer obtains a large-screen viewing experience.
However, since the optical imaging unit is generally composed of an optical lens. In practical applications, due to the manufacturing precision of the optical lens, the deviation of the assembly process of the optical imaging unit, and the structural characteristics of the optical lens (such as optical free-form surfaces or various spherical surfaces), the virtual image formed by the optical imaging unit loses similarity relative to the real image generated by the image generating unit, that is, the geometric deformation is generated. This geometrical distortion is also known as optical distortion. The optical distortion here can also be understood as: for an ideal optical imaging unit, the magnification of the virtual image is constant throughout the virtual image relative to the real image, i.e. there is a high similarity between the virtual image and the real image, i.e. no optical distortion is generated. However, for an actual optical imaging unit, the magnification of the virtual image varies with the field of view, and the larger the field of view, the greater the degree of variation in magnification, and therefore the virtual image loses similarity with respect to the real image, that is, optical distortion occurs.
The optical distortion caused by the optical imaging unit to the virtual image may be, for example, a pincushion distortion as shown in fig. 1 (b) or a barrel distortion as shown in fig. 1 (c), in which one grid represents one pixel. In brief, assuming that a distorted virtual image is referred to as a real virtual image, barrel distortion causes some pixel positions of the real virtual image to be shifted in a direction away from the center of the image and the pixel size to become large, and pincushion distortion causes some pixel positions of the real virtual image to be shifted in a direction closer to the center of the image and the pixel size to become small, with respect to the ideal virtual image shown in (a) of fig. 1. The higher the field of view, the more pronounced the distortion is, whether barrel-shaped or pincushion. Such imaging defects of the optical imaging unit make geometrically distorted images visible to the user, which undoubtedly affect the viewing experience of the user.
In order to solve the problem of optical distortion caused by the optical imaging unit, in some implementations, as shown in fig. 2, a processor at the periphery of the display device is used to perform pre-correction processing on an image to be displayed to obtain a pre-corrected image, and an image signal of the pre-corrected image is input to the image generating unit, so that the image generating unit generates a real image of the pre-corrected image. The real image of the pre-corrected image is processed by the optical imaging unit to form a relatively ideal virtual image.
To better understand the principle of this implementation, the process can be divided into three phases: the first stage is a pre-correction processing stage, and a pre-correction image with geometric deformation relative to the image to be displayed is obtained through pre-correction processing; the second stage is a real image generation stage, and after the image signal of the pre-correction image is input to the image generation unit, the image generation unit can generate a real image of the pre-correction image; the third stage is a virtual image forming stage, and after the real image of the pre-corrected image passes through the optical imaging unit, the optical distortion caused by the optical imaging unit can offset the geometric deformation of the pre-corrected image to a certain extent, so that a relatively ideal virtual image is formed.
However, on the one hand, since the pre-corrected image is geometrically deformed with respect to the image to be displayed, for example, in fig. 2, the image to be displayed is rectangular, and the pre-corrected image is "barrel-shaped", which means that the positions of a large number of pixels on the pre-corrected image are shifted, and the pixel size is also changed, the pixels of the image generating unit are not fully utilized when generating the real image of the pre-corrected image, thereby resulting in a loss of image resolution. On the other hand, since the processor at the periphery of the display device is required to perform pre-correction processing on the image to be displayed, that is, the implementation requires corresponding hardware support, additional power consumption and cost are generated.
In view of this, the embodiments of the present application provide a display device, which may also be referred to as a projection device, and the display device may form a relatively ideal virtual image, ensure the viewing experience of the user, and also fully utilize the pixels of the image generating unit, so as to avoid losing the resolution of the image, and not increase the power consumption and the cost.
The display device can be applied to video entertainment, education, office, driving assistance, medical treatment and other scenes. The display device can be a planar display device for displaying two-dimensional images, and can also be a stereoscopic display device for providing left eye images and right eye images for the left eye and the right eye of a user respectively based on a spectroscopic stereoscopic display technology so as to enable the user to generate stereoscopic vision experience. In specific application, the display device can be integrated into other equipment as a component, and can also be used independently.
For example, in still other possible application scenarios, the display device may be a desktop display device, i.e. the display window thereof is located on a desktop of a table body structure such as a game table, a desk, a learning table, etc. The desktop display device can display larger-sized images in the front view range of the user, so that the visual experience with stronger immersion and shock is provided, and the visual fatigue of the user is reduced. As shown in fig. 3 (a), taking an example in which the tabletop display device is mounted on a learning table, the imaging light beam generated by the tabletop display device is reflected by the display window and reflected on the eyes of the user, so that an enlarged virtual image of the image is displayed in the front view of the user. The image can contain book content, auxiliary learning content or pictures of network courses, and the like, so that a user does not need to turn the book at low head, does not need to learn the network courses through small-screen intelligent equipment, is convenient for protecting the eyesight of the user, and provides better learning conditions for the user.
In other possible application scenarios, the display apparatus may be integrated into a Near Eye Display (NED) device, which may be, for example, an augmented reality (augmented reality, AR) device or a Virtual Reality (VR) device, which may include, but is not limited to, AR glasses or AR helmets, and a VR device, which may include, but is not limited to, VR glasses or VR helmets. As shown in fig. 3 (b), taking NED devices as VR glasses as an example, when a user wears VR glasses, the VR glasses may present an enlarged virtual image corresponding to an image in a field of view in front of the user, so that the user is immersed in a virtual reality scene, plays a game, watches a video, participates in a virtual conference, or performs video shopping, and the like.
In still other possible application scenarios, the display device may be integrated in a HUD. As shown in fig. 3 (c), taking an example in which the HUD is mounted to a vehicle, the HUD may project an imaging light beam to the binocular of a user (driver) so as to present an enlarged virtual image of an image in a front view of the user. For example, the image may include meter information, navigation information, etc., so that the user does not need to look at the information at low head, avoiding affecting driving safety. Types of HUDs include, but are not limited to, windshield (W) -HUD, augmented reality head up display (AR-HUD), and the like.
In a further possible application scenario, the display device can also be integrated in a vehicle lamp. As shown in fig. 3 (d), in addition to the lighting function, the vehicle lamp may also implement an adaptive high beam system (adaptive driving beam, ADB) that may present an enlarged virtual image of an image containing text, traffic sign, video frame, etc. in front of the vehicle, thereby providing the user with an auxiliary driving function or an audio/video entertainment function.
In still other possible application scenarios, the display device may be a projector. As shown in (e) of fig. 3, the projector may project an enlarged virtual image corresponding to the image onto a wall surface or a projection screen.
It should be noted that the application scenario given above is merely an example, and the display device provided in the embodiment of the present application may also be applied to other possible scenarios, for example, integrated in a medical device, for assisting medical treatment, which is not limited in this application.
Fig. 4 is a schematic structural diagram of a display device 400 according to an embodiment of the present application. As shown in fig. 4, the display apparatus 400 includes an image generation unit 410 and an optical imaging unit 420. The image generating unit 410 is capable of generating a real image of an image to be displayed corresponding to an input image signal, that is, generating a light beam P1 containing information of the image to be displayed, where the light beam P1 may form the real image of the image to be displayed. The optical imaging unit 420 is configured to perform processes such as folding (e.g. changing a propagation direction of the light beam) and amplifying the light beam P1, so as to form an amplified virtual image corresponding to the real image, that is, form a light beam P2 containing image information to be displayed, where the light beam P2 may form the amplified virtual image. For convenience of distinction and explanation, the light beam P1 generated by the image generating unit 410 will be referred to as a first light beam, and the light beam P2 emitted by the optical imaging unit 420 will be referred to as a second light beam.
It should be appreciated that the positional relationship of the image generation unit 410 and the optical imaging unit 420 may be different in different application scenarios. For example, when the display apparatus 400 is applied to a desktop display scene and applied to a HUD, the positional relationship between the image generating unit 410 and the optical imaging unit 420 may be different. Or it may be understood that, for a specific application scenario, the positional relationship between the image generating unit 410 and the optical imaging unit 420 depends on the optical path design in the specific application scenario, which is not limited in the present application. However, to ensure that the first light beam may be incident on the optical imaging unit 420, the optical imaging unit 420 is typically located on the light exit side of the image generation unit 410.
In order to make full use of the pixels of the image generating unit 410 while forming a relatively ideal virtual image, avoiding losing image resolution and not increasing power consumption and cost, in the embodiment of the present application, as shown in fig. 5, the real image of the image to be displayed generated by the image generating unit 410 has predistortion with respect to the image to be displayed. After the real image enters the optical imaging unit 420, the optical imaging unit 420 may form a corresponding amplified virtual image, and the optical distortion caused by the optical imaging unit 420 on the virtual image may compensate for the pre-distortion. In this way, a relatively ideal virtual image can be formed, and the viewing experience of the user is ensured.
It should be noted that, since the image generating unit 410 directly generates the real image with predistortion according to the image signal of the image to be displayed inputted by the processor at the periphery of the display device, rather than generating the corresponding real image according to the image signal of the pre-corrected image, the pixels of the image generating unit 410 are effectively utilized, and the loss of the resolution of the image is avoided. In addition, it can be further seen that, in the present application, the image generating unit 410 implements the pre-correction of the optical distortion in the "real image generating stage", and omits the "pre-correction processing stage", that is, no hardware support on the periphery of the display device 400 is required, so that no additional power consumption and no additional cost are generated.
The optical imaging unit 420 corresponds to a distortion parameter, which may be understood as a characteristic parameter of the optical imaging unit 420, which may characterize the optical distortion caused by the optical imaging unit 420. For convenience of description, the distortion parameter corresponding to the optical imaging unit 420 will be referred to as a first distortion parameter hereinafter.
In some embodiments, the image generation unit 410 includes a light source and a light modulator. The light modulator is located on the light-emitting side of the light source for providing the light modulator with base light, such as natural light or primary light. The light modulator is also called a display panel, such as a liquid crystal display (liquid crystal display, LCD) and an organic light-emitting diode (OLED), for modulating a signal of an image to be displayed to generate a first light beam, i.e., a light beam P1 as shown in fig. 2 described above, for forming a real image of the image to be displayed. The size and/or distribution mode of the pixels of the light modulator are matched with a second distortion parameter, and the second distortion parameter is determined according to the first distortion parameter. It will be further understood that the size and/or distribution of the pixels of the light modulator are designed according to, i.e. correspond to, the first distortion parameters of the optical generation unit 420, so that based on the real image generated by the light modulator, there is a predistortion, which can be characterized by the second distortion parameters, with respect to the image to be displayed, and the predistortion can be offset with the optical distortion caused when the subsequent optical imaging unit 420 performs virtual image imaging, so that the predistortion of the optical distortion is implemented in the "real image generation stage", which not only avoids the loss of image resolution, but also avoids the generation of additional power consumption and cost.
The relationship between the first distortion parameter and the optical modulator and the first distortion parameter and the second distortion parameter is explained in order below, and then the specific implementation manner related to the optical modulator is described.
In one possible implementation, the optical distortion is defined as a deviation amount of the real virtual image from the ideal virtual image, where the deviation amount may be a position deviation amount of the same feature point in the real virtual image and the ideal virtual image, and may also be a deviation of the heights and/or widths of the same feature areas in the real virtual image and the ideal virtual image. Here, the ideal virtual image is a virtual image formed by a certain real image passing through the ideal optical imaging unit 420, assuming that the optical imaging unit 420 is an ideal optical imaging unit 420 having zero distortion. The real virtual image is a virtual image formed by the real image passing through the actual optical imaging unit 420.
As shown in the following formulas (1-1) and (1-2), if the deviation amount is taken as an example of the positional deviation amount of the same feature point, the relationship between the real virtual image and the ideal virtual image may be a relationship between the position coordinates of the feature point on the real virtual image and the position coordinates of the feature point on the ideal virtual image, and the parameter for characterizing/describing the relationship is the first distortion parameter. It should be appreciated that the first distortion parameter, which corresponds to the optical imaging unit 420, is also a parameter characterizing/describing the optical distortion. By way of example, the feature points may be pixel locations.
x`=x+x[k 1 (x 2 +y 2 )+k 2 (x 2 +y 2 ) 2 ](1-1)
y`=y+y[k 1 (x 2 +y 2 )+k 2 (x 2 +y 2 ) 2 ](1-2)
In the above formula, (x 'and y') represent the position coordinates of the pixels in the real virtual image, and (x and y) represent the position coordinates of the pixels in the ideal virtual image, where parameters describing the relationship between x 'and x and the relationship between y' and y, such as k 1 And k 2 I.e. the first distortion parameter.
It should be noted that the above formulas (1-1) and (1-2) are given as examples, and are not intended to limit the embodiments of the present applicationThe first distortion parameter involved constitutes a definition. In practice, the relationship between the position coordinates of the feature point on the real virtual image and the position coordinates of the feature point on the ideal virtual image will also be different depending on the types of optical distortions, such as radial distortion (including barrel distortion and pincushion distortion mentioned above) and tangential distortion, and the first distortion parameters will also be different, not limited to the functional relationship and k shown in the formulas (1-1) and (1-2) 1 And k 2
In addition, the optical distortion referred to in the present application is distortion caused to a virtual image formed by the optical imaging unit 420, and may have different definitions based on a subject to be compared with the virtual image. The first distortion parameters may also be different for optical distortions under different definitions.
For example, in another possible implementation, the above optical distortion is defined as the amount of deviation of the virtual image formed by the optical imaging unit 420 from the real image entering the optical imaging unit 420. Since the virtual image and the real image have a correspondence relationship, in the case where there are a plurality of feature points (or feature areas) on the real image, the virtual image also has a plurality of feature points (or feature areas), and the plurality of feature points (or feature areas) in the real image are in one-to-one correspondence with the plurality of feature points (or feature areas) in the virtual image. The amount of deviation of the virtual image from the real image may be a positional deviation amount of the corresponding feature points in the virtual image and the real image, or may be a deviation in height and/or height of the corresponding feature regions in the virtual image and the real image. Taking the deviation amount as an example, the relationship between the virtual image and the real image may be the relationship between the position coordinates of the corresponding feature points in the virtual image and the real image, and the parameter for representing/describing the relationship may be the first distortion parameter corresponding to the optical imaging unit 420, that is, the parameter for representing the optical distortion generated by the virtual image.
For any of the above implementations, the first distortion parameter corresponding to the optical imaging unit 420 may be obtained by corresponding actual measurement or simulated measurement. It will be appreciated that the manner in which the distortion parameters are measured will vary based on the definition of optical distortion described above. Those skilled in the art may select an appropriate measurement mode to obtain the first distortion parameter corresponding to the optical imaging unit 420 in light of the definition of the optical distortion given in the embodiment of the present application, and the embodiment of the present application is not limited to the specific measurement mode.
Fig. 6 exemplarily shows a relationship between the optical modulator and the first distortion parameter and the second distortion parameter, and in this example, it is assumed that the first distortion parameter describes a relationship between a virtual image R formed by the optical imaging unit 420 and a real image D entering the optical imaging unit 420. Referring to fig. 6, since the virtual image R formed by the optical imaging unit 420 is desired to be an ideal virtual image, the virtual image R can be regarded as known. Since the pixels imgD in the real image D and the pixels imgR in the virtual image R are in one-to-one correspondence, and the pixel values of the corresponding imgR and imgD are the same, when the first distortion parameter is obtained and the virtual image R is known, the coordinates of each pixel imgR in the real image D can be determined according to the position coordinates of each pixel imgR in the virtual image R and the first distortion parameter, and the real image D can be obtained. The real image D is a predistortion real image generated by the desired image generating unit 410, and the predistortion of the predistortion real image D may be represented by a second distortion parameter, that is, the relationship between the real image D and the image I to be displayed may be represented by the second distortion parameter. In this way, in the case where the size and/or the distribution of the pixels of the light modulator match the second distortion parameter, based on the light modulator, a desired real image D can be generated from the image to be displayed.
In short, in the case where the first distortion parameter corresponding to the optical imaging unit 420 is obtained, with the ideal virtual image R as a virtual image imaging target, the real image D generated by the desired image generating unit 410 can be determined in conjunction with the first distortion parameter. In case the real image D is determined, a relation between the real image D and the image I to be displayed, i.e. a second distortion parameter, may be further obtained. The size and/or distribution of the pixels of the light modulator are designed based on the second distortion parameters, i.e. the real image generated by the image generation unit 410 is made to be the desired real image D with predistortion.
Typically, a panel region of a light modulator includes pixels thereon, each pixel including a plurality of sub-pixels. For example, for most LCDs, each pixel is made up of three primary colors, red, blue and green (RGB), one "subpixel" for each color on each pixel, and for most OLEDs, each pixel includes 4 subpixels, one for each of red, green and two for blue. The panel region includes sub-pixel regions distributed according to a certain rule, and the sub-pixels are located in the sub-pixel regions. The sub-pixel region includes a light emitting region, and an area ratio of the light emitting region in the sub-pixel region is represented by an aperture ratio of the sub-pixel region. The sub-pixels are commonly referred to herein as pixel points. The size and the distribution manner of the pixels mentioned in the embodiments of the present application may specifically be the size and the distribution manner of the light emitting area in the sub-pixel area where the sub-pixels are located, where the size is, for example, width, height, area, etc., and the distribution manner includes a position, an array distribution direction, a pitch, etc. when the array is distributed.
In some embodiments, the optical distortion caused by the optical imaging unit 420 is manifested as a change in the shape of the image. For example, as can be seen from fig. 3, the distorted image shown in fig. 3 (b) and (c) has a significant change in shape relative to the pre-distorted image shown in fig. 3 (a). For convenience of explanation, the shape that is not distorted will be referred to as an ideal shape, and the shape that is distorted will be referred to as a non-ideal shape.
In order to make the virtual image formed by the optical imaging unit 420 have a desired shape, in one possible implementation, the real image generated by the image generating unit 410 is deformed with respect to the image to be displayed, or, the image to be displayed is in a desired shape, and the real image is pre-deformed to have a non-desired shape. The predistortion includes the predistortion. In this way, after the real image enters the optical imaging unit 420, when the optical imaging unit 420 images the virtual image, the pre-deformation of the real image is counteracted by changing the shape of the virtual image, so that the finally formed virtual image takes on the ideal shape.
Wherein, since the shape of the real image depends on the shape of the light modulator, and the shape thereof depends on the size and distribution of the pixels thereon, in case the size and/or distribution of the pixels of the light modulator matches the second distortion parameter, the shape of the light modulator may be a non-ideal shape, thereby ensuring that a real image of a non-ideal shape is generated, thereby ensuring that the optical imaging unit 420 forms a virtual image of an ideal shape.
As shown in fig. 7 (a), assuming that the light modulator is of an ideal shape, in particular rectangular, the real image it generates is of an ideal shape, in particular rectangular as well. After the real image enters the optical imaging unit 420, the virtual image formed is in a non-ideal shape, rather than a rectangular shape, due to optical distortion caused by the optical imaging unit 420.
As shown in fig. 7 (b), assuming that the light modulator is of a non-ideal shape for the same optical imaging unit 420, the real image it generates is of an ideal shape. After the real image enters the optical imaging unit 420, the virtual image formed may have an ideal shape, i.e., a rectangular shape, due to optical distortion caused by the optical imaging unit 420.
It should be understood that if the shape of the light modulator is irregular, this means that the area limited by the edge area is irregular in shape and size. The edge region cannot accommodate pixels of the same size as the inner region due to the aforementioned region shape and size, and thus defects such as "burrs", "gaps", and color shift of the real image are likely to occur. Therefore, in order to avoid that the generated real image has defects such as "burrs", "gaps", in one possible implementation, the size of the pixels distributed at the edge (which may be simply referred to as "edge area") of the light modulator is smaller than the size of the other pixels. By reducing the size of the pixels, the whole luminous area is ensured to be also arranged in the smaller sub-pixel area in the edge area, and the defects of ' rough edges ', ' gaps ', ' color cast and the like of the generated real image can be avoided. In addition, the position of the pixels in the edge area of the light modulator can be adjusted, for example, continuous pixels are distributed in a staggered manner, so that the smoothness of the edge area of the light modulator is further optimized, the color cast phenomenon is avoided, the smoothness and the color uniformity of the edge of the real image are further optimized, and the image display effect is optimized.
Taking the light modulator shown in fig. 7 (b) as an example, fig. 8 exemplarily shows the size and distribution of pixels within the light modulator. As shown in fig. 8, the size of the pixels distributed in the edge area is smaller than that of the other pixels. The sub-pixel regions located in the edge regions are of a smaller area and less regular shape than the other sub-pixel regions, but due to the smaller size of the pixels therein, it is ensured that such sub-pixel regions also have a complete light emitting region.
The size and position of the pixel located in the edge region may be adapted to the size and aperture ratio of the corresponding sub-pixel region, so as to avoid the defects of the generated real image such as "burr", "notch", and color shift.
In some embodiments, the optical distortion caused by the optical imaging unit 420 is manifested not only as a change in the shape of the image, but also as a change in the pixel size and a change in the pixel position, i.e., distortion of the image, such as barrel distortion and pincushion distortion shown in fig. 3. Barrel distortion is also called barrel distortion, which is a distortion phenomenon in which a picture expands in a barrel shape, i.e., a picture becomes large in a barrel shape from the viewpoint of a change in a pixel size. The pincushion distortion is also called pincushion distortion, which is a distortion phenomenon in which a picture is pincushion-compressed, that is, the size of a pixel becomes small from the viewpoint of the change in the size of the pixel. It should be appreciated that barrel distortion and pincushion distortion may be accompanied by a change in the shape of the image.
Based on this, in the embodiment of the present application, the size and array distribution of pixels on the light modulator are designed according to the type of optical distortion caused by the optical imaging unit 420. Wherein if the optical distortion caused by the optical imaging unit 420 is barrel-shaped, the size of some or all of the pixels on the optical modulator is compressively designed such that the image generating unit 410 generates a real image having pincushion distortion. In this way, after the real image enters the optical imaging unit 420, when the optical imaging unit 420 images the virtual image, the barrel distortion of the real image is counteracted by barrel distortion of the virtual image, and finally an ideal virtual image is formed. If the optical distortion caused by the optical imaging unit 420 is pincushion distortion, the size of some or all of the pixels on the light modulator is expansion-designed so that the image generating unit 410 generates a real image having barrel distortion. In this way, after the real image enters the optical imaging unit 420, when the optical imaging unit 420 images the virtual image, the barrel distortion of the real image is cancelled by the pincushion distortion caused to the virtual image, and finally an ideal virtual image is formed.
The pixel size is designed in a compressed manner, which is understood as reducing the pixel size, that is, reducing the size of the light emitting area in the sub-pixel area. The expansion of the size of the pixel is understood to be an increase in the size of the pixel, i.e. an increase in the size of the light emitting area within the sub-pixel area.
For example, the pixel obtained by the compression design may be as shown in (b) of fig. 9, and the pixel obtained by the expansion design may be as shown in (c) of fig. 9. Compared to the normal-size pixel shown in fig. 9 (a), the pixel obtained by the compression design is smaller than the normal-size pixel, and the pixel obtained by the expansion design is larger than the normal-size pixel.
As shown in fig. 7 (a), assuming that the dimensions of all pixels of the light modulator are the same, the resulting real image has neither barrel nor pincushion distortion. After the real image enters the optical imaging unit 420, if the optical distortion caused by the optical imaging unit 420 is barrel distortion, a virtual image formed by the real image has barrel distortion. If the optical distortion caused by the optical imaging unit 420 is a pincushion distortion, a virtual image formed thereof has the pincushion distortion.
Assuming that the optical distortion caused by the optical imaging unit 420 is barrel-shaped, and that the size of some or all of the pixels of the light modulator is compression-designed, i.e., smaller than the normal size, the resulting real image has pincushion distortion. After the real image enters the optical imaging unit 420, barrel distortion caused by the optical imaging unit 420 can compensate barrel distortion of the real image, so that an ideal virtual image is formed.
In practical application, only partial pixels can be subjected to compression design or expansion design, so that ideal virtual images can be formed. Of course, the ideal virtual image herein is not an absolute ideal virtual image, but an ideal virtual image within the error range of human eyes, that is, distortion of an image cannot be recognized by human eyes. Specifically, the light modulator comprises a first area and a second area, wherein pixels in the first area and pixels in the second area are distributed in an array; the size of the pixels in the first area is different from the size of the pixels in the second area, and/or the array distribution mode of the pixels in the first area is different from the array distribution mode of the pixels in the second area. It will be appreciated herein that the light modulator is divided into two portions, the size and/or array distribution of pixels in the two portions may be in accordance with different rules.
Illustratively, the first region is remote from the center of the light modulator relative to the second region. The size and/or array distribution of pixels within the first region is matched to the second distortion parameter. Alternatively, the size and/or array distribution of pixels in the first region corresponds to the first distortion parameter. That is, the size and/or array distribution of the pixels in the first region is designed according to the second distortion parameter, i.e. the second distortion parameter is matched with a rule, such as a compression design rule or an expansion design rule. The size and/or array distribution of the pixels in the second area conform to another rule, such as a standard design rule of the light modulator. The standard design rules herein may be understood as that the size and/or array distribution of pixels is independent of the second distortion parameter, i.e. the imaging defect of the optical imaging unit 410.
The distortion is more pronounced in the region of the larger field of view, whether barrel-shaped or pincushion. And the distortion does not affect the viewing experience of the user corresponding to the region with smaller field of view. Therefore, the field angle corresponding to the pixels in the first area may be greater than or equal to the preset field angle, that is, the first area is a large field area. In this way, the ideal virtual image formed within the error range of human eyes can be ensured without compressing or expanding all pixels.
In one implementation, the size of the pixels within the first region corresponds to a first distortion parameter. Wherein if the optical distortion generated by the optical imaging unit 420 is barrel distortion, the size of the pixels in the first region is smaller than the size of the pixels in the second region so that the real image has predistortion as pincushion distortion. If the optical distortion generated by the optical imaging unit 420 is a pincushion distortion, the size of the pixels in the first region is larger than the size of the pixels in the second region so that the real image has a predistortion of barrel shape.
In another possible implementation, the array of pixels in the first region is distributed in a manner corresponding to the first distortion parameter. Specifically, a row of pixels in the first region is distributed along a curve. Wherein if the optical distortion generated by the optical imaging unit 420 is barrel distortion, the pixels near the two ends of the curve are far from the second region relative to the pixels near the midpoint of the curve; if the optical distortion produced by the optical imaging unit 420 is pincushion distortion, then pixels near the midpoint of the curve are farther from the second region than pixels near the two ends of the curve.
Of course, in some embodiments, the size and array distribution of pixels within the first region may both correspond to the first distortion parameter.
Fig. 10 illustrates one possible light modulator, as shown in fig. 10, that includes a first region and a second region, the first region surrounding the periphery of the second region. The pixels in the first area and the second area are distributed in an array, and the view angle corresponding to each pixel position in the first area is larger than the preset view angle. The size and the array distribution mode of the pixels in the first area are designed according to the second distortion parameters, namely, the pixels are matched with the second distortion parameters.
As shown in fig. 10, the width of the light modulator in the column direction of the pixels gradually increases from the middle to both ends in the row direction so that the shape of the real image is "pillow-shaped". Meanwhile, the size of the pixels in the first region is smaller than that of the pixels in the second region, so that the predistortion of the real image is pincushion distortion, i.e., pincushion distortion is exhibited. That is, the light modulator shown in fig. 10 can generate a real image having a pincushion distortion with respect to an image to be displayed.
It can also be seen from fig. 10 that a row of pixels in the first region is distributed along a curve, wherein pixels near both ends of the curve are distant from the second region relative to pixels near the midpoint of the curve. For example, in fig. 10, in the height direction of the light modulator, a row of pixels located on the left side of the second region in the first region is arrayed along the extending direction of the curve S1, and a row of pixels located on the right side of the second region is arrayed along the curve S2; in the width direction of the light modulator, a row of pixels in the first region, which are positioned on the upper side of the second region, are distributed in an array along the extending direction of the curve S3, and a row of pixels in the lower side of the second region are distributed in an array along the curve S4; and pixels in the second region are arrayed in the height direction of the light modulator, i.e., along the height direction, and in the width direction, i.e., along the width direction.
Alternatively, the array distribution of pixels in the first region in fig. 10 may be further described as follows: a row of pixels in the first region gradually gets farther from a midline of the light modulator parallel to the row direction from the middle of the row direction toward both ends. The center line parallel to the row direction is a straight line parallel to the row direction and passing through the geometric center of the light modulator. Further, in one row of pixels in the first region, the width of the pixels gradually decreases from the middle toward the both ends in the row direction thereof. The pixel size and the array distribution mode in the first area are designed in such a way that the pincushion distortion caused by the light modulator on the real image is more close to the pincushion distortion on the optics, and then is more matched with barrel-shaped distortion caused by the optical imaging unit, and after the pincushion distortion and the barrel-shaped distortion are counteracted, the formed virtual image is more ideal. The optical pincushion distortion is understood to be pincushion distortion caused by factors such as assembly errors of optical lenses and manufacturing accuracy, and the principle of the pincushion distortion is different from that of the pincushion distortion caused by the optical modulator in the present application.
Fig. 11 illustrates another possible light modulator, unlike the light modulator shown in fig. 10, in which the width of the light modulator in the column direction of the pixels gradually decreases from the middle to both ends in the row direction so that the shape of the real image is "barrel-shaped". Meanwhile, the size of the pixels in the first region is larger than that of the pixels in the second region, so that the predistortion of the real image is barrel distortion, i.e., barrel distortion is exhibited. That is, the light modulator shown in fig. 11 can generate a real image having barrel distortion with respect to an image to be displayed.
It can also be seen from fig. 11 that a row of pixels in the first region is distributed along a curve, wherein pixels near the midpoint of the curve are farther from the second region than pixels near the ends of the curve. For example, in fig. 11, unlike the light modulator shown in fig. 10, in the height direction of the light modulator, a row of pixels located on the left side of the second region in the first region is arrayed along the extending direction of the curve S5, and a row of pixels located on the right side of the second region is arrayed along the curve S6; in the width direction of the light modulator, a row of pixels located on the upper side of the second region in the first region is arrayed along the extending direction of the curve S7, and a row of pixels located on the lower side of the second region is arrayed along the curve S8.
Alternatively, the array distribution of pixels in the first region in fig. 11 may be further described as follows: a row of pixels in the first region gradually gets farther from a midline of the light modulator parallel to the row direction from the middle of the row direction toward both ends. Further, in one row of pixels in the first region, the width of the pixels gradually increases from the middle to the both ends in the row direction thereof. The pixel size and the array distribution mode in the first area are designed in such a way, so that barrel-shaped distortion caused by the light modulator on a real image is closer to barrel-shaped distortion on optics, and then is more matched with pincushion distortion caused by an optical imaging unit, and after the barrel-shaped distortion and the pincushion distortion are counteracted, a formed virtual image is more ideal.
It should be noted that, the light modulators shown in fig. 10 and 11 are designed to compress and expand the pixels in the first region at a given resolution (e.g., 1400×1050) and pixel panel size, respectively. In other words, under the condition of determining the resolution, after the pixels in the first area are compressed and designed, the size of the pixels in the first area is smaller than the size of the pixels in the second area, and after the pixels in the first area are expanded and designed, the size of the pixels in the first area is larger than the size of the pixels in the second area. Taking the light modulator shown in fig. 11 as an example, assuming that the second area has a size of 800×600, the second area includes 800×600 pixels, the area other than the second area includes (1400×1050) - (800×600) pixels, part or all of (1400×1050) - (800×600) are located in the first area, and the size of the pixels in the first area is smaller than the size of the pixels in the second area.
It should be appreciated that in other embodiments of the present application, the light modulator may also be divided into the first region, the second region, and the edge region described above. The first region is remote from the center of the light modulator relative to the second region, and the edge region is remote from the center relative to the first region. Wherein the size of the pixels in the edge region is smaller than the size of the pixels in the first region. In the embodiment where the size of the pixels in the first region is larger than the size of the pixels in the second region, the size of the pixels in the edge region may be smaller than the size of the pixels in the second region, or may be larger than the size of the pixels in the second region.
The light modulators mentioned in the above embodiments include, but are not limited to, LCD, OLED, e.g. liquid crystal on silicon (liquid crystal on silicon, LCOS) displays, and digital light processing (digital light processing, DLP) displays. It is easy to understand that, due to different optical display principles of different types of optical modulators and different structures of the optical modulators, manufacturing processes of the different types of optical modulators are different, and further, when the display device 400 of the present application adopts the different types of optical modulators, different manufacturing processes need to be adopted to manufacture the optical modulators with sizes and/or distribution manners of pixels matched with the second distortion parameters. Meanwhile, for this purpose, those skilled in the art can use a manufacturing process known in the art to manufacture the optical modulator provided in the embodiment of the present application without performing any creative effort, and the achievable manufacturing process is not repeated herein.
On the basis of any of the above embodiments, as shown in fig. 12, the display device 400 may further include a light diffusing element 430, and the light diffusing element 430 may also be referred to as a diffusion screen. The light diffusing element 430 is located at the light emitting side of the image generating unit 410, and is configured to diffuse the first light beam P1 emitted by the image generating unit 410, so as to improve uniformity of the imaging frame.
It is easily understood that the light diffusing element, as one of the optical lenses, also causes optical distortion of the image when it is manufactured with low precision and/or includes a curved surface structure. For such a light diffusing element, when the first distortion parameter corresponding to the optical imaging unit 420 is measured, it needs to be considered as a part of the optical imaging unit 420, so as to ensure the accuracy of the first distortion parameter, so as to ensure that the optical distortion described by the first distortion parameter includes the optical distortion caused by the light diffusing element. In this way, it is ensured that an optical modulator designed according to the first distortion parameter is capable of achieving a pre-correction of optical distortions.
Referring to fig. 13, in some embodiments, the optical imaging unit 420 includes a folded light path element 421 and a virtual image imaging group 422. Wherein the folded light path element 421 is configured to receive the first light beam P1 emitted by the image generating unit 410, and transmit the first light beam P1 to the virtual image forming group 422; the virtual image forming lens group 422 is used for reflecting the first light beam to form a second light beam; the folded light path element 421 also serves to transmit the second light beam so that the second light beam enters both eyes of the user, and the user sees a virtual image. Such a structural design of the optical imaging unit 420 improves the integration level of the display device 400, and is advantageous for miniaturization of the display device 400.
Referring to fig. 14, in the design of the display device 400 further including the light diffusing element 430, the light diffusing element 430 is located on the optical path between the image generating unit 410 and the folded optical path element 421, and the first light beam P1 is diffused by the light diffusing element 430 and then reaches the folded optical path element 421.
In some embodiments, virtual image imaging group 422 includes an optical free-form surface, which may be provided by an optical free-form surface mirror. In addition, the light-emitting side of the light modulator in the image generation unit 410 may be further provided with a 1/4 wave plate. In the design where the light beam emitted from the light modulator is linearly polarized, a 1/4 wave plate located on the light-emitting side of the light modulator is used to convert the light beam into circularly polarized light, i.e., to make the first light beam into circularly polarized light.
A specific implementation of the folded optical path element 421 will be described below taking the virtual image imaging lens group 422 as an example including an optical free-form surface lens, and the first light beam is circularly polarized light.
In one possible implementation, as shown in fig. 15, the folded light path element 421 includes a polarization converter 421a and a first film layer 421b, with the first film layer 421b and the polarization converter 421a being located in sequence on a side of the substrate away from the optical freeform mirror 422 a. In order to facilitate explanation of the function of the folded light path element 421, in fig. 15 illustrating the present embodiment, the first film layer 421b is provided separately from the polarization converter 421 a. It should be understood that, in practical applications, the first film 421b and the polarization converter 421a are sequentially attached to the substrate at a side away from the optical free-form surface mirror 422 a. Wherein the substrate is used to carry the first film 421b and the polarization converter 421a without affecting the light beam. For example, the substrate may specifically be a transparent glass sheet.
The polarization converter 421a is configured to receive the first light beam P1 (circle) and convert the first light beam P1 (circle) into linear polarized light with a first polarization state. For example, in FIG. 15, the first light beam P1 (circle) is converted into S light, i.e., P1-S.
The first film 421b is configured to reflect light of the first polarization state, such that when P1-s reaches the first film 421b, the light is reflected to the polarization converter 421a, and under the action of the polarization converter 421a, the light P1-s is converted into circularly polarized light, i.e., P1 (circle), and is transmitted to the optical freeform mirror 422a.
The optical free-form surface mirror 422a reflects P1 (circle), and the reflected light is the second light beam.
The polarization converter 421a is further configured to receive the reflected light of the freeform mirror 422a, i.e., P1 (circle), and convert the reflected light into polarized light of the second polarization state, i.e., P1-P, and reach the first film 421b.
The first film 421b is also configured to transmit polarized light of the second polarization state. Thus, P1-P reaches the first film 421b and then transmits through the first film 421b to continue to propagate, and finally enters both eyes of the user, so that the user sees an enlarged virtual image.
In the implementation shown in fig. 15, the polarization converter 421a may be a 1/4 wave plate, and the first film layer 421b may be a reflective-transmissive polarizing film.
In the implementation manner shown in fig. 15, based on the characteristic that the first light beam may be circularly polarized light, the folded light path element may reflect the first light beam to the virtual image imaging lens group on one hand, and the virtual image imaging lens group processes the first light beam, such as amplifying processing, and may transmit the light beam emitted by the virtual image imaging lens group on the other hand, so that the light beam may enter the eyes of the user, that is, the effect of folding the light path is achieved, so that the integration level of the display device is improved, and the miniaturization of the display device is facilitated. Meanwhile, the propagation process of the light beam can ensure the imaging quality without causing the loss of the light beam. In another possible implementation, as shown in fig. 16, the folded light path element 421 includes a second film layer 421c and a third film layer 421d, where the third film layer 421d and the second film layer 421c may be sequentially located on a side of the substrate away from the optical freeform mirror 422 a. In order to facilitate explanation of the function of the folded light path element 421, in the drawing of this embodiment, in fig. 16, the second film layer 421c and the third film layer 421d are provided separately. It should be understood that, in practical applications, the third film 421d and the second film 421c may be sequentially attached to the surface of the substrate.
The second film 421c is configured to reflect a portion of the first light beam P1, for example, a portion of the light beam P11 is transmitted, so that P11 is directed to the free-form surface mirror 422a. The second film 421c simultaneously transmits the remaining light (not shown) in the first light beam P1.
The optical freeform mirror 422a is configured to reflect the light P11, and after the reflected light P11 is transmitted to the second film 421c, a portion P111 of the reflected light is transmitted through the second film 421c and is transmitted to the third film 421d, and another portion (not shown) is reflected by the second film 421 c.
The third film 421d is for transmitting P111. The transmitted light is P2, and the P2 continues to spread and finally enters the eyes of the user, so that the user can see the amplified virtual image.
In the implementation shown in fig. 16, the second film layer 421c may be a semi-transmissive and semi-reflective film, and the third film layer 421d may be a transmissive-absorptive polarizing film. In this implementation, the folded light path element is simple in structure and easy to implement. On the one hand, part of the first light beam can be reflected to the virtual image imaging lens group, the virtual image imaging lens group processes the part of the light beam, and on the other hand, part of the light beam emitted by the virtual image imaging lens group can be transmitted, so that the light beam can enter eyes of a user, the user can see the amplified virtual image, namely, the effect of folding the light path is realized, the integration level of the display device is improved, and the miniaturization of the display device is facilitated. The embodiment of the application also provides a display device, which can be a projection display device used in occasions such as families, classrooms, meeting rooms, auditoriums, cinema, courts, squares and the like, can be a vehicle-mounted display screen or a display screen integrated in intelligent household appliances and the like, and can be a network television, an intelligent television, an Internet Protocol Television (IPTV) or integrated in the display device.
Fig. 17 is a schematic diagram of a display device according to an embodiment of the present application. As shown in fig. 17, the circuits in the display device mainly include a processor 1701, an internal memory 1702, an external memory interface 1703, an audio module 1704, a video module 1705, a power supply module 1706, a wireless communication module 1707, an i/O interface 1708, a video interface 1709, a controller area network (Controller Area Network, CAN) transceiver 1710, a display circuit 1711, any of the above display apparatuses 400, and the like. The processor 1701 and its peripheral components, such as an internal memory 1702, a can transceiver 1710, an audio module 1704, a video module 1705, a power module 1706, a wireless communication module 1707, an i/O interface 1708, a video interface 1709, a transceiver 1710, and a display circuit 1711, may be connected by a bus.
The processor 1701 may be referred to as a front-end processor. The processor 1701 includes one or more processing units, such as: the processor 1701 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the processor 1701 for storing instructions and data. Such as storing the operating system of the display device, AR Creator software package, etc. In some embodiments, the memory in the processor 1701 is a cache memory. The memory may hold instructions or data that the processor 1701 has just used or recycled. If the processor 1701 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 1701 is reduced, thereby improving the efficiency of the system.
In addition, if the display device in the present embodiment is mounted on a vehicle, the function of the processor 1701 may be implemented by a domain controller on the vehicle.
In some embodiments, the display device may also include a plurality of input/output (I/O) interfaces 1708 coupled to the processor 1701. The interface 1708 may include, but is not limited to, an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others. The I/O interface 1708 may be connected to a mouse, a touch screen, a keyboard, a camera, a speaker/horn, a microphone, or may be connected to physical keys (e.g., a volume key, a brightness adjustment key, an on/off key, etc.) on a display device.
The internal memory 1702 may be used to store computer executable program code that includes instructions. The memory 1702 may include a stored program area and a stored data area. The storage program area may store an application program (such as a call function, a time setting function, an AR function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the display device (e.g., phone book, universal time, etc.), etc. In addition, the internal memory 1702 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), and the like. The processor 1701 executes various functional applications of the display device and data processing by executing instructions stored in the internal memory 1702 and/or instructions stored in a memory provided in the processor 1701.
The external memory interface 1703 may be used to connect to an external memory (e.g., a Micro SD card), where the external memory may store data or program instructions as needed, and the processor 1701 may perform operations such as reading and writing on these data or program execution through the external memory interface 1703.
The audio module 1704 is used to convert digital audio information to an analog audio signal output and also to convert an analog audio input to a digital audio signal. The audio module 1704 may also be used to encode and decode audio signals, such as for playback or recording. In some embodiments, the audio module 1704 may be disposed in the processor 1701 or some functional modules of the audio module 1704 may be disposed in the processor 1701. The display device may implement audio functions through the audio module 1704, an application processor, and the like.
The video interface 1709 may receive externally input audio and video, which may specifically be a high-definition multimedia interface (high definition multimedia interface, HDMI), a digital video interface (digital visual interface, DVI), a video graphics array (video graphics array, VGA), a Display Port (DP), a low voltage differential signal (low voltage differential signaling, LVDS) interface, etc., and the video interface 1709 may also output video to the outside. For example, the display device receives video data transmitted from the navigation system or video data transmitted from the domain controller through the video interface.
The video module 1705 may decode video input by the video interface 1709, such as h.264 decoding. The video module can also encode the video collected by the display device, for example, H.264 encoding is carried out on the video collected by the external camera. The processor 1701 may decode the video input from the video interface 1709, and output the decoded image signal to the display circuit 1711.
Further, the display device further includes a CAN transceiver 1710, and the CAN transceiver 1710 may be connected to a CAN BUS (CAN BUS) of an automobile. Through the CAN bus, the display device CAN communicate with in-vehicle entertainment systems (music, radio, video modules), vehicle status systems, etc. For example, the user may turn on the in-vehicle music play function by operating the display device. The vehicle status system may send vehicle status information (doors, seat belts, etc.) to a display device for display.
The display circuit 1711 and the display device 400 realize a function of displaying an image together. The display circuit 1711 receives an image signal output from the processor 1701, processes the image signal, and then inputs the processed image signal to the image generating unit 410 of the display device 400 to generate a real image. The display circuit 1711 can also control an image displayed by the image generation unit 410. For example, parameters such as display brightness or contrast are controlled. The display circuit 1711 may include a driver circuit, an image control circuit, and the like.
In this embodiment, the video interface 1709 may receive input video data (or referred to as a video source), the video module 1705 decodes and/or digitizes the input video data, and outputs an image signal to the display circuit 1711, and the display circuit 1711 drives the image generating unit 410 to image a light beam emitted from the light source according to the input image signal, so as to generate a visual image (emit imaging light).
The power module 1706 is configured to provide power to the processor 1701 and projection device based on input power (e.g., dc power), and may include a rechargeable battery. In addition, the power module 1706 may be coupled to a power module (e.g., a power battery) of the vehicle that provides power to the power module 1706 of the display device.
The wireless communication module 1707 may enable the display device to communicate wirelessly with the outside world, which may provide solutions for wireless communication such as wireless local area network (wireless local area networks, WLAN), wireless fidelity (wireless fidelity, wi-Fi) network, bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. The wireless communication module 1707 may be one or more devices that integrate at least one communication processing module. The wireless communication module 1707 receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 1701. The wireless communication module 1707 may also receive a signal to be transmitted from the processor 1701, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via an antenna.
In addition, the video data decoded by the video module 1705 may be received wirelessly by the wireless communication module 1707 or read from the internal memory 1702 or the external memory, for example, the display device may receive video data from a terminal device or an in-vehicle entertainment system via a wireless lan in the vehicle, and the display device may read audio/video data stored in the internal memory 1702 or the external memory, in addition to the video data input via the video interface 1709.
In addition, the circuit diagrams illustrated in the embodiments of the present application do not constitute a specific limitation on the display device. In other embodiments of the present application, the display device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The above display device may provide a broadcast receiving television function in addition to the above functions. For example, the display device may be integrated in a web tv, a smart tv, an Internet Protocol Tv (IPTV).
The embodiment of the application further provides a table body structure, the table body structure is provided with a table top, a display window is arranged on the table top, any one of the display equipment or the display device 400 is arranged in the table body structure, the second light beam emitted by the display device 400 can reach the display window, and the display window reflects the second light beam, so that the second light beam enters eyes of a user, and the user can see an amplified virtual image.
The embodiment of the application also provides a vehicle, which comprises a reflecting element and any one of the display devices or any one of the display devices. The reflective element is for reflecting the second light beam from the display device such that the second light beam can be received by both eyes of the observer, thereby enabling an enlarged virtual image of the observer.
In a possible implementation, the display device described above is integrated in a head-up display HUD of a vehicle. The vehicle may include a reflective element, in particular, a front windshield (i.e., front windshield) thereof.
For example, fig. 18 is a schematic diagram of the application of the display device according to the present application to a head-up display device HUD of a vehicle. In this application, the optical generation unit of the display device (or the display apparatus described above) includes a front windshield of the vehicle. As shown in fig. 18, the second light beam corresponding to the virtual image is reflected by the front windshield to the eyes of the driver, so that an enlarged virtual image of the image M is formed on one side of the front windshield. By way of example, the vehicle may be a car, truck, motorcycle, bus, boat, airplane, helicopter, mower, recreational vehicle, casino vehicle, construction equipment, electric car, golf cart, train, trolley, etc., and embodiments of the present application are not particularly limited.
The contents of the above image M include, but are not limited to, map assistance information, indication information of external objects, state information of vehicles, entertainment information, and the like. The map assistance information is used as assistance driving, and includes, for example, but not limited to, a directional arrow, a distance, a travel time, and the like. The indication information of the external object includes, but is not limited to, safe car distance, surrounding obstacle, reversing image and the like. Taking an automobile as an example, the state information of the vehicle is generally information displayed on a vehicle meter, which is also referred to as meter information, including, but not limited to, information of a traveling speed, a traveling mileage, a fuel amount, a water temperature, a lamp state, and the like.
In other embodiments, the display device (or the display apparatus described above) may be mounted on the backrest of a vehicle seat. Taking a vehicle as an example of a small-sized vehicle, the display device may be mounted on the back of a front seat of the vehicle. In this way, the light beam emitted by the display device enters the eyes of the rear passenger, so that the rear passenger can watch the enlarged virtual image of the image to be displayed.
Fig. 19 is a schematic view of one possible functional framework of a vehicle provided in an embodiment of the present application.
As shown in fig. 19, various subsystems may be included in the functional framework of the vehicle, such as a control system 1901, a sensor system 1902, one or more peripheral devices 1903 (one shown in the illustration), a power supply 1904, a computer system 1905, and a display system 1906, as shown. Alternatively, the vehicle may also include other functional systems, such as an engine system to power the vehicle, etc., as not limited herein.
The sensor system 1902 may include a plurality of sensing devices that sense the measured information and convert the sensed information to an electrical signal or other desired form of information output according to a certain law. As illustrated, these detection devices may include, without limitation, a global positioning system (global positioning system, GPS), a vehicle speed sensor, an inertial measurement unit (inertial measurement unit, IMU), a radar unit, a laser rangefinder, an imaging device, a wheel speed sensor, a steering sensor, a gear sensor, or other elements for automatic detection, and so forth.
The control system 1901 may include several elements, such as a steering unit, a braking unit, a lighting system, an autopilot system, a map navigation system, a network time synchronization system, and an obstacle avoidance system, as shown. Optionally, the control system 1901 may also include elements such as a throttle controller and an engine controller for controlling the vehicle travel speed, which are not limited in this application.
The peripheral device 1903 may include several elements, such as a communication system, a touch screen, a user interface, a microphone, and a speaker, among others, as shown. Wherein the communication system is used for realizing network communication between the vehicle and other devices except the vehicle. In practical applications, the communication system may employ wireless communication technology or wired communication technology to enable network communication between the vehicle and other devices. The wired communication technology may refer to communication between the vehicle and other devices through a network cable or an optical fiber, etc.
The power supply 1904 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, a rechargeable lithium battery or lead acid battery, or the like. In practical applications, one or more battery packs in the power supply are used to provide electrical energy or power for vehicle start-up, and the type and materials of the power supply are not limited in this application.
Several functions of the vehicle are controlled by the computer system 1905. The computer system 1905 may include one or more processors (shown as one processor in the illustration) and memory (which may also be referred to as storage). In practical applications, the memory may also be internal to the computer system 1905, or external to the computer system 1905, for example, as a cache in a vehicle, and the present application is not limited thereto. Wherein,
The processor may include one or more general-purpose processors, e.g., a graphics processor (graphic processing unit, GPU). The processor may be configured to execute the relevant program or instructions corresponding to the program stored in the memory to perform the corresponding functions of the vehicle.
The memory may include volatile memory (RAM), for example; the memory may also include a non-volatile memory (ROM), flash memory (flash memory), or solid state disk (solid state drives, SSD); the memory may also comprise a combination of the above types of memories. The memory may be used to store a set of program code or instructions corresponding to the program code such that the processor invokes the program code or instructions stored in the memory to perform the corresponding functions of the vehicle. In this application, the memory may store a set of program codes for controlling the vehicle, and the processor may call the program codes to control the safe driving of the vehicle.
Alternatively, the memory may store information such as road maps, driving routes, sensor data, and the like, in addition to program codes or instructions. The computer system 1905 may implement the functions associated with the vehicle in conjunction with other elements in the vehicle's functional framework schematic, such as sensors in the sensor system, GPS, etc. For example, the computer system 1905 may control the direction of travel or speed of travel of the vehicle, etc., based on data input from the sensor system, without limitation.
The display system 1906 may include several elements, such as a windshield, a controller, and the display device 400 described previously. The controller is used for generating an image (such as an image containing vehicle states such as vehicle speed, electric quantity/oil quantity and the like and an image of augmented reality AR content) according to a user instruction and sending the image content to the projection device; the projection device projects the light bearing the image content to a windshield, and the windshield is used for reflecting the light bearing the image content so as to enable a virtual image corresponding to the image content to be presented in front of a driver. It should be noted that the functions of some of the elements in the display system 1906 may be implemented by other subsystems of the vehicle, for example, the controller may be an element in the control system.
Herein, fig. 19 shows that the system includes four subsystems, and the sensor system, the control system, the computer system and the display system are only examples, and are not limiting. In practical applications, the vehicle may combine several elements in the vehicle according to different functions, thereby obtaining subsystems with corresponding different functions. In practice, the vehicle may include more or fewer systems or elements, and the present application is not limited thereto.

Claims (18)

1. A display device comprising an image generation unit and an optical imaging unit;
the image generation unit is used for generating a real image according to an image signal input by the processor, wherein the real image has predistortion relative to an image corresponding to the image signal;
the optical imaging unit is used for imaging the real image to form a virtual image corresponding to the real image, and compensating the predistortion through optical distortion caused by the virtual image.
2. The display device according to claim 1, wherein the optical imaging unit corresponds to a distortion parameter, which is a parameter characterizing the optical distortion;
the image generation unit includes:
an optical modulator for modulating the received optical signal according to the image signal to generate the real image; the size and/or the distribution mode of the pixels on the light modulator correspond to the distortion parameters.
3. The display device of claim 2, wherein the light modulator comprises a first region and a second region, the pixels in the first region and the pixels in the second region each being distributed in an array; the size of the pixels in the first area is different from the size of the pixels in the second area, and/or the array distribution mode of the pixels in the first area is different from the array distribution mode of the pixels in the second area.
4. A display device as claimed in claim 3, characterized in that the first region is remote from the centre of the light modulator with respect to the second region; the size and the array distribution mode of the pixels in the first area correspond to the distortion parameters.
5. The display device of claim 4, wherein the optical distortion is barrel distortion; the size of the pixels in the first region is smaller than the size of the pixels in the second region so that the predistortion is a pincushion distortion.
6. A display device as claimed in claim 4 or 5, characterized in that a row of pixels in the first area is distributed along a curve, wherein pixels near both ends of the curve are distant from the second area relative to pixels near the middle of the curve.
7. The display device of claim 4, wherein the optical distortion is a pincushion distortion; the size of the pixels in the first region is larger than the size of the pixels in the second region so that the predistortion is barrel distortion.
8. A display device as claimed in claim 4 or 7, characterized in that a row of pixels in the first area is distributed along a curve, wherein pixels near the middle of the curve are distant from the second area relative to pixels near both ends of the curve.
9. The display device according to any one of claims 4 to 6, wherein the angle of view corresponding to the pixels in the first region is greater than or equal to a preset angle of view.
10. The display device of any one of claims 4-9, wherein the light modulator further comprises an edge region remote from the center relative to the first region, the size of pixels in the edge region being smaller than the size of pixels in the first region.
11. The display device according to any one of claims 1 to 10, wherein the display device further comprises:
and the light diffusion element is positioned at the light emitting side of the image generation unit and used for diffusing the first light beam corresponding to the real image.
12. A display device according to any one of claims 1 to 11, wherein the optical imaging unit comprises a folded light path element and a virtual image imaging lens group;
the folding light path element is used for receiving a first light beam corresponding to the real image and transmitting the first light beam to the virtual image imaging lens group;
the virtual image imaging lens group is used for reflecting the first light beam, and the reflected light reaches the folding light path element;
The folding light path element is also used for transmitting the reflected light of the virtual image imaging lens group, and the transmitted second light beam is used for forming the virtual image.
13. The display device according to claim 12, wherein the first light beam is circularly polarized light, the folded light path element includes a polarization converter and a first film layer, the first film layer being located on a side of the polarization converter away from the virtual image forming group;
the polarization converter is used for receiving the first light beam and converting the first light beam into linearly polarized light with a first polarization state;
the first film layer is used for reflecting the first light beam with the first polarization state from the polarization converter to the polarization converter;
the polarization converter is further used for converting the first light beam from the first film layer into circularly polarized light and emitting the circularly polarized light to the virtual image imaging lens group;
the polarization converter is further used for converting the light beam reflected by the virtual image imaging lens group into linearly polarized light with a second polarization state;
the first film layer is also configured to transmit linearly polarized light of a second polarization state from the polarization converter.
14. A display device as recited in claim 12 or 13, wherein the virtual image imaging group comprises an optical freeform mirror.
15. The display device of any one of claims 12-14, wherein the polarization converter is a 1/4 wave plate.
16. A display device comprising a processor for transmitting the image signal to the image generation unit, and a display apparatus according to any one of claims 1 to 15.
17. A vehicle comprising the display device of claim 16 mounted on the vehicle.
18. The vehicle of claim 17, wherein the vehicle has a seat and the display device is mounted to a back of the seat.
CN202211073982.8A 2022-09-02 2022-09-02 Display device, display equipment and vehicle Pending CN117687210A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211073982.8A CN117687210A (en) 2022-09-02 2022-09-02 Display device, display equipment and vehicle
PCT/CN2023/095982 WO2024045704A1 (en) 2022-09-02 2023-05-24 Display apparatus, display device, and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211073982.8A CN117687210A (en) 2022-09-02 2022-09-02 Display device, display equipment and vehicle

Publications (1)

Publication Number Publication Date
CN117687210A true CN117687210A (en) 2024-03-12

Family

ID=90100324

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211073982.8A Pending CN117687210A (en) 2022-09-02 2022-09-02 Display device, display equipment and vehicle

Country Status (2)

Country Link
CN (1) CN117687210A (en)
WO (1) WO2024045704A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6726742B2 (en) * 2016-07-07 2020-07-22 マクセル株式会社 Head up display device
CN106094200B (en) * 2016-08-24 2018-09-14 宁波视睿迪光电有限公司 A kind of dot structure, display panel and display device
CN106920475B (en) * 2017-04-25 2020-03-06 京东方科技集团股份有限公司 Display panel, display device and driving method of display panel
CN210488131U (en) * 2019-10-10 2020-05-08 浙江水晶光电科技股份有限公司 Optical module and intelligent glasses
JP7095018B2 (en) * 2020-05-22 2022-07-04 マクセル株式会社 Head-up display device
CN111929906B (en) * 2020-09-25 2021-01-22 歌尔光学科技有限公司 Image display structure and head-mounted display device

Also Published As

Publication number Publication date
WO2024045704A1 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
WO2022188096A1 (en) Hud system, vehicle, and virtual image position adjustment method
CN102736375B (en) Projector, projection unit, and interactive board
CN105143964B (en) Many Laser Driven systems
WO2024017038A1 (en) Image generation apparatus, display device and vehicle
WO2024021852A1 (en) Stereoscopic display apparatus, stereoscopic display system, and vehicle
CN116420106A (en) Optical device, system and method
CN115629515B (en) Stereoscopic projection system, projection system and vehicle
CN117687210A (en) Display device, display equipment and vehicle
CN217360538U (en) Projection system, display device and vehicle
CN115933185B (en) Virtual image display device, image data generation method and device and related equipment
WO2023071548A1 (en) Optical display apparatus, display system, vehicle, and color adjustment method
CN115542644B (en) Projection device, display equipment and vehicle
WO2023098228A1 (en) Display apparatus, electronic device and vehicle
WO2024125441A1 (en) Display apparatus and vehicle
CN220983636U (en) Display device, vehicle and vehicle-mounted system
WO2024098828A1 (en) Projection system, projection method, and transportation means
CN220983541U (en) Diffusion screen, display device, vehicle and vehicle-mounted system
WO2024065332A1 (en) Display module, optical display system, terminal device and image display method
CN220419700U (en) Vehicle-mounted head-up display system based on diffraction optical waveguide
WO2024032057A1 (en) Three-dimensional display apparatus, three-dimensional display device and three-dimensional display method
WO2022124028A1 (en) Head-up display
WO2023103492A1 (en) Image generation apparatus, display device and vehicle
WO2023184276A1 (en) Display method, display system and terminal device
US20240073379A1 (en) Picture generation apparatus, projection apparatus, and vehicle
CN116819718A (en) Projection lens, projection device, display device and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication