WO2022061584A1 - 图像显示方法和图像显示装置 - Google Patents

图像显示方法和图像显示装置 Download PDF

Info

Publication number
WO2022061584A1
WO2022061584A1 PCT/CN2020/117141 CN2020117141W WO2022061584A1 WO 2022061584 A1 WO2022061584 A1 WO 2022061584A1 CN 2020117141 W CN2020117141 W CN 2020117141W WO 2022061584 A1 WO2022061584 A1 WO 2022061584A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
uniformity
image
sub
compensated
Prior art date
Application number
PCT/CN2020/117141
Other languages
English (en)
French (fr)
Inventor
费永浩
欧阳世宏
范霍文安东尼
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/117141 priority Critical patent/WO2022061584A1/zh
Priority to CN202080101879.2A priority patent/CN115698821A/zh
Publication of WO2022061584A1 publication Critical patent/WO2022061584A1/zh
Priority to US18/187,728 priority patent/US20230221554A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4233Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive element [DOE] contributing to a non-imaging application
    • G02B27/4244Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive element [DOE] contributing to a non-imaging application in wavelength selecting devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present application relates to the field of augmented reality, and in particular, to an image display method and an image display device.
  • Augmented reality (AR) technology can combine the virtual environment with the real environment, and superimpose the objects in the real environment and the virtual environment on the same screen in real time, so as to realize the interaction between the real environment and the virtual environment.
  • the AR head-mounted display device is a wearable device that implements AR technology and can be worn on the human head for display.
  • the AR head-mounted display device is generally implemented in the form of glasses or helmets, which can display pictures in front of the user's eyes and enhance the user's reality. feel.
  • the display part of the AR head-mounted display device includes a light engine and a diffractive light guide.
  • the light engine includes a light source, a lens group, a display screen, a lens and other devices.
  • the light source may include a light emitting diode (LED) or a laser diode (LD);
  • the display screen may include a liquid crystal on silicon (LCOS) display screen, or may include Digital light processing (digital light procession, DLP) display.
  • the principle of the display part is: the light emitted by the light source can be transmitted to the display screen for imaging through the lens group, and the image formed by the display screen can be used as an image source.
  • the waveguide is finally dilated through the diffractive optical waveguide to display a virtual image. Therefore, the virtual image displayed by the display part of the AR head-mounted display device enters the human eye, thereby realizing near-eye display.
  • the core component of the above-mentioned diffractive optical waveguide is the diffraction grating, which can divide the incident light into several diffraction orders and perform dispersion. Since the wavelengths of the incident light of different colors are different, the diffraction angles are also different. Therefore, when the incident light of different colors is transmitted in the diffracted optical waveguide, the phenomenon of different transmission angles and different paths will occur, resulting in poor image uniformity and affecting user experience.
  • the present application provides an image display method and an image display device, which are beneficial to improve the uniformity of an image displayed through a diffractive optical waveguide, thereby improving user experience.
  • an image display method applied to a device comprising an optical engine and a diffractive optical waveguide, the method comprising: obtaining uniformity data of a first image obtained by the diffractive optical waveguide; and determining, according to the uniformity data, The data to be compensated of the light engine; based on the data to be compensated, the brightness distribution of the light source in the light engine is adjusted; the second image is displayed through the adjusted light engine and the diffractive optical waveguide.
  • the image display method of the embodiment of the present application compensates the brightness of the image displayed by the diffractive optical waveguide by adjusting the brightness distribution of the light source in the light engine, thereby improving the uniformity of the image displayed by the diffractive optical waveguide, thereby improving the user experience. experience.
  • the light source can include an LED array light source or an LD light source
  • the brightness distribution of the LED array light source can be adjusted, and the brightness distribution of the LD light source can also be adjusted.
  • the uniformity of the image includes the uniformity of the brightness of the image and the uniformity of the color of the image. Brightness and color are related, and when the brightness changes, the color will also change. Therefore, by adjusting the brightness distribution of the light source in the light engine, the embodiments of the present application actually adjust the brightness and color uniformity of the image presented by the light engine, so that the image output by the light engine and the image output by the diffractive optical waveguide The uniformity complements each other, and finally presents an image with higher uniformity.
  • the obtaining the uniformity data of the first image obtained through the diffractive optical waveguide includes: testing the uniformity of the first image corresponding to the multiple regions, and obtaining A plurality of uniformity sub-data; the uniformity data is determined according to the plurality of uniformity sub-data and the weights of the plurality of regions.
  • the uniformity data of the first image can be determined according to the uniformity sub-data corresponding to the divided regions and the weights of the plurality of regions, so as to determine the to-be-to-be based on the uniformity data.
  • the compensation data is used to adjust the brightness distribution of the light source in the light engine, and the second image is displayed through the light engine and the diffractive light guide.
  • the method of this embodiment of the present application can take into account the areas where the human eye may be located, so that the uniformity of the images displayed in different areas looked at by different users or in different areas looked at by the same user at different times has the same improved, thereby improving the user experience.
  • this application implements The above-mentioned multiple regions of the example may include regions obtained by dividing the eye box. In different regions, the human eye sees the first images with different degrees of uniformity, and the embodiment of the present application uses uniformity sub-data to represent the degrees of uniformity of the first images in different regions.
  • the uniformity data of the first image may be obtained by weighted summation of uniformity sub-data corresponding to multiple regions.
  • the weight of the middle area of the multiple areas is the highest, and the weight of the edge area of the multiple areas is the lowest. . In this way, it is more in line with the actual situation and is conducive to obtaining more accurate uniformity data.
  • the method before obtaining the uniformity data of the first image obtained by the diffractive optical waveguide, the method further includes: determining a target area according to an eye tracking technology;
  • the obtaining the uniformity data of the first image obtained through the diffractive optical waveguide includes: obtaining the uniformity sub-data of the first image corresponding to the target area; and determining the uniformity sub-data of the first image corresponding to the target area as The uniformity data.
  • the image display method of the embodiment of the present application uses eye tracking technology to track the position of the human eye in real time, obtains uniformity data of the first image according to the position of the human eye, and then adjusts the brightness distribution of the light source in the light engine in real time.
  • the optical waveguide displays the second image.
  • the method of the embodiment of the present application can flexibly adapt to different users, and determine the image uniformity of the current user's eyeball position corresponding to the target area according to the current user's eyeball position, so as to perform compensation based on the image uniformity of the target area, which can be Users provide a good experience.
  • the acquiring the uniformity sub-data of the first image corresponding to the target area includes: obtaining a plurality of uniformity sub-data of the first image corresponding to a plurality of areas , select the uniformity sub-data of the first image corresponding to the target area, wherein the plurality of areas include the target area.
  • the image display device Before determining the target area, the image display device has tested the uniformity of the first image corresponding to the plurality of areas, obtained a plurality of uniformity sub-data, and saved the plurality of uniformity sub-data. In this case, after the image display device determines the target area, it can directly select the uniformity sub-data corresponding to the target area from the plurality of uniformity sub-data corresponding to the multiple areas, which shortens the time delay of image display and improves the performance of the image display. Efficiency of image display.
  • the method before determining the data to be compensated for the light engine according to the uniformity data, the method further includes: obtaining a brightness value of the environment where the device is located; the Determining the data to be compensated for the light engine according to the uniformity data includes: determining the data to be compensated according to the uniformity data and the brightness value.
  • adjusting the brightness distribution of the light source in the light engine based on the data to be compensated includes: determining a target current value according to the data to be compensated; based on the target Current value to adjust the brightness distribution of the light source.
  • the embodiment of the present application can adjust the brightness distribution of the light source in the light engine based on the target current value determined by the data to be compensated.
  • the image display device may first determine the relationship between these different luminances and current values, and then determine each LED in the LED array. The brightness required by each LED is converted into the current value required by each LED, which is the above target current value.
  • an image display device for executing the method in any possible implementation manner of the above-mentioned first aspect.
  • the apparatus includes a module for executing the method in any one of the possible implementation manners of the first aspect above.
  • another image display device including a processor, which is coupled to a memory and can be used to execute instructions in the memory, so as to implement the method in any one of the possible implementations of the first aspect.
  • the apparatus further includes a memory.
  • the apparatus further includes a communication interface to which the processor is coupled.
  • the image display device is an AR head display device.
  • the communication interface may be a transceiver, or an input/output interface.
  • the image display device is a chip configured in an AR head display device.
  • the communication interface may be an input/output interface.
  • a processor including: an input circuit, an output circuit, and a processing circuit.
  • the processing circuit is configured to receive the signal through the input circuit and transmit the signal through the output circuit, so that the processor executes the method in any one of the possible implementation manners of the above first aspect.
  • the above-mentioned processor may be a chip
  • the input circuit may be an input pin
  • the output circuit may be an output pin
  • the processing circuit may be a transistor, a gate circuit, a flip-flop, and various logic circuits.
  • the input signal received by the input circuit may be received and input by, for example, but not limited to, a receiver
  • the signal output by the output circuit may be, for example, but not limited to, output to and transmitted by a transmitter
  • the circuit can be the same circuit that acts as an input circuit and an output circuit at different times.
  • the embodiments of the present application do not limit the specific implementation manners of the processor and various circuits.
  • a processing apparatus including a processor and a memory.
  • the processor is configured to read the instructions stored in the memory, and can receive signals through the receiver and transmit signals through the transmitter, so as to execute the method in any one of the possible implementation manners of the first aspect.
  • processors there are one or more processors and one or more memories.
  • the memory may be integrated with the processor, or the memory may be provided separately from the processor.
  • the memory can be a non-transitory memory, such as a read only memory (ROM), which can be integrated with the processor on the same chip, or can be separately set in different On the chip, the embodiment of the present application does not limit the type of the memory and the setting manner of the memory and the processor.
  • ROM read only memory
  • the relevant data interaction process such as sending indication information, may be a process of outputting indication information from the processor, and receiving capability information may be a process of receiving input capability information by the processor.
  • the data output by the processing can be output to the transmitter, and the input data received by the processor can be from the receiver.
  • the transmitter and the receiver may be collectively referred to as a transceiver.
  • the processing device in the fifth aspect may be a chip, and the processor may be implemented by hardware or software.
  • the processor When implemented by hardware, the processor may be a logic circuit, an integrated circuit, etc.; when implemented by software
  • the processor can be a general-purpose processor, which is realized by reading software codes stored in a memory, and the memory can be integrated in the processor or located outside the processor and exist independently.
  • a computer program product includes: a computer program (also referred to as code, or instruction), which, when the computer program is executed, enables the computer to execute any one of the above-mentioned first aspects. method in method.
  • a computer-readable storage medium stores a computer program (also referred to as code, or instruction) when it is run on a computer, causing the computer to execute the above-mentioned first aspect. method in any of the possible implementations.
  • Fig. 1 is the schematic diagram of the principle of image display
  • FIG. 2 is a schematic flowchart of an image display method provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of image uniformity corresponding to an image display method provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of another image display method provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of weight distribution of multiple regions provided by an embodiment of the present application.
  • FIG. 6 is a schematic flowchart of another image display method provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of another image display method provided by an embodiment of the present application.
  • FIG. 8 is a schematic block diagram of an image display device provided by an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of another image display device provided by an embodiment of the present application.
  • FIG. 10 is a schematic block diagram of another image display apparatus provided by an embodiment of the present application.
  • An optical waveguide is a medium that guides light waves to propagate in it, also known as a medium optical waveguide.
  • Diffractive optical waveguides are optical waveguides including diffraction gratings, which can utilize the diffraction properties of light and the total reflection properties of the optical waveguide medium to realize the transmission of imaging beams.
  • Diffractive optical waveguides are mainly divided into two types: surface relief grating waveguides fabricated by photolithography technology and holographic volume grating waveguides fabricated based on holographic interference technology.
  • Diffraction grating is the core element of diffractive optical waveguide.
  • Diffraction grating is an optical element with a periodic structure. This period can be the peaks and valleys embossed on the surface of the material, or it can be the "light and dark interference fringes" formed by the exposure of the holographic technology inside the material.
  • the role of the diffraction grating is to induce periodic changes in the refractive index in the material. This period is generally on the micro-nano level, which is an order of magnitude with the wavelength of visible light (400-700nm), so it can have an effective effect on light.
  • the light engine may include a light source, a lens group, a display screen, a lens and other devices.
  • the light emitted by the light source can be transmitted to the display screen for imaging through the lens group, and the image formed by the display screen can be used as the image source, that is, the display screen is the image surface of the light source. Since the display screen is the image plane of the light source, the brightness distribution of the image formed by the display screen corresponds one-to-one with the brightness distribution of the light source.
  • the light source may include an LED light source, an LD light source, or other types of light sources;
  • the display screen may include an LCOS display screen, a DLP display screen, or other types of display screens, which are not limited in this application.
  • the LED light source may include multiple LEDs, that is, an LED array. Therefore, the LED light source may also be called an LED array light source.
  • the LED array can be of any size, for example, a rectangle of 5 ⁇ 8, or a square of 5 ⁇ 5, which is not limited in this application.
  • the above-mentioned LED array can be located on the LED substrate, and together with the LED substrate, an LED module can be formed, and the LED module can also include a diffuser (diffuser) and a brightness enhancement film (BEF) for reducing the divergence of the LED array. Horn.
  • Image display can be achieved by light engines and diffractive optical waveguides. Specifically, the light emitted by the light source can be transmitted to the display screen for imaging through the lens group, and then modulated by the lens to exit the pupil, then input into the diffractive optical waveguide, transmitted to the diffraction grating through diffraction and total reflection to generate pupil dilation, and finally output the diffracted optical waveguide , the image is displayed.
  • the function of the light engine can be understood as a projector, and the function of the diffractive optical waveguide can be understood as being responsible for transmitting the image of the projector to the human eyes. It should be understood that when the diffractive optical waveguide transmits the image, the image will not be enlarged or reduced.
  • the uniformity of the image refers to the degree of difference between the pixels of the image at different positions on the display screen, and the uniformity can be measured by parameters such as brightness and color.
  • brightness is also called lightness, which indicates the lightness and darkness of a color.
  • Brightness and color are related to a certain extent, and when the brightness changes, the color will also change.
  • the eye box is for AR head-mounted display devices. It refers to a cone-shaped area between the display part of the AR head-mounted display device and the eyeball. It is also the area with the clearest display content. The display content is incomplete, or even the content is not displayed.
  • Eye tracking technology can track the trajectory of the human eye by measuring the position of the gaze point of the human eye or the movement of the human eye relative to the head. Specifically, through image processing technology, the position of the pupil can be located, the coordinates of the center of the pupil can be obtained, and the gaze point of the human eye can be calculated through an algorithm to track the movement trajectory of the human eye.
  • the pupil-corneal reflection tracking method can be used, the eye image is captured by the eye camera, and then the center position of the pupil is obtained by image processing, and then the corneal reflection point is used as the base point of the relative position of the eye camera and the human eye.
  • the center position that is, the coordinates of the sight vector of the human eye can be obtained, so as to determine the gaze point of the human eye, and then track the movement trajectory of the human eye.
  • Figure 1 is a schematic diagram of the principle of image display. 1 includes an optical engine 101 , a diffractive optical waveguide 102 and a human eye 106 , wherein the diffractive optical waveguide 102 includes an input grating 103 , a diffraction grating 104 and an output grating 105 .
  • the light emitted by the light source in the above-mentioned light engine 101 is imaged by the display screen, and then modulated by the lens to the exit pupil, and then enters the diffractive optical waveguide 102 .
  • the input grating 103 is responsible for receiving the optical signal, and transmitting the optical signal to the diffraction grating 104 in the form of total reflection.
  • the diffraction grating 104 is used to dilate the received optical signal, and then transmit the optical signal to the output in the form of total reflection.
  • the grating 105 outputs the light signal through the output grating 105 and projects it to the human eye 106 for imaging.
  • the above pupil dilation refers to converting an input optical signal with a low field of view into an output light signal with a high field of view, including horizontal pupil dilation and vertical pupil dilation.
  • the field of view can be measured by the field of view (FOV), which specifically refers to the angle between the edge of the display and the line connecting the human eye.
  • the field of view can include a horizontal field of view and a vertical field of view.
  • the above-mentioned diffraction grating 104 dilates the incident light, which can divide the incident light into several diffraction orders and perform dispersion.
  • diffraction gratings may include one-dimensional gratings and two-dimensional gratings. In order to avoid ghost images and stray light, diffraction gratings generally have only one diffraction order, including positive and negative first orders.
  • the diffraction order of a one-dimensional grating can be +1 or -1; the diffraction order of a two-dimensional grating can include (-1,1), (-1,-1), (1,1), (1,-1), (1,0), (-1,0), (0,-1), (0 ,1).
  • the diffraction grating 104 divides the incident light into several diffraction orders, the light of each diffraction order can continue to propagate along different directions in the diffractive optical waveguide 102 .
  • the operation of the diffraction grating 104 on the incident light in this dimension is mainly to change the propagation direction of the incident light.
  • the diffraction efficiency of the incident light of a certain diffraction order (ie, a certain direction) can be optimized to the highest by parameters such as the material refractive index, grating shape, thickness, and duty cycle of the diffraction grating 104, so that most of the light is After diffraction, it mainly propagates in this direction, thereby reducing the loss of light in other diffraction directions.
  • the diffraction grating 104 can generate different diffraction angles for incident light of different wavelengths, and the longer the wavelength of light, the larger the diffraction angle.
  • the incident light includes red, green and blue light. Since the wavelength of red light>the wavelength of green light>the wavelength of blue light, the diffraction angle of red light>the diffraction angle of green light>the diffraction angle of blue light. The diffraction angle of the incident light is different, and the path length experienced by the incident light after completing one total reflection will also be different. Therefore, the number of total reflections of red light ⁇ the number of total reflections of green light ⁇ the number of total reflections of blue light. Based on the above differences, the image finally displayed by the incident light including red, green and blue light through the diffractive optical waveguide 102 may have uneven brightness and color.
  • the present application provides an image display method, by adjusting the brightness distribution of the light source in the light engine, compensating the brightness of the image displayed by the diffractive optical waveguide, and improving the uniformity of the image displayed through the diffractive optical waveguide, thereby Improve user experience.
  • the image display method of the embodiments of the present application can be applied to an image display device including a light engine and a diffractive optical waveguide, for example, an AR helmet, AR glasses, a mobile phone, a tablet, a computer, and a vehicle head-up display. , HUD), AR intelligent interactive devices, and smart glasses, etc., which are not limited in the embodiments of the present application.
  • the first, the second, and various numeral numbers are only for the convenience of description, and are not used to limit the scope of the embodiments of the present application. For example, distinguish between different images, distinguish between different regions, etc.
  • At least one means one or more, and “plurality” means two or more.
  • And/or which describes the association relationship of the associated objects, means that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b and c may represent: a, or b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b, c can be single or multiple.
  • FIG. 2 is a schematic flowchart of an image display method 200 provided by an embodiment of the present application.
  • the method 200 can be performed by an image display device and includes the following steps:
  • the first image may be an image output only through the diffractive optical waveguide. Since the incident light of different colors is transmitted in the diffractive optical waveguide, the phenomenon of different transmission angles and different paths will occur, so the uniformity of the first image is poor.
  • the light source can include an LED array light source or an LD light source, in this embodiment, the brightness distribution of the LED array light source can be adjusted, and the brightness distribution of the LD light source can also be adjusted.
  • the data to be compensated is determined according to the uniformity data of the first image, and the uniformity data of the first image reflects the uniformity of the image displayed by the diffractive optical waveguide, the data to be compensated can be displayed on the diffractive optical waveguide. to compensate for the uniformity of the image.
  • the second image is an image displayed after brightness compensation is performed on the diffractive optical waveguide. Due to the brightness compensation, the second image is more uniform than the first image.
  • the image display method of the embodiment of the present application compensates the brightness of the image displayed by the diffractive optical waveguide by adjusting the brightness distribution of the light source in the light engine, thereby improving the uniformity of the image displayed by the diffractive optical waveguide, thereby improving the user experience. experience.
  • the uniformity of the image includes the uniformity of the brightness of the image and the uniformity of the color of the image. Brightness and color are related, and when the brightness changes, the color will also change. Therefore, by adjusting the brightness distribution of the light source in the light engine, the embodiments of the present application actually adjust the brightness and color uniformity of the image presented by the light engine, so that the image output by the light engine and the image output by the diffractive optical waveguide The uniformity complements each other, and finally presents an image with higher uniformity.
  • FIG. 3 is a schematic diagram of the image uniformity corresponding to the above method 200, wherein the image obtained by the diffracted optical waveguide is image 1 (equivalent to the above-mentioned first image), the image imaged by the optical engine is image 2, and the image obtained by the diffracted optical waveguide and the light
  • the image displayed by the engine is image 3 (equivalent to the second image above).
  • Image 1, Image 2, and Image 3 are all divided into 4 ⁇ 4 areas, and each area has its own brightness value, thereby showing the brightness distribution of the image.
  • the brightness distribution of the image 1, the brightness distribution of the image 2, and the brightness distribution of the image 3 can be respectively shown as lines in FIG. 3, and different line patterns represent different brightness values. It can be seen from Fig.
  • image 1 is divided into a 4 ⁇ 4 area
  • image 2 is also divided into a 4 ⁇ 4 area
  • the area of image 1 and the area of image 2 are in one-to-one correspondence.
  • the uniformity data of the first image obtained by the diffractive optical waveguide can be obtained in many different ways, that is, the above S201 can be realized in many different ways. Two possible ways are introduced below. Method to realize.
  • the above S201 may include the following steps:
  • S402 Determine the uniformity data of the first image according to the multiple uniformity sub-data and the weights of the multiple regions.
  • this application implements The above-mentioned multiple regions of the example may include regions obtained by dividing the eye box. In different regions, the human eye sees the first images with different degrees of uniformity, and the embodiment of the present application uses uniformity sub-data to represent the degrees of uniformity of the first images in different regions.
  • multiple areas have respective weights, wherein the weights of some areas may be the same, and the weights of some areas may be different, which are not limited in this embodiment of the present application.
  • the weight of the middle area of the multiple areas is the highest, and the weight of the edge area of the multiple areas is the lowest. . In this way, it is more in line with the actual situation and is conducive to obtaining more accurate uniformity data.
  • the eye box of the human eye window can be divided into 3 ⁇ 5 areas, and the above-mentioned multiple areas include 15 areas, namely area 1, area 2, . . . , area 15.
  • the eye can be in any of the 15 zones.
  • the above plurality of uniformity sub-data includes 15 uniformity sub-data corresponding to the 15 regions.
  • the weight of area 8 is A
  • the weight of area 3, area 7, area 9 and area 13 is B
  • the weight of area 2, area 4, area 6, area 10, area 12 and area 14 is C
  • the weight of area 5, area 11 and area 15 is D.
  • a to D represent the level of weight.
  • A represents the highest weight, followed by B and C, and D represents the lowest weight.
  • the uniformity data of the first image may be obtained by weighted summation of uniformity sub-data corresponding to multiple regions.
  • the uniformity data of the first image may be one of uniformity sub-data corresponding to area 1*10%, uniformity sub-data corresponding to area 2*3.3%, . . . , and uniformity sub-data corresponding to area 15*2.5% and.
  • the uniformity data of the first image can be determined according to the uniformity sub-data corresponding to the divided regions and the weights of the plurality of regions, so as to determine the to-be-to-be based on the uniformity data.
  • the compensation data is used to adjust the brightness distribution of the light source in the light engine, and the second image is displayed through the light engine and the diffractive light guide.
  • the method of this embodiment of the present application can take into account the areas where the human eye may be located, so that the uniformity of the images displayed in different areas looked at by different users or in different areas looked at by the same user at different times has the same improved, thereby improving the user experience.
  • the above S201 may include the following steps:
  • S601 according to the eye tracking technology, determine the target area.
  • S603 Determine the uniformity sub-data of the first image corresponding to the target area as the uniformity data of the first image.
  • the image display device can first determine the relative position of the pupil of the eyeball according to the eye tracking technology, and then determine the gaze point of the human eye.
  • the area where the human eye gaze point is located is the above-mentioned target area.
  • the uniformity sub-data of the first image corresponding to the target area is the uniformity data of the first image.
  • the image display method of the embodiment of the present application uses eye tracking technology to track the position of the human eye in real time, obtains uniformity data of the first image according to the position of the human eye, and then adjusts the brightness distribution of the light source in the light engine in real time.
  • the optical waveguide displays the second image.
  • the method of the embodiment of the present application can flexibly adapt to different users, and determine the image uniformity of the current user's eyeball position corresponding to the target area according to the current user's eyeball position, so as to perform compensation based on the image uniformity of the target area, which can be Users provide a good experience.
  • the uniformity sub-data of the first image corresponding to the target area may be acquired in various ways, which is not limited in this embodiment of the present application.
  • the image display device can measure the uniformity sub-data of the first image corresponding to the target area in real time.
  • Method 2 After the image display device uses eye tracking technology to determine the target area, the image display device can select the first image corresponding to the target area from the plurality of uniformity sub-data of the first images corresponding to the multiple areas The uniformity sub-data of , where the plurality of regions include the target region.
  • the image display device before determining the target area, the image display device has performed the above S401, tested the uniformity of the first image corresponding to the multiple areas, obtained multiple uniformity sub-data, and saved the multiple uniformity sub-data .
  • the uniformity sub-data corresponding to the target area can be directly selected from the plurality of uniformity sub-data corresponding to the plurality of areas.
  • the image display device may determine the target area as area 7 according to the eye tracking technology, then the image display device may select the uniformity sub-data corresponding to the area 7 from the 15 uniformity sub-data corresponding to the 15 areas. data, and the uniformity sub-data corresponding to area 7 is determined as the uniformity data of the first image.
  • the first method does not need to measure the uniformity of the images in multiple areas in advance, and there is no need to save multiple uniformity sub-data, which is conducive to saving equipment energy consumption and memory; and the second method does not need to determine the target.
  • the measurement of the image uniformity of the target area is performed after the area, which shortens the time delay of image display and improves the efficiency of image display.
  • the above method 200 before determining the data to be compensated for the light engine according to the uniformity data, that is, S202, the above method 200 further includes: obtaining a brightness value of the environment where the image display device is located.
  • the above-mentioned determining the data to be compensated of the light engine according to the uniformity data includes: determining the data to be compensated according to the uniformity data and the brightness value of the first image.
  • the image display device can use the ambient light sensor to test the brightness value of the current environment. If the uniformity data of the first image is 20 nits, and the brightness value of the environment where the image display device is located is 100 nits, the data to be compensated may be 100 divided by 20, which is equal to 5.
  • the above-mentioned adjusting the brightness distribution of the light source in the light engine based on the data to be compensated includes: determining a target current value based on the data to be compensated; adjusting the brightness distribution of the light source according to the target current value.
  • the embodiment of the present application can adjust the brightness distribution of the light source in the light engine based on the target current value determined by the data to be compensated.
  • the image display device may first determine the relationship between these different luminances and current values, and then determine each LED in the LED array. The brightness required by each LED is converted into the current value required by each LED, which is the above target current value.
  • FIG. 7 shows a schematic flow chart of another image display method 700 proposed by the present application.
  • the method can be executed by an image display device and can include the following steps:
  • an eye-tracking technology is used to determine a target area from a plurality of areas.
  • S704 includes two possible implementation manners.
  • the image display device may execute S703 and then execute S704, that is, the image display device determines the target area, and determines the uniformity sub-data of the first image corresponding to the target area as the first image uniformity data.
  • the image display apparatus may not perform S703, and directly perform S704 after performing S702, that is, the image display apparatus may perform S704 according to the uniformity sub-data of the first image corresponding to the multiple regions and the uniformity sub-data of the multiple regions. Weight, which determines the uniformity data of the first image.
  • S707 Determine the target current value according to the to-be-compensated data of the light engine.
  • the method 700 further: It may include: S710, judging whether the second image achieves the expected effect.
  • the image display device can test the uniformity of the second image, obtain the uniformity data of the second image, and then judge the first image according to the uniformity data of the second image and the brightness value of the environment where the image display device is located. Whether the uniformity of the second image can achieve the expected effect. If the uniformity of the second image achieves the expected effect, the process ends, and the second image is output for the user; if the uniformity of the second image does not reach the expected effect, the above S703 to S710 are repeated until the expected effect is achieved.
  • S701, S702, and S704-S709 are a method for displaying an image, and the method, through area division, can determine the first image according to the uniformity sub-data corresponding to the divided multiple areas and the weights of the multiple areas. uniformity data, so as to determine the data to be compensated based on the uniformity data, and then adjust the brightness distribution of the light source of the light engine, and display the second image through the light engine and the diffractive optical waveguide.
  • the method can take all possible regions of the human eye into account, so that the uniformity of the images displayed in different regions stared by different users or different regions stared at by the same user at different times can be improved to some extent. Improve user experience.
  • S701-S710 are another image display method, which uses eye tracking technology to track the position of the human eye in real time, obtains the uniformity data of the first image according to the position of the human eye, and then adjusts the brightness distribution of the light source in the light engine in real time, The second image is displayed through the light engine and the diffractive optical waveguide, and it is detected whether the uniformity of the second image achieves the expected effect. If the expected effect is not achieved, repeat this method until the expected effect is achieved.
  • the method can flexibly adapt to different users, according to the current user's eyeball position, determine the image uniformity of the current user's eyeball position corresponding to the target area, and then perform compensation based on the image uniformity of the target area, and detect the compensation result until To achieve the desired effect, it can provide users with a good experience.
  • FIG. 8 shows an image display apparatus 800 provided by an embodiment of the present application.
  • the apparatus 800 includes a processing unit 810 and a display unit 820 .
  • the display unit 820 includes an optical engine and a diffractive optical waveguide.
  • the above-mentioned processing unit 810 is configured to: obtain uniformity data of the first image obtained based on the light engine and the diffractive optical waveguide, determine the data to be compensated for the light engine according to the uniformity data, and adjust the light engine based on the data to be compensated
  • the brightness distribution of the light source in the engine; the above-mentioned display unit 820 is used to: display the second image based on the adjusted light source and the diffracted light guide.
  • the processing unit 820 is specifically configured to: test the uniformity of the first images corresponding to the multiple regions, and obtain multiple uniformity sub-data; and determine the uniformity data according to the multiple uniformity sub-data and the weights of the multiple regions .
  • the processing unit 820 is further configured to: determine the target area according to the eye tracking technology; obtain the uniformity sub-data of the first image corresponding to the target area; determine the uniformity sub-data of the first image corresponding to the target area as Uniformity data.
  • the processing unit 820 is specifically configured to: select the uniformity sub-data of the first image corresponding to the target area from the uniformity sub-data of the first image corresponding to the multiple areas, wherein the multiple areas include the uniformity sub-data of the first image. target area.
  • the processing unit 820 is further configured to: obtain the brightness value of the environment where the device is located; and determine the data to be compensated according to the uniformity data and the brightness value.
  • the processing unit 820 is specifically configured to: determine a target current value according to the data to be compensated; and adjust the brightness distribution of the light source based on the target current value.
  • the apparatus 800 here is embodied in the form of functional modules.
  • the term "unit” as used herein may refer to an application specific integrated circuit (ASIC), an electronic circuit, a processor for executing one or more software or firmware programs (eg, a shared processor, a dedicated processor, or a group of processors, etc.) and memory, merge logic, and/or other suitable components to support the described functions.
  • ASIC application specific integrated circuit
  • the apparatus 800 may be specifically the image display device in the foregoing embodiment, or the functions of the image display device in the foregoing embodiment may be integrated into the apparatus 800, and the apparatus 800 may use The processes and/or steps corresponding to the image display device in executing the above method embodiments are not repeated here in order to avoid repetition.
  • the above-mentioned apparatus 800 has the function of implementing the corresponding steps performed by the image display device in the above-mentioned method; the above-mentioned functions may be implemented by hardware, or by executing corresponding software by hardware.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the apparatus 800 in FIG. 8 may also be a chip or a system of chips, such as a system on chip (system on chip, SoC).
  • SoC system on chip
  • FIG. 9 shows another image display apparatus 900 provided by an embodiment of the present application.
  • the apparatus 900 includes a processor 910 , an optical engine 920 and a diffractive optical waveguide 930 .
  • the processor 910 is configured to: obtain uniformity data of the first image obtained based on the light engine 920 and the diffractive optical waveguide 930, determine the data to be compensated for the light engine 920 according to the uniformity data, and based on the data to be compensated , adjust the brightness distribution of the light source in the light engine 920; the light engine 920 and the diffractive optical waveguide 930 are used to: display the second image based on the adjusted light source.
  • the apparatus 900 may be specifically the image display device in the above-mentioned embodiments, or the functions of the image display device in the above-mentioned embodiments may be integrated in the apparatus 900, and the apparatus 900 may be used to execute the method and the image display device in the above-mentioned embodiments. corresponding steps and/or processes.
  • FIG. 10 shows yet another image display apparatus 1000 provided by an embodiment of the present application.
  • the apparatus 1000 includes a memory 1010, a processor 1020, a controller 1030, a driver 1040, a light source 1050 , display screen 1060 and diffractive optical waveguide 1070 .
  • the above devices may be connected through internal paths, the memory 1010 is used to store data (for example, the uniformity data of the first image) and instructions, and the processor 1020 is used to execute the instructions stored in the memory 1010 to perform preprocessing operations, that is, based on
  • the uniformity data of the first image is used to determine the data to be compensated for the light source 1050 , so that the controller 1030 controls the driver 1040 to adjust the brightness distribution of the light source 1050 .
  • the light emitted by the adjusted light source 1050 can be transmitted to the display screen 1060 for imaging, and then enter the human eye through the diffractive optical waveguide 1070 .
  • the image that the user can see is the second image displayed by the adjusted light source and the diffractive optical waveguide.
  • the apparatus 1000 may be specifically the image display device in the above-mentioned embodiments, or the functions of the image display device in the above-mentioned embodiments may be integrated into the apparatus 1000, and the apparatus 1000 may be used to execute the method and the image display device in the above-mentioned embodiments. corresponding steps and/or processes.
  • the memory may include read-only memory and random access memory, and provide instructions and data to the processor.
  • a portion of the memory may also include non-volatile random access memory.
  • the memory may also store device type information.
  • the processor may be configured to execute the instructions stored in the memory, and when the processor executes the instructions, the processor may execute various steps and/or processes corresponding to the image display device in the above method embodiments.
  • the processor may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (DSPs), application specific integrated circuits (ASICs) ), field programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the above-mentioned controller may be a microcontroller unit (MCU).
  • each step of the above-mentioned method can be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software.
  • the steps of the methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory, and the processor executes the instructions in the memory, and completes the steps of the above method in combination with its hardware. To avoid repetition, detailed description is omitted here.
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Spectroscopy & Molecular Physics (AREA)

Abstract

本申请提供了一种图像显示方法和图像显示装置,有利于提高通过衍射光波导所显示的图像的均匀性,从而提高用户体验。该方法应用于包括光引擎和衍射光波导的装置,包括:获得通过衍射光波导得到的第一图像的均匀性数据;根据该均匀性数据,确定该光引擎的待补偿数据;基于该待补偿数据,调整光引擎中的光源的亮度分布;通过调整后的光引擎和衍射光波导,显示第二图像。

Description

图像显示方法和图像显示装置 技术领域
本申请涉及增强现实领域,尤其涉及一种图像显示方法和图像显示装置。
背景技术
增强现实(augmented reality,AR)技术可以将虚拟环境和真实环境相结合,将真实环境和虚拟环境的物体实时地叠加到同一个画面中显示出来,从而实现真实环境与虚拟环境之间的互动。AR头显设备是一种实现AR技术且可以佩戴在人体头部进行展示的穿戴式设备,AR头显设备一般采用眼镜或头盔的形态实现,能够在用户的眼前进行画面展示,增强用户的现实感。
AR头显设备的显示部分包括光引擎和衍射光波导。该光引擎包括光源、透镜组、显示屏以及镜头等器件。示例性地,光源可以包括发光二极管(light emitting diode,LED),也可以包括激光二极管(laser diode,LD);显示屏可以包括硅基液晶(liquid crystal on silicon,LCOS)显示屏,也可以包括数字光处理(digital light procession,DLP)显示屏。其中,显示部分的原理为:光源发出的光可以经过透镜组传输至显示屏成像,显示屏所成的像可以作为图像源,将其发出的光经过镜头调制出瞳,然后射入进衍射光波导,最后通过衍射光波导进行扩瞳,显示出虚拟的图像。因此,AR头显设备的显示部分所显示的虚拟的图像进入人眼,从而实现近眼显示。
上述衍射光波导的核心部件是衍射光栅,衍射光栅可以将入射光分成若干个衍射级并进行色散。由于不同颜色的入射光的波长不同,衍射角度也不同,因此,不同颜色的入射光在衍射光波导内传输时,会出现传输角度不同、路径不同的现象,导致图像的均匀性较差,影响用户体验。
发明内容
本申请提供了一种图像显示方法和图像显示装置,有利于提高通过衍射光波导所显示的图像的均匀性,从而提高用户体验。
第一方面,提供了一种图像显示方法,应用于包括光引擎和衍射光波导的装置,该方法包括:获得通过衍射光波导得到的第一图像的均匀性数据;根据该均匀性数据,确定光引擎的待补偿数据;基于该待补偿数据,调整光引擎中的光源的亮度分布;通过调整后的光引擎和衍射光波导,显示第二图像。
本申请实施例的图像显示方法,通过调整光引擎中的光源的亮度分布,对衍射光波导所显示的图像的亮度进行补偿,提高了通过衍射光波导所显示的图像的均匀性,从而提高用户体验。
由于光源可以包括LED阵列光源,也可以包括LD光源,因此,本实施例可以调整LED阵列光源的亮度分布,也可以调整LD光源的亮度分布。
应理解,在本申请实施例中,图像的均匀性包括图像的亮度均匀性和图像的色彩均匀性。亮度和色彩是具有关联关系的,亮度变化,色彩也会发生变化。因此,本申请实施例通过调整光引擎中光源的亮度分布,实际上调整了通过光引擎所呈现图像的亮度和色彩的均匀性,从而使光引擎所输出的图像和衍射光波导所输出的图像的均匀性互补,最终呈现出均匀性较高的图像。
结合第一方面,在第一方面的某些实现方式中,该获得通过该衍射光波导得到的第一图像的均匀性数据,包括:测试多个区域对应的该第一图像的均匀性,获得多个均匀性子数据;根据该多个均匀性子数据以及该多个区域的权重,确定该均匀性数据。
本申请实施例的图像显示方法,通过区域划分,可以根据划分的多个区域对应的均匀性子数据和该多个区域的权重,确定第一图像的均匀性数据,从而基于该均匀性数据确定待补偿数据,进而调整光引擎中光源的亮度分布,通过光引擎和衍射光波导显示出第二图像。本申请实施例的方法能够将人眼可能所处的区域都考虑在内,使得在不同用户所注视的不同区域、或者同一用户在不同时刻所注视的不同区域所显示的图像的均匀性都有所提高,从而提高用户体验。
由于人眼是可以移动的,同一用户的人眼在不同时刻可以处于人眼窗口eye box的不同位置,不同用户的人眼相对于同一设备也会处于eye box的不同位置,因此,本申请实施例的上述多个区域可以包括对eye box进行划分所获得的区域。在不同区域,人眼会看到不同均匀程度的第一图像,本申请实施例采用均匀性子数据来表示不同区域的第一图像的均匀程度。
在本申请实施例中,多个区域分别具有各自的权重,其中,部分区域的权重可以是相同的,部分区域的权重可以是不同的,本申请实施例对此不作限定。可选地,上述第一图像的均匀性数据可以对通过多个区域对应的均匀性子数据加权求和得到。
在一种可能的实现方式中,考虑到中间区域对应的人群比例较高,而边缘区域对应的人群比例较低,该多个区域的中间区域的权重最高,该多个区域的边缘区域权重最低。这样,更加符合实际情况,有利于获得更准确的均匀性数据。
结合第一方面,在第一方面的某些实现方式中,在该获得通过该衍射光波导得到的第一图像的均匀性数据之前,该方法还包括:根据眼动追踪技术,确定目标区域;该获得通过该衍射光波导得到的第一图像的均匀性数据,包括:获取该目标区域对应的该第一图像的均匀性子数据;将该目标区域对应的该第一图像的均匀性子数据确定为该均匀性数据。
本申请实施例的图像显示方法,使用眼动追踪技术实时追踪人眼位置,根据人眼位置获取第一图像的均匀性数据,进而实时调整光引擎中的光源的亮度分布,通过光引擎和衍射光波导显示第二图像。本申请实施例的方法可以灵活适配不同的用户,根据当前用户的眼球位置,确定当前用户的眼球位置对应目标区域的图像的均匀性,从而基于目标区域的图像的均匀性进行补偿,能够为用户提供良好的体验。
结合第一方面,在第一方面的某些实现方式中,该获取该目标区域对应的该第一图像的均匀性子数据,包括:从多个区域对应的该第一图像的多个均匀性子数据中,选择该目标区域对应的该第一图像的均匀性子数据,其中,该多个区域包括该目标区域。
在确定目标区域之前,图像显示设备已经测试了多个区域对应的第一图像的均匀性,获得了多个均匀性子数据,并且保存了该多个均匀性子数据。在这种情况下,图像显示设 备确定目标区域之后,就可以直接从该多个区域对应的多个均匀性子数据中选择出目标区域对应的均匀性子数据,缩短了图像显示的时延,提高了图像显示的效率。
结合第一方面,在第一方面的某些实现方式中,在该根据该均匀性数据,确定该光引擎的待补偿数据之前,该方法还包括:获得该装置所处环境的亮度值;该根据该均匀性数据,确定该光引擎的待补偿数据,包括:根据该均匀性数据以及该亮度值,确定该待补偿数据。
结合第一方面,在第一方面的某些实现方式中,该基于该待补偿数据,调整该光引擎中的光源的亮度分布,包括:根据该待补偿数据,确定目标电流值;基于该目标电流值,调整该光源的亮度分布。
应理解,由于电流值越大,光源越亮,电流值越小,光源越暗,本申请实施例可以基于待补偿数据确定的目标电流值,来调整光引擎中光源的亮度分布。
示例性地,若上述光源为LED阵列光源,该LED阵列光源中不同的LED有不同的亮度,那么图像显示设备可以先确定这些不同的亮度与电流值之间的关系,再确定LED阵列中每个LED所需要的亮度,将每个LED所需要的亮度转换为每个LED所需要的电流值,即为上述目标电流值。
第二方面,提供了一种图像显示装置,用于执行上述第一方面中任一种可能的实现方式中的方法。具体地,该装置包括用于执行上述第一方面中任一种可能的实现方式中的方法的模块。
第三方面,提供了另一种图像显示装置,包括处理器,该处理器与存储器耦合,可用于执行存储器中的指令,以实现上述第一方面中任一种可能实现方式中的方法。可选地,该装置还包括存储器。可选地,该装置还包括通信接口,处理器与通信接口耦合。
在一种实现方式中,该图像显示装置为AR头显设备。当该图像显示装置为AR头显设备时,通信接口可以是收发器,或,输入/输出接口。
在另一种实现方式中,该图像显示装置为配置于AR头显设备中的芯片。当该图像显示装置为配置于AR头显设备中的芯片时,通信接口可以是输入/输出接口。
第四方面,提供了一种处理器,包括:输入电路、输出电路和处理电路。处理电路用于通过输入电路接收信号,并通过输出电路发射信号,使得处理器执行上述第一方面中任一种可能实现方式中的方法。
在具体实现过程中,上述处理器可以为芯片,输入电路可以为输入管脚,输出电路可以为输出管脚,处理电路可以为晶体管、门电路、触发器和各种逻辑电路等。输入电路所接收的输入的信号可以是由例如但不限于接收器接收并输入的,输出电路所输出的信号可以是例如但不限于输出给发射器并由发射器发射的,且输入电路和输出电路可以是同一电路,该电路在不同的时刻分别用作输入电路和输出电路。本申请实施例对处理器及各种电路的具体实现方式不做限定。
第五方面,提供了一种处理装置,包括处理器和存储器。该处理器用于读取存储器中存储的指令,并可通过接收器接收信号,通过发射器发射信号,以执行上述第一方面中任一种可能实现方式中的方法。
可选地,处理器为一个或多个,存储器为一个或多个。
可选地,存储器可以与处理器集成在一起,或者存储器与处理器分离设置。
在具体实现过程中,存储器可以为非瞬时性(non-transitory)存储器,例如只读存储器(read only memory,ROM),其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请实施例对存储器的类型以及存储器与处理器的设置方式不做限定。
应理解,相关的数据交互过程例如发送指示信息可以为从处理器输出指示信息的过程,接收能力信息可以为处理器接收输入能力信息的过程。具体地,处理输出的数据可以输出给发射器,处理器接收的输入数据可以来自接收器。其中,发射器和接收器可以统称为收发器。
上述第五方面中的处理装置可以是一个芯片,该处理器可以通过硬件来实现也可以通过软件来实现,当通过硬件实现时,该处理器可以是逻辑电路、集成电路等;当通过软件来实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现,该存储器可以集成在处理器中,可以位于该处理器之外,独立存在。
第六方面,提供了一种计算机程序产品,计算机程序产品包括:计算机程序(也可以称为代码,或指令),当计算机程序被运行时,使得计算机执行上述第一方面中任一种可能实现方式中的方法。
第七方面,提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序(也可以称为代码,或指令)当其在计算机上运行时,使得计算机执行上述第一方面中任一种可能实现方式中的方法。
附图说明
图1是图像显示原理的示意图;
图2是本申请实施例提供的一种图像显示方法的示意性流程图;
图3是本申请实施例提供的一种图像显示方法对应的图像均匀性示意图;
图4是本申请实施例提供另一种图像显示方法的示意性流程图;
图5是本申请实施例提供的多个区域的权重分布示意图;
图6是本申请实施例提供的又一种图像显示方法的示意性流程图;
图7是本申请实施例提供的另一种图像显示方法的示意性流程图;
图8是本申请实施例提供的一种图像显示装置的示意性框图;
图9是本申请实施例提供的另一种图像显示装置的示意性框图;
图10是本申请实施例提供的又一种图像显示装置的示意性框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
为便于理解,首先介绍本申请实施例所涉及的相关术语。
1、衍射光波导(diffractive waveguide)
光波导是引导光波在其中传播的介质,又称介质光波导。
衍射光波导是包括衍射光栅的光波导,可以利用光的衍射特性和光波导介质的全反射特性来实现成像光束的传输。衍射光波导主要分为两种:利用光刻技术制造的表面浮雕光栅波导和基于全息干涉技术制造的全息体光栅波导。
衍射光栅是衍射光波导的核心元件。衍射光栅是一个具有周期结构的光学元件,这个 周期可以是材料表面浮雕出来的高峰和低谷,也可以是全息技术在材料内部曝光形成的“明暗干涉条纹”。衍射光栅的作用是在材料中引起折射率的周期性变化。这个周期一般是微纳米级别的,与可见光波长(400-700nm)一个量级,因此可以对光线产生有效的作用。
2、光引擎
光引擎可以包括光源、透镜组、显示屏以及镜头等器件。其中,光源发出的光可以经过透镜组传输至显示屏成像,显示屏所成的像可以作为图像源,即显示屏是光源的像面。由于显示屏是光源的像面,因此显示屏所成的图像的亮度分布与光源的亮度分布一一对应。
光源可以包括LED光源,也可以包括LD光源,或其他类型的光源;显示屏可以包括LCOS显示屏,也可以包括DLP显示屏,或其他类型的显示屏,本申请对此不作限定。
以LED光源为例,LED光源可以包括多个LED,即为一个LED阵列,因此,LED光源又可以称为LED阵列光源。该LED阵列可以为任意大小,例如5×8的矩形,或者5×5的正方形,本申请对此不作限定。上述LED阵列可以位于LED基板上,与LED基板一起构成LED模组,该LED模组还可以包括扩散器(diffuser)和增亮膜(brightness enhancement film,BEF),用于减小LED阵列的发散角。
3、图像显示
通过光引擎和衍射光波导能够实现图像显示。具体而言,光源发出的光可以经过透镜组传输至显示屏成像,再经镜头调制出瞳,然后输入进衍射光波导,经过衍射和全反射传输至衍射光栅产生扩瞳,最后输出衍射光波导,显示出图像。
简单来讲,光引擎的功能可以被理解为一个投影仪,衍射光波导的功能可以被理解为用于负责将投影仪的影像传递到人的眼睛里。应理解,衍射光波导传输图像时,不会对图像进行放大、缩小等处理。
4、图像的均匀性
图像的均匀性指显示屏上不同位置的图像的像素之间的差异程度,均匀性可以通过亮度、色彩等参数来衡量。其中,亮度也称明度,表示色彩的明暗程度。亮度和色彩是具有一定关联关系的,亮度变化,色彩也会发生变化。
5、人眼窗口(eye box)
eye box是针对AR头显设备而言的,指AR头显设备的显示部分与眼球之间的一块锥形区域,也是显示内容最清晰的区域,超出该区域的范围可能会出现显色错误、显示内容不全,甚至不显示内容等问题。
6、眼动追踪(eye tracking)技术
眼动追踪技术可以通过测量人眼的注视点的位置或者人眼相对头部的运动,实现对人眼运动轨迹的追踪。具体而言,通过图像处理技术可以定位瞳孔位置,获取瞳孔中心坐标,并通过算法计算人眼的注视点,从而追踪人眼的运动轨迹。
例如,可以采用瞳孔-角膜反射追踪方法,利用眼摄像机拍摄眼睛图像,然后进行图像处理得到瞳孔中心位置,再把角膜反射点作为眼摄像机和人眼的相对位置的基点,根据图像处理得到的瞳孔中心位置,即可以得到人眼的视线向量坐标,从而确定人眼注视点,进而追踪人眼的运动轨迹。
图1是图像显示的原理示意图。图1包括光引擎101、衍射光波导102以及人眼106, 其中,衍射光波导102包括输入光栅103、衍射光栅104以及输出光栅105。
上述光引擎101中的光源发出的光经显示屏成像,再经镜头调制出瞳,射入衍射光波导102。输入光栅103负责接收光信号,并将光信号以全反射的方式传输至衍射光栅104,衍射光栅104用于对接收到的光信号进行扩瞳,再将光信号以全反射的方式传输至输出光栅105,经输出光栅105将光信号输出,投射至人眼106成像。
应理解,上述扩瞳是指将低视场角的输入光信号转化为高视场角的输出光信号,包括水平扩瞳和垂直扩瞳。视场角可以采用可视角度(field of view,FOV)来衡量,具体是指显示器边缘与人眼连线的夹角,视场角可以包括水平视场角和垂直视场角。
在图1中,上述衍射光栅104对入射光进行扩瞳,可以将入射光分成若干个衍射级次的光以及进行色散。应理解,衍射光栅可以包括一维光栅和二维光栅。为了避免鬼像及杂散光,衍射光栅一般只有1阶衍射级次,包括正负1阶,例如,一维光栅的衍射级次可以为+1或者-1;二维光栅的衍射级次可以包括(-1,1)、(-1,-1)、(1,1)、(1,-1)、(1,0)、(-1,0)、(0,-1)、(0,1)。
衍射光栅104将入射光分成若干个衍射级次的光后,每一个衍射级次的光可以在衍射光波导102内沿着不同的方向继续传播。衍射光栅104对入射光进行这个维度的操作主要是改变入射光的传播方向。具体可以通过衍射光栅104的材料折射率、光栅形状、厚度、占空比等参数,将某一衍射级次(即某一方向)的入射光的衍射效率优化到最高,从而使大部分光在衍射后主要沿这一方向传播,从而降低光在其他衍射方向上的损耗。
对于色散的问题,衍射光栅104可以对不同波长的入射光产生不同的衍射角度,波长越长的光,衍射角度越大。示例性地,入射光包括红绿蓝三色光,由于红光的波长>绿光的波长>蓝光的波长,因此,红光的衍射角>绿光的衍射角>蓝光的衍射角。入射光的衍射角度不同,入射光每完成一次全反射所经历的路径长度也会不同。因此,红光的全反射次数<绿光的全反射次数<蓝光的全反射次数。基于上述差异,包括红绿蓝三色光的入射光最终通过衍射光波导102显示的图像会出现亮度和色彩不均匀的情况。
因此,不同颜色的入射光在衍射光波导内传输时,会出现传输角度不同、路径不同的现象,导致图像的均匀性较差,进而影响用户体验。
有鉴于此,本申请提供了一种图像显示方法,通过调整光引擎中光源的亮度分布,补偿衍射光波导所显示的图像的亮度,提高了通过衍射光波导所显示的图像的均匀性,从而提高用户体验。
应理解,本申请实施例的图像显示方法,可以应用于包括光引擎和衍射光波导的图像显示设备中,例如,AR头盔、AR眼镜、手机、平板、电脑、车载抬头显示器(head-up display,HUD)、AR智能互动设备以及智能眼镜等,本申请实施例对此不作限定。
在介绍本申请实施例提供的图像显示方法和图像显示装置之前,先做出以下几点说明。
第一,在下文示出的实施例中,各术语及英文缩略语,如均匀性数据、均匀性子数据、补偿数据等,均为方便描述而给出的示例性举例,不应对本申请构成任何限定。本申请并不排除在已有或未来的协议中定义其它能够实现相同或相似功能的术语的可能。
第二,在下文示出的实施例中第一、第二以及各种数字编号仅为描述方便进行的区分,并不用来限制本申请实施例的范围。例如,区分不同的图像、区分不同的区域等。
第三,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”, 描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a、b和c中的至少一项(个),可以表示:a,或b,或c,或a和b,或a和c,或b和c,或a、b和c,其中a,b,c可以是单个,也可以是多个。
下面将结合附图详细说明本申请提供的图像显示方法和图像显示装置。
图2为本申请实施例提供的一种图像显示方法200的示意性流程图。该方法200可以由图像显示设备执行,包括下列步骤:
S201,获得通过衍射光波导得到的第一图像的均匀性数据。应理解,该第一图像可以为仅通过衍射光波导输出的图像。由于不同颜色的入射光在衍射光波导内传输时,会出现传输角度不同、路径不同的现象,故该第一图像的均匀性较差。
S202,根据该均匀性数据,确定光引擎的待补偿数据。
S203,基于该待补偿数据,调整光引擎中的光源的亮度分布。由于光源可以包括LED阵列光源,也可以包括LD光源,因此,本实施例可以调整LED阵列光源的亮度分布,也可以调整LD光源的亮度分布。
由于上述待补偿数据是根据第一图像的均匀性数据确定的,而第一图像的均匀性数据反映了衍射光波导所显示图像的均匀程度,因此,该待补偿数据能够对衍射光波导所显示的图像的均匀性进行补偿。
S204,通过调整后的光源和衍射光波导,显示第二图像。应理解,该第二图像是对衍射光波导进行了亮度补偿后显示出的图像。由于进行了亮度补偿,该第二图像相对于第一图像的均匀性更高。
本申请实施例的图像显示方法,通过调整光引擎中的光源的亮度分布,对衍射光波导所显示的图像的亮度进行补偿,提高了通过衍射光波导所显示的图像的均匀性,从而提高用户体验。
应理解,在本申请实施例中,图像的均匀性包括图像的亮度均匀性和图像的色彩均匀性。亮度和色彩是具有关联关系的,亮度变化,色彩也会发生变化。因此,本申请实施例通过调整光引擎中光源的亮度分布,实际上调整了通过光引擎所呈现图像的亮度和色彩的均匀性,从而使光引擎所输出的图像和衍射光波导所输出的图像的均匀性互补,最终呈现出均匀性较高的图像。
下面结合图3对本申请实施例的原理进行说明。
图3是上述方法200对应的图像均匀性的示意图,其中,经衍射光波导得到的图像为图像1(相当于上述第一图像),经光引擎成像的图像为图像2,经衍射光波导和光引擎显示的图像为图像3(相当于上述第二图像)。在图3中,图像1、图像2和图像3均被分为4×4的区域,每个区域分别有各自的亮度值,从而展现出了图像的亮度分布。图像1的亮度分布、图像2的亮度分布以及图像3的亮度分布可以分别如图3中的线条所示,不同的线条图案代表不同的亮度值。由图3可以看出,图像1的亮度分布是不均匀的,图像2的亮度分布是不均匀的,而将图像1和图像2叠加获得的图像3的亮度分布是均匀的。在本示例中,图像1被划分为4×4的区域,则图像2也被划分为4×4的区域,图像1的 区域和图像2的区域是一一对应的。
作为一个可选的实施例,上述通过衍射光波导得到的第一图像的均匀性数据可以通过多种不同的方式获得,即上述S201可以通过多种不同的方式来实现,下面介绍两种可能的实现方式。
在第一种可能的实现方式中,如图4所示,上述S201可以包括下列步骤:
S401,测试多个区域对应的第一图像的均匀性,获得多个均匀性子数据。
S402,根据多个均匀性子数据以及多个区域的权重,确定第一图像的均匀性数据。
由于人眼是可以移动的,同一用户的人眼在不同时刻可以处于人眼窗口eye box的不同位置,不同用户的人眼相对于同一设备也会处于eye box的不同位置,因此,本申请实施例的上述多个区域可以包括对eye box进行划分所获得的区域。在不同区域,人眼会看到不同均匀程度的第一图像,本申请实施例采用均匀性子数据来表示不同区域的第一图像的均匀程度。
在本申请实施例中,多个区域分别具有各自的权重,其中,部分区域的权重可以是相同的,部分区域的权重可以是不同的,本申请实施例对此不作限定。
在一种可能的实现方式中,考虑到中间区域对应的人群比例较高,而边缘区域对应的人群比例较低,该多个区域的中间区域的权重最高,该多个区域的边缘区域权重最低。这样,更加符合实际情况,有利于获得更准确的均匀性数据。
示例性地,如图5所示,可以将人眼窗口eye box划分成3×5个区域,则上述多个区域包括15个区域,分别为区域1、区域2、……、区域15,人眼可处于15个区域中的任意一个区域。上述多个均匀性子数据包括该15个区域对应的15个均匀性子数据。其中,区域8的权重为A,区域3、区域7、区域9以及区域13的权重为B,区域2、区域4、区域6、区域10、区域12以及区域14的权重为C,区域1、区域5、区域11以及区域15的权重为D。A至D代表权重的高低。A表示的权重最高,B、C次之,D表示的权重最低。例如,可以设置A为60%,权重为B的区域的权重之和为30%,即B为7.5%,权重为C的区域的权重之和为20%,即C为3.3%,权重为D的区域的权重之和为10%,即D为2.5%,但本申请实施例对此不作限定。
可选地,上述第一图像的均匀性数据可以对通过多个区域对应的均匀性子数据加权求和得到。
在上述示例中,第一图像的均匀性数据可以为区域1对应的均匀性子数据*10%、区域2对应的均匀性子数据*3.3%、…、以及区域15对应的均匀性子数据*2.5%之和。
本申请实施例的图像显示方法,通过区域划分,可以根据划分的多个区域对应的均匀性子数据和该多个区域的权重,确定第一图像的均匀性数据,从而基于该均匀性数据确定待补偿数据,进而调整光引擎中光源的亮度分布,通过光引擎和衍射光波导显示出第二图像。本申请实施例的方法能够将人眼可能所处的区域都考虑在内,使得在不同用户所注视的不同区域、或者同一用户在不同时刻所注视的不同区域所显示的图像的均匀性都有所提高,从而提高用户体验。
在第二种可能的实现方式中,如图6所示,上述S201可以包括下列步骤:
S601,根据眼动追踪技术,确定目标区域。
S602,获取目标区域对应的第一图像的均匀性子数据。
S603,将目标区域对应的第一图像的均匀性子数据确定为上述第一图像的均匀性数据。
具体而言,图像显示设备可以先根据眼动追踪技术,确定眼球瞳孔的相对位置,进而确定人眼注视点。该人眼注视点所处的区域即为上述目标区域。在本申请实施例中,目标区域对应的第一图像的均匀性子数据即为该第一图像的均匀性数据。
本申请实施例的图像显示方法,使用眼动追踪技术实时追踪人眼位置,根据人眼位置获取第一图像的均匀性数据,进而实时调整光引擎中的光源的亮度分布,通过光引擎和衍射光波导显示第二图像。本申请实施例的方法可以灵活适配不同的用户,根据当前用户的眼球位置,确定当前用户的眼球位置对应目标区域的图像的均匀性,从而基于目标区域的图像的均匀性进行补偿,能够为用户提供良好的体验。
作为一个可选的实施例,本申请实施例可以通过多种不同的方式获取该目标区域对应的第一图像的均匀性子数据,本申请实施例对此不作限定。
方式一,在图像显示设备使用眼动追踪技术,确定了目标区域之后,该图像显示设备可以实时测量该目标区域对应的第一图像的均匀性子数据。
方式二,在图像显示设备使用眼动追踪技术,确定了目标区域之后,该图像显示设备可以从多个区域对应的第一图像的多个均匀性子数据中,选择该目标区域对应的第一图像的均匀性子数据,其中,多个区域包括目标区域。
应理解,在确定目标区域之前,该图像显示设备已经执行了上述S401,测试了多个区域对应的第一图像的均匀性,获得了多个均匀性子数据,并且保存了该多个均匀性子数据。在这种情况下,图像显示设备确定目标区域之后,就可以直接从该多个区域对应的多个均匀性子数据中选择出目标区域对应的均匀性子数据。
示例性地,在图5中,图像显示设备可以根据眼动追踪技术将目标区域确定为区域7,则图像显示设备可以在15个区域对应的15个均匀性子数据中选择区域7对应的均匀性子数据,并将区域7对应的均匀性子数据确定为第一图像的均匀性数据。
对比上述两种可能的方式可以得出,方式一无需提前测量多个区域的图像的均匀性,无需保存多个均匀性子数据,有利于节省设备能耗和内存;而方式二无需在确定了目标区域后再进行该目标区域的图像的均匀性的测量,缩短了图像显示的时延,提高了图像显示的效率。
作为一个可选的实施例,在上述根据均匀性数据,确定光引擎的待补偿数据,即S202之前,上述方法200还包括:获得图像显示设备所处环境的亮度值。上述根据均匀性数据,确定光引擎的待补偿数据,包括:根据第一图像的均匀性数据以及亮度值,确定待补偿数据。
示例性地,图像显示设备可以用环境光传感器测试当前所处环境的亮度值。若第一图像的均匀性数据为20尼特,图像显示设备所处环境的亮度值为100尼特,则待补偿数据可以为100除以20等于5。
作为一个可选的实施例,上述基于待补偿数据,调整光引擎中的光源的亮度分布,即S203包括:基于待补偿数据,确定目标电流值;根据目标电流值,调整光源的亮度分布。
应理解,由于电流值越大,光源越亮,电流值越小,光源越暗,本申请实施例可以基于待补偿数据确定的目标电流值,来调整光引擎中光源的亮度分布。
示例性地,若上述光源为LED阵列光源,该LED阵列光源中不同的LED有不同的 亮度,那么图像显示设备可以先确定这些不同的亮度与电流值之间的关系,再确定LED阵列中每个LED所需要的亮度,将每个LED所需要的亮度转换为每个LED所需要的电流值,即为上述目标电流值。
为了更好地理解本申请提出的图像显示方法,下面结合图7对本申请实施例图像显示的方法进行详细介绍。
图7示出了本申请提出的另一图像显示方法700的示意性流程图,该方法可以由图像显示设备执行,可以包括下列步骤:
S701,通过衍射光波导得到第一图像。
S702,测试多个区域对应的第一图像的均匀性,获得多个均匀性子数据。
可选地,S703,使用眼动追踪技术,从多个区域中确定目标区域。
S704,确定第一图像的均匀性数据。
应理解,S704包括两种可能的实现方式。
在第一种可能的实现方式中,图像显示装置可以执行S703之后,再执行S704,即图像显示设备确定了目标区域,并将该目标区域对应的第一图像的均匀性子数据确定为第一图像的均匀性数据。
在第二种可能的实现方式中,图像显示装置可以不执行S703,在执行S702之后直接执行S704,即图像显示设备可以根据多个区域对应的第一图像的均匀性子数据和该多个区域的权重,确定第一图像的均匀性数据。
S705,获得图像显示设备所处环境的亮度值。
S706,根据第一图像的均匀性数据和获得的亮度值,确定光引擎的待补偿数据。
S707,根据上述光引擎的待补偿数据,确定目标电流值。
S708,基于该目标电流值,调整光引擎的光源的亮度分布。
S709,通过调整后的光引擎和衍射光波导,显示第二图像。
可选地,在上述第一种可能的实现方式中,即第一图像的均匀性数据是图像显示装置使用眼动追踪技术所确定的目标区域对应的第一图像的均匀性数据,方法700还可以包括:S710,判断第二图像是否达到预期效果。
具体而言,图像显示装置可以测试上述第二图像的均匀性,得到第二图像的均匀性数据,进而根据该第二图像的均匀性数据和该图像显示设备所处环境的亮度值,判断第二图像的均匀性是否能够达到预期效果。若第二图像的均匀性达到了预期效果,则结束本次流程,为用户输出第二图像;若第二图像的均匀性未达到预期效果,则重复执行上述S703~S710,直至达到预期效果。
应理解,S701、S702、以及S704~S709为一种图像显示的方法,该方法通过区域划分,可以根据划分的多个区域对应的均匀性子数据和该多个区域的权重,确定第一图像的均匀性数据,从而基于该均匀性数据确定待补偿数据,进而调整光引擎的光源的亮度分布,通过光引擎和衍射光波导显示出第二图像。该方法能够将人眼可能所处的区域都考虑在内,使得在不同用户所注视的不同区域、或者同一用户在不同时刻所注视的不同区域所显示的图像的均匀性都有所提高,从而提高用户体验。
S701~S710为另一种图像显示的方法,该方法使用眼动追踪技术实时追踪人眼位置,根据人眼位置获取第一图像的均匀性数据,进而实时调整光引擎中的光源的亮度分布,通 过光引擎和衍射光波导显示第二图像,并检测第二图像的均匀性是否达到预期效果,若未达到预期效果,重复此方法直至达到预期效果为止。该方法可以灵活适配不同的用户,根据当前用户的眼球位置,确定当前用户的眼球位置对应目标区域的图像的均匀性,进而基于目标区域的图像的均匀性进行补偿,并检测补偿结果,直至达到预期效果,从而能够为用户提供良好的体验。
应理解,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
上文中结合图2至图7,详细描述了根据本申请实施例的图像显示方法,下面将结合图8至图10,详细描述根据本申请实施例的图像显示装置。
图8示出了本申请实施例提供的图像显示装置800,该装置800包括:处理单元810和显示单元820。其中,显示单元820包括光引擎和衍射光波导。
上述处理单元810用于:获得基于光引擎和衍射光波导得到的第一图像的均匀性数据,根据该均匀性数据,确定该光引擎的待补偿数据,并基于该待补偿数据,调整该光引擎中的光源的亮度分布;上述显示单元820用于:基于调整后的光源和衍射光波导,显示第二图像。
可选地,处理单元820具体用于:测试多个区域对应的第一图像的均匀性,获得多个均匀性子数据;根据该多个均匀性子数据以及该多个区域的权重,确定均匀性数据。
可选地,处理单元820还用于:根据眼动追踪技术,确定目标区域;获取该目标区域对应的第一图像的均匀性子数据;将该目标区域对应的第一图像的均匀性子数据确定为均匀性数据。
可选地,处理单元820具体用于:从多个区域对应的第一图像的多个均匀性子数据中,选择该目标区域对应的第一图像的均匀性子数据,其中,该多个区域包括该目标区域。
可选地,处理单元820还用于:获得该装置所处环境的亮度值;根据均匀性数据以及该亮度值,确定待补偿数据。
可选地,处理单元820具体用于:根据待补偿数据,确定目标电流值;基于该目标电流值,调整光源的亮度分布。
应理解,这里的装置800以功能模块的形式体现。这里的术语“单元”可以指应用特有集成电路(application specific integrated circuit,ASIC)、电子电路、用于执行一个或多个软件或固件程序的处理器(例如共享处理器、专有处理器或组处理器等)和存储器、合并逻辑电路和/或其它支持所描述的功能的合适组件。在一个可选例子中,本领域技术人员可以理解,装置800可以具体为上述实施例中的图像显示设备,或者,上述实施例中图像显示设备的功能可以集成在装置800中,装置800可以用于执行上述方法实施例中与图像显示设备对应的各个流程和/或步骤,为避免重复,在此不再赘述。
上述装置800具有实现上述方法中图像显示设备执行的相应步骤的功能;上述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。
在本申请的实施例,图8中的装置800也可以是芯片或者芯片系统,例如:片上系统(system on chip,SoC)。
图9示出了本申请实施例提供的另一图像显示装置900。该装置900包括处理器910、 光引擎920以及衍射光波导930。其中,处理器910用于:获得基于光引擎920和衍射光波导930得到的第一图像的均匀性数据,根据该均匀性数据,确定该光引擎920的待补偿数据,并基于该待补偿数据,调整该光引擎920中的光源的亮度分布;光引擎920和衍射光波导930用于:基于调整后的光源,显示第二图像。
应理解,装置900可以具体为上述实施例中的图像显示设备,或者,上述实施例中图像显示设备的功能可以集成在装置900中,装置900可以用于执行上述方法实施例中与图像显示设备对应的各个步骤和/或流程。
进一步地,上述光引擎可以包括光源和显示屏,图10示出了本申请实施例提供的又一图像显示装置1000,该装置1000包括存储器1010、处理器1020、控制器1030以及驱动器1040、光源1050、显示屏1060以及衍射光波导1070。
上述器件之间可以通过内部通路连接,存储器1010用于存储数据(例如上述第一图像的均匀性数据)和指令,处理器1020用于执行该存储器1010存储的指令,进行预处理操作,即基于第一图像的均匀性数据,确定光源1050的待补偿数据,以通过控制器1030控制驱动器1040调整光源1050的亮度分布。调整后的光源1050所发出的光可以传输至显示屏1060成像,再经过衍射光波导1070射入人眼。在本实施例中,用户所能看到的图像即为经过调整后的光源和衍射光波导显示出的第二图像。相关细节可参考上述方法实施例的描述,此处不再赘述。
应理解,装置1000可以具体为上述实施例中的图像显示设备,或者,上述实施例中图像显示设备的功能可以集成在装置1000中,装置1000可以用于执行上述方法实施例中与图像显示设备对应的各个步骤和/或流程。
可选地,上述存储器可以包括只读存储器和随机存取存储器,并向处理器提供指令和数据。存储器的一部分还可以包括非易失性随机存取存储器。例如,存储器还可以存储设备类型的信息。该处理器可以用于执行存储器中存储的指令,并且该处理器执行该指令时,该处理器可以执行上述方法实施例中与图像显示设备对应的各个步骤和/或流程。
应理解,在本申请实施例中,该处理器可以是中央处理单元(central processing unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。上述控制器可以为微控制单元(microcontroller unit,MCU)。
在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器执行存储器中的指令,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本 申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (15)

  1. 一种图像显示方法,其特征在于,应用于包括光引擎和衍射光波导的装置,所述方法包括:
    获得通过所述衍射光波导得到的第一图像的均匀性数据;
    根据所述均匀性数据,确定所述光引擎的待补偿数据;
    基于所述待补偿数据,调整所述光引擎中的光源的亮度分布;
    通过调整后的所述光引擎和所述衍射光波导,显示第二图像。
  2. 如权利要求1所述的方法,其特征在于,所述获得通过所述衍射光波导显示出的第一图像的均匀性数据,包括:
    测试多个区域对应的所述第一图像的均匀性,获得多个均匀性子数据;
    根据所述多个均匀性子数据以及所述多个区域的权重,确定所述均匀性数据。
  3. 如权利要求1所述的方法,其特征在于,在所述获得通过所述衍射光波导显示出的第一图像的均匀性数据之前,所述方法还包括:
    根据眼动追踪技术,确定目标区域;
    所述获得通过所述衍射光波导显示出的第一图像的均匀性数据,包括:
    获取所述目标区域对应的所述第一图像的均匀性子数据;
    将所述目标区域对应的所述第一图像的均匀性子数据确定为所述均匀性数据。
  4. 如权利要求3所述的方法,其特征在于,所述获取所述目标区域对应的所述第一图像的均匀性子数据,包括:
    从多个区域对应的所述第一图像的多个均匀性子数据中,选择所述目标区域对应的所述第一图像的均匀性子数据,其中,所述多个区域包括所述目标区域。
  5. 如权利要求1至4中任一项所述的方法,其特征在于,在所述根据所述均匀性数据,确定所述光引擎的待补偿数据之前,所述方法还包括:
    获得所述装置所处环境的亮度值;
    所述根据所述均匀性数据,确定所述光引擎的待补偿数据,包括:
    根据所述均匀性数据以及所述亮度值,确定所述待补偿数据。
  6. 如权利要求1至5中任一项所述的方法,其特征在于,所述基于所述待补偿数据,调整所述光引擎中的光源的亮度分布,包括:
    根据所述待补偿数据,确定目标电流值;
    基于所述目标电流值,调整所述光源的亮度分布。
  7. 一种图像显示装置,其特征在于,包括:
    处理器、光引擎和衍射光波导;
    其中,所述处理器用于:获得基于所述光引擎和所述衍射光波导得到的第一图像的均匀性数据,根据所述均匀性数据,确定所述光引擎的待补偿数据,并基于所述待补偿数据,调整所述光引擎中的光源的亮度分布;
    所述光引擎和所述衍射光波导用于:基于调整后的所述光源,显示第二图像。
  8. 如权利要求7所述的装置,其特征在于,所述处理器具体用于:
    测试多个区域对应的所述第一图像的均匀性,获得多个均匀性子数据;
    根据所述多个均匀性子数据以及所述多个区域的权重,确定所述均匀性数据。
  9. 如权利要求7所述的装置,其特征在于,所述处理器还用于:
    根据眼动追踪技术,确定目标区域;
    获取所述目标区域对应的所述第一图像的均匀性子数据;
    将所述目标区域对应的所述第一图像的均匀性子数据确定为所述均匀性数据。
  10. 如权利要求9所述的装置,其特征在于,所述处理器具体用于:
    从多个区域对应的所述第一图像的多个均匀性子数据中,选择所述目标区域对应的所述第一图像的均匀性子数据,其中,所述多个区域包括所述目标区域。
  11. 如权利要求7至10中任一项所述的装置,其特征在于,所述处理器还用于:
    获得所述装置所处环境的亮度值;
    根据所述均匀性数据以及所述亮度值,确定所述待补偿数据。
  12. 如权利要求7至11中任一项所述的装置,其特征在于,所述处理器具体用于:
    根据所述待补偿数据,确定目标电流值;
    基于所述目标电流值,调整所述光源的亮度分布。
  13. 一种图像显示装置,其特征在于,包括:处理器,所述处理器与存储器耦合,所述存储器用于存储计算机程序,当所述处理器调用所述计算机程序时,使得所述装置执行如权利要求1至6中任一项所述的方法。
  14. 一种计算机可读存储介质,其特征在于,用于存储计算机程序,所述计算机程序包括用于实现如权利要求1至6中任一项所述的方法的指令。
  15. 一种计算机程序产品,所述计算机程序产品中包括计算机程序代码,其特征在于,当所述计算机程序代码在计算机上运行时,使得计算机实现如权利要求1至6中任一项所述的方法。
PCT/CN2020/117141 2020-09-23 2020-09-23 图像显示方法和图像显示装置 WO2022061584A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/117141 WO2022061584A1 (zh) 2020-09-23 2020-09-23 图像显示方法和图像显示装置
CN202080101879.2A CN115698821A (zh) 2020-09-23 2020-09-23 图像显示方法和图像显示装置
US18/187,728 US20230221554A1 (en) 2020-09-23 2023-03-22 Image display method and image display apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/117141 WO2022061584A1 (zh) 2020-09-23 2020-09-23 图像显示方法和图像显示装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/187,728 Continuation US20230221554A1 (en) 2020-09-23 2023-03-22 Image display method and image display apparatus

Publications (1)

Publication Number Publication Date
WO2022061584A1 true WO2022061584A1 (zh) 2022-03-31

Family

ID=80844543

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/117141 WO2022061584A1 (zh) 2020-09-23 2020-09-23 图像显示方法和图像显示装置

Country Status (3)

Country Link
US (1) US20230221554A1 (zh)
CN (1) CN115698821A (zh)
WO (1) WO2022061584A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107089B (zh) * 2022-11-17 2024-02-09 江西凤凰光学科技有限公司 一种衍射光波导均匀性补偿的方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170235219A1 (en) * 2015-02-09 2017-08-17 Pasi KOSTAMO Generating Electronic Components
CN107340567A (zh) * 2017-09-01 2017-11-10 上海誉沛光电科技有限公司 一种平板光波导和显示装置
US20170357089A1 (en) * 2016-06-09 2017-12-14 Microsoft Technology Licensing, Llc Wrapped Waveguide With Large Field Of View
CN107870438A (zh) * 2017-12-04 2018-04-03 华为技术有限公司 增强现实的装置、光引擎部件和方法
CN110426849A (zh) * 2019-06-26 2019-11-08 华为技术有限公司 一种投影系统及增强现实装置
CN110543022A (zh) * 2019-07-31 2019-12-06 华为技术有限公司 一种增强现实装置及穿戴设备
CN110764260A (zh) * 2018-07-28 2020-02-07 华为技术有限公司 一种增强现实装置
CN111487774A (zh) * 2020-05-15 2020-08-04 北京至格科技有限公司 一种增强现实显示装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170235219A1 (en) * 2015-02-09 2017-08-17 Pasi KOSTAMO Generating Electronic Components
US20170357089A1 (en) * 2016-06-09 2017-12-14 Microsoft Technology Licensing, Llc Wrapped Waveguide With Large Field Of View
CN107340567A (zh) * 2017-09-01 2017-11-10 上海誉沛光电科技有限公司 一种平板光波导和显示装置
CN107870438A (zh) * 2017-12-04 2018-04-03 华为技术有限公司 增强现实的装置、光引擎部件和方法
CN110764260A (zh) * 2018-07-28 2020-02-07 华为技术有限公司 一种增强现实装置
CN110426849A (zh) * 2019-06-26 2019-11-08 华为技术有限公司 一种投影系统及增强现实装置
CN110543022A (zh) * 2019-07-31 2019-12-06 华为技术有限公司 一种增强现实装置及穿戴设备
CN111487774A (zh) * 2020-05-15 2020-08-04 北京至格科技有限公司 一种增强现实显示装置

Also Published As

Publication number Publication date
CN115698821A (zh) 2023-02-03
US20230221554A1 (en) 2023-07-13

Similar Documents

Publication Publication Date Title
US10929997B1 (en) Selective propagation of depth measurements using stereoimaging
US11650426B2 (en) Holographic optical elements for eye-tracking illumination
KR102231910B1 (ko) 초점 이동에 반응하는 입체적 디스플레이
US10852817B1 (en) Eye tracking combiner having multiple perspectives
US9134700B2 (en) Display device
WO2020041067A1 (en) Diffractive gratings for eye-tracking illumination through a light-guide
KR20170059476A (ko) 스위칭가능 회절 격자를 이용한 도파관 눈 추적
US10725302B1 (en) Stereo imaging with Fresnel facets and Fresnel reflections
JP2022183245A (ja) 別個の位相および振幅変調器を伴う接眼3dディスプレイ
WO2021007134A1 (en) Apodized optical elements for optical artifact reduction
US20230221554A1 (en) Image display method and image display apparatus
US11914767B2 (en) Glint-based eye tracker illumination using dual-sided and dual-layered architectures
WO2022203827A1 (en) Eye tracker illumination through a waveguide
US9934583B2 (en) Expectation maximization to determine position of ambient glints
WO2022098454A1 (en) Waveguide assembly with virtual image focus
US11237628B1 (en) Efficient eye illumination using reflection of structured light pattern for eye tracking
US20200033595A1 (en) Method and system for calibrating a wearable heads-up display having multiple exit pupils
US11988828B1 (en) Multi-pupil display and eye-tracking with interferometric sensing
US20230258937A1 (en) Hybrid waveguide to maximize coverage in field of view (fov)
US20240029218A1 (en) Gaze-aware tone mapping and chromatic adaptation
US11924536B2 (en) Augmented reality device including variable focus lenses and operating method thereof
US11927758B1 (en) Multi-laser illuminated mixed waveguide display with volume Bragg grating (VBG) and mirror
US20240069347A1 (en) System and method using eye tracking illumination
US11803238B1 (en) Eye and hand tracking utilizing lensless camera and machine learning
US20230314716A1 (en) Emission of particular wavelength bands utilizing directed wavelength emission components in a display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954438

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20954438

Country of ref document: EP

Kind code of ref document: A1