WO2023051479A1 - Module d'affichage et procédé de commande d'imagerie - Google Patents

Module d'affichage et procédé de commande d'imagerie Download PDF

Info

Publication number
WO2023051479A1
WO2023051479A1 PCT/CN2022/121488 CN2022121488W WO2023051479A1 WO 2023051479 A1 WO2023051479 A1 WO 2023051479A1 CN 2022121488 W CN2022121488 W CN 2022121488W WO 2023051479 A1 WO2023051479 A1 WO 2023051479A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
sub
polarization
displayed
Prior art date
Application number
PCT/CN2022/121488
Other languages
English (en)
Chinese (zh)
Inventor
邱孟
高少锐
吴巨帅
冯国华
罗伟城
孙上
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023051479A1 publication Critical patent/WO2023051479A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/13363Birefringent elements, e.g. for optical compensation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/33Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being semiconductor devices, e.g. diodes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/35Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being liquid crystals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays

Definitions

  • the embodiments of the present application relate to the field of optical technologies, and in particular, to a display module and an imaging control method.
  • Display devices such as virtual reality (VR) devices, achieve deep immersion by satisfying a large angle of view (angle of view, FOV) and high resolution.
  • the content viewed in the currently used VR devices only reaches 10-20 points per degree (PPD), which cannot meet the resolution limit of 1’ (60PPD) of the human eye, so that the image seen by the user has a screen door effect.
  • PPD points per degree
  • 60PPD resolution limit of 1’
  • high-resolution display screens can be used, but in order to meet the size requirements of VR devices, silicon-based organic light-emitting diodes (micro-OLED) must be used technology display, but the high-resolution micro-OLED costs more.
  • the resolution of the display screen can be indirectly improved through a resolution enhancement technology, but currently there is no feasible resolution enhancement solution applied to a display device.
  • Embodiments of the present application provide a display module and an imaging control method, which are used to provide a resolution enhancement solution applied to a VR device.
  • an embodiment of the present application provides a display module, including a display component, a pixel position adjustment component, and a control component.
  • the display component includes a plurality of pixels, and each pixel in the plurality of pixels includes a plurality of sub-pixels; the display component, It is used to display multi-frame images in time under the control of the control component; multi-frame images are obtained by sub-pixel decomposition of the image to be displayed, and the resolution of the multi-frame images is the same as that of the display component, and the resolution of the multi-frame images Smaller than the resolution of the image to be displayed; the pixel position adjustment component is used to adjust the position of each frame of image displayed by the display component in time division under the control of the control component; wherein, the time when the display component displays the first image is adjusted with the pixel position adjustment component Time synchronization of the first image, where the first image is any image in multiple frames of images.
  • the human eyes can see high-resolution images by using the persistence of vision and visual synthesis functions of the human eyes. Decompose the image to be displayed by sub-pixel sampling to improve the smoothness of the edge and reduce the jaggedness.
  • the pixel position adjustment component includes a polarization converter and a polarization displacement device; the polarization converter is used to time-divisionally adjust the polarization direction of the target polarized light output by the polarization converter under the control of the control component, and the target polarization The light bears one frame of images in multiple frames of images; the polarization displacement device is used to output the target polarized light at the first position when the polarization direction of the target polarized light output by the polarization converter is the first polarization direction, and output the target polarized light at the polarization converter output When the polarization direction of the target polarized light is the second polarization direction, the target polarized light is output at the second position.
  • control component is specifically used to receive the image to be displayed, decompose the image to be displayed at sub-pixel level to obtain multiple frames of images, and send the multiple frames of images to the display component in time division.
  • the polarization shifting device is a birefringent device or a polarization grating.
  • the birefringent device is a non-diffractive device and does not have inherent dispersion characteristics, so that the light after passing through the birefringent device will not be dispersed, further improving the imaging effect.
  • the polarization converter includes twisted nematic liquid crystals or in-plane spun liquid crystals; or the polarization converter includes cholesteric liquid crystals and 1/4 wave plates.
  • the pixel position adjustment component is a motor.
  • the sub-pixels included in the first pixel in the first image in the multi-frame image are obtained by sampling from the sub-pixels included in at least h adjacent pixels included in the image to be displayed;
  • h is the number of images in the multi-frame images
  • the first image is any image in the multi-frame images
  • the first pixel is any pixel in the first image.
  • the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel that is the same color as the first sub-pixel included in the set area of the image to be displayed;
  • the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
  • the sub-pixel value of the sampling position is determined by the sub-pixel value of the same color around the sampling position, which can reduce the occurrence of color fringes around the displayed image.
  • this method may also be used only for the determination of sub-pixel values within a set range of four weeks.
  • the pixel value of the first sub-pixel included in the first pixel point is obtained by weighting and summing the pixel values of the sub-pixels included in the set area with the same color as the first sub-pixel;
  • the weight of the sub-pixel with the same color as the first sub-pixel included in the setting area is inversely proportional to the distance between the sub-pixels; The distance between sample locations in the image to be displayed.
  • the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
  • q(i, j) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the first sub-pixel in the pixel point of the image to be displayed
  • j represents the ordinate of the first sub-pixel in the pixel point of the image to be displayed
  • Q(i,j) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 and ⁇ 5 represent the weights respectively.
  • the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
  • q(i, j+1) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the pixel point of the first sub-pixel in the image to be displayed
  • j+1 represents the pixel of the first sub-pixel in the image to be displayed
  • the ordinate of the point, Q(i, j+1) represents the pixel value of the sub-pixel of the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 , ⁇ 1 , ⁇ 2 , ⁇ 3 and ⁇ 4 represent the weights, respectively.
  • the multi-frame images include a first image and a second image
  • the control component is specifically configured to input the multi-frame images to the display component in time division, so that the display component displays the multi-frame images in time division
  • the time unit controls the pixel position adjustment component to output the first image at the first position
  • the second time unit controls the pixel position adjustment component to output the second image at the second position.
  • the distance between the first position and the second position in the horizontal direction is Px/2; or, the distance between the first position and the second position in the vertical direction is Py/2; or, the distance between the first position and the second position in the horizontal direction
  • the interval above is Px/2 and the interval between the first position and the second position in the vertical direction is Py/2; the first time unit and the second time unit are temporally adjacent.
  • control component is specifically configured to control the polarization converter to adjust the input polarization direction of the target polarized light carrying the first image to the first polarization direction within the first time unit, and within the second time unit controlling the polarization converter to adjust the polarization direction of the input target polarized light carrying the second image to the second polarization direction, so that the first position of the target polarized light carrying the first image output by the polarization displacement device is the same as that of the output carrying the second image
  • the distance between the second positions of the target polarized light in the horizontal direction is Px/2; Px represents the distance between adjacent pixels of the first image or the second image in the horizontal direction.
  • control component is specifically configured to control the polarization converter to adjust the input polarization direction of the target polarized light carrying the first image to the first polarization direction within the first time unit, and within the second time unit controlling the polarization converter to adjust the polarization direction of the input target polarized light carrying the second image to the second polarization direction, so that the first position of the target polarized light carrying the first image output by the polarization displacement device is the same as that of the output carrying the second image
  • the distance between the second positions of the target polarized light in the vertical direction is Py/2; Py represents the distance between adjacent pixels of the first image or the second image in the vertical direction.
  • control component is specifically configured to control the polarization converter to adjust the input polarization direction of the target polarized light carrying the first image to the first polarization direction within the first time unit, and within the second time unit controlling the polarization converter to adjust the polarization direction of the input target polarized light carrying the second image to the second polarization direction, so that the first position of the target polarized light carrying the first image output by the polarization displacement device is the same as that of the output carrying the second image
  • the second position of the target polarized light is spaced by Py/2 in the vertical direction and shifted by Px/2 in the horizontal direction, Py represents the distance between adjacent pixels of the first image or the second image in the vertical direction, and Px represents The distance between adjacent pixels of the first image or the second image in the vertical direction.
  • the polarization converter includes twisted nematic liquid crystals or in-plane spun liquid crystals; or the polarization converter includes cholesteric liquid crystals and 1/4 wave plates.
  • the polarization displacement device is a birefringent liquid crystal
  • the birefringent liquid crystal adopts quartz crystal, barium borate crystal, lithium niobate crystal or titanium dioxide crystal or liquid crystal polymer.
  • the display module further includes a folded light path, which is located between the display component and the pixel position adjustment component, and the folded light path is used to transmit the target polarized light of any image carrying multiple frames of images to the pixel position Adjust components.
  • an embodiment of the present application provides an imaging control method, the method is applied to a display device, the display device includes a display component and a pixel position adjustment component, the display component includes a plurality of pixels, each of the plurality of pixels includes a plurality of sub-pixels pixel; the method includes: receiving the image to be displayed, and decomposing the image to be displayed at sub-pixel level to obtain a multi-frame image; the resolution of each frame of the image in the multi-frame image is the same as the resolution of the display component, and the resolution of the multi-frame image is smaller than the resolution of the image to be displayed Display the resolution of the image; control the pixel position adjustment component to adjust the position of each frame of image displayed by the display component in time-sharing; wherein, the time when the display component displays the first image is synchronized with the time when the pixel position adjustment component adjusts the first image, and the first image is any image in the multi-frame images.
  • sub-pixel-level decomposition of the image to be displayed is performed to obtain multi-frame images, including:
  • the image to be displayed is decomposed at the sub-pixel level to obtain a multi-frame image.
  • the method also includes:
  • the image to be displayed is down-sampled to the image to be processed
  • the image to be processed is output at a set position through a pixel position adjustment component.
  • the sub-pixels included in the first pixel in the first image in the multi-frame image are sampled from the sub-pixels included in at least h adjacent pixels included in the image to be displayed;
  • h is the number of images in the multi-frame images
  • the first image is any image in the multi-frame images
  • the first pixel is any pixel in the first image.
  • the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel that is the same color as the first sub-pixel included in the set area of the image to be displayed;
  • the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
  • the pixel value of the first sub-pixel included in the first pixel point is obtained by weighting and summing the pixel values of the sub-pixels included in the set area with the same color as the first sub-pixel;
  • the weight of the sub-pixel with the same color as the first sub-pixel included in the setting area is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the sub-pixel with the same color as the first sub-pixel and the first sub-pixel The distance between sample locations in the image to be displayed.
  • the display device is a wearable device
  • the size of the set area is related to the distance between the display component and the imaging plane of the wearable device.
  • the size of the setting area is related to the pixel size of the display component.
  • the size of the setting area is related to the display content of the display component.
  • the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
  • q(i, j) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the first sub-pixel in the pixel point of the image to be displayed
  • j represents the ordinate of the first sub-pixel in the pixel point of the image to be displayed
  • Q(i,j) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 and ⁇ 5 represent the weights respectively.
  • the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
  • q(i, j+1) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the pixel point of the first sub-pixel in the image to be displayed
  • j+1 represents the pixel of the first sub-pixel in the image to be displayed
  • the ordinate of the point, Q(i, j+1) represents the pixel value of the sub-pixel of the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 , ⁇ 1 , ⁇ 2 , ⁇ 3 and ⁇ 4 represent the weights, respectively.
  • the multiple frames of images include the first image and the second image
  • controlling the pixel position adjustment component to adjust the position of each frame of image displayed by the display component in time division includes: The multi-frame image is input to the display component, so that the display component displays the multi-frame image in time; the pixel position adjustment component is controlled to output the first image at the first position in the first time unit, and the control
  • the second time unit controls the pixel position adjustment component to output the second image at a second position; the interval between the first position and the second position in the horizontal direction is Px/2; or, the first The distance between the position and the second position in the vertical direction is Py/2; or, the distance between the first position and the second position in the horizontal direction is Px/2 and the distance between the first position and the The interval between the second positions in the vertical direction is Py/2; the first time unit and the second time unit are temporally adjacent.
  • the pixel position adjustment component includes a polarization converter and a polarization displacement device; the pixel position adjustment component is controlled to output the first image at the first position in a first time unit, and the second The time unit controls the pixel position adjustment component to output the second image at the second position, including: controlling the polarization converter to adjust the polarization direction of the target polarized light in a time-division manner, and the target polarized light is time-divided by the display component Generated when multiple frames of images are displayed; when the polarization direction of the target polarized light output by the polarization converter is the first polarization direction, the polarization displacement device outputs the target polarized light at the first position, and when the polarization converter outputs When the polarization direction of the target polarized light is the second polarization direction, the polarization displacement device outputs the target polarized light at the second position.
  • the polarization converter is controlled to time-divisionally adjust the polarization direction of the output target polarized light, including:
  • the polarization converter In the first time unit, the polarization converter is controlled to adjust the output polarization direction of the target polarized light bearing the first image to be the first polarization direction, and in the second time unit, the polarization converter is controlled to adjust the output target polarization bearing the second image.
  • the polarization direction of the light is the second polarization direction, so that the first position of the target polarized light bearing the first image output by the polarization displacement device and the second position of the output target polarized light bearing the second image are spaced horizontally by Px/2; Px represents the distance between adjacent pixels of the first image or the second image in the horizontal direction.
  • the sub-pixels included in the first pixel in the first image are obtained by sampling from the sub-pixels included in two horizontally adjacent pixels included in the image to be displayed;
  • the sub-pixels included in the second pixel in the second image are obtained by sampling from the sub-pixels included in two horizontally adjacent pixels;
  • the position coordinates of the first pixel in the first image are the same as the position coordinates of the second pixel in the second image.
  • the polarization converter is controlled to time-divisionally adjust the polarization direction of the input target polarized light, including:
  • the polarization converter In the first time unit, the polarization converter is controlled to adjust the output polarization direction of the target polarized light bearing the first image to be the first polarization direction, and in the second time unit, the polarization converter is controlled to adjust the output target polarization bearing the second image.
  • the polarization direction of the light is the second polarization direction, so that the first position of the target polarized light bearing the first image output by the polarization displacement device and the second position of the output target polarized light bearing the second image are spaced vertically by Py/2; Py represents the distance between adjacent pixels of the first image or the second image in the vertical direction.
  • the sub-pixels included in the first pixel in the first image are obtained by sampling from the sub-pixels included in two adjacent pixels in the vertical direction in the image to be displayed;
  • the sub-pixels included in the second pixel in the second image are sampled from the sub-pixels included in two adjacent pixels in the vertical direction;
  • the position coordinates of the first pixel in the first image are the same as the position coordinates of the second pixel in the second image.
  • the polarization converter is controlled to time-divisionally adjust the polarization direction of the output target polarized light, including:
  • the polarization converter In the first time unit, the polarization converter is controlled to adjust the polarization direction of the input target polarized light bearing the first image to the first polarization direction, and in the second time unit, the polarization converter is controlled to adjust the input target polarized light bearing the second image.
  • the polarization direction of is the second polarization direction, so that the first position of the target polarized light bearing the first image output by the polarization displacement device and the output second position of the target polarized light bearing the second image are separated by Py in the vertical direction /2 and offset Px/2 in the horizontal direction, Py represents the vertical distance between adjacent pixels of the first image or the second image, and Px represents the vertical distance between adjacent pixels of the first image or the second image Pitch.
  • the sub-pixels included in the first pixel in the first image are obtained by sampling from sub-pixels included in two adjacent pixels in a diagonal direction in the image to be displayed;
  • the sub-pixels included in the second pixel in the second image are sampled from the sub-pixels included in two adjacent pixels in the diagonal direction;
  • the position coordinates of the first pixel in the first image are the same as the position coordinates of the second pixel in the second image.
  • the embodiment of the present application provides a display module, including a display component, at least one adjustment component and a control component; the adjustment component includes a polarization rotator and a birefringent device; the display component is used to receive the image to be processed, and display The image to be processed, the resolution of the image to be processed is the same as the resolution of the display component; the polarization rotator is used to adjust the polarization direction of the light beam of each pixel of the image to be processed under the control of the control component; the birefringent device is used to Decomposing the light beam of each pixel included in the image to be processed, outputting the first target polarized light for projecting the first sub-image at the first position and outputting the second target polarized light for projecting the second sub-image at the second position ; Wherein, the decomposition ratio of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, and
  • the polarization rotator is combined with the birefringent device, and the principle of outputting two beams of light by the birefringent device is used to output two sub-images, so that the superposition of the two sub-images is close to the source image to be displayed, improving the display image quality resolution.
  • the position of the first sub-image and the position of the second sub-image are separated by Py/2 in the vertical direction and/or separated by Px/2 in the horizontal direction, where Py represents the distance between pixels in the vertical direction Pitch, Px represents the pixel pitch in the vertical direction.
  • control component is specifically used to control the polarization rotator to adjust the polarization direction of the light beams of the sub-pixels included in each pixel.
  • the image to be processed is obtained by down-sampling the image to be displayed;
  • the control component is specifically configured to: estimate each sub-image to be projected as the first sub-image according to the luminous intensity of each pixel of the image to be displayed The luminous intensity of each pixel and the luminous intensity of each pixel to be projected as the second sub-image; wherein, the resolution of the first sub-image and the second sub-image is the same, and is smaller than the resolution of the image to be displayed; according to to be projected as The luminous intensity of each pixel of the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image control the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed.
  • the image to be processed is obtained by down-sampling the image to be displayed; the control component is specifically used to: estimate and adjust the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity to be projected as The luminous intensity of each pixel of the second sub-image is such that the similarity between the adjusted first sub-image and the second sub-image superimposed projected image and the image to be displayed is greater than the set threshold; the set threshold is based on the perception of the human eye.
  • the perception of image difference is determined; according to the adjusted luminous intensity of each pixel of the first sub-image and the luminous intensity of each pixel of the second sub-image, the polarization rotator is controlled to adjust the light beam of each pixel of the image to be processed direction of polarization.
  • an embodiment of the present application provides an imaging control method, the method is applied to a wearable device, the wearable device includes a display component and a pixel position adjustment component, and the pixel position adjustment component includes a polarization rotator and a birefringent device; the method includes: Receive the image to be displayed, perform down-sampling processing on the image to be displayed to obtain the image to be processed, and input the image to be processed to the display component, so that the display component emits target polarized light carrying the image to be displayed, wherein the target frame image is a multi-frame image A frame of image, the resolution of the image to be processed is the same as the resolution of the display component; control the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed, so that the birefringent device includes each pixel in the image to be processed The light beam of the pixel is decomposed, and the first target polarized light for projecting the first sub-image is output at the
  • the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed.
  • the decomposition ratio of the first pixel is that the light beam of the first pixel is projected on the first pixel after decomposition.
  • the ratio of the luminous intensity of the pixel of the sub-image to the luminous intensity of the pixel projected on the second sub-image, the decomposition ratio of the second pixel is the ratio of the luminous intensity of the pixel projected on the first sub-image after the light beam of the second pixel is decomposed and projected on the The luminous intensity scale of the pixels of the second subimage.
  • the position of the first sub-image and the position of the second sub-image are separated by Py/2 in the vertical direction and/or separated by Px/2 in the horizontal direction, where Py represents the distance between pixels in the vertical direction Pitch, Px represents the pixel pitch in the vertical direction.
  • controlling the polarization rotator to adjust the polarization direction of the beam of each pixel of the output image to be processed includes: controlling the polarization rotator to adjust the polarization of the beam of the sub-pixel included in each pixel of the image to be processed direction.
  • controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes: estimating the light beam to be projected as the first sub-image according to the luminous intensity of each pixel of the image to be displayed The luminous intensity of each pixel and the luminous intensity of each pixel to be projected as the second sub-image; wherein, the resolution of the first sub-image and the second sub-image is the same, and is smaller than the resolution of the image to be displayed; according to the to-be-projected For the luminous intensity of each pixel of the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image, the polarization rotator is controlled to adjust the polarization direction of the light beam of each pixel of the image to be processed.
  • controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes: estimating and adjusting the luminous intensity and the to-be-projected luminous intensity of each pixel to be projected as the first sub-image is the luminous intensity of each pixel of the second sub-image, so that the similarity between the adjusted first sub-image and the second sub-image superimposed projected image and the image to be displayed is greater than the set threshold; the set threshold is based on the human eye The perception of image difference is determined; according to the adjusted luminous intensity of each pixel of the first sub-image and the luminous intensity of each pixel of the second sub-image, the polarization rotator is controlled to adjust the luminous intensity of each pixel of the image to be processed The polarization direction of the beam.
  • the present application provides a control device, which is used to implement any one of the methods in the second aspect or the fourth aspect above, and includes corresponding functional modules, respectively used to implement the steps in the above methods.
  • the functions may be implemented by hardware, or may be implemented by executing corresponding software through hardware.
  • Hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • the present application provides a computer-readable storage medium, in which a computer program or instruction is stored, and when the computer program or instruction is executed by a display device, the display device executes the above-mentioned second aspect or the first aspect.
  • the present application provides a computer program product, the computer program product includes a computer program or instruction, and when the computer program or instruction is executed by the control device, any possible implementation manner of the above-mentioned second aspect or the fourth aspect can be realized method in .
  • Figure 1A is a schematic diagram of the RGB Stripe arrangement
  • Figure 1B is a schematic diagram of the arrangement of pentile RGBG
  • Fig. 1C is a schematic diagram of delta RGB arrangement
  • Figure 1D is a schematic diagram of the arrangement of pentile RGBW
  • FIG. 2 is a schematic structural diagram of display components in an embodiment of the present application.
  • FIG. 3A is a schematic diagram of the structure of the display module in the embodiment of the present application.
  • FIG. 3B is a schematic diagram of the structure of the display module in the embodiment of the present application.
  • Fig. 4 is a schematic diagram of the principle of diagonal super-resolution in the embodiment of the present application.
  • Fig. 5A is a schematic diagram of realizing resolution enhancement in combination with a polarization grating
  • FIG. 5B is a schematic diagram of image pixel offset to achieve resolution enhancement
  • FIG. 6 is a schematic diagram of beam transmission of a display module using a birefringent device in an embodiment of the present application
  • Fig. 7 is a schematic diagram of the light output principle of the birefringent device in the embodiment of the present application.
  • Fig. 8 is the relationship between the thickness of the birefringent device and the refractive index difference ⁇ n of the o-light and e-light of the birefringent device in the embodiment of the present application;
  • Fig. 9 is a schematic diagram of the structure of the display module in the embodiment of the present application.
  • FIG. 10 is a schematic diagram of the concept of sub-pixel sampling in the embodiment of the present application.
  • Fig. 11A is a schematic diagram of sample decomposition of an image to be displayed under diagonal super-resolution in the embodiment of the present application;
  • Fig. 11B is a schematic diagram of sample decomposition of the image to be displayed under diagonal super-resolution in the embodiment of the present application;
  • Fig. 12 is a schematic diagram of sample decomposition of images to be displayed under vertical super-resolution in the embodiment of the present application.
  • FIG. 13 is a schematic diagram of control timing in the embodiment of the present application.
  • Fig. 14 is a schematic diagram of the beam transmission direction in the display module in the embodiment of the present application.
  • FIG. 15 is a schematic diagram of the overlay display of low-resolution subframes under diagonal super-resolution in the embodiment of the present application.
  • FIG. 16A is a schematic diagram of sampling and decomposition of an image to be displayed in the embodiment of the present application.
  • FIG. 16B is a schematic diagram of sampling and decomposition of an image to be displayed in the embodiment of the present application.
  • FIG. 17 is a schematic diagram of low-resolution subframe superposition display in the embodiment of the present application.
  • FIG. 18 is a schematic diagram of the calculation principle of the pixel value of the sub-pixel in the embodiment of the present application.
  • FIG. 19 is a schematic diagram of the calculation principle of the pixel value of the sub-pixel in the embodiment of the present application.
  • Figure 20 is a schematic diagram of the structure of the display module in the embodiment of the present application.
  • Figure 21 is a schematic diagram of the structure of the display module in the embodiment of the present application.
  • Fig. 22 is a schematic diagram of the polarized light converted and output by the deflection rotator in the embodiment of the present application;
  • Fig. 23 is a schematic diagram of the principle of resolution enhancement in the embodiment of the present application.
  • FIG. 24 is a schematic flow chart of an imaging control method in an embodiment of the present application.
  • FIG. 25 is a schematic flowchart of an imaging control method in an embodiment of the present application.
  • Displaying near the eyes is a display method of AR display devices or VR display devices.
  • Angular resolution can also be referred to as spatial resolution, which refers to the number of pixels filled in an average angle of 1 degree of field of view.
  • the smallest image unit on a display is a pixel.
  • a pixel is composed of three sub-pixels of different colors. Sub-pixels can also be referred to as sub-pixels. For example, a red (R) sub-pixel, a green (G) sub-pixel and a blue (B) sub-pixel form one pixel.
  • the pixel arrangement adopted by the display screen may include RGB stripe arrangement, pentile RGBG arrangement, pentile RGBW arrangement, delta RGB arrangement and the like.
  • FIG. 1A shows RGB Stripe, a commonly used pixel arrangement method for traditional display screens. It is arranged in long strips, and each pixel includes one R, one G, and one B.
  • Figure 1B shows the arrangement of pentile RGBG, each pixel contains two sub-pixels, which alternately appear in the combination of RG and BG.
  • Figure 1C is a delta RGB arrangement, each pixel contains three RGB sub-pixels, and two adjacent pixels have shared sub-pixels, taking row a in Figure 1C as an example, pixel 1 and pixel 2 share a blue sub-pixel , pixel 2 and pixel 3 share the red sub-pixel and the green sub-pixel.
  • Figure 1D is a pentile RGBW arrangement, each pixel includes a red (R) sub-pixel, a green (G) sub-pixel, a blue (B) sub-pixel and a white (W) sub-pixel.
  • the RGB Stripe arrangement three pixels are composed of 9 sub-pixels; in the RGBG arrangement, three pixels are composed of 6 sub-pixels; in the RGB delta arrangement, three pixels are composed of 6 sub-pixels composed of sub-pixels. Therefore, when the number of pixels is the same, the RGBG arrangement and the RGB delta arrangement require fewer sub-pixels than the RGB Stripe arrangement.
  • a display device For example, a terminal device with a display screen, such as a mobile phone, a monitor, a TV, and the like.
  • the display device can also be a wearable device.
  • the wearable device may be a near eye display (near eye display, NED) device, such as VR glasses, or a VR helmet.
  • NED near eye display
  • users wear NED devices to play games, read, watch movies (or TV series), participate in virtual conferences, participate in video education, or video shopping.
  • FIG. 2 it is a schematic structural diagram of a display module provided by an embodiment of the present application.
  • the display module includes a display component 100 , at least one pixel position adjustment component 200 and a control component 300 .
  • the display module includes a pixel position adjustment component 200 as an example.
  • the display component 100 is used for displaying images.
  • the pixel position adjustment component 200 is configured to adjust the position of the image displayed by the display component 100 .
  • the display device is a wearable device, and the pixel position adjustment component 200 adjusts the imaging of the image displayed by the display component 100 to a position on a virtual image plane at a certain distance from the display component 100 .
  • the display component 100 displays multiple frames of images in time division.
  • the multi-frame images may be obtained by down-sampling the image to be displayed by the control component 300 .
  • the control component 300 splits a high-resolution image to be displayed into multiple low-resolution images.
  • the low resolution image has the same resolution as the display component.
  • the control component 300 sends multiple low-resolution images to the display component 100 in time division for display.
  • the pixel position adjustment component is used to adjust the position of each frame of image displayed by the display component in time division under the control of the control component; wherein, the time when the display component displays the first image is synchronized with the time when the pixel position adjustment component adjusts the first image, the first An image is any image in multiple frames of images.
  • the pixel position adjustment component may be a motor, and the control component 300 controls the mechanical movement of the motor in time division, so as to realize the time division adjustment of the position of each frame of image displayed by the display component.
  • the motor may be an ultrasonic motor or a servo motor or the like.
  • the pixel position adjustment component 200 may include a polarization converter 210 and a polarization displacement device 220 .
  • the polarization converter 210 is configured to time-divisionally adjust the polarization direction of the target polarized light output by the polarization converter 210 under the control of the control component 300 , and the target polarized light carries one frame of images among multiple frames of images.
  • the polarization displacement device 220 is used to output the target polarized light at the first position when the polarization direction of the target polarized light output by the polarization converter 210 is the first polarization direction, and the polarization direction of the target polarized light output by the polarization converter 210 is In the second polarization direction, the target polarized light is output at the second position; wherein, the display time of the first image is synchronized with the polarization direction adjustment time of the target polarized light carrying the first image, and the first image is any frame of the multi-frame image image.
  • the display module can realize super resolution in the horizontal direction, or super resolution in the vertical direction, or super resolution in the diagonal direction.
  • Horizontal direction super resolution it can be understood that the distance between the two frames of images formed by the beam output by the polarization displacement device in the horizontal direction is Px/2; the Px represents the distance between adjacent pixels of the two frames of images in the horizontal direction, so as to realize The resolution in the horizontal direction is doubled.
  • the polarization shifting device needs to have the ability to realize the shift vector as (Px/2, 0).
  • the distance between the two frames of images output by the polarization displacement device in the vertical direction is Py/2; the Py represents the vertical distance between adjacent pixels of the two frames of images, so as to realize the vertical distance doubled the resolution.
  • the polarization shifting device needs to have the ability to realize the shift vector to be (0, Py/2).
  • Diagonal direction super resolution can be understood as the distance between two frames of images output by the polarization displacement device in the vertical direction is Py/2, and the distance in the horizontal direction is Px/2; thereby doubling the resolution.
  • the polarization shifting device needs to have the ability to realize the shift vector as (Px/2, Py/2).
  • the polarization displacement device can realize the offset of the offset vector as (Px/2, Py/2), then through time division multiplexing In this way, the effect of doubling the equivalent number of display pixels can be achieved.
  • the offset vector of the polarization shifting device is (Px/2,0)
  • the resolution in the horizontal direction can be doubled.
  • the offset vector of the polarization displacement device is (0, Py/2)
  • the resolution in the vertical direction is multiplied.
  • multiple pixel position adjustment components 200 can also be connected in series to realize the offset of multiple frames of images, so as to achieve a resolution increase of more than 2 times.
  • the two pixel position adjustment components are respectively referred to as the first pixel position adjustment component 200a and the second pixel position adjustment component 200b.
  • the polarization converter of the pixel position adjustment component 200a is called a first polarization converter 210a
  • the polarization converter of the pixel position adjustment component 200b is called a second polarization converter 210b.
  • the polarization displacement device in the pixel position adjustment component 200a is called the first polarization displacement device 220a
  • the polarization displacement device in the pixel position adjustment component 200b is called the second polarization displacement device 220b.
  • FIG. 3A and FIG. 3B are respectively introduced and described below to give an exemplary specific implementation solution.
  • the reference numerals of the components in the display module will not be exemplified.
  • the polarization shifting device may be a polarization grating or a birefringent device.
  • the polarization displacement device 220 is a polarization grating as an example.
  • a combination of polarization gratings and polarization modulation devices is used to achieve resolution enhancement.
  • a polarization grating also known as a Pancharatnam-Berry deflector (PBD) is a diffractive optical device.
  • the polarization grating uses the geometric phase to generate a periodic phase grating structure, so that under different incident circularly polarized light, it can generate +1 order and -1 order diffraction in different directions.
  • the polarization state of the light beam output from the display screen is changed by using an electronically controlled polarization modulation device, so that the light beam input into the polarization grating is deflected.
  • P is the distance between two adjacent pixels in the horizontal direction or the distance in the vertical direction.
  • each pixel in the image output by the display screen is shifted diagonally through a polarization converter and a polarization grating Then, by time-sharing displaying the unshifted image and the shifted image, the resolution of the displayed image on the display screen is equivalently increased by 2 times, and the horizontal interval of the shifted pixels is P/2.
  • a combination of a polarization converter and a birefringent element is used to deflect the image-bearing light emitted by the display assembly.
  • the polarization converter can time-divisionally adjust the polarization direction of the input target polarized light under the control of the control component.
  • Target polarized light is used to carry low-resolution images.
  • the birefringent device when the polarization direction of the target polarized light output by the polarization converter is the first polarization direction, the birefringent device outputs the target polarized light at the first position, and the target polarized light output by the polarization converter has the second polarization direction.
  • the target polarized light is output at the second position, as shown in FIG. 6 .
  • the display component displays the time of the first image, and the polarization converter adjusts the time synchronization for the polarization direction of the target polarized light carrying the first image.
  • the polarization grating is a diffraction device
  • the diffraction device has inherent dispersion characteristics.
  • the deflection angle of the grating is linearly related to the wavelength of the input beam, which leads to the dispersion of the red, green, and blue rays of different wavelengths emitted by the display screen after passing through the polarization grating, resulting in poor imaging effects for the human eye, such as rainbow edge effects,
  • the image seen by the human eye is blurred and the color is distorted.
  • the birefringent device is a non-diffractive device and does not have inherent dispersion characteristics, so the light after passing through the birefringent device will not be dispersed, and the imaging effect is better than that of the polarization grating.
  • the display component may be an ordinary liquid crystal display (liquid crystal display, LCD), an ordinary organic light emitting diode display (organic light emitting diode, OLED) or a silicon-based OLED, or other display devices, which are not specifically limited in this application.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • silicon-based OLED silicon-based OLED
  • the polarization switch may be an electronically controlled polarization switch (ECPS).
  • ECPS electronically controlled polarization switch
  • the electronically controlled polarization converter can be nematic liquid crystals (nematic liquid crystals), vertical alignment (vertical alignment, VA) liquid crystals, plate switching (in-plane switching, IPS) liquid crystals, electronically controlled twisted nematic Any of twisted nematic liquid crystals (TNLC), electrically controlled nonlinear crystals or electrically controlled ferroelectric liquid crystals.
  • the control component controls the polarization converter to keep the polarization direction of the input polarized light when it is not powered (that is, OFF), which can be understood as the difference between the input polarized light and the output polarized light
  • the polarization direction is the same, or it is understood that the polarization converter only transmits the input polarized light.
  • the embodiment of this application will only transmit the input polarized light and the output polarized light. The light is considered to be the same polarized light.
  • the control component controls the polarization converter to be powered on, for example, the applied voltage exceeds the threshold voltage Vc, and is used to rotate the polarization direction of the input polarized light, for example, rotate the polarization direction of the input polarized light by 90 degrees.
  • control component controls the polarization converter to be powered on, for example, the applied voltage exceeds a threshold voltage, so as to maintain the polarization direction of the input polarized light.
  • the control component controls the polarization converter to rotate the polarization direction of the input polarized light when it is not powered on, for example, to rotate the polarization direction of the input polarized light by 90 degrees.
  • TNLC is composed of a liquid crystal layer sandwiched between two conductive substrates.
  • the polarization direction of the incident polarized light passing through the TNLC is rotated by 90 degrees; when the voltage applied to the twisted nematic liquid crystal exceeds the threshold voltage Vc, the liquid crystal molecules in the TNLC stand up, and the incident polarized light passing through the TNLC The polarization direction remains unchanged, and the same polarization state as the incident polarized light is still emitted. If the applied voltage is between 0 and Vc, the polarization direction of the incident polarized beam will be rotated by 0-90 degrees after passing through the TNLC, and the specific rotation angle is related to the applied voltage and the specific material of the TNLC.
  • control component can be, for example, a processor, a microprocessor, a controller and other control components, for example, it can be a general-purpose central processing unit (central processing unit, CPU), a general-purpose processor, a digital signal processing (digital signal processing, DSP), application specific integrated circuits (ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof .
  • CPU central processing unit
  • DSP digital signal processing
  • ASIC application specific integrated circuits
  • FPGA field programmable gate array
  • FIG. 7 it is a schematic diagram of the principle of a birefringent device.
  • the optical axis of the birefringent device lies in-plane, that is, the direction parallel to the plane of the paper.
  • the output light of the birefringent device can be divided into o light and e light.
  • o Light can be called ordinary light
  • the polarization direction is perpendicular to the plane of incidence.
  • e-ray can also be called extraordinary light
  • the polarization direction is located in the plane of incidence.
  • the shift amount is a.
  • the value of the displacement a is related to the specifications of the birefringent device.
  • the value of the displacement a is related to the thickness of the birefringent device and the included angle ⁇ , which is the included angle between the optical axis of the birefringent device and the normal line of the surface of the birefringent device. For example, the displacement a satisfies the conditions shown in the following formula (1):
  • T represents the thickness of the birefringent device
  • n o represents the refractive index of o light
  • ne represents the refractive index of e light.
  • the birefringent device can use crystals, such as quartz crystals, barium borate crystals, quartz crystals, lithium niobate crystals, or titanium dioxide crystals.
  • crystals such as quartz crystals, barium borate crystals, quartz crystals, lithium niobate crystals, or titanium dioxide crystals.
  • the birefringence of the crystal is small, resulting in a larger volume of the VR device or the AR device.
  • liquid crystal can be used as the material of the birefringent device, and the larger birefringence index of the liquid crystal material can be used to make the birefringent device thinner, which can be better placed in a VR device or an AR device with limited space.
  • the liquid crystal material may be a liquid liquid crystal material or a liquid crystal polymer material cured by ultraviolet or heat.
  • the display module may further include a folded optical path, as shown in FIG. 9 , where the polarization shifting device is a birefringent device as an example.
  • the display module sequentially includes a display component, a polarization converter, a birefringent device and a folded optical path according to the beam transmission direction.
  • the folded light path is used to make the light beam output by the birefringent device incident on the human eye.
  • the folded optical path includes a half mirror, a reflective polarizer, one or more imaging lenses and multiple phase films. Phase modes can be used to change the polarization state of incident light.
  • the imaging lens can be a single spherical lens or an aspheric lens, or a combination of multiple spherical or aspheric lenses.
  • the combination of multiple spherical or aspherical lenses can improve the imaging quality of the system and reduce the aberration of the system.
  • the present application may also use an optical path with other structures to make the light beam output by the birefringent device incident on the human eye, which is not specifically limited in this embodiment of the present application.
  • the image to be displayed is decomposed into multiple frames of low-resolution images.
  • the multiple frames of images can be obtained by sub-pixel sampling, that is, each sub-pixel in the pixel unit is regarded as a separate Pixels participate in sampling, and then calculate the sub-pixel pixel value of each pixel in each frame of image to obtain multiple frames of low-resolution images, for example, it can be determined by SPR.
  • sub-pixel sampling When sub-pixel sampling is performed, sub-pixels may be sampled from multiple adjacent pixel points as a sub-pixel of one pixel point.
  • FIG. 10 taking the RGB Stipe arrangement as an example. Taking a slanted line as an example, see (a) in FIG. 10 is a schematic diagram of the concept of sampling by pixel unit, and (b) in FIG. 10 is a schematic diagram of the concept of sampling by sub-pixel. In (c) of FIG. 10 , a schematic diagram of sampling is taken as an example of the spacing between red sub-pixels, green sub-pixels and blue sub-pixels. It can be seen from Figure 10 that after using sub-pixel sampling, the oblique lines will be smoother and the jaggedness will be weakened. It should be noted that when the display components adopt different arrangements of sub-pixels, the maximum resolution multiples that can be improved by using the solutions provided in the embodiments of the present application may be different, as shown in Table 1, for example.
  • the sub-pixels included in the first pixel in the first image are obtained by sampling from sub-pixels included in at least h adjacent pixels included in the image to be displayed.
  • h represents the number of low-resolution images that need to be decomposed for the image to be displayed.
  • the number of low-resolution images decomposed is 2.
  • the image source (with a resolution of P ⁇ Q) is resampled to obtain the image to be displayed.
  • the resolution of the image to be displayed is 2M ⁇ 2N
  • the resolution of the display component is M ⁇ N (P >M, Q>N)
  • multiple sub-pixels can be collected for each adjacent 4 pixels to form a sub-pixel of a low-resolution image.
  • two frames of low-resolution images include a first image and a second image.
  • the sub-pixels included in the first pixel in the first image are sampled from the sub-pixels included in the 4 adjacent pixels in the image to be displayed; the sub-pixels included in the second pixel in the second image are It is obtained by sampling from sub-pixels included in four adjacent pixel points; the position coordinates of the first pixel point in the first image are the same as the position coordinates of the second pixel point in the second image.
  • the RGB Stipe arrangement Take the RGB Stipe arrangement as an example.
  • the image to be displayed may be decomposed into two images by sub-pixel interval sampling.
  • FIG. 11A For example, referring to Figure 11A, (a) in Figure 11A represents the image to be displayed before sampling, (b) in Figure 11A represents the low-resolution image subframe 1 after sampling, and (c) in Figure 11A represents the low-resolution image after sampling Rate image subframe 2. It can be understood that, the R sub-pixel, the G sub-pixel and the B sub-pixel included in each thickened rectangular box in (b) of FIG. 11A are regarded as a pixel point of a sub-frame 1 . It should be understood that FIG. 11A is only used as an example, and does not specifically limit the sampling manner. For another example, referring to Fig. 11B, (a) in Fig. 11B represents the image to be displayed before sampling, (b) in Fig.
  • FIG. 11B represents the low-resolution image subframe 1 after sampling, and (c) in Fig. 11B represents the low-resolution image after sampling.
  • Resolution image subframe 2 It can be understood that, the R sub-pixel, the G sub-pixel and the B sub-pixel included in each thickened rectangular box in (b) of FIG. 11B are regarded as a pixel point of a sub-frame 1 .
  • the image source when implementing horizontal super-resolution, the image source (with a resolution of P ⁇ Q) is resampled to obtain the image to be displayed.
  • the resolution of the image to be displayed is 2M ⁇ N, and the resolution of the display component is M ⁇ N.
  • a plurality of sub-pixels can be collected for every two horizontally adjacent pixels to form a sub-pixel of a low-resolution image.
  • the resolution increase factor is 2.
  • the image source (with a resolution of P ⁇ Q) is resampled to obtain the image to be displayed.
  • the resolution of the image to be displayed is M ⁇ 2N, and the display component The resolution is M ⁇ N, and multiple sub-pixels can be collected for every 2 adjacent pixels in the vertical direction to form a sub-pixel of a low-resolution image.
  • the image to be displayed can be decomposed into two images by sub-pixel interval sampling. For example, as shown in Figure 12, (a) in Figure 12 represents the image to be displayed before sampling, (b) in Figure 12 represents the low-resolution image subframe 1 after sampling, and (c) in Figure 12 represents the low-resolution image after sampling Rate image subframe 2. It should be understood that Fig. 12 is only used as an example.
  • the number of decomposed low-resolution images is four.
  • the image source (with a resolution of P ⁇ Q) is resampled to obtain the image to be displayed.
  • the resolution of the image to be displayed is 2M ⁇ 2N, and the display component
  • the resolution is M ⁇ N, and multiple sub-pixels can be collected for each adjacent 4 pixels to form pixels of a low-resolution image.
  • the image source may be resampled first to obtain the resolution of the image to be displayed as kM ⁇ LN.
  • k is a positive integer
  • L is a positive integer
  • the resolution of the image source is kM ⁇ LN, that is, the image source is the image to be displayed.
  • the control component obtains the image to be displayed, and then sends the image to be displayed to the display component in time division.
  • the control component After the control component obtains the image source, it processes the image source according to the resolution of the image source and the resolution of the display component to obtain multiple frames of low-resolution images.
  • the image to be displayed can be obtained by a mobile terminal connected to a display device (such as an AR device or a VR device), for example, the mobile terminal obtains the image to be displayed according to the resolution of the image source and the resolution of the display component . Then the mobile terminal sends the image to be displayed to the control component.
  • the image to be displayed may be obtained by a mobile terminal connected to the AR device or VR device, for example, the mobile terminal obtains the image to be displayed according to the resolution of the image source and the resolution of the display component.
  • the mobile terminal decomposes the image to be displayed into multiple frames of low-resolution images, and then sends them to the control component, and sends a control signal to the control component, instructing the display component to display the display signal of multiple frames of low-resolution images.
  • the mobile terminal may be, for example, a mobile phone, a tablet computer, or a personal computer (person computer, PC).
  • the pixel position adjustment component including a polarization converter and a birefringent device as an example, as shown in FIG. 3A and FIG. 3B .
  • the implementation manner is similar to the implementation manner using the polarization converter and the birefringent device, and will not be repeated in this application.
  • the control component or the terminal device may decompose the image to be displayed into two frames of low-resolution images.
  • the implementation of diagonal super-division is taken as an example.
  • the control component synchronously sends modulation signals to the polarization converter, and sends display signals to the display component. For example, at time T0, as shown in FIG. 13 , a high-level AC signal is sent to the polarization converter, and a low-resolution image sub-frame 1 is sent to the display component synchronously.
  • the polarization converter does not convert the polarization direction of the input target polarized light, so that the birefringent device emits O light, that is, does not perform any conversion on the input target beam.
  • Position offset see Figure 13.
  • the control component sends a low-level AC signal to the polarization converter, and synchronously sends the low-resolution image subframe 1 to the display component.
  • the polarization direction of the birefringent device is transformed to make the birefringent device emit e-light, that is, the position of the input target beam is shifted, as shown in FIG. 13 .
  • the image transmitted to the human eye through the optical path is equivalent to the superposition of the low-resolution image subframe 1 and the low-resolution image subframe 2, as shown in FIG. 15 .
  • Each bold black box in Figure 15 represents a pixel.
  • the equivalent resolution of the display component can be increased by 2 times.
  • a higher multiple of resolution can also be achieved, but the combination of multiple sets of polarization converters and birefringent devices is required.
  • Resolution enhancement in horizontal, vertical and arbitrary directions can be achieved by controlling the azimuth angle between the birefringent device and the display assembly.
  • the resolution enhancement algorithm of the present application adopts the time-sharing display of low-resolution images, utilizes the visual persistence and visual synthesis functions of the human eye, and synthesizes high-resolution images in the mind, so the frame rate of the synthesized high-resolution images is relatively low.
  • the maximum frame rate of the display component is reduced by a certain percentage.
  • the structure of the display module is shown in FIG. 3B as an example.
  • the control component or the terminal device may decompose the image to be displayed into 4 frames of low-resolution images.
  • the first pixel position adjustment component and the second pixel position adjustment component are combined to implement horizontal super-resolution, vertical super-resolution and diagonal super-resolution.
  • the resolution of the display element is M ⁇ N. If the resolution of the image source is not 2M ⁇ 2N, the image source may be resampled to obtain the image to be displayed, so that the resolution of the image to be displayed is 2M ⁇ 2N.
  • the image to be displayed When decomposing the image to be displayed, 4 frames of images can be collected by sampling first, and then the pixel value of each pixel in each frame of the image can be further calculated, for example, it can be determined by SPR.
  • the sub-pixel value of a pixel in the frame of image can be obtained by sampling from sub-pixels included in 4 adjacent pixels.
  • the image to be displayed is decomposed into 4 frames of images by sub-pixel sampling.
  • FIG. 16A (a) in FIG. 16A represents the image to be displayed before sampling, and (b)-(e) in FIG. 16A represent low-resolution image subframe 1-subframe 4 after sampling.
  • FIG. 16A (a) in FIG. 16A represents the image to be displayed before sampling, and (b)-(e) in FIG. 16A represent low-resolution image subframe 1-subframe 4 after sampling.
  • FIG. 16A is only an example, and does not specifically limit the specific sampling manner.
  • FIG. 16B (a) in FIG. 16B represents the image to be displayed before sampling, and (b)-(e) in FIG. 16B represent low-resolution image subframe 1-subframe 4 after sampling.
  • FIG. 16B is only used as an example, and does not specifically limit the specific sampling manner.
  • the O optical axis of the second birefringent device coincides with the O optical axis of the first birefringent device, and the e optical axis of the second birefringent device is perpendicular to the e optical axis of the first birefringent device.
  • Table 2 When the polarization converter is OFF, the polarization converter does not convert the polarization direction of the input target polarized light, so that the birefringent device emits O light, that is, does not shift the position of the input target beam.
  • the polarization converter When the polarization converter is turned ON, the polarization converter converts the polarization direction of the target polarized light of the input pair, so that the birefringent device emits e-light.
  • the first birefringent device When the first birefringent device emits e-ray, it shifts the target beam horizontally, and when the second birefringent device emits e-ray, it shifts the target beam vertically. See Table 2.
  • the control component controls the display component to display subframe 1, controls the first polarization converter to be in an ON state, and controls the second polarization converter to be in an OFF state.
  • the control component controls the display component to start displaying subframe 1, and sends high-level modulation signals to the first polarization converter and the second polarization converter. Therefore, the target polarized light carrying the sub-frame 1 does not shift after passing through the first birefringent device and the second birefringent device.
  • the control component controls the display component to display the subframe 2, controls the first polarization converter to be in the OFF state, and controls the second polarization converter to be in the ON state.
  • the control component controls the display component to start displaying subframe 2, sends a low-level modulation signal to the first polarization converter, and sends a high-level modulation signal to the second polarization converter. Therefore, the target polarized light carrying the sub-frame 2 is shifted horizontally after passing through the first birefringent device, and the beam does not continue to shift after passing through the second birefringent device. It can be understood that the target polarized light carrying the sub-frame 2 is shifted horizontally after passing through the first birefringent device and the second birefringent device.
  • the control component controls the display component to display the subframe 3 , controls the first polarization converter to be in the ON state, and controls the second polarization converter to be in the OFF state.
  • the control component controls the display component to start displaying subframe 3, sends a high-level modulation signal to the first polarization converter, and sends a low-level modulation signal to the second polarization converter. Therefore, the target polarized light carrying the sub-frame 3 passes through the first birefringent device without beam shift, and then passes through the second birefringent device and then undergoes a vertical shift of the beam.
  • the target polarized light carrying the sub-frame 3 is vertically shifted after passing through the first birefringent device and the second birefringent device.
  • the control component controls the display component to display the subframe 4, controls the first polarization converter to be in the OFF state, and controls the second polarization converter to be in the OFF state.
  • the control component controls the display component to start displaying subframe 4, sends a high-level modulation signal to the first polarization converter, and sends a high-level modulation signal to the second polarization converter.
  • the target polarized light carrying the sub-frame 4 is horizontally shifted through the first birefringent device, and then continues to be vertically shifted after passing through the second birefringent device. It can be understood that the target polarized light carrying the sub-frame 4 is diagonally shifted after passing through the first birefringent device and the second birefringent device.
  • the image transmitted to the human eye through the optical path is equivalent to the superimposition of the low-resolution image subframe 1-low-resolution image subframe 4, as shown in FIG. 17 .
  • the 4 low-resolution image subframes of the size above the arrow are superimposed to obtain the effect below the arrow. It can be seen that the resolution has been increased by a factor of 4.
  • the equivalent resolution of the display component can be increased by 4 times.
  • the resolution enhancement algorithm of the present application adopts the time-sharing display of low-resolution images, utilizes the visual persistence and visual synthesis functions of the human eye, and synthesizes high-resolution images in the mind, so the frame rate of the synthesized high-resolution images is relatively low.
  • the maximum frame rate of the display component is reduced by a certain percentage.
  • the embodiment of the present application uses a sub-pixel level decomposition method to split the image to be displayed. After sub-pixel sampling is used, the edge will be smoother and the jaggedness will be weakened, but there may be color fringe problems. Based on this, in order to weaken the color fringing, the embodiment of the present application provides a sub pixel rendering (sub pixel rendering, SPR) method to sample the pixel value of each sub pixel.
  • SPR sub pixel rendering
  • the pixel value of the set color sub-pixel of the pixel (i, j) may be weighted and determined according to the pixel value of the set color sub-pixel of the pixel (i, j) and the set color sub-pixels of surrounding pixels.
  • the weight of the sub-pixel with the same color as the first sub-pixel included in the setting area is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the sub-pixel with the same color as the first sub-pixel and the first The distance between subpixel sampling locations in the image to be displayed.
  • the size of the set area is related to the distance between the display component and the imaging plane of the wearable device. For example, the greater the distance between the display component and the imaging plane of the wearable device, the greater the size of the set area. It can be understood that the distance between the display component and the imaging plane of the wearable device is inversely proportional to the size of the set area.
  • the size of the setting area is related to the pixel size of the display component. The size of the setting area can be configured according to the pixel size of the display component. In some other embodiments, the size of the setting area is related to the display content of the display component. For example, the size of the setting area can be adaptively adjusted according to the display content of the display component.
  • the size of the setting area may be adaptively adjusted according to the scene to which the image displayed by the display component belongs. For example, compared with the grassland scene, the setting area corresponding to the task scene is larger than the setting area corresponding to the grassland scene.
  • the pixel value of the sub-pixel of the sampled pixel point can be determined by the following formula, taking the first sub-pixel as an example:
  • q(i, j) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the first sub-pixel in the pixel of the image to be displayed
  • j represents the pixel of the first sub-pixel in the image to be displayed
  • Q(i,j) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 and ⁇ 5 represent the weights respectively.
  • the pixel value of the sub-pixel of the sampled pixel point can be determined by the following formula, taking the first sub-pixel as an example:
  • q(i, j+1) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the first sub-pixel in the pixel point of the image to be displayed
  • j+1 represents the first sub-pixel in the to-be-displayed image.
  • the ordinate of the pixel point of the displayed image, Q(i,j+1) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 , ⁇ 1 , ⁇ 2 , ⁇ 3 and ⁇ 4 denote weights, respectively.
  • the pixel value g(i,j) of the green sub-pixel in the pixel at position (i,j) can be determined by the position (i-2,j) , (i-1, j-1), (i, j-2), (i+1, j-1), (i+2, j), (i+1, j+1), (i- 1, j+1), (i, j+2) green sub-pixels composed of green G components in the diamond-shaped pixels to jointly determine.
  • the green sub-pixels whose area of the pixel is not greater than 50% within the rhombic range do not participate in the determination.
  • the green sub-pixel at position (i,j) is composed of positions (i,j), (i-1,j), (i,j-1), (i+1,j), (i,j+ 1)
  • the weighted determination of the pixel values of the green sub-pixels among the five pixels is obtained.
  • g(i,j) ⁇ 1 *G(i-1,j)+ ⁇ 2 *G(i,j-1)+ ⁇ 3 *G(i+1,j)+ ⁇ 4 * G(i,j+1)+ ⁇ 5 *G(i,j).
  • the pixel value r(i, j+1) of the red sub-pixel in the pixel at position (i, j+1) can be determined by the position (i-2, j+1), (i-1, j), (i,j-1), (i+1,j), (i+2,j+1), (i+1,j+2), (i,j+3), (i- 1, j+1) red sub-pixels composed of red R components in the diamond-shaped pixels to jointly determine.
  • the red sub-pixels whose area of the pixel is not more than 50% within the rhombic range do not participate in the determination.
  • the red sub-pixel at position (i, j+1) is composed of positions (i-1, j), (i, j-1), (i+1, j), (i, j+1), (i, j+1), (i, j+2) is obtained.
  • eight pixels can be divided into two categories, the red sub-pixels in the upper, lower, left and right four pixels around the position (i, j+1) are used as the second category, and the red sub-pixels in the remaining pixels are Sub-pixel as the first category.
  • the weight distribution ratio of the two types of sub-pixels is 0.5:0.5.
  • the pixel value r(i,j+1) of the red sub-pixel at position (i,j+1) can be determined as follows:
  • r(i,j+1) ⁇ 1 *R(i-1,j)+ ⁇ 2 *R(i,j-1)+ ⁇ 3 *R(i+1,j)+ ⁇ 4 *R( i,j+1)
  • the weights corresponding to the 8 sub-pixels can be determined by the distances between the geometric center of the pixel where the 8 sub-pixels are located and the red sub-pixel in position (i, j+1). Take a sub-pixel aspect ratio of 1:3 as an example.
  • weights corresponding to the eight sub-pixels are respectively determined by the following formulas:
  • the second category is a first category:
  • x represents the relationship between the ratio of the pixel value allocated for calculating the red sub-pixel and the distance; x ⁇ 0.
  • the pixel value and its own pixel value are determined by weighting.
  • the weights corresponding to the 8 sub-pixels involved in the determination of the pixel value of the red sub-pixel of (i, j+1) as can be seen in Figure 19, the black solid dot represents the red sub-pixel of (i, j+1) , the black dashed hollow dots represent the center positions of the 8 pixels of the pixel values of the red sub-pixels of (i,j+1).
  • the weight can be determined according to the distance between the black dotted hollow dots and the black solid dots.
  • the method of determining the pixel value b(i, j) of the blue sub-pixel at position (i, j+1) in the SPR process can be obtained.
  • the present application supports the display module to work in two working modes.
  • Super resolution mode and normal mode.
  • the control component determines that the super-resolution mode is enabled, it performs super-resolution processing, and when it determines that the normal mode is enabled, the resolution remains unchanged and the frame rate does not decrease.
  • TNLC uniformly keeps the polarization mode of the target polarized light unchanged (or uniformly changes the polarization direction by 90 degrees).
  • the control component can resample the image source to the resolution of the display component, and directly output to the display component. This normal mode can be applied to some high frame rate scenarios, such as games.
  • another display module which realizes super-resolution through a polarization converter and a birefringent device.
  • the display module includes a display component 2301 , a deflection rotator 2302 , a birefringent device 2303 and a control component 2304 .
  • the display component 2301, the polarization rotator 2302 and the birefringent device 2303 are placed in sequence along the transmission direction of the optical path.
  • the display module further includes a folding optical path 2305 .
  • the display component 2301 is used to display images.
  • the polarization rotator 2302 can modulate the polarization direction of the incident ray polarized light according to the magnitude of the voltage signal applied to it, so that it can achieve polarization rotation at any angle, and can achieve pixel or sub-pixel level through the array transistor on the polarization rotator 2302 separate control.
  • the display component 2301 displays the image to be displayed. Wherein, the resolution of the image to be processed is the same as that of the display component.
  • the polarization rotator 2302 adjusts the polarization direction of the light beam of each pixel (or each sub-pixel in the pixel) of the image to be processed under the control of the control component 2304 , and outputs it to the birefringent device 2303 .
  • the birefringent device 2303 decomposes the light beam of each pixel included in the image to be processed, outputs the first target polarized light for projecting the first sub-image at the first position and outputs the first polarized light for projecting the second sub-image at the second position.
  • Two target polarized light The polarization directions of the input polarized light at different pixel positions may be different, and the decomposition ratios of two pixels with different polarization directions are also different.
  • the output light of the birefringent device can be divided into o light and e light.
  • o Light can be called ordinary light, the polarization direction is perpendicular to the plane of incidence.
  • e-ray can also be called extraordinary light, and the polarization direction is located in the plane of incidence.
  • the birefringent device only outputs o-light when a beam of a certain polarization direction is input, and only outputs e-light when a beam of another polarization direction is input. In the remaining polarization directions, both o-light and e-light are output.
  • the embodiment of the present application uses this principle to input target polarized light with different polarization directions into the birefringent device for different pixel or sub-pixel positions, so that the pixels at two positions output by the birefringent device 2303 are superimposed and correspond to the image to be displayed
  • the pixel values of the pixels at the locations are approximately the same or infinitely close.
  • the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed.
  • the decomposition ratio of the first pixel is different from that of the second pixel.
  • the decomposition ratio of the first pixel is that the luminous intensity of the pixel projected on the first sub-image after the beam of the first pixel is decomposed is the same as that of the pixel projected on the second sub-image.
  • the ratio of the luminous intensity of the pixel of the second pixel, the decomposition ratio of the second pixel is the ratio of the luminous intensity of the pixel projected on the first sub-image to the luminous intensity of the pixel projected on the second sub-image after the light beam of the second pixel is decomposed.
  • the polarization rotator 2302 may be a twisted nematic polarization rotator (TNPR), an in-Plane switching polarization rotator (IPSPR), a cholesteric liquid crystal (cholesteric liquid crystals, CLC) and 1/4 Any of wave plate combination, TNLC with thin film transistor (thin film transistor, TFT) circuit, etc.
  • TNPR twisted nematic polarization rotator
  • IPSPR in-Plane switching polarization rotator
  • CLC cholesteric liquid crystals
  • TFT thin film transistor
  • the TNLC with TFT is composed of a liquid crystal layer sandwiched between two conductive substrates.
  • the voltage applied to the TNLC with TFT is between 0 and Vc
  • the polarization direction of the incident polarized beam will be rotated by 0 to 90 degrees after passing through the TNLC.
  • the specific rotation angle is related to the applied voltage and the specific material of the TNLC.
  • the display component 2301 can be an ordinary liquid crystal display (LCD), OLED, or a more advanced micro-LED display, wherein the OLED display has higher luminous efficiency and higher contrast ratio; the mini-LED display has more High luminous brightness, can be applied to scenes that require stronger luminous brightness.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • mini-LED display has more High luminous brightness, can be applied to scenes that require stronger luminous brightness.
  • control component 2304 can be, for example, a processor, a microprocessor, a controller and other control components, for example, it can be a general-purpose central processing unit (central processing unit, CPU), a general-purpose processor, a digital signal processing (digital signal processing, DSP), application specific integrated circuits (application specific integrated circuits, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any other combination.
  • CPU central processing unit
  • DSP digital signal processing
  • ASIC application specific integrated circuits
  • FPGA field programmable gate array
  • the birefringent device can use crystals, such as quartz crystals, barium borate crystals, quartz crystals, lithium niobate crystals, or titanium dioxide crystals. Generally speaking, the birefringence of the crystal is small, resulting in a larger volume of the VR device or the AR device.
  • liquid crystal can be used as the material of the birefringent device, and the larger birefringence index of the liquid crystal material can be used to make the birefringent device thinner, which can be better placed in a VR device or an AR device with limited space.
  • the liquid crystal material may be a liquid liquid crystal material or a liquid crystal polymer material cured by ultraviolet or heat.
  • the birefringent device 2303 uses liquid crystal polymer RM 257. After surface orientation or electric field orientation, the angle between the director of the liquid crystal molecules and the surface is ⁇ 45 degrees, and then the liquid crystal polymer is formed through ultraviolet curing.
  • each device in the display module will not be numbered as an example.
  • the voltage signal applied to the polarization rotator is controlled by the control component, and the polarization rotator Vertical linearly polarized light can be rotated into linearly polarized light at any angle. Since the polarization rotator can realize sub-pixel level control, the polarization directions of each pixel or even each sub-pixel in the display component are not completely consistent after passing through the polarization rotator.
  • the light beam reaches the birefringent device after passing through the polarization rotator, and is divided into two beams by the birefringent device, o light and e light.
  • the intensity ratio of the two beams is determined by the polarization direction or polarization angle of the incident light, as shown in Figure 22.
  • a beam of light will be divided into two beams after passing through the birefringent device, and the two beams of light correspond to two images, which are the first sub-image and the second sub-image respectively.
  • the birefringent device arranged in the display module the position of the first sub-image and the position of the second sub-image are separated by Py/2 in the vertical direction and/or Px/2 in the horizontal direction, the Py represents the pitch of the pixels in the vertical direction, and the Px represents the pitch of the pixels in the horizontal direction.
  • the human eye's persistence of vision and visual synthesis function are used to synthesize a high-resolution image in the mind, so the frame rate of the synthesized high-resolution image is reduced, but the resolution will be reduced. improve.
  • control component can estimate and adjust the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image, so that the adjusted first sub-image
  • the similarity between the image and the image to be displayed after superimposing and projecting the second sub-image is greater than a set threshold, or in other words, the image after superimposing and projecting is substantially the same as the image to be displayed.
  • the set threshold is determined according to human eyes' ability to perceive image differences.
  • the control component can control the polarization rotator to adjust the light beam of each pixel of the image to be processed according to the adjusted luminous intensity of each pixel of the first sub-image and the adjusted luminous intensity of each pixel of the second sub-image direction of polarization.
  • different polarization directions correspond to the distribution ratios of the output luminous intensity of the o-light and e-light. Therefore, after the luminous intensity of each pixel of the first sub-image and the second sub-image, and further according to the corresponding relationship, the polarization direction of the input light beam corresponding to each pixel can be determined.
  • different polarization directions correspond to different applied voltages, so that the control component can apply a corresponding applied voltage to each pixel or sub-pixel according to the applied voltage corresponding to the polarization direction.
  • the voltage signal strength of each pixel (or sub-pixel) on the deflection rotator and the content of the display component can be obtained by the following optimization formula (1):
  • S is a column vector composed of all pixel values of the target image
  • R is a high-resolution image synthesized by the low-resolution first sub-image V1 composed of o-light and the low-resolution second sub-image V2 composed of e-light , which can be understood as the actual displayed image.
  • the column vectors obtained by the arrangement of each pixel or sub-pixel in R are similar to the arrangement of the column vectors obtained by the arrangement of each pixel or sub-pixel in S.
  • M is a mapping matrix that maps the first sub-image V1 and the second sub-image V2 to R.
  • the image to be processed after passing through the birefringent device is decomposed into a first sub-image (o-photon frame V1) and a second sub-image (e-photon-frame V2 ).
  • the image synthesized by the first sub-image and the second sub-image is shown in FIG.
  • the pixel 6 is synthesized, so refer to the mapping matrix M, the ninth row in the mapping matrix M, except the 2nd element and the P+6th element are 1, other elements are 0 (wherein P is the pixel of the subframe total).
  • mapping matrix M is as follows:
  • an embodiment of the present application further provides an imaging control method, which is applied to a wearable device.
  • the wearable device includes a display component and a pixel position adjustment component, and the pixel position adjustment component includes a polarization converter and a birefringence device.
  • the display device includes a display component and a pixel position adjustment component.
  • the display component includes a plurality of pixels, and each pixel of the plurality of pixels includes a plurality of sub-pixels.
  • the time when the display component displays the first image is synchronized with the time when the pixel position adjustment component adjusts the first image, and the first image is any one of the multiple frames of images.
  • the display device supports two modes, super resolution mode or normal mode.
  • the image to be displayed is decomposed at the sub-pixel level to obtain a multi-frame image.
  • the image to be displayed is down-sampled to an image to be processed; the image to be processed is input to the display component, so that the display component displays An image to be processed; the pixel position adjustment component is used to output the image to be processed at a set position.
  • the polarization converter no longer adjusts the switch state, that is, the polarization direction of the output target polarized light that needs to be controlled no longer needs to be adjusted.
  • the sub-pixels included in the first pixel in the first image of the multi-frame images are obtained by sampling from sub-pixels included in at least h adjacent pixels included in the image to be displayed.
  • h is the number of images in the multi-frame images
  • the first image is any image in the multi-frame images
  • the first pixel is any pixel in the first image.
  • the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel that is the same color as the first sub-pixel included in the set area of the image to be displayed;
  • the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
  • the pixel value of the first sub-pixel included in the first pixel point is obtained by weighting and summing the pixel values of the sub-pixels included in the set area with the same color as the first sub-pixel;
  • the weight of the sub-pixel with the same color as the first sub-pixel included in the setting area is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the sub-pixel with the same color as the first sub-pixel and the first sub-pixel The distance between sample locations in the image to be displayed.
  • the display device is a wearable device
  • the size of the set area is related to the distance between the display component and the imaging plane of the wearable device.
  • the size of the setting area is related to the pixel size of the display component.
  • the size of the setting area is related to the display content of the display component.
  • the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
  • q(i, j) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the first sub-pixel in the pixel point of the image to be displayed
  • j represents the ordinate of the first sub-pixel in the pixel point of the image to be displayed
  • Q(i,j) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 and ⁇ 5 represent the weights respectively.
  • the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
  • q(i, j+1) represents the pixel value of the first sub-pixel
  • i represents the abscissa of the pixel point of the first sub-pixel in the image to be displayed
  • j+1 represents the pixel of the first sub-pixel in the image to be displayed
  • the ordinate of the point, Q(i, j+1) represents the pixel value of the sub-pixel of the sampling position of the first sub-pixel in the image to be displayed
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 4 , ⁇ 1 , ⁇ 2 , ⁇ 3 and ⁇ 4 represent the weights, respectively.
  • the pixel position adjustment component includes a polarization converter and a polarization shifting device
  • Controlling the pixel position adjustment component to adjust the position of each frame of image displayed by the display component in time division includes: inputting multiple frames of images to the display component in time division, so that the display component emits target polarized light in time division, wherein the target polarized light is used to carry multiple Each frame of image in the frame image; control the polarization converter to adjust the polarization direction of the output target polarized light in time division, so that when the polarization direction of the target polarized light output by the polarization converter is the first polarization direction, the polarization displacement device is in the first position Outputting the target polarized light, when the polarization direction of the target polarized light output by the polarization converter is the second polarization direction, the polarization displacement device outputs the target polarized light at the second position.
  • the multiple frames of images include a first image and a second image
  • Control the polarization converter to adjust the polarization direction of the output target polarized light in time division including:
  • the polarization converter In the first time unit, the polarization converter is controlled to adjust the output polarization direction of the target polarized light bearing the first image to be the first polarization direction, and in the second time unit, the polarization converter is controlled to adjust the output target polarization bearing the second image.
  • the polarization direction of the light is the second polarization direction, so that the first position of the target polarized light bearing the first image output by the polarization displacement device and the second position of the output target polarized light bearing the second image are spaced horizontally by Px/2; Px represents the distance between adjacent pixels of the first image or the second image in the horizontal direction.
  • the sub-pixels included in the first pixel in the first image are sampled from the sub-pixels included in two horizontally adjacent pixels included in the image to be displayed;
  • the sub-pixels included in the second pixel in the second image are obtained by sampling from the sub-pixels included in two horizontally adjacent pixels;
  • the position coordinates of the first pixel in the first image are the same as the position coordinates of the second pixel in the second image.
  • the multiple frames of images include a first image and a second image
  • Control the polarization converter to adjust the polarization direction of the output target polarized light in time division including:
  • the polarization converter In the first time unit, the polarization converter is controlled to adjust the output polarization direction of the target polarized light bearing the first image to be the first polarization direction, and in the second time unit, the polarization converter is controlled to adjust the output target polarization bearing the second image.
  • the polarization direction of the light is the second polarization direction, so that the first position of the target polarized light bearing the first image output by the polarization displacement device and the second position of the output target polarized light bearing the second image are spaced vertically by Py/2; Py represents the distance between adjacent pixels of the first image or the second image in the vertical direction.
  • the sub-pixels included in the first pixel in the first image are obtained by sampling from sub-pixels included in two adjacent pixels in the vertical direction in the image to be displayed;
  • the sub-pixels included in the second pixel in the second image are sampled from the sub-pixels included in two adjacent pixels in the vertical direction;
  • the position coordinates of the first pixel in the first image are the same as the position coordinates of the second pixel in the second image.
  • the multiple frames of images include a first image and a second image
  • Control the polarization converter to adjust the polarization direction of the output target polarized light in time division including:
  • the polarization converter In the first time unit, the polarization converter is controlled to adjust the polarization direction of the input target polarized light bearing the first image to the first polarization direction, and in the second time unit, the polarization converter is controlled to adjust the input target polarized light bearing the second image.
  • the polarization direction of is the second polarization direction, so that the first position of the target polarized light bearing the first image output by the polarization displacement device and the output second position of the target polarized light bearing the second image are separated by Py in the vertical direction /2 and offset Px/2 in the horizontal direction, Py represents the vertical distance between adjacent pixels of the first image or the second image, and Px represents the vertical distance between adjacent pixels of the first image or the second image Pitch.
  • the sub-pixels included in the first pixel in the first image are obtained by sampling from the sub-pixels included in two adjacent pixels in a diagonal direction in the image to be displayed;
  • the sub-pixels included in the second pixel in the second image are sampled from the sub-pixels included in two adjacent pixels in the diagonal direction;
  • the position coordinates of the first pixel in the first image are the same as the position coordinates of the second pixel in the second image.
  • the embodiment of the present application also provides another imaging control method, which is applied to a wearable device.
  • the wearable device includes a display component and a pixel position adjustment component, and the pixel position adjustment component includes a polarization rotator and a birefringent device.
  • FIG. 25 it is a schematic flowchart of a possible imaging control method.
  • the decomposition ratio of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, and the decomposition ratio of the first pixel is that the light beam of the first pixel passes through
  • the ratio of the luminous intensity of the pixel projected on the first sub-image after decomposition to the luminous intensity of the pixel projected on the second sub-image, the decomposition ratio of the second pixel is the pixel projected on the first sub-image after the light beam of the second pixel is decomposed
  • the luminous intensity is proportional to the luminous intensity of the pixel projected on the second sub-image.
  • the position of the first sub-image and the position of the second sub-image are separated by Py/2 in the vertical direction and/or Px/2 in the horizontal direction, where Py represents the distance between pixels in the vertical direction, and Px Indicates the pixel spacing in the vertical direction.
  • controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes:
  • the polarization rotator is controlled to adjust the polarization direction of the light beams of the sub-pixels included in each pixel of the image to be processed.
  • controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes:
  • the polarization rotator is controlled to adjust the polarization direction of the light beam of each pixel of the image to be processed.
  • controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes:
  • the similarity between the image and the image to be displayed is greater than a set threshold; the set threshold is determined according to the human eye's ability to perceive image differences;
  • the polarization rotator is controlled to adjust the polarization direction of the light beam of each pixel of the image to be processed.
  • the method steps in the embodiments of the present application may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, and software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and storage medium can be located in the ASIC.
  • the ASIC may be located in a head-mounted display device or a terminal device.
  • the processor and the storage medium may also exist in the head-mounted display device or the terminal device as discrete components.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product comprises one or more computer programs or instructions. When the computer program or instructions are loaded and executed on the computer, the processes or functions described in the embodiments of the present application are executed in whole or in part.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, network equipment, user equipment, or other programmable devices.
  • the computer program or instructions can be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer program or instructions can be downloaded from a website, computer, A server or data center transmits to another website site, computer, server or data center by wired or wireless means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrating one or more available media. Described usable medium can be magnetic medium, for example, floppy disk, hard disk, magnetic tape; It can also be optical medium, for example, digital video disc (digital video disc, DVD); It can also be semiconductor medium, for example, solid state drive (solid state drive) , SSD).
  • At least one means one or more, and “multiple” means two or more.
  • “And/or” describes the association relationship of associated objects, indicating that there may be three types of relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and B exists alone, where A, B can be singular or plural.
  • “At least one of the following” or similar expressions refer to any combination of these items, including any combination of single or plural items.
  • at least one item (piece) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c ", where a, b, c can be single or multiple.
  • the character “/” generally indicates that the contextual objects are an “or” relationship. In the formulas of this application, the character “/” indicates that the front and back related objects are in a “division” relationship.
  • the symbol “(a, b)” means an open interval, the range is greater than a and less than b; "[a, b]” means a closed interval, the range is greater than or equal to a and less than or equal to b; "(a , b]” means a half-open and half-closed interval, the range is greater than a and less than or equal to b; “(a, b]” means a half-open and half-closed interval, the range is greater than a and less than or equal to b.
  • exemplary is used to mean an example, illustration, or illustration. Any embodiment or design described in this application as “exemplary” should not be construed as preferred or more preferred than other embodiments or designs. Or it can be understood that the use of the word example is intended to present a concept in a specific manner, and does not constitute a limitation to the application.

Abstract

L'invention concerne un module d'affichage et un procédé de commande d'imagerie, lesquels sont utilisés pour fournir un schéma de super-résolution destiné à un dispositif d'affichage. Le procédé comprend les étapes selon lesquelles un composant de commande (300) : effectue une décomposition de niveau de sous-pixel sur une image à haute résolution, de façon à obtenir une pluralité de trames d'images à basse résolution ; puis, affiche les images à basse résolution, au moyen d'un composant d'affichage (100), d'une manière par répartition dans le temps ; puis commande, de la manière par répartition dans le temps, à un composant de réglage de position de pixel (200) de régler la position de chaque trame d'image qui est affichée par le composant d'affichage (100), c'est-à-dire, afficher la pluralité de trames d'images à basse résolution d'une manière de répartition dans le temps. La pluralité de trames d'images à faible résolution est superposée dans les yeux humains, à l'aide des fonctions de persistance visuelle et de synthèse visuelle des yeux humains, de telle sorte que les yeux humains voient des images à haute résolution et la sensation en dents de scie est réduite au moyen de la décomposition au niveau du sous-pixel, de telle sorte que l'effet d'imagerie est amélioré.
PCT/CN2022/121488 2021-09-30 2022-09-26 Module d'affichage et procédé de commande d'imagerie WO2023051479A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111164540.XA CN115909913A (zh) 2021-09-30 2021-09-30 一种显示模组及成像控制方法
CN202111164540.X 2021-09-30

Publications (1)

Publication Number Publication Date
WO2023051479A1 true WO2023051479A1 (fr) 2023-04-06

Family

ID=85729581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/121488 WO2023051479A1 (fr) 2021-09-30 2022-09-26 Module d'affichage et procédé de commande d'imagerie

Country Status (2)

Country Link
CN (1) CN115909913A (fr)
WO (1) WO2023051479A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0463332A (ja) * 1990-07-02 1992-02-28 Nippon Telegr & Teleph Corp <Ntt> 投影表示装置
JP2003222832A (ja) * 2001-11-22 2003-08-08 Sharp Corp 画像シフト素子および画像表示装置
CN1533563A (zh) * 2001-05-09 2004-09-29 ������Ү��ʵ�������޹�˾ 一种子像素格式数据到另一种子像素数据格式的转换
CN104680949A (zh) * 2015-03-25 2015-06-03 京东方科技集团股份有限公司 像素阵列、显示驱动方法、显示驱动装置和显示装置
CN106023909A (zh) * 2016-08-12 2016-10-12 京东方科技集团股份有限公司 显示装置及其显示方法
CN111866588A (zh) * 2019-04-30 2020-10-30 深圳光峰科技股份有限公司 图像拆分方法与图像显示方法
CN112005161A (zh) * 2018-03-30 2020-11-27 华为技术有限公司 成像器件、显示装置和成像设备
CN113156744A (zh) * 2020-01-23 2021-07-23 亘冠智能技术(杭州)有限公司 一种基于双折射原理的分辨率倍增的dmd投影系统
CN113452942A (zh) * 2021-06-03 2021-09-28 华东师范大学 一种针对数字微镜芯片的4k分辨率视频图像预处理方法
CN113489960A (zh) * 2021-06-30 2021-10-08 青岛海信激光显示股份有限公司 投影显示设备、方法及系统
CN113542701A (zh) * 2020-04-20 2021-10-22 青岛海信激光显示股份有限公司 投影显示方法及投影设备

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0463332A (ja) * 1990-07-02 1992-02-28 Nippon Telegr & Teleph Corp <Ntt> 投影表示装置
CN1533563A (zh) * 2001-05-09 2004-09-29 ������Ү��ʵ�������޹�˾ 一种子像素格式数据到另一种子像素数据格式的转换
JP2003222832A (ja) * 2001-11-22 2003-08-08 Sharp Corp 画像シフト素子および画像表示装置
CN104680949A (zh) * 2015-03-25 2015-06-03 京东方科技集团股份有限公司 像素阵列、显示驱动方法、显示驱动装置和显示装置
CN106023909A (zh) * 2016-08-12 2016-10-12 京东方科技集团股份有限公司 显示装置及其显示方法
CN112005161A (zh) * 2018-03-30 2020-11-27 华为技术有限公司 成像器件、显示装置和成像设备
CN111866588A (zh) * 2019-04-30 2020-10-30 深圳光峰科技股份有限公司 图像拆分方法与图像显示方法
CN113156744A (zh) * 2020-01-23 2021-07-23 亘冠智能技术(杭州)有限公司 一种基于双折射原理的分辨率倍增的dmd投影系统
CN113542701A (zh) * 2020-04-20 2021-10-22 青岛海信激光显示股份有限公司 投影显示方法及投影设备
CN113452942A (zh) * 2021-06-03 2021-09-28 华东师范大学 一种针对数字微镜芯片的4k分辨率视频图像预处理方法
CN113489960A (zh) * 2021-06-30 2021-10-08 青岛海信激光显示股份有限公司 投影显示设备、方法及系统

Also Published As

Publication number Publication date
CN115909913A (zh) 2023-04-04

Similar Documents

Publication Publication Date Title
Zhan et al. Multifocal displays: review and prospect
Haas 40‐2: Invited paper: Microdisplays for augmented and virtual reality
TWI592919B (zh) 使用串列面板的超高解析度顯示
Lueder 3D Displays
US8681075B2 (en) Display device and electronic appliance
US9667954B2 (en) Enhanced image display in head-mounted displays
JP5631739B2 (ja) ディスプレイに対してプライバシを与えるための方法及びデバイス
US11169380B2 (en) Ghost image mitigation in see-through displays with pixel arrays
US20230324744A1 (en) Geometries for mitigating artifacts in see-through pixel arrays
JP2010078653A (ja) 立体画像表示装置
US10546521B2 (en) Resolutions by modulating both amplitude and phase in spatial light modulators
US20200333662A1 (en) Alignment cells for modulating both amplitude and phase in spatial light modulators
US20230350248A1 (en) Spatial light modulators modulating both amplitude and phase
US11521572B2 (en) Holographic displays with light modulation in amplitude and phase
US10775617B2 (en) Eye tracked lens for increased screen resolution
Zhang et al. A resolution-enhanced digital micromirror device (DMD) projection system
WO2023051479A1 (fr) Module d&#39;affichage et procédé de commande d&#39;imagerie
Wetzstein et al. State of the art in perceptual VR displays
Lu et al. 59‐1: Liquid Crystal Technology for Solving Key Optical Challenges in Virtual and Augmented Realities
Wu et al. 70‐3: Invited Paper: High‐Resolution Light‐Field VR LCD
JP6329792B2 (ja) 表示装置
US10540930B1 (en) Apparatus, systems, and methods for temperature-sensitive illumination of liquid crystal displays
Yoo et al. 15 focal planes head-mounted display using led array backlight
US20230282177A1 (en) Foveated display and driving scheme
US11422458B2 (en) Nano-stamping to create two different gratings to modulate light in amplitude and phase via liquid crystals

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22874864

Country of ref document: EP

Kind code of ref document: A1