CN115909913A - Display module and imaging control method - Google Patents

Display module and imaging control method Download PDF

Info

Publication number
CN115909913A
CN115909913A CN202111164540.XA CN202111164540A CN115909913A CN 115909913 A CN115909913 A CN 115909913A CN 202111164540 A CN202111164540 A CN 202111164540A CN 115909913 A CN115909913 A CN 115909913A
Authority
CN
China
Prior art keywords
image
pixel
sub
polarization
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111164540.XA
Other languages
Chinese (zh)
Inventor
邱孟
高少锐
吴巨帅
冯国华
罗伟城
孙上
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111164540.XA priority Critical patent/CN115909913A/en
Priority to PCT/CN2022/121488 priority patent/WO2023051479A1/en
Publication of CN115909913A publication Critical patent/CN115909913A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/13363Birefringent elements, e.g. for optical compensation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/33Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being semiconductor devices, e.g. diodes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/35Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being liquid crystals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays

Abstract

The application discloses a display module and an imaging control method, which are used for providing a super-resolution scheme of display equipment. The control assembly obtains multi-frame low-resolution images by performing sub-pixel level decomposition on the high-resolution images, then displays the low-resolution images through the display assembly in a time-sharing mode, and then adjusts the position of each frame of image displayed by the display assembly through the time-sharing control pixel position adjusting assembly, namely displays the multi-frame low-resolution images in a time-sharing mode, and by utilizing the functions of vision persistence and vision synthesis of human eyes, the multi-frame low-resolution images are superposed in the human eyes, so that the high-resolution images seen by the human eyes are decomposed in a sub-pixel level mode, the saw tooth feeling is reduced, and the imaging effect is improved.

Description

Display module and imaging control method
Technical Field
The embodiment of the application relates to the technical field of optics, in particular to a display module and an imaging control method.
Background
Display devices, such as Virtual Reality (VR) devices, achieve a sense of depth of immersion by satisfying a larger angle of view (FOV) and a high resolution. The content viewed in the currently adopted VR device only reaches 10-20 angular resolution (PPD), and cannot meet the resolution limit of the human eye 1' (60 PPD), so that the image seen by the user has a screen effect. In order to solve the screen effect of the VR device, on one hand, a high-resolution display screen may be utilized, but in order to meet the size requirement of the VR device, a display screen of a silicon-based organic light-emitting diode (micro-OLED) technology is adopted, but the high-resolution micro-OLED is relatively high in cost. On the other hand, the resolution of the display screen can be indirectly improved through a resolution enhancement technology, but at present, there is no feasible resolution enhancement scheme applied to the display device.
Disclosure of Invention
The embodiment of the application provides a display module and an imaging control method, which are used for providing a resolution enhancement scheme applied to VR equipment.
In a first aspect, an embodiment of the present application provides a display module, which includes a display module, a pixel position adjustment module, and a control module, where the display module includes a plurality of pixels, and each pixel in the plurality of pixels includes a plurality of sub-pixels; the display component is used for displaying the multi-frame images in a time-sharing manner under the control of the control component; the multi-frame image is obtained by performing sub-pixel level decomposition on the image to be displayed, the resolution of the multi-frame image is the same as that of the display assembly, and the resolution of the multi-frame image is smaller than that of the image to be displayed; the pixel position adjusting component is used for adjusting the position of each frame of image displayed by the display component in a time-sharing way under the control of the control component; the time for displaying the first image by the display component is synchronous with the time for adjusting the first image by the pixel position adjusting component, and the first image is any one of the multi-frame images. Through the method, the multi-frame low-resolution images are displayed in a time-sharing mode, and the high-resolution images can be seen by human eyes by utilizing the functions of persistence of vision and visual synthesis of the human eyes. The image to be displayed is decomposed in a sub-pixel level sampling mode, so that the smoothness of the edge is improved, and the saw-tooth feeling is reduced.
In one possible design, the pixel position adjustment assembly includes a polarization converter and a polarization displacement device; the polarization converter is used for adjusting the polarization direction of target polarized light output by the polarization converter in a time-sharing manner under the control of the control assembly, and the target polarized light bears one frame of image in the multi-frame images; and the polarization displacement device is used for outputting the target polarized light at the first position when the polarization direction of the target polarized light output by the polarization converter is a first polarization direction, and outputting the target polarized light at the second position when the polarization direction of the target polarized light output by the polarization converter is a second polarization direction.
In one possible design, the control component is specifically configured to receive an image to be displayed, perform sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image, and send the multi-frame image to the display component in a time-sharing manner.
In one possible design, the polarization displacement device is a birefringent device or a polarization grating. The double refraction device is a non-diffraction device and has no inherent dispersion characteristic, so that light passing through the double refraction device is not dispersed, and the imaging effect is further improved.
In one possible design, the polarization converter comprises twisted nematic liquid crystals or in-plane rotating liquid crystals; or the polarization converter comprises cholesteric liquid crystals and a 1/4 wave plate.
In one possible design, the pixel position adjustment assembly is a motor.
In one possible design, the sub-pixels included by the first pixel points in the first image in the multi-frame image are obtained by sampling the sub-pixels included by at least h adjacent pixel points included in the image to be displayed; and h is the image quantity of the multi-frame images, the first image is any one of the multi-frame images, and the first pixel point is any one of the pixel points in the first image.
In one possible design, the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel with the same color as the first sub-pixel included in the set region of the image to be displayed; the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
In the above design, the sub-pixel values of the sampling positions are determined by sub-pixel values of the same color around the sampling position, and the occurrence of color fringes around the display image can be reduced. In some embodiments, this approach may also be used only for sub-pixel value determinations within a set range of four weeks.
In one possible design, the pixel value of the first sub-pixel included in the first pixel point is obtained by performing weighted summation on the pixel values of the sub-pixels included in the setting region and having the same color as the first sub-pixel;
wherein, the weight of the sub-pixel with the same color as the first sub-pixel included in the set region is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the distance between the sub-pixel with the same color as the first sub-pixel and the sampling position of the first sub-pixel in the image to be displayed.
In one possible design, the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
q(i,j)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)+α 5 *Q(i,j);
wherein q (i, j) represents a pixel value of the first sub-pixel; i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed; j represents the vertical coordinate of the pixel point of the first sub-pixel in the image to be displayed; q (i, j) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed; alpha is alpha 1 ,α 2 ,α 3 ,α 4 And alpha 5 Respectively, represent the weights.
In one possible design, the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
q(i,j+1)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)
1 *Q(i-1,j+1)+β 2 *Q(i,j)+β 3 *Q(i+1,j+1)+β 4 *Q(i,j+2)。
wherein Q (i, j + 1) represents the pixel value of the first sub-pixel, i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed, j +1 represents the ordinate of the first sub-pixel at the pixel point of the image to be displayed, Q (i, j + 1) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed, and α 1 ,α 2 ,α 3 ,α 4 ,β 1 ,β 2 ,β 3 And beta 4 Respectively, represent the weights.
In one possible design, the multi-frame image includes a first image and a second image; the control assembly is specifically used for inputting the multi-frame images to the display assembly in a time-sharing manner, so that the display assembly displays the multi-frame images in a time-sharing manner; and controlling the pixel position adjusting assembly to output the first image at the first position in the first time unit, and controlling the pixel position adjusting assembly to output the second image at the second position in the second time unit. The interval between the first position and the second position in the horizontal direction is Px/2; or the interval between the first position and the second position in the vertical direction is Py/2; or the interval between the first position and the second position in the horizontal direction is Px/2 and the interval between the first position and the second position in the vertical direction is Py/2; the first time unit and the second time unit are adjacent in time.
In one possible design, the control component is specifically configured to control the polarization converter to adjust the polarization direction of the input target polarized light bearing the first image to be a first polarization direction in a first time unit, and to control the polarization converter to adjust the polarization direction of the input target polarized light bearing the second image to be a second polarization direction in a second time unit, so that a horizontal interval between a first position of the target polarized light bearing the first image output by the polarization displacement device and a second position of the target polarized light bearing the second image output by the polarization displacement device is Px/2; px represents a pitch of adjacent pixels of the first image or the second image in the horizontal direction. Through the design, a mode of realizing horizontal overdivision is provided, and the realization is simple and effective.
In a possible design, the control component is specifically configured to control the polarization converter to adjust the polarization direction of the input target polarized light bearing the first image to be a first polarization direction in a first time unit, and control the polarization converter to adjust the polarization direction of the input target polarized light bearing the second image to be a second polarization direction in a second time unit, so that an interval between a first position of the target polarized light bearing the first image output by the polarization displacement device and a second position of the target polarized light bearing the second image output by the polarization displacement device in the vertical direction is Py/2; py denotes a pitch of adjacent pixels of the first image or the second image in the vertical direction. Through the design, a mode for realizing vertical hyper-resolution is provided, and the realization is simple and effective.
In one possible design, the control component is specifically configured to control the polarization converter to adjust the polarization direction of the input target polarized light bearing the first image to be a first polarization direction in a first time unit, and to adjust the polarization direction of the input target polarized light bearing the second image to be a second polarization direction in a second time unit, so that a first position of the target polarized light bearing the first image output by the polarization displacement device is separated from a second position of the target polarized light bearing the second image output by the polarization displacement device by Py/2 in a vertical direction and is shifted by Px/2 in a horizontal direction, where Py represents a vertical distance between adjacent pixels of the first image or the second image, and Px represents a vertical distance between adjacent pixels of the first image or the second image. Through the design, a mode for realizing diagonal over-separation is provided, and the realization is simple and effective.
In one possible design, the polarization converter comprises twisted nematic liquid crystals or in-plane rotating liquid crystals; or the polarization converter comprises cholesteric liquid crystals and a 1/4 wave plate.
In one possible design, the polarization displacement device is a birefringent liquid crystal, which uses quartz crystal, barium borate crystal, lithium niobate crystal, or titanium dioxide crystal, or liquid crystal polymer.
In a possible design, the display module further comprises a folded optical path, the folded optical path is located between the display component and the pixel position adjusting component, and the folded optical path is used for transmitting the target polarized light bearing any image of the multi-frame image to the pixel position adjusting component.
In a second aspect, an embodiment of the present application provides an imaging control method, where the method is applied to a display device, where the display device includes a display component and a pixel position adjustment component, the display component includes a plurality of pixels, and each of the plurality of pixels includes a plurality of sub-pixels; the method comprises the following steps: receiving an image to be displayed, and performing sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image; the resolution of each frame of image in the multi-frame images is the same as that of the display assembly, and the resolution of the multi-frame images is smaller than that of the image to be displayed; controlling the pixel position adjusting component to adjust the position of each frame of image displayed by the display component in a time-sharing manner; the time of the display component for displaying the first image is synchronous with the time of the pixel position adjusting component for adjusting the first image, and the first image is any one of the multi-frame images.
In one possible design, performing sub-pixel decomposition on an image to be displayed to obtain a multi-frame image includes:
and when the super-resolution mode of the display equipment is enabled, performing sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image.
In one possible design, the method further includes:
when the super-resolution mode of the display equipment is not enabled, performing down-sampling processing on an image to be displayed to an image to be processed;
inputting the image to be processed to the display component, so that the display component displays the image to be processed;
and outputting the image to be processed at a set position through a pixel position adjusting component.
In one possible design, the sub-pixels included by the first pixel points in the first image in the multi-frame image are obtained by sampling the sub-pixels included by at least h adjacent pixel points included in the image to be displayed;
and h is the image quantity of the multi-frame images, the first image is any one of the multi-frame images, and the first pixel point is any one of the pixel points in the first image.
In one possible design, the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel with the same color as the first sub-pixel included in the set region of the image to be displayed;
the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
In one possible design, the pixel value of the first sub-pixel included in the first pixel point is obtained by performing weighted summation on the pixel values of the sub-pixels included in the setting region and having the same color as the first sub-pixel;
wherein, the weight of the sub-pixel with the same color as the first sub-pixel in the set region is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the distance between the sub-pixel of the same color as the first sub-pixel and the sampling position of the first sub-pixel in the image to be displayed.
In one possible design, the display device is a wearable device, and the size of the defined area is related to the distance between the display assembly and an imaging plane of the wearable device.
In one possible design, the size of the defined area is related to the pixel size of the display element.
In one possible design, the size of the defined area is related to the display content of the display element.
In one possible design, the pixel value of the first sub-pixel satisfies the condition shown in the following equation:
q(i,j)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)+α 5 *Q(i,j);
wherein q (i, j) represents a pixel value of the first sub-pixel; i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed; j represents the vertical coordinate of the pixel point of the first sub-pixel in the image to be displayed; q (i, j) denotes the first sub-pixel inPixel values of sub-pixels at sampling positions in an image to be displayed; alpha (alpha) ("alpha") 1 ,α 2 ,α 3 ,α 4 And alpha 5 Respectively representing the weights.
In one possible design, the pixel value of the first sub-pixel satisfies the condition shown in the following formula:
q(i,j+1)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)
1 *Q(i-1,j+1)+β 2 *Q(i,j)+β 3 *Q(i+1,j+1)+β 4 *Q(i,j+2)。
wherein Q (i, j + 1) represents the pixel value of the first sub-pixel, i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed, j +1 represents the ordinate of the first sub-pixel at the pixel point of the image to be displayed, Q (i, j + 1) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed, and α 1 ,α 2 ,α 3 ,α 4 ,β 1 ,β 2 ,β 3 And beta 4 Respectively representing the weights.
In one possible design, the multiple frames of images include the first image and the second image, and the controlling the pixel position adjustment component time-divisionally adjusts the position of each frame of image displayed by the display component includes: inputting the plurality of frames of images to the display component in a time-sharing manner, so that the display component displays the plurality of frames of images in a time-sharing manner; controlling the pixel position adjusting assembly to output the first image at a first position in a first time unit, and controlling the pixel position adjusting assembly to output the second image at a second position in a second time unit; the interval between the first position and the second position in the horizontal direction is Px/2; or the interval between the first position and the second position in the vertical direction is Py/2; or the interval between the first position and the second position in the horizontal direction is Px/2 and the interval between the first position and the second position in the vertical direction is Py/2; the first time unit and the second time unit are adjacent in time.
In one possible design, the pixel position adjustment assembly includes a polarization converter and a polarization displacement device; controlling the pixel position adjusting component to output the first image at a first position in a first time unit, and controlling the pixel position adjusting component to output the second image at a second position in a second time unit, including: controlling the polarization converter to adjust the polarization direction of target polarized light in a time-sharing manner, wherein the target polarized light is generated when the display assembly displays a plurality of frames of images in a time-sharing manner; when the polarization direction of the target polarized light output by the polarization converter is a first polarization direction, the polarization displacement device outputs the target polarized light at a first position, and when the polarization direction of the target polarized light output by the polarization converter is a second polarization direction, the polarization displacement device outputs the target polarized light at a second position.
In one possible design, the controlling the polarization converter to adjust the polarization direction of the output target polarized light in a time-sharing manner includes:
controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the first image to be a first polarization direction in a first time unit, and controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the second image to be a second polarization direction in a second time unit, so that the interval between a first position of the output target polarized light bearing the first image and a second position of the output target polarized light bearing the second image of the polarization displacement device in the horizontal direction is Px/2; px represents a pitch of adjacent pixels of the first image or the second image in the horizontal direction.
In one possible design, the sub-pixels included in the first pixel point in the first image are obtained by sampling the sub-pixels included in two horizontally adjacent pixel points included in the image to be displayed;
the sub-pixels included by the second pixel points in the second image are obtained by sampling the sub-pixels included by two horizontally adjacent pixel points;
the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image.
In one possible design, the controlling the polarization converter to adjust the polarization direction of the input target polarized light in a time-sharing manner includes:
controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the first image to be a first polarization direction in the first time unit, and controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the second image to be a second polarization direction in the second time unit, so that a first position of the output target polarized light bearing the first image of the polarization displacement device and a second position of the output target polarized light bearing the second image are vertically separated by Py/2; py denotes a pitch of adjacent pixels of the first image or the second image in the vertical direction.
In one possible design, the sub-pixels included in the first pixel point in the first image are obtained by sampling the sub-pixels included in two adjacent pixel points in the vertical direction in the image to be displayed;
the sub-pixels included by the second pixel points in the second image are obtained by sampling the sub-pixels included by two adjacent pixel points in the vertical direction;
the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image.
In one possible design, the controlling the polarization converter to adjust the polarization direction of the output target polarized light in a time-sharing manner includes:
the polarization converter is controlled to adjust the polarization direction of input target polarized light bearing a first image into a first polarization direction in the first time unit, the polarization converter is controlled to adjust the polarization direction of input target polarized light bearing a second image into a second polarization direction in the second time unit, so that a first position of the target polarized light bearing the first image output by the polarization displacement device and a second position of the target polarized light bearing the second image output by the polarization displacement device are vertically spaced by Py/2 and horizontally offset by Px/2, py represents the distance between adjacent pixels of the first image or the second image in the vertical direction, and Px represents the distance between adjacent pixels of the first image or the second image in the vertical direction.
In a possible design, sub-pixels included in a first pixel point in a first image are obtained by sampling sub-pixels included in two adjacent pixel points in the diagonal direction in an image to be displayed;
the sub-pixels included by the second pixel points in the second image are obtained by sampling the sub-pixels included by two adjacent pixel points in the diagonal direction;
the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image.
In a third aspect, an embodiment of the present application provides a display module, including a display module, at least one adjusting module, and a control module; the adjusting component comprises a polarization rotator and a birefringent device; the display component is used for receiving the image to be processed and displaying the image to be processed, and the resolution of the image to be processed is the same as that of the display component; a polarization rotator for adjusting a polarization direction of a light beam of each pixel of the image to be processed under the control of the control assembly; the double refraction device is used for decomposing the light beam of each pixel included in the image to be processed, outputting first target polarized light for projecting a first sub-image at a first position and outputting second target polarized light for projecting a second sub-image at a second position; the decomposition proportion of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, the decomposition proportion of the first pixel is the proportion of the luminous intensity of the light beam of the first pixel projected on the pixel of the first sub-image after decomposition to the luminous intensity of the pixel projected on the second sub-image, and the decomposition proportion of the second pixel is the proportion of the luminous intensity of the light beam of the second pixel projected on the pixel of the first sub-image after decomposition to the luminous intensity of the pixel projected on the second sub-image. In the embodiment of the application, the polarization rotator is combined with the birefringent device, and the two sub-images are output by utilizing the principle that the birefringent device outputs two beams of light, so that the superposition of the two sub-images is close to the source image to be displayed, and the resolution of the displayed image is improved.
In one possible design, the position of the first sub-image is vertically spaced from the position of the second sub-image by Py/2 and/or horizontally spaced by Px/2, py representing the pitch of the pixels in the vertical direction and Px representing the pitch of the pixels in the vertical direction.
In one possible design, the control component is specifically configured to control the polarization rotator to adjust the polarization direction of the light beam of the sub-pixel comprised by each pixel.
In one possible design, the image to be processed is obtained by performing down-sampling processing on the image to be displayed; the control assembly is specifically configured to: estimating the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image according to the luminous intensity of each pixel of the image to be displayed; the resolution of the first sub-image is the same as that of the second sub-image and is smaller than that of the image to be displayed; and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image.
In one possible design, the image to be processed is obtained by performing down-sampling processing on the image to be displayed; the control assembly is specifically configured to: estimating and adjusting the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image, so that the similarity between the image after the overlapped projection of the adjusted first sub-image and the second sub-image and the image to be displayed is greater than a set threshold; the set threshold is determined according to the perception capability of human eyes to image difference; and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the adjusted luminous intensity of each pixel of the first sub-image and the adjusted luminous intensity of each pixel of the second sub-image.
In a fourth aspect, an embodiment of the present application provides an imaging control method, where the method is applied to a wearable device, where the wearable device includes a display component and a pixel position adjustment component, and the pixel position adjustment component includes a polarization rotator and a birefringent device; the method comprises the following steps: receiving an image to be displayed, performing down-sampling processing on the image to be displayed to obtain an image to be processed, and inputting the image to be processed to a display assembly, so that the display assembly emits target polarized light bearing the image to be displayed, wherein the target frame image is one of a plurality of frames of images, and the resolution of the image to be processed is the same as that of the display assembly; controlling the polarization rotator to adjust the polarization direction of the output light beam of each pixel of the image to be processed, so that the birefringent device decomposes the light beam of each pixel included in the image to be processed, and outputs a first target polarized light for projecting a first sub-image at a first position and a second target polarized light for projecting a second sub-image at a second position; the decomposition ratio of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, the decomposition ratio of the first pixel is the ratio of the luminous intensity of the pixel projected on the first sub-image to the luminous intensity of the pixel projected on the second sub-image after the light beam of the first pixel is decomposed, and the decomposition ratio of the second pixel is the ratio of the luminous intensity of the pixel projected on the first sub-image to the luminous intensity of the pixel projected on the second sub-image after the light beam of the second pixel is decomposed.
In one possible design, the position of the first sub-image is vertically spaced from the position of the second sub-image by Py/2 and/or horizontally spaced by Px/2, py representing the pitch of the pixels in the vertical direction and Px representing the pitch of the pixels in the vertical direction.
In one possible design, controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes: the polarization rotator is controlled to adjust the polarization direction of the light beam of the sub-pixel comprised by each pixel of the image to be processed.
In one possible design, controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes: estimating the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image according to the luminous intensity of each pixel of the image to be displayed; the resolution of the first sub-image is the same as that of the second sub-image and is smaller than that of the image to be displayed; and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image.
In one possible design, controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes: estimating and adjusting the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image, so that the similarity between the image after the overlapped projection of the adjusted first sub-image and the second sub-image and the image to be displayed is greater than a set threshold; the set threshold value is determined according to the perception capability of human eyes to image difference; and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the adjusted luminous intensity of each pixel of the first sub-image and the adjusted luminous intensity of each pixel of the second sub-image.
In a fifth aspect, the present application provides a control device, configured to implement any one of the methods in the second or fourth aspects, including corresponding functional modules, respectively configured to implement the steps in the above methods. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above-described functions.
In a sixth aspect, the present application provides a computer readable storage medium, in which a computer program or instructions are stored, which, when executed by a display device, cause the display device to perform the method in any possible implementation manner of the second aspect or the fourth aspect.
In a seventh aspect, the present application provides a computer program product comprising a computer program or instructions for implementing the method of any possible implementation manner of the second or fourth aspect when the computer program or instructions are executed by a control device.
Drawings
FIG. 1A is a schematic diagram of an arrangement of RGB strips;
FIG. 1B is a schematic diagram of pentile RGBG arrangement;
FIG. 1C is a schematic view of a delta RGB arrangement;
FIG. 1D is a schematic diagram of pentile RGBW arrangement;
FIG. 2 is a schematic diagram of a display module according to an embodiment of the present disclosure;
FIG. 3A is a schematic structural diagram of a display module according to an embodiment of the present disclosure;
FIG. 3B is a schematic diagram of a display module according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a diagonal over-shoot principle in an embodiment of the present application;
FIG. 5A is a schematic diagram of resolution enhancement achieved in conjunction with a polarization grating;
FIG. 5B is a schematic diagram of resolution enhancement achieved by shifting pixels of an image;
FIG. 6 is a schematic diagram of beam transmission of a display module using a birefringent device according to an embodiment of the present application;
FIG. 7 is a schematic diagram of the light output of a birefringent device in an embodiment of the present application;
FIG. 8 is a graph showing the relationship between the thickness of a birefringent device and the refractive index difference Δ n between o-light and e-light of the birefringent device in an embodiment of the present application;
FIG. 9 is a schematic view of a display module according to an embodiment of the present disclosure;
FIG. 10 is a conceptual diagram of sub-pixel sampling according to an embodiment of the present application;
fig. 11A is a schematic view illustrating decomposition of sampling of an image to be displayed under diagonal super-resolution in the embodiment of the present application;
FIG. 11B is a schematic diagram illustrating decomposition of a sample of an image to be displayed under diagonal super-resolution in an embodiment of the present application;
FIG. 12 is a schematic diagram illustrating an exploded sampling of an image to be displayed under vertical super-resolution in an embodiment of the present application;
FIG. 13 is a schematic control timing diagram according to an embodiment of the present application;
FIG. 14 is a schematic view illustrating a transmission direction of light beams in the display module according to the embodiment of the present application;
FIG. 15 is a schematic diagram illustrating superimposed display of low-resolution subframes under diagonal over-resolution in an embodiment of the present application;
FIG. 16A is a sample exploded view of an image to be displayed according to an embodiment of the present disclosure;
FIG. 16B is a sample exploded view of an image to be displayed according to an embodiment of the present disclosure;
FIG. 17 is a schematic diagram of a low-resolution subframe overlay display in an embodiment of the present application;
FIG. 18 is a schematic diagram illustrating a principle of calculating a pixel value of a sub-pixel in the embodiment of the present application;
FIG. 19 is a schematic diagram illustrating a principle of calculating a pixel value of a sub-pixel in an embodiment of the present application;
FIG. 20 is a schematic view of a display module according to an embodiment of the present disclosure;
FIG. 21 is a schematic view of a display module according to an embodiment of the present disclosure;
FIG. 22 is a diagram illustrating polarized light output by a deflection rotator according to an embodiment of the present application;
FIG. 23 is a schematic diagram illustrating a resolution enhancement principle in an embodiment of the present application;
FIG. 24 is a flowchart illustrating an imaging control method according to an embodiment of the present application;
fig. 25 is a flowchart illustrating an imaging control method according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Hereinafter, some terms in the present application will be explained. It should be noted that these explanations are for the convenience of those skilled in the art, and do not limit the scope of protection claimed in the present application.
(1) Near-eye display:
the display is close to the eyes, and is a display mode of an AR display device or a VR display device.
(2) The angular resolution may also be referred to as spatial resolution, which refers to the number of pixels filled in each 1 degree included angle on average of the field angle. Before the definition reaches the limit of human eyes to being able to distinguish, the more the number of pixel points filled in the picture of a unit area in the human eye visual field is, the clearer and finer the visual perception is. The larger the PPD is, the more the number of pixel points filled in the picture of the unit area in the human eye field is, and the clearer the user feels on the display picture.
(3) And (3) sub-pixel:
the smallest picture element on the display screen is a pixel (pixel). A pixel is composed of three sub-pixels of different colors. A sub-pixel may also be referred to as a sub-pixel. For example, a red (R) sub-pixel, a green (G) sub-pixel, and a blue (B) sub-pixel constitute one pixel. The pixel arrangement mode adopted by the display screen can comprise an RGB stripe (stripe) arrangement, a pentile RGBG arrangement, a pentile RGBW arrangement, a delta RGB arrangement and the like.
Fig. 1A shows a conventional RGB Stripe arrangement of a display panel, where each pixel includes one of R, G, and B.
Fig. 1B shows a pentile RGBG arrangement, where each pixel includes two sub-pixels, which alternate with the combination of RG and BG.
Fig. 1C shows a delta RGB arrangement, each pixel includes three RGB sub-pixels, and two adjacent pixels share a common sub-pixel, and as an example of row a in fig. 1C, pixel 1 and pixel 2 share a blue sub-pixel, and pixel 2 and pixel 3 share a red sub-pixel and a green sub-pixel.
Fig. 1D is a pentile RGBW arrangement, each pixel including a red (R) sub-pixel, a green (G) sub-pixel, a blue (B) sub-pixel, and a white (W) sub-pixel.
It can be seen that in the RGB Stripe arrangement, three pixels are composed of 9 sub-pixels; in the RGBG arrangement, three pixels are composed of 6 sub-pixels; in the RGB delta arrangement, the three pixels consist of 6 sub-pixels. Therefore, when the number of pixels is the same, the RGBG arrangement and the RGB delta arrangement require fewer sub-pixels than the RGB Stripe arrangement.
The embodiment of the application is applied to the display equipment. Such as terminal equipment with a display screen, such as a mobile phone, a display, a television, etc. The display device may also be a wearable device. The wearable device may be a Near Eye Display (NED) device, such as VR glasses, or a VR headset, etc. For example, the user wears the NED device to play a game, read, watch a movie (or tv show), participate in a virtual meeting, participate in video education, or video shopping, etc.
In order to realize the effect that human eyes see high-resolution images by using the low-resolution display screen, a resolution enhancement mode, which may also be called super-resolution, may be used to improve the resolution of the low-resolution display screen. Fig. 2 is a schematic view of a display module structure provided in the embodiment of the present application. The display module comprises a display element 100, at least one pixel position adjusting element 200 and a control element 300. In FIG. 2, a pixel position adjustment assembly 200 is illustrated in the display module. A display assembly 100 for displaying an image. The pixel position adjusting assembly 200 is used for adjusting the position of the image displayed by the display assembly 100. For example, the display device is a wearable device, and the pixel position adjustment assembly 200 adjusts an image displayed by the display assembly 100 to be imaged to a position on a virtual image plane at a certain distance from the display assembly 100. The display component 100 time-divisionally displays a plurality of frames of images under the control of the control component 300. The multi-frame image may be obtained by down-sampling the image to be displayed by the control component 300. For example, the control component 300 splits the high-resolution image to be displayed into a plurality of low-resolution images. The resolution of the low-resolution image is the same as the resolution of the display assembly. The control unit 300 time-divisionally transmits the plurality of low-resolution images to the display unit 100 to be displayed. The pixel position adjusting component is used for adjusting the position of each frame of image displayed by the display component in a time-sharing way under the control of the control component; the time of the display component for displaying the first image is synchronous with the time of the pixel position adjusting component for adjusting the first image, and the first image is any one of the multi-frame images.
In one possible embodiment, the pixel position adjusting component may be a motor, and the control component 300 may mechanically move the motor by time-sharing control, so as to achieve time-sharing adjustment of the position of each frame of image displayed by the display component. The motor may be an ultrasonic motor or a servo motor, etc.
In another possible embodiment, referring to fig. 3A, the pixel position adjustment assembly 200 may include a polarization converter 210 and a polarization displacement device 220. The polarization converter 210 is configured to adjust the polarization direction of the target polarized light output by the polarization converter 210 in a time-sharing manner under the control of the control component 300, where the target polarized light carries one image of the multiple images. A polarization shifting device 220 for outputting the target polarized light at a first position when the polarization direction of the target polarized light output by the polarization converter 210 is a first polarization direction, and outputting the target polarized light at a second position when the polarization direction of the target polarized light output by the polarization converter 210 is a second polarization direction; the display time of the first image is synchronous with the polarization direction adjustment time of the target polarized light bearing the first image, and the first image is any one of the multi-frame images.
As an example, taking a multi-frame image as two frames of images as an example, the display module provided in the embodiment of the present application can achieve horizontal direction over-separation, vertical direction over-separation, or diagonal direction over-separation. The horizontal direction is over-divided, which can be understood that the separation distance of two frames of images formed by the light beams output by the polarization displacement device in the horizontal direction is Px/2; and Px represents the distance between adjacent pixels of the two frames of images in the horizontal direction, so that the resolution in the horizontal direction is doubled. When the horizontal direction super-polarization is adopted, the polarization displacement device needs to have the capability of realizing the offset vector of (Px/2, 0). The vertical direction is over-divided, which can be understood that the separation distance of two frames of images output by the polarization displacement device in the vertical direction is Py/2; the Py represents the vertical distance between adjacent pixels of two frame images, thereby achieving a vertical resolution doubling. When the vertical polarization is adopted, the polarization displacement device needs to have the capability of realizing an offset vector of (0, py/2). The diagonal direction is over-divided, which can be understood as that the separation distance between two frames of images output by the polarization displacement device in the vertical direction is Py/2, and the separation distance in the horizontal direction is Px/2; thereby achieving resolution doubling. When the diagonal polarization is adopted, the polarization displacement device needs to have the capability of realizing an offset vector of (Px/2, py/2). Referring to fig. 4, taking a 4-by-4 pixel array as an example (the pixel arrangement is RGB stripe arrangement), assuming that the polarization displacement device can realize the offset with the offset vector of (Px/2, py/2), the effect of doubling the equivalent number of display pixels can be realized by time division multiplexing. Of course, when the offset vector of the polarization displacement device is (Px/2, 0), the resolution in the horizontal direction can be doubled. When the offset vector of the polarization displacement device is (0, py/2), resolution multiplication in the vertical direction is achieved. Of course, the shift of multiple frames of images can be realized by serially connecting multiple pixel position adjusting components 200, so as to realize resolution improvement larger than 2 times.
See, for example, fig. 3B. The two pixel position adjustment components are referred to as a first pixel position adjustment component 200a and a second pixel position adjustment component 200b, respectively, for ease of distinction. The polarization converter of the pixel position adjustment assembly 200a is referred to as a first polarization converter 210a, and the polarization converter of the pixel position adjustment assembly 200b is referred to as a second polarization converter 210b. The polarization displacement device in the pixel position adjustment assembly 200a is referred to as a polarization displacement device 220a, and the polarization displacement device 2 in the pixel position adjustment assembly 200b is referred to as a polarization displacement device 220b.
The functional components and structures of fig. 3A and 3B are described separately below to give exemplary embodiments. Reference numerals for the respective components in the display module are not illustrated as follows.
The polarization displacement device may be a polarization grating or a birefringent device.
In one possible example, the polarization displacement device 220 is a polarization grating 220 a. The resolution enhancement is realized by adopting a combination mode of combining a polarization grating with a polarization modulation device.
A polarization grating, which may also be referred to as a Pancharatnam-Berry reflector (PBD), is a diffractive optical device. The polarization grating utilizes the geometric phase to generate a periodic phase grating structure, so that under the incidence of different circularly polarized light, the diffraction of +1 order and-1 order in different directions can be respectively generated. Referring to fig. 5A, the polarization state of the light beam output from the display screen is changed by the electrically controlled polarization modulation device, so that the light beam input to the polarization grating is deflected. Taking each pixel as a rectangle as an example, P is a horizontal pitch or a vertical pitch of two adjacent pixels. For example, see FIG. 5BTaking diagonal super-resolution as an example, shifting each pixel in an image output by a display screen along a diagonal direction is realized by a polarization converter and a polarization grating
Figure BDA0003291244240000101
And P, displaying the image which is not subjected to the offset and the image after the offset in a time-sharing manner, equivalently improving the resolution of the image displayed by the display screen by 2 times, wherein the horizontal interval of the pixel after the offset is P/2.
In another possible example, a combination of a polarization converter and a birefringent device is used to shift the image-bearing light rays emitted by the display assembly. The polarization converter can adjust the polarization direction of the input target polarized light in a time-sharing way under the control of the control assembly. The target polarized light is used to carry low resolution images. When the polarization directions of the input polarized light are different, the positions of the output target polarized light are different. For example, the birefringent device outputs the target polarized light at the first position when the polarization direction of the target polarized light output by the polarization converter is the first polarization direction, and outputs the target polarized light at the second position when the polarization direction of the target polarized light output by the polarization converter is the second polarization direction, as shown in fig. 6. It should be noted that, for the same image, for example, a first image in a plurality of images, the display module displays the time of the first image, and the polarization converter adjusts the time synchronization for the polarization direction of the target polarized light bearing the first image. Through the method, the multi-frame low-resolution images are displayed in a time-sharing manner, and the high-resolution images seen by human eyes are displayed by utilizing the functions of vision persistence and vision synthesis of the human eyes.
It should be noted that, since the polarization grating is a diffraction device, the diffraction device has inherent dispersion characteristics. The deflection angle of the grating is linearly related to the wavelength of the input light beam, so that red, green and blue light rays with different wavelengths emitted by the display screen are subjected to dispersion after passing through the polarization grating, and the imaging effect of human eyes is poor, for example, the effect similar to a rainbow edge appears, so that the image seen by the human eyes is blurred, and the color is distorted. The double refraction device is a non-diffraction device and has no inherent dispersion characteristic, so that light passing through the double refraction device is not dispersed, and compared with a polarization grating, the imaging effect is better.
The display component may be a common Liquid Crystal Display (LCD), a common Organic Light Emitting Diode (OLED), or a silicon-based OLED, or other display devices, which is not limited in this application.
The polarization converter may be an Electronically Controlled Polarization Switch (ECPS). Illustratively, the electrically controlled polarization converter may be any one of nematic liquid crystal (nematic liquid crystals), vertical Alignment (VA) liquid crystal, in-plane switching (IPS) liquid crystal, electrically controlled Twisted Nematic Liquid Crystal (TNLC), electrically controlled nonlinear liquid crystal, or electrically controlled ferroelectric liquid crystal.
In one possible example, the control component controls the polarization converter to be in a non-powered (i.e., OFF) state for maintaining the polarization direction of the input polarized light, which may be understood as the input polarized light being the same as the polarization direction of the output polarized light, or as the polarization converter transmitting only the input polarized light. It should be noted that, when a light beam is transmitted through a certain optical component, there may be energy loss, but information carried in the light beam is not changed, and based on this, the input polarized light and the output polarized light which are only subjected to transmission processing are regarded as the same polarized light in the embodiments of the present application. The control component controls the polarization converter to be in a power-on condition, such as the applied voltage exceeds the threshold voltage Vc, for performing a rotation process on the polarization direction of the input polarized light, such as rotating the polarization direction of the input polarized light by 90 degrees.
In another possible example, the control component controls the polarization converter to be in a powered-up condition, such as an applied voltage exceeding a threshold voltage, for maintaining the polarization direction of the input polarized light. The control component controls the polarization converter to be in a non-powered state and is used for performing rotation processing on the polarization direction of the input polarized light, such as rotating the polarization direction of the input polarized light by 90 degrees.
Taking TNLC as an example, TNLC consists of two conductive substrates sandwiching a liquid crystal layer. When the TNLC is not powered, the polarization direction of incident polarized light passing through the TNLC is rotated by 90 degrees; when a voltage is applied to the twisted nematic liquid crystal over a threshold voltage Vc and liquid crystal molecules in the TNLC are erected, the polarization direction of incident polarized light passing through the TNLC remains unchanged, and polarized light with the same polarization state as the incident polarized light is still emitted. If the applied voltage is between 0 and Vc, the polarization direction of the incident polarized beam will rotate by 0-90 degrees after passing through the TNLC, depending on the applied voltage and the specific material of the TNLC.
In one possible implementation, the control component may be, for example, a processor, a microprocessor, a controller, or other control components, such as a general purpose Central Processing Unit (CPU), a general purpose processor, a Digital Signal Processing (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), or other programmable logic device, transistor logic device, hardware component, or any combination thereof.
The structure of the birefringent device is described as follows. Before describing the structure of the birefringent device, the principle of the birefringent device will be described. Referring to fig. 7, a schematic diagram of a birefringent device is shown. The optical axis of the birefringent device lies in the plane, i.e. in a direction parallel to the plane of the paper. The birefringent device can output light divided into o light and e light according to the difference of the polarization direction of incident light. The o light may be referred to as ordinary light, with the polarization direction perpendicular to the plane of incidence. e-light may also be referred to as extraordinary light, with the polarization direction lying in the plane of incidence.
Since the refractive indices of the o-light and the e-light transmitted in the birefringent device are different, they may be shifted after passing through the birefringent device, for example, by an amount a. The magnitude of the displacement amount a is related to the specification of the birefringent device. The numerical value of the displacement amount a is related to the thickness of the birefringent device and an included angle theta, wherein the included angle theta is an included angle between the optical axis of the birefringent device and the surface normal of the birefringent device. For example, the displacement amount a satisfies the condition shown in the following formula (1):
Figure BDA0003291244240000121
wherein T represents the thickness of the birefringent device, n o Denotes the refractive index of o light, n e Representing the refractive index of e-light.
Based on the above formula (1), it can be seen that the refractive index difference Δ n between o light and e light of the birefringent device is larger as the thickness of the birefringent device is smaller in the case where the same displacement amount a is achieved. Illustratively, the relationship between the thickness of the birefringent device and the refractive index difference Δ n between o light and e light of the birefringent device is shown in fig. 8, where θ =45 ° is taken as an example in fig. 8. The applicant has found through studies that the thickness of the birefringent device is the smallest with the same refractive index difference Δ n in the vicinity of θ =45 °.
The birefringent device may be a crystal such as a quartz crystal, a barium borate crystal, a quartz crystal, a lithium niobate crystal, or a titanium dioxide crystal. Generally, the birefringence of a crystal is small, resulting in a bulky VR device or AR device. According to the embodiment of the application, the liquid crystal can be used as a material of the birefringent device, the thickness of the manufactured birefringent device is smaller by utilizing the larger birefringence of the liquid crystal material, and the liquid crystal can be better placed in VR equipment or AR equipment with limited space. For example, the liquid crystal material may be a liquid crystal material, and may also be a liquid crystal polymer material that is cured by ultraviolet or heat.
In a possible implementation manner, the display module may further include a folded optical path, as shown in fig. 9, taking the polarization displacement device as a birefringent device as an example. The display module comprises a display component, a polarization converter, a birefringence device and a folding light path in sequence according to the light beam transmission direction. The folded optical path is used for leading the light beam output by the birefringent device to be incident to human eyes. Illustratively, the folded optical path includes a piece of half mirror, a piece of reflective polarizer, one or more imaging lenses, and a plurality of phase films. The phase mode may be used to change the polarization state of incident light. The imaging lens can be a single spherical lens or an aspheric lens, or a combination of multiple spherical or aspheric lenses, wherein the combination of the multiple spherical or aspheric lenses can improve the imaging quality of the system and reduce the aberration of the system. The light path output by the birefringent device may be incident to the human eye by using other structures of light paths, which is not specifically limited in this embodiment of the present application.
In some embodiments, the image to be displayed is decomposed into frames of low resolution images. In a possible embodiment, when the image to be displayed is decomposed into multiple frames of low-resolution images, the multiple frames of images may be obtained by sub-pixel sampling, that is, each sub-pixel in a pixel unit is taken as a separate pixel to be sampled, and then the pixel value of the sub-pixel of each pixel point in each frame of image is calculated to obtain multiple frames of low-resolution images, for example, the pixel value may be determined by SPR. When sub-pixel sampling is performed, sub-pixels can be sampled from a plurality of adjacent pixel points to serve as sub-pixels of one pixel point.
For example, as shown in fig. 10, an RGB stripe arrangement is taken as an example. Taking an oblique line as an example, fig. 10 (a) shows a conceptual diagram of sampling by using a pixel unit, and fig. 10 (b) shows a conceptual diagram of sampling by using a sub-pixel. Fig. 10 (c) is a schematic diagram of sampling with red sub-pixel, green sub-pixel, and blue sub-pixel intervals taken as an example. As can be seen from fig. 10, after sub-pixel sampling is used, the slope is smoother and the jaggy is weaker. It should be noted that, when the display module adopts different sub-pixel arrangement modes, the maximum resolution multiple that can be improved by using the scheme provided in the embodiment of the present application may be different, for example, see table 1.
TABLE 1
Sub-pixel arrangement RGB Stipe Pentile RGB Pentile RGBW Delta RGB
Number of sub-pixels per pixel unit 3 2 4 2
Maximum resolution improvement factor 3 2 4 2
Take the first image of the multi-frame images as an example. The sub-pixels included by the first pixel points in the first image are obtained by sampling from the sub-pixels included by at least h adjacent pixel points included in the image to be displayed. h represents the number of low resolution images that the image to be displayed needs to be decomposed.
For example, the number of decomposed low resolution images is 2. For example, when diagonal super-sampling is implemented, an image source (with a resolution of P × Q) is resampled to obtain an image to be displayed, the resolution of the image to be displayed is 2 mx 2N, the resolution of a display module is mxn (P > M, Q > N), and each adjacent 4 pixels can collect a plurality of sub-pixels to form a sub-pixel of a low-resolution image. For example, two frames of low resolution images include a first image and a second image. The sub-pixels included by the first pixel points in the first image are obtained by sampling from the sub-pixels included by the adjacent 4 pixel points in the image to be displayed; the sub-pixels included by the second pixel points in the second image are obtained by sampling from the sub-pixels included by the adjacent 4 pixel points; and the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image. Take the RGB stripe arrangement as an example. Illustratively, the image to be displayed may be decomposed into two images by way of sub-pixel interval sampling. For example, referring to fig. 11A, (a) in fig. 11A represents an image to be displayed before sampling, (b) in fig. 11A represents a sampled low-resolution image sub-frame 1, and (c) in fig. 11A represents a sampled low-resolution image sub-frame 2. It is to be understood that R sub-pixels, G sub-pixels, and B sub-pixels included in each of the bold rectangular frames in (B) in fig. 11A serve as one pixel point of one sub-frame 1. It should be understood that fig. 11A is only an example, and does not specifically limit the sampling manner. For another example, referring to fig. 11B, (a) in fig. 11B represents an image to be displayed before sampling, (B) in fig. 11B represents a low-resolution image sub-frame 1 after sampling, and (c) in fig. 11B represents a low-resolution image sub-frame 2 after sampling. It is to be understood that R sub-pixels, G sub-pixels, and B sub-pixels included in each of the bold rectangular frames in (B) in fig. 11B serve as one pixel point of one sub-frame 1.
For another example, when horizontal super-resolution is implemented, an image source (with resolution of P × Q) is resampled to obtain an image to be displayed, the resolution of the image to be displayed is 2 mxn, the resolution of a display module is mxn, and 2 pixels adjacent to each other in each horizontal direction can collect a plurality of sub-pixels to form a sub-pixel of a low-resolution image. For another example, the resolution improvement multiple is 2, when vertical super-resolution is realized, an image source (resolution is P × Q) is subjected to resampling processing to obtain an image to be displayed, the resolution of the image to be displayed is M × 2N, the resolution of the display component is M × N, and 2 pixels adjacent in each vertical direction can acquire a plurality of sub-pixels to form a sub-pixel of a low-resolution image. Take the RGB stripe arrangement as an example. Take the RGB stripe arrangement as an example. The image to be displayed can be decomposed into two images by means of sub-pixel spaced sampling. For example, referring to fig. 12, (a) in fig. 12 represents an image to be displayed before sampling, (b) in fig. 12 represents a sampled low-resolution image sub-frame 1, and (c) in fig. 12 represents a sampled low-resolution image sub-frame 2. It should be understood that fig. 12 is intended as an example only.
As another example, the number of decomposed low resolution images is 4. For example, when horizontal super-resolution, vertical super-resolution and diagonal super-resolution are simultaneously realized, an image source (with resolution of P × Q) is resampled to obtain an image to be displayed, the resolution of the image to be displayed is 2 mx 2N, the resolution of a display component is mxn, and a plurality of sub-pixels can be collected for every 4 adjacent pixel points to form a pixel point of a low-resolution image.
It should be noted that, if the resolution of the image source is not kM × LN, the image source may be resampled to obtain the resolution of the image to be displayed, which is kM × LN. Wherein k is a positive integer, L is a positive integer, and the specific values of k and L are determined by the specific resolution enhancement factor and the over-resolution direction of the system. For example, when resolution multiplication in the diagonal direction needs to be realized, k =2,l =2. In some embodiments, the resolution of the image source is kM × LN, i.e. the image source is the image to be displayed. In one possible example, the control component obtains the image to be displayed and then time-divisionally sends the image to be displayed to the display component. For example, after the control assembly obtains the image source, the image source is processed according to the resolution of the image source and the resolution of the display assembly, and a plurality of frames of low-resolution images are obtained. In another possible example, the image to be displayed may be obtained by a mobile terminal associated with a display device (such as an AR device or a VR device), such as the mobile terminal obtaining the image to be displayed according to a resolution of an image source and a resolution of a display component. The mobile terminal then sends the image to be displayed to the control component. In yet another possible example, the image to be displayed may be obtained by a mobile terminal connected to the AR device or the VR device, such as the mobile terminal obtaining the image to be displayed according to a resolution of the image source and a resolution of the display component. Then the mobile terminal decomposes the image to be displayed into a plurality of frames of low-resolution images, then sends the images to the control assembly, and sends a control signal to the control assembly to instruct the display assembly to display the display signal of the plurality of frames of low-resolution images. The mobile terminal may be, for example, a mobile phone, a tablet computer, a Personal Computer (PC), or the like.
The following description will be made by taking as an example a case where the pixel position adjustment assembly includes a polarization converter and a birefringent device, as shown in fig. 3A and 3B. When the pixel position adjusting assembly adopts other structures, the implementation mode is similar to the implementation mode of adopting the polarization converter and the birefringent device, and details are not repeated in the application.
The scheme provided by the embodiment of the present application is described below with reference to the structure of the display module shown in fig. 3A. Take the resolution improvement factor of 2 times as an example. In the implementation of the application, the control component or the terminal device can decompose the image to be displayed into two frames of low-resolution images. In this embodiment, the diagonal over is implemented as an example. The sub-frames are displayed in a time-shared manner, and the control module synchronously sends the modulation signal to the polarization converter and sends the display signal to the display module. For example, at time T0, a high level ac signal is sent to the polarization converter, synchronously sending low resolution image sub-frame 1 to the display module, as shown in fig. 13. Taking the polarization converter as TNLC as an example, at time T0, as shown in fig. 14, the polarization converter does not convert the polarization direction of the input target polarized light, so that the birefringent device emits O light, i.e., does not shift the position of the input target light beam, as shown in fig. 13. At time T1, as shown in fig. 13, the control module sends a low-level ac signal to the polarization converter, synchronously sending the low-resolution image sub-frame 1 to the display module, and at time T1, the polarization converter performs conversion processing on the polarization direction of the input target polarized light, so that the birefringent device emits e-light, that is, performs positional shift on the input target light beam, as shown in fig. 13. For example, the image transmitted to the human eye through the optical path is equivalent to the superposition of the low-resolution image sub-frame 1 and the low-resolution image sub-frame 2, as shown in fig. 15. Each bold black box in fig. 15 represents a pixel.
By using the method, the equivalent resolution of the display module can be improved by 2 times, and certainly, the improvement of the resolution by higher times can also be realized, but a plurality of groups of combinations of polarization converters and birefringent devices are needed. By controlling the azimuthal angle between the birefringent device and the display assembly, resolution enhancement in horizontal, vertical and arbitrary directions can be achieved. The resolution enhancement algorithm adopts time-sharing display of low-resolution images, and synthesizes high-resolution images in the brain and the sea by using the functions of persistence of vision and visual synthesis of human eyes, so that the frame rate of the synthesized high-resolution images is reduced in a certain proportion relative to the highest frame rate of a display assembly.
As another example, taking the RGB stripe arrangement as an example, the structure of the display module is shown in fig. 3B as an example. In the implementation of the application, the control component or the terminal device can decompose the image to be displayed into 4 frames of low-resolution images. Illustratively, the first pixel position adjustment component and the second pixel position adjustment component combine to achieve a horizontal direction over-divide, a vertical direction over-divide, and a diagonal direction over-divide. In some embodiments, the resolution of the display assembly is M × N. If the resolution of the image source is not 2 mx 2N, the image source can be resampled to obtain an image to be displayed, so that the resolution of the image to be displayed is 2 mx 2N. When the image to be displayed is decomposed, 4 frames of images may be collected in a sampling manner, and then the pixel value of the sub-pixel of each pixel point in each frame of image is further calculated, for example, the pixel value may be determined in an SPR manner. Taking a frame of image as an example, the sub-pixel value of a pixel of the frame of image may be obtained by sampling sub-pixels included in 4 adjacent pixels. Illustratively, the image to be displayed is decomposed into 4 frames of images by means of sub-pixel sampling. For example, referring to fig. 16A, (a) in fig. 16A represents an image to be displayed before sampling, and (b) - (e) in fig. 16A represent sub-frames 1-4 of a low-resolution image after sampling. Fig. 16A is merely an example, and a specific sampling manner is not particularly limited. For another example, as shown in fig. 16B, (a) in fig. 16B represents an image to be displayed before sampling, and (B) - (e) in fig. 16B represent sub-frames 1-4 of a low-resolution image after sampling. Fig. 16B is merely an example, and does not specifically limit the specific sampling manner.
It will be appreciated that the O-axis of the second birefringent device coincides with the O-axis of the first birefringent device and the e-axis of the second birefringent device is perpendicular to the e-axis of the first birefringent device. See table 1 for an example. When the polarization converter is turned OFF, the polarization converter does not convert the polarization direction of the input target polarized light, so that the birefringent device emits O light, i.e., does not perform position shift on the input target light beam. When the polarization converter is ON, the polarization converter converts the polarization direction of the target polarized light of the input pair, so that the birefringent device emits e light. The first birefringent device shifts the object beam in a horizontal direction when emitting the e-light, and the second birefringent device shifts the object beam in a vertical direction when emitting the e-light. See table 1. In a first time unit, the control component controls the display component to display the subframe 1, controls the first polarization converter to be in an ON state, and controls the second polarization converter to be in an OFF state. It is understood that the control component controls the display component to start displaying the sub-frame 1 at the time T0, and sends the high-level modulation signal to the first polarization converter and the second polarization converter. Therefore, the target polarized light bearing the sub-frame 1 does not generate beam deviation after passing through the double-refraction device and the second double-refraction device. During a second time unit, that is, when time T1 is reached, the control module controls the display module to display sub-frame 2, controls the first polarization converter to be in the OFF state, and controls the second polarization converter to be in the ON state. It is to be understood that the control component controls the display component to start displaying sub-frame 2 at time T1, sending a low-level modulation signal to the first polarization transformer, and sending a high-level modulation signal to the second polarization transformer. Therefore, the target polarized light bearing the sub-frame 2 generates beam horizontal deviation through the first birefringent device, and does not continue to perform beam deviation after passing through the second birefringent device. It can be understood that the target polarized light carrying the sub-frame 2 is subjected to beam horizontal shift after passing through the first birefringent device and the second birefringent device. During a third time unit, that is, when time T2 arrives, the control component controls the display component to display sub-frame 3, controls the first polarization converter to be in the ON state, and controls the second polarization converter to be in the OFF state. It will be appreciated that at time T2, the control component controls the display component to begin displaying sub-frame 3, sending a high level modulation signal to the first polarization converter, and sending a low level modulation signal to the second polarization converter. Therefore, the target polarized light bearing the sub-frame 3 does not generate beam offset after passing through the first birefringent device, and generates beam vertical offset after passing through the second birefringent device. It can be understood that the light beam vertical shift occurs after the target polarized light carrying the sub-frame 3 passes through the first birefringent device and the second birefringent device. In the fourth time unit, that is, when the time T4 arrives, the control unit controls the display unit to display the sub-frame 4, controls the first polarization converter to be in the OFF state, and controls the second polarization converter to be in the OFF state. It will be appreciated that at time T3 the control component controls the display component to begin displaying sub-frame 4, sending a high level modulated signal to the first polarization transformer, and sending a high level modulated signal to the second polarization transformer. Therefore, the target polarized light bearing the sub-frame 4 generates horizontal beam deviation through the first birefringent device, and continues to generate vertical beam deviation after passing through the second birefringent device. It can be understood that the target polarized light carrying the sub-frame 4 undergoes diagonal beam shift after passing through the first birefringent device and the second birefringent device. The image transmitted to the human eye via the optical path then corresponds to the superposition of the sub-frames 1 of the low-resolution image and the sub-frames 4 of the low-resolution image, as shown in fig. 17. And overlapping the 4 low-resolution image sub-frames with the sizes above the arrow to obtain the effect below the arrow. It can be seen that the resolution is improved by a factor of 4.
It should be noted that, in the embodiment of the present application, the sequence of displaying the subframes 1 to 4 is not specifically limited.
TABLE 1
Figure BDA0003291244240000161
By using the method, the equivalent resolution of the display component can be improved by 4 times, and in the embodiment of the application, the resolution enhancement in the horizontal direction, the vertical direction and any direction can be realized by controlling the azimuth angle between the birefringent device and the display component. The resolution enhancement algorithm adopts time-sharing display of low-resolution images, and synthesizes high-resolution images in the brain and the sea by using the functions of persistence of vision and visual synthesis of human eyes, so that the frame rate of the synthesized high-resolution images is reduced in a certain proportion relative to the highest frame rate of a display assembly.
The embodiment of the application adopts a sub-pixel level decomposition mode to split the image to be displayed, and after sub-pixel sampling is adopted, the edge is smoother, the sawtooth feeling is weakened, but the color edge problem can occur. Based on this, in order to weaken the color edge, the embodiment of the present application provides a sub-pixel rendering (SPR) manner to apply to the pixel value of each sub-pixel of the sample. In determining the pixel values of the sub-pixels to be sampled, the determination may be made based on the pixel values of the sub-pixels of the same color in a set region around the sampling position. For example, the pixel value of the set color sub-pixel of the pixel (i, j) may be determined by weighting the pixel value of the set color sub-pixel of the pixel (i, j) and the pixel values of the set color sub-pixels of the surrounding pixels. Illustratively, the weight of the sub-pixels included in the set region and having the same color as the first sub-pixel is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the distance between the sub-pixel of the same color as the first sub-pixel and the sampling position of the first sub-pixel in the image to be displayed.
In some embodiments, the size of the set area is related to the distance between the display assembly and the imaging plane of the wearable device. For example, the larger the distance between the display assembly and the imaging plane of the wearable device, the larger the size of the setting area. It will be appreciated that the distance between the display assembly and the imaging plane of the wearable device is inversely proportional to the size of the set area. In some embodiments, the size of the defined area is related to the pixel size of the display element. The size of the setting region may be configured according to the pixel size of the display component. In still other embodiments, the size of the setting area is related to the display content of the display component. For example, the size of the setting region may be adaptively adjusted according to the display content of the display component. For example, the setting area may be increased if the display details of the display content are large, and the setting area may be decreased if the display details of the display content are small. For another example, the size of the setting region may be adaptively adjusted according to a scene to which the image displayed by the display module belongs. For example, when the character scene is compared with the grassland scene, the set area corresponding to the task scene is larger than the set area corresponding to the grassland scene.
In some embodiments, the pixel value of the sub-pixel of the sampled pixel point may be determined by using the following formula, taking the first sub-pixel as an example:
q(i,j)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)+α 5 *Q(i,j);
wherein q (i, j) represents a pixel value of the first sub-pixel; i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed; j represents the vertical coordinate of the first sub-pixel at the pixel point of the image to be displayed; q (i, j) represents the pixel value of the sub-pixel of the sampling position of the first sub-pixel in the image to be displayed; alpha is alpha 1 ,α 2 ,α 3 ,α 4 And alpha 5 Respectively, represent the weights.
In other embodiments, the pixel value of the sub-pixel of the sampled pixel point may be determined by using the following formula, taking the first sub-pixel as an example:
q(i,j+1)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)
1 *Q(i-1,j+1)+β 2 *Q(i,j)+β 3 *Q(i+1,j+1)+β 4 *Q(i,j+2)。
wherein Q (i, j + 1) represents the pixel value of the first sub-pixel, i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed, j +1 represents the ordinate of the first sub-pixel at the pixel point of the image to be displayed, Q (i, j + 1) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed, and α 1 ,α 2 ,α 3 ,α 4 ,β 1 ,β 2 ,β 3 And beta 4 Respectively representing the weights.
As an example, taking the RGB stripe subpixel arrangement as an example, in the SPR process, the pixel value G (i, j) of the green subpixel in the pixel at position (i, j) can be determined jointly by the green G component in the diamond-shaped pixels composed of the green subpixels at positions (i-2, j), (i-1, j-1), (i, j-2), (i +1, j-1), (i +2, j), (i +1, j + 1), (i-1, j + 1), (i, j + 2). Illustratively, green sub-pixels having pixels located within the diamond area of no more than 50% do not participate in the determination. Based on this, the green sub-pixel at the position (i, j) is determined by weighting the pixel values of the green sub-pixels in the five pixels at the positions (i, j), (i-1, j), (i, j-1), (i +1, j), (i, j + 1).
Exemplarily, g (i, j) = α 1 *G(i-1,j)+α 2 *G(i,j-1)+α 3 *G(i+1,j)+α 4 *G(i,j+1)+α 5 * G (i, j). For example, the weights α corresponding to (i, j), (i-1, j), (i, j-1), (i +1, j), and (i, j + 1), respectively 1 、α 2 、α 3 、α 4 、α 5 May be determined based on the distance of each pixel from the pixel at location (i, j). For example, alpha 1 =α 2 =α 3 =α 4 =0.125,α 5 =0.5。
For example, the pixel value at the time of sampling of the green sub-pixel of the pixel (i, j) of the low-resolution sub-frame 1 shown in fig. 18 may be determined by weighting the pixel value of the green sub-pixel in the black dot position pixel in the image to be displayed and the pixel value itself.
As another example, the pixel value R (i, j + 1) of the red sub-pixel in the pixel at position (i, j + 1) may be determined jointly by the red R component in the diamond-shaped intra-pixel composed of the red sub-pixels at positions (i-2, j + 1), (i-1, j), (i, j-1), (i +1, j + 2), (i, j + 3), (i-1, j + 1). Illustratively, red sub-pixels having pixels located within the diamond-shaped range of no more than 50% of the area do not participate in the determination.
Based on this, the red sub-pixel at position (i, j + 1) is determined by weighting the pixel values of the red sub-pixels in the eight pixels at positions (i-1, j), (i, j-1), (i +1, j), (i, j + 1), (i-1, j + 1), (i, j + 2).
Illustratively, eight pixels may be classified into two types, a red sub-pixel of four pixels around the position (i, j + 1) on the upper, lower, left, and right sides as a second type, and a red sub-pixel of the remaining pixels as a first type. The weight distribution proportion of the two types of sub-pixels is 0.5:0.5.
the pixel value r (i, j + 1) of the red sub-pixel at position (i, j + 1) can be determined as follows:
r(i,j+1)=α 1 *R(i-1,j)+α 2 *R(i,j-1)+α 3 *R(i+1,j)+α 4 *R(i,j+1)
1 *R(i-1,j+1)+β 2 *R(i,j)+β 3 *R(i+1,j+1)+β 4 *R(i,j+2)。
the weight corresponding to each of the 8 sub-pixels can be determined by the distance between the geometric center of the pixel where the 8 sub-pixels are located and the red sub-pixel in the position (i, j + 1). Take a sub-pixel aspect ratio of 1: 3 as an example.
Illustratively, the weights corresponding to the 8 sub-pixels are determined by the following formulas:
the first type:
Figure BDA0003291244240000181
Figure BDA0003291244240000182
Figure BDA0003291244240000183
the second type:
Figure BDA0003291244240000184
Figure BDA0003291244240000185
wherein x represents the ratio of the pixel values assigned for calculating the red sub-pixel versus the distance; x is less than 0.
For example, the pixel value at the time of sampling of the red subpixel of the pixel (i, j + 1) of the low-resolution subframe 1 shown in fig. 19 may be determined by weighting the pixel value of the red subpixel in the black dot position pixel in the image to be displayed and the pixel value itself. And the weights corresponding to the 8 sub-pixels involved in the determination of the pixel value of the red sub-pixel of (i, j + 1) can be seen in fig. 19, where a black solid dot indicates the red sub-pixel of (i, j + 1), and a black dotted open dot indicates the center position of the 8 pixels of the pixel value of the red sub-pixel of (i, j + 1). The weight can then be determined from the distance between the black dashed hollow dots and the black solid dots.
Based on the manner of determining the pixel value of the red sub-pixel, the manner of determining the pixel value b (i, j) of the blue sub-pixel at the (i, j + 1) position in the SPR process can be obtained.
In some embodiments, the display module is supported to work in two working modes. A hyper-divide mode and a normal mode. The control component executes the hyper-resolution processing when determining that the hyper-resolution mode is enabled, and keeps the resolution unchanged and the frame rate not reduced when determining that the normal mode is enabled. In this normal mode, the TNLC uniformly maintains the polarization mode of the target polarized light unchanged (or uniformly changes the polarization direction by 90 degrees). In some embodiments, the control component may resample the image source to the resolution of the display component for direct output to the display component, and this normal mode may be suitable for some high frame rate scenes, such as games.
In the embodiment of the application, in order to improve the resolution of an image and improve the imaging effect, another display module is further provided, and the super-resolution is realized through a polarization converter and a birefringent device.
Referring to fig. 20, the display module includes a display element 2301, a deflection rotator 2302, a birefringent device 2303 and a control element 2304. The display element 2301, the polarization rotator 2302, and the birefringent device 2303 are disposed in this order in the optical path transmission direction. Referring to fig. 21, the display module further includes a folded light path 2305. The display component 2301 is used to display images. The polarization rotator 2302 can modulate the polarization direction of incident linearly polarized light according to the amplitude of a voltage signal applied thereto, enabling polarization rotation of any angle, and can implement individual control of a pixel or subpixel level through an array transistor on the polarization rotator 2302. The display component 2301 receives the image to be processed, and displays the image to be displayed. Wherein the resolution of the image to be processed is the same as the resolution of the display component. The polarization rotator 2302 adjusts the polarization direction of the light beam of each pixel (or each sub-pixel in the pixel) of the image to be processed under the control of the control component 2304, and outputs to the birefringent device 2303. The birefringent device 2303 decomposes the light beam of each pixel included in the image to be processed, outputs a first target polarized light for projecting a first sub-image at a first position and outputs a second target polarized light for projecting a second sub-image at a second position. The polarization directions of the polarized light input at different pixel positions may be different, and the resolution ratios of two pixels with different polarization directions are also different. The operation of the birefringent device 2303 can be seen as described in reference to fig. 6. The birefringent device can output light divided into o light and e light according to the difference of the polarization direction of incident light. The o light may be referred to as ordinary light, with the polarization direction perpendicular to the plane of incidence. e-light may also be referred to as extraordinary light, with the polarization direction lying in the plane of incidence. In some embodiments, the birefringent device outputs only o light when a light beam of one polarization direction is input, and outputs only e light when a light beam of another polarization direction is input. In the remaining polarization directions, both o-light and e-light are output. By using the principle, the target polarized light with different polarization directions is input into the birefringent device for different pixel or sub-pixel positions, so that the pixel values of the pixels at the two positions output by the birefringent device 2303 are approximately the same as or infinitely close to the pixel values of the pixels at the corresponding positions of the image to be displayed after the pixels at the two positions are superposed. Taking the first pixel and the second pixel as an example, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed. The decomposition ratio of the first pixel is different from that of the second pixel, the decomposition ratio of the first pixel is the ratio of the luminous intensity of the light beam of the first pixel projected on the pixel of the first sub-image after decomposition to the luminous intensity of the light beam projected on the pixel of the second sub-image, and the decomposition ratio of the second pixel is the ratio of the luminous intensity of the light beam of the second pixel projected on the pixel of the first sub-image after decomposition to the luminous intensity of the light beam projected on the pixel of the second sub-image.
The polarization rotator 2302 may be any one of a twisted nematic liquid crystal (TNPR), an in-Plane switching (IPSPR), a Cholesteric Liquid Crystal (CLC) and 1/4-wave plate combination, a TNLC with Thin Film Transistor (TFT) circuit, and the like.
Taking the TNLC with TFT as an example, the TNLC with TFT is composed of two conductive substrates sandwiching a liquid crystal layer. The TNLC applied voltage at the TFT is between 0 and Vc, then the polarization direction of the incident polarized beam will be rotated by 0-90 degrees after passing through the TNLC, depending on the applied voltage and the specific material of the TNLC.
The display assembly 2301 may be a general Liquid Crystal Display (LCD), an OLED, or a more advanced micro-LED display, wherein the OLED display has higher light emitting efficiency and higher contrast ratio; the mini-LED display has higher luminous brightness and can be applied to scenes needing stronger luminous brightness.
In one possible implementation, the control component 2304 may be a processor, a microprocessor, a controller, or other control components, such as a general purpose Central Processing Unit (CPU), a general purpose processor, a Digital Signal Processing (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof.
The birefringent device may be a crystal such as a quartz crystal, a barium borate crystal, a quartz crystal, a lithium niobate crystal, or a titanium dioxide crystal. Generally, the birefringence of a crystal is small, resulting in a bulky VR device or AR device. According to the embodiment of the application, the liquid crystal can be used as the material of the birefringent device, the thickness of the manufactured birefringent device is smaller by utilizing the larger birefringence of the liquid crystal material, and the birefringent device can be better placed in VR equipment or AR equipment with limited space. For example, the liquid crystal material may be a liquid crystal material, and may also be a liquid crystal polymer material that is cured by ultraviolet or heat. Illustratively, the birefringent device 2303 adopts a liquid crystal polymer RM 257, after surface orientation or electric field orientation, the included angle between the director of the liquid crystal molecule and the surface is 45 degrees, and then the liquid crystal polymer is formed through ultraviolet curing.
For convenience of description, the components in the display module are not numbered again. Referring to fig. 22, for example, the light beam emitted by the display assembly is vertically linearly polarized light, after the vertically polarized light passes through the polarization rotator, the control assembly controls a voltage signal applied to the polarization rotator, and the polarization rotator can rotate the vertically linearly polarized light into linearly polarized light with any angle. Because the polarization rotator can realize sub-pixel level control, the polarization direction of each pixel in the display assembly, even each sub-pixel after passing through the polarization rotator, is not completely consistent. The light beam reaches the birefringent device after passing through the polarization rotator, and is divided into two beams, i light and e light, by the birefringent device. The ratio of the intensities of the two beams is determined by the polarization direction or angle of the incident light, as shown in fig. 22.
From the above, in practice, after passing through the birefringent device, one beam of light is divided into two beams of light, where the two beams of light correspond to two images, which are a first sub-image and a second sub-image. According to the birefringent device arranged in the display module, the position of the first sub-image and the position of the second sub-image are separated by Py/2 in the vertical direction and/or by Px/2 in the horizontal direction, where Py represents the pitch of the pixels in the vertical direction and Px represents the pitch of the pixels in the horizontal direction. After the two sub-images enter human eyes, the high-resolution images are synthesized in the brain and sea by using the functions of persistence of vision and visual synthesis of the human eyes, so that the frame rate of the synthesized high-resolution images is reduced, but the resolution is improved.
In this embodiment of the application, the control component may estimate and adjust the light emission intensity of each pixel to be projected as the first sub-image and the light emission intensity of each pixel to be projected as the second sub-image, so that the similarity between the image after the adjusted first sub-image and the adjusted second sub-image are superimposed and projected and the image to be displayed is greater than a set threshold, or the superimposed and projected image is substantially the same as the image to be displayed. Wherein the set threshold is determined according to the perception capability of human eyes for image difference. The control component may then control the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the adjusted light emission intensity of each pixel of the first sub-image and the adjusted light emission intensity of each pixel of the second sub-image. For the polarization rotator, the luminous intensity distribution ratios of the o light and the e light which are correspondingly output in different polarization directions have a corresponding relationship. Therefore, after the light emitting intensity of each pixel of the first sub-image and the second sub-image, the polarization direction of the input light beam required by each pixel can be determined according to the corresponding relation. For the polarization rotator, different polarization directions correspond to different applied voltages, so that the control component can apply corresponding applied voltages to each pixel or sub-pixel according to the applied voltages corresponding to the polarization directions.
Illustratively, the voltage signal strength of each pixel (or sub-pixel) on the deflection rotator and the content of the display component can be obtained by the following optimization equation (1):
Figure BDA0003291244240000201
wherein S is a column vector consisting of all pixel values of the target image; r is a high-resolution image composed of the first sub-image V1 of low resolution composed of o light and the second sub-image V2 of low resolution composed of e light, and can be understood as an actually displayed image. The column vectors resulting from the arrangement of the respective pixels or sub-pixels in R are arranged in a manner similar to the column vectors resulting from the arrangement of the respective pixels or sub-pixels in S.
M is a mapping matrix that maps the first sub-image V1 and the second sub-image V2 to R.
Illustratively, taking a diagonal super-divide as an example, referring to fig. 23, the image to be processed after passing through the birefringent device is decomposed into a first sub-image (o-optical sub-frame V1) and a second sub-image (e-optical sub-frame V2). And the image composed of the first sub-image and the second sub-image is shown in fig. 23, taking as an example the pixel 9 in the high resolution image, the pixel 9 is composed of the pixel 2 of the o-photonics frame V1 and the pixel 6 of the e-photonics frame V2, and thus see the mapping matrix M, in which the ninth row, except the 2 nd element and the P +6 th element are 1, has 0 elements (where P is the total number of pixels of the sub-frame).
Illustratively, the mapping matrix M is as follows:
Figure BDA0003291244240000211
based on the above and the same technical concept, the embodiment of the present application further provides an imaging control method, which is applied to a wearable device. The wearable device includes a display component and a pixel position adjustment component including a polarization converter and a birefringent device.
Fig. 24 is a schematic flow chart of a possible imaging control method. The method is applied to a display device which comprises a display component and a pixel position adjusting component, wherein the display component comprises a plurality of pixels, and each pixel in the plurality of pixels comprises a plurality of sub-pixels.
2301, receiving an image to be displayed, and performing sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image; the resolution of each frame of image in the multi-frame images is the same as that of the display assembly, and the resolution of the multi-frame images is smaller than that of the image to be displayed.
2302, controlling the pixel position adjusting component to adjust the position of each frame of image displayed by the display component in a time-sharing manner.
The time of displaying the first image by the display component is synchronous with the time of adjusting the first image by the pixel position adjusting component, and the first image is any one of the multi-frame images.
In some embodiments, the display device supports two modes, either a super-divide mode or a normal mode.
And when the super-resolution mode of the display equipment is enabled, performing sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image.
When the hyper-resolution mode of the display equipment is not enabled, namely the normal mode is enabled, performing down-sampling processing on an image to be displayed to the image to be processed; inputting the image to be processed to the display component, so that the display component displays the image to be processed; and outputting the image to be processed at a set position through a pixel position adjusting component. It will be appreciated that the polarization converter no longer adjusts the switching state, i.e. the polarization direction of the target polarized light that is output by the control is no longer required.
In some embodiments, the sub-pixels included in the first pixel point in the first image in the multi-frame image are sampled from the sub-pixels included in at least h adjacent pixel points included in the image to be displayed.
And h is the image quantity of the multi-frame images, the first image is any one of the multi-frame images, and the first pixel point is any one of the pixel points in the first image.
In some embodiments, the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel included in the setting region of the image to be displayed and having the same color as the first sub-pixel;
the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
In some embodiments, the pixel value of the first sub-pixel included in the first pixel point is obtained by performing weighted summation on the pixel values of the sub-pixels included in the setting region and having the same color as the first sub-pixel;
wherein, the weight of the sub-pixel with the same color as the first sub-pixel in the set region is inversely proportional to the distance between the sub-pixels; the distance between the sub-pixels is the distance between the sub-pixel of the same color as the first sub-pixel and the sampling position of the first sub-pixel in the image to be displayed.
In some embodiments, the display device is a wearable device, and the size of the defined area is related to a distance between the display component and an imaging plane of the wearable device.
In some embodiments, the size of the defined area is related to the pixel size of the display element.
In some embodiments, the size of the defined area is related to the display content of the display element.
In some embodiments, the pixel value of the first sub-pixel satisfies a condition shown by the following formula:
q(i,j)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)+α 5 *Q(i,j);
wherein q (i, j) represents a pixel value of the first sub-pixel; i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed; j represents the vertical coordinate of the pixel point of the first sub-pixel in the image to be displayed; q (i, j) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed; alpha (alpha) ("alpha") 1 ,α 2 ,α 3 ,α 4 And alpha 5 Respectively, represent the weights.
In some embodiments, the pixel value of the first sub-pixel satisfies a condition shown by the following formula:
q(i,j+1)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)
1 *Q(i-1,j+1)+β 2 *Q(i,j)+β 3 *Q(i+1,j+1)+β 4 *Q(i,j+2)。
wherein Q (i, j + 1) represents the pixel value of the first sub-pixel, i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed, j +1 represents the ordinate of the first sub-pixel at the pixel point of the image to be displayed, Q (i, j + 1) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed, and α 1 ,α 2 ,α 3 ,α 4 ,β 1 ,β 2 ,β 3 And beta 4 Respectively, represent the weights.
In some embodiments, the pixel position adjustment assembly includes a polarization converter and a polarization displacement device;
the method for controlling the pixel position adjusting component to adjust the position of each frame of image displayed by the display component in a time-sharing way comprises the following steps: inputting the multi-frame images to a display assembly in a time-sharing manner, so that the display assembly emits target polarized light in the time-sharing manner, wherein the target polarized light is used for bearing each frame of image in the multi-frame images; and controlling the polarization converter to adjust the polarization direction of the output target polarized light in a time-sharing manner, so that when the polarization direction of the target polarized light output by the polarization converter is the first polarization direction, the polarization displacement device outputs the target polarized light at the first position, and when the polarization direction of the target polarized light output by the polarization converter is the second polarization direction, the polarization displacement device outputs the target polarized light at the second position.
In some embodiments, the multi-frame image includes a first image and a second image;
the method for controlling the polarization converter to adjust the polarization direction of the output target polarized light in a time-sharing manner comprises the following steps:
controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the first image to be a first polarization direction in a first time unit, and controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the second image to be a second polarization direction in a second time unit, so that the interval between a first position of the output target polarized light bearing the first image and a second position of the output target polarized light bearing the second image of the polarization displacement device in the horizontal direction is Px/2; px represents a pitch of adjacent pixels of the first image or the second image in the horizontal direction.
In some embodiments, the sub-pixels included in the first pixel point in the first image are obtained by sampling the sub-pixels included in two horizontally adjacent pixel points included in the image to be displayed;
the sub-pixels included by the second pixel points in the second image are obtained by sampling the sub-pixels included by two horizontally adjacent pixel points;
the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image.
In some embodiments, the multi-frame image includes a first image and a second image;
the method for controlling the polarization converter to adjust the polarization direction of the output target polarized light in a time-sharing manner comprises the following steps:
controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the first image to be a first polarization direction in the first time unit, and controlling the polarization converter to adjust the polarization direction of the output target polarized light bearing the second image to be a second polarization direction in the second time unit, so that the interval between a first position of the output target polarized light bearing the first image and a second position of the output target polarized light bearing the second image, which are output by the polarization displacement device, in the vertical direction is Py/2; py denotes a pitch of adjacent pixels of the first image or the second image in the vertical direction.
In some embodiments, the sub-pixels included in the first pixel point in the first image are obtained by sampling the sub-pixels included in two adjacent pixel points in the vertical direction in the image to be displayed;
the sub-pixels included by the second pixel points in the second image are obtained by sampling the sub-pixels included by two adjacent pixel points in the vertical direction;
the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image.
In some embodiments, the multi-frame image comprises a first image and a second image;
the method for controlling the polarization converter to adjust the polarization direction of the output target polarized light in a time-sharing manner comprises the following steps:
the polarization converter is controlled to adjust the polarization direction of input target polarized light bearing a first image into a first polarization direction in the first time unit, the polarization converter is controlled to adjust the polarization direction of input target polarized light bearing a second image into a second polarization direction in the second time unit, so that a first position of the target polarized light bearing the first image output by the polarization displacement device and a second position of the target polarized light bearing the second image output by the polarization displacement device are vertically spaced by Py/2 and horizontally offset by Px/2, py represents the distance between adjacent pixels of the first image or the second image in the vertical direction, and Px represents the distance between adjacent pixels of the first image or the second image in the vertical direction.
In some embodiments, the sub-pixels included in the first pixel point in the first image are obtained by sampling the sub-pixels included in two diagonally adjacent pixel points in the image to be displayed;
the sub-pixels included by the second pixel points in the second image are obtained by sampling the sub-pixels included by two adjacent pixel points in the diagonal direction;
the position coordinates of the first pixel points in the first image are the same as the position coordinates of the second pixel points in the second image.
The embodiment of the application also provides another imaging control method, and the imaging control method is applied to wearable equipment. The wearable device includes a display component and a pixel position adjustment component including a polarization rotator and a birefringent device.
Referring to fig. 25, a flowchart of a possible imaging control method is shown.
2401, receiving an image to be displayed, performing down-sampling processing on the image to be displayed to obtain an image to be processed, and inputting the image to be processed to a display assembly, so that the display assembly emits target polarized light bearing the image to be displayed, wherein the target frame image is one frame image in a plurality of frame images, and the resolution of the image to be processed is the same as that of the display assembly;
2402, controlling the polarization rotator to adjust the polarization direction of the output light beam of each pixel of the image to be processed, so that the birefringent device decomposes the light beam of each pixel included in the image to be processed, outputs a first target polarized light for projecting a first sub-image at a first position, and outputs a second target polarized light for projecting a second sub-image at a second position;
the decomposition ratio of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, the decomposition ratio of the first pixel is the ratio of the luminous intensity of the pixel projected on the first sub-image to the luminous intensity of the pixel projected on the second sub-image after the light beam of the first pixel is decomposed, and the decomposition ratio of the second pixel is the ratio of the luminous intensity of the pixel projected on the first sub-image to the luminous intensity of the pixel projected on the second sub-image after the light beam of the second pixel is decomposed.
In some embodiments, the position of the first sub-image is vertically spaced from the position of the second sub-image by Py/2 and/or horizontally spaced by Px/2, where Py represents the pitch of the pixels in the vertical direction and Px represents the pitch of the pixels in the vertical direction.
In some embodiments, controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes:
the polarization rotator is controlled to adjust the polarization direction of the light beam of the sub-pixel comprised by each pixel of the image to be processed.
In some embodiments, controlling the polarization rotator to adjust the polarization direction of the light beam for each pixel of the output image to be processed comprises:
estimating the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image according to the luminous intensity of each pixel of the image to be displayed; the resolution of the first sub-image is the same as that of the second sub-image and is smaller than that of the image to be displayed;
and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image.
In some embodiments, controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the output image to be processed includes:
estimating and adjusting the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image, so that the similarity between the image after the overlapped projection of the adjusted first sub-image and the second sub-image and the image to be displayed is greater than a set threshold; the set threshold is determined according to the perception capability of human eyes on image difference;
and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the adjusted luminous intensity of each pixel of the first sub-image and the adjusted luminous intensity of each pixel of the second sub-image.
The method steps in the embodiments of the present application may be implemented by hardware, or may be implemented by software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in Random Access Memory (RAM), flash memory, read-only memory (ROM), programmable ROM, erasable PROM (EPROM), electrically EPROM (EEPROM), registers, a hard disk, a removable hard disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may be located in a head-mounted display device or a terminal device. Of course, the processor and the storage medium may reside as discrete components in a head mounted display device or a terminal device.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer programs or instructions. When the computer program or instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are performed in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, a network appliance, a user device, or other programmable apparatus. The computer program or instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer program or instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire or wirelessly. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that integrates one or more available media. The usable medium may be a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape; or optical media such as Digital Video Disks (DVDs); it may also be a semiconductor medium, such as a Solid State Drive (SSD).
In the embodiments of the present application, unless otherwise specified or conflicting with respect to logic, the terms and/or descriptions in different embodiments have consistency and may be mutually cited, and technical features in different embodiments may be combined to form a new embodiment according to their inherent logic relationship.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural. In the description of the text of this application, the character "/" generally indicates that the former and latter associated objects are in an "or" relationship. In the formula of the present application, the character "/" indicates that the preceding and following related objects are in a relationship of "division". In the present application, the notation "(a, b)" indicates an open range, ranging from greater than a to less than b; "[ a, b ]" means a closed interval in the range of greater than or equal to a and less than or equal to b; "(a, b ]" means a half-open and half-closed interval, ranging from greater than a to less than or equal to b, "(a, b ]" means a half-open and half-closed interval, ranging from greater than a to less than or equal to b. Additionally, in this application, the word "exemplary" is used to mean for example, illustration, or description.
It is to be understood that the various numerical designations referred to in this application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application. The sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of the processes should be determined by their functions and inherent logic. The terms "first," "second," and the like, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such as a list of steps or elements. The methods, systems, articles, or apparatus need not be limited to the explicitly listed steps or elements, but may include other steps or elements not expressly listed or inherent to such processes, methods, articles, or apparatus.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely illustrative of the concepts defined by the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.

Claims (31)

1. A display module is characterized by comprising a display component, a pixel position adjusting component and a control component, wherein the display component comprises a plurality of pixels, and each pixel in the plurality of pixels comprises a plurality of sub-pixels;
the display component is used for displaying a plurality of frames of images in a time-sharing manner under the control of the control component; the multi-frame image is obtained by performing sub-pixel level decomposition on an image to be displayed, the resolution of the multi-frame image is the same as that of the display assembly, and the resolution of the multi-frame image is smaller than that of the image to be displayed;
the pixel position adjusting component is used for adjusting the position of each frame of image displayed by the display component in a time-sharing manner under the control of the control component;
the time of displaying the first image by the display component is synchronous with the time of adjusting the first image by the pixel position adjusting component, and the first image is any one of the multi-frame images.
2. The display module of claim 1, wherein the pixel position adjustment assembly comprises a polarization converter and a polarization displacement device;
the polarization converter is used for adjusting the polarization direction of target polarized light output by the polarization converter in a time-sharing manner under the control of the control component, and the target polarized light bears one frame image in the multi-frame images;
the polarization displacement device is used for outputting the target polarized light at a first position when the polarization direction of the target polarized light output by the polarization converter is a first polarization direction, and outputting the target polarized light at a second position when the polarization direction of the target polarized light output by the polarization converter is a second polarization direction.
3. The display module of claim 2, wherein the polarization displacement device is a birefringent device or a polarization grating.
4. A display module according to claim 2 or 3, wherein the polarization converter comprises twisted nematic liquid crystals or in-plane rotating liquid crystals; or the polarization converter comprises cholesteric liquid crystals and a 1/4 wave plate.
5. The display module according to any one of claims 2 to 4, wherein the birefringent device is a birefringent liquid crystal, and the birefringent liquid crystal is a quartz crystal, a barium borate crystal, a lithium niobate crystal or a titanium dioxide crystal or a liquid crystal polymer.
6. The display module of claim 1, wherein the pixel position adjustment assembly is a motor.
7. The display module according to any one of claims 1 to 6, wherein the control component is specifically configured to receive the image to be displayed, perform sub-pixel level decomposition on the image to be displayed to obtain the multiple frames of images, and send the multiple frames of images to the display component in a time-sharing manner.
8. The display module according to any one of claims 1-7, further comprising a folded optical path between the display component and the pixel position adjustment component, the folded optical path configured to transmit the target polarized light carrying any one of the plurality of images to the pixel position adjustment component.
9. An imaging control method is applied to a display device, the display device comprises a display component and a pixel position adjusting component, the display component comprises a plurality of pixels, and each pixel in the plurality of pixels comprises a plurality of sub-pixels; the method comprises the following steps:
receiving an image to be displayed, and performing sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image; the resolution of each frame of image in the multi-frame images is the same as that of the display assembly, and the resolution of the multi-frame images is smaller than that of the image to be displayed;
controlling the pixel position adjusting component to adjust the position of each frame of image displayed by the display component in a time-sharing manner;
the time of displaying the first image by the display component is synchronous with the time of adjusting the first image by the pixel position adjusting component, and the first image is any one of the multi-frame images.
10. The method according to claim 9, wherein the sub-pixel level decomposing the image to be displayed to obtain a plurality of frames of images comprises:
and when the super-resolution mode of the display equipment is enabled, performing sub-pixel level decomposition on the image to be displayed to obtain a multi-frame image.
11. The method of claim 10, wherein the method further comprises:
when the super-resolution mode of the display equipment is not enabled, performing down-sampling processing on the image to be displayed to the image to be processed;
inputting the image to be processed to the display component, so that the display component displays the image to be processed;
and outputting the image to be processed at a set position through a pixel position adjusting component.
12. The method according to any one of claims 9 to 11, wherein the sub-pixels included in the first pixel point in the first image are sampled from the sub-pixels included in at least h adjacent pixel points included in the image to be displayed;
the h is the number of the multi-frame images, the first image is any one of the multi-frame images, and the first pixel point is any one of the pixel points in the first image.
13. The method according to claim 12, wherein the pixel value of the first sub-pixel included in the first pixel point is determined according to the pixel value of the sub-pixel included in the set region of the image to be displayed and having the same color as the first sub-pixel;
the geometric center of the set area is the sampling position of the first sub-pixel in the image to be displayed.
14. The method according to claim 13, wherein the pixel value of the first sub-pixel included in the first pixel point is obtained by weighted summation of pixel values of sub-pixels included in the setting region and having the same color as the first sub-pixel;
wherein the set region comprises the weight of the sub-pixels with the same color as the first sub-pixels and the distance between the sub-pixels is inversely proportional; the distance between the sub-pixels is the distance between the sub-pixel with the same color as the first sub-pixel and the sampling position of the first sub-pixel in the image to be displayed.
15. The method of claim 13 or 14, wherein the display device is a wearable device, and wherein the size of the defined area is related to a distance between the display component and an imaging plane of the wearable device.
16. The method of any of claims 13-15, wherein the size of the defined area is related to a pixel size of the display element.
17. The method of any of claims 13-16, wherein the size of the defined field is related to the display content of the display element.
18. The method according to any one of claims 13 to 17, wherein the pixel value of the first sub-pixel satisfies a condition shown by the following formula:
q(i,j)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)+α 5 *Q(i,j);
wherein q (i, j) represents a pixel value of the first sub-pixel; i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed; j represents the first sub-pixel is atThe vertical coordinate of the pixel point of the image to be displayed; q (i, j) represents the pixel value of the sub-pixel of the first sub-pixel at the sampling position in the image to be displayed; alpha is alpha 1 ,α 2 ,α 3 ,α 4 And alpha 5 Respectively, represent the weights.
19. The method of any one of claims 13-17, wherein the pixel value of the first sub-pixel satisfies a condition represented by the following equation:
q(i,j+1)=α 1 *Q(i-1,j)+α 2 *Q(i,j-1)+α 3 *Q(i+1,j)+α 4 *Q(i,j+1)+β 1 *Q(i-1,j+1)+β 2 *Q(i,j)+β 3 *Q(i+1,j+1)+β 4 *Q(i,j+2);
wherein Q (i, j + 1) represents the pixel value of the first sub-pixel, i represents the abscissa of the first sub-pixel at the pixel point of the image to be displayed, j +1 represents the ordinate of the first sub-pixel at the pixel point of the image to be displayed, Q (i, j + 1) represents the pixel value of the sub-pixel at the sampling position of the first sub-pixel in the image to be displayed, and α 1 ,α 2 ,α 3 ,α 4 ,β 1 ,β 2 ,β 3 And beta 4 Respectively representing the weights.
20. The method of any one of claims 9-19, wherein the plurality of frames of images include the first image and a second image, and wherein controlling the pixel position adjustment component to time-divisionally adjust the position of each frame of image displayed by the display component comprises:
inputting the plurality of frames of images to the display component in a time-sharing manner, so that the display component displays the plurality of frames of images in a time-sharing manner;
controlling the pixel position adjusting assembly to output the first image at a first position in a first time unit, and controlling the pixel position adjusting assembly to output the second image at a second position in a second time unit;
the interval between the first position and the second position in the horizontal direction is Px/2; alternatively, the first and second electrodes may be,
the interval between the first position and the second position in the vertical direction is Py/2; alternatively, the first and second electrodes may be,
the interval between the first position and the second position in the horizontal direction is Px/2, and the interval between the first position and the second position in the vertical direction is Py/2;
the first time unit and the second time unit are adjacent in time.
21. The method of claim 20, wherein the pixel position adjustment component comprises a polarization converter and a polarization displacement device;
controlling the pixel position adjusting component to output the first image at a first position in a first time unit, and controlling the pixel position adjusting component to output the second image at a second position in a second time unit, comprising:
controlling the polarization converter to adjust the polarization direction of target polarized light in a time-sharing manner, wherein the target polarized light is generated when the display assembly displays a plurality of frames of images in a time-sharing manner; when the polarization direction of the target polarized light output by the polarization converter is a first polarization direction, the polarization displacement device outputs the target polarized light at a first position, and when the polarization direction of the target polarized light output by the polarization converter is a second polarization direction, the polarization displacement device outputs the target polarized light at a second position.
22. A display module is characterized by comprising a display component, at least one adjusting component and a control component; the adjusting component comprises a polarization rotator and a birefringent device;
the display component is used for receiving an image to be processed and displaying the image to be processed, and the resolution of the image to be processed is the same as that of the display component;
the polarization rotator is used for adjusting the polarization direction of the light beam of each pixel of the image to be processed under the control of the control component;
the birefringent device is used for decomposing the light beam of each pixel included in the image to be processed, outputting first target polarized light for projecting a first sub-image at a first position and outputting second target polarized light for projecting a second sub-image at a second position;
the decomposition ratio of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, the decomposition ratio of the first pixel is the ratio of the luminous intensity of the light beam of the first pixel projected on the pixel of the first sub-image after being decomposed to the luminous intensity of the light beam projected on the pixel of the second sub-image, and the decomposition ratio of the second pixel is the ratio of the luminous intensity of the light beam of the second pixel projected on the pixel of the first sub-image after being decomposed to the luminous intensity of the light beam projected on the pixel of the second sub-image.
23. The display module according to claim 22, wherein the positions of the first sub-image and the second sub-image are vertically spaced by Py/2 and/or horizontally spaced by Px/2, where Py represents a pitch of the pixels in a vertical direction and Px represents a pitch of the pixels in a vertical direction.
24. The display module according to claim 22 or 23, wherein the control component is specifically configured to control the polarization rotator to adjust the polarization direction of the light beam of the sub-pixel included in each pixel.
25. The display module according to any one of claims 22-24, wherein the image to be processed is obtained by down-sampling the image to be displayed;
the control assembly is specifically configured to:
estimating the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image according to the luminous intensity of each pixel of the image to be displayed; the resolution of the first sub-image is the same as that of the second sub-image, and is smaller than that of the image to be displayed;
and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image.
26. The display module according to any one of claims 22-24, wherein the image to be processed is obtained by down-sampling the image to be displayed;
the control assembly is specifically configured to:
estimating and adjusting the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image, so that the similarity between the image after the overlapped projection of the adjusted first sub-image and the second sub-image and the image to be displayed is greater than a set threshold; the set threshold is determined according to the perception capability of human eyes on image difference;
and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the adjusted luminous intensity of each pixel of the first sub-image and the adjusted luminous intensity of each pixel of the second sub-image.
27. An imaging control method is applied to a wearable device, the wearable device comprises a display component and a pixel position adjusting component, and the pixel position adjusting component comprises a polarization rotator and a birefringent device; the method comprises the following steps:
receiving an image to be displayed, performing down-sampling processing on the image to be displayed to obtain an image to be processed, and inputting the image to be processed to the display assembly, so that the display assembly emits target polarized light bearing the image to be displayed, wherein the target frame image is one of the multiple frames of images, and the resolution of the image to be processed is the same as that of the display assembly;
controlling the polarization rotator to adjust the polarization direction of the output light beam of each pixel of the image to be processed, so that the birefringent device decomposes the light beam of each pixel included in the image to be processed, outputs a first target polarized light for projecting a first sub-image at a first position and outputs a second target polarized light for projecting a second sub-image at a second position;
the decomposition ratio of the first pixel is different from that of the second pixel, the first pixel and the second pixel are two pixels with different polarization directions in the image to be processed, the decomposition ratio of the first pixel is the ratio of the luminous intensity of the light beam of the first pixel projected on the pixel of the first sub-image after being decomposed to the luminous intensity of the light beam projected on the pixel of the second sub-image, and the decomposition ratio of the second pixel is the ratio of the luminous intensity of the light beam of the second pixel projected on the pixel of the first sub-image after being decomposed to the luminous intensity of the light beam projected on the pixel of the second sub-image.
28. The method according to claim 27 wherein the position of the first sub-image is vertically spaced from the position of the second sub-image by Py/2 and/or horizontally spaced by Px/2, where Py represents the pitch of the pixels in the vertical direction and Px represents the pitch of the pixels in the vertical direction.
29. The method of claim 27 or 28, wherein the controlling the polarization rotator to adjust the polarization direction of the output light beam for each pixel of the image to be processed comprises:
and controlling the polarization rotator to adjust the polarization direction of the light beams of the sub-pixels included in each pixel of the image to be processed.
30. The method according to any of claims 27-29, wherein said controlling the polarization rotator to adjust the polarization direction of the output light beam for each pixel of the image to be processed comprises:
estimating the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image according to the luminous intensity of each pixel of the image to be displayed; the resolution of the first sub-image is the same as that of the second sub-image and is smaller than that of the image to be displayed;
and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the luminous intensity of each pixel to be projected as the first sub-image and the luminous intensity of each pixel to be projected as the second sub-image.
31. The method according to any one of claims 27-29, wherein said controlling the polarization rotator to adjust the polarization direction of the output light beam for each pixel of the image to be processed comprises:
estimating and adjusting the luminous intensity of each pixel to be projected as a first sub-image and the luminous intensity of each pixel to be projected as a second sub-image, so that the similarity between the image after the overlapped projection of the adjusted first sub-image and the second sub-image and the image to be displayed is greater than a set threshold; the set threshold is determined according to the perception capability of human eyes on image difference;
and controlling the polarization rotator to adjust the polarization direction of the light beam of each pixel of the image to be processed according to the adjusted luminous intensity of each pixel of the first sub-image and the adjusted luminous intensity of each pixel of the second sub-image.
CN202111164540.XA 2021-09-30 2021-09-30 Display module and imaging control method Pending CN115909913A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111164540.XA CN115909913A (en) 2021-09-30 2021-09-30 Display module and imaging control method
PCT/CN2022/121488 WO2023051479A1 (en) 2021-09-30 2022-09-26 Display module and imaging control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111164540.XA CN115909913A (en) 2021-09-30 2021-09-30 Display module and imaging control method

Publications (1)

Publication Number Publication Date
CN115909913A true CN115909913A (en) 2023-04-04

Family

ID=85729581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111164540.XA Pending CN115909913A (en) 2021-09-30 2021-09-30 Display module and imaging control method

Country Status (2)

Country Link
CN (1) CN115909913A (en)
WO (1) WO2023051479A1 (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2813041B2 (en) * 1990-07-02 1998-10-22 日本電信電話株式会社 Projection display device
US7123277B2 (en) * 2001-05-09 2006-10-17 Clairvoyante, Inc. Conversion of a sub-pixel format data to another sub-pixel data format
JP3973524B2 (en) * 2001-11-22 2007-09-12 シャープ株式会社 Image shift element and image display device
CN104680949B (en) * 2015-03-25 2017-03-15 京东方科技集团股份有限公司 Pel array, display drive method, display drive apparatus and display device
CN106023909B (en) * 2016-08-12 2018-07-24 京东方科技集团股份有限公司 Display device and its display methods
CN112005161B (en) * 2018-03-30 2021-12-10 华为技术有限公司 Imaging device, display apparatus, and imaging apparatus
CN115278367A (en) * 2019-04-30 2022-11-01 深圳光峰科技股份有限公司 Image splitting method and image display method
CN113156744A (en) * 2020-01-23 2021-07-23 亘冠智能技术(杭州)有限公司 Double-refraction-principle-based DMD projection system with multiplied resolution
CN113542701A (en) * 2020-04-20 2021-10-22 青岛海信激光显示股份有限公司 Projection display method and projection equipment
CN113452942B (en) * 2021-06-03 2022-07-05 华东师范大学 4k resolution video image preprocessing method for digital micromirror chip
CN113489960A (en) * 2021-06-30 2021-10-08 青岛海信激光显示股份有限公司 Projection display device, method and system

Also Published As

Publication number Publication date
WO2023051479A1 (en) 2023-04-06

Similar Documents

Publication Publication Date Title
Zhan et al. Multifocal displays: review and prospect
US11022806B2 (en) Augmented reality light field head-mounted displays
US10642058B2 (en) Wearable data display
US8345088B2 (en) Autostereoscopic display apparatus
JP2785427B2 (en) How to create computer hologram
US20200169725A1 (en) Multilayer high-dynamic-range head-mounted display
US20180188538A1 (en) Near eye display multi-component dimming system
EP1923860A2 (en) Defect compensation and/or masking
WO2000023832A1 (en) Holographic display system
US10546521B2 (en) Resolutions by modulating both amplitude and phase in spatial light modulators
US20200333662A1 (en) Alignment cells for modulating both amplitude and phase in spatial light modulators
US20230350248A1 (en) Spatial light modulators modulating both amplitude and phase
US11521572B2 (en) Holographic displays with light modulation in amplitude and phase
US20070052699A1 (en) Colour ratios in a 3d image display device
US20130063332A1 (en) Display device, display method, and electronic apparatus
JP2018508807A (en) Autostereoscopic display device and driving method thereof
JP6857197B2 (en) Dynamic full 3D display
Wetzstein et al. State of the art in perceptual VR displays
CN115909913A (en) Display module and imaging control method
KR20210022228A (en) Hologram display device
JP2017044769A (en) Display device
Yoo et al. 15 focal planes head-mounted display using led array backlight
Wetzstein et al. Factored displays: improving resolution, dynamic range, color reproduction, and light field characteristics with advanced signal processing
US10957240B1 (en) Apparatus, systems, and methods to compensate for sub-standard sub pixels in an array
CN113096530B (en) Display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination