CN115220241A - 3D display device and 3D display driving method - Google Patents

3D display device and 3D display driving method Download PDF

Info

Publication number
CN115220241A
CN115220241A CN202210954249.0A CN202210954249A CN115220241A CN 115220241 A CN115220241 A CN 115220241A CN 202210954249 A CN202210954249 A CN 202210954249A CN 115220241 A CN115220241 A CN 115220241A
Authority
CN
China
Prior art keywords
sub
pixel
cylindrical lenses
pixels
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210954249.0A
Other languages
Chinese (zh)
Inventor
孙艳六
梁蓬霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202210954249.0A priority Critical patent/CN115220241A/en
Publication of CN115220241A publication Critical patent/CN115220241A/en
Priority to PCT/CN2023/110974 priority patent/WO2024032461A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/27Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
    • G02B30/29Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays characterised by the geometry of the lenticular array, e.g. slanted arrays, irregular arrays or arrays of varying shape or size

Abstract

The invention relates to a 3D display device, which comprises a display panel and a micro-lens array positioned on a light-emitting surface of the display panel, wherein the display panel comprises a plurality of sub-pixels, the sub-pixels with the same color are arranged along a first direction, the sub-pixels with different colors are arranged along a second direction vertical to the first direction, and the first direction is parallel to a connecting line of two eyes of a viewer; the micro-lens array comprises a plurality of cylindrical lenses arranged along the first direction, the extending direction of the cylindrical lenses is parallel to the second direction, in the first direction, each cylindrical lens covers M corresponding sub-pixels, and M is a natural number. The invention also relates to a 3D display driving method.

Description

3D display device and 3D display driving method
Technical Field
The invention relates to the technical field of display product manufacturing, in particular to a 3D display device and a 3D display driving method.
Background
The current naked eye 3D display technology mainly adopts a mode of oblique arrangement of a cylindrical lens to realize viewpoint separation, and crosstalk between viewpoints is increased. The naked eye 3D applied to the OLED is limited by the size of PDL Gap, the large visual space of ultra-multiple continuous viewpoints cannot be realized, the existing technology can cause the reduction of the resolution of 3D display and the loss of pixels of partial viewpoints, the overall power consumption and the height of the naked eye 3D display are constant, and the development of naked eye 3D display is severely limited.
Disclosure of Invention
In order to solve the above technical problems, the present invention provides a 3D display device and a 3D display driving method, which solve the problem of low resolution of 3D display.
In order to achieve the purpose, the embodiment of the invention adopts the technical scheme that: A3D display device comprises a display panel and a micro-lens array arranged on the light emergent surface of the display panel,
the display panel comprises a plurality of sub-pixels, the sub-pixels with the same color are arranged along a first direction, the sub-pixels with different colors are arranged along a second direction perpendicular to the first direction, and the first direction is parallel to a connecting line of two eyes of a viewer;
the micro-lens array comprises a plurality of cylindrical lenses arranged along the first direction, the extending direction of the cylindrical lenses is parallel to the second direction, in the first direction, each cylindrical lens covers M corresponding sub-pixels, and M is a natural number.
Optionally, along the first direction, a plurality of the cylindrical lenses are periodically arranged, each period includes 2 or more than 2 cylindrical lenses, and projection regions corresponding to the plurality of cylindrical lenses in the same period are complementary to each other in a crossed manner, so as to form a continuous imaging surface.
Optionally, the number of cylindrical lenses and the number of sub-pixels in the same period are relatively prime.
Optionally, the cylindrical lenses in the same period include at least one first cylindrical lens, and the number of the sub-pixels corresponding to the first cylindrical lens is different from the number of the sub-pixels corresponding to the remaining cylindrical lenses in the same period.
Optionally, in the first direction, the first cylindrical lens is located on one side of the remaining cylindrical lenses in the same period.
Optionally, the cylindrical lenses include N cylindrical lenses in one period, each of the sub-pixels includes an open region and a non-open region, and the width of the open region is 1/N of the width of the sub-pixel in the first direction.
Optionally, the number of the sub-pixels in one period is m, in the first direction, the widths of the plurality of cylindrical lenses are the same, and in the first direction, the width of each cylindrical lens is m times of the opening area, where m is a natural number.
Optionally, along the first direction, orthographic projections of different cylindrical lenses located in the same period on the display panel sequentially shift by a first width, where the first width is equal to the width of the opening area.
Optionally, the number of cylindrical lenses in different periods is the same.
Optionally, the plurality of cylindrical lenses arranged periodically at least includes at least one first period, and the number of cylindrical lenses in the first period is different from the number of cylindrical lenses in other periods except the first period.
Optionally, the light emitting surface of the display panel is located on the focal plane of the cylindrical lens.
Optionally, the distance H between the cylindrical lens and the display panel satisfies the following formula: h = n f, where n is the refractive index of the medium between the light emitting face and the cylindrical lens, and f is the focal length of the cylindrical lens.
The embodiment of the present invention further provides a 3D display driving method, which is implemented by using the display device, where along the first direction, a plurality of cylindrical lenses are periodically arranged, and a plurality of sub-pixels corresponding to one period form a pixel island, and the method includes the following steps:
providing a three-dimensional object, and determining actual image information obtained by observing the three-dimensional object when human eyes are positioned at different positions;
determining the corresponding relation between the position of each sub-pixel and the position of a viewpoint;
acquiring an angular spectrum boundary of a projection area corresponding to the sub-pixel according to the angular spectrum distribution of the projection area corresponding to the sub-pixel;
determining the corresponding relation between the angular spectrum position of the projection area corresponding to the sub-pixel and the sub-pixel;
determining the coordinates of the eyebrow center and the visual angle of human eyes of a viewer, and acquiring corresponding image information;
and obtaining the position of the pixel island corresponding to the eyebrow center according to the included angle between the eyebrow center and each pixel island, and loading the image information on the pixel island and the pixel islands around the pixel island.
The embodiment of the invention also provides a 3D display driving method, which is realized by adopting the display device and comprises the following steps:
determining the coordinates of pupils of human eyes;
determining according to the pupil coordinates of the human eyes to form the position of a first sub-pixel entering a viewpoint of the human eyes;
and lighting the first sub-pixel, or lighting the first sub-pixel and a preset number of sub-pixels around the first sub-pixel.
The invention has the beneficial effects that: in the invention, the sub-pixels with the same color are arranged along a first direction parallel to a line connecting eyes of a person, the sub-pixels with different colors are arranged along a second direction perpendicular to the first direction, and the problem of color cross by evaporation does not need to be considered in the first direction, so that the width of PDL (pixel definition layer) between adjacent sub-pixels with the same color can be compressed to the limit, thereby increasing the aperture ratio of the pixel.
Drawings
FIG. 1 is a schematic view showing a cylindrical lens arrangement in the related art;
FIG. 2 is a schematic diagram showing the arrangement of sub-pixels and cylindrical lenses in an embodiment of the present invention;
FIG. 3 is a schematic diagram showing the relationship between the aperture ratio of a sub-pixel and the lifetime of a display device;
FIG. 4 is a schematic view of a projection area of a sub-pixel;
FIG. 5 shows a schematic projection of a sub-pixel during a period;
FIG. 6 is a schematic diagram showing the superposition of the translation of the projection regions corresponding to two cylindrical lenses in one period;
FIG. 7 is a schematic diagram showing a comparison of the optical path of the light-emitting surface of the display panel at the focal plane of the cylindrical lens and the optical path at a non-focal plane;
FIG. 8 is a cross-talk diagram illustrating a display panel with the light emitting surface in a non-focal plane of a cylindrical lens;
FIG. 9 is a cross-talk diagram illustrating the light emitting surface of the display panel at the focal plane of the cylindrical lens;
FIG. 10 shows a first schematic diagram of the main lobe and the side lobes of the projection area;
fig. 11 shows a second schematic diagram of the main lobe and the side lobes of the projected area;
FIG. 12 is a view showing a view index and an index of a sub-pixel;
FIG. 13 is a view showing a view angle spectrum distribution;
FIG. 14 shows a schematic view of the angular spectral boundaries of a sub-pixel;
FIG. 15 is a schematic view of an angle between the eyebrow center and the display panel;
FIG. 16 is a schematic diagram showing the determination of the sub-pixel position based on the eyebrow center coordinates;
FIG. 17 is a schematic diagram of multiple images corresponding to different positions of the human eye;
FIG. 18 is a schematic view of an image corresponding to a human eye in a first position;
fig. 19 is a schematic diagram showing an image obtained by loading corresponding image information to a corresponding sub-pixel and performing display driving;
FIG. 20 is a schematic diagram illustrating coverage of sub-pixels corresponding to pupils of a human eye;
fig. 21 is a diagram showing the relationship between power consumption and the number of sub-pixels to be turned on.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the drawings of the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention, are within the scope of the invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, in the related art, viewpoints are arranged along an oblique direction (the cylindrical lenses 100 are arranged obliquely), such as viewpoints 2 and 6 in fig. 1, it can be seen first that viewpoints 2 and 6 are substantially located at the edges of the corresponding cylindrical lenses 100, and the projections of the viewpoints 2 and 6 in space are substantially consistent, which tends to increase crosstalk between the viewpoints and the cylindrical lenses; in addition, it can be seen that the boundaries of the cylindrical lens 100 are crossed from the middle of the viewpoint 2, i.e. only 50% to 70% of the light emitting area can enter its intrinsic projection area, and the remaining less than 50% of the light emitting area interferes with other viewpoints, resulting in a certain degree of vertigo; in addition, the 2D display pixels are arranged in a horizontal direction, RGB (red sub-pixel 10, green sub-pixel 20, blue sub-pixel 30) but with the column transparent/100 added, the RGB sub-pixels are arranged in a vertical direction, which has the effect that the vertical resolution of the screen is reduced to 1/3, or even more, resulting in resolution imbalance and 2D/3D display mismatch.
Referring to fig. 2, in order to solve the above technical problem, the present embodiment provides a 3D display device, which includes a display panel and a micro lens array disposed on a light emitting surface of the display panel,
the display panel comprises a plurality of sub-pixels, the sub-pixels with the same color are arranged along a first direction (refer to the X direction in fig. 2), the sub-pixels with different colors are arranged along a second direction (refer to the Y direction in fig. 2) which is perpendicular to the first direction, and the first direction is parallel to a connecting line of two eyes of a viewer;
the micro lens array comprises a plurality of cylindrical lenses 1 arranged along the first direction, the extending direction of the cylindrical lenses 1 is parallel to the second direction, in the first direction, each cylindrical lens covers M corresponding sub pixels, and M is a natural number.
In this embodiment, the sub-pixels of the same color are arranged along a first direction parallel to a line connecting eyes of a human, the sub-pixels of different colors are arranged along a second direction perpendicular to the first direction, and in the first direction, there is no need to consider a problem of color cross-talk by evaporation, so that a width of PDL (pixel defining layer) between adjacent sub-pixels of the same color can be compressed to a limit, for example, from 20um in the related art to 5-6um, that is, in the first direction, a width of the pixel defining layer between adjacent sub-pixels is 5-6um, thereby increasing a pixel aperture ratio, the pixel aperture ratio can be increased from 15% to 30% or more (the aperture ratio directly affects a lifetime of the display device), reliability of the display device is ensured, fig. 3 is a schematic diagram of a correspondence relationship between the aperture ratio of the sub-pixels and the lifetime of the display device, a curve with an origin in fig. 3 represents an attenuation ratio, a rectangular bar represents LT95 (lifetime), that is a time taken for the luminance to be reduced to 95%, a left side represents a lifetime in fig. 3, a left side coordinate represents a maximum luminance ratio, and a left side coordinate is a vertical coordinate is a maximum luminance ratio, and a right side is a vertical coordinate.
It should be noted that the display panel in fig. 2 includes a red sub-pixel 10, a green sub-pixel 20, and a blue sub-pixel 30, and the red sub-pixel 10, the green sub-pixel 20, and the blue sub-pixel 30 are respectively arranged along the first direction.
Since there is still a small PDL Gap between the same-color sub-pixels arranged in the first direction (i.e. a PDL pixel defining layer is disposed between adjacent same-color sub-pixels, and the pixel defining layer has a certain width in the first direction), there must be bright and dark regions (as shown in fig. 4) when the cylindrical lens is projected in the viewing space, forming moire fringes, which seriously affects the display effect. In view of this problem, in an implementation manner of this embodiment, along the first direction, a plurality of cylindrical lenses 1 are periodically arranged, each period includes 2 or more cylindrical lenses, and projection regions corresponding to the plurality of cylindrical lenses 1 in the same period are crossed and complemented to form a continuous imaging surface.
Projection regions corresponding to a plurality of cylindrical lenses 1 in the same period are crossed and complemented with each other to form a continuous imaging surface, that is, the projection region corresponding to each cylindrical lens 1 includes a bright region 200 and a dark region 300, the plurality of cylindrical lenses 1 are periodically arranged along the first direction, the projection regions formed by different cylindrical lenses 1 in the same period are overlapped in a translation manner, the bright region 200 in the projection region formed by one cylindrical lens 1 is overlapped (at least partially overlapped) with the dark region 300 in the projection region formed by another cylindrical lens 1, so that the width of the dark region is reduced, even the dark region is eliminated, and a continuous imaging surface is formed, fig. 6 is a schematic diagram of the projection regions corresponding to two cylindrical lenses 1 in one period being overlapped in a translation manner, the bright region 200 in the projection region formed by one cylindrical lens 1 is overlapped with the dark region 300 in the projection region formed by another cylindrical lens 1, and the pattern regions are the bright region of the projection region corresponding to one cylindrical lens 1 and the projection region corresponding to another cylindrical lens 1, so that the dark regions in a crossed and complemented relationship are formed into a continuous imaging surface.
In the exemplary embodiment, the number of cylindrical lenses 1 and the number of sub-pixels in the same period are relatively prime.
The number of the sub-pixels corresponding to each cylindrical lens 1 may be determined according to the number m of all the sub-pixels corresponding to all the cylindrical lenses 1 in the same period and the number N of the cylindrical lenses 1 in the same period, and specifically, may be obtained according to the following formula, where m/N = a … b, that is, the quotient of m/N is a, the remainder is b, and b is smaller than N, where on the basis that each cylindrical lens 1 corresponds to a sub-pixel, 1 is added to the number of the sub-pixels corresponding to b cylindrical lenses 1, that is, the number of the sub-pixels corresponding to b cylindrical lenses 1 is a +1 (in order to ensure that the widths of each cylindrical lens in the first direction are the same, 1 is added to the number of the sub-pixels corresponding to b cylindrical lenses 1). For example, if m is 11, n is 2, 11/2=5 … 1, i.e. a =5, b =1, then the number of sub-pixels corresponding to one cylindrical lens 1 is 5, and the number of sub-pixels corresponding to one cylindrical lens 1 is 6 in 2 cylindrical lenses 1, refer to fig. 5; for example, m is 11, n is 3, 11/3=3 … 2, i.e., a =3, b =2, then in 3 cylindrical lenses 1, the number of sub-pixels corresponding to one cylindrical lens 1 is 3, and the number of sub-pixels corresponding to the remaining two cylindrical lenses 1 is 4.
The periodic arrangement rule of the cylindrical lenses 1 is obtained by the ratio of the width of the dark area 300 to the width of the bright area 200 in the projection area formed by one cylindrical lens 1, so that the projection boundary of the first sub-pixel corresponding to one cylindrical lens 1 in the same period is parallel to the projection boundary of the second sub-pixel in another cylindrical lens 1, and thus the superposition effect of the bright area and the dark area of the projection areas of different sub-pixels can be formed.
It should be noted that, the sub-pixels corresponding to each cylindrical lens 1 are numbered along the same direction, and the number of the first sub-pixel is the same as that of the second sub-pixel. For example, the sub-pixels corresponding to each cylindrical lens 1 are numbered from left to right, the first sub-pixel is the first sub-pixel corresponding to the corresponding cylindrical lens, and the second sub-pixel is the first sub-pixel corresponding to the corresponding cylindrical lens. Referring to fig. 5, the projection boundary a of the first subpixel 1001 and the projection boundary b of the second subpixel 1002 are parallel.
In an exemplary embodiment, the cylindrical lenses 1 in the same period include at least one first cylindrical lens, and the number of sub-pixels corresponding to the first cylindrical lens is different from the number of sub-pixels corresponding to the remaining cylindrical lenses in the same period.
If the number of the sub-pixels corresponding to all the cylindrical lenses 1 in the same period is the same, the formed projection areas are the same, that is, the arrangement rule of the bright and dark areas and the corresponding positions are the same, when the projection areas corresponding to all the cylindrical lenses 1 in the same period are overlapped, the condition that the bright area of the projection area corresponding to one cylindrical lens 1 overlaps with the dark area of the projection area corresponding to another cylindrical lens 1 does not occur, that is, a continuous imaging surface cannot be realized, so in this embodiment, the number of the sub-pixels corresponding to the first cylindrical lens is different from the number of the sub-pixels corresponding to the other cylindrical lenses in the same period, so as to realize the continuous imaging surface, referring to fig. 5, which shows 2 cylindrical lenses arranged in one period, the number of the sub-pixels corresponding to one cylindrical lens is 6, and the number of the sub-pixels corresponding to one cylindrical lens is 5.
In the exemplary embodiment, in the first direction, the first cylindrical lens is located on one side of the remaining cylindrical lenses 1 in the same period.
When the cylindrical lenses 1 and the corresponding sub-pixels are arranged, due to the existence of the cylindrical lenses 1 with different numbers of the corresponding sub-pixels, the sub-pixels corresponding to the different cylindrical lenses 1 may shift along one direction, and if the first cylindrical lens is located at the middle position in the same period, it cannot be guaranteed that the corresponding projection regions are compensated and superimposed according to a preset rule.
In the exemplary embodiment, N cylindrical lenses 1 are included in one period, each of the sub-pixels includes an open area and a non-open area, and the width of the open area is 1/N of the width of the sub-pixel in the first direction.
Referring to fig. 5, in the first direction, the width of the opening area is c, the width of the non-opening area is d, and 2 cylindrical lenses 1,c = d are provided in one period, i.e., the width of the opening area is 1/2 of the width of the sub-pixel.
By adopting the scheme, the effect that the bright area in the projection area corresponding to one cylindrical lens 1 and the dark area in the projection area corresponding to the other cylindrical lens 1 in the same period are superposed can be effectively realized, and the width of each cylindrical lens 1 is ensured to be consistent in the first direction. For example, the number m of sub-pixels corresponding to the cylindrical lens 1 in the same period is 11, n is 2, the width of the opening area of each sub-pixel is 1/2 of the total width of the sub-pixel, then in 2 cylindrical lenses 1, the width of each cylindrical lens 1 in the first direction is m times of the width of the opening area, and the positions of the sub-pixels corresponding to 2 cylindrical lenses 1 are staggered by the width of one opening area in the first direction, and at the connection of 2 cylindrical lenses, 2 cylindrical lenses together cover one complete sub-pixel; for example, the number m of the sub-pixels corresponding to the cylindrical lens 1 in the same period is 11, n is 3, the width of the opening region of each sub-pixel is 1/3 of the total width of the sub-pixel, and then the position of the sub-pixel corresponding to each cylindrical lens 1 in 3 cylindrical lenses 1 is shifted by one opening region with respect to the position of the sub-pixel corresponding to the cylindrical lens 1 adjacent to the sub-pixel, referring to fig. 5, the opening region of the sub-pixel corresponding to the edge of the cylindrical lens located on the left side close to the right side is covered by the cylindrical lens located on the left side, and the non-opening region of the sub-pixel corresponding to the edge of the cylindrical lens located on the left side close to the right side is covered by the cylindrical lens located on the right side.
In an exemplary embodiment, the number of the sub-pixels in one period is m, the plurality of the cylindrical lenses 1 have the same width in the first direction, and each of the cylindrical lenses 1 has a width m times the opening area in the first direction, where m is a natural number.
In an exemplary embodiment, in the first direction, orthographic projections of the different cylindrical lenses located in the same period on the display panel are sequentially shifted by a first width equal to a width of the opening area.
In an exemplary embodiment, the number of cylindrical lenses in different periods is the same, i.e. all periods include the same number of cylindrical lenses, e.g. 2 cylindrical lenses are included in each period.
In an exemplary embodiment, the plurality of cylindrical lenses arranged periodically includes at least one first period, and the number of cylindrical lenses included in the first period is different from the number of cylindrical lenses included in other periods.
In an exemplary embodiment, the plurality of cylindrical lenses arranged periodically includes at least one first period and at least one second period, and the number of cylindrical lenses in the first period is different from the number of cylindrical lenses in the second period.
Illustratively, the plurality of cylindrical lenses arranged periodically includes at least one third period other than the first period and the second period, the number of cylindrical lenses in the first period is different from the number of cylindrical lenses in the third period, and the number of cylindrical lenses in the second period is different from the number of cylindrical lenses in the second period.
In an exemplary embodiment, a light emitting surface of the display panel is located on a focal plane of the cylindrical lens.
Referring to fig. 7, in an exemplary embodiment, a distance H between the lenticular lens 1 and the display panel satisfies the following formula: h = n × f, where n is a refractive index of a medium located between the light emitting surface 1003 of the display panel and the cylindrical lens 1, and f is a focal length of the cylindrical lens.
In order to solve the crosstalk between adjacent viewpoints, a light emitting surface of the display panel (for the OLED display device, an actual light emitting surface is an EL (light emitting layer) light emitting surface, which may also be expressed as a display panel light emitting surface or a sub-pixel light emitting surface) needs to be disposed on a focal plane of the cylindrical lens 1, when the light emitting surface of the display panel is located at a position of a focal plane C in fig. 7, each point on the light emitting surface of the display panel passes through the lens to approximately collimate light (for example, a light path shown by reference numeral 101), and when the light emitting surface of the display panel is located at a non-focal plane D, each point on the light emitting surface of the display panel passes through the cylindrical lens, the light emitting has a certain inclination angle (refer to a light path shown by reference numeral 102 in fig. 7), and the larger the inclination angle is, the crosstalk between the light emitting surface of the display panel and the adjacent sub-pixel is larger (compare fig. 8 and fig. 9), as shown in fig. 7. In order to ensure that the light emitting surface of the display panel is in the focal plane of the cylindrical lens, the placement height of the cylindrical lens 1 (i.e., the distance between the cylindrical lens and the display panel) should satisfy the following relationship: h = n × f, where H is a placement height, n is a refractive index of a medium between a light emitting surface of the display panel and the cylindrical lens, and f is a focal length of the cylindrical lens itself.
For the observation of a three-dimensional object, in the actual observation, the pictures obtained by different positions of human eyes are different, for example, when the pictures are located on the front side of the three-dimensional object, a front view is obtained, when the pictures are located on the side of the three-dimensional object, a side view of the three-dimensional object is obtained, in order to better realize the three-dimensional effect, realize the ultra-large visible space and restore the real world, the embodiment of the invention further provides a 3D display driving method, which is realized by adopting the display device, wherein a plurality of cylindrical lenses are periodically arranged along the first direction, and a plurality of sub-pixels corresponding to one period form a pixel island, and the method comprises the following steps:
providing a three-dimensional object, and determining actual image information obtained by observing the three-dimensional object when human eyes are positioned at different positions;
determining the corresponding relation between the position of each sub-pixel and the position of a viewpoint;
acquiring an angular spectrum boundary of a projection area corresponding to the sub-pixel according to the angular spectrum distribution of the projection area corresponding to the sub-pixel;
determining the corresponding relation between the angular spectrum position of the projection area corresponding to the sub-pixel and the sub-pixel;
determining the eyebrow coordinates and the eye visual angles of the viewers, and acquiring corresponding image information;
and obtaining the position of the pixel island corresponding to the eyebrow center according to the included angle between the eyebrow center and each pixel island, and loading the image information on the pixel island and the pixel islands around the pixel island.
As shown in fig. 10, the projection of the cylindrical lens directly opposite to the corresponding sub-pixel in space is a main lobe projection area 400, and the projection of the sub-pixel in space is a multi-level side lobe projection area 500, and through the concatenation of the main lobe projection area 400 and the side lobe projection area 500, an excessive number of continuous 3D viewpoints are formed, but the two eyes fall into different lobes (referring to fig. 11, the abscissa in fig. 11 is a light-emitting angle, the ordinate is a relative brightness, and one side of the main lobe projection area 400 is sequentially provided with a first-level side lobe projection area 501 and a second-level side lobe projection area 502), a case where the actual 3D effect is reversed from the real world is introduced (viewpoint reversal means that the left eye originally views the left view, the right eye views the right view, and is fused into a normal 3D image through the brain, but viewpoint reversal results in that the left eye views, the right eye views the left view, the brain cannot be fused into a correct 3D image, but the opposite or inverted image, therefore, the visual space is also limited to the same, and the following optical tracking method is provided for solving the problem of the following steps:
providing a three-dimensional object, and determining actual image information obtained by observing the three-dimensional object when human eyes are positioned at different positions (for example, if the human eyes are positioned right in front of the three-dimensional object, a front view of the three-dimensional object is obtained, and if the human eyes are positioned on the side of the three-dimensional object, a side view of the three-dimensional object is obtained);
and determining the relationship between the viewpoint index and the physical index, namely determining the position corresponding relationship between the position of each sub-pixel and the viewpoint formed by corresponding the sub-pixel. For example, when a sub-pixel is denoted by (1), the corresponding viewpoint is denoted by 1, and the correspondence relationship is stored, and referring to fig. 12, fig. 12 shows the correspondence index relationship between the sub-pixel and the corresponding viewpoint in one embodiment (it should be noted that the correspondence index relationship between the sub-pixel and the corresponding viewpoint is not limited to that shown in fig. 12).
It should be noted that, in this embodiment, the cylindrical lenses are arranged periodically, a plurality of sub-pixels corresponding to one period form one pixel island, only the sub-pixels of one period are shown in fig. 12, that is, the corresponding index relationship between 11 sub-pixels corresponding to one period of the sub-pixels in one pixel island and the corresponding viewpoint is shown, for example, the sub-pixels covered by the orthographic projection of the cylindrical lenses in one period on the display panel, the viewpoint of the main lobe projection area formed by the corresponding cylindrical lenses, and the viewpoint located in the side lobe projection area are shown;
actually measuring the angular spectrum distribution of the projection area corresponding to the sub-pixel, wherein each sub-pixel corresponds to a viewpoint, and each sub-pixel corresponds to a projection area, that is, the angle of the projection area corresponding to each sub-pixel relative to the plane (parallel to the light emitting surface of the display panel) where the sub-pixel is located is different, in order to obtain the angle of the projection area corresponding to each sub-pixel relative to the plane where the sub-pixel is located, referring to fig. 13, in fig. 13, the abscissa is the light emitting angle, and the ordinate is the relative brightness;
extracting angular spectrum boundaries of the projection regions corresponding to the sub-pixels, and acquiring angles of the corresponding boundaries of the projection regions corresponding to each sub-pixel, where reference numerals 6, 7, 8, 9, 10, 11, 1, 2, 3, 4, 5, 6, and 7 in fig. 14 are reference numerals of the sub-pixels, and may correspond to (1) and (2) in fig. 12, except that no parentheses are provided with respect to fig. 12, refer to fig. 14, where the abscissa in fig. 14 is a light-emitting angle, and the ordinate is relative brightness;
establishing a corresponding relation between the angular spectrum position of the projection area corresponding to the sub-pixel and the physical index of the sub-pixel, specifically, dividing the projection area corresponding to the sub-pixel into a plurality of angle intervals by taking a preset angle as a step length, establishing a corresponding relation between each angle interval and the corresponding sub-pixel, and referring to the following table;
Figure BDA0003790528770000111
the specific setting value of the preset angle may be set according to actual needs, and the preset angle in the table is 0.02 degrees, corresponding to the sub-pixel data set 1: the-70 degree position and the-69.98 position in the projection area both correspond to sub-pixel 3, while the-69.96 degree position, -69.94 degree position, -69.92 degree position, -69.90 degree position, -69.88 degree position both correspond to sub-pixel 4.
It should be noted that, if the number of cylindrical lenses included in each period is the same in the first direction, when the projection area corresponding to the sub-pixel is divided into sections by taking a preset angle as a step length, the label of the sub-pixel corresponding to one angle in the projection areas corresponding to the cylindrical lenses in different periods is the same, and only one set of data exists. If the number of cylindrical lenses included in each period is different (for example, 2 cylindrical lenses are included in one period and 3 lenses are included in another period), when the projection area corresponding to the sub-pixel is divided into sections with a preset angle as a step length, and the labels of the sub-pixels corresponding to the same angle on the projection area formed by the cylindrical lenses in different periods are different, different data groups (a plurality of periods having the same number of cylindrical lenses correspond to the same data group) exist, and the above table shows two sets of data of the sub-pixel data group 1 and the sub-pixel data group 2.
It should be noted that the numbers of the sub-pixels in the above table correspond to those of the sub-pixels in fig. 12, except that no brackets are labeled in the table, i.e., the sub-pixel 3 corresponding to-70 degrees is identical to the sub-pixel (3) in fig. 12.
Determining image information (including left-eye image information and right-eye image information) watched by human eyes according to the coordinates of the eyebrows and the visual angles of the human eyes;
determining the eyebrow coordinates and the eye viewing angles of the viewers, acquiring corresponding image information, and determining the image information corresponding to the positions of the eyes according to the eyebrow coordinates and the eye viewing angles, for example, acquiring corresponding side views when the eyebrow coordinates are positioned on the side of a three-dimensional object. Fig. 17 shows different images corresponding to different positions, and fig. 18 shows image information corresponding to the positions of human eyes in one embodiment.
Obtaining the position of the pixel island corresponding to the eyebrow center according to the included angle between the eyebrow center and each pixel island (a plurality of cylindrical lenses are periodically arranged along the first direction, and a plurality of sub-pixels corresponding to one period form one pixel island) and the visual angle of human eyes; referring to fig. 15, fig. 15 shows a schematic view of the angle between the coordinates of the eyebrow center and the center of each pixel island on the display panel 4 in the first direction;
loading the corresponding image information on the pixel island corresponding to the eyebrow center and the pixel islands located at the periphery of the pixel island along the first direction, and displaying the obtained image while loading the image information in FIG. 18 in FIG. 19,
fig. 16 is a schematic diagram showing image information loaded on a plurality of sub-pixels (shown in the figure as 1-11 without parentheses, but corresponding to physical indexes of the sub-pixels in fig. 12, for example, sub-pixel 1 is equivalent to sub-pixel (1)) corresponding to the kth pixel island. One pixel island in fig. 16 includes 11 sub-pixels, and 11 parallax images (views) are loaded to the 11 sub-pixels.
By adopting the 3D display driving method, corresponding image information is loaded to the sub-pixels corresponding to the human eyes along with the movement of the human eyes, so that the problem of viewpoint inversion caused by the fact that the two eyes are positioned in different projection valve areas is solved, and meanwhile, the visual space is enlarged. By the 3D display driving method, the human eyes are located at different positions of the three-dimensional object, so that different visual angle images can be obtained, namely image information obtained at an actual viewing position is obtained, for example, if the human eyes are located on the front side of the three-dimensional object, an image of a front view is obtained, if the human eyes are located on the side face of the three-dimensional object, a side face image of the three-dimensional object can be obtained, a real world is restored, and the 3D viewing experience of a user is improved.
Illustratively, an embodiment of the present invention further provides a 3D display driving method, which is implemented by using the display device described above, where the display panel includes a plurality of sub-pixels, and each sub-pixel is driven independently, and the method includes the following steps:
determining the coordinates of the pupil of the human eye;
determining according to the pupil coordinates of the human eyes to form the position of a first sub-pixel entering a viewpoint of the human eyes;
and lighting the first sub-pixel or lighting the first sub-pixel and a preset number of sub-pixels around the first sub-pixel.
Due to full-screen ultrahigh-resolution display, all projection viewpoints are opened under the condition that all sub-pixels are lightened, the width of pupils of human eyes is only about 3mm, most viewpoint light rays cannot enter the human eyes, and therefore power consumption waste is caused. In order to solve the problem, the invention determines to light only the first sub-pixel which can enter the human eye or light the first sub-pixel and a preset number of sub-pixels around the first sub-pixel based on the human eye tracking and the backward tracking.
As shown in FIG. 20, the pupil of the human eye S is shown in FIG. 20 as passing through the corresponding cylindrical lens 1, corresponding to the region on the display panel 4, according to the law of refraction, n 0 sinθ1=nsinθ2,n 0 N is the refractive index of the cylindrical lens, theta 1 is the incident angle of human eyes relative to the cylindrical lens, theta 2 is the refraction angle of the backward tracing to the luminous surface of the display panel, and only the lighting is carried outPartial sub-pixels, so as to greatly reduce the overall power consumption of the display, referring to the following table, if the number of sub-pixels corresponding to a viewpoint that can enter human eyes is 1-2 sub-pixels, when only 1-2 sub-pixels are turned on, there may be a non-uniform situation in close-range viewing, when 3-4 sub-pixels are turned on, the uniformity of the viewing picture is greatly improved, and the normal display requirement is basically satisfied, and the power consumption at this time is only 34.7% of the total lighting of the sub-pixels, as shown in fig. 21 (fig. 21 shows the corresponding relationship between the power consumption and the number of the lit sub-pixels, the ordinate shows the power consumption, and the abscissa shows the number of the lit sub-pixels). That is, when only the first sub-pixel is turned on and the uniformity is poor, it may be considered that the first sub-pixel and a predetermined number of sub-pixels around the first sub-pixel are turned on, and the predetermined number may be set according to actual needs.
Figure DEST_PATH_IMAGE001
The above table also shows that when the sub-pixels corresponding to the left eye are turned on (the sub-pixels corresponding to the right eye are turned off), the crosstalk generated for the right-eye viewpoint is reduced, when 1 to 2 sub-pixels corresponding to the left eye are turned on, the left-eye luminance is 100%, and the right-eye luminance is 0.5% of the left-eye luminance, that is, the crosstalk for the right-eye viewpoint is 0.5%, and when 3 to 4 sub-pixels corresponding to the left eye are turned on, the left-eye luminance is 100%, and the right-eye luminance is 1.6% of the left-eye luminance, that is, the crosstalk for the right-eye viewpoint is 1.6%, and when all the sub-pixels corresponding to the left eye are turned on, the left-eye luminance is 100%, and the right-eye luminance is 3.4% of the left-eye luminance, that is 3.4% of the crosstalk for the right-eye viewpoint, and the number of the sub-pixels that are turned on is larger, the picture uniformity is higher, but the sub-pixels corresponding to the left eye are all turned on.
It will be understood that the above embodiments are merely exemplary embodiments taken to illustrate the principles of the present invention, which is not limited thereto. It will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the spirit and scope of the invention, and such modifications and improvements are also considered to be within the scope of the invention.

Claims (14)

1. A3D display device comprises a display panel and a micro-lens array arranged on the light emergent surface of the display panel,
the display panel comprises a plurality of sub-pixels, the sub-pixels with the same color are arranged along a first direction, the sub-pixels with different colors are arranged along a second direction perpendicular to the first direction, and the first direction is parallel to a connecting line of two eyes of a viewer;
the micro-lens array comprises a plurality of cylindrical lenses arranged along the first direction, the extending direction of the cylindrical lenses is parallel to the second direction, in the first direction, each cylindrical lens covers M corresponding sub-pixels, and M is a natural number.
2. The 3D display device according to claim 1, wherein along the first direction, a plurality of cylindrical lenses are periodically arranged, each period comprises 2 or more cylindrical lenses, and projection regions corresponding to the cylindrical lenses in the same period are crossed and complemented to form a continuous imaging plane.
3. The 3D display device according to claim 2, wherein the number of cylindrical lenses and the number of sub-pixels in the same period are relatively prime.
4. The 3D display device according to claim 2, wherein the cylindrical lenses in a same period comprise at least one first cylindrical lens, and the number of sub-pixels corresponding to the first cylindrical lens is different from the number of sub-pixels corresponding to the rest of the cylindrical lenses in the same period.
5. The 3D display device according to claim 4, wherein the first cylindrical lens is located on one side of the remaining cylindrical lenses in the same period in the first direction.
6. The 3D display device according to claim 2, wherein the cylindrical lenses include N number of the cylindrical lenses in one period, each of the sub-pixels includes an open area and a non-open area, and a width of the open area is 1/N of a width of the sub-pixel in the first direction.
7. The 3D display device according to claim 6, wherein the number of the sub-pixels in one period is m, a width of the plurality of the cylindrical lenses is the same in the first direction, and a width of each of the cylindrical lenses is m times the opening area in the first direction, m being a natural number.
8. The 3D display device according to claim 6, wherein orthographic projections of the different cylindrical lenses located in the same period on the display panel are sequentially shifted by a first width in the first direction, the first width being equal to a width of the opening area.
9. 3D display device according to claim 2, characterized in that the number of cylindrical lenses in different periods is the same.
10. The 3D display device according to claim 2, wherein the plurality of cylindrical lenses arranged periodically includes at least one first period, and the number of cylindrical lenses in the first period is different from the number of cylindrical lenses in periods other than the first period.
11. A 3D display device according to claim 1, wherein the light emitting face of the display panel is located on the focal plane of the cylindrical lens.
12. The 3D display device according to claim 11, wherein a distance H between the cylindrical lens and the display panel satisfies the following formula: h = n f, where n is the refractive index of the medium between the light emitting face and the cylindrical lens, and f is the focal length of the cylindrical lens.
13. A 3D display driving method implemented by the 3D display device according to any one of claims 1 to 12, wherein along the first direction, a plurality of the cylindrical lenses are periodically arranged, and a plurality of sub-pixels corresponding to one period form one pixel island, the method comprising the steps of:
providing a three-dimensional object, and determining actual image information obtained by observing the three-dimensional object when human eyes are positioned at different positions;
determining the corresponding relation between the position of each sub-pixel and the position of a viewpoint;
acquiring an angular spectrum boundary of a projection area corresponding to the sub-pixel according to the angular spectrum distribution of the projection area corresponding to the sub-pixel;
determining the corresponding relation between the angular spectrum position of the projection area corresponding to the sub-pixel and the sub-pixel;
determining the eyebrow coordinates and the eye visual angles of the viewers, and acquiring corresponding image information;
and obtaining the position of the pixel island corresponding to the eyebrow center according to the included angle between the eyebrow center and each pixel island, and loading the image information on the pixel island and the pixel islands around the pixel island.
14. A 3D display driving method, implemented using the 3D display device of any one of claims 1-12, the method comprising the steps of:
determining the coordinates of the pupil of the human eye;
determining according to the pupil coordinates of the human eyes to form the position of a first sub-pixel entering a viewpoint of the human eyes;
and lighting the first sub-pixel, or lighting the first sub-pixel and a preset number of sub-pixels around the first sub-pixel.
CN202210954249.0A 2022-08-10 2022-08-10 3D display device and 3D display driving method Pending CN115220241A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210954249.0A CN115220241A (en) 2022-08-10 2022-08-10 3D display device and 3D display driving method
PCT/CN2023/110974 WO2024032461A1 (en) 2022-08-10 2023-08-03 3d display device and 3d display driving method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210954249.0A CN115220241A (en) 2022-08-10 2022-08-10 3D display device and 3D display driving method

Publications (1)

Publication Number Publication Date
CN115220241A true CN115220241A (en) 2022-10-21

Family

ID=83616101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210954249.0A Pending CN115220241A (en) 2022-08-10 2022-08-10 3D display device and 3D display driving method

Country Status (2)

Country Link
CN (1) CN115220241A (en)
WO (1) WO2024032461A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102169236B (en) * 2011-04-15 2013-03-13 黑龙江省四维影像数码科技有限公司 Odd-viewpoint, free and stereo sub-pixel arranging method based on vertical lenticular lens grating
CN102183842A (en) * 2011-04-15 2011-09-14 黑龙江省四维影像数码科技有限公司 Autostereoscopic display equipment sub-pixel arranging method based on vertical column mirror grating
US8963808B2 (en) * 2012-10-29 2015-02-24 Corning Incorporated Autostereoscopic display device and method of displaying image
CN103957400A (en) * 2014-05-09 2014-07-30 北京乐成光视科技发展有限公司 Naked eye 3D display system based on Unity3D game engine
CN114981711B (en) * 2020-12-21 2023-11-03 京东方科技集团股份有限公司 Display device and driving method thereof
CN113903785A (en) * 2021-09-30 2022-01-07 京东方科技集团股份有限公司 Display panel and display device

Also Published As

Publication number Publication date
WO2024032461A1 (en) 2024-02-15

Similar Documents

Publication Publication Date Title
KR100658545B1 (en) Apparatus for reproducing stereo-scopic picture
CN103348687B (en) Autostereoscopic display apparatus
JP4492851B2 (en) Parallax barrier and multiple display
JP4400172B2 (en) Image display device, portable terminal device, display panel, and image display method
US7551353B2 (en) Glassless stereoscopic display
US8553030B2 (en) Stereo display apparatus and lens array thereof
TWI514005B (en) Non-glasses type stereoscopic image display device
CN106023907B (en) The method for driving color sequential display
US8279270B2 (en) Three dimensional display
US20080218433A1 (en) Optical sheet for three-dimensional image and three-dimensional image display device using the same
CN105445949B (en) Three-dimensional display device
JP2011164637A (en) Three-dimensional image display device and display panel
WO2017092708A1 (en) 3d imaging device and a driving method thereof
WO2017020473A1 (en) 3d display apparatus and display method thereof
CN208257981U (en) A kind of LED naked-eye 3D display device based on sub-pixel
KR20150134309A (en) Autostereoscopic display device and method of displaying image
US10477193B2 (en) Three dimensional display device and method of driving the same
US10140898B2 (en) Multi-view display device and method for driving the same
CN107087156A (en) Stereoscopic display device and its driving method
US20190018254A1 (en) Lens Type Display for Displaying Three-Dimensional Images
US11137619B2 (en) Display device for virtual reality, viewing device for virtual reality and head-mounted display apparatus
WO2022133681A1 (en) Display device and driving method therefor
CN115220241A (en) 3D display device and 3D display driving method
KR102515026B1 (en) Autostereoscopic 3-Dimensional Display
CN108885377A (en) Show equipment and its driving method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination