CN115866230A - Stereoscopic display device and display method thereof - Google Patents

Stereoscopic display device and display method thereof Download PDF

Info

Publication number
CN115866230A
CN115866230A CN202111123519.5A CN202111123519A CN115866230A CN 115866230 A CN115866230 A CN 115866230A CN 202111123519 A CN202111123519 A CN 202111123519A CN 115866230 A CN115866230 A CN 115866230A
Authority
CN
China
Prior art keywords
lens array
processing circuit
image
stereoscopic display
actual position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111123519.5A
Other languages
Chinese (zh)
Inventor
陈瑞麟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN202111123519.5A priority Critical patent/CN115866230A/en
Publication of CN115866230A publication Critical patent/CN115866230A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention provides a stereoscopic display device and a display method thereof. The stereoscopic display device includes a display panel, a lens array, an image sensor, and a processing circuit. The display panel displays a three-dimensional image. The lens array is configured on a transmission path of the three-dimensional image. The image sensor acquires a sensing image for a viewing field of the display panel. The processing circuit is coupled to the lens array and the image sensor. The processing circuit calculates the actual position of the eyes of the user in the viewing field according to the reference coordinates of the reference position in the sensing image and the coordinates of the eyes of the user in the sensing image. The processing circuit adjusts the liquid crystal rotation angle of the lens array according to the actual position of human eyes, so that the watching position of the three-dimensional image is matched with the actual position of the human eyes.

Description

Stereoscopic display device and display method thereof
Technical Field
The present invention relates to a stereoscopic display device, and more particularly, to a stereoscopic display device and a display method thereof, in which a rotation angle of a lens can be adjusted.
Background
The current stereoscopic display technologies can be classified into glasses type stereoscopic display technologies in which an observer needs to wear glasses with special designs, and glasses type stereoscopic display technologies in which an observer can directly watch the glasses with naked eyes. The naked-eye stereoscopic display technology can be classified into stereoscopic display technologies such as parallax barrier (parallax barriers), lenticular lens (lenticular lenses), and directional light source (directional backlight). The lenticular lens type stereo imaging method is to set one series of straight cylindrical convex lens films in front of the display screen. The light rays may change traveling direction as they pass through the lenticular lens. The left-eye image and the right-eye image are respectively arranged in a longitudinally staggered manner corresponding to the positions of the lenticular lenses. Through the refraction of the lens, the left eye image and the right eye image of the user respectively see the corresponding left eye image and the right eye image to generate parallax, and then the stereo effect is displayed.
However, the conventional lenticular lens is arranged in a single direction and is fixedly attached to the surface of the display screen, or liquid crystal is injected into the lenticular lens to control the refraction angle of the lens. However, the refraction adjustment in the horizontal direction can only be performed according to the horizontal position of the user. Therefore, when the display screen has an excessively large inclination angle for the user, or the vertical viewing angle of the display screen is excessively large due to the height, sitting posture or other factors of the user, the stereoscopic effect of the displayed image may be greatly reduced, and even the stereoscopic image may be visually deviated or cannot form a stereoscopic effect, thereby affecting the user experience.
Disclosure of Invention
In view of this, the present invention provides a stereoscopic display device and a display method thereof, which can dynamically adjust a liquid crystal rotation angle of a lens array according to an actual position of a human eye to maintain a presentation effect of a stereoscopic image.
In an embodiment according to the present invention, the stereoscopic display device includes a display panel, a lens array, an image sensor, and a processing circuit. The display panel is used for displaying a three-dimensional image. The lens array is configured on a transmission path of the three-dimensional image. The image sensor is used for acquiring a sensing image for a viewing field of the display panel. The processing circuit is coupled to the lens array and the image sensor. The processing circuit calculates the actual position of the eyes of the user in the viewing field according to the reference coordinates of the reference position in the sensing image and the coordinates of the eyes of the user in the sensing image. The processing circuit adjusts the liquid crystal rotation angle of the lens array according to the actual position of human eyes, so that the watching position of the three-dimensional image is matched with the actual position of the human eyes.
In an embodiment according to the present invention, a display method of a stereoscopic display device includes: displaying a three-dimensional image through a display panel of a stereoscopic display device; configuring a lens array of the stereoscopic display device on a transmission path of the three-dimensional image; acquiring a sensing image for a viewing field of a display panel through an image sensor of a stereoscopic display device; calculating the actual position of the eyes of the user in the viewing field by a processing circuit of the stereoscopic display device according to the reference coordinates of the reference position in the sensing image and the coordinates of the eyes of the user in the sensing image; and adjusting the liquid crystal rotation angle of the lens array through the processing circuit according to the actual position of the human eyes, so that the watching position of the three-dimensional image is matched with the actual position of the human eyes.
Based on the above, the stereoscopic display device and the display method thereof according to the embodiments of the invention may obtain the sensing image for the viewing field of the display panel through the image sensor, and calculate the actual position of the eyes of the user in the viewing field according to the reference coordinates and the eye coordinates in the sensing image through the processing circuit, so as to adjust the liquid crystal rotation angle of the lens array according to the actual position of the eyes. Therefore, the watching position of the three-dimensional image displayed by the display panel can be matched with the actual position of the human eyes, so that the presenting effect of the three-dimensional image is maintained, and the user experience is further optimized.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
FIG. 1 is a schematic usage scenario diagram of a stereoscopic display device according to an embodiment of the invention;
fig. 2 is a schematic circuit block diagram of a stereoscopic display device according to an embodiment of the invention;
fig. 3 is a flowchart of a display method of a stereoscopic display device according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a sensing scenario of a stereoscopic display device according to an embodiment of the invention;
FIG. 5 is a schematic y-z plane view illustrating the sensing scenario of FIG. 4 according to one embodiment of the present invention;
FIG. 6 is a partial optical path diagram illustrating the stereoscopic display apparatus shown in FIGS. 1 and 2 according to an embodiment of the invention;
FIG. 7 is a schematic diagram of a hardware architecture of a stereoscopic display device according to another embodiment of the invention;
fig. 8 is a partial structural schematic diagram of a stereoscopic display device according to an embodiment of the invention.
Description of the reference numerals
100. 200: stereoscopic display device
110: display panel
120. 121, 122: lens array
130: image sensor with a light-emitting element
140: processing circuit
141: memory device
142: processor with a memory having a plurality of memory cells
143: analog-to-digital conversion circuit
144: time schedule controller
145: scaling circuit
150: backlight module
C 0 : origin coordinates
C G : position of human eye
C S : reference position
d: distance between two adjacent plates
DP: display pixel
Eye L : left eye coordinates
Eye R : coordinates of the right eye
FD: viewing field
fv: imaging distance
GS1, GS2: glass substrate
I1: three-dimensional image
I2: sensing an image
IM1: left parallax image
IM2: right parallax image
LC: liquid crystal layer
PC: polarizing coating film
S210 to S250: step (ii) of
SLC: switching liquid crystal layer
t: compensation vector
u, v, x, y, z: coordinate axes
u G 、v G 、u S 、v S 、x E 、y E 、z E 、x F 、y F 、z F : coordinates of the object
VC: voltage of
VP: imaging plane
X E : actual position of human eye
X EyeL : left eye
X EyeR : right eye
X F : default viewing position
Detailed Description
Reference will now be made in detail to exemplary embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts.
The term "coupled" as used throughout this specification, including the claims, may refer to any means for connecting, directly or indirectly. For example, if a first device couples (or connects) to a second device, that should be interpreted as that the first device may be directly connected to the second device or the first device may be indirectly connected to the second device through some other device or connection means. The terms "first," "second," and the like, as used throughout this specification, including the claims, are used to designate a name for a component, or to distinguish between different embodiments or ranges, and are not used to limit the upper or lower limit of the number of components, nor the order of the components. Further, wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts. Components/parts/steps that have the same reference numerals or use the same terms in different embodiments may be referred to one another in relation to the description.
Fig. 1 is a schematic usage scenario diagram of a stereoscopic display apparatus 100 according to an embodiment of the invention. Fig. 2 is a schematic circuit block diagram of a stereoscopic display device 100 according to an embodiment of the invention. Please refer to fig. 1 and fig. 2. In the embodiment shown in fig. 1 and fig. 2, the stereoscopic display apparatus 100 includes a display panel 110, a lens array 120, an image sensor 130, and a processing circuit 140. The processing circuit 140 is coupled to the lens array 120 and the image sensor 130. Depending on the actual application, the Display panel 110 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) Display, a Field Emission Display (FED), an Organic Light Emitting Diode (OLED), an Active-Matrix Organic Light Emitting Diode (AMOLED) Display, a Flexible Display (Flexible Display), a Transparent Light Emitting Diode (Transparent Light Emitting Diode Display), or other Display units providing Display functions.
The display panel 110 may display the three-dimensional image I1 to the viewing field FD. The lens array 120 is disposed on the transmission path of the three-dimensional image I1. In some embodiments, the lens array 120 may be directly contacted or attached to the display panel 110 to simplify the manufacturing process or optimize the optical effect, which is not limited in the embodiments.
The related functions of the processing circuit 140 can be implemented as hardware using a hardware description language (e.g., verilog HDL or VHDL) or other suitable programming language, according to design requirements. For example, the related functions of the processing circuit 140 may be implemented in various logic blocks, modules and circuits of one or more microcontrollers, microprocessors, application-specific integrated circuits (ASICs), digital Signal Processors (DSPs), field Programmable Gate Arrays (FPGAs) and/or other processing units. In software and/or firmware, the related functions of the processing circuit 140 may be implemented as programming codes. For example, using a general programming language (e.g., C + +, or assembly language) or other suitable programming language. The program code may be recorded/stored in a "non-transitory computer readable medium", and includes, for example, a Read Only Memory (ROM), a tape (tape), a disk (disk), a card (card), a semiconductor Memory, a programmable logic circuit, and/or a storage device. A Central Processing Unit (CPU), microcontroller, or microprocessor can read and execute the programming code from the non-transitory computer-readable medium to achieve the relevant functions.
Fig. 3 is a flowchart illustrating a display method of a stereoscopic display device according to an embodiment of the invention. The stereoscopic display device 100 shown in fig. 1 and 2 can refer to the related description of fig. 3. Please refer to fig. 1, fig. 2 and fig. 3. In step S210, the display panel 110 may display the three-dimensional image I1 to project the three-dimensional image I1 to the viewing field FD. In step S220, the lens array 120 of the stereoscopic display device 100 may be disposed on the transfer path of the three-dimensional image I1.
In detail, the viewing field FD of FIG. 1 includes a default viewing position X F And the actual position X of the user's eyes E Actual position of human eye X E May be the same or different from the default viewing position X F . It should be noted that, in the present embodiment, each lens in the lens array 120 is injected with liquid crystal, wherein the rotation angle of the liquid crystal can be controlled to change the focusing characteristic of the lens, and further change the refraction angle of the light passing through the lens. In the present embodiment, assuming that the liquid crystal rotation angle of the lens array 120 is a preset initial angle, the viewing position (ideal position) of the three-dimensional image I1 displayed on the display panel 110 is a default viewing position X in the viewing field FD F . In accordance with actual design, in some embodiments, the default viewing position X F May be located on a normal vector at the center of the display panel 110, but is not limited thereto in other embodiments.
In step S230, the image sensor 130 may acquire the sensed image I2 for the viewing field FD of the display panel 110 and transfer the sensed image I2 to the processing circuit 140. In step S240, the processing circuit 140 may identify the sensed image I2 to obtain the sensed image I2The eye coordinates of the user. Default viewing position X in viewing field FD F Corresponding to the reference position in the sensed image I2. The present embodiment will assume that the reference position in the sensed image I2 includes the center position of the sensed image I2. However, in other embodiments, the reference position may be other positions in the sensed image I2. Based on the reference coordinates of the reference position in the sensing image I2 and the coordinates of the user's eyes in the sensing image I2, the processing circuit 140 can calculate the actual position X of the user's eyes in the viewing field FD E (step S240).
For example, fig. 4 is a schematic sensing situation diagram of the stereoscopic display apparatus 100 according to an embodiment of the invention. In the embodiment shown in fig. 4, it is assumed that the sensing direction of the image sensor 130 towards the user is defined as a z-axis with reference to the image sensor 130 in the x-y-z three-dimensional space of the viewing field FD, and the x-axis, the y-axis and the z-axis are perpendicular to each other. Default viewing position X in viewing field FD F Can be expressed as (x) F ,y F ,z F ) Actual position X of user's eyes in viewing field FD E Can be expressed as (x) E ,y E ,z E )。
In the present embodiment, the sensing image I2 acquired by the image sensor 130 for the viewing field FD may correspond to a virtual imaging plane VP at which the focal length of the image sensor 130 is located. The image sensor 130 senses the actual position X of the human eye in the viewing field FD E And a sensed image I2 is obtained. The imaging plane VP (sensed image I2) is a u-v two-dimensional plane formed by the u-axis and the v-axis. Human eye position C in the imaging plane VP (sensing image I2) G Corresponding to the actual position X of the human eye in the viewing field FD E And a reference position C in the imaging plane VP (sensing image I2) S Corresponding to a default viewing position X in the viewing field FD F . On the imaging plane VP, with origin coordinates C 0 (0, 0) as a reference, reference position C S Is a reference coordinate (u) S ,v S ) And eye position C G Is the coordinates of the human eye (u) G ,v G ). In the embodiment shown in FIG. 4In the example, the x-axis of the viewing field FD is parallel to the u-axis of the imaging plane VP, and the y-axis of the viewing field FD is parallel to the v-axis of the imaging plane VP. Thus, the default viewing position X in the viewing field FD F The processing circuit 140 may determine the reference position C in the sensed image I2 according to the predetermined default viewing position or the known position S Reference coordinate (u) of S ,v S ) And eye position C G Human eye coordinates (u) G ,v G ) To calculate the actual position X of the user's eyes in the viewing field FD E
For example, FIG. 5 is a schematic y-z plane view illustrating the sensing scenario of FIG. 4 according to one embodiment of the invention. Please refer to fig. 4 and fig. 5. In the embodiment shown in FIG. 5, it is assumed that the user in the field FD is being viewed (i.e. the actual position X of the eye) E ) Z-axis coordinate (depth information of the display panel 110 to human eyes) z E Equal to the default viewing position X F Z-axis coordinate (display panel 110 to default viewing position X) F Depth information of) z F . In the embodiment shown in FIG. 5, the default viewing position X is based on the image sensor 130 F May be expressed as (y) F ,z F ) Actual position of human eye X E Can be expressed as (0,z) F ) And a default viewing position X F With the actual position X of the human eye E The relative distance on the y-axis being y F . As such, in the present embodiment, the position C is referenced in the imaging plane VP S Reference coordinate (u) S ,v S ) To the eye position C G Human eye coordinates (u) G ,v G ) Can be expressed as follows:
Figure BDA0003277900550000071
Figure BDA0003277900550000072
Figure BDA0003277900550000073
wherein d denotes the reference position C in the imaging plane VP S To the eye position C G Relative distance on the v-axis (y-axis variable). fv denotes an imaging distance between the display panel 110 (image sensor 130) and the imaging plane VP corresponding to the sensed image I2, i.e., the image sensor 130 and the human eye position C G Relative distance on the z-axis (z-axis variable). Δ X denotes the actual position X of the human eye E And a default viewing position X F The relative distance on the X-axis (X-axis variable), and Δ Eyeball represents the actual position X of the eye E And a default viewing position X F The actual amount of variation in the x-y two-dimensional plane. In some embodiments, processing circuitry 140 may determine the reference position C by sensing the reference position C in image I2 S Reference coordinate (u) of S ,v S ) And eye position C G Human eye coordinates (u) G ,v G ) The distance d is calculated. In some embodiments, processing circuitry 140 may derive imaging distance fv from the focal length of image sensor 130. Depending on the actual application, in some embodiments, the stereoscopic display device 100 may optionally include a depth sensor (not shown) which may sense the user (the actual position X of the human eye) in the viewing field FD from the display panel 110 E ) Depth information z of F . As such, in some embodiments, the processing circuit 140 can use the depth information z through the relations (1) to (3) described above F And a reference position C S Reference coordinate (u) of S ,v S ) To calculate the reference position C in the viewing field FD S Corresponding default viewing position X F
In step S250 shown in fig. 3, the processing circuit 140 may determine the actual position X of the human eye E The rotation angle of the liquid crystal of the lens array 120 is adjusted so that the viewing position (ideal position) of the three-dimensional image I1 matches the actual position X of the human eye E . It should be noted that, in the embodiment shown in fig. 1 and fig. 2, the processing circuit 140 can change the refraction direction of the light by adjusting the rotation angle of the liquid crystal of the lens array 120, so that the viewing position of the three-dimensional image I1 can be determined by defaultViewing position X F Move to the actual position X of the human eye E (or, as close as possible to the actual position X of the human eye E ). Taking the scenario shown in fig. 4 as an example, the movement of the light ray (image) is only the movement of the three-dimensional image 11 on the u-v (or x-y) two-dimensional plane for the eyes of the user, but is the stereoscopic transformation in the x-y-z three-dimensional space for the lens array 120 and the image sensor 130. Thus, in some embodiments, processing circuitry 140 may preset or calculate a spatial transformation matrix that transforms u-v two-dimensional coordinates to x-y-z three-dimensional coordinates. Processing circuitry 140 may use the spatial transformation matrix to sense eye position C in image I2 G Human eye coordinates (u) G ,v G ) Conversion into the actual position X of the eye E Coordinate (x) of E ,y E ,z E ) Then according to the actual position X of human eyes E Coordinate (x) of E ,y E ,z E ) The liquid crystal rotation angle of the lens array 120 is adjusted.
For example, in some embodiments, the spatial transformation matrix may be one or more fixed matrices preset before the stereoscopic display device 100 is shipped from a factory. Alternatively, in some embodiments, the processing circuit 140 may also depend on the reference position C in the sensed image I2 S Reference coordinate (u) of S ,v S ) And a reference position C in the viewing field FD S Corresponding default viewing position X F Coordinate (x) of (2) F ,y F ,z F ) To dynamically compute the spatial transformation matrix. For example, in some embodiments, reference position C S And a default viewing position X F Can be expressed as follows:
C s =K F X F -(4)
K F (X F t ·X F )=X F t ·C s ·X F -1 -(5)
K F (X F t ·X F ) -1 ·(X F t ·X F )=(X F t ·X F ) -1 ·C s -(6)
K F =(X F t ·X F ) -1 ·C s -(7)
wherein, K F A reference position C represented on a two-dimensional plane S With a default viewing position X in three-dimensional space F A spatial transformation matrix therebetween. In some embodiments, reference position C S May be a 2x1 matrix, a spatial transformation matrix K F May be a 2X3 matrix, default viewing position X F May be a 3x1 matrix. In some embodiments, the default viewing position X F Is an irreversible non-square matrix, so the processing circuit 140 can solve the spatial transformation matrix K by the least squares method of equations (5) - (7) above F . Similarly, in some embodiments, eye position C G With the actual position X of the human eye E Can also be expressed as follows:
C G =K E X E -(8)
wherein, K E Eye position C represented on a two-dimensional plane G With the actual position X of the human eye in three-dimensional space E A spatial transformation matrix therebetween. In some embodiments, eye position C G May be a 2x1 matrix, a spatial transformation matrix K E Can be a 2X3 matrix, the actual coordinate X of human eyes E May be a 3x1 matrix. In some embodiments, the spatial transformation matrix K E May be equal to the spatial transformation matrix K F Thus, the processing circuit 140 can use the spatial transformation matrix K calculated by the above equation (7) F Then the eye position C is determined by equation (8) G Human eye coordinates (u) G ,v G ) Conversion to the actual position X of the eye E Coordinate (x) of E ,y E ,z E )。
Furthermore, since the movement of the viewing position of the three-dimensional image 11 can be regarded as the default viewing position X for the lens array 120 and the image sensor 130 F Coordinate transformation in three-dimensional space. Thus, in some embodiments, the default viewing position X F With the actual position X of the human eye E Can be expressed as follows:
X E =R F X F +t -(9)
where t represents a compensation vector between the plurality of lenses in the lens array 120. R F Indicating a default viewing position X F And a rotation matrix in coordinate conversion in a three-dimensional space. In some embodiments, the actual position of the human eye X E Can be a 3x1 matrix, a rotation matrix R F May be a 3X3 matrix, default viewing position X F May be a 3x1 matrix and the compensation vector t may be a 3x1 matrix. As such, when the compensation vector t approaches 0, the processing circuit 140 may determine the default viewing position X in the viewing field FD F And the actual position X of the human eye calculated by the above formula (8) E Substituting the above equation (9) to calculate the rotation matrix R F . Processing circuitry 140 may then rely on rotation matrix R F The liquid crystal rotation angle of the lens array 120 is calculated.
In other embodiments, the above equations (8) and (9) can be combined into the following relations:
C G =K E (R F X F +t) -(10)
R F =K E -1 ·X F t ·(X F t X F ) -1 ·C G -(11)
in some embodiments, it is assumed that the compensation vector t approaches 0 and the spatial transformation matrix K E Is equal to the spatial transformation matrix K F When the formula (10) is represented by C G =K F (R F X F ). In other words, the processing circuit 140 can also determine the human eye position C in the sensed image I2 sensed by the image sensor 130 G Default viewing position X preset or calculated by the above equation (1) F And the space transformation matrix K calculated by the above formula (7) F To calculate a rotation matrix R F . In this way, the image sensor 130 is required to capture the eye position C G The processing circuit 140 can directly calculate the rotation matrix R by the above equations (10) and (11) F And according to a rotation matrix R F Determining the required rotation of the lens array 120And (5) rotating the angle.
In detail, a three-dimensional coordinate point (e.g., default viewing position X) F ) The action of coordinate transformation in the x-y-z three-dimensional space can be disassembled into three-dimensional coordinate points which respectively rotate the x axis, the y axis and the z axis, and the rotation matrix R of the three-dimensional coordinate points F Can be expressed as follows:
R F =R yy )R xx )R zz ) -(12)
Figure BDA0003277900550000101
Figure BDA0003277900550000102
Figure BDA0003277900550000103
wherein R is x 、R y And R z Respectively representing rotation matrixes when the three-dimensional coordinate points independently rotate the x axis, the y axis and the z axis, and the rotation angles are respectively theta x 、θ y And theta z . In other words, in the case where the three-dimensional coordinate point rotates on the x-axis alone, the coordinate transformation only affects the y-z two-dimensional plane, i.e., it can be considered that the y-z two-dimensional plane on which the three-dimensional coordinate point is located rotates on the x-axis. Under the condition that the three-dimensional coordinate point independently rotates the y axis, the coordinate conversion only affects the x-z two-dimensional plane, namely the x-z two-dimensional plane where the three-dimensional coordinate point is located rotates the y axis. Under the condition that the three-dimensional coordinate point independently rotates the z axis, the coordinate conversion only affects the x-y two-dimensional plane, namely the x-y two-dimensional plane where the three-dimensional coordinate point is located rotates the z axis.
Thus, taking the embodiment of FIG. 4 as an example, in X-y-z three-dimensional space, for lens array 120 (or image sensor 130), the viewing position of three-dimensional image I1 is from default viewing position X F Transferred to the actual position X of the human eye E Can be viewed as a default viewing position X F The x-y two-dimensional plane on which the lens is positioned rotates the z-axis. The rotation matrix R in the above-mentioned formulas (9) to (11) F The rotation matrix R in the above formula (15) can be used Zz ) To indicate. In this way, the processing circuit 140 calculates the rotation matrix R F Then, the rotation matrix R can be determined Zz ) Calculate the default viewing position X F Rotation angle theta when rotating the z-axis z . Furthermore, the processing circuit 140 adjusts the rotation angle of the liquid crystal in the lens array 120 to change the refraction direction of the passing light, so that the viewing position of the three-dimensional image I1 matches with the actual position X of the human eye E So as to maintain the presenting effect of the stereo image.
Fig. 6 is a partial optical path diagram illustrating the stereoscopic display apparatus 100 shown in fig. 1 and 2 according to an embodiment of the invention. In the embodiment shown in fig. 6, the display panel 110 may simultaneously display at least two sub-images for forming a three-dimensional image, and the lens array 120 may include a plurality of sequentially arranged lenses to respectively correspond to the plurality of sub-images, so as to respectively project the plurality of sub-images to the eyes of the user. The number, shape and arrangement of the lenses in the lens array 120 may be adjusted according to actual requirements, and the embodiment is not limited thereto. For example, in the present embodiment, the three-dimensional image displayed by the display panel 110 may include a left parallax image IM1 and a right parallax image IM2, and the lens array 120 may include left eyes X respectively corresponding to users EyeL And the right eye X EyeR The lens array 122 and the lens array 121 respectively transmit the left parallax image IM1 and the right parallax image IM2 to the left eye X of the user EyeL And the right eye X EyeR And then a three-dimensional effect is presented by parallax. Corresponding to the actual position X of the eyes of the user E (left eye X) EyeL And the right eye X EyeR ) The liquid crystal rotation angles of the lens arrays 121, 122 must be calculated separately.
In detail, in the present embodiment, the image sensor 130 may sense the left eye X of the user EyeL And (or) the right eye X EyeR To generate a sensed image, sensingThe image plane VP corresponding to the image may include the left eye X of the corresponding user EyeL And (or) the right eye X EyeR Eye coordinate of L And (or) the right Eye coordinate Eye R . Then, in some embodiments, assume a right eye X EyeR Based on the actual position of (a), the above equation (9) can be rewritten as follows:
Eye R =K E (R EL X EyeL ) -(16)
Eye L =K E (R ER X EyeR +t) -(17)
in the present embodiment, the compensation vector t may be related to the horizontal resolution of the display panel 110. For example, taking the display panel 110 with a resolution of 1920 x 1080 pixels (pixels) as an example, the compensation vector t may be 0.17925 millimeters (mm) of horizontal inter-pixel spacing, and then in x-y-z three-dimensional space, the compensation vector t may be represented as a 3x1 matrix
Figure BDA0003277900550000111
R EL And R ER Which represent the rotation matrices of the lens arrays 121 and 122, respectively. In the present embodiment, it is assumed that the lens arrays 121 and 122 are sequentially arranged along the x-axis horizontal direction, and the direction of the display panel 110 displaying the image through the lens array 120 is the z-axis. The processing circuit 140 can use the above equations (16), (17) to determine the actual position X of the human eye E (left eye X) EyeL Or right eye X EyeR ) Calculating rotation matrix R corresponding to the lens arrays 121 and 122 according to the compensation vector t EL And R ER . The processing circuit 140 may further rotate the z-axis independently according to the rotation matrix R in equation (15) above Zz ) To calculate a rotation matrix R EL And R ER Angle of rotation of (1) & theta z Further, the liquid crystal rotation angle of the lens array 121 and the liquid crystal rotation angle of the lens array 122 can be adjusted.
Fig. 7 is a schematic hardware architecture diagram of a stereoscopic display apparatus 200 according to another embodiment of the invention. In the embodiment shown in fig. 7, the stereoscopic display apparatus 200 includes a display panel 110, a lens array 120, an image sensor 130, and a processing circuit 140. The embodiments of the display panel 110, the lens array 120 and the image sensor 130 can be analogized with reference to the related descriptions of the display panel 110, the lens array 120 and the image sensor 130 shown in fig. 1 and 2, and therefore, the description thereof is omitted. In some embodiments, the processing circuit 140 may include a memory 141, a processor 142, an analog-to-digital conversion circuit 143, a timing controller 144, and/or a scaling circuit 145. Wherein the implementation of the processor 142 may be analogized with reference to the related description of the processing circuit 140 shown in fig. 1 and 2. In some embodiments, the stereoscopic display device 100 may further include a backlight module 150. The embodiments of the memory 141, the analog-to-digital conversion circuit 143, the timing controller 144, the scaling circuit 145 and the backlight module 150 can be implemented by an image display device known to those skilled in the art, and the embodiments are not limited thereto.
Fig. 8 is a partial structural schematic diagram of a stereoscopic display device according to an embodiment of the invention. The display panel 110 and/or the lens array 120 shown in fig. 8 can be used as an example of the display panel 110 and/or the lens array 120 in the stereoscopic display device 100 shown in fig. 1 and 2, or as an example of the display panel 110 and/or the lens array 120 in the stereoscopic display device 200 shown in fig. 7. In the embodiment shown in fig. 8, the display panel 110 may include a switching liquid crystal layer (SLC), a Polarized Coating (PC) and a plurality of Display Pixels (DP) sequentially stacked. The lens array 120 may include a glass substrate GS1, a liquid crystal layer LC, and a glass substrate GS2 sequentially stacked. The liquid crystal rotation angle of the liquid crystal layer LC in the lens array 120 may be adjusted to change the light transmission direction of the image displayed by the display pixel DP. The embodiments of the components can be implemented by a stereoscopic display device known to those skilled in the art, and the embodiments are not limited thereto.
In summary, the stereoscopic display devices 100 and 200 according to the embodiments of the invention can acquire the sensing image I2 for the viewing field FD of the display panel 110 through the image sensor 130, and use the processing circuit 140 to determine the sensing image I2Reference position C of S To the eye position C G Calculating the actual position X of the user's eyes in the viewing field FD E According to the actual position X of the human eye E The liquid crystal rotation angle of the lens array 120 is adjusted. In this way, the viewing position of the three-dimensional image I1 displayed on the display panel 110 can be matched with the actual position X of the human eye E And further optimize the user experience.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (12)

1. A stereoscopic display device, characterized in that the stereoscopic display device comprises:
a display panel for displaying a three-dimensional image;
a lens array disposed on a transmission path of the three-dimensional image;
an image sensor to acquire a sensing image for a viewing field of the display panel; and
and the processing circuit is coupled to the lens array and the image sensor, and calculates the actual position of the eyes of the user in the viewing field according to the reference coordinate of the reference position in the sensing image and the coordinates of the eyes of the user in the sensing image so as to adjust the liquid crystal rotation angle of the lens array according to the actual position of the eyes and make the viewing position of the three-dimensional image coincide with the actual position of the eyes.
2. The stereoscopic display apparatus of claim 1, wherein the processing circuit calculates a spatial transformation matrix from the reference coordinates and a default viewing position corresponding to the reference coordinates in the viewing field.
3. The stereoscopic display apparatus of claim 1, wherein the processing circuitry converts the human eye coordinates to the human eye actual position using a spatial conversion matrix.
4. The stereoscopic display apparatus according to claim 1, wherein the processing circuit calculates a rotation matrix according to a default viewing position corresponding to the reference coordinate in the viewing field and the actual position of the human eyes, and calculates the liquid crystal rotation angle according to the rotation matrix.
5. The stereoscopic display apparatus according to claim 1, further comprising:
a depth sensor to sense depth information from the display panel to the user in the viewing field, wherein the processing circuit uses the depth information and the reference coordinates to calculate a default viewing position corresponding to the reference coordinates in the viewing field.
6. The stereoscopic display apparatus according to claim 1, wherein the lens arrays comprise a first lens array and a second lens array corresponding to the left eye and the right eye of the user, respectively, and the processing circuit adjusts the liquid crystal rotation angle of the first lens array and the liquid crystal rotation angle of the second lens array according to the actual position of the human eye and a compensation vector, wherein the compensation vector is related to the horizontal resolution of the display panel.
7. A display method of a stereoscopic display device, the display method comprising:
displaying a three-dimensional image through a display panel of the stereoscopic display device;
configuring a lens array of the stereoscopic display device on a transmission path of the three-dimensional image;
acquiring a sensing image for a viewing field of the display panel through an image sensor of the stereoscopic display device;
calculating, by a processing circuit of the stereoscopic display device, an actual position of eyes of a user in the viewing field according to a reference coordinate of a reference position in the sensed image and coordinates of eyes of the user in the sensed image; and
and adjusting the liquid crystal rotation angle of the lens array through the processing circuit according to the actual position of the human eye, so that the watching position of the three-dimensional image is matched with the actual position of the human eye.
8. The display method according to claim 7, further comprising:
calculating, by the processing circuit, a spatial transformation matrix from the reference coordinates and a default viewing position corresponding to the reference coordinates in the viewing field.
9. The display method according to claim 7, further comprising:
and converting the coordinates of the human eyes into actual positions of the human eyes by using a space conversion matrix through the processing circuit.
10. The display method according to claim 7, further comprising:
calculating a rotation matrix by the processing circuit according to the default viewing position corresponding to the reference coordinate in the viewing field and the actual position of the human eyes; and
and calculating the rotation angle of the liquid crystal according to the rotation matrix through the processing circuit.
11. The display method according to claim 7, further comprising:
sensing, by a depth sensor of the stereoscopic display apparatus, depth information from the display panel to the user in the viewing field; and
using, by the processing circuit, the depth information and the reference coordinates to calculate a default viewing position corresponding to the reference coordinates in the viewing field.
12. The display method according to claim 7, wherein the lens arrays include a first lens array and a second lens array corresponding to a left eye and a right eye of the user, respectively, and the display method further comprises:
adjusting, by the processing circuit, the liquid crystal rotation angle of the first lens array and the liquid crystal rotation angle of the second lens array according to the actual position of the human eye and the compensation vector, wherein
The compensation vector is related to the horizontal resolution of the display panel.
CN202111123519.5A 2021-09-24 2021-09-24 Stereoscopic display device and display method thereof Pending CN115866230A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111123519.5A CN115866230A (en) 2021-09-24 2021-09-24 Stereoscopic display device and display method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111123519.5A CN115866230A (en) 2021-09-24 2021-09-24 Stereoscopic display device and display method thereof

Publications (1)

Publication Number Publication Date
CN115866230A true CN115866230A (en) 2023-03-28

Family

ID=85653204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111123519.5A Pending CN115866230A (en) 2021-09-24 2021-09-24 Stereoscopic display device and display method thereof

Country Status (1)

Country Link
CN (1) CN115866230A (en)

Similar Documents

Publication Publication Date Title
CN107885325B (en) Naked eye 3D display method and control system based on human eye tracking
KR101850718B1 (en) Display Apparatus and method for adjusting 3D image of the display apparatus
CN101938666B (en) Stereoscopic image displaying device, object proximity detecting device, and electronic apparatus
TWI459035B (en) Stereoscopic image display device and driving method thereof
US9866825B2 (en) Multi-view image display apparatus and control method thereof
CN100371870C (en) Input and output device and terminal device
US9104034B2 (en) Display apparatus
US20150286457A1 (en) Tiled display system and method for processing images thereof
US9857616B2 (en) Touch sensing optical system and display device including the same
US8279270B2 (en) Three dimensional display
US11860471B2 (en) Optical system using segmented phase profile liquid crystal lenses
US20150362741A1 (en) Stereoscopic image display apparatus
US10674141B1 (en) Apparatuses, systems, and methods for determining interpupillary distances of head-mounted displays
CN105374325A (en) Bendable stereoscopic 3D display device
US20130235305A1 (en) Optical device, display apparatus and electronic apparatus
US20130335403A1 (en) Two dimensional/ three dimensional switchable display module and display deice having the same
CN113325599A (en) Display device
TWI461794B (en) Display module and display device applied with the same
CN115866230A (en) Stereoscopic display device and display method thereof
KR101881163B1 (en) Integral image system and driving method of the same
TWI779842B (en) Stereoscopic display device and display method thereof
CN115695976A (en) Display device and image acquisition method thereof
CN116506588A (en) Stereoscopic display device and display method thereof
CN113973199A (en) Light-transmitting display system and image output method and processing device thereof
KR20170011048A (en) Transparent display apparatus and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination