CN117880484A - 3D display device, display method thereof, electronic device and storage medium - Google Patents

3D display device, display method thereof, electronic device and storage medium Download PDF

Info

Publication number
CN117880484A
CN117880484A CN202410101544.0A CN202410101544A CN117880484A CN 117880484 A CN117880484 A CN 117880484A CN 202410101544 A CN202410101544 A CN 202410101544A CN 117880484 A CN117880484 A CN 117880484A
Authority
CN
China
Prior art keywords
sub
grating
unit
pixel
positions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410101544.0A
Other languages
Chinese (zh)
Inventor
吴坤
薛海林
彭晓青
朱劲野
李艳云
冯茜
秦伟达
商世明
冯中英
魏伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202410101544.0A priority Critical patent/CN117880484A/en
Publication of CN117880484A publication Critical patent/CN117880484A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)

Abstract

The application provides 3D display equipment, a display method thereof, electronic equipment and a storage medium, wherein the method comprises the steps of obtaining the positions of two eyes of a viewer when the viewer faces a display screen, wherein the positions of the two eyes are the spatial positions of the two eyes relative to the display screen; judging whether the two eye positions fall into an optimal viewing area of the naked eye 3D visual area or not; and if the two eye positions do not fall into the optimal viewing area, adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit so as to adjust the spatial position of the optimal viewing area. No matter what area the two eyes of the viewer are positioned in, the best watching effect can be obtained by adjusting the positions of the sub-pixel lighting areas and/or the grating units, and the watching experience of the user is improved.

Description

3D display device, display method thereof, electronic device and storage medium
Technical Field
The application relates to the technical field of display devices, in particular to a 3D display device, a display method thereof, electronic equipment and a storage medium.
Background
The naked eye 3D is a special picture capable of obtaining a stereoscopic viewing effect without auxiliary viewing equipment and naked eyes, and the principle is that different pictures are respectively received by the left eye and the right eye of a person, and after the image information is overlapped by the brain, an image with a stereoscopic display effect can be constructed. Among them, naked eye 3D display devices are popular because they can view 3D images without requiring viewers to wear glasses or any viewing-aid devices such as helmets.
In the current mainstream naked eye 3D display device equipment, because the optical structure is relatively fixed with the corresponding pixel width, the adjustment can not be timely carried out, especially can not be carried out according to the position change of the viewer, the 3D display effect that leads to the viewer to experience in different positions is good or bad, and the viewing experience is seriously influenced.
Disclosure of Invention
In view of this, an object of the present application is to propose a 3D display device, a display method thereof, an electronic device, and a storage medium.
In view of the above object, a first aspect of the present application provides a display method of a 3D display device, the 3D display device including a display screen including a plurality of sub-pixels and a grating structure including a plurality of grating units arranged in sequence, the method comprising:
acquiring the positions of two eyes of a viewer facing a display screen, wherein the positions of the two eyes are the spatial positions of the two eyes relative to the display screen;
judging whether the two eye positions fall into an optimal viewing area of the naked eye 3D visual area or not;
and if the two eye positions do not fall into the optimal viewing area, adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit so as to adjust the spatial position of the optimal viewing area.
Optionally, if the binocular position does not fall into the optimal viewing area, adjusting the sub-pixel lighting area corresponding to the raster unit and/or the position of the raster unit includes:
determining a central grating unit and lateral grating units based on the binocular positions, and determining a target pixel boundary position of each lateral grating unit;
if the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen is positioned outside the current sub-pixel lighting area of the lateral grating unit, determining the deviation amount and the deviation direction of the lateral grating unit based on the target pixel boundary position and the current pixel boundary position;
and adjusting the position of the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit based on the deviation amount and the deviation direction.
Optionally, the determining a central grating unit and a lateral grating unit based on the binocular positions includes:
determining a binocular central point position of the viewer based on the binocular positions;
and if the orthographic projection of the grating units on the display screen covers the orthographic projection of the binocular central point positions on the display screen, determining the grating units as the central grating units, and determining the grating units except the central grating units in all the grating units as lateral grating units.
Optionally, the determining the target pixel boundary position of each of the lateral raster units includes:
establishing a coordinate system in a first extending direction perpendicular to a plane of the display screen and a second extending direction parallel to the plane of the display screen;
and determining the target pixel boundary position of each lateral grating unit based on the two-eye center point position, the coordinate system and a preset maximum human eye viewing angle.
Optionally, the adjusting the sub-pixel lighting area corresponding to the raster unit based on the deviation amount and the deviation direction includes:
determining the actual adjustment number of the sub-pixels based on the deviation amount and the preset minimum adjustment width of the pixels;
and adjusting the sub-pixel lighting area corresponding to the grating unit based on the actual adjustment number of the sub-pixels and the deviation direction.
Optionally, the adjusting the sub-pixel lighting area corresponding to the raster unit based on the sub-pixel adjustment number and the deviation direction includes:
and translating the sub-pixel lighting areas corresponding to the grating units from the current lighting areas along the deviation direction according to the actual adjustment number of the sub-pixels so as to adjust the sub-pixel lighting areas corresponding to the grating units.
Optionally, the grating structure is an electrically driven liquid crystal lens, the electrically driven liquid crystal lens comprises a plurality of electrodes arranged in parallel, and the grating unit is a lens area formed by a plurality of adjacent electrodes;
the adjusting the position of the grating unit based on the deviation amount and the deviation direction includes:
determining the actual adjustment number of the electrodes based on the deviation amount and the preset minimum electrode adjustment width;
and adjusting the position of the grating unit based on the actual adjustment number of the electrodes and the deviation direction.
Optionally, the adjusting the position of the grating unit based on the electrode adjustment number and the deviation direction includes:
and translating the lens area of the grating unit from the current lens area according to the actual adjustment number of the electrodes along the deviation direction so as to adjust the position of the grating unit.
Optionally, the adjusting the positions of the sub-pixel lighting area corresponding to the raster unit and the raster unit includes:
after adjusting the sub-pixel lighting areas corresponding to the grating units, continuously acquiring the binocular positions of a viewer facing the display screen;
judging whether the two eye positions fall into an optimal viewing area of the naked eye 3D visual area or not;
And if the two eye positions do not fall into the optimal viewing area, adjusting the positions of the grating units.
Based on the same inventive concept, a second aspect of the present application provides a 3D display device, comprising:
a display screen including a plurality of subpixels;
the grating structure comprises a plurality of grating units which are sequentially arranged;
the human eye tracker is used for acquiring the positions of the eyes of a viewer when the eyes face the display screen, wherein the positions of the eyes are the spatial positions of the eyes relative to the display screen; judging whether the binocular positions fall into an optimal viewing area of the naked eye 3D visual area or not;
and the image processor is used for adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit when the eye tracker determines that the two eye positions do not fall into the optimal viewing area so as to adjust the spatial position of the optimal viewing area.
Optionally, the image processor includes a target pixel boundary determining module, a deviation determining module, and an adjusting module;
the target pixel boundary determining module is used for determining a central grating unit and lateral grating units based on the binocular positions and determining the target pixel boundary position of each lateral grating unit;
The judging module is used for judging whether the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen is positioned outside the current sub-pixel lighting area of the lateral grating unit;
if the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen is located outside the current sub-pixel lighting area of the lateral grating unit, the deviation determining module is used for determining the deviation amount and the deviation direction of each lateral grating unit based on the target pixel boundary position and the current pixel boundary position;
the adjusting module is used for adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit.
Optionally, the adjustment module includes a pixel adjustment unit;
the pixel adjusting unit is used for determining the actual adjustment number of the sub-pixels based on the deviation amount and the minimum adjustment width of the preset pixels; and translating the sub-pixel lighting areas corresponding to the grating units from the current lighting areas along the deviation direction according to the actual adjustment number of the sub-pixels so as to adjust the sub-pixel lighting areas corresponding to the grating units.
Optionally, the grating structure is an electrically driven liquid crystal lens, the electrically driven liquid crystal lens comprises a plurality of electrodes arranged in parallel, and the grating unit is a lens area formed by a plurality of adjacent electrodes;
The adjusting module further comprises an electrode adjusting unit;
the electrode adjusting unit is used for determining the actual electrode adjusting number based on the deviation and the preset minimum electrode adjusting width; and translating the lens area of the grating unit from the current lens area according to the actual adjustment number of the electrodes along the deviation direction so as to adjust the position of the grating unit.
Based on the same inventive concept, a third aspect of the present application provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to any one of the first aspect above when executing the program.
Based on the same inventive concept, a fourth aspect of the present application provides a computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of the above first aspects.
As can be seen from the foregoing, when the binocular positions do not fall into the optimal viewing area, the 3D display device, the display method thereof, the electronic device and the storage medium provided by the present application indicate that the viewing effect is not good at this time, the spatial position of the optimal viewing area can be adjusted by adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit, so that the binocular positions can fall into the adjusted optimal viewing area to obtain the optimal viewing effect, and therefore, no matter in what area the binocular positions of the viewer are, the optimal viewing effect can be obtained by adjusting the sub-pixel lighting area and/or the position of the grating unit, and the viewing experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present application or related art, the drawings that are required to be used in the description of the embodiments or related art will be briefly described below, and it is apparent that the drawings in the following description are only embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort to those of ordinary skill in the art.
FIG. 1 is a schematic diagram of the optical design and implementation of a 3D display device;
FIG. 2 is a schematic diagram of the optical design of a 3D display device;
FIG. 3 is a schematic diagram of a viewer looking at a center position;
FIG. 4 is a schematic diagram of a viewer looking at a non-center position;
fig. 5 is a first flowchart of a display method of the 3D display device according to the embodiment of the present application;
fig. 6 is a schematic diagram of a first principle of a display method of a 3D display device according to an embodiment of the present application;
FIG. 7A is a schematic diagram of a first equivalent lens principle of an electrically driven liquid crystal lens according to an embodiment of the present application;
FIG. 7B is a schematic diagram of a second equivalent lens principle of an electrically driven liquid crystal lens according to an embodiment of the present application;
fig. 8 is a second flowchart of a display method of the 3D display device according to the embodiment of the present application;
Fig. 9 is a second schematic diagram of a display method of the 3D display device according to the embodiment of the present application;
fig. 10 is a schematic diagram of a third principle of a display method of the 3D display device according to the embodiment of the present application;
fig. 11 is a schematic diagram of a fourth principle of a display method of the 3D display device according to the embodiment of the present application;
fig. 12 is a schematic diagram of a fifth principle of a display method of the 3D display device according to the embodiment of the present application;
fig. 13 is a schematic diagram of sub-pixel arrangement according to an embodiment of the present application;
fig. 14 is a sixth schematic diagram of a display method of a 3D display device according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a 3D display device according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
In the figure: p, grating unit pitch; w, display pixel width; l, the optimal viewing distance; D. the distance between the optical structure and the display panel; x, a first offset; y, a second offset; m, the position of the central point of the actual optimal viewing area; n, the positions of the center points of the eyes of the left-offset viewer; o, the positions of the center points of the eyes of the right-offset viewer; w, the amount of deviation; r, deviation direction;
1. a 3D display device; 11. a display screen; 12. a grating structure; 121. a grating unit; 13. an eye tracker; 14. an image tracker;
2. A space distribution diagram of each viewpoint of the sub-pixel; 3. a current lighting area; 4. the area is lightened after adjustment; 5. a current lens region; 6. adjusting the rear lens region; 7. an optimal viewing area; 8. an electrode; 9. an equivalent lens; 10. the actual arrangement of the views of the sub-pixels is schematically shown.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings.
It should be noted that unless otherwise defined, technical or scientific terms used in the embodiments of the present application should be given the ordinary meaning as understood by one of ordinary skill in the art to which the present application belongs. The terms "first," "second," and the like, as used in embodiments of the present application, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
With the rapid development of stereoscopic display technology, there is an increasing demand for stereoscopic display devices in industry, and among the technologies for realizing three-dimensional stereoscopic display, naked eye stereoscopic display is favored in the field of three-dimensional stereoscopic display because of the advantage that viewers do not need to wear glasses. The naked eye three-dimensional display technology is widely applied to products such as mobile phones, flat plates, notebooks, televisions and the like, and full-size coverage is realized.
At present, the naked eye stereoscopic display technology is mainly realized by arranging a grating in front of or behind a display panel, dividing a pixel unit of the display panel into odd-numbered columns of pixels and even-numbered columns of pixels in the horizontal direction, thereby respectively providing two different images for the left eye and the right eye of a viewer, forming a depth of field by utilizing parallax effects of the left eye image and the right eye image of the viewer, and further generating a stereoscopic display effect.
The grating comprises a shielding type grating and a light splitting type grating, wherein the shielding type grating is divided into a black-white parallax barrier grating and a liquid crystal slit grating, and the light splitting type grating is divided into a columnar physical lens, a switchable liquid crystal lens and the like. The switchable liquid crystal lenses are further classified into three types, including: switchable resin type lenticular liquid crystal lens, switchable polarizing liquid crystal lens, and switchable electrode type liquid crystal lens. The most widely used gratings today are fixed cylindrical physical lenses (also known as lenticular gratings).
Fig. 1 shows an optical design and implementation schematic of an bare eye 3D product. As shown in fig. 1, taking the grating structure 12 as a cylindrical grating as an example, the cylindrical grating includes a plurality of grating units 121, the pitch (also referred to as "pitch") of one grating unit 121 corresponds to a plurality of sub-pixels, and the light emitted by the sub-pixels respectively enter the left and right eyes of a person through refraction of the cylindrical grating, and then is integrated by the brain to feel a stereoscopic effect.
Based on the naked eye 3D optical design principle, as shown in fig. 2, for Δabc and Δaef, according to the principle of similar triangles, (l+d)/l=w/P can be obtained, and it can be found that the grating unit pitch P is smaller than the corresponding display pixel width W, which is a current general design scheme.
In some embodiments, for a display device, such as a 15.6-4 k resolution display 11, where the width of one sub-pixel is 0.03mm, if a naked eye 3D optical scheme is designed, it is assumed that the pitch of one optical structure corresponds to 4 sub-pixels, that is, the width W of one display pixel corresponding to one pitch is 0.12mm, the optimal viewing distance L is set to 500mm, a two-viewpoint design is performed, the pupil distance of the human eye is specified to be 65mm, and a pitch 0.121989mm and the difference between the pitch and the pixel width are 0.000121mm can be calculated. The display 11 has 3840×3= 121520 sub-pixels in the transverse direction, so that there are 2880 pixels wide, and the total difference is 0.3168mm according to the difference between the two sub-pixels.
For the naked eye 3D display device, the design is generally based on the center line of the display screen 11, and when the display screen 11 extends to two sides, the structure shown in fig. 3 below is obtained, and the grating unit pitch P is offset from the display pixel width W, that is, the first offset X occurs. In this case, the viewer obtains the best display effect if viewing at the center of the screen, but the viewer is at a non-center position, such as the position shown in fig. 4, where the offset amount becomes the second offset amount Y, which is significantly larger than the first offset amount X, and since the raster unit 121 is still at the position shown by the solid line, the crosstalk of the 3D display is seriously increased, resulting in poor 3D experience.
As shown in fig. 4, the actually required grating unit 121 should be located at the position shown by the dotted line in fig. 4 for better 3D experience. However, since the grating unit 121 and the pixel width are relatively fixed, adjustment cannot be performed, especially, adjustment cannot be flexibly performed according to the position change of the viewer, so that the 3D display effect experienced by the viewer at different positions is good or bad, the viewing experience is seriously affected, and the popularization of the product is also not facilitated.
Based on this, referring to fig. 5 and 6, in some embodiments, the present application provides a display method of a 3D display device, where the 3D display device includes a display screen 11 and a grating structure 12, the display screen 11 includes a plurality of sub-pixels, and the grating structure 12 includes a plurality of grating units 121 sequentially arranged, and the method specifically includes the following steps:
step S100, obtaining the positions of the eyes of a viewer facing the display screen 11, wherein the positions of the eyes are the spatial positions of the eyes relative to the display screen 11;
step 200, judging whether the two eye positions fall into an optimal viewing area 7 of the naked eye 3D visual area;
step S300, if the two eye positions do not fall into the optimal viewing area 7, adjusting the sub-pixel lighting area corresponding to the raster unit 121 and/or the position of the raster unit 121, so as to adjust the spatial position of the optimal viewing area 7.
The binocular position of the viewer when facing the display screen 11, which is the spatial position of the eyes with respect to the display screen 11, may be acquired by a device such as a camera, an eye tracker, or the like provided on the 3D display device. Illustratively, the acquired both eye positions are embodied in the form of position coordinates, for example, the position coordinates of the acquired left eye position are (1, 2) or (a, b), and the position coordinates of the acquired right eye position are (3, 4) or (c, d).
After the binocular positions are acquired, whether the binocular positions fall into an optimal viewing area 7 of the naked eye 3D visual area is judged. The optimal viewing area 7 is obtained based on pre-designing and testing the 3D display device, exemplarily, the optimal viewing area 7 of the display device as shown in fig. 6, where the M position represents the center point position M of the actual optimal viewing area 7.
When the positions of the eyes are positioned in the optimal viewing area 7, the optimal viewing effect can be obtained, and the visual crosstalk is small; when the positions of both eyes of the viewer do not fall within the optimal viewing zone 7 (for example, the position of the center point of both eyes is in the position N in the figure), the visual crosstalk is large and the viewing effect is poor.
Therefore, when the positions of the eyes do not fall into the optimal viewing area 7, it is indicated that the viewing effect is not good at this time, the positions of the sub-pixels corresponding to the grating units 121 and/or the positions of the grating units 121 are adjusted to adjust the relative positions of the grating units 121 and the lighted sub-pixels, so that the light emitted by the lighted sub-pixels can enter the human eyes after being refracted by the corresponding grating units 121, which is equivalent to adjusting the spatial position of the optimal viewing area 7 on the premise that the positions of the eyes are unchanged, so that the positions of the eyes can fall into the adjusted optimal viewing area 7, the optimal viewing effect is obtained, and the display crosstalk caused by the position deviation between the grating units 121 and the lighted sub-pixels is reduced. Thus, no matter what area the eyes of the viewer are positioned in, the best viewing effect can be obtained by adjusting the sub-pixel lighting area and/or the position of the grating unit 121, and the viewing experience of the user is improved.
Specifically, adjusting the sub-pixel lighting area corresponding to the raster unit 121 and/or the position of the raster unit 121 may be adjusting only the sub-pixel lighting area, or may be adjusting only the position of the raster unit 121, or may be adjusting the sub-pixel lighting area and the position of the raster unit 121 simultaneously or sequentially.
However, before adjusting the position of the grating unit 121, it is necessary to determine whether the position of the grating unit 121 is adjustable by determining the grating structure 12. The position of the grating unit 121 can be flexibly adjusted only when the grating structure 12 is an electrically driven liquid crystal lens, and the grating unit 121 is a lens area formed by a plurality of adjacent electrodes 8. If the grating structure 12 is a lenticular liquid crystal lens or a switchable polarizing liquid crystal lens, the position of the grating unit 121 is fixed, and cannot be adjusted, and at this time, the spatial position of the optimal viewing area 7 can be adjusted only by adjusting the sub-pixel lighting area corresponding to the grating unit 121.
Referring to fig. 7A, fig. 7A shows a schematic structure of an electrically-driven liquid crystal lens, which includes a plurality of electrodes 8 arranged in parallel, and a specific voltage is applied to the adjacent plurality of electrodes 8, for example, specific voltages V1 to V10 are applied to the right 10 electrodes 8 in fig. 7A, so that the 10 electrodes 8 form an equivalent lens 9 (i.e., a grating unit 121). When the voltage applied to the electrode 8 changes, the position of the equivalent lens 9 also changes. For example, referring to fig. 7B, when specific voltages V1 to V10 are applied to the middle 10 electrodes 8 of fig. 7B, the 10 electrodes 8 can be made to form a new equivalent lens 9. That is, from fig. 7A to 7B, the voltage applied to the electrode 8 is changed, so that the lens region formed by the electric drive is changed, the lens region is shifted, and the position of the grating unit 121 is changed.
Due to process limitations, the accuracy of adjustment of the sub-pixel lighting area is higher, and the accuracy of adjustment of the position of the grating unit 121 is worse.
Based on this, in some embodiments, referring to fig. 8, when the binocular position does not fall into the optimal viewing area 7, which indicates that the viewing effect is not good at this time, the sub-pixel lighting area corresponding to the raster unit 121 may be preferentially adjusted, and after the sub-pixel lighting area corresponding to the raster unit 121 is adjusted, the binocular position of the viewer when facing the display screen 11 is continuously acquired; judging whether the two eye positions fall into an optimal viewing area 7 of the naked eye 3D visual area or not; if the two eye positions do not fall into the optimal viewing area 7, the display effect is not good. Then, it is determined whether the position of the grating unit 121 is adjustable, and if the position of the grating unit 121 is adjustable, the spatial position of the best viewing area 7 can be adjusted by adjusting the position of the grating unit 121, so as to achieve the best display effect. If the position of the grating unit 121 is not adjustable, the sub-pixel lighting area corresponding to the grating unit 121 is adjusted again, so that the spatial position of the best viewing area 7 can be adjusted to the most suitable position by multiple adjustments, so as to obtain the best viewing effect. Meanwhile, the sub-pixel lighting area is preferentially adjusted, so that the adjustment precision can be improved, the adjustment times are reduced, and the actual operation is convenient.
As described above, in designing a 3D display device, in order to obtain a better display effect, the grating unit pitch P needs to be precisely designed and prepared according to the size of the display pixel width W, and the grating unit pitch P needs to be slightly smaller than the display pixel width W, which is very critical to the manufacturing process and is not beneficial to mass production. For example, as described above, for a display device, such as a 15.6-4 k resolution display 11, the width of one sub-pixel is 0.03mm, if a naked eye 3D optical scheme is designed, it is assumed that the pitch of one raster unit 121 corresponds to 4 sub-pixels, that is, the width W of one display pixel corresponding to one pitch is 0.12mm, the optimal viewing distance L is set to 500mm, a two-viewpoint design is performed, the pupil distance of the human eye is designated to 65mm, and a pitch 0.121989mm and the difference between the pitch and the pixel width are 0.000121mm can be calculated. That is, based on the current design, the pitch width of the grating unit 121 to be designed is 0.121989mm, which is very demanding for the manufacturing process, and is disadvantageous for mass production.
Based on this, in some embodiments, each of the grating units 121 corresponds to a plurality of the sub-pixels, and the orthographic projection of the grating unit 121 on the display screen 11 is completely overlapped with the corresponding plurality of sub-pixels, so that no precise design is required, and no pitch of the grating unit 121 must be smaller than the width of the corresponding plurality of sub-pixels, which simplifies the design and manufacturing process of the grating unit 121 and facilitates mass production. For example, taking the foregoing 15.6-4 k resolution display 11 as an example, the width of one sub-pixel is 0.03mm, and assuming that the pitch of one grating unit 121 corresponds to 4 sub-pixels, based on the application, the pitch of one grating unit 121 can be designed to be 0.12mm, which greatly reduces the difficulty of the manufacturing process and reduces the manufacturing cost compared with the existing design with one pitch of 0.121989 mm.
In this application, since the sub-pixel lighting area and/or the position of the grating unit 121 can be flexibly adjusted according to the positions of both eyes, the pitch of the grating unit 121 is not required to be severely limited, and the optimal viewing effect can be achieved through adjustment.
In some embodiments, referring to fig. 9, if the binocular position does not fall into the optimal viewing area 7, the step S300 adjusts the sub-pixel lighting area corresponding to the raster unit 121 and/or the position of the raster unit 121, which specifically includes:
step S310, based on the two eye positions, determining a central grating unit and lateral grating units, and determining the boundary position of a target pixel of each lateral grating unit;
step 320, if the orthographic projection of the target pixel boundary position of the lateral raster unit on the display screen 11 is located outside the current sub-pixel lighting area of the lateral raster unit, determining the deviation value W and the deviation direction R of each lateral raster unit based on the target pixel boundary position and the current pixel boundary position;
step S330, based on the deviation amount W and the deviation direction R, adjusts the sub-pixel lighting area corresponding to the raster unit 121 and/or the position of the raster unit 121.
Specifically, the step S310 specifically includes:
step S3121 determines a center grating unit and a side grating unit based on the binocular positions, specifically including: and determining the positions of the center points of the eyes of the viewer based on the positions of the eyes, if the orthographic projection of the grating units 121 on the display screen 11 covers the orthographic projection of the positions of the center points of the eyes on the display screen 11, determining the grating units 121 as the center grating units, and determining the grating units 121 except for the center grating units among all the grating units 121 as lateral grating units.
Step S312, determining a target pixel boundary position of each lateral raster unit, specifically includes:
step S3121, establishing a coordinate system with a first extending direction perpendicular to the plane of the display screen 11 and a second extending direction parallel to the plane of the display screen 11;
step S3123, determining a target pixel boundary position of each of the lateral raster units based on the binocular central point position, the coordinate system and a preset maximum human eye viewing angle.
Specifically, as shown in fig. 9, it is assumed that the binocular central point of the viewer determined based on the binocular positions is located at the position O. Then, based on the position O, the orthographic projection on the display screen 11 of the second raster unit 121 from left to right in fig. 9 covers the orthographic projection of the binocular central point position on the display screen 11, so that the raster unit 121 is determined as a central raster unit, and the raster units 121 other than the central raster unit among all raster units 121 are determined as lateral raster units.
Then, with continued reference to fig. 9, a coordinate system is established with a first extending direction (Y direction as shown in fig. 9) perpendicular to the plane of the display screen 11 and a second extending direction (X direction as shown in fig. 9) parallel to the plane of the display screen 11.
Based on the binocular central point position O, a preset maximum viewing angle of the human eye (illustratively, an angle shown as an angle ZOE in fig. 9), a right triangle is established in the coordinate system, and a position E is determined as a target pixel boundary position of the lateral raster unit, that is, a specific position of the display screen 11 that can be viewed by the human eye at the preset maximum viewing angle of the human eye, as shown by Δ OZE in fig. 9.
When the orthographic projection of the target pixel boundary position of the lateral raster unit on the display screen 11 is located outside the current sub-pixel lighting area of the lateral raster unit, determining the deviation amount W and the deviation direction R of each lateral raster unit based on the target pixel boundary position and the current pixel boundary position.
The current sub-pixel lighting area is the sub-pixel lighting area before being not adjusted. In the present application, in the initial state, the width of the current lighting region 3 of the sub-pixel is the same as the pitch width of the corresponding raster unit 121.
When determining the deviation amount W and the deviation direction R of each lateral raster unit based on the target pixel boundary position and the current pixel boundary position, referring to fig. 9, based on the coordinate system, determining the boundary point position of the bottom of the lateral raster unit and the line of sight of the maximum angle of human eyes in the X direction as F, since the width of the current lighting area 3 of the sub-pixel is the same as the pitch width of the corresponding raster unit 121, the position F is the current pixel boundary position.
And then determining the deviation W and the deviation direction R of each lateral grating unit based on the target pixel boundary position E, the current pixel boundary position F, the boundary point position S in the Y direction of the bottom of the central grating unit and the binocular central point position, the orthographic projection position Z of the binocular central point position on the display screen 11, the distance D between the bottom of the grating unit 121 and the top of the display screen 11 and the binocular central point position O.
Illustratively, referring to fig. 9, Δosf and Δ OZE are similar triangles, and SF/ze=os/OZ is known from the law of similar triangles, and thus, based on the positions of the respective points that have been determined, the difference between SF and ZE, that is, the deviation amount W of the lateral grating unit, can be calculated.
And determining the deviation direction R of the lateral grating unit to be the R direction shown in fig. 9 based on the coordinate system and the relative positions of the target pixel boundary position and the current pixel boundary position.
Then, based on the deviation amount W and the deviation direction R, the sub-pixel lighting area corresponding to the raster unit 121 and/or the position of the raster unit 121 are adjusted.
Specifically, referring to fig. 9 and 10, the sub-pixel lighting area corresponding to the raster unit 121 may be adjusted based on the deviation amount W and the deviation direction R. Illustratively, as shown in fig. 9, when the deviation direction R is right, and the sub-pixel lighting area corresponding to the raster unit 121 is adjusted, the sub-pixel lighting area is shifted rightward by the deviation amount W from the current lighting area 3, and the shifted sub-pixel lighting area is the adjusted lighting area 4. Illustratively, as shown in fig. 10, when the deviation direction R is left, and the sub-pixel lighting area corresponding to the raster unit 121 is adjusted, the sub-pixel lighting area is shifted to the left from the current lighting area 3 by the distance of the deviation amount W, and the shifted sub-pixel lighting area is the adjusted lighting area 4.
Referring to fig. 11 and 12, the position of the grating unit 121 may be adjusted based on the deviation amount W and the deviation direction R. Illustratively, as shown in fig. 11, when the position of the grating unit 121 is adjusted, the lens area of the grating unit 121 is shifted rightward by the shift amount W from the current lens area 5, and the shifted lens area is the adjusted lens area 6, so that the position of the grating unit 121 can be adjusted. As shown in fig. 12, when the position of the grating unit 121 is adjusted with the deviation direction R being the left direction, the lens region of the grating unit 121 is shifted to the left from the current lens region 5 by the deviation amount W, and the shifted lens region is the adjusted lens region 6, so that the position of the grating unit 121 can be adjusted.
In this application, by establishing an appropriate coordinate system, and accurately calculating the deviation amount W and the deviation direction R of each lateral grating unit, and then adjusting the sub-pixel lighting area corresponding to the grating unit 121 and/or the position of the grating unit 121 based on the accurately calculated deviation amount W and the accurately calculated deviation direction R, the accuracy and precision of adjustment can be improved, and further, the spatial position of the optimal viewing area 7 can be more accurately adjusted, so as to obtain the optimal viewing effect.
When the adjustment is performed based on the deviation amount W and the deviation direction R, the translation of the sub-pixel lighting area and/or the translation of the position of the grating unit 121 may not be exactly performed according to the deviation amount W due to the limitations of the process, the sub-pixel arrangement position, and the grating structure 12 itself. For example, the calculated deviation amount W is 0.1094mm, but in practice, the minimum adjustment width of the sub-pixel lighting area is 0.1000mm, and then the shift amount of the sub-pixel lighting area cannot be shifted exactly by the deviation amount w0.1094mm, resulting in an increase in process difficulty.
Based on this, in some embodiments, the adjusting the sub-pixel lighting area corresponding to the raster unit 121 based on the deviation amount W and the deviation direction R includes:
Determining the actual adjustment number of the sub-pixels based on the deviation W and the minimum adjustment width of the preset pixels;
and adjusting the sub-pixel lighting area corresponding to the grating unit 121 based on the sub-pixel adjustment number and the deviation direction R. Specifically, the sub-pixel lighting area corresponding to the raster unit 121 is translated from the current lighting area 3 along the deviation direction R according to the actual adjustment number of the sub-pixels, so as to adjust the sub-pixel lighting area corresponding to the raster unit 121.
The preset minimum pixel adjusting width is the minimum width which can be adjusted by the preset sub-pixels. Illustratively, the preset pixel minimum adjustment width may be 0.006mm or 0.005mm, etc.
Based on the ratio of the deviation W to the minimum adjustment width of the preset pixels, the number of sub-pixel adjustments can be determined. When the calculated number of sub-pixel adjustment is an integer, the number of sub-pixel adjustment is determined as the actual number of sub-pixel adjustment, and the sub-pixel lighting area corresponding to the raster unit 121 is translated from the current lighting area 3 along the deviation direction R according to the actual number of sub-pixel adjustment, so as to adjust the sub-pixel lighting area corresponding to the raster unit 121.
For example, assuming that the deviation amount W is 0.0300 and the preset pixel minimum adjustment width may be 0.006mm, the number of sub-pixel adjustments is calculated based on the ratio of the deviation amount W and the preset pixel minimum adjustment width to be 5, and then 5 is determined as the actual number of sub-pixels adjustments. And translating the sub-pixel lighting area corresponding to the grating unit 121 from the current lighting area 3 by 5 sub-pixels along the deviation direction R so as to adjust the sub-pixel lighting area corresponding to the grating unit 121.
When the calculated adjustment number of the sub-pixels is a fraction (for example, m.nb, where m, n, and b are all natural numbers), the actual adjustment number of the sub-pixels is determined as follows:
when 0.nb is greater than or equal to 0.5, determining m+1 as the actual adjustment number of the sub-pixels;
when 0.nb is smaller than 0.5, m is determined as the actual adjustment number of the sub-pixels.
Then, the sub-pixel lighting area corresponding to the raster unit 121 is shifted from the current lighting area 3 along the deviation direction R according to the actual adjustment number of the sub-pixels, so as to adjust the sub-pixel lighting area corresponding to the raster unit 121. Therefore, on the basis that the existing technology can be implemented, the deviation W actually adjusted is enabled to be closest to the deviation W calculated in theory to the greatest extent, and the accuracy of adjustment is improved.
For example, assuming that the deviation amount W is 0.0320mm and the preset pixel minimum adjustment width may be 0.006mm, the number of sub-pixel adjustments is calculated to be 5.33 based on the ratio of the deviation amount W and the preset pixel minimum adjustment width, and then 5 is determined as the actual number of sub-pixels. And translating the sub-pixel lighting area corresponding to the grating unit 121 from the current lighting area 3 by 5 sub-pixels along the deviation direction R so as to adjust the sub-pixel lighting area corresponding to the grating unit 121.
For example, assuming that the deviation amount W is 0.0350mm and the preset pixel minimum adjustment width may be 0.006mm, the number of sub-pixel adjustments is calculated based on the ratio of the deviation amount W and the preset pixel minimum adjustment width to be 5.83, and then 6 is determined as the actual number of sub-pixels. And translating the sub-pixel lighting area corresponding to the grating unit 121 from the current lighting area 3 according to 6 sub-pixels along the deviation direction R so as to adjust the sub-pixel lighting area corresponding to the grating unit 121.
In the method, the actual adjustment number of the sub-pixels is determined, and the sub-pixel lighting areas are adjusted based on the actual adjustment number of the sub-pixels, so that on one hand, the difficulty is not increased for the existing process, and adjustment can be realized only under the existing process conditions, and on the other hand, the deviation W of actual adjustment is enabled to be closest to the deviation W calculated in theory to the greatest extent, and the accuracy of adjustment is improved.
In some embodiments, the adjusting the position of the grating unit 121 based on the deviation amount W and the deviation direction R includes:
determining the actual adjustment number of the electrodes based on the deviation W and the preset minimum electrode adjustment width;
the position of the grating unit 121 is adjusted based on the electrode adjustment number and the deviation direction R. Specifically, the lens area of the grating unit 121 is translated from the current lens area 5 along the deviation direction R according to the actual adjustment number of the electrodes, so as to adjust the position of the grating unit 121.
Wherein the preset minimum electrode adjusting width is the minimum width which can be adjusted by the preset electrode 8. Illustratively, the electrode minimum adjustment width may be 0.004mm or 0.005mm, etc.
Based on the ratio of the deviation W and the minimum adjustment width of the electrodes, the number of electrode adjustments can be determined. When the calculated electrode adjustment number is an integer, the electrode adjustment number is determined as an actual electrode adjustment number, and the lens area of the grating unit 121 is translated from the current lens area 5 along the deviation direction R according to the electrode actual adjustment number, so as to adjust the position of the grating unit 121.
For example, assuming that the deviation amount W is 0.0280 and the preset electrode minimum adjustment width may be 0.004mm, the electrode adjustment number is calculated to be 7 based on the ratio of the deviation amount W and the preset electrode minimum adjustment width, and then 7 is determined as the electrode actual adjustment number. Along the deviation direction R, the lens area of the grating unit 121 is translated from the current lens area 5 by 7 electrodes to adjust the position of the grating unit 121.
When the calculated electrode adjustment number is a fraction (e.g., r.tu, where r, t, and u are natural numbers), the actual electrode adjustment number is determined as follows:
when 0.tu is greater than or equal to 0.5, determining r+1 as the actual adjustment number of the electrodes;
when 0.tu is less than 0.5, r is determined as the actual number of electrode adjustments.
Then, the lens area of the grating unit 121 is translated from the current lens area 5 along the deviation direction R according to the actual adjustment number of the electrodes, so as to adjust the position of the grating unit 121. Therefore, on the basis that the existing technology can be implemented, the deviation W actually adjusted is enabled to be closest to the deviation W calculated in theory to the greatest extent, and the accuracy of adjustment is improved.
For example, assuming that the deviation amount W is 0.0282mm and the preset electrode minimum adjustment width may be 0.004mm, the electrode adjustment number is calculated to be 7.05 based on the ratio of the deviation amount W and the preset electrode minimum adjustment width, and then 7 is determined as the electrode actual adjustment number. Along the deviation direction R, the lens area of the grating unit 121 is translated from the current lens area 5 by 7 electrodes to adjust the position of the grating unit 121.
For example, assuming that the deviation amount W is 0.0312mm and the preset electrode minimum adjustment width may be 0.004mm, the electrode adjustment number is calculated to be 7.8 based on the ratio of the deviation amount W and the preset electrode minimum adjustment width, and then 8 is determined as the electrode actual adjustment number. Along the deviation direction R, the lens area of the grating unit 121 is translated from the current lens area 5 by 8 electrodes to adjust the position of the grating unit 121.
In the application, the actual adjustment number of the electrodes is determined, and the positions of the grating units 121 are adjusted based on the actual adjustment number of the electrodes, so that on one hand, the difficulty is not increased for the existing process, and the adjustment can be realized only under the existing process condition, and on the other hand, the deviation W of the actual adjustment is enabled to be closest to the deviation W calculated in theory to the greatest extent, so that the accuracy of the adjustment is improved.
In specific implementation, the display method of the 3D display device described in the present application may include the following steps:
taking the 15.6 4k display 11 as an example, the 3D design related parameters related to the display 11 include: the red, green and blue sub-pixels are 0.03mm in size, the thickness of the color film glass is 0.25mm, the thickness of the color film polaroid is 0.1mm, the optimal viewing distance LL=500 mm of naked eyes 3D, the horizontal width of one resin prism (namely the grating unit 121) corresponds to 4 sub-pixels, the inclination angle of the resin prism, namely the included angle between the prism and the short side direction of the display screen 11 is 7.5946 degrees, and the design of 20 view points and 2 view is adopted. The 2 views represent that the width of the continuous window corresponding to one prism is t=10 mm, i.e. a single person views, 20 views represent that 20 views can be displayed within the window width corresponding to one prism, the 20 views can be 1-10 views displaying a picture of one parallax, and the 121-20 views displaying a picture of another parallax, as shown in fig. 13, the adjustment of the 20 views representing images can adjust the width of the pixels based on the 20 views, thereby adjusting the width of the view, one resin prism unit width is 0.12mm, and then the adjustment unit of the sub-pixel width is 0.12/20=0.006 mm (i.e. the preset minimum adjustment width of the pixels).
Based on the conventional optical structural design mode, the pitch of the resin prism is 0.121989mm, but in the scheme, the pitch is 0.12mm because the pitch is not required to be too severe.
If the grating structure 12 is a non-switching structure (i.e., the position of the grating unit 121 is not adjustable), such as a resin prism structure, there are two ways to adjust the sub-pixel lighting area:
the first way is: the pitch of the actual resin prism was 0.12mm, the width of the theoretically required sub-pixel lighting area was 0.120121mm, but the width of the actual sub-pixel width was 0.12mm, so there was a 0.000121mm deviation of the theoretical width and the actual width of the resin prism at the center position. When the calculation is continued from the center position to both sides, for example, by the 29 th resin prism, the accumulated deviation at this time is 0.00319mm, which is greater than 0.003mm (half of the adjustment unit of the sub-pixel width 0.006), and at this time, the sub-pixel lighting area corresponding to the resin prism is shifted by one sub-pixel viewpoint. When the 55 th resin prism is reached, the accumulated deviation is 0.00594mm, the next resin prism is larger than 0.006mm, the next resin prism is restarted, when the 29 th resin prism is reached, one viewpoint is adjusted, the previous 55 resin prisms are cleared, and the actual sub-pixel offset number of the sub-pixel lighting area corresponding to each resin prism is calculated in sequence.
When the accumulated change value of the pitch is smaller than 0.003mm, the pixel width (namely the sub-pixel lighting area) is not adjusted, the accumulated change value is larger than 0.003mm, the pixel width is adjusted by one viewpoint, then the accumulated change value is equal to 0.006mm or a minimum change value is increased by more than 0.006mm, accumulation is restarted, and the above actions are repeated, namely, the judgment accuracy can be improved according to half of the adjustment unit of the sub-pixel width (namely the preset minimum adjustment width of the preset pixel) as a judgment basis. Of course, any data that can be implemented may be used as a basis for judgment.
The second way is: as shown in fig. 14, the position of the center point of the human eye is detected as (x, y) by the camera, wherein the position y of the human eye is assumed as the viewing distance L, because the resin prism has a certain spacing height D from the sub-pixel, the spacing distance M1 between the central grating unit N1 and the lateral grating unit Nt of the N grating units 121 spaced from N1 is not equal to the spacing distance M2 between the sub-pixel lighting areas corresponding to N1 and Nt, and the difference between M1 and M2 can be calculated according to the similar triangle, where M1/m2=l/(l+d).
Assuming a is 200 resin prisms wide, i.e., a=24 mm, l=500 mm, d=0.7 mm, then b= 24.0332mm is calculated. At this time, the deviation between the target sub-pixel lighting area corresponding to the resin prism and the current sub-pixel lighting area is 0.0332mm (i.e., the deviation W) divided by 0.006mm (i.e., the preset minimum adjustment width of the pixels), so as to obtain 5.53 (i.e., the number of sub-pixel adjustments). Then, at this time, the sub-pixel lighting area corresponding to the resin prism needs to be shifted from the current lighting area 3 by 6 sub-pixels (i.e., the actual adjustment number of the sub-pixels) along the deviation direction R, so as to adjust the sub-pixel lighting area corresponding to the resin prism.
Further, when one resin prism corresponds to more viewpoints, the minimum unit of pixel width adjustment is smaller, and the adjustment is more precise.
If the grating structure 12 is a switchable structure, such as an electrically driven liquid crystal lens structure, where the minimum unit of pitch width adjustment of the liquid crystal lens is related to the width and pitch of the electrodes 8, the minimum unit of pitch width adjustment at this time is 0.004mm (i.e. the preset electrode minimum adjustment width) assuming that the electrode width is 0.002mm and the pitch is 0.002 mm.
For adjusting the position of the grating unit 121 of the electrically driven liquid crystal lens structure, it is assumed that one equivalent lens 9 includes 30 electrodes 8, and it is assumed that the voltage values corresponding to each electrode 8 are V1, V2, V3 … … V28, V29, V30, respectively, which may be the same or different.
When adjustment is required, the sub-pixel lighting area is preferably adjusted, and then the position of the grating unit 121 is adjusted.
In the first mode, when the deviation W between the prism and the current pixel lighting area is 0.0332mm, the sub-pixel lighting area is preferentially adjusted to shift the 6 sub-pixel viewpoints, and the voltage of the electrode 8 of the liquid crystal lens is not changed.
After the shift of 6 sub-pixel viewpoints is performed for the sub-pixel lighting area, the shift amount W is continuously calculated according to the above steps.
When the deviation W is divided by 0.006, the remaining value is still greater than or equal to 0.006/2, and the sub-pixel lighting area is continuously shifted by 1 sub-pixel viewpoint.
When the deviation W is divided by 0.006, the remaining value is less than 0.006/2, but greater than or equal to 0.004/2, the sub-pixel lighting area is not adjusted, but the lens area of the grating unit 121 is shifted by 1 electrode from the current lens area 5 to adjust the position of the grating unit 121.
When the remaining value is less than 0.004/2 after the deviation W is divided from 0.006, the sub-pixel lighting area is not adjusted, and the position of the grating unit 121 is not adjusted.
In the application, the position of the sub-pixel lighting area and/or the position of the grating unit 121 are adjusted in real time by detecting the position information of the viewer, so that the viewer can see the best effect of the full screen at any position, the 3D viewing effect and the comfort are improved, meanwhile, the manufacturing precision of the grating structure 12 and the grating unit 121 is reduced, and the actual manufacturing is facilitated.
It should be noted that, the method of the embodiments of the present application may be performed by a single device, for example, a computer or a server. The method of the embodiment can also be applied to a distributed scene, and is completed by mutually matching a plurality of devices. In the case of such a distributed scenario, one of the devices may perform only one or more steps of the methods of embodiments of the present application, and the devices may interact with each other to complete the methods.
It should be noted that some embodiments of the present application are described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Based on the same inventive concept, the application also provides a 3D display device corresponding to the method of any embodiment.
Referring to fig. 15, the 3D display device 1 includes:
a display screen 11 including a plurality of sub-pixels;
a grating structure 12 including a plurality of grating units 121 arranged in sequence;
an eye tracker 13, configured to obtain the positions of eyes of a viewer when the eyes face the display screen 11, where the positions of eyes are spatial positions of eyes relative to the display screen 11; judging whether the two eye positions fall into an optimal viewing area 7 of the naked eye 3D visual area or not;
and an image processor 14, configured to adjust a sub-pixel lighting area corresponding to the grating unit 121 and/or a position of the grating unit 121 when the eye tracker 13 determines that the binocular position does not fall into the optimal viewing area 7, so as to adjust a spatial position of the optimal viewing area 7.
In some embodiments, the image processor 14 includes a target pixel boundary determination module, a decision module, a bias determination module, and an adjustment module;
the target pixel boundary determining module is used for determining a central grating unit and lateral grating units based on the binocular positions and determining the target pixel boundary position of each lateral grating unit;
the judging module is used for judging whether the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen 11 is positioned outside the current sub-pixel lighting area of the lateral grating unit;
if the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen 11 is located outside the current sub-pixel lighting area of the lateral grating unit, the deviation determining module is configured to determine a deviation amount W and a deviation direction R of each lateral grating unit based on the target pixel boundary position and the current pixel boundary position;
the adjusting module is configured to adjust a sub-pixel lighting area corresponding to the raster unit 121 and/or a position of the raster unit 121.
In some embodiments, the target pixel boundary determination module is further to determine a binocular central point location of the viewer based on the binocular location;
If the orthographic projection of the grating unit 121 on the display screen 11 covers the orthographic projection of the binocular central point position on the display screen 11, the grating unit 121 is determined as the central grating unit, and the grating units 121 except for the central grating unit among all the grating units 121 are determined as lateral grating units.
In some embodiments, the target pixel boundary determining module is further configured to establish a coordinate system in a first extending direction perpendicular to a plane in which the display screen 11 is located and a second extending direction parallel to the plane in which the display screen 11 is located;
and determining the target pixel boundary position of each lateral grating unit based on the two-eye center point position, the coordinate system and a preset maximum human eye viewing angle.
In some embodiments, the adjustment module includes a pixel adjustment unit;
the pixel adjusting unit is used for determining the actual adjustment number of the sub-pixels based on the deviation W and the minimum adjustment width of the preset pixels; and translating the sub-pixel lighting area corresponding to the grating unit 121 from the current lighting area 3 along the deviation direction R according to the actual adjustment number of the sub-pixels so as to adjust the sub-pixel lighting area corresponding to the grating unit 121.
In some embodiments, the grating structure 1 is an electrically driven liquid crystal lens comprising a plurality of electrodes 8 arranged side by side, and the grating unit 121 is a lens region formed by adjacent plurality of electrodes;
the adjusting module further comprises an electrode adjusting unit;
the electrode adjusting unit is used for determining the actual electrode adjusting number based on the deviation W and the preset minimum electrode adjusting width; and translating the lens area of the grating unit 121 from the current lens area 5 along the deviation direction R according to the actual adjustment number of the electrodes so as to adjust the position of the grating unit 121.
For convenience of description, the above devices are described as being functionally divided into various modules, respectively. Of course, the functions of each module may be implemented in the same piece or pieces of software and/or hardware when implementing the present application.
The apparatus of the foregoing embodiment is configured to implement the display method of the corresponding 3D display device in any foregoing embodiment, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Based on the same inventive concept, the application also provides an electronic device corresponding to the method of any embodiment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the display method of the 3D display device of any embodiment when executing the program.
Fig. 16 shows a more specific hardware architecture of an electronic device according to this embodiment, where the device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 implement communication connections therebetween within the device via a bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit ), microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, etc. for executing relevant programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of ROM (Read Only Memory), RAM (Random Access Memory ), static storage device, dynamic storage device, or the like. Memory 1020 may store an operating system and other application programs, and when the embodiments of the present specification are implemented in software or firmware, the associated program code is stored in memory 1020 and executed by processor 1010.
The input/output interface 1030 is used to connect with an input/output module for inputting and outputting information. The input/output module may be configured as a component in a device (not shown) or may be external to the device to provide corresponding functionality. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various types of sensors, etc., and the output devices may include a display, speaker, vibrator, indicator lights, etc.
Communication interface 1040 is used to connect communication modules (not shown) to enable communication interactions of the present device with other devices. The communication module may implement communication through a wired manner (such as USB, network cable, etc.), or may implement communication through a wireless manner (such as mobile network, WIFI, bluetooth, etc.).
Bus 1050 includes a path for transferring information between components of the device (e.g., processor 1010, memory 1020, input/output interface 1030, and communication interface 1040).
It should be noted that although the above-described device only shows processor 1010, memory 1020, input/output interface 1030, communication interface 1040, and bus 1050, in an implementation, the device may include other components necessary to achieve proper operation. Furthermore, it will be understood by those skilled in the art that the above-described apparatus may include only the components necessary to implement the embodiments of the present description, and not all the components shown in the drawings.
The electronic device of the foregoing embodiment is configured to implement the display method of the corresponding 3D display device in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Based on the same inventive concept, corresponding to the method of any of the above embodiments, the present application further provides a computer-readable storage medium storing computer instructions for causing the computer to perform the display method of the 3D display device according to any of the above embodiments.
The computer readable media of the present embodiments, including both permanent and non-permanent, removable and non-removable media, may be used to implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
The storage medium of the foregoing embodiments stores computer instructions for causing the computer to execute the display method of the 3D display device according to any one of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein.
Those of ordinary skill in the art will appreciate that: the discussion of any of the embodiments above is merely exemplary and is not intended to suggest that the scope of the application (including the claims) is limited to these examples; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the present application, the steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present application as described above, which are not provided in detail for the sake of brevity.
Additionally, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown within the provided figures, in order to simplify the illustration and discussion, and so as not to obscure the embodiments of the present application. Furthermore, the devices may be shown in block diagram form in order to avoid obscuring the embodiments of the present application, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform on which the embodiments of the present application are to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the application, it should be apparent to one skilled in the art that embodiments of the application can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative in nature and not as restrictive.
While the present application has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of those embodiments will be apparent to those skilled in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic RAM (DRAM)) may use the embodiments discussed.
The present embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Accordingly, any omissions, modifications, equivalents, improvements and/or the like which are within the spirit and principles of the embodiments are intended to be included within the scope of the present application.

Claims (15)

1. A display method of a 3D display device, the 3D display device comprising a display screen and a raster structure, the display screen comprising a plurality of sub-pixels, the raster structure comprising a plurality of raster units arranged in sequence, the method comprising:
acquiring the positions of two eyes of a viewer facing a display screen, wherein the positions of the two eyes are the spatial positions of the two eyes relative to the display screen;
judging whether the two eye positions fall into an optimal viewing area of the naked eye 3D visual area or not;
And if the two eye positions do not fall into the optimal viewing area, adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit so as to adjust the spatial position of the optimal viewing area.
2. The display method according to claim 1, wherein adjusting the sub-pixel lighting area corresponding to the raster unit and/or the position of the raster unit if the binocular position does not fall into the optimal viewing area comprises:
determining a central grating unit and lateral grating units based on the binocular positions, and determining a target pixel boundary position of each lateral grating unit;
if the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen is positioned outside the current sub-pixel lighting area of the lateral grating unit, determining the deviation amount and the deviation direction of the lateral grating unit based on the target pixel boundary position and the current pixel boundary position;
and adjusting the position of the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit based on the deviation amount and the deviation direction.
3. The display method according to claim 2, wherein the determining a center raster unit and a side raster unit based on the binocular positions includes:
Determining a binocular central point position of the viewer based on the binocular positions;
and if the orthographic projection of the grating units on the display screen covers the orthographic projection of the binocular central point positions on the display screen, determining the grating units as the central grating units, and determining the grating units except the central grating units in all the grating units as lateral grating units.
4. A display method according to claim 3, wherein said determining the target pixel boundary position of each of said lateral raster units comprises:
establishing a coordinate system in a first extending direction perpendicular to a plane of the display screen and a second extending direction parallel to the plane of the display screen;
and determining the target pixel boundary position of each lateral grating unit based on the two-eye center point position, the coordinate system and a preset maximum human eye viewing angle.
5. The display method according to claim 2, wherein adjusting the sub-pixel lighting area corresponding to the raster unit based on the deviation amount and the deviation direction includes:
determining the actual adjustment number of the sub-pixels based on the deviation amount and the preset minimum adjustment width of the pixels;
And adjusting the sub-pixel lighting area corresponding to the grating unit based on the actual adjustment number of the sub-pixels and the deviation direction.
6. The display method according to claim 5, wherein adjusting the sub-pixel lighting area corresponding to the raster unit based on the sub-pixel adjustment number and the deviation direction includes:
and translating the sub-pixel lighting areas corresponding to the grating units from the current lighting areas along the deviation direction according to the actual adjustment number of the sub-pixels so as to adjust the sub-pixel lighting areas corresponding to the grating units.
7. The display method according to claim 2, wherein the grating structure is an electrically driven liquid crystal lens including a plurality of electrodes arranged in parallel, and the grating unit is a lens region formed by adjacent plurality of electrodes;
the adjusting the position of the grating unit based on the deviation amount and the deviation direction includes:
determining the actual adjustment number of the electrodes based on the deviation amount and the preset minimum electrode adjustment width;
and adjusting the position of the grating unit based on the actual adjustment number of the electrodes and the deviation direction.
8. The display method according to claim 7, wherein the adjusting the position of the raster unit based on the electrode adjustment number and the deviation direction includes:
and translating the lens area of the grating unit from the current lens area according to the actual adjustment number of the electrodes along the deviation direction so as to adjust the position of the grating unit.
9. The display method according to claim 1, wherein the adjusting the positions of the sub-pixel lighting areas corresponding to the raster units and the raster units includes:
after adjusting the sub-pixel lighting areas corresponding to the grating units, continuously acquiring the binocular positions of a viewer facing the display screen;
judging whether the two eye positions fall into an optimal viewing area of the naked eye 3D visual area or not;
and if the two eye positions do not fall into the optimal viewing area, adjusting the positions of the grating units.
10. A 3D display device, comprising:
a display screen including a plurality of subpixels;
the grating structure comprises a plurality of grating units which are sequentially arranged;
the human eye tracker is used for acquiring the positions of the eyes of a viewer when the eyes face the display screen, wherein the positions of the eyes are the spatial positions of the eyes relative to the display screen; judging whether the binocular positions fall into an optimal viewing area of the naked eye 3D visual area or not;
And the image processor is used for adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit when the eye tracker determines that the two eye positions do not fall into the optimal viewing area so as to adjust the spatial position of the optimal viewing area.
11. The 3D display device of claim 10, wherein the image processor comprises a target pixel boundary determination module, a decision module, a bias determination module, and an adjustment module;
the target pixel boundary determining module is used for determining a central grating unit and lateral grating units based on the binocular positions and determining the target pixel boundary position of each lateral grating unit;
the judging module is used for judging whether the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen is positioned outside the current sub-pixel lighting area of the lateral grating unit;
if the orthographic projection of the target pixel boundary position of the lateral grating unit on the display screen is located outside the current sub-pixel lighting area of the lateral grating unit, the deviation determining module is used for determining the deviation amount and the deviation direction of each lateral grating unit based on the target pixel boundary position and the current pixel boundary position;
The adjusting module is used for adjusting the sub-pixel lighting area corresponding to the grating unit and/or the position of the grating unit.
12. The 3D display device of claim 11, wherein the adjustment module comprises a pixel adjustment unit;
the pixel adjusting unit is used for determining the actual adjustment number of the sub-pixels based on the deviation amount and the minimum adjustment width of the preset pixels; and translating the sub-pixel lighting areas corresponding to the grating units from the current lighting areas along the deviation direction according to the actual adjustment number of the sub-pixels so as to adjust the sub-pixel lighting areas corresponding to the grating units.
13. The 3D display device of claim 11, wherein the grating structure is an electrically driven liquid crystal lens comprising a plurality of electrodes arranged side-by-side, the grating unit being a lens region formed by adjacent plurality of electrodes;
the adjusting module further comprises an electrode adjusting unit;
the electrode adjusting unit is used for determining the actual electrode adjusting number based on the deviation and the preset minimum electrode adjusting width; and translating the lens area of the grating unit from the current lens area according to the actual adjustment number of the electrodes along the deviation direction so as to adjust the position of the grating unit.
14. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 9 when the program is executed by the processor.
15. A computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 9.
CN202410101544.0A 2024-01-24 2024-01-24 3D display device, display method thereof, electronic device and storage medium Pending CN117880484A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410101544.0A CN117880484A (en) 2024-01-24 2024-01-24 3D display device, display method thereof, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410101544.0A CN117880484A (en) 2024-01-24 2024-01-24 3D display device, display method thereof, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN117880484A true CN117880484A (en) 2024-04-12

Family

ID=90591784

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410101544.0A Pending CN117880484A (en) 2024-01-24 2024-01-24 3D display device, display method thereof, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN117880484A (en)

Similar Documents

Publication Publication Date Title
JP6449428B2 (en) Curved multi-view video display device and control method thereof
CN103873844B (en) Multiple views automatic stereoscopic display device and control the method for its viewing ratio
CN102056003B (en) High density multi-view image display system and method with active sub-pixel rendering
CN103595986B (en) Stereoscopic image display device, image processing device, and image processing method
EP3350989B1 (en) 3d display apparatus and control method thereof
CN105072431A (en) Glasses-free 3D playing method and glasses-free 3D playing system based on human eye tracking
CN103235415B (en) Based on the multi-view free stereoscopic displayer of grating
EP2418868A2 (en) Image processor, stereoscopic display, method of detecting parallax displacement in a stereoscopic display
CN102843567B (en) The apparatus and method of display 3-D view
KR102174258B1 (en) Glassless 3d display apparatus and contorl method thereof
WO2013061734A1 (en) 3d display device
CN102404592B (en) Image processing device and method, and stereoscopic image display device
CN102789058A (en) Stereoscopic image generation device, stereoscopic image generation method
CN102497563A (en) Tracking-type autostereoscopic display control method, display control apparatus and display system
CN105374325A (en) Bendable stereoscopic 3D display device
KR20120051287A (en) Image providing apparatus and image providng method based on user's location
KR101975246B1 (en) Multi view image display apparatus and contorl method thereof
KR20140137971A (en) display apparatus and method for displaying multi view image using the same
CN105376558A (en) Multiview image display apparatus and control method thereof
WO2012172766A1 (en) Image processing device and method thereof, and program
CN103676167A (en) Stereoscopic display device and storage medium
Minami et al. Portrait and landscape mode convertible stereoscopic display using parallax barrier
KR102081111B1 (en) Method of driving stereopsis image display device
CN117880484A (en) 3D display device, display method thereof, electronic device and storage medium
KR20170036476A (en) Multi view image display apparatus and contorl method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination