Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Fig. 1a is a flowchart of a naked eye 3D display method based on an OLED display screen, where the method may be performed by a naked eye 3D display device based on an OLED display screen, the device may be implemented in software and/or hardware, and the device may be configured in an electronic device for naked eye 3D display of the OLED display screen. The method is applied to a scene of naked eye 3D display by using the OLED display screen. As shown in fig. 1a, the technical solution provided in the embodiment of the present invention specifically includes:
and S110, determining the pixel structure type of the pixel, and determining the sub-coordinates of each sub-pixel in the pixel based on the pixel structure type.
The OLED display screen is provided with at least two types of pixel structures, and each type of pixel structure is provided with corresponding sub-pixel arrangement characteristics.
Illustratively, as shown in FIG. 1b, the pixel area of each pixel in the display screen is a square of length 1. Wherein each pixel is composed of three primary colors of red (R), blue (B), green (G), each color on each pixel may be a sub-pixel (each dot in fig. 1B). The OLED display screen comprises two types of pixel structures, and each type of pixel structure has corresponding sub-pixel arrangement characteristics. According to the scheme, a plane coordinate system can be built by taking the top left vertex of the OLED display screen as an origin, the row direction of the screen is the x axis, the column direction of the screen is the y axis, the row serial number and the column serial number of each pixel in the screen are respectively used as the pixel abscissa and the pixel ordinate of each pixel in the screen, and then the pixel structure type of the pixel is determined according to the characteristics of the pixel abscissa and the pixel ordinate of each pixel. For example, if the sum of the row number and the column number is an even number, the sub-pixel arrangement characteristics corresponding to the pixel structure type of the pixel are as shown in fig. 1 c. If the sum of the row number and the column number is odd, the sub-pixel arrangement characteristics corresponding to the pixel structure type of the pixel are shown in fig. 1 d. The scheme can further determine the sub-coordinates of each sub-pixel in the pixel according to the pixel structure type of the pixel. Wherein the sub-coordinates also include an abscissa and an ordinate. It should be noted that, in the embodiment of the present invention, the coordinates of all the pixels and the sub-coordinates of the sub-pixels are all pixel serial number coordinates.
In this embodiment, optionally, the at least two types of pixel structures are disposed at periodic intervals in the display screen.
For example, as shown in fig. 1b, two types of pixel structures in the OLED display screen are arranged at periodic intervals.
Therefore, the pixel structures of at least two types are arranged at periodic intervals in the display screen, so that the pixel structure types can be determined according to the distribution characteristics of the pixel structure types in the screen, the sub-coordinates of each sub-pixel in the pixel can be determined, and reliable data support is provided for subsequent steps.
And S120, respectively determining a view number matched with each sub-pixel and an offset of each sub-pixel relative to the screen according to each sub-coordinate, the width of the column mirror and the relative position information of the column mirror relative to the screen.
Specifically, one cylindrical lens corresponds to a group of row diagrams, and each group of row diagrams is numbered sequentially from the serial number 1. The number of the row diagram groups and the number of the views of each group of the row diagrams can be set according to the actual requirement of the display screen. The relative positional information of the lenticular lens with respect to the screen may be an offset angle between the lenticular lens element principal axis direction and the display screen pixel column direction. The offset of the sub-pixels relative to the screen may be the specific location of the sub-pixels on the screen. The offset of each sub-pixel in the screen relative to the screen is unique. According to the scheme, the view numbers matched with the sub-pixels and the offset of the sub-pixels relative to the screen can be respectively determined according to the sub-coordinates, the widths of the columnar lenses and the relative position information of the columnar lenses relative to the screen. Wherein, by determining the view number matched with each sub-pixel, it can be determined which row diagram each sub-pixel comes from, and by determining the offset of each sub-pixel relative to the screen, it can be determined which group of row diagrams each sub-pixel comes from and the specific position in the group of row diagrams.
And S130, displaying each sub-pixel correspondingly according to the view number and the offset.
According to the scheme, the pixel value of the pixel at the corresponding position in the row diagram can be determined according to the determined view number matched with each sub-pixel and the determined offset of each sub-pixel relative to the screen, the sub-pixel is correspondingly displayed by using the pixel value, and the naked eye 3D display effect in the OLED display screen can be obtained.
According to the technical scheme provided by the embodiment of the invention, the pixel structure type of the pixel is determined, and the sub-coordinates of each sub-pixel in the pixel are determined based on the pixel structure type; respectively determining a view number matched with each sub-pixel and an offset of each sub-pixel relative to a screen according to each sub-coordinate, the width of the column lens and the relative position information of the column lens relative to the screen; and displaying each sub-pixel correspondingly according to the view number and the offset. By executing the scheme, the normal naked eye 3D display interface and display effect can be displayed on the OLED display screen, and the advantages of the OLED display screen can be fully exerted.
Fig. 2 is a flowchart of a naked eye 3D display method based on an OLED display screen according to an embodiment of the present invention, where the embodiment is optimized based on the foregoing embodiment. As shown in fig. 2, the naked eye 3D display method based on the OLED display screen in the embodiment of the present invention may include:
And S210, determining the pixel structure type of the pixel based on the periodic position of the pixel in the display screen, and determining the sub-coordinates of each sub-pixel in the pixel based on the pixel structure type.
The periodic position may be a row number or a column number where the pixel is located within the display screen. For example, as shown in FIG. 1b, a pixel is located in the second row and the third column of the display screen. Because the pixel structure types of the display screen are arranged at periodic intervals, the scheme can determine the pixel structure types of the pixels based on the periodic positions of the pixels in the display screen. For example, the present solution may determine the type of pixel structure of the pixels located in the third column of the second row of the display screen as shown in fig. 1 d.
In a possible implementation manner, optionally, the determining the pixel structure type of the pixel based on the periodic position of the pixel in the display screen includes: and determining the pixel structure type of the pixel based on the row period position and the column period position of the pixel in the display screen.
Specifically, assuming that the screen resolution of the OLED display screen is m×n, it is considered that the screen may be divided into m rows and n columns, and a position where each pixel in the screen is located may be marked as (r, s). Wherein r is more than or equal to 1 and less than or equal to m, s is more than or equal to 1 and less than or equal to n, r is the row period position of the pixel in the display screen, and s is the column period position of the pixel in the display screen. Because the row period position and the column period position of the pixel in the display screen are associated with the pixel structure type of the pixel, the scheme can determine the pixel structure type of the pixel based on the row period position and the column period position of the pixel in the display screen.
Therefore, the pixel structure type of the pixel is determined based on the row period position and the column period position of the pixel in the display screen, so that the distribution characteristics of the pixel structure type of the OLED display screen can be fully utilized, the pixel structure type can be accurately determined, and reliable data support is provided for subsequent steps.
In another possible implementation manner, optionally, determining the pixel structure type of the pixel based on the row period position and the column period position of the pixel on the display screen includes: and determining the pixel structure type of the pixel based on whether the sum of the row serial number and the column serial number of the pixel in the display screen is an odd number.
For example, as shown in fig. 1b and fig. 1d, if the sum of the row number and the column number of the pixel in the display screen is determined to be odd, the pixel structure type of the pixel may be determined as follows: the sub-pixel B is located at the upper right vertex (p 2) of the square pixel, the sub-pixel R is located at the lower right vertex (p 0) of the square pixel, and the sub-pixel G is located at the center point (p 1) of the square pixel.
Thus, the pixel structure type of the pixel is determined by whether the sum of the row number and the column number where the pixel is located within the display screen is odd. The pixel structure type distribution characteristics of the OLED display screen can be fully utilized, the pixel structure type can be accurately determined, and reliable data support is provided for subsequent steps.
In yet another possible implementation manner, optionally, determining the pixel structure type of the pixel based on the row period position and the column period position of the pixel on the display screen includes: and determining the pixel structure type of the pixel based on whether the sum of the row serial number and the column serial number of the pixel in the display screen is even.
For example, as shown in fig. 1b and fig. 1c, if the sum of the row number and the column number of the pixel in the display screen is determined to be an even number by the present scheme, the present scheme may determine the pixel structure type of the pixel as follows: the sub-pixel R is located at the upper right vertex (p 0) of the square pixel, the sub-pixel B is located at the lower right vertex (p 2) of the square pixel, and the sub-pixel G is still located at the center point (p 1) of the square pixel.
Thus, the pixel structure type of the pixel is determined by whether the sum of the row number and the column number where the pixel is located within the display screen is an even number. The pixel structure type distribution characteristics of the OLED display screen can be fully utilized, the pixel structure type can be accurately determined, and reliable data support is provided for subsequent steps.
In this embodiment, optionally, determining the sub-coordinates of each sub-pixel in the pixel based on the pixel structure type includes: determining a subpixel arrangement characteristic of the pixel based on the pixel structure type; determining coordinates of the pixel according to the periodic position of the pixel in the display screen; determining a pixel region of the pixel according to the coordinates of the pixel; and determining the sub-coordinates of each sub-pixel in the pixel according to the pixel area of the pixel and the sub-pixel arrangement characteristics of the pixel.
The coordinate point of the pixel may be preset to be located at a coordinate position in the square pixel corresponding to the coordinate point, for example, the coordinate point of the pixel may be located at an upper left vertex of the square pixel corresponding to the coordinate point of the pixel, or the coordinate point of the pixel may be located at a center point of the square pixel corresponding to the coordinate point of the pixel. The coordinate point of the pixel may also be located at other positions of the square pixel corresponding thereto. According to the scheme, the sub-pixel arrangement characteristics of the pixels can be determined according to the pixel structure type, the specific coordinates of the pixels are determined according to the row serial numbers and the column serial numbers of the pixels in the display screen, and then the pixel areas of the pixels are determined according to the specific coordinates of the pixels. Determining the sub-coordinates of each sub-pixel in the pixel according to the pixel area, the sub-pixel arrangement characteristics of the pixel and the coordinate position of the pixel in the pixel area.
Thereby, determining a subpixel arrangement characteristic of the pixel by based on the pixel structure type; determining coordinates of the pixel according to the periodic position of the pixel in the display screen; determining a pixel region of the pixel according to the coordinates of the pixel; and determining the sub-coordinates of each sub-pixel in the pixel according to the pixel area of the pixel and the sub-pixel arrangement characteristics of the pixel. The position of each sub-pixel in the pixel can be determined, and a reliable and accurate data source can be provided for realizing naked eye 3D display of the OLED display screen.
And S220, respectively determining view numbers matched with the sub-pixels and offset of the sub-pixels relative to the screen according to the sub-coordinates, the widths of the columnar lenses and the relative position information of the columnar lenses relative to the screen.
And S230, displaying the sub-pixels correspondingly according to the view numbers and the offset.
In a possible implementation manner, optionally, displaying each sub-pixel correspondingly according to the view number and the offset includes: determining a target view matched with the sub-pixel according to the view number and the number of views matched with the column lens; determining a target sub-pixel according to the target view and the offset; and displaying the sub-pixels correspondingly by adopting the values of the target sub-pixels.
The number of views can be 8 or 28, and can be set according to actual needs. According to the scheme, the specific view in the row diagram corresponding to the column mirror, namely the target view, can be determined according to the view number and the number of views matched with the column mirror. For example, if the view number is 5, and the number of views of the row of the map corresponding to each of the cylindrical lenses is 28, the target view is the 5 th view. According to the scheme, the pixel and the pixel value corresponding to the sub-pixel in all rows of the screen can be determined according to the offset of the sub-pixel relative to the screen and the target view corresponding to the sub-pixel, and the pixel value is assigned to the sub-pixel so that the sub-pixel can be displayed.
Thereby, determining a target view matched with the sub-pixel according to the view number and the number of views matched with the column lens; determining a target sub-pixel according to the target view and the offset; and displaying the sub-pixels correspondingly by adopting the values of the target sub-pixels. The normal naked eye 3D display interface and display effect can be displayed on the OLED display screen, and the advantages of the OLED display screen can be fully exerted.
According to the technical scheme provided by the embodiment of the invention, the pixel structure type of the pixel is determined based on the periodic position of the pixel in the display screen, the sub-coordinates of each sub-pixel in the pixel are determined based on the pixel structure type, the view number matched with each sub-pixel and the offset of each sub-pixel relative to the screen are respectively determined according to each sub-coordinate, the width of the column mirror and the relative position information of the column mirror relative to the screen, and the sub-pixels are correspondingly displayed according to the view number and the offset. By executing the scheme, the normal naked eye 3D display interface and display effect can be displayed on the OLED display screen, and the advantages of the OLED display screen can be fully exerted.
Fig. 3a is a flowchart of a naked eye 3D display method based on an OLED display screen according to an embodiment of the present invention, where the embodiment is optimized based on the foregoing embodiment. As shown in fig. 3a, the naked eye 3D display method based on the OLED display screen in the embodiment of the present invention may include:
and S310, determining the pixel structure type of the pixel, and determining the sub-coordinates of each sub-pixel in the pixel based on the pixel structure type.
And S320, respectively determining view numbers matched with the sub-pixels according to the sub-coordinates, the center point of the screen, the width of the column mirror and the offset angle of the main axis direction of the column mirror relative to the column direction of the screen pixels.
The grating column lens is obliquely arranged on the display screen, so that a certain included angle is formed between the grating column lens and the display screen, namely an offset angle of the main axis direction of the column lens relative to the direction of the pixel column of the screen. The offset angle can be 20 degrees or 30 degrees, and can be set according to actual needs. The screen center point is the center position of the display screen.
In this embodiment, optionally, the view number matching the sub-pixel is determined based on the following formula:
no=([(x1-x0)+(y0-y1)×tan(a)]/pitch)×number%number;
Where no represents the view number matched with the sub-pixel, x1 represents the abscissa of the sub-coordinate, y1 represents the ordinate of the sub-coordinate, x0 represents the abscissa of the screen center point pixel, y0 represents the ordinate of the screen center point pixel, pitch represents the width of the lenticular lens, a represents the offset angle of the main axis direction of the lenticular lens with respect to the column direction of the screen pixel, and number represents the number of views matched with the lenticular lens.
As shown in fig. 3b, assuming that the number of views of a group of line drawings matching with a lenticular lens is 28, the coordinates of a pixel p in the display screen are (x, y), the coordinates of a pixel number of a pixel O at the center point of the screen are (960, 540), and the lenticular lens width is the number of pixels covered in the width direction by one lenticular lens period determined according to the physical width and the pixel pitch of the lenticular lens. Then, in connection with fig. 1b, 1c and 1d, it is possible to determine the view numbers that match the sub-coordinates of the respective sub-pixels under the arrangement characteristics of the different sub-pixels, respectively. As shown in fig. 1c and 3c, when the abscissa of the pixel plus the ordinate of the pixel is even, the sub-coordinates of the three R, G, and B sub-pixels are respectively: p0 (p.x +1, p.y), p1 (p.x +1/2, p.y+1/2), p2 (p.x +1, p.y+1); view numbers matched with the R, G and B sub-pixels are respectively as follows:
p0_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-y)×tan(a)]/pitch)×28%28;
p1_no=([PA+AB]/pitch)×28%28=([(x+1/2-960)+(540-(y+1/2))×tan(a)]/pitch)×28%28;
p2_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-(y+1))×tan(a)]/pitch)×28%28。
As shown in fig. 1d, when the abscissa of the pixel plus the ordinate of the pixel is odd, the sub-coordinates of the three sub-pixels R, G, and B are respectively: p0 (p.x +1, p.y+1), p1 (p.x +1/2, p.y+1/2), p2 (p.x +1, p.y); view numbers respectively matched with the R, G and B sub-pixels are as follows:
p0_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-(y+1))×tan(a)]/pitch)×28%28;
p1_no=([PA+AB]/pitch)×28%28=([(x+1/2-960)+(540-(y+1/2))×tan(a)]/pitch)×28%28;
p2_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-y))×tan(a)]/pitch)×28%28。
Therefore, the view number matched with the sub-pixel is determined, so that the view matched with the sub-pixel can be determined, and an accurate pixel source can be provided for realizing naked eye 3D display of the OLED display screen.
S330, respectively determining the offset of each sub-pixel relative to the screen according to each sub-coordinate and the screen parameter.
Wherein the screen parameters include: screen height and screen length.
Specifically, as shown in fig. 3b, the screen height is the screen height in the column (i.e., y-axis) direction of the OLED display screen, and the screen length is the screen length in the row (i.e., x-axis) direction of the OLED display screen. The unit of the screen length and the screen height may be a length unit or a pixel unit. After determining the sub-coordinates of each sub-pixel under the arrangement characteristics of different sub-pixels, the scheme can determine the offset of the sub-pixel relative to the screen, namely the specific position of the sub-pixel on the OLED display screen according to the sub-coordinates of the sub-pixel, the screen height and the screen length of the OLED display screen.
In this embodiment, optionally, the determining the offset of the sub-pixel with respect to the screen includes: determining the lateral offset of the sub-pixel according to the abscissa of the sub-coordinate and the screen length; determining the longitudinal offset of the sub-pixel according to the ordinate of the sub-coordinate and the screen height; and determining the offset of the sub-pixel relative to a screen according to the transverse offset and the longitudinal offset.
Specifically, if the OLED display screen has a screen height of 1080, a screen length of 1920, and sub-coordinates of sub-pixels (192, 108), the horizontal offset of the abscissa of the sub-coordinates with respect to the screen length 1920 is 192/1920=0.1, the vertical offset of the ordinate of the sub-coordinates with respect to the screen height 1080 is 0.1, and it is possible to determine that the offset of the sub-pixels of sub-coordinates (192, 108) with respect to the screen is 10% of the screen length and 10% of the screen height from the determined horizontal offset and vertical offset.
Thereby, determining the lateral offset of the sub-pixel by the abscissa in the sub-coordinates and the screen length; determining the longitudinal offset of the sub-pixels according to the ordinate in the sub-coordinates and the screen height; and determining the offset of the sub-pixels relative to the screen according to the lateral offset and the longitudinal offset. And further, specific pixels matched with the sub-pixels in the row diagram can be determined according to the row diagram rule of the display screen and the offset of the screen, and a normal naked eye 3D display interface and display effect can be displayed on the OLED display screen.
And S340, displaying the sub-pixels correspondingly according to the view numbers and the offset.
According to the technical scheme provided by the embodiment of the invention, the pixel structure type of the pixel is determined, the sub-coordinates of each sub-pixel in the pixel are determined based on the pixel structure type, the view numbers matched with the sub-pixels are respectively determined according to the sub-coordinates, the center point of the screen, the width of the column mirror and the offset angle of the column mirror main axis direction relative to the column direction of the screen pixel, the offset of the sub-pixels relative to the screen is respectively determined according to the sub-coordinates and the screen parameters, and the sub-pixels are correspondingly displayed according to the view numbers and the offset. By executing the scheme, the normal naked eye 3D display interface and display effect can be displayed on the OLED display screen, and the advantages of the OLED display screen can be fully exerted.
In order to clearly describe the technical scheme of the invention, the technical scheme provided by the embodiment of the invention can comprise the following steps:
Step 1, obtaining a lenticular parameter, the number of views N matched with the lenticular lens, and relative position information of the lenticular lens element relative to the display screen pixel.
Wherein the number of views N may be 28. The relative positions of lenticular elements with respect to display screen pixels may be: and an included angle a between the main axis direction of the columnar lens element and the pixel column direction of the display screen.
And 2, determining sub-coordinates of sub-pixels in each pixel based on arrangement characteristics of pixels of the OLED screen, and determining a view number corresponding to each sub-pixel based on a predetermined algorithm.
When the pixel arrangement characteristics of the OLED screen satisfy various characteristics, for example, when the pixel abscissa and the pixel ordinate are odd and even, a R, G, B three-color sub-pixel spatial distribution mode is respectively corresponding. The sub-coordinates of R, G, B three-color sub-pixels under each arrangement feature can be determined separately according to the arrangement feature.
And 3, extracting pixel data at corresponding positions from view data of the view numbers corresponding to each sub-pixel for display.
For example, as shown in fig. 3b, the number N of views is taken as 28 (the number N can be set according to actual needs), and the pixel structure of the OLED display screen has two types, which correspond to the case that the pixel abscissa and the pixel ordinate are respectively odd and even. The pixel structure types can be more various, such as 3 kinds, 6 kinds … … kinds
Fig. 1b illustrates a block of 6 column x 4 row = 24 pixel OLED display screen. In any two adjacent pixels, R, G, B three-color sub-pixels are distributed in a non-uniform space. There are two things in common: the pixel abscissa plus pixel ordinate is even (as shown in fig. 1 c) and the pixel abscissa plus pixel ordinate is odd (as shown in fig. 1 d). There are only 24+15+15=54 physical subpixels in the screen. The red and blue colors between adjacent rows are common.
As shown in fig. 3c, the pixel coordinates p (x, y) are assumed to be located at the position of the upper left vertex of the square pixel corresponding to the pixel (note: the assumption that the pixel coordinates are located at the 1 x 1 size square pixel position can theoretically be at any position). When the abscissa of the pixel plus the ordinate of the pixel is even, the structure is as shown in fig. 1 c: the sub-coordinates of the R, G, B sub-pixels are respectively: p0 (p.x +1, p.y), p1 (p.x +1/2, p.y+1/2), p2 (p.x +1, p.y+1); when the abscissa of the pixel plus the ordinate of the pixel is odd, the structure is as shown in fig. 1 d: the sub-coordinates of the R, G, B sub-pixels are respectively: p0 (p.x +1, p.y+1), p1 (p.x +1/2, p.y+1/2), p2 (p.x +1, p.y) are obtained. The view number of the aliquoting lenticular strip sitting at 28 is calculated when the abscissa of the pixel plus the ordinate of the pixel is even:
p0_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-y)×tan(a)]/pitch)×28%28;
p1_no=([PA+AB]/pitch)×28%28=([(x+1/2-960)+(540-(y+1/2))×tan(a)]/pitch)×28%28;
p2_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-(y+1))×tan(a)]/pitch)×28%28。
when the abscissa of the pixel plus the ordinate of the pixel is odd:
p0_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-(y+1))×tan(a)]/pitch)×28%28;
p1_no=([PA+AB]/pitch)×28%28=([(x+1/2-960)+(540-(y+1/2))×tan(a)]/pitch)×28%28;
p2_no=([PA+AB]/pitch)×28%28=([(x+1-960)+(540-y))×tan(a)]/pitch)×28%28。
note that,% is a remainder operation, the cylinder width pitch is the number of pixels covered in the width direction by one cylinder period determined according to the ratio of the physical width of the cylinder to the pixel pitch, the pixel coordinate O (960, 540) of the center point of the screen is an example of a line-up reference point, and in practice, other position points at the boundary of the line-up period may be selected as line-up reference points, where a preferred manner is to select the center point of the screen as the line-up reference point. The view number of the sub-pixel sitting and the offset in the screen (relative coordinates, horizontal coordinates divided by the screen length and vertical coordinates divided by the screen height) are determined, and the value filling of the corresponding position in the corresponding map can be calculated proportionally, so that the map arranging work is completed.
According to the technical scheme provided by the embodiment of the invention, aiming at the OLED screen, the sub-coordinates of each sub-pixel are calculated according to the pixel structure characteristics of the OLED screen, and then the sub-pixel is determined to realize the image arrangement function based on the calculated sub-coordinates. And on an unconventional OLED screen, the naked eye 3D image arrangement function is realized.
Fig. 4 is a schematic structural diagram of an OLED display screen-based naked eye 3D display device provided by an embodiment of the present invention, where the device may be configured in an electronic apparatus for naked eye 3D display of an OLED display screen, as shown in fig. 4, and the device includes:
A sub-coordinate determining module 410, configured to determine a pixel structure type of a pixel, and determine sub-coordinates of each sub-pixel in the pixel based on the pixel structure type;
An offset determining module 420, configured to determine, according to each of the sub-coordinates, the width of the prism, and the relative position information of the prism with respect to the screen, a view number matched with each of the sub-pixels, and an offset of each of the sub-pixels with respect to the screen, respectively;
And a display module 430, configured to display each sub-pixel correspondingly according to the view number and the offset.
Optionally, the at least two types of pixel structures are arranged at periodic intervals in the display screen.
Optionally, the sub-coordinate determining module 410 is specifically configured to: the pixel structure type of the pixel is determined based on the periodic location of the pixel within the display screen.
Optionally, the sub-coordinate determining module 410 is specifically configured to: and determining the pixel structure type of the pixel based on the row period position and the column period position of the pixel in the display screen.
Optionally, the sub-coordinate determining module 410 is specifically configured to: and determining the pixel structure type of the pixel based on whether the sum of the row serial number and the column serial number of the pixel in the display screen is an odd number.
Optionally, the sub-coordinate determining module 410 is specifically configured to: and determining the pixel structure type of the pixel based on whether the sum of the row serial number and the column serial number of the pixel in the display screen is even.
Optionally, the sub-coordinate determination module 410 includes: a subpixel arrangement feature determining unit configured to determine a subpixel arrangement feature of the pixel based on the pixel structure type; the pixel coordinate determining unit is used for determining the coordinate of the pixel according to the periodic position of the pixel in the display screen; a pixel region determining unit configured to determine a pixel region of the pixel according to coordinates of the pixel; and the sub-coordinate determining unit is used for determining the sub-coordinates of each sub-pixel in the pixel according to the pixel area of the pixel and the sub-pixel arrangement characteristics of the pixel.
Optionally, the offset determining module 420 includes a view number determining unit, configured to determine a view number matched with each sub-pixel according to each sub-coordinate, a center point of the screen, a width of the pillar, and an offset angle of a main axis direction of the pillar relative to a column direction of the screen; an offset determining unit, configured to determine an offset of each sub-pixel with respect to a screen according to each sub-coordinate and a screen parameter; wherein the screen parameters include: screen height and screen length.
Optionally, the offset determining unit includes: a lateral offset determining subunit, configured to determine a lateral offset of the subpixel according to an abscissa of the sub-coordinate and the screen length; a longitudinal offset determining subunit, configured to determine a longitudinal offset of the sub-pixel according to the ordinate of the sub-coordinate and the screen height; and the offset determining subunit is used for determining the offset of the sub-pixel relative to the screen according to the transverse offset and the longitudinal offset.
Optionally, the view number determining unit is specifically configured to determine the view number matched with the subpixel based on the following formula: no= ([ (x 1-x 0) + (y 0-y 1) ×tan (a) ]/pitch) ×number% number; where no represents the view number matched with the sub-pixel, x1 represents the abscissa of the sub-coordinate, y1 represents the ordinate of the sub-coordinate, x0 represents the abscissa of the screen center point pixel, y0 represents the ordinate of the screen center point pixel, pitch represents the width of the lenticular lens, a represents the offset angle of the main axis direction of the lenticular lens with respect to the column direction of the screen pixel, and number represents the number of views matched with the lenticular lens.
Optionally, the display module 430 includes: a target view determining unit, configured to determine a target view matched with the sub-pixel according to the view number and the number of views matched with the column lens; a target sub-pixel determining unit, configured to determine a target sub-pixel according to the target view and the offset; and the display unit is used for displaying the sub-pixels correspondingly by adopting the values of the target sub-pixels.
The device provided by the embodiment can execute the naked eye 3D display method based on the OLED display screen provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 5, where the device includes:
One or more processors 510, one processor 510 being illustrated in fig. 5;
a memory 520;
the apparatus may further include: an input device 530 and an output device 540.
The processor 510, memory 520, input means 530 and output means 540 in the apparatus may be connected by a bus or otherwise, in fig. 5 by way of example.
The memory 520 is used as a non-transitory computer readable storage medium for storing software programs, computer executable programs, and modules, such as program instructions/modules corresponding to an OLED display screen-based naked eye 3D display method in an embodiment of the present invention. The processor 510 executes software programs, instructions and modules stored in the memory 520 to perform various functional applications and data processing of the computer device, that is, to implement a naked eye 3D display method based on an OLED display screen according to the above method embodiment, that is:
determining a pixel structure type of a pixel, and determining sub-coordinates of each sub-pixel in the pixel based on the pixel structure type;
Respectively determining a view number matched with each sub-pixel and an offset of each sub-pixel relative to a screen according to each sub-coordinate, the width of the column mirror and the relative position information of the column mirror relative to the screen;
and displaying each sub-pixel correspondingly according to the view number and the offset.
Memory 520 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created according to the use of the computer device, etc. In addition, memory 520 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 520 may optionally include memory located remotely from processor 510, which may be connected to the terminal device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 530 may be used to receive entered numeric or character information and to generate key signal inputs related to user settings and function control of the computer device. The output 540 may include a display device such as a display screen.
The embodiment of the invention provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements a naked eye 3D display method based on an OLED display screen, namely:
determining a pixel structure type of a pixel, and determining sub-coordinates of each sub-pixel in the pixel based on the pixel structure type;
Respectively determining a view number matched with each sub-pixel and an offset of each sub-pixel relative to a screen according to each sub-coordinate, the width of the column mirror and the relative position information of the column mirror relative to the screen;
and displaying each sub-pixel correspondingly according to the view number and the offset.
Any combination of one or more computer readable media may be employed. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.