CN211528831U - Multi-view naked eye 3D display screen and naked eye 3D display terminal - Google Patents

Multi-view naked eye 3D display screen and naked eye 3D display terminal Download PDF

Info

Publication number
CN211528831U
CN211528831U CN201922162206.5U CN201922162206U CN211528831U CN 211528831 U CN211528831 U CN 211528831U CN 201922162206 U CN201922162206 U CN 201922162206U CN 211528831 U CN211528831 U CN 211528831U
Authority
CN
China
Prior art keywords
display screen
naked eye
sub
pixels
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201922162206.5U
Other languages
Chinese (zh)
Inventor
刁鸿浩
黄玲溪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Vision Technology Venture Capital Pte Ltd
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Technology Venture Capital Pte Ltd, Beijing Ivisual 3D Technology Co Ltd filed Critical Vision Technology Venture Capital Pte Ltd
Priority to CN201922162206.5U priority Critical patent/CN211528831U/en
Application granted granted Critical
Publication of CN211528831U publication Critical patent/CN211528831U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The utility model relates to a 3D image field discloses a multi-viewpoint bore hole 3D display screen, include: a display panel including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels including a plurality of sub-pixels corresponding to a plurality of viewpoints of the multi-viewpoint naked-eye 3D display screen; and a grating directly bonded on the display panel. The utility model provides a multi-viewpoint bore hole 3D display screen can realize that the grating does not have the bed course, has guaranteed bore hole 3D viewing effect under the set distance simultaneously, has reduced holistic thickness and weight, is convenient for install, transport. The utility model also discloses a bore hole 3D display terminal.

Description

Multi-view naked eye 3D display screen and naked eye 3D display terminal
Technical Field
The utility model relates to a 3D image field for example relates to multi-viewpoint bore hole 3D display screen, bore hole 3D display terminal.
Background
The 3D image is one of the hottest technologies in the video industry, and the technology change from the flat display to the 3D display is promoted. The 3D display technology is a key part in the 3D image industry, and is mainly classified into two types, namely, glasses type 3D display technology and naked eye type 3D display technology. The naked-eye 3D display technology is a technology in which a user can view a 3D display screen without wearing glasses. Compared with glasses type 3D display, naked eye type 3D display belongs to free 3D display technology, and restraint on a user is reduced.
The naked-eye 3D display is based on a viewpoint, and recently, multi-viewpoint naked-eye 3D display has been proposed, so that a sequence of parallax images (frames) is formed at different positions in a space, so that a pair of 3D images having a parallax relationship can enter left and right eyes of a person, respectively, thereby giving a 3D feeling to a user. For a conventional multi-view naked eye three-dimensional (3D) display screen having, for example, N views, multiple views of space need to be projected with multiple independent pixels on the display panel.
However, in the conventional naked-eye 3D display screen configuration, only the raster is disposed on one side or both sides of the 2D display panel to provide a 3D display effect, and transmission and display of a 3D image or video are based on the 2D display panel. This presents the dilemma of resolution degradation and a surge in rendering computation. Meanwhile, the thickness of the display panel is increased, and particularly, the thickness of the display panel is increased for large-size display panels, so that the overall quality of the panel is increased, and the problems of installation and transportation during use are solved.
This background is only for convenience in understanding the relevant art in this field and is not to be taken as an admission of prior art.
SUMMERY OF THE UTILITY MODEL
The following presents a simplified summary of some embodiments in order to provide a basic understanding of some aspects of the disclosed embodiments, and is not intended to identify key/critical elements or to delineate the scope of the invention but rather as a prelude to the more detailed description that is presented later.
The embodiment of the utility model provides a multi-viewpoint bore hole 3D display screen, bore hole 3D display terminal, at least some problems that the intention was overcome or was alleviated above-mentioned.
In some embodiments, there is provided a multi-view naked eye 3D display screen, comprising: a display panel including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels including a plurality of sub-pixels corresponding to a plurality of viewpoints of the multi-viewpoint naked-eye 3D display screen; and a grating directly bonded on the display panel.
In some embodiments, the width p of each of the plurality of sub-pixels is configured to: p is less than or equal to (dxq)/(nxD), wherein D is the sum of the thicknesses of the display panel and the grating, q is the interpupillary distance reference distance, D is the preset viewing distance of the multi-view naked eye 3D display screen, and n is the refractive index of the grating.
In some embodiments, 1.3 ≦ n ≦ 1.6.
In some embodiments, n is 1.46.
In some embodiments, each composite subpixel comprises a plurality of subpixels in a single row or column; or each composite sub-pixel comprises a plurality of sub-pixels in an array.
In some embodiments, the plurality of composite subpixels includes at least one of a red composite subpixel, a green composite subpixel, and a blue composite subpixel.
In some embodiments, the size of the multi-view naked-eye 3D display screen is greater than or equal to 43 inches.
In some embodiments, the size of the multi-view naked eye 3D display screen is 55 inches, 60 inches, 80 inches, or 100 inches; or the multi-view naked eye 3D display screen is a cinema screen.
In some embodiments, the width of each sub-pixel of the plurality of sub-pixels is less than 0.008 mm.
In some embodiments, each of the plurality of subpixels has a width of less than 0.0076 mm.
In some embodiments, a display panel includes: a first substrate; a second substrate spaced apart from the first substrate; a color filter attached to a surface of the first substrate facing the second substrate; a thin film transistor attached to a surface of the second substrate facing the first substrate; the polaroid is attached to the surface, opposite to the first substrate, of the second substrate; and a liquid crystal layer disposed between the first substrate and the second substrate; wherein the grating is directly bonded to a surface of the first substrate facing away from the second substrate.
In some embodiments, the grating is obliquely bonded to the display panel.
In some embodiments, the grating comprises a plurality of cylindrical prism gratings.
In some embodiments, a naked eye 3D display terminal is provided, comprising a multi-view naked eye 3D display screen as described above.
In some embodiments, the naked eye 3D display terminal further comprises a 3D processing device configured to render respective sub-pixels of a plurality of composite sub-pixels in the multi-view naked eye 3D display screen based on the 3D signal.
In some embodiments, the 3D processing apparatus is further configured to perform shift rendering on the sub-pixels in the composite sub-pixel according to the viewpoint position corresponding to the currently rendered sub-pixel and the viewpoint position corresponding to the sub-pixel to be rendered next.
In some embodiments, the naked eye 3D display terminal further comprises a memory configured to store a correspondence of the sub-pixels to the viewpoints; wherein the 3D processing device is configured to obtain the correspondence.
In some embodiments, the naked eye 3D display terminal further comprises an eyeball tracking data acquisition device configured to acquire eyeball tracking data of the user.
The utility model provides a multi-viewpoint bore hole 3D display screen and bore hole 3D display terminal can be with the grating direct bonding on display panel, effectively reduced multi-viewpoint bore hole 3D display screen and bore hole 3D display terminal's thickness and weight.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the invention.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1A to fig. 1D are schematic structural diagrams of a multi-view naked eye 3D display screen and a naked eye 3D display terminal according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a hardware structure of a naked eye 3D display terminal according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a software structure of a naked eye 3D display terminal according to an embodiment of the present invention;
fig. 4A to 4C are schematic diagrams of a composite pixel according to an embodiment of the present invention;
fig. 5A to 5E are schematic diagrams illustrating formats and contents of images included in video frames of a 3D video signal according to an embodiment of the present invention;
fig. 6 is a schematic diagram of at least two 3D processing devices provided in an embodiment of the present invention;
fig. 7A is a schematic 3D optical imaging diagram of a multi-view naked eye 3D display screen according to an embodiment of the present invention;
fig. 7B is a schematic diagram of an optical path of a lens region of a multi-view naked-eye 3D display screen according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a multi-view naked eye 3D display screen according to an embodiment of the present invention;
fig. 9A and 9B are schematic diagrams of splitting a pixel of a multi-view naked eye 3D display screen according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a multi-view naked eye 3D display screen according to an embodiment of the present invention.
Reference numerals:
100: a multi-view naked eye 3D display screen; and (3) CP: a composite pixel; CSP: a composite sub-pixel; p: a sub-pixel; 1000: a naked eye 3D display terminal; 101: a processor; 122: a register; 130: a 3D processing device; 131: a buffer; 140: a video signal interface; 150: an eyeball tracking device; 160: an eye tracking data interface; 200: a naked eye 3D display terminal; 201: a processor; 202: an external memory interface; 203: a memory; 204: a USB interface; 205: a charging management module; 206: a power management module; 207: a battery; 208: a mobile communication module; 209: an antenna; 210: a wireless communication module; 211: an antenna; 212: an audio module; 213: a speaker; 214: a telephone receiver; 215: a microphone; 216: an earphone interface; 217: pressing a key; 218: a motor; 219: an indicator; 220: a SIM card interface; 221: an image pickup unit; 222: a register; 223: a GPU; 224: a codec; 230: a sensor module; 2301: a proximity light sensor; 2302: an ambient light sensor; 2303: a pressure sensor; 2304: an air pressure sensor; 2305: a magnetic sensor; 2306: a gravity sensor; 2307: a gyroscope sensor; 2308: an acceleration sensor; 2309: a distance sensor; 2310: a temperature sensor; 2311: a fingerprint sensor; 2312: a touch sensor; 2313: a bone conduction sensor; 310: an application layer; 320: a frame layer; 330: core class library and Runtime (Runtime); 340: an inner core layer; 400: and (3) compounding pixels: 410. 420, 430, 440, 450, 460, 470, 480, 490: a composite sub-pixel; 411. 421, 431, 441, 451, 461, 471, 481, 491: a sub-pixel; 501. 502: two images in a parallel format; 503. 504, a step of: two images in a top-bottom format; 505: left and right interlaced format composite images; 506: a composite image in an up-down interlaced format; 507: a composite image in checkerboard format; d: the distance between the display screen surface and the human eye; d: the thickness of the display screen; q: is the distance between the eyes of the person; p: the distance between adjacent pixels; n': refractive index in air; n: the refractive index of the grating; theta1: the included angle between the emergent ray of the lens and the normal line; theta2: the nip between the incident ray and the normal of the lensAn angle; 800: a multi-view naked eye 3D display screen; 810: a display panel; 820: a grating; 811: a display TFT layer; 812: a polarizer.
Detailed Description
In order to understand the features and technical contents of the embodiments of the present invention in more detail, the following description is given in conjunction with the accompanying drawings for describing the embodiments of the present invention in detail, and the accompanying drawings are only used for the purpose of reference and are not used to limit the embodiments of the present invention.
Herein, "naked-eye three-dimensional (or 3D) display" relates to a technique in which a user can observe a 3D image on a display without wearing glasses for 3D display.
In this context, "multi-view" has its conventional meaning in the art, meaning that different images displayed by different pixels or sub-pixels of the display screen can be viewed at different positions (viewpoints) in space. In this context, multi-view shall mean at least 3 views.
In this context, "grating" has a broad interpretation in the art, including but not limited to "parallax barrier" gratings and "lenticular" gratings, such as "lenticular" gratings.
Herein, "lens" or "lenticular" has the conventional meaning in the art, and includes, for example, cylindrical lenses and spherical lenses.
A conventional "pixel" means a 2D display or the smallest display unit in terms of its resolution when displayed as a 2D display.
However, in some embodiments herein, the term "composite pixel" when applied to multi-view technology in the field of naked eye 3D display refers to the smallest unit of display when a naked eye 3D display device provides multi-view display, but does not exclude that a single composite pixel for multi-view technology may comprise or appear as a plurality of 2D display pixels. Herein, unless specifically stated as a composite pixel or 3D pixel for "3D display" or "multi-view" applications, a pixel will refer to the smallest unit of display in 2D display. Also, when describing a "composite subpixel" for a multi-view naked eye 3D display, it will refer to a composite subpixel of a single color present in the composite pixel when the naked eye 3D display device provides multi-view display. Herein, a sub-pixel in a "composite sub-pixel" will refer to the smallest display unit of a single color, which tends to correspond to a viewpoint.
As shown in fig. 7A, which is a schematic diagram of an optical path for imaging of a multi-view naked eye 3D display screen, if a viewer watches 3D content at a distance D from a screen, a thickness D required by the multi-view naked eye 3D display screen can be calculated according to a geometric relationship of light. For the ray emitted from any display screen, let theta1(theta in FIG. 7B)1) Into the air and ultimately into the eyes of the viewer. The parameters in the figure are expressed as: d is the distance between the surface of the display screen and human eyes, D is the thickness of the display screen, q is the distance between two human eyes, p is the distance between adjacent pixels, n' is the refractive index in the air, and n is the refractive index of the grating.
As shown in fig. 7B, which is a schematic diagram of the enlargement of the light at the lens, the light emitted from the multi-view naked eye 3D display screen passes through any point of the convex lens to enter the air and finally enters the eyes of the user. The angle between the incident light and the normal of the lens is theta2The angle between the emergent ray and the normal of the lens is theta1Defined in terms of refractive index:
Figure BDA0002303629460000061
because the included angle between the outgoing incident ray and the normal is small, the following steps are performed:
Figure BDA0002303629460000062
according to formulae ① and ②, there are
Figure BDA0002303629460000063
Therefore, it is not only easy to use
Figure BDA0002303629460000064
In the case of the embodiment shown in figure 7A,
Figure BDA0002303629460000065
according to ③, there is the following equality relationship:
Figure BDA0002303629460000071
generally, a 55 inch display screen is 1397mm diagonal 55 inches long. The general aspect ratio is 16: 9, a length of 1218mm and a width of 685mm were obtained. The 4K display screen has 3840 × 2160 pixels. The size of each pixel is 1218/3840-0.317 mm long and 685/2160-0.317 mm wide. The width of a display screen pixel is about 0.317 mm. Further split by RGB three colors (standard RGB arrangement), subpixel width is 0.106mm, p is 0.106mm (subpixel pitch of 55-inch 4K-resolution display screen), D is 5m (comfortable distance for observing 3D display effect of 55-inch display screen), and q is 0.062m (average distance between two eyes and pupil of asian person).
According to the formula ④, the method comprises the steps of,
Figure BDA0002303629460000072
fig. 8 shows a structure of a multi-view naked-eye 3D display screen 800, which includes a display panel 810 and a grating 820, where the display panel 810 includes a display TFT layer 811 and a polarizer 812, the thickness of the display TFT layer 811 is 0.5mm, the thickness of the polarizer 812 is 0.1mm, the thickness of the grating 820 is 0.3mm, the total thickness is 0.9mm, the required thickness for comfortable display effect is 12.4mm, and is still 11.5mm, generally, a cushion layer (spacer glass) with additional thickness needs to be added between the display panel and the grating, but currently, a mainstream television screen is more than 50 and 55 inches, the process difficulty of attaching large-size glass to the display screen is very high, and the weight of the display screen is very heavy. The thickness of the display screen is obviously increased, the installation space is occupied, the increased weight enables the installation structure of the display screen and the display screen to be additionally increased by corresponding thickness, or the corresponding installation structure needs to be designed independently, so that the problems of inconvenience in use and multiple specifications in production are caused.
The embodiment of the utility model provides a design of new pixel structure, be equivalent to carry out the split with current TFT pixel, the split becomes a plurality of TFT pixels, can realize same 3D display effect at established distance under the condition that does not increase display panel, display screen thickness.
According to the formula ④, the method comprises the steps of,
Figure BDA0002303629460000073
without changing the thickness of the display screen, d is 0.9 mm. And D is 5m, q is 0.062m, n is 1.46, substituting for (c), and obtaining the changed p value, namely p'.
Figure BDA0002303629460000074
According to the calculated result of the seventh step, when the sub-pixel distance of the multi-view naked eye 3D display screen is split into about 0.0076mm of pixels from the original 0.106.4mm, the thickness of the display screen does not need to be increased artificially, and the same 3D display effect can be achieved.
After the pixels are split, the same 3D display effect can be achieved by a set distance between the display panel and the grating in the multi-view naked eye 3D display screen without additionally increasing a cushion layer.
The way of splitting TFT pixels is as shown in fig. 9A and 9B, and a multi-view 3D display effect can be achieved by independently controlling left and right views.
The first pixel splitting scheme is shown in fig. 9A. Every two adjacent pixels are a group, each group containing left and right views. The pixels labeled L show the left view and the pixels labeled R show the right view.
The second way of splitting the pixels is shown in fig. 9B, where every N pixels are a group, for example, N is 5. Pixels marked with the same number in the figure display a left view or a right view at the same time or in a time-sharing manner, and 3D display is achieved.
The introduction of the above embodiment comparatively explains how the distance between the sub-pixels affects the thickness of the display screen, and in the actual application process of the naked eye 3D display screen, the distance between the sub-pixels can be ensured, that is, the thickness of the display terminal and the display screen can be ensured, and no extra cushion layer is added. Meanwhile, due to the fact that the number of split sub-pixels is multiplied, the 3D display screen can have the multiplied display resolution ratio relative to the 2D display screen with the same size. For example, the display resolution of a 2D display screen of the same size is M1 × N1, and the display resolution of a 3D display screen of the same size is INT (T/i) × M1 × N1, INT is a rounding function, T is a division multiple, for example, 14, i is the number of viewpoints, for example, 2, 5, or 7.
In some embodiments of the present invention, as shown in fig. 10 and 1A to 1C, a multi-view naked-eye 3D display screen 100 is provided, which includes a display panel 110 and a grating 120 disposed on the display panel 110, wherein m × n composite pixels CP are disposed on the display panel 110 and thus a display resolution of m × n is defined; the composite pixel CP includes a plurality of rows of composite subpixels CSP, each row of composite subpixels CSP is composed of i subpixels P corresponding to i viewpoints, i is larger than or equal to 3, and the colors of the subpixels in each row of composite subpixels CSP can be set to be the same. The size of the multi-view naked-eye 3D display screen in this embodiment is the same as the size of the 2D display screen with the same display resolution and the same size, so that the pitch between the sub-pixels P in the composite pixel CP in this embodiment is smaller than the pitch between the sub-pixels P in the 2D display screen with the same display resolution, for example, in the above-mentioned embodiment of pixel splitting of the 55-inch display screen, the resolution of 4k is realized for the 55-inch display screen with the 2D, and the pitch between the sub-pixels P in this embodiment is 0.0076mm, which is approximately equivalent to being reduced by 14 times, thereby realizing the effect without a cushion layer. The utility model provides a display screen can be used under the scene of large-size display screen very much, for example, display screen more than 50 cun can effectively reduce weight.
In the embodiment of the utility model, the grating can the direct bonding on display panel.
According to the above embodiments, it can be known that adjusting the spacing between the sub-pixels can adjust the thickness of the pad layer until the pad layer is removed, and further, the spacing between the sub-pixels P is configured such that the grating 120 directly adheres to the display panel 110.
In some embodiments, the pitch P between the subpixels P satisfies the following relationship: p ≦ (D × q)/(n × D), where D is the sum of the thicknesses of the display panel and the gratings, q is the average distance between the pupils of both eyes, D is the predetermined viewing distance of the multi-view naked eye 3D display, n is the refractive index of the gratings, e.g., 1.3 ≦ n ≦ 1.6, where n is 1.46 in some display screen structures and material settings. The embodiment of the utility model provides an in, carry out the width of subpixel as above setting up can make the grating direct bonding on display panel, realize that the height of bed course is zero to realize the effect of no bed course.
In some embodiments, the sub-pixel width in each composite sub-pixel is less than 0.008mm, or less than 0.0076 mm.
In some embodiments, each composite subpixel comprises a single row or column of multiple subpixels.
In some embodiments, each composite subpixel comprises a plurality of subpixels in an array.
In some embodiments, the plurality of composite subpixels includes a red composite subpixel, a green composite subpixel, and a blue composite subpixel.
In some embodiments, the size of the multi-view naked-eye 3D display screen is equal to or greater than 43 inches, such as 50, 55 inches, 60 inches, 80 inches, 100 inches, 110 inches, and the like.
In some embodiments, the multi-view naked eye 3D display screen is a Micro-Led display screen. The TFT layer comprises a driving circuit and a light emitting circuit of the Micro-LED.
This the utility model discloses an in some embodiments, to the whole pixel width of compound pixel CP, set up the width of grating grid, generally, to the demand of no bed course, learn the interval between the sub-pixel after the accounting, can confirm the width of grating grid according to the number of sub-pixel in compound sub-pixel, for example, have i sub-pixel in the compound sub-pixel, then the width of grating grid is i xp.
The embodiment of the present invention provides an embodiment, display panel 110 can be a liquid crystal panel, and specifically, display panel 110 includes: a pair of spaced apart substrates; a color filter attached to a surface of a first substrate of the pair of substrates facing the second substrate; a thin film transistor attached to a surface of the second substrate facing the first substrate; another polarizer attached to a surface of the second substrate facing away from the first substrate; and a liquid crystal layer disposed between the pair of substrates; wherein the grating 120 is directly bonded to the surface of the first substrate facing away from the second substrate.
The embodiment of the present invention is summarized, and the plurality of prism gratings in the grating 120 are parallel and inclined to each other to be connected to the display panel, thereby preventing moire.
According to the above-mentioned embodiment of the utility model, the utility model discloses still related to a bore hole 3D display screen with 2D display screen unidimensional, with the regional split of the sub-pixel on original 2D display screen one, two or more compound pixels, realize not having the bed course, for example, in the above-mentioned 55 cun split pixel's embodiment, split 14 sub-pixels with original sub-pixel, to the condition that the sight point is 2, then can utilize 7 compound pixels to show a pixel on original 2D display screen, if need guarantee the same resolution ratio, then the sub-pixel that corresponds sight point i among 7 compound pixels shows same colour luminance; if the resolution is required to be multiplied, the sub-pixels corresponding to the viewpoint i in the 7 composite pixels can be arranged to display different color brightness, and the specific color brightness can be obtained by calculating the color brightness of the surrounding pixel points. If the viewpoint is 5, since the division cannot be performed, 14 sub-pixels are allocated to 2 composite pixels, the number of viewpoints of each composite pixel is 7, that is, each composite sub-pixel has 7 sub-pixels, the redundant sub-pixels can be controlled not to display or display non-interference color brightness, correspondingly, the raster grid can also not cover the non-displayed sub-pixels, at this time, if the display with the same resolution is required, the sub-pixels corresponding to the viewpoint i in the 2 composite pixels can display the same color brightness, and if the resolution multiplication is required, the color brightness of the two sub-pixels can be set according to the color brightness of the surrounding sub-pixels.
In some embodiments of the utility model, a bore hole 3D display terminal 1000 is provided, it includes: a multi-view naked-eye 3D display screen 100 including m × n composite pixels CP and thus defining an m × n display resolution; a video signal interface 140 for receiving video frames of the 3D video signal, wherein the video frames of the 3D video signal comprise two images with m × n resolution or comprise a composite image with 2m × n or m × 2n resolution; and at least one 3D processing device 130. The utility model provides a bore hole 3D display terminal 1000 can be bore hole 3D display terminal or bore hole 3D display device.
In some embodiments, each composite pixel CP comprises a plurality of composite subpixels CSP, each composed of i same-color subpixels corresponding to i viewpoints, where i ≧ 3.
In some embodiments, the at least one 3D processing device 130 is configured to render at least one sub-pixel in each composite sub-pixel based on one of the two images and at least another sub-pixel in each composite sub-pixel based on the other of the two images.
In some further embodiments, the at least one 3D processing device 130 is configured to render at least two of each of the composite sub-pixels based on the composite image.
In some embodiments, the 3D processing device 130 is configured to render the sub-pixels corresponding to the viewpoints in the composite pixel based on the 3D image signal.
Fig. 1A shows the schematic diagram of a multi-view bore hole 3D display screen of an embodiment of the present invention, and fig. 1B shows the structure schematic diagram of a bore hole 3D display terminal 1000 provided by an embodiment of the present invention. Referring to fig. 1A and 1B, in an embodiment of the present invention, a naked-eye 3D display terminal 1000 is provided, which may include a multi-view naked-eye 3D display screen 100, at least one 3D processing device 130, and a video signal interface 140 for receiving video frames of a 3D video signal.
In the embodiment shown in fig. 1A and 1B, the multi-view naked-eye 3D display screen 100 may include m × n composite pixels and thus define an m × n display resolution. As shown in fig. 1A and 1B, the multi-view naked-eye 3D display screen 100 includes m columns and n rows of composite pixels CP and thus defines a display resolution of m × n.
In some embodiments, each composite pixel CP comprises a plurality of composite sub-pixels, each composite sub-pixel being made up of i same-color sub-pixels corresponding to i viewpoints, i ≧ 2. In the embodiment shown in fig. 1A, i is 6, but other values for i are contemplated. In the illustrated embodiment, the multi-view naked-eye 3D display screen may accordingly have i (i ═ 6) views (V1-V6), but it is contemplated that more or fewer views may be provided accordingly.
Referring to fig. 1A and 4A in combination, in the illustrated embodiment, each composite pixel includes three composite sub-pixels, and each composite sub-pixel is composed of 6 same-color sub-pixels corresponding to 6 viewpoints (i ═ 6). The three composite subpixels correspond to three colors, i.e., red (R), green (G), and blue (B), respectively. That is, the three composite subpixels of each composite pixel have 6 red, 6 green, or 6 blue subpixels, respectively.
In the embodiment shown in fig. 1A and 4A, the composite sub-pixels 410, 420, 430 in the composite pixel 400 are arranged in parallel. Each composite subpixel 410, 420, 430 includes subpixels 411, 421, 431 in a single row. It is conceivable, however, for the composite sub-pixels to be arranged differently or for the sub-pixels to be arranged differently in the composite sub-pixels.
As shown in fig. 4B, each composite subpixel 440, 450, 460 includes subpixels 441, 451, 461 in a single column.
As shown in fig. 4C, the three composite subpixels 470, 480, 490 in composite pixel 400 are arranged in a "pin" shape, for example. In the embodiment shown in fig. 4C, the subpixels 471, 481, 491 in each composite subpixel 470, 480, 490 can be in the form of an array (3 × 2).
In some embodiments, such as shown in fig. 1A-1C, a naked eye 3D display terminal 1000 may be provided with a single 3D processing device 130. The single 3D processing device 130 simultaneously processes the rendering of each composite sub-pixel of the composite pixels of the large-sized naked eye 3D display screen 100.
In other embodiments, such as shown in fig. 6, the naked eye 3D display terminal 1000 may be provided with at least two 3D processing devices 130 that process in parallel, in series, or in a combination of series and parallel, the rendering of each composite sub-pixel of the composite pixels of the large-size naked eye 3D display screen 100.
Those skilled in the art will appreciate that the above-mentioned at least two 3D processing devices may be distributed in other ways and process rows and columns of composite pixels or composite sub-pixels of the large-sized naked eye 3D display screen 100 in parallel, which falls within the scope of the present invention.
In some embodiments, the at least one 3D processing device 130 may also optionally include a buffer 131 to buffer the received video frames.
In some embodiments, the at least one 3D processing device is an FPGA or ASIC chip or an FPGA or ASIC chipset.
With continued reference to fig. 1A, the naked eye 3D display terminal 1000 may further comprise a processor 101 communicatively connected to the at least one 3D processing device 130 via a video signal interface 140. In some embodiments illustrated herein, the processor 101 is included in or as a processor unit of a computer or smart terminal, such as a mobile terminal. It is contemplated that in some embodiments, the processor 101 may be disposed outside the naked-eye 3D display terminal, for example, the naked-eye 3D display terminal may be a multi-view naked-eye 3D display screen with a 3D processing device, such as a non-intelligent naked-eye 3D television.
For simplicity, the exemplary embodiments of the naked eye 3D display terminal hereinafter internally include a processor. Further, the video signal interface 140 is configured as an internal interface connecting the processor 101 and the 3D processing device 130, and the configuration can be more clearly understood with reference to the naked-eye 3D display terminal 200 implemented in a mobile terminal as shown in fig. 2 and 3. In some embodiments of the present invention, the video signal interface 140, which is an internal interface of the naked-eye 3D Display terminal 200, may be a MIPI, mini-MIPI, LVDS, min-LVDS, or Display Port interface. In some embodiments, as shown in fig. 1A, the processor 101 of the naked eye 3D display terminal 1000 may further include a register 122. The registers 122 may be used to temporarily store instructions, data, and addresses.
In some embodiments, the naked eye 3D display terminal 1000 may further comprise an eye tracking device or eye tracking data interface for acquiring real-time eye tracking data, such that the 3D processing device 130 may render respective sub-pixels of the composite pixel (composite sub-pixel) based on the eye tracking data. For example, in the embodiment shown in fig. 1B, the naked eye 3D display terminal 1000 further includes an eyeball tracking device 150 communicatively connected to the 3D processing device 130, so that the 3D processing device 130 can directly receive the eyeball tracking data. In the embodiment shown in fig. 1C, the eye tracking device (not shown) may for example be directly connected to the processor 101, while the 3D processing device 130 obtains eye tracking data from the processor 101 via the eye tracking data interface 151. In other embodiments, the eye tracking device may be connected to both the processor and the 3D processing device, which may enable the 3D processing device 130 to obtain the eye tracking data directly from the eye tracking device on the one hand, and enable other information obtained by the eye tracking device to be processed by the processor on the other hand.
With combined reference to fig. 1A-C and 5A-E, the transmission and display of 3D video signals within a naked eye 3D display terminal of some embodiments of the present invention is described. In the illustrated embodiment, the display screen 110 may define 6 viewpoints V1-V6, at which the user's eyes can see the display of corresponding ones of the composite sub-pixels of each composite pixel in the display panel of the multi-viewpoint naked-eye 3D display screen 110. Two different images seen by the two eyes of the user at different viewpoints form parallax, and a 3D image is synthesized in the brain.
In some embodiments of the present invention, the 3D processing device 130 receives video frames, e.g. as decompressed 3D video signals, from the processor 101 through the video signal interface 140, e.g. as an internal interface. Each video frame may comprise or consist of two images with a resolution of m × n or a composite image with a resolution of 2m × n or m × 2 n.
In some embodiments, the two images or composite image may include different types of images and may be in various arrangements.
As shown in fig. 5A, a video frame of a 3D video signal comprises or consists of two images 501, 502 with m × n resolution in a parallel format. In some embodiments, the two images may be a left eye parallax image and a right eye parallax image, respectively. In some embodiments, the two images may be a rendered color image and a depth image, respectively.
As shown in fig. 5B, a video frame of the 3D video signal contains or consists of two images 503, 504 with m × n resolution in a top-bottom format. In some embodiments, the two images may be a left eye parallax image and a right eye parallax image, respectively. In some embodiments, the two images may be a rendered color image and a depth image, respectively.
As shown in fig. 5C, the video frame of the 3D video signal contains a composite image 505 having a resolution of 2m × n in a left-right interleaved format. In some embodiments, the composite image may be left-and right-eye parallax composite images interleaved left and right, left-and right-interleaved rendered color and depth composite images.
As shown in fig. 5D, the video frame of the 3D video signal contains a composite image 506 having a resolution of m × 2n in a top-bottom interleaved format. In some embodiments, the composite image may be a left-eye and right-eye parallax composite image interleaved up and down. In some embodiments, the composite image may be an up-down interleaved composite image that renders color and depth.
As shown in fig. 5E, the video frame of the 3D video signal contains a composite image 507 having a 2m × n resolution in a checkerboard format. In some embodiments, the composite image may be a checkerboard format of left and right eye parallax composite images. In some embodiments, the composite image may be a checkerboard format of the rendered color image and the depth image.
It will be appreciated by those skilled in the art that the embodiments shown in the figures are merely schematic, and that the two images or composite image comprised by the video frames of the 3D video signal may comprise other types of images and may be in other arrangements, which fall within the scope of the invention.
In some embodiments, the resolution of m × n may be a resolution above Full High Definition (FHD), including, but not limited to, 1920 × 1080, 1920 × 1200, 2048 × 1280, 2560 × 1440, 3840 × 2160, and the like.
In some embodiments, the at least one 3D processing device 130, upon receiving a video frame comprising two images, renders at least one sub-pixel in each composite sub-pixel based on one of the two images and at least another sub-pixel in each composite sub-pixel based on the other of the two images. Similarly, in some embodiments, the at least one 3D processing device, upon receiving the video frame comprising the composite image, renders at least two of each of the composite sub-pixels based on the composite image. For example, at least one sub-pixel is rendered from a first image (portion) of the composite image and at least another sub-pixel is rendered from a second image (portion).
In some embodiments, this is dynamically rendered, for example, based on eye tracking data.
By way of explanation and not limitation, since the 3D processing device 130 in the embodiment of the present invention receives two images included in video frame data through the video signal interface 140 configured as an internal interface, for example, the resolution of each image (or half of the resolution of the composite image) corresponds to a composite pixel divided by a viewpoint (which includes a composite sub-pixel divided by a viewpoint). On one hand, as the viewpoint information is irrelevant to the transmission process, naked eye 3D display with small processing calculation amount and no loss of resolution can be realized; on the other hand, since the composite pixels (composite sub-pixels) are arranged corresponding to the viewpoints, rendering of the display screen can be realized in a point-to-point manner, and the amount of calculation is greatly reduced. In contrast, the transmission and display of images or videos of a conventional naked-eye 3D display screen are still based on a 2D display panel, and not only the problems of reduced resolution and greatly increased amount of rendering calculation, but also the problems of multiple format adjustment and image or video display adaptation may exist.
In some embodiments, the register 122 of the processor 101 may be configured to receive information about the display requirements of the multi-view naked eye 3D display screen 110, typically information independent of i views and related to the m × n resolution of the multi-view naked eye 3D display screen 110, so that the processor 101 sends video frames of the 3D video signal to the multi-view naked eye 3D display screen 110 that meet its display requirements. This information may be, for example, a data packet used to initially establish the video transmission.
Therefore, the processor 101 does not need to consider information related to i viewpoints of the multi-viewpoint naked eye 3D display screen 110 (i ≧ 3) when transmitting video frames of the 3D video signal. Instead, the processor 101 is able to transmit the video frames of the 3D video signal meeting its requirements to the multi-view naked eye 3D display screen 110 by means of the information related to the m × n resolution of the multi-view naked eye 3D display screen 100 received by the register 122.
In some embodiments, the 3D processing device 130 is further configured to perform shift rendering on the sub-pixels in the composite pixel according to the viewpoint position corresponding to the currently rendered sub-pixel and the next viewpoint position corresponding to the sub-pixel rendered in the next frame.
In some embodiments, the naked eye 3D display terminal 1000 may further include a codec configured to decompress and codec the compressed 3D video signal and transmit the decompressed 3D video signal to the at least one 3D processing device 130 via the video signal interface 140.
In some embodiments, the processor 101 of the naked eye 3D display terminal 1000 reads a video frame of the 3D video signal from the memory or from outside the naked eye 3D display terminal 1000, for example, through an external interface, and then transmits the read or received video frame of the 3D video signal to the at least one 3D processing device 130 via the video signal interface 140.
In some embodiments, the naked eye 3D display terminal 1000 further comprises a formatter (not shown), for example integrated in the processor 101, configured as a codec or as part of a GPU, for pre-processing the video frames of the 3D video signal such that they contain two images with a resolution of m × n or such that they contain a composite image with a resolution of 2m × n or m × 2 n.
As before, the utility model discloses the bore hole 3D display terminal that some embodiments provided can be the bore hole 3D display terminal who contains the treater. In some embodiments, the naked eye 3D display terminal may be configured as a smart cellular phone, a tablet computer, a smart television, a wearable device, an in-vehicle device, a notebook computer, an Ultra Mobile Personal Computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like.
In another scheme, the naked eye 3D display system comprises a processor unit and the naked eye 3D display terminal, wherein the processor unit is in communication connection with the naked eye 3D display terminal.
In some embodiments, the naked eye 3D display system is configured as a smart television with a processor unit; or the naked eye 3D display system is an intelligent cellular phone, a tablet computer, a personal computer or wearable equipment; or the naked eye 3D display system comprises a set top box or a screen-projectable cellular phone or a tablet computer serving as a processor unit and a digital television serving as a naked eye 3D display terminal, wherein the digital television is in wired or wireless connection with the set top box or the cellular phone or the tablet computer; or, the naked eye 3D display system is configured as an intelligent home system or a part thereof, wherein the processor unit includes an intelligent gateway or a central controller of the intelligent home system, and the intelligent home system further includes an eyeball tracking device for acquiring eyeball tracking data; alternatively, the naked eye 3D display system is configured as an entertainment interaction system or a part thereof.
Exemplarily, fig. 2 shows a hardware configuration diagram of a naked eye 3D display terminal 200 implemented as a large-sized mobile terminal. The naked-eye 3D display terminal 200 may include a processor 201, an external storage interface 202, an (internal) memory 203, a Universal Serial Bus (USB) interface 204, a charging management module 205, a power management module 206, a battery 207, a mobile communication module 208, a wireless communication module 210, antennas 209 and 211, an audio module 212, a speaker 213, a receiver 214, a microphone 215, an earphone interface 216, a button 217, a motor 218, an indicator 219, a Subscriber Identity Module (SIM) card interface 220, a multi-view naked-eye 3D display screen 110, a 3D processing device 130, a video signal interface 140, a camera unit 221, an eye tracking device 150, a sensor module 230, and the like. Among other things, the sensor module 230 may include a proximity light sensor 2301, an ambient light sensor 2302, a pressure sensor 2303, a barometric pressure sensor 2304, a magnetic sensor 2305, a gravity sensor 2306, a gyroscope sensor 2307, an acceleration sensor 2308, a distance sensor 2309, a temperature sensor 2310, a fingerprint sensor 2311, a touch sensor 2312, a bone conduction sensor 2313, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation to the naked-eye 3D display terminal 200. In other embodiments of the present invention, the naked eye 3D display terminal 200 may include more or fewer components than those shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 201 may include one or more processing units, such as: the processor 201 may include an Application Processor (AP), a modem processor, a baseband processor, a Graphics Processor (GPU)223, an Image Signal Processor (ISP), a controller, a memory, a video codec 224, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), etc., or combinations thereof. The different processing units may be separate devices or may be integrated into one or more processors.
A cache memory may also be provided in the processor 201 to hold instructions or data that have just been used or recycled by the processor 201. If the processor 201 needs to reuse the instruction or data, it may be called directly from memory.
In some embodiments, the processor 201 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and so forth.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 201 may include multiple sets of I2C buses. The processor 201 may be communicatively connected to the touch sensor 2312, the charger, the flash, the camera unit 221, the eye tracking device 150, etc. through different I2C bus interfaces.
Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is used to connect the processor 201 with the wireless communication module 210.
In the embodiment shown in fig. 2, a MIPI interface may be used to connect the processor 201 with the multi-view naked-eye 3D display screen 110. In addition, the MIPI interface may also be used to connect peripheral devices such as the camera unit 221, the eye tracking device 150, and the like.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, the GPIO interface may be used to connect the processor 201 with the camera unit 221, the multi-view naked-eye 3D display screen 110, the wireless communication module 210, the audio module 212, the sensor module 230, and the like.
The USB interface 204 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 204 may be used to connect a charger to charge the naked-eye 3D display terminal 200, and may also be used to transmit data between the naked-eye 3D display terminal 200 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone.
It should be understood that the interface connection relationship between the modules according to the embodiment of the present invention is only schematically illustrated, and does not form a structural limitation on the naked eye 3D display terminal 200.
The wireless communication function of the naked eye 3D display terminal 200 may be implemented by the antennas 209 and 211, the mobile communication module 208, the wireless communication module 210, a modem processor, a baseband processor, or the like.
The antennas 209, 211 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the naked eye 3D display terminal 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 208 may provide a solution including wireless communication of 2G/3G/4G/5G and the like applied to the naked-eye 3D display terminal 200. The mobile communication module 208 may include at least one filter, switch, power amplifier, Low Noise Amplifier (LNA), and the like. The mobile communication module 208 may receive electromagnetic waves from the antenna 209, filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 208 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 209 for radiation. In some embodiments, at least some of the functional modules of the mobile communication module 208 may be disposed in the processor 201. In some embodiments, at least some of the functional modules of the mobile communication module 208 may be disposed in the same device as at least some of the modules of the processor 201.
The wireless communication module 210 may provide a solution for wireless communication including a Wireless Local Area Network (WLAN), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), etc. applied to the naked eye 3D display terminal 200. The wireless communication module 210 may be one or more devices integrating at least one communication processing module. The wireless communication module 210 receives electromagnetic waves via the antenna 211, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 201. The wireless communication module 210 can also receive a signal to be transmitted from the processor 201, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 211 to radiate the electromagnetic waves.
In some embodiments, the antenna 209 and the mobile communication module 208 of the naked-eye 3D display terminal 200 are coupled, and the antenna 211 and the wireless communication module 210 are coupled, so that the naked-eye 3D display terminal 200 can communicate with a network and other devices through a wireless communication technology. The wireless communication technologies may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. The GNSS may include a global positioning satellite system (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
In some embodiments, the external interface for receiving the 3D video signal may include a USB interface 204, a mobile communication module 208, a wireless communication module 209, or a combination thereof. Furthermore, other possible interfaces for receiving 3D video signals are also conceivable, such as the interfaces described above.
The memory 203 may be used to store computer executable program code, which includes instructions. The processor 201 executes various functional applications of the naked eye 3D display terminal 200 and data processing by executing instructions stored in the memory 203. The memory 203 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phone book, etc.) created during the use of the naked eye 3D display terminal 200, and the like. In addition, the memory 203 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The external memory interface 202 may be used to connect an external memory card, such as a Micro SD card, to implement the extended storage capability of the naked-eye 3D display terminal 200. The external memory card communicates with the processor 201 through the external memory interface 202, implementing a data storage function.
In some embodiments, the memory of the naked eye 3D display terminal may comprise an (internal) memory 203, an external memory card to which the external memory interface 202 is connected, or a combination thereof. In other embodiments of the present invention, the video signal interface may also adopt different internal interface connection modes or combinations thereof in the above embodiments.
In the embodiment of the present invention, the camera unit 221 may capture an image or a video.
In some embodiments, the naked-eye 3D display terminal 200 implements a display function through the video signal interface 140, the 3D processing device 130, the multi-view naked-eye 3D display screen 110, and the application processor.
In some embodiments, the naked eye 3D display terminal 200 may include a GPU, for example, within the processor 201 for processing 3D video images, as well as for processing 2D video images.
In some embodiments, the naked eye 3D display terminal 200 further includes a video codec 224 for compressing or decompressing digital video.
In some embodiments, the video signal interface 140 is used to output 3D video signals, e.g., video frames of decompressed 3D video signals, processed by the GPU or the codec 224, or both, to the 3D processing device 130.
In some embodiments, the GPU or codec 224 is integrated with a formatter.
The multi-view naked eye 3D display screen 110 is used to display a 3D (3D) image or video, etc. The multi-view naked eye 3D display screen 110 includes a display panel. The display panel can be a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED) or an Active Matrix Organic Light Emitting Diode (AMOLED), a Flexible Light Emitting Diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light emitting diode (QLED), and the like.
In some embodiments, the eye tracking device 150 is communicatively coupled to the 3D processing unit 130, such that the 3D processing unit 130 can render corresponding sub-pixels of the composite pixel (composite sub-pixel) based on the eye tracking data. In some embodiments, the eye tracking device 150 may also be connected to the processor 201, such as bypassing the connection processor 201.
The naked-eye 3D display terminal 200 may implement an audio function through the audio module 212, the speaker 213, the receiver 214, the microphone 215, the earphone interface 216, and the application processor. Such as music playing, recording, etc. The audio module 212 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 212 may also be used to encode and decode audio signals. In some embodiments, the audio module 212 may be disposed in the processor 201, or some functional modules of the audio module 212 may be disposed in the processor 201. The speaker 213 is used to convert an audio electric signal into a sound signal. The naked eye 3D display terminal 200 may listen to music through the speaker 213 or listen to a hands-free call. A receiver 214, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the naked eye 3D display terminal 200 receives a call or voice information, it is possible to receive a voice by bringing the receiver 214 close to the human ear. The microphone 215 is used to convert a sound signal into an electric signal. The earphone interface 216 is used to connect a wired earphone. The headset interface 216 may be the USB interface 204 or may be a 3.5mm open mobile naked eye 3D display terminal platform (OMTP) standard interface, a Cellular Telecommunications Industry Association (CTIA) standard interface.
The keys 217 include a power-on key, a volume key, and the like. The keys 217 may be mechanical keys. Or may be touch keys. The naked-eye 3D display terminal 200 may receive a key input, and generate a key signal input related to user setting and function control of the naked-eye 3D display terminal 200.
The motor 218 may generate a vibration indication. The motor 218 may be used for both an electrical vibration alert and for touch vibration feedback.
The SIM card interface 220 is used to connect a SIM card. In some embodiments, the naked-eye 3D display terminal 200 employs eSIM, namely: an embedded SIM card.
The pressure sensor 2303 is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2303 may be disposed on the multi-view naked eye 3D display screen 110, which falls within the scope of the present invention.
The air pressure sensor 2304 is used to measure air pressure. In some embodiments, the naked eye 3D display terminal 200 calculates the altitude through the barometric pressure value measured by the barometric pressure sensor 2304, to assist in positioning and navigation.
The magnetic sensor 2305 includes a hall sensor.
The gravity sensor 2306 is a sensor that converts motion or gravity into an electrical signal, and is mainly used for measuring parameters such as an inclination angle, an inertial force, an impact, and vibration.
The gyro sensor 2307 may be used to determine the motion gesture of the naked eye 3D display terminal 200.
The acceleration sensor 2308 may detect the magnitude of acceleration of the naked eye 3D display terminal 200 in various directions (generally, three axes).
The distance sensor 2309 may be used to measure distance
A temperature sensor 2310 may be used to detect temperature.
The fingerprint sensor 2311 is used for collecting fingerprints. The naked eye 3D display terminal 200 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering, and the like.
The touch sensor 2312 may be disposed in the multi-view naked eye 3D display screen 110, and the touch sensor 2312 and the multi-view naked eye 3D display screen 110 form a touch screen, which is also referred to as a "touch screen".
The bone conduction sensor 2313 may acquire a vibration signal.
The charging management module 205 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 205 may receive charging input from a wired charger via the USB interface 204. In some wireless charging embodiments, the charging management module 205 may receive a wireless charging input through a wireless charging coil of the naked eye 3D display terminal 200.
The power management module 206 is used to connect the battery 207, the charging management module 205 and the processor 201. The power management module 206 receives the input of the battery 207 and/or the charging management module 205, and supplies power to the processor 201, the memory 203, the external memory, the multi-view naked-eye 3D display screen 110, the camera unit 221, the wireless communication module 210, and the like. In other embodiments, the power management module 206 and the charging management module 205 may be disposed in the same device.
The software system of the naked eye 3D display terminal 200 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The utility model discloses the android system of layered architecture is used as an example to the embodiment shown, the software structure of bore hole 3D display terminal 200 is explained to the example. It is contemplated that embodiments of the present invention may be implemented in different software systems, such as an operating system.
Fig. 3 is a schematic diagram of a software structure of the naked eye 3D display terminal 200 according to an embodiment of the present invention. The layered architecture divides the software into several layers. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer 310, a framework layer 320, a core class library and Runtime (Runtime)330, and a kernel layer 340.
The application layer 310 may include a series of application packages. As shown in fig. 3, the application packages may include bluetooth, WLAN, navigation, music, camera, calendar, telephony, video, gallery, map, short message, etc. applications. According to the utility model discloses 3D video display method, for example can implement in the video application.
Framework layer 320 provides an Application Programming Interface (API) and programming framework for applications at the application layer. The framework layer includes some predefined functions. For example, in some embodiments of the present invention, functions or algorithms for identifying the captured 3D video image, algorithms for processing the image, and the like may be included in the framework layer.
As shown in FIG. 3, the framework layer 320 may include an explorer, a phone manager, a content manager, a notification manager, a window manager, a view system, an installation package manager, and the like.
The android Runtime includes a core library and a virtual machine. The android Runtime is responsible for scheduling and managing the android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the framework layer run in a virtual machine. And executing java files of the application program layer and the framework layer into binary files by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The core class library may include a plurality of functional modules. For example: a 3D graphics processing library (e.g., OpenGL ES), a surface manager, an image processing library, a media library, a graphics engine (e.g., SGL), and the like.
The kernel layer 340 is a layer between hardware and software. The inner core layer at least comprises a camera drive, an audio and video interface, a communication interface, a Wifi interface, a sensor drive, a power supply management and a GPS interface.
Here, an embodiment of 3D video transmission and display in a naked-eye 3D display terminal having the structure shown in fig. 2 and 3 as a mobile terminal is described as an example; it is contemplated, however, that additional or fewer features may be included or changes may be made in the features of alternative embodiments.
In some embodiments, the naked eye 3D display terminal 200, for example, a mobile terminal, such as a smart cellular phone or a tablet computer, receives, for example, a compressed 3D video signal from a network, such as a cellular network, a WLAN network, bluetooth, for example, by means of the mobile communication module 208 and the antenna 209 or the wireless communication module 210 and the antenna 211 as external interfaces, the compressed 3D video signal performs image processing, codec and decompression by the GPU223, for example, and then transmits the decompressed 3D video signal to the at least one 3D processing device 130, for example, via the video signal interface 140 as an internal interface, such as a MIPI interface or a mini-MIPI interface, and a video frame of the decompressed 3D video signal includes two images or a composite image according to an embodiment of the present invention. Further, the 3D processing device 130 renders the sub-pixels of the composite sub-pixels of the display screen accordingly, thereby implementing 3D video playback.
In other embodiments, the naked eye 3D display terminal 200 reads the (internal) memory 203 or reads the compressed 3D video signal stored in the external memory card through the external memory interface 202, and implements 3D video playing through corresponding processing, transmission and rendering.
In some embodiments, the playing of the 3D video is implemented in a video application in the android system application layer 310.
In some embodiments, the video frame of the 3D video signal includes a composite image having a resolution of 2 mxn or mx 2n, such that after transmitting the video frame of the 3D video signal, at least two sub-pixels of each composite sub-pixel of each composite pixel of the multi-view naked-eye 3D display screen 110 are rendered based on the composite image.
In some embodiments, the naked eye 3D display terminal 200 may include an eye tracking device or may read eye tracking data to obtain or read real-time eye tracking data of the user, thereby implementing dynamic rendering of the multi-view naked eye 3D display screen 110.
The devices, apparatuses, modules or units set forth in the above embodiments may be implemented by various possible entities. A typical implementation entity is a computer or a processor or other component thereof. In particular, the computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, a smart television, an internet of things system, a smart home, an industrial computer, a single chip microcomputer system, or a combination of these devices. In a typical configuration, a computer may include one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM).
The methods, programs, devices, apparatuses, etc., in embodiments of the present invention may be performed or implemented in a single or multiple networked computers, or may be practiced in distributed computing environments. In the described embodiments, tasks are performed by remote processing devices that are linked through a communications network in these distributed computing environments.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, apparatus or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
Those skilled in the art will appreciate that the implementation of the functional blocks/units or controllers and the associated method steps set forth in the above embodiments may be implemented in software, hardware, or a combination of software and hardware. For example, it may be implemented in purely computer readable program code means or it may be possible to cause a controller to perform the same function in hardware, in part or in whole by logically programming method steps, including but not limited to logic gates, switches, application specific integrated circuits, programmable logic controllers (e.g., FPGAs), and embedded microcontrollers.
In some embodiments of the invention, the components of the apparatus are described in the form of functional modules/units. It is contemplated that the various functional modules/units may be implemented in one or more "combined" functional modules/units and/or one or more software and/or hardware components. It is also conceivable that a single functional module/unit is implemented by a plurality of sub-functional modules or combinations of sub-units and/or by a plurality of software and/or hardware. The division of functional modules/units may be only one logical division of functions, and in particular implementations, multiple modules/units may be combined or may be integrated into another system. Further, the connection of modules, units, devices, systems and components thereof herein includes direct or indirect connections, encompassing possible electrical, mechanical, communication connections, including in particular wired or wireless connections between various interfaces, including but not limited to HDMI, thunderbolt, USB, WiFi, cellular networks.
In the embodiments of the present invention, the technical features, the flowcharts and/or the block diagrams of the methods, the programs may be applied to corresponding apparatuses, devices, systems and modules, units and components thereof. Conversely, various embodiments and features of apparatuses, devices, systems and modules, units, components thereof may be applied to methods, programs according to embodiments of the present invention. For example, the computer program instructions may be loaded onto a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, having corresponding functions or features, which implement one or more of the procedures of the flowcharts and/or one or more blocks of the block diagrams.
Methods, programs, and computer program instructions according to embodiments of the present invention may be stored in a computer-readable memory or medium that can direct a computer or other programmable data processing apparatus to function in a particular manner. Embodiments of the invention also relate to a readable memory or medium having stored thereon methods, programs, instructions that may implement embodiments of the invention.
Storage media include articles of manufacture that are permanent and non-permanent, removable and non-removable, and that may implement any method or technology for storage of information. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
Unless specifically stated otherwise, the actions or steps of a method, program or process described in accordance with an embodiment of the present invention need not be performed in a particular order and still achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
While various embodiments of the invention have been described herein, the description of the various embodiments is not intended to be exhaustive or to limit the invention to the precise forms disclosed, and features and components that are the same or similar to one another may be omitted for clarity and conciseness. As used herein, "one embodiment," "some embodiments," "examples," "specific examples," or "some examples" are intended to apply to at least one embodiment or example, but not to all embodiments, in accordance with the present invention. And the above terms are not necessarily meant to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics of the various embodiments may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
As used herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exhaustive, such that a process, method, article, or apparatus that comprises a list of elements may include those elements but do not exclude the presence of other elements not expressly listed. For purposes of this disclosure and unless specifically stated otherwise, "a" means "one or more". To the extent that the term "includes" or "including" is used in this specification and the claims, it is intended to be inclusive in a manner similar to the term "comprising" as that term is interpreted when employed as a transitional word. Furthermore, to the extent that the term "or" is used (e.g., a or B), it will mean "a or B or both". When applicants intend to indicate "only a or B but not both," only a or B but not both will be used. Thus, use of the term "or" is inclusive and not exclusive.
Exemplary systems and methods of the present invention have been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes for carrying out the systems and methods. It will be appreciated by those skilled in the art that various changes in the embodiments of the systems and methods described herein may be made in practicing the systems and/or methods without departing from the spirit and scope of the invention as defined in the appended claims. It is intended that the following claims define the scope of the system and method and that the system and method within the scope of these claims and their equivalents be covered thereby. The above description of the present system and method should be understood to include all novel and non-obvious combinations of elements described herein, and claims may be presented in this or a later application to any novel and non-obvious combination of elements. Moreover, the foregoing embodiments are illustrative, and no single feature or element is essential to all possible combinations that may be claimed in this or a later application.

Claims (18)

1. A multi-view naked eye 3D display screen is characterized by comprising:
a display panel comprising a plurality of composite pixels, each of the plurality of composite pixels comprising a plurality of composite sub-pixels, each of the plurality of composite sub-pixels comprising a plurality of sub-pixels corresponding to a plurality of viewpoints of the multi-viewpoint naked-eye 3D display screen; and
and the grating is directly jointed on the display panel.
2. The multi-view naked eye 3D display screen according to claim 1, wherein the width p of each of the plurality of sub-pixels is configured to:
p≤(d×q)/(n×D),
wherein D is the sum of the thicknesses of the display panel and the grating, q is a interpupillary distance reference distance, D is a preset viewing distance of the multi-view naked eye 3D display screen, and n is the refractive index of the grating.
3. The multi-view naked eye 3D display screen according to claim 2, wherein n is more than or equal to 1.3 and less than or equal to 1.6.
4. A multi-view naked eye 3D display screen according to claim 3, wherein n is 1.46.
5. A multi-view naked eye 3D display screen according to any one of claims 1 to 4, wherein each composite sub-pixel comprises a plurality of sub-pixels in a single row or column; or
Each of the composite sub-pixels includes a plurality of sub-pixels in an array.
6. A multi-view naked eye 3D display screen according to any one of claims 1 to 4, wherein the plurality of composite sub-pixels comprises at least one of a red composite sub-pixel, a green composite sub-pixel and a blue composite sub-pixel.
7. A multi-view naked eye 3D display screen according to any one of claims 1 to 4, wherein the size of the multi-view naked eye 3D display screen is greater than or equal to 43 inches.
8. The multi-view naked eye 3D display screen according to claim 7, wherein the size of the multi-view naked eye 3D display screen is 55 inches, 60 inches, 80 inches or 100 inches; or
The multi-view naked eye 3D display screen is a cinema screen.
9. The multi-view naked eye 3D display screen according to claim 8, wherein the width of each of the plurality of sub-pixels is less than 0.008 mm.
10. The multi-view naked eye 3D display screen according to claim 9, wherein the width of each of the plurality of sub-pixels is less than 0.0076 mm.
11. A multi-view naked eye 3D display screen according to any one of claims 1 to 4, wherein the display panel comprises:
a first substrate;
a second substrate spaced apart from the first substrate;
a color filter attached to a surface of the first substrate facing the second substrate;
a thin film transistor attached to a surface of the second substrate facing the first substrate;
the polaroid is attached to the surface, opposite to the first substrate, of the second substrate; and
a liquid crystal layer disposed between the first substrate and the second substrate;
wherein the grating is directly bonded to a surface of the first substrate facing away from the second substrate.
12. A multi-view naked eye 3D display screen according to claim 11 wherein the grating is obliquely bonded to the display panel.
13. The multi-view naked eye 3D display screen according to claim 12, wherein the grating comprises a plurality of cylindrical prism gratings.
14. A naked eye 3D display terminal, characterized by comprising a multi-view naked eye 3D display screen according to any one of claims 1 to 13.
15. The naked eye 3D display terminal according to claim 14, further comprising a 3D processing device configured to render respective sub-pixels of the plurality of composite sub-pixels in the multi-view naked eye 3D display screen based on a 3D signal.
16. The naked-eye 3D display terminal of claim 15, wherein the 3D processing device is further configured to perform a shift rendering on the sub-pixels in the composite sub-pixel according to a viewpoint position corresponding to a currently rendered sub-pixel and a viewpoint position corresponding to a next rendered sub-pixel.
17. The naked eye 3D display terminal according to claim 15 or 16, further comprising a memory configured to store a correspondence of sub-pixels to viewpoints;
wherein the 3D processing apparatus is configured to acquire the correspondence.
18. The naked eye 3D display terminal according to any one of claims 14 to 16, further comprising eyeball tracking data acquisition means configured to acquire eyeball tracking data of a user.
CN201922162206.5U 2019-12-05 2019-12-05 Multi-view naked eye 3D display screen and naked eye 3D display terminal Active CN211528831U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201922162206.5U CN211528831U (en) 2019-12-05 2019-12-05 Multi-view naked eye 3D display screen and naked eye 3D display terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201922162206.5U CN211528831U (en) 2019-12-05 2019-12-05 Multi-view naked eye 3D display screen and naked eye 3D display terminal

Publications (1)

Publication Number Publication Date
CN211528831U true CN211528831U (en) 2020-09-18

Family

ID=72444727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201922162206.5U Active CN211528831U (en) 2019-12-05 2019-12-05 Multi-view naked eye 3D display screen and naked eye 3D display terminal

Country Status (1)

Country Link
CN (1) CN211528831U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021110040A1 (en) * 2019-12-05 2021-06-10 北京芯海视界三维科技有限公司 Multi-viewpoint 3d display screen and 3d display terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021110040A1 (en) * 2019-12-05 2021-06-10 北京芯海视界三维科技有限公司 Multi-viewpoint 3d display screen and 3d display terminal

Similar Documents

Publication Publication Date Title
TWI818211B (en) Eye positioning device and method and 3D display device and method
TWI746302B (en) Multi-viewpoint 3D display, multi-viewpoint 3D display terminal
AU2020250124B2 (en) Image processing method and head mounted display device
CN112584125A (en) Three-dimensional image display apparatus and display method thereof
CN211791829U (en) 3D display device
CN104272732B (en) Image processing apparatus, method and shooting device
CN211128026U (en) Multi-view naked eye 3D display screen and multi-view naked eye 3D display terminal
EP4261675A1 (en) Screen display method and related apparatus
WO2021110033A1 (en) 3d display device, method and terminal
CN211528831U (en) Multi-view naked eye 3D display screen and naked eye 3D display terminal
US11924398B2 (en) Method for implementing 3D image display and 3D display device
CN105227828B (en) Filming apparatus and method
CN211791828U (en) 3D display device
WO2021110040A1 (en) Multi-viewpoint 3d display screen and 3d display terminal
WO2021110026A1 (en) Method for realizing 3d image display, and 3d display device
CN211930763U (en) 3D display device
CN112929645A (en) 3D display device, system and method, and 3D video data communication method
US20230007233A1 (en) Method for realizing 3d image display, and 3d display device
WO2022160795A1 (en) Display mode conversion method and apparatus based on light field display
WO2022127612A1 (en) Image calibration method and device
US20240233303A1 (en) Method for Eliminating Interference Pattern in Image, and Apparatus

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220815

Address after: 100055 1-1808c, 15th floor, building 1, 168 Guang'anmenwai street, Xicheng District, Beijing

Patentee after: Beijing Xinhai vision 3D Technology Co.,Ltd.

Address before: 100055 1-1808c, 15th floor, building 1, 168 Guang'anmenwai street, Xicheng District, Beijing

Patentee before: Beijing Xinhai vision 3D Technology Co.,Ltd.

Patentee before: Vision technology venture capital Pte. Ltd.

Patentee before: Diao Honghao