CN112929647A - 3D display device, method and terminal - Google Patents

3D display device, method and terminal Download PDF

Info

Publication number
CN112929647A
CN112929647A CN201911231396.XA CN201911231396A CN112929647A CN 112929647 A CN112929647 A CN 112929647A CN 201911231396 A CN201911231396 A CN 201911231396A CN 112929647 A CN112929647 A CN 112929647A
Authority
CN
China
Prior art keywords
display
pixels
composite
sub
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911231396.XA
Other languages
Chinese (zh)
Inventor
刁鸿浩
黄玲溪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vision Technology Venture Capital Pte Ltd
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Vision Technology Venture Capital Pte Ltd
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Technology Venture Capital Pte Ltd, Beijing Ivisual 3D Technology Co Ltd filed Critical Vision Technology Venture Capital Pte Ltd
Priority to CN201911231396.XA priority Critical patent/CN112929647A/en
Publication of CN112929647A publication Critical patent/CN112929647A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to naked eye type stereoscopic display technology, and discloses a 3D display device, which comprises: a multi-view naked-eye 3D display screen including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels being configured of a plurality of sub-pixels corresponding to a plurality of views; the multi-view naked eye 3D display screen is divided into at least two display areas which are driven independently; at least two 3D processing devices respectively corresponding to different display areas of the at least two display areas and configured to render sub-pixels within the corresponding display areas based on the 3D signals. The 3D display equipment can realize smooth 3D display. The application also discloses a 3D display method and a 3D display terminal.

Description

3D display device, method and terminal
Technical Field
The present application relates to a naked-eye type 3D display technology, and for example, to a 3D display device, a 3D display method, and a 3D display terminal.
Background
3D (stereoscopic) imaging is one of the hot technologies in the video industry, and the technology change from flat display to 3D display is being promoted. The 3D display technology is a key part in the 3D image industry, and is mainly classified into two types, namely, glasses type 3D display technology and naked eye type 3D display technology. The naked-eye 3D display technology is a technology in which a user can view a 3D display screen without wearing glasses. Compared with glasses type 3D display, the glasses type 3D display belongs to an auto-stereoscopic display technology, and the constraint on a user is reduced.
The naked-eye 3D display is based on a viewpoint, and recently, multi-viewpoint naked-eye 3D display has been proposed, so that a sequence of parallax images (frames) is formed at different positions in a space, so that a pair of 3D images having a parallax relationship can enter left and right eyes of a person, respectively, thereby giving a 3D feeling to a user. For a conventional multi-view naked eye 3D display with e.g. N views, multiple views of the space are to be projected with multiple independent pixels on the display panel.
However, in the configuration of the conventional naked-eye 3D display, a 3D display effect is provided by providing a raster on one side or both sides of a 2D display panel, and transmission and display of a 3D image or video are based on the 2D display panel. This presents the dilemma of resolution degradation and a surge in rendering computation.
The proliferation of rendering computations results in an increased computational burden on the display. This is likely to cause image display delays, machine jams, or other display-unsmooth problems.
This background is only for convenience in understanding the relevant art in this field and is not to be taken as an admission of prior art.
Disclosure of Invention
The following presents a general summary of the embodiments in order to provide a basic understanding of some aspects of the disclosed embodiments, and is not intended to identify key/critical elements or to delineate the scope of the invention but rather as a prelude to the more detailed description that follows.
Embodiments of the present application are intended to provide a 3D display device, a 3D display method, and a 3D display terminal.
In one aspect, a 3D display device is provided, which includes a multi-view naked eye 3D display screen including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels being composed of a plurality of sub-pixels corresponding to a plurality of views; the multi-view naked eye 3D display screen is divided into at least two display areas which are driven independently; at least two 3D processing devices respectively corresponding to different display areas of the at least two display areas and configured to render sub-pixels within the corresponding display areas based on the 3D signals.
In the embodiment of the present disclosure, rendering of at least two display areas of independent driving of a multi-view naked-eye 3D display screen is processed in parallel by using at least two 3D processing devices, which can effectively reduce the amount of rendering calculation.
In some embodiments, each of the at least two 3D processing devices is configured as a driving device communicatively connected to the corresponding display area.
In some embodiments, the driving means of the display area includes a row driver, a column driver, and a timing controller connecting the row driver and the column driver; each of the at least two 3D processing devices is communicatively connected to the timing controller of the driving device of the corresponding display region.
In some embodiments, further comprising: a synchronizer configured to synchronize renderings of the at least two 3D processing devices for the corresponding display regions.
In some embodiments, further comprising: an image divider configured to divide an image of the 3D signal based on the at least two display regions that are independently driven; wherein each of the at least two 3D processing devices is configured to render a sub-pixel of the composite sub-pixels within the corresponding display area based on the segmented image.
In some embodiments, the multi-view naked-eye 3D display screen comprises m columns and n rows of composite pixels, each of the at least two display regions comprising an integer number of rows or an integer number of columns of a plurality of composite pixels or composite sub-pixels.
In some embodiments, each composite pixel includes a plurality of composite sub-pixels arranged in columns, and each of the plurality of composite sub-pixels includes a plurality of sub-pixels arranged in rows.
In some embodiments, the multi-view naked-eye 3D display screen comprises at least two display areas arranged vertically side by side, such that each of the at least two display areas comprises p × n composite pixels; wherein p is m/a, a is the number of at least two display regions, and p and a are natural numbers.
In some embodiments, each composite pixel includes a plurality of composite subpixels arranged in rows, and each of the plurality of composite subpixels includes a plurality of subpixels arranged in columns.
In some embodiments, the multi-view naked-eye 3D display screen comprises at least two display regions arranged laterally side by side such that each of the at least two display regions comprises m × q composite pixels; wherein q is n/b, b is the number of at least two display regions, and q, b are natural numbers.
In some embodiments, at least one of the at least two 3D processing devices is an FPGA or an ASIC chip or an FPGA or an ASIC chip set.
In some embodiments, further comprising: and a human eye tracking data acquisition device configured to acquire human eye tracking data so as to allow at least one of the at least two 3D processing devices to determine a viewpoint according to the acquired human eye tracking data, and render a sub-pixel corresponding to the viewpoint among the plurality of composite sub-pixels based on the 3D signal.
In another aspect, a 3D display method is provided, including: acquiring a 3D signal; and rendering sub-pixels in at least two independently driven display areas partitioned in the multi-view naked eye 3D display screen based on the 3D signals in a regional manner.
In the embodiment of the disclosure, since at least two independently driven display regions of the multi-view naked-eye 3D display screen are driven in parallel and sub-pixels in each composite sub-pixel in each region are rendered in parallel in a zoned manner, rendering is faster and smoother, thereby enabling 3D display to be smoother.
In some embodiments, rendering sub-pixels within a corresponding display region in regions is performed time-synchronously.
In some embodiments, further comprising: segmenting an image of the 3D signal; wherein rendering regionally comprises: sub-pixels within the corresponding display area are rendered in regions based on the segmented image.
In some embodiments, further comprising: and acquiring eye tracking data, and rendering sub-pixels in the corresponding display area in a regional manner according to the eye tracking data.
In another aspect, there is provided a 3D display terminal comprising a processor and a memory storing program instructions, the processor being configured to, upon execution of the program instructions, perform the method as described above.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1A and 1B are schematic structural views of a 3D display device according to an embodiment of the present disclosure;
fig. 2 is a hardware configuration diagram of a 3D display device according to an embodiment of the present disclosure;
fig. 3 is a software configuration diagram of the 3D display device shown in fig. 2;
fig. 4A-4C are schematic diagrams of a composite pixel according to an embodiment of the disclosure;
fig. 5A and 5B are schematic diagrams of a multi-view naked-eye 3D display screen and at least two 3D processing devices equipped according to an embodiment of the present disclosure;
fig. 6A and 6B are schematic diagrams of images of segmented 3D signals according to embodiments of the present disclosure;
fig. 7A and 7B are schematic diagrams of at least two display regions of an independent drive of a multi-view naked-eye 3D display screen according to an embodiment of the present disclosure;
fig. 8 is a schematic step diagram of a 3D display method according to an embodiment of the present disclosure;
fig. 9 is a schematic step diagram of a 3D display method according to an embodiment of the present disclosure;
fig. 10 is a schematic step diagram of a 3D display method according to an embodiment of the present disclosure;
fig. 11 is a schematic step diagram of a 3D display method according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of a 3D display terminal according to an embodiment of the present disclosure.
Reference numerals:
100: a 3D display device; 101: a processor; 122: a register; 140: a signal interface; 130: a 3D processing device; 131: a buffer; 132: a drive device; 1321: a time schedule controller; 1322: a column driver; 1323: a row driver; 133: a synchronizer; 110: a multi-view naked eye 3D display screen; and (3) CP: a composite pixel; CSP: a composite sub-pixel; 200: a 3D display device; 201: a processor; 202: an external memory interface; 203: a memory; 204: a USB interface; 205: a charging management module; 206: a power management module; 207: a battery; 208: a mobile communication module; 209: an antenna; 210: a wireless communication module; 211: an antenna; 212: an audio module; 213: a speaker; 214: a telephone receiver; 215: a microphone; 216: an earphone interface; 217: pressing a key; 218: a motor; 219: an indicator; 220: a SIM card interface; 221: a camera device; 222: a register; 223: a GPU; 224: a codec; 230: a sensor module; 2301: a proximity light sensor; 2302: an ambient light sensor; 2303: a pressure sensor; 2304: an air pressure sensor; 2305: a magnetic sensor; 2306: a gravity sensor; 2307: a gyroscope sensor; 2308: an acceleration sensor; 2309: a distance sensor; 2310: a temperature sensor; 2311: a fingerprint sensor; 2312: a touch sensor; 2313: a bone conduction sensor; 310: an application layer; 320: a frame layer; 330: core class library and Runtime (Runtime); 340: an inner core layer; 400: a composite pixel; 410. 420, 430: composite subpixels arranged in a single column; 411. 421, 431: subpixels arranged in a single row; 440. 450, 460: a composite subpixel arranged in a single row; 441. 451, 461: subpixels arranged in a single column; 470. 480, 490: the composite sub-pixels are arranged in a shape of Chinese character 'pin'; 471. 481, 491: subpixels in an array form (3 × 2); 601. 602: images respectively having m × n (signal) resolutions; 603: a composite image having a resolution of 2m × n (signal); 1200: a 3D display terminal; 1210: a processor; 1211: a memory; 1212: a communication interface; 1213: a bus.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
Definition of
Herein, "autostereoscopic (3D) display" refers to a technology in which a user (viewer) can observe a stereoscopic display image on a flat display without wearing glasses for stereoscopic display, and includes, but is not limited to, "parallax barrier", "lenticular lens", and "directional backlight" technologies.
In this context, "multi-view" has its conventional meaning in the art, meaning that different images displayed by different pixels or sub-pixels of the display screen can be viewed at different positions (viewpoints) in space. In this context, multi-view shall mean at least 3 views.
In this context, "grating" has a broad interpretation in the art, including but not limited to "parallax barrier" gratings and "lenticular" gratings, such as "lenticular" gratings.
Herein, "lens" or "lenticular" has the conventional meaning in the art, and includes, for example, cylindrical lenses and spherical lenses.
A conventional "pixel" means a 2D display or the smallest display unit in terms of its resolution when displayed as a 2D display.
However, in some embodiments herein, a "composite pixel" as it is referred to when applied to multi-view technology in the field of autostereoscopic display refers to the smallest unit of display when an autostereoscopic display provides multi-view display, but does not preclude that a single composite pixel for multi-view technology may comprise or appear as a plurality of 2D display pixels. Herein, unless specifically stated as a composite pixel or 3D pixel for "3D display" or "multi-view" applications, a pixel will refer to the smallest unit of display in 2D display. Likewise, when describing a "composite subpixel" for multi-view autostereoscopic 3D display, it will refer to a composite subpixel of a single color present in the composite pixel when the autostereoscopic display provides multi-view display. Herein, a sub-pixel in a "composite sub-pixel" will refer to the smallest display unit of a single color, which tends to correspond to a viewpoint.
In one aspect, there is provided a 3D display device including: the multi-view naked eye 3D display screen comprises m multiplied by n composite pixels; a video signal interface configured to receive an image of the 3D signal; and at least two 3D processing devices; each composite pixel comprises a plurality of composite sub-pixels, each composite sub-pixel is composed of i same-color sub-pixels corresponding to i viewpoints, wherein i is more than or equal to 3; wherein the multi-view naked-eye 3D display screen comprises at least two display areas which are independently driven, each 3D processing device is configured to be respectively allocated with one or more areas, and each 3D processing device is configured to render sub-pixels in the composite sub-pixels in the respectively allocated areas based on the images of the 3D signals.
In the embodiment of the present disclosure, rendering of at least two display areas of independent driving of a multi-view naked-eye 3D display screen is processed in parallel by using at least two 3D processing devices, which can effectively reduce the amount of rendering calculation.
In some embodiments, each 3D processing device is configured as a drive device connected to the respective assigned region.
In some embodiments, the driving means of each region of the multi-view naked eye 3D display screen comprises a respective row driver, a column driver and a timing controller connecting the row driver and the column driver, wherein each 3D processing means is communicatively connected to the timing controller of the respective assigned region. This configuration provides particular advantages in that the display screen areas with separate row and column drivers are used to collectively display the 3D video as a whole.
In some embodiments, the 3D display device further comprises a synchronizer configured to synchronize the at least two 3D processing apparatuses.
In some embodiments, the 3D display device further includes an image divider configured to divide an image of the 3D signal based on the at least two display regions that are independently driven; wherein each 3D processing device is configured to render sub-pixels of the composite sub-pixels within the respective assigned region based on the segmented image.
In some embodiments, each of the at least two independently driven display regions comprises an integer number of rows or columns of composite pixels or composite sub-pixels.
In some embodiments, each composite pixel comprises a single column of a plurality of composite sub-pixels, each composite sub-pixel comprising a single row of a plurality of sub-pixels.
In some embodiments, the multi-view naked-eye 3D display screen comprises at least two independently driven display regions arranged vertically side by side such that each region comprises p × n composite pixels, where p ═ m/a, a is the number of the at least two independently driven display regions and p, a are natural numbers.
In some embodiments, each composite pixel comprises a single row of a plurality of composite sub-pixels, each composite sub-pixel comprising a single column of the plurality of sub-pixels.
In some embodiments, the multi-view naked-eye 3D display screen comprises at least two independently driven display regions arranged laterally side by side such that each region comprises m × q composite pixels, where q ═ n/b, b is the number of the at least two independently driven display regions and q, b are natural numbers.
In some embodiments, the at least two 3D processing devices are FPGA or ASIC chips or FPGA or ASIC chip sets.
In some embodiments, the 3D display device further comprises a eye tracking apparatus or eye tracking data interface configured to acquire real-time eye tracking data.
In another aspect, a 3D display method for a multi-view naked eye 3D display screen is provided, where the multi-view naked eye 3D display screen includes m × n composite pixels, each composite pixel includes a plurality of composite sub-pixels, each composite sub-pixel is composed of i same-color sub-pixels corresponding to i views, where i is greater than or equal to 3, and the 3D display method includes: transmitting an image of the 3D signal; and driving the multi-view naked eye 3D display screen in a regional mode, and rendering the sub-pixels in each composite sub-pixel in a regional mode based on the image of the 3D signal.
In the embodiment of the disclosure, since at least two independently driven display regions of the multi-view naked-eye 3D display screen are driven in parallel and sub-pixels in each composite sub-pixel in each region are rendered in parallel in a zoned manner, rendering is faster and smoother, thereby enabling 3D display to be smoother.
In some embodiments, the 3D display method further comprises: and performing regional driving of the multi-view naked eye 3D display screen and regional rendering of the sub-pixels in each composite sub-pixel synchronously in time.
In some embodiments, the 3D display method further comprises: segmenting an image of the 3D signal; wherein rendering regionally comprises: sub-pixels in each composite sub-pixel are rendered regionally based on the segmented image.
In some embodiments, the 3D display method further comprises: and acquiring real-time eye tracking data, and rendering the sub-pixels in each composite sub-pixel in a regional manner based on the real-time eye tracking data.
In another aspect, a 3D display device is provided, which includes a processor and a memory storing program instructions, and the 3D display device further includes a multi-view naked eye 3D display screen, where the multi-view naked eye 3D display screen includes m × n composite pixels, each composite pixel includes a plurality of composite sub-pixels, each composite sub-pixel is composed of i same-color sub-pixels corresponding to i views, where i ≧ 3, and the processor is configured to execute the 3D display method described above when the program instructions are executed.
Fig. 1A shows a schematic structural diagram of a 3D display device 100 according to an embodiment of the present disclosure. Referring to fig. 1A, there is provided in an embodiment of the present disclosure a 3D display device 100 including: a multi-view naked-eye 3D display screen 110 including m columns and n rows of composite pixels CP and thus defining a display resolution of m × n; a signal interface 140 configured to receive images of a 3D signal, wherein the 3D signal may contain two images with m × n (signal) resolution or may contain a composite image with 2m × n or m × 2n (signal) resolution; and at least two 3D processing devices 130; each composite pixel comprises a plurality of composite sub-pixels, each composite sub-pixel is composed of i same-color sub-pixels corresponding to i viewpoints, wherein i is more than or equal to 3; wherein the multi-view naked eye 3D display screen 110 is divided into at least two display regions that are independently driven, each 3D processing device 130 is configured to be respectively allocated with one or more regions, wherein each 3D processing device 130 is configured to render sub-pixels in the composite sub-pixels within the respectively allocated regions based on an image of the 3D signal. The multi-view naked eye 3D display screen 110 may include a display panel and a raster (not identified) overlaid on the display panel.
In the embodiment shown in fig. 1A, i is 6, but other values for i are contemplated. In the illustrated embodiment, the multi-view autostereoscopic display may accordingly have i (i ═ 6) views (V1-V6), but it is contemplated that there may be more or fewer views accordingly.
With combined reference to fig. 1A and 4A-4C, in the illustrated embodiment, each composite pixel includes three composite sub-pixels, and each composite sub-pixel is composed of 6 same-color sub-pixels corresponding to 6 viewpoints (i ═ 6). The three composite subpixels correspond to three colors, i.e., red (R), green (G), and blue (B), respectively.
In the embodiment shown in fig. 1A and 4A, the three composite sub-pixels 410, 420, 430 in the composite pixel 400 are arranged in columns, e.g. in a single column arrangement. Each composite subpixel 410, 420, 430 includes subpixels 411, 421, 431 arranged in a row, e.g., in a single row, respectively. It is conceivable that the composite sub-pixels in the composite pixel are arranged differently or that the sub-pixels in the composite sub-pixel are arranged differently.
As shown in fig. 4B, the three composite sub-pixels 440, 450, 460 in the composite pixel 400 are arranged in a row, for example, in a single row. Each composite sub-pixel 440, 450, 460 comprises sub-pixels 441, 451, 461, respectively, arranged in columns, e.g. in a single column.
As shown in fig. 4C, the three composite subpixels 470, 480, 490 in composite pixel 400 are illustratively arranged in a "pin" shape. In the embodiment shown in fig. 4C, the subpixels 471, 481, 491 in each composite subpixel 470, 480, 490 are in the form of an array (3 × 2).
In some embodiments, the 3D display device 100 is provided with at least two 3D processing means 130 that process the rendering of each composite sub-pixel of each composite pixel of the autostereoscopic display screen 110 in parallel, in series or in a combination of series and parallel. In the embodiment shown in fig. 1A, the 3D display device 100 is provided with two 3D processing means 130.
Those skilled in the art will appreciate that the at least two 3D processing devices may be distributed in other ways and process multiple rows and multiple columns of composite pixels or composite sub-pixels of the autostereoscopic display screen 110 in parallel, which falls within the scope of the embodiments of the present disclosure.
In some embodiments, the at least two 3D processing devices 130 may further optionally include a buffer 131 to buffer the image of the received 3D signal.
In some embodiments, the at least two 3D processing devices are FPGA or ASIC chips or FPGA or ASIC chip sets.
With continued reference to fig. 1A, the 3D display device 100 may further include a processor 101 communicatively connected to at least two 3D processing apparatuses 130 through a signal interface 140. In some embodiments illustrated herein, the processor 101 is included in or as a processor unit of a computer or smart terminal, such as a mobile terminal. It is contemplated that in some embodiments, the processor 101 may be disposed external to the 3D display device, for example, the 3D display device may be a multi-view autostereoscopic display with 3D processing means, for example, a non-intelligent autostereoscopic television, for example, a mobile television disposed at a public transportation facility.
For simplicity, in the following, exemplary embodiments of the 3D display device internally comprise a processor. Further, the signal interface 140 is configured as an internal interface connecting the processor 101 and the 3D processing device 130, and such a structure can be more clearly understood with reference to the 3D display apparatus 200 implemented in a mobile terminal shown in fig. 2 and 3. In some embodiments of the present disclosure, the signal interface 140, which is an internal interface of the 3D Display device 200, may be a MIPI, mini-MIPI, LVDS, min-LVDS, or Display Port interface. In some embodiments, as shown in fig. 1A, the processor 101 of the 3D display device 100 may further include a register 122. The registers 122 may be used to temporarily store instructions, data, and addresses.
In some embodiments, at least one of the at least two 3D processing devices is communicatively connected to a multi-view naked eye 3D display screen.
In some embodiments, the multi-view naked-eye 3D display screen includes at least two independently driven display regions, each 3D processing device being configured to be each allocated with one or more independently driven display regions. In the embodiment shown in fig. 5A, the 3D display device 100 is provided with 6 3D processing means 130, the multi-view naked eye 3D display screen 110 includes 6 independently driven display regions, and each of the 3D processing means 130 is configured to be allocated with one independently driven display region. In other embodiments, not shown, each 3D processing device is configured to be assigned more than one independently driven display area, for example, is configured to be assigned two independently driven display areas.
In some embodiments, each 3D processing device is configured to render sub-pixels of the composite sub-pixels within the respective assigned region based on an image of the 3D signal. In the embodiment shown in fig. 5A, 6 3D processing devices 130 are configured to render sub-pixels of the composite sub-pixels within each assigned one of the independently driven display regions based on the image of the 3D signal.
In some embodiments, each 3D processing device is configured as a drive device connected to the respective assigned region. In the embodiment shown in fig. 5A, each of the 6 3D processing devices 130 is configured to be connected to a respective assigned driving device 132 of one independently driven display area.
In some embodiments, the driving means of each region of the multi-view naked eye 3D display screen comprises a respective row driver, a column driver and a timing controller connecting the row driver and the column driver, wherein each 3D processing means is communicatively connected to the timing controller of the respective assigned region. In the embodiment shown in fig. 5B, the 3D display device 100 is provided with at least two 3D processing means 130, and the multi-view naked-eye 3D display screen 110 includes at least two display regions that are independently driven. Each 3D processing device 130 is configured to be assigned with an independently driven display area. The driving means 132 of each independently driven display region comprises a respective row driver 1323, column driver 1322 and timing controller 1321 connected to the row driver 1323 and column driver 1322.
Illustratively, each composite subpixel of an m-column and n-row composite pixel of the multi-view naked-eye 3D display screen 110 is written using an addressing scheme, e.g., for each independently driven display area, the entire row of composite subpixels in this area is updated simultaneously by the column driver 1322, sequentially from the first row to the last row, and so on. The timing controller 1321 searches 3D video data for an entire line of composite subpixels in one line period, the found 3D video data including, for example, address information data related to a viewpoint for an entire line of composite subpixels and intensity information data for subpixels included in each composite subpixel in an entire line. The timing controller 1321 obtains 3D video data from the 3D processing device 130, distributes intensity information data to the column drivers 1322, and simultaneously sends address information data to the row drivers 1323 to address the entire row of composite subpixels and their subpixels. In the embodiment shown in fig. 5B, the timing controller 1321 is illustratively connected to the column drivers 1322 via a Mini-LVDS interface.
In some embodiments, as shown in fig. 5B, the 3D display apparatus 100 further includes a synchronizer 133 configured to synchronize the at least two 3D processing devices 130. Thereby synchronously driving each independently driven display region and synchronously rendering the sub-pixels in the same row of composite sub-pixels in each region; alternatively, the clock signals for each independently driven display region are synchronized.
Illustratively, the synchronization may be achieved by an external synchronization method, i.e., before the 3D video data is sent to each timing controller, a special synchronization clock signal is sent from the outside to each timing controller, and each timing controller locks the respective clock pulse frequency by using the received synchronization clock signal, so as to achieve the purpose of synchronous driving and synchronous rendering.
Illustratively, the synchronization may be achieved by a self-synchronization method, that is, 3D video data sent by at least two 3D processing devices to respective timing controllers itself includes a synchronization clock signal, and each timing controller extracts the synchronization clock signal from the received 3D video data, so as to achieve the purpose of synchronous driving and synchronous rendering.
In some embodiments, the 3D display device further comprises an image splitter configured to split an image of the 3D signal based on the at least two display areas of the multi-view naked eye 3D display screen that are independently driven; wherein each 3D processing device is configured to render sub-pixels of the composite sub-pixels within the respective assigned region based on the segmented image.
Exemplarily, referring to fig. 5A and 6A in combination, the 3D display device 100 has 6 3D processing means, the multi-view naked eye 3D display screen 110 has a display resolution of m × n and is divided into 6 independently driven display areas, and the 3D signal contains two images 601, 602 having a resolution of m × n (signal) in a parallel format.
In some embodiments, the two images 601, 602 may be a left eye parallax image and a right eye parallax image, respectively. In the embodiment shown in fig. 6A, the left-eye parallax image and the right-eye parallax image are respectively divided into 6 portions, and the 6 3D processing devices are configured to render sub-pixels among the composite sub-pixels within the 6 independently driven display regions each allocated based on the left-eye parallax image and the right-eye parallax image divided into 6 portions.
In some embodiments, the two images may be a rendered color image and a depth image, respectively. In this case, the rendered color image is decomposed into left and right color images as an intermediate color image, the left and right color images are refined using depth information included in the depth image to form left-eye parallax images and right-eye parallax images, and the images are divided and rendered.
In other embodiments, not shown, the 3D signal contains two images with m × n resolution in a top-bottom format. The two images may be a left-eye parallax image and a right-eye parallax image, or a rendering color image and a depth image.
Exemplarily, referring to fig. 5A and 6B in combination, the 3D display device 100 has 6 3D processing means 130, the multi-view naked eye 3D display screen 110 has a display resolution of m × n and is divided into 6 independently driven display areas, and the 3D signal contains a composite image 603 having a resolution of 2m × n. In other embodiments, not shown, the 3D signal contains a composite image having a resolution of m x 2 n.
In some embodiments, the composite image 603 may be an interleaved left eye parallax image and right eye parallax image having a 2m × n resolution. In this case, the composite image 603 is first split into two images, which are a left-eye parallax image and a right-eye parallax image having m × n resolutions, respectively, and then the two images are split into 6 parts, respectively, and the 6 3D processing devices are configured to render sub-pixels among the composite sub-pixels within the 6 independently driven display regions allocated to each based on the left-eye parallax image and the right-eye parallax image split into 6 parts.
In some embodiments, the composite image may be an interleaved rendered color image and depth image having a 2m × n resolution. In this case, the composite image is first divided into two images, which are a rendered color image and a depth image having m × n resolutions, respectively, the rendered color image is then divided into left and right color images as an intermediate color image, the left and right color images are completed using depth information included in the depth image to form left-eye parallax images and right-eye parallax images, and then the images are divided and rendered.
In some embodiments, the interleaving manner of the composite image may be left-right interleaving, upper-lower interleaving, or checkerboard interleaving.
In some embodiments, two or more 3D signals are transmitted to at least two 3D processing devices, wherein each 3D signal may contain two images with m × n resolution or a composite image with 2m × n or m × 2n resolution. In this case, image segmentation is performed according to the above-described embodiment, and at least two 3D processing devices render sub-pixels among composite sub-pixels within the respective allocated regions based on segmented images of two or more paths of 3D signals. That is, images included in two or more paths of 3D signals are rendered simultaneously.
In some embodiments, the image splitter may be integrated in each of the at least two 3D processing devices, or in a 3D processing device group formed by the at least two 3D processing devices, or in a processor of the 3D display device, or in another part of the 3D display device having the image splitting function.
In some embodiments, the image splitter may be provided as a separate component.
In the above-described embodiment, each composite subpixel is composed of 6 same-color subpixels corresponding to 6 viewpoints, the 3D display apparatus 100 includes 6 3D processing devices, and the multi-viewpoint naked-eye 3D display screen 110 accordingly includes 6 independently driven display regions. The number of viewpoints, the number of independently driven display regions, and the number of 3D processing devices are described herein by way of example and not limitation and need not be consistent.
In some embodiments, each of the independently driven at least two display regions of the multi-view naked eye 3D display screen 110 of the 3D display device 100 comprises an integer number of rows or columns of composite pixels or composite sub-pixels.
In some embodiments, each composite pixel comprises a single column of a plurality of composite sub-pixels, each composite sub-pixel comprising a single row of a plurality of sub-pixels. Referring collectively to fig. 4A and 7A, each composite pixel includes a single column of 3 composite sub-pixels, each including a single row of 6 sub-pixels, corresponding to 6 viewpoints.
With continued reference to fig. 7A, the multi-view naked-eye 3D display screen 110 includes 6 independently driven display regions arranged vertically side by side, wherein each independently driven display region is not shown in actual size scale in order to illustrate the arrangement of composite pixels and their composite sub-pixels within a single region. In the embodiment shown in fig. 7A, the multi-view naked-eye 3D display screen 110 includes m × n composite pixels, based on which each region includes p × n composite pixels, where p ═ m/a, a is the number of at least two display regions independently driven and where a ═ 6, and p, a are natural numbers.
In some embodiments, each composite pixel comprises a single row of a plurality of composite sub-pixels, each composite sub-pixel comprising a single column of the plurality of sub-pixels. Referring collectively to fig. 4B and 7B, each composite pixel includes a single row of 3 composite sub-pixels, each including a single column of 6 sub-pixels, corresponding to 6 viewpoints.
With continued reference to fig. 7B, the multi-view naked-eye 3D display screen 110 includes 6 independently driven display regions arranged laterally side by side, wherein each independently driven display region is not shown to a full scale for purposes of illustrating the arrangement of the composite pixels and their composite sub-pixels within a single region. In the embodiment shown in fig. 7B, the multi-view naked-eye 3D display screen 110 includes m × n composite pixels, based on which each region includes m × q composite pixels, where q ═ n/B, B is the number of at least two display regions independently driven and where B ═ 6, and q, B are natural numbers.
In some embodiments, not shown, each composite pixel comprises a plurality of composite sub-pixels, each composite sub-pixel comprising a plurality of sub-pixels in an array. Such a composite pixel and composite sub-pixel can refer to the composite pixel and composite sub-pixel shown in fig. 4C, for example. In this case, the multi-view naked-eye 3D display screen includes at least two independently driven display regions arranged side by side in an array, each region including an integer number of rows or columns of composite pixels or composite sub-pixels.
In some embodiments, each of the plurality of composite subpixels includes a red composite subpixel, a green composite subpixel, and a blue composite subpixel.
In some embodiments, the 3D display device further comprises a eye tracking apparatus or eye tracking data interface configured to acquire real-time eye tracking data.
As previously mentioned, the 3D display device provided by some embodiments of the present disclosure may be a 3D display device comprising a processor. In some embodiments, the 3D display device may be configured as a smart cellular phone, a tablet, a smart television, a wearable device, an in-vehicle device, a notebook, an Ultra Mobile Personal Computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like.
Exemplarily, fig. 2 shows a hardware configuration diagram of a 3D display device 200 implemented as a mobile terminal, such as a smart cellular phone or a tablet computer. The 3D display device 200 may include a processor 201, an external storage interface 202, an (internal) memory 203, a Universal Serial Bus (USB) interface 204, a charging management module 205, a power management module 206, a battery 207, a mobile communication module 208, a wireless communication module 210, antennas 209, 211, an audio module 212, a speaker 213, a receiver 214, a microphone 215, an earphone interface 216, a button 217, a motor 218, an indicator 219, a Subscriber Identity Module (SIM) card interface 220, a multi-view naked eye 3D display screen 110, at least two 3D processing apparatuses 130 (two are schematically shown in fig. 2), a signal interface 140, a camera 221, an eye tracking apparatus 150, a sensor module 230, and the like. Among other things, the sensor module 230 may include a proximity light sensor 2301, an ambient light sensor 2302, a pressure sensor 2303, a barometric pressure sensor 2304, a magnetic sensor 2305, a gravity sensor 2306, a gyroscope sensor 2307, an acceleration sensor 2308, a distance sensor 2309, a temperature sensor 2310, a fingerprint sensor 2311, a touch sensor 2312, a bone conduction sensor 2313, and the like.
It is to be understood that the illustrated structure of the embodiment of the present disclosure does not constitute a specific limitation to the 3D display device 200. In other embodiments, the 3D display device 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 201 may include one or more processing units, such as: the processor 201 may include an Application Processor (AP), a modem processor, a baseband processor, registers 222, a Graphics Processor (GPU)223, an Image Signal Processor (ISP), a controller, a memory, a codec 224, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), etc., or combinations thereof. The different processing units may be separate devices or may be integrated into one or more processors.
A cache memory may also be provided in the processor 201 and configured to hold instructions or data that have just been used or recycled by the processor 201. When the processor 201 is to use the instructions or data again, it may be called directly from memory.
In some embodiments, the processor 201 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and so forth.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 201 may include multiple sets of I2C buses. The processor 201 may be communicatively connected to the touch sensor 2312, the charger, the flash, the camera 221, the eye tracking device 150, etc. through different I2C bus interfaces.
In the embodiment shown in fig. 2, the MIPI interface may be configured to connect the processor 201 with the multi-view naked-eye 3D display screen 110. In addition, the MIPI interface may also be configured to connect peripheral devices such as the camera 221, the eye tracking device 150, and the like.
It is understood that the interfacing relationship between the modules illustrated in the embodiments of the present disclosure is only an exemplary illustration and does not constitute a structural limitation of the 3D display device 200.
The wireless communication function of the 3D display device 200 may be implemented by the antennas 209 and 211, the mobile communication module 208, the wireless communication module 210, a modem processor or a baseband processor, and the like.
The antennas 209, 211 are configured to transmit and receive electromagnetic wave signals. Each antenna in the 3D display device 200 may be configured to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 208 may provide a solution including 2G/3G/4G/5G wireless communication applied on the 3D display device 200. In some embodiments, at least some of the functional modules of the mobile communication module 208 may be disposed in the processor 201. In some embodiments, at least some of the functional modules of the mobile communication module 208 may be disposed in the same device as at least some of the modules of the processor 201.
The wireless communication module 210 may provide a solution for wireless communication applied to the 3D display device 200, including Wireless Local Area Network (WLAN), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 210 may be one or more devices integrating at least one communication processing module.
In some embodiments, the antenna 209 and the mobile communication module 208 of the 3D display device 200 are coupled and the antenna 211 and the wireless communication module 210 are coupled so that the 3D display device 200 can communicate with a network and other devices through a wireless communication technology. The wireless communication technology may include at least one of global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, or IR technology, among others.
In some embodiments, the external interface configured to receive the 3D signal may include a USB interface 204, a mobile communication module 208, a wireless communication module 209, or a combination thereof. Furthermore, other possible interfaces configured to receive 3D signals are also conceivable, such as the interfaces described above.
The memory 203 may be configured to store computer-executable program code, the executable program code comprising instructions. The processor 201 executes various functional applications of the 3D display device 200 and data processing by executing instructions stored in the memory 203.
The external memory interface 202 may be configured to connect an external memory card, such as a Micro SD card, to extend the storage capability of the 3D display device 200. The external memory card communicates with the processor 201 through the external memory interface 202, implementing a data storage function.
In some embodiments, the memory of the 3D display device may comprise an (internal) memory 203, an external memory card to which the external memory interface 202 is connected, or a combination thereof. In other embodiments of the present disclosure, the signal interface may also adopt different internal interface connection manners or a combination thereof in the above embodiments.
In an embodiment of the present disclosure, the camera 221 may capture an image or video.
In some embodiments, the 3D display apparatus 200 implements a display function through the signal interface 140, the at least two 3D processing devices 130, the multi-view naked eye 3D display screen 110, and the application processor, etc.
In some embodiments, the 3D display device 200 may include a GPU, for example, configured to process 3D video images within the processor 201, as well as 2D video images.
In some embodiments, the 3D display device 200 further includes a codec 224 configured to compress or decompress digital video.
In some embodiments, the signal interface 140 is configured to output images of the 3D signals, e.g., decompressed 3D signals, processed by the GPU or the codec 224, or both, to the 3D processing device 130.
In some embodiments, the GPU or codec 224 is integrated with a formatter.
The multi-view naked eye 3D display screen 110 is configured to display a 3D image or video or the like. The multi-view naked eye 3D display screen 110 includes a display panel. The display panel can be a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED) or an Active Matrix Organic Light Emitting Diode (AMOLED), a Flexible Light Emitting Diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light emitting diode (QLED), and the like.
In some embodiments, the 3D display apparatus 200 may further include eye tracking data acquisition means, a eye tracking means 150 configured to acquire the eye tracking data in real time, or an eye tracking data interface, so that the at least two 3D processing means 130 may render respective sub-pixels in the composite pixel (composite sub-pixel) based on the eye tracking data. In other embodiments, eye tracking device 150 is communicatively coupled to at least two 3D processing devices 130. Illustratively, the eye tracking apparatus 150 may also be connected to the processor 201, for example by-passing the connection processor 201. Illustratively, the eye tracking device 150 may connect the processor 201 and at least two 3D processing devices 130 simultaneously.
The 3D display device 200 may implement an audio function through the audio module 212, the speaker 213, the receiver 214, the microphone 215, the earphone interface 216, and the application processor, etc.
The keys 217 include a power-on key, a volume key, and the like. The keys 217 may be mechanical keys. Or may be touch keys. The 3D display device 200 may receive a key input, and generate a key signal input related to user setting and function control of the 3D display device 200.
The motor 218 may generate a vibration indication. The motor 218 may be configured to provide an electrical vibration alert, and may also be configured to provide a touch vibration feedback.
The SIM card interface 220 is configured to connect a SIM card. In some embodiments, the 3D display device 200 employs eSIM, namely: an embedded SIM card.
The pressure sensor 2303 is configured to sense a pressure signal, which may be converted into an electrical signal. In some embodiments, the pressure sensor 2303 may be disposed on the multi-view naked eye 3D display screen 110, which falls within the scope of the disclosed embodiments.
The air pressure sensor 2304 is configured to measure air pressure.
The magnetic sensor 2305 includes a hall sensor.
The gravity sensor 2306 is a sensor that converts motion or gravity into an electrical signal, and is mainly configured to measure parameters such as a tilt angle, an inertial force, an impact, and vibration.
The gyro sensor 2307 may be configured to determine a motion gesture of the 3D display device 200.
The acceleration sensor 2308 may detect the magnitude of acceleration of the 3D display device 200 in various directions (typically three axes).
The distance sensor 2309 may be configured to measure distance
The temperature sensor 2310 may be configured to detect temperature.
The fingerprint sensor 2311 is configured to acquire a fingerprint.
The touch sensor 2312 may be disposed in the multi-view naked eye 3D display screen 110, and the touch sensor 2312 and the multi-view naked eye 3D display screen 110 form a touch screen, which is also referred to as a "touch screen".
The bone conduction sensor 2313 may acquire a vibration signal.
The charging management module 205 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger.
The power management module 206 is configured to connect the battery 207, the charge management module 205 and the processor 201.
The software system of the 3D display device 200 may employ a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture. The embodiment shown in the present disclosure exemplifies a software structure of the 3D display device 200 by taking an android system of a layered architecture as an example. It is contemplated that embodiments of the present disclosure may be implemented in different software systems, such as an operating system.
Fig. 3 is a software configuration diagram of the 3D display device 200 according to the embodiment of the present disclosure. The layered architecture divides the software into several layers. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer 310, a framework layer 320, a core class library and Runtime (Runtime)330, and a kernel layer 340.
The application layer 310 may include a series of application packages. As shown in fig. 3, the application packages may include bluetooth, WLAN, navigation, music, camera, calendar, telephony, video, gallery, map, short message, etc. applications. The 3D video display method according to the embodiments of the present disclosure may be implemented in, for example, a video application.
Framework layer 320 provides an Application Programming Interface (API) and programming framework for applications at the application layer. The framework layer includes some predefined functions. For example, in some embodiments of the present disclosure, functions or algorithms that identify captured 3D video images, algorithms that process images, and the like may be included at the framework layer.
As shown in FIG. 3, the framework layer 320 may include an explorer, a phone manager, a content manager, a notification manager, a window manager, a view system, an installation package manager, and the like.
The android Runtime includes a core library and a virtual machine. The android Runtime is responsible for scheduling and managing the android system.
The core library comprises two parts: one part is a function to be called by java language, and the other part is a core library of android.
The application layer and the framework layer run in a virtual machine. And executing java files of the application program layer and the framework layer into binary files by the virtual machine. The virtual machine is configured to perform the functions of object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
The core class library may include a plurality of functional modules. For example: three-dimensional graphics processing libraries (e.g., OpenGL ES), surface managers, image processing libraries, media libraries, graphics engines (e.g., SGL), and the like.
The kernel layer 340 is a layer between hardware and software. The inner core layer at least comprises a camera drive, an audio and video interface, a communication interface, a Wifi interface, a sensor drive, a power supply management and a GPS interface.
Here, an embodiment of 3D video transmission and display in a 3D display device is described taking as an example a 3D display device as a mobile terminal having the structure shown in fig. 2 and 3; it is contemplated, however, that additional or fewer features may be included or changes may be made in the features of alternative embodiments.
In some embodiments, the 3D display apparatus 200, for example, a mobile terminal, such as a smart cellular phone or a tablet computer, receives, for example, a compressed 3D signal from a network, such as a cellular network, a WLAN network, bluetooth, for example, by means of the mobile communication module 208 and the antenna 209 or the wireless communication module 210 and the antenna 211 as an external interface, the compressed 3D signal is subjected to image processing, codec and decompression by the GPU 223, for example, and then the decompressed 3D signal is sent to the at least one 3D processing device 130, for example, via the signal interface 140 as an internal interface, such as a MIPI interface or a mini-MIPI interface, and the image of the decompressed 3D signal includes two images or a composite image of the embodiment of the present disclosure. Further, the 3D processing device 130 renders the sub-pixels of the composite sub-pixels of the display screen accordingly, thereby implementing 3D video playback.
In other embodiments, the 3D display device 200 reads the (internal) memory 203 or reads the compressed 3D signal stored in the external memory card through the external memory interface 202, and implements 3D video playback through corresponding processing, transmission, and rendering.
In some embodiments, the playing of the 3D video is implemented in a video application in the android system application layer 310.
The embodiment of the present disclosure may further provide a 3D display method for a multi-view naked eye 3D display screen, where the multi-view naked eye 3D display screen includes at least two display areas and at least two 3D processing devices which are independently driven, and each of the at least two 3D processing devices corresponds to a different display area of the at least two display areas.
Referring to fig. 8, in some embodiments, a 3D display method includes:
s801: acquiring a 3D signal;
s802: and rendering sub-pixels in at least two independently driven display areas partitioned in the multi-view naked eye 3D display screen based on the 3D signals in a regional manner.
Referring to fig. 9, in some embodiments, a 3D display method includes:
s901: acquiring a 3D signal;
s902: the rendering of sub-pixels within the corresponding display area is performed in a time synchronized manner.
Referring to fig. 10, in some embodiments, a 3D display method includes:
s1001: acquiring an image of the 3D signal;
s1002: segmenting an image of the 3D signal;
s1003: sub-pixels within the corresponding display area are rendered in regions based on the segmented image.
Referring to fig. 11, in some embodiments, a 3D display method includes:
s1101: acquiring a 3D signal;
s1102: acquiring human eye tracking data;
s1103: and rendering sub-pixels in the corresponding display area in a regional manner based on the 3D signal and the acquired human eye tracking data.
An embodiment of the present disclosure provides a 3D display terminal 1200, and referring to fig. 12, the 3D display terminal includes:
processor 1210, and memory 1211, and may also include a communication interface 1212, and bus 1213. The processor 1210, the communication interface 1212, and the memory 1211 are configured to communicate with each other via the bus 1213. The communication interface 1213 may be configured to communicate information. The processor 1210 may call logic instructions in the memory 1211 to perform the 3D display method of the above-described embodiment.
In addition, the logic instructions in the memory 1211 may be implemented in the form of software functional units and may be stored in a computer readable storage medium when the logic instructions are sold or used as a separate product.
The memory 1211, which is a computer-readable storage medium, may be configured to store a software program, a computer-executable program, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 1210 executes functional applications and data processing, i.e., implements the 3D display method in the above-described method embodiments, by executing program instructions/modules stored in the memory 1211.
The memory 1211 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. In addition, the memory 1211 may include a high-speed random access memory and may also include a nonvolatile memory.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by various possible entities. A typical implementation entity is a computer or a processor or other component thereof. The computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a gaming console, a tablet computer, a wearable device, a smart television, an internet of things system, a smart home, an industrial computer, a single-chip system, or a combination of these devices. In a typical configuration, a computer may include one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM).
The methods, programs, systems, apparatuses, etc. in embodiments of the present application may be performed or implemented in a single or multiple networked computers, or may be practiced in distributed computing environments. In the described embodiments, tasks are performed by remote processing devices that are linked through a communications network in these distributed computing environments.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
Those skilled in the art will appreciate that the implementation of the functional blocks/units or controllers and the associated method steps set forth in the above embodiments may be implemented in software, hardware, or a combination of software and hardware. For example, it may be implemented in purely computer readable program code means or it may be possible to cause a controller to perform the same function in hardware, in part or in whole by logically programming method steps, including but not limited to logic gates, switches, application specific integrated circuits, programmable logic controllers (e.g., FPGAs), and embedded microcontrollers.
In some embodiments of the present application, the components of the apparatus are described in the form of functional modules/units. It is contemplated that the various functional modules/units may be implemented in one or more "combined" functional modules/units and/or one or more software and/or hardware components. It is also conceivable that a single functional module/unit is implemented by a plurality of sub-functional modules or combinations of sub-units and/or by a plurality of software and/or hardware. The division of functional modules/units may be only one logical division of functions, and in particular implementations, multiple modules/units may be combined or may be integrated into another system. Furthermore, references herein to the connection of modules, units, devices, systems and components thereof include direct or indirect connections, encompassing possible electrical, mechanical, communicative connections, including in particular wired or wireless connections between various interfaces, including but not limited to HDMI, thunderbolt, USB, WiFi, cellular networks.
In the embodiments of the present application, the technical features, the flowcharts and/or the block diagrams of the methods, the programs may be applied to corresponding apparatuses, devices, systems and modules, units and components thereof. Conversely, various embodiments and features of apparatuses, devices, systems, modules, units, components thereof may be applied to methods, programs according to embodiments of the present application. For example, the computer program instructions may be loaded onto a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, having corresponding functions or features, which implement one or more of the procedures of the flowcharts and/or one or more blocks of the block diagrams.
Methods, programs, and computer program instructions according to embodiments of the present application may be stored in a computer-readable memory or medium that can direct a computer or other programmable data processing apparatus to function in a particular manner. The embodiments of the present application also relate to a readable memory or medium storing a method, a program, and instructions that can implement the embodiments of the present application.
Storage media include articles of manufacture that are permanent and non-permanent, removable and non-removable, and that may implement any method or technology for storage of information. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be configured to store information that can be accessed by a computing device.
Unless specifically stated otherwise, the actions or steps of a method, program or process described in accordance with the embodiments of the present application do not have to be performed in a particular order and still achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The exemplary systems and methods of the present application have been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes for carrying out the systems and methods. It will be appreciated by those skilled in the art that various changes in the embodiments of the systems and methods described herein may be made in practicing the systems and/or methods without departing from the spirit and scope of the application as defined in the appended claims. It is intended that the following claims define the scope of the system and method and that the system and method within the scope of these claims and their equivalents be covered thereby. The above description of the present system and method should be understood to include all novel and non-obvious combinations of elements described herein.

Claims (17)

1. A3D display device, comprising:
a multi-view naked-eye 3D display screen including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels being composed of a plurality of sub-pixels corresponding to a plurality of views; the multi-view naked eye 3D display screen is divided into at least two display areas which are driven independently;
at least two 3D processing devices respectively corresponding to different display areas of the at least two display areas and configured to render sub-pixels within the corresponding display areas based on the 3D signals.
2. 3D display device according to claim 1, characterized in that each of the at least two 3D processing means is configured as a driving means communicatively connected to the corresponding display area.
3. 3D display device according to claim 2,
the driving device of the display area comprises a row driver, a column driver and a time schedule controller connected with the row driver and the column driver;
each of the at least two 3D processing devices is communicatively connected to a timing controller of a driving device of the corresponding display region.
4. The 3D display device according to claim 1, further comprising:
a synchronizer configured to synchronize renderings of the at least two 3D processing devices for the corresponding display regions.
5. The 3D display device according to claim 1, further comprising:
an image divider configured to divide an image of the 3D signal based on the independently driven at least two display regions;
wherein each of the at least two 3D processing devices is configured to render a sub-pixel of the composite sub-pixels within the corresponding display area based on the segmented image.
6. The 3D display device according to any one of claims 1 to 5, wherein the multi-view naked eye 3D display screen comprises m columns and n rows of composite pixels, and each of the at least two display regions comprises an integer number of rows or an integer number of columns of a plurality of composite pixels or composite sub-pixels.
7. The 3D display device according to claim 6,
each composite pixel includes a plurality of composite sub-pixels arranged in columns, and each of the plurality of composite sub-pixels includes a plurality of sub-pixels arranged in rows.
8. The 3D display device according to claim 7, wherein the multi-view naked-eye 3D display screen comprises the at least two display areas arranged vertically side by side such that each of the at least two display areas comprises p x n composite pixels;
wherein p is m/a, a is the number of the at least two display regions, and p and a are natural numbers.
9. The 3D display device according to claim 6, wherein each composite pixel comprises a plurality of composite sub-pixels arranged in rows, and each composite sub-pixel in the plurality of composite sub-pixels comprises a plurality of sub-pixels arranged in columns.
10. The 3D display device according to claim 9, wherein the multi-view naked-eye 3D display screen comprises the at least two display areas arranged laterally side by side such that each of the at least two display areas comprises m x q composite pixels;
wherein q is n/b, b is the number of the at least two display regions, and q, b are natural numbers.
11. The 3D display device according to any of claims 1 to 5, wherein at least one of the at least two 3D processing means is an FPGA or an ASIC chip or an FPGA or an ASIC chip set.
12. The 3D display device according to any of claims 1 to 5, further comprising:
a human eye tracking data obtaining device configured to obtain human eye tracking data for at least one of the at least two 3D processing devices to determine a viewpoint according to the obtained human eye tracking data, and render a sub-pixel corresponding to the viewpoint in the plurality of composite sub-pixels based on the 3D signal.
13. A 3D display method, comprising:
acquiring a 3D signal;
and rendering sub-pixels in at least two independently driven display areas partitioned in the multi-view naked eye 3D display screen based on the 3D signal in a partitioned manner.
14. The method of claim 13, wherein the rendering sub-pixels within a corresponding display region in regions is performed time-synchronously.
15. The method of claim 13 or 14, further comprising: segmenting an image of the 3D signal;
wherein the rendering regionally comprises: sub-pixels within the corresponding display area are rendered in regions based on the segmented image.
16. The method of claim 13 or 14, further comprising:
and acquiring human eye tracking data, and rendering sub-pixels in the corresponding display area in a regional manner according to the human eye tracking data.
17. A 3D display terminal comprising a processor and a memory storing program instructions, characterized in that the processor is configured to perform the method according to any of claims 13 to 16 when executing the program instructions.
CN201911231396.XA 2019-12-05 2019-12-05 3D display device, method and terminal Pending CN112929647A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911231396.XA CN112929647A (en) 2019-12-05 2019-12-05 3D display device, method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911231396.XA CN112929647A (en) 2019-12-05 2019-12-05 3D display device, method and terminal

Publications (1)

Publication Number Publication Date
CN112929647A true CN112929647A (en) 2021-06-08

Family

ID=76160766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911231396.XA Pending CN112929647A (en) 2019-12-05 2019-12-05 3D display device, method and terminal

Country Status (1)

Country Link
CN (1) CN112929647A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113315964A (en) * 2021-06-21 2021-08-27 北京京东方光电科技有限公司 Display method and device of 3D image and electronic equipment
CN114040184A (en) * 2021-11-26 2022-02-11 京东方科技集团股份有限公司 Image display method, system, storage medium and computer program product
CN114079765A (en) * 2021-11-17 2022-02-22 京东方科技集团股份有限公司 Image display method, device and system
WO2023024112A1 (en) * 2021-08-27 2023-03-02 京东方科技集团股份有限公司 Display panel, display apparatus and driving method therefor, and image rendering method
US20240169948A1 (en) * 2021-08-27 2024-05-23 Boe Technology Group Co., Ltd. Display panel, display apparatus and driving method thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113315964A (en) * 2021-06-21 2021-08-27 北京京东方光电科技有限公司 Display method and device of 3D image and electronic equipment
CN113315964B (en) * 2021-06-21 2023-04-14 北京京东方光电科技有限公司 Display method and device of 3D image and electronic equipment
WO2023024112A1 (en) * 2021-08-27 2023-03-02 京东方科技集团股份有限公司 Display panel, display apparatus and driving method therefor, and image rendering method
EP4276806A1 (en) * 2021-08-27 2023-11-15 BOE Technology Group Co., Ltd. Display panel, display apparatus and driving method therefor, and image rendering method
EP4276806A4 (en) * 2021-08-27 2024-05-01 Boe Technology Group Co Ltd Display panel, display apparatus and driving method therefor, and image rendering method
US20240169948A1 (en) * 2021-08-27 2024-05-23 Boe Technology Group Co., Ltd. Display panel, display apparatus and driving method thereof
CN114079765A (en) * 2021-11-17 2022-02-22 京东方科技集团股份有限公司 Image display method, device and system
CN114079765B (en) * 2021-11-17 2024-05-28 京东方科技集团股份有限公司 Image display method, device and system
CN114040184A (en) * 2021-11-26 2022-02-11 京东方科技集团股份有限公司 Image display method, system, storage medium and computer program product

Similar Documents

Publication Publication Date Title
CN112929647A (en) 3D display device, method and terminal
TWI746302B (en) Multi-viewpoint 3D display, multi-viewpoint 3D display terminal
CN1892808B (en) Stereoscopic image display device
EP4068769A1 (en) Eye positioning device and method, and 3d display device and method
CN112584125A (en) Three-dimensional image display apparatus and display method thereof
JP4629838B2 (en) Stereoscopic image generation apparatus and stereoscopic image generation method
CN110557626B (en) Image display method and electronic equipment
CN112933599A (en) Three-dimensional model rendering method, device, equipment and storage medium
CN211791828U (en) 3D display device
CN101237480B (en) 3d display multi-screen mobile phone and multi-screen display control method
TWI782361B (en) 3D display device, method and terminal
CN110192391B (en) Processing method and equipment
WO2021110027A1 (en) Method for implementing 3d image display and 3d display device
CN211128026U (en) Multi-view naked eye 3D display screen and multi-view naked eye 3D display terminal
KR20120053548A (en) Display driver circuit, operating method thereof, and user device including that
CN211528831U (en) Multi-view naked eye 3D display screen and naked eye 3D display terminal
CN112929645A (en) 3D display device, system and method, and 3D video data communication method
US20120133744A1 (en) Stereoscopic image generation apparatus and method
WO2021110040A1 (en) Multi-viewpoint 3d display screen and 3d display terminal
TWI840636B (en) Method for realizing 3D image display, and 3D display device
CN112929637A (en) Method for realizing 3D image display and 3D display equipment
CN112929641B (en) 3D image display method and 3D display device
JP2001331169A (en) Stereoscopic video display device and information storage medium
EP4068773A1 (en) Method for realizing 3d image display, and 3d display device
CN112911268A (en) Image display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination