CN112929645A - 3D display device, system and method, and 3D video data communication method - Google Patents

3D display device, system and method, and 3D video data communication method Download PDF

Info

Publication number
CN112929645A
CN112929645A CN201911231387.0A CN201911231387A CN112929645A CN 112929645 A CN112929645 A CN 112929645A CN 201911231387 A CN201911231387 A CN 201911231387A CN 112929645 A CN112929645 A CN 112929645A
Authority
CN
China
Prior art keywords
display
composite
images
pixels
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911231387.0A
Other languages
Chinese (zh)
Inventor
刁鸿浩
黄玲溪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vision Technology Venture Capital Pte Ltd
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Vision Technology Venture Capital Pte Ltd
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Technology Venture Capital Pte Ltd, Beijing Ivisual 3D Technology Co Ltd filed Critical Vision Technology Venture Capital Pte Ltd
Priority to CN201911231387.0A priority Critical patent/CN112929645A/en
Publication of CN112929645A publication Critical patent/CN112929645A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses 3D display device includes: a multi-view naked-eye 3D display screen including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels being configured of a plurality of sub-pixels corresponding to a plurality of views; a signal interface configured to receive a video frame in the 3D signal, wherein the video frame contains at least three images or a composite image composed of at least three images; a 3D processing device configured to render at least one of the plurality of composite sub-pixels based on at least one of the at least three images in the video frame or the composite image. The device can realize that a plurality of sub-pixels corresponding to a plurality of viewpoints in each composite sub-pixel are rendered simultaneously aiming at the images corresponding to a plurality of viewpoints. The application also discloses a 3D display system, a 3D display method and a 3D video data communication method.

Description

3D display device, system and method, and 3D video data communication method
Technical Field
The present application relates to naked-eye 3D display technology, for example to 3D display devices, systems and methods and 3D video data communication methods.
Background
3D images are one of the hot technologies in the video industry, and the technology change from flat display to 3D display is being promoted. The 3D display technology is a key part in the 3D image industry, and is mainly classified into two types, namely, glasses type 3D display technology and naked eye type 3D display technology. The naked-eye 3D display technology is a technology in which a user can view a 3D display screen without wearing glasses. Compared with glasses type 3D display, naked eye type 3D display belongs to free 3D display technology, and restraint on a user is reduced.
The naked-eye 3D display is based on a viewpoint, and recently, multi-viewpoint naked-eye 3D display has been proposed, so that a sequence of parallax images (frames) is formed at different positions in a space, so that a pair of 3D images having a parallax relationship can enter left and right eyes of a person, respectively, thereby giving a 3D feeling to a user. For a conventional multi-view naked eye 3D display with e.g. N views, the transmission and display of 3D images or video is based on a 2D display panel, which projects multiple views of the space with multiple independent pixels on the display panel. This presents the dilemma of reduced display resolution and a surge in rendering computation.
Since the total display resolution of the 2D display panel is constant, the display resolution is drastically reduced, for example, the column resolution is reduced to 1/N of the original resolution. This also results in different horizontal and vertical resolution reduction factors due to the pixel arrangement of the multi-view display.
In the case of an N-viewpoint 3D display device that provides high definition, for example, N times as much as a 2D display device, in order to maintain high-definition display, the transmission bandwidth from the terminal to the display that would be occupied is also multiplied by N times, resulting in a too large amount of signal transmission. Moreover, the pixel-level rendering of such N-fold high-resolution images can severely occupy the computing resources of the terminal or the display itself, resulting in a significant performance degradation.
Furthermore, since the transmission and display of 3D images or video is based on a 2D display panel, there may also be problems with multiple format adjustments and image or video display adaptations. This may result in a further increase in the amount of rendering computations on the one hand and may affect the display effect of the 3D image or video on the other hand.
In addition, in the conventional naked-eye 3D display technology, the display screen is usually a single two-dimensional image, which is generated to be viewed by the left and right eyes, and then adjusted to be a viewpoint image to be finally displayed according to the viewpoint position. The multiple computing steps in this process may occupy a large amount of computing resources of the terminal or the display itself, resulting in a significant performance degradation.
This background is only for convenience in understanding the relevant art in this field and is not to be taken as an admission of prior art.
Disclosure of Invention
The following presents a general summary of the embodiments in order to provide a basic understanding of some aspects of the disclosed embodiments, and is not intended to identify key/critical elements or to delineate the scope of the invention but rather as a prelude to the more detailed description that follows.
Embodiments of the present application aim to provide 3D display devices, systems and methods and 3D video data communication methods, which aim to overcome or alleviate at least some of the problems mentioned above.
In one aspect, there is provided a 3D display device including: a multi-view naked-eye 3D display screen including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels being configured of a plurality of sub-pixels corresponding to a plurality of views; a signal interface configured to receive a video frame in the 3D signal, wherein the video frame contains at least three images or a composite image composed of at least three images; a 3D processing device configured to render at least one of the plurality of composite sub-pixels based on at least one of the at least three images in the video frame or the composite image.
With such a 3D display device, it is possible to simultaneously render a plurality of sub-pixels corresponding to a plurality of viewpoints in each of the composite sub-pixels for images corresponding to a plurality of viewpoints. Therefore, the steps of generating left and right eye images and adjusting the left and right eye images to the viewpoint images according to the viewpoint positions in the traditional naked eye type 3D display technology are omitted; a large amount of computing resources are saved, and rendering efficiency is improved. In some embodiments, the plurality of subpixels corresponding to the plurality of viewpoints are a plurality of same-color subpixels; the multi-view naked eye 3D display screen comprises m x n composite pixels and thus defines an m x n display resolution; and at least three images each have an image resolution of m x n.
In the embodiment of the disclosure, the display resolution of the multi-view naked-eye 3D display screen is defined in a composite pixel manner, so that the display resolution defined by the composite pixel is considered in both transmission and display, which can effectively reduce the amount of transmission and rendering calculation and still has an excellent display effect. In contrast, the conventional 3D display still considers the transmission and display of 3D images or videos based on a 2D display panel, which is a problem of resolution reduction and computation increase when performing multi-view naked eye 3D display, and a problem of multiple format adjustment and image or video display adaptation.
In some embodiments, the signal interface is further configured to transmit identification data indicating a correspondence between the image and the viewpoint.
In some embodiments, the signal interface is a high definition multimedia interface, HDMI.
In some embodiments, the HDMI is configured to transmit extended display identification data EDID including identification data indicating a correspondence between an image and a viewpoint.
In some embodiments, the at least three images are arranged in a left-right arrangement or an up-down arrangement; or the composite image is a left-right interwoven composite image or a top-bottom interwoven composite image.
In some embodiments, at least one of the plurality of composite sub-pixels comprises a single row or column or array of sub-pixels.
In some embodiments, the 3D processing device comprises at least one of a field programmable gate array, FPGA, chip, an application specific integrated circuit, ASIC, chip.
In some embodiments, the 3D display device further comprises: an eye position acquisition means configured to acquire a user's eye position.
In another aspect, there is provided a 3D display system including: a terminal; and a 3D display device as described above; wherein, the terminal is connected with the signal interface in a communication way.
In some embodiments, a terminal includes a processor, a memory configured to store a 3D signal, and an image processing apparatus; the image processing device is configured to process the 3D signal to obtain a video frame in the 3D signal, wherein the video frame comprises at least three images or a composite image formed by at least three images.
In some embodiments, the 3D display system further comprises: a formatter configured to process a video frame in the 3D signal so that each of at least three images has an image resolution consistent with a display resolution of the multi-view naked eye 3D display screen; wherein the display resolution of the multi-view naked eye 3D display screen is defined by the composite pixels.
In another aspect, there is provided a 3D display method including:
transmitting a video frame in the 3D signal, wherein the video frame comprises at least three images or a composite image formed by at least three images; rendering at least one sub-pixel in a plurality of composite sub-pixels in the multi-view naked eye 3D display screen based on at least one of the at least three images or the composite image; the multi-view naked eye 3D display screen comprises a plurality of composite pixels, each composite pixel in the plurality of composite pixels comprises a plurality of composite sub-pixels, and each composite sub-pixel in the plurality of composite sub-pixels is formed by a plurality of sub-pixels corresponding to a plurality of views. In some embodiments, the plurality of subpixels corresponding to the plurality of viewpoints are a plurality of same-color subpixels; the multi-view naked eye 3D display screen comprises m x n composite pixels and thus defines an m x n display resolution; and at least three images each have an image resolution of m x n.
In the embodiment of the disclosure, because the display resolution of the multi-view naked eye 3D display screen is consistent with the resolution of the video frame of the 3D signal, the video frame for transmitting the 3D signal does not occupy additional transmission bandwidth resources; and since the display resolution of the display screen coincides with the image resolution of the images contained in the video frames, there is no need to format the generated images.
In another aspect, there is provided a 3D video data communication method including: acquiring identification data indicating a corresponding relationship between the image and the viewpoint; converting the initial 3D video data based on the identification data to obtain converted 3D video data containing video frames; wherein the video frame comprises at least three images or a composite image consisting of at least three images.
In some embodiments, the 3D video data communication method further comprises: the converted 3D video data is transmitted.
In some embodiments, the obtaining of identification data indicating a correspondence between an image and a viewpoint and the transmitting of the converted 3D video data are implemented via a high definition multimedia interface HDMI.
In some embodiments, obtaining identification data indicating a correspondence between the image and the viewpoint includes: extended display identification data EDID is received via the HDMI, the EDID including identification data indicating a correspondence between the image and the viewpoint.
In some embodiments, transmitting the converted 3D video data comprises: the converted 3D video data is transmitted at synchronized timing via the HDMI.
In some embodiments, the 3D video data communication method further comprises: initial 3D video data for multi-view naked eye 3D display is prepared.
In some embodiments, preparing initial 3D video data for multi-view naked eye 3D display comprises: reading the stored compressed 3D video data or receiving the compressed 3D video data.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in a limiting sense, and in which:
fig. 1A to 1C are schematic structural views of a 3D display device according to an embodiment of the present disclosure;
fig. 2 is a hardware configuration diagram of a 3D display device according to an embodiment of the present disclosure;
fig. 3 is a software structure diagram of a 3D display device according to an embodiment of the present disclosure;
fig. 4A-4C are schematic diagrams of a composite pixel according to an embodiment of the disclosure;
fig. 5A to 5D are schematic diagrams of formats of images included in video frames of a 3D signal according to an embodiment of the present disclosure;
fig. 6A is a schematic diagram of 6 images corresponding to 6 viewpoints in a top-bottom format included in a video frame of a 3D signal according to an embodiment of the present disclosure;
FIG. 6B is a content structure diagram of a Vendor Specific Data Block (VSDB) of Extended Display Identification Data (EDID) or enhanced extended display identification data (E-EDID) according to an embodiment of the present disclosure;
FIG. 6C is a content structure diagram of a package of vendor specific information frames, in accordance with an embodiment of the present disclosure;
fig. 7 is a schematic step diagram of a 3D display method according to an embodiment of the present disclosure;
fig. 8 is a schematic step diagram of a 3D video data communication method according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a 3D display system according to an embodiment of the present disclosure.
Reference numerals:
100: a 3D display device; 101: a processor; 122: a register; 110: a multi-view naked eye 3D display screen; 130: a 3D processing device; 131: a buffer; 140: a signal interface; 150: an eye tracking device; 151: an eye tracking data interface; and (3) CP: a composite pixel; CSP: a composite sub-pixel; 200: a 3D display device; 201: a processor; 202: an external memory interface; 203: a memory; 204: a USB interface; 205: a charging management module; 206: a power management module; 207: a battery; 208: a mobile communication module; 209: an antenna; 210: a wireless communication module; 211: an antenna; 212: an audio module; 213: a speaker; 214: a telephone receiver; 215: a microphone; 216: an earphone interface; 217: pressing a key; 218: a motor; 219: an indicator; 220: a SIM card interface; 221: an image pickup unit; 222: a register; 223: a GPU; 224: a codec; 230: a sensor module; 2301: a proximity light sensor; 2302: an ambient light sensor; 2303: a pressure sensor; 2304: an air pressure sensor; 2305: a magnetic sensor; 2306: a gravity sensor; 2307: a gyroscope sensor; 2308: an acceleration sensor; 2309: a distance sensor; 2310: a temperature sensor; 2311: a fingerprint sensor; 2312: a touch sensor; 2313: a bone conduction sensor; 310: an application layer; 320: a frame layer; 330: core class library and Runtime (Runtime); 340: an inner core layer; 400: a composite pixel; 410. 420, 430: a composite subpixel arranged in a single column; 411. 421, 431: subpixels arranged in a single row; 440. 450, 460: a composite subpixel arranged in a single row; 441. 451, 461: subpixels arranged in a single column; 470. 480, 490: a composite sub-pixel arranged in a shape of a Chinese character 'pin'; 471. 481, 491: sub-pixels arranged in an array; 501. 502, 503: three images arranged left and right; 504. 505, 506: three images arranged up and down; 507: left and right interlaced composite images; 508: a composite image interlaced up and down; 601. 602, 603, 604, 605, 606: 6 images corresponding to 6 viewpoints; AS: an effective space; 900: a 3D display system; 910: a terminal; 911: a processor; 912: a memory; 913: an external interface; 914: an image processing device; 920: a 3D display device; 930: a bus; 940: a communication interface.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
Herein, "naked-eye 3D (stereoscopic) display" refers to a technology in which a user (e.g., a viewer) can observe a 3D display image on a flat display without wearing glasses for 3D display, and includes, but is not limited to, "parallax barrier", "lenticular lens", "directional backlight" technology.
In this context, "multi-view" has its conventional meaning in the art, meaning that different images displayed by different pixels or sub-pixels of the display screen can be viewed at different positions (viewpoints) in space. In this context, multi-view shall mean at least 3 views.
In this context, "grating" has a broad interpretation in the art, including but not limited to "parallax barrier" gratings and "lenticular" gratings, such as "lenticular" gratings.
Herein, "lens" or "lenticular" has the conventional meaning in the art, and includes, for example, cylindrical lenses and spherical lenses.
A conventional "pixel" means a 2D display or the smallest display unit in terms of its resolution when displayed as a 2D display.
However, in some embodiments herein, the term "composite pixel" when applied to multi-view technology in the field of naked eye 3D display refers to the smallest unit of display when a naked eye 3D display provides multi-view display, but does not exclude that a single composite pixel for multi-view technology may comprise or appear as a plurality of 2D display pixels. Herein, unless specifically stated as a composite pixel or 3D pixel for "3D display" or "multi-view" applications, a pixel will refer to the smallest unit of display in 2D display. Likewise, when describing a "composite subpixel" for multi-view, naked eye 3D display, it will refer to a composite subpixel of a single color present in the composite pixel when the naked eye 3D display provides multi-view display. Herein, a sub-pixel in a "composite sub-pixel" will refer to the smallest display unit of a single color, which tends to correspond to a viewpoint.
In an embodiment of the present disclosure, there is provided a 3D display device including: the multi-view naked eye 3D display screen comprises a plurality of composite pixels, wherein each composite pixel in the plurality of composite pixels comprises a plurality of composite sub-pixels; each of the plurality of composite subpixels is comprised of i subpixels corresponding to i viewpoints, wherein i ≧ 3; a signal interface configured to receive a video frame of the 3D signal, wherein the video frame of the 3D signal contains at least three images or a composite image composed of at least three images; a 3D processing device configured to render at least one of the plurality of composite sub-pixels based on at least one of the at least three images or the composite image.
In an embodiment of the present disclosure, there is provided a 3D display system including: a terminal; and a 3D display device as described above; wherein, the terminal is connected with the signal interface in a communication way.
In an embodiment of the present disclosure, a 3D display method for a multi-view naked eye 3D display screen is provided, where the multi-view naked eye 3D display screen includes a plurality of composite pixels, each of the plurality of composite pixels includes a plurality of composite sub-pixels, each of the plurality of composite sub-pixels is composed of i sub-pixels corresponding to i viewpoints, where i is greater than or equal to 3, and the 3D display method includes: transmitting a video frame of the 3D signal, wherein the video frame of the 3D signal comprises at least three images or a composite image consisting of at least three images; and rendering at least one of the composite sub-pixels based on at least one of the at least three images or the composite image.
In an embodiment of the present disclosure, a 3D video data communication method for multi-view naked eye 3D display is provided, including: preparing initial 3D video data for multi-view naked eye 3D display; receiving and acquiring identification data indicating the corresponding relation between the image and the viewpoint when communication is established; converting the initial 3D video data into converted 3D video data based on the received identification data, the converted 3D video data including video frames having a plurality of images corresponding to multiple viewpoints; the converted 3D video data is transmitted.
Fig. 1A illustrates a schematic structure of a 3D display device 100 according to an embodiment of the present disclosure. Referring to fig. 1A, the 3D display device 100 includes a multi-view naked eye 3D display screen 110, a 3D processing apparatus 130, and a signal interface 140 for receiving video frames of a 3D signal.
The multi-view naked eye 3D display screen 110 may include a display panel and a raster (not identified) overlaid on the display panel. In the embodiment shown in fig. 1A, the multi-view naked-eye 3D display screen 110 may comprise m columns and n rows, i.e. m × n composite pixels and thus define an m × n display resolution.
In some embodiments, the 3D processing device is in communication with a multi-view naked eye 3D display screen.
In some embodiments, the 3D processing means is communicatively connected with the driving means of the multi-view naked eye 3D display screen.
In some embodiments, each composite pixel includes a plurality of composite sub-pixels, each composite sub-pixel being made up of i same-color sub-pixels corresponding to i viewpoints, i ≧ 3. In the embodiment shown in fig. 1A, i is 6, but other values for i are conceivable, i.e. correspondingly more or fewer viewpoints.
Referring to fig. 1A and 4A in combination, in the illustrated embodiment, each composite pixel includes three composite sub-pixels, and each composite sub-pixel is composed of 6 same-color sub-pixels corresponding to 6 viewpoints (i ═ 6). The three composite subpixels correspond to three colors, i.e., red (R), green (G), and blue (B), respectively.
As shown in fig. 4A, three composite subpixels 410, 420, 430 in composite pixel 400 are arranged in a single column. Each composite subpixel 410, 420, 430 includes subpixels 411, 421, 431 arranged in a single row.
As shown in fig. 4B, the three composite sub-pixels 440, 450, 460 in the composite pixel 400 are arranged in a single row. Each composite subpixel 440, 450, 460 comprises subpixels 441, 451, 461 in a single column.
As shown in fig. 4C, the three composite subpixels 470, 480, 490 in composite pixel 400 are arranged, for example, in a "pin" shape. The subpixels 471, 481, 491 in each composite subpixel 470, 480, 490 can be in an array (3 × 2).
It is conceivable, however, for the composite sub-pixels to be arranged differently in the composite pixel or for the sub-pixels to be arranged differently in the composite sub-pixel.
In some embodiments, such as shown in fig. 1A-1C, the 3D display device 100 may be provided with a single 3D processing apparatus 130. The single 3D processing device 130 simultaneously processes the rendering of each composite sub-pixel of each composite pixel of the naked eye 3D display screen 110.
In further embodiments, the 3D display device 100 may be provided with at least two 3D processing means 130 which process the rendering of each composite sub-pixel of each composite pixel of the naked eye 3D display screen 110 in parallel, in series, in a combination of series and parallel, or in another way.
In some embodiments, the 3D processing device 130 may also optionally include a buffer 131 to buffer the received video frames.
In some embodiments, the 3D processing device is an FPGA or ASIC chip or an FPGA or ASIC chipset.
With continued reference to fig. 1A, the 3D display device 100 may further include a processor 101 communicatively connected to the 3D processing apparatus 130 through a signal interface 140. In some embodiments illustrated herein, the processor 101 is included in or as a processor unit of a computer or smart terminal, such as a mobile terminal. It is contemplated that in some embodiments the processor 101 may be provided external to the 3D display device, for example the 3D display device may be a multi-view naked eye 3D display with 3D processing means, for example a non-intelligent naked eye 3D television.
For simplicity, in the following, exemplary embodiments of the 3D display device internally comprise a processor. Further, the signal interface 140 is configured as an internal interface connecting the processor 101 and the 3D processing device 130, and such a structure can be more clearly understood with reference to the 3D display apparatus 200 implemented in a mobile terminal shown in fig. 2 and 3. In some embodiments, the signal interface 140, which is an internal interface of the 3D Display device 200, may be a MIPI, mini-MIPI, LVDS, min-LVDS, or Display Port interface. In some embodiments, as shown in fig. 1A, the processor 101 of the 3D display device 100 may further include a register 122. The registers 122 may be used to temporarily store instructions, data, and addresses.
In some embodiments, the 3D display apparatus 100 may further include an eye position obtaining device, such as an eye tracking device or an eye tracking data interface, for obtaining the eye tracking data in real time, so that the 3D processing device 130 may render respective sub-pixels of the composite pixel based on the eye tracking data. For example, in the embodiment shown in fig. 1B, the 3D display device 100 includes an eye tracking device 150 communicatively connected to the 3D processing device 130, whereby the 3D processing device 130 can directly receive eye tracking data. In the embodiment shown in fig. 1C, an eye tracking data interface 151 is provided, an eye tracking device (not shown) may be directly connected to the processor 101, and the 3D processing device obtains the eye tracking data from the processor via the eye tracking data interface 151. In other embodiments, the eye tracking device may be connected to the processor and the 3D processing device simultaneously, which may enable the 3D processing device to obtain the eye tracking data directly from the eye tracking device on the one hand, and enable other information obtained by the eye tracking device to be processed by the processor on the other hand.
3D signal transmission and display within a 3D display device of an embodiment of the present disclosure is described with reference to FIGS. 1A-1C and FIGS. 5A-5D in combination. In the illustrated embodiment, the multi-view naked-eye 3D display screen 110 may define 6 views V1-V6, and the user's eyes may see the display of corresponding sub-pixels in the composite sub-pixels of each composite pixel in the display panel of the multi-view naked-eye 3D display screen 110 at each view (spatial location). Two different images seen by the two eyes of the user at different viewpoints form parallax, and a 3D image is synthesized in the brain.
In some embodiments, the 3D processing device 130 receives video frames, e.g., as a decompressed 3D signal, from the processor 101 through the signal interface 140, e.g., as an internal interface. Each video frame may contain at least three images with an m x n image resolution or a composite image composed of at least three images with an m x n image resolution.
In some embodiments, the at least three images or the composite image may be in various arrangements.
As shown in fig. 5A, a video frame of a 3D signal includes three images 501, 502, 503 having m × n image resolution arranged left and right.
As shown in fig. 5B, the video frame of the 3D signal includes three images 504, 505, 506 arranged one above the other with m × n image resolution.
As shown in fig. 5C, the video frame of the 3D signal contains a composite image 507 having a 3m × n image resolution in a left-right interleaved format.
As shown in fig. 5D, the video frame of the 3D signal contains a composite image 508 having a 3m × n image resolution in a top-bottom interleaved format.
It will be appreciated by those skilled in the art that the embodiments shown in the figures are merely illustrative and that the at least three images or composite images comprised by the video frames of the 3D signal may comprise other numbers of images and may be in other arrangements, which fall within the scope of the embodiments of the present disclosure.
In some embodiments, the m × n display resolution may be a resolution above Full High Definition (FHD), including, but not limited to, 1920 × 1080, 1920 × 1200, 2048 × 1280, 2560 × 1440, 3840 × 2160, and the like.
In some embodiments, 3D processing device 130 renders at least one of the composite sub-pixels of the composite pixel based on at least one of the at least three images after receiving the video frame including the at least three images. Similarly, in some embodiments, the 3D processing device, upon receiving a video frame including the composite image, renders at least one of the composite sub-pixels of the composite pixel based on the composite image.
In some embodiments, this is rendered, for example, based on human eye tracking data, such as the spatial position of the user's eyes or the user's perspective, and so forth.
In some embodiments, the display panel may define a plurality of correction regions, and the correspondence data of the sub-pixels in each of the composite sub-pixels to the view point is associated with each of the correction regions.
By way of explanation and not limitation, since a video frame received by the 3D processing device 130 in the embodiment of the present disclosure through the signal interface 140 configured as an internal interface, for example, contains at least three images, the resolution of each image (or one of the composite image resolutions in multiples of the number of at least three images) corresponds to a composite pixel divided by viewpoint (which includes a composite sub-pixel divided by viewpoint). On one hand, as the viewpoint information is irrelevant to the transmission process, naked eye 3D display with small processing calculation amount and no loss of resolution can be realized; on the other hand, because the sub-pixels in each composite sub-pixel of the composite pixel are arranged corresponding to the view points, the rendering of the display screen can be realized in a point-to-point mode, and the calculation amount is greatly reduced. In contrast, the conventional naked-eye 3D display still uses a 2D display panel as a basis for image or video transmission and display, and not only has the problems of resolution reduction and drastic increase of the amount of rendering calculation, but also has the problems of multiple format adjustment and image or video display adaptation.
In some embodiments, the register 122 of the processor 101 may be configured to receive information about the display requirements of the multi-view naked eye 3D display screen 110, typically information independent of i views and related to the m × n display resolution of the multi-view naked eye 3D display screen 110, so that the processor 101 sends video frames of the 3D signal to the multi-view naked eye 3D display screen 110 that meet its display requirements. The information may be, for example, data packets sent for initially establishing a video transmission.
Therefore, the processor 101 does not need to consider information related to i viewpoints of the multi-viewpoint naked eye 3D display screen 110 (i ≧ 3) when transmitting the video frame of the 3D signal. Instead, the processor 101 is able to transmit a video frame of the 3D signal meeting its requirements to the multi-view naked-eye 3D display screen 110 by means of the information related to the m × n display resolution of the multi-view naked-eye 3D display screen 110 received by the register 122.
In some embodiments, the 3D display device 100 may further include a codec configured to decompress and codec the compressed 3D signal and transmit the decompressed 3D signal to the 3D processing apparatus 130 via the signal interface 140.
In some embodiments, the processor 101 of the 3D display device 100 reads the video frame of the 3D signal from the memory or from outside the 3D display device 100, for example, through an external interface, and then transmits the read or received video frame of the 3D signal to the 3D processing apparatus 130 via the signal interface 140.
In some embodiments, the 3D display device 100 further comprises a formatter (not shown), e.g. integrated in the processor 101, configured as a codec or as part of a GPU, for pre-processing the video frames of the 3D signal such that they contain at least three images with an m × n image resolution or such that they contain a composite image with an image resolution of e.g. 3m × n or m × 3n or other suitable image resolution.
As described previously, the 3D display device provided by the embodiments of the present disclosure may be a 3D display device including a processor. In some embodiments, the 3D display device may be configured as a smart cellular phone, a tablet, a smart television, a wearable device, an in-vehicle device, a notebook, an Ultra Mobile Personal Computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like.
Exemplarily, fig. 2 shows a hardware configuration diagram of a 3D display device 200 implemented as a mobile terminal, such as a smart cellular phone or a tablet computer. The 3D display device 200 may include a processor 201, an external storage interface 202, an (internal) memory 203, a Universal Serial Bus (USB) interface 204, a charging management module 205, a power management module 206, a battery 207, a mobile communication module 208, a wireless communication module 210, antennas 209, 211, an audio module 212, a speaker 213, a receiver 214, a microphone 215, an earphone interface 216, a button 217, a motor 218, an indicator 219, a Subscriber Identity Module (SIM) card interface 220, a multi-view naked eye 3D display screen 110, a 3D processing apparatus 130, a signal interface 140, an eye tracking apparatus 150, a camera unit 221, a sensor module 230, and the like. Among other things, the sensor module 230 may include a proximity light sensor 2301, an ambient light sensor 2302, a pressure sensor 2303, a barometric pressure sensor 2304, a magnetic sensor 2305, a gravity sensor 2306, a gyroscope sensor 2307, an acceleration sensor 2308, a distance sensor 2309, a temperature sensor 2310, a fingerprint sensor 2311, a touch sensor 2312, a bone conduction sensor 2313, and the like.
It is to be understood that the structure illustrated in fig. 2 does not constitute a specific limitation of the 3D display device 200. In other embodiments, the 3D display device 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 201 may include one or more processing units, such as: the processor 201 may include an Application Processor (AP), a modem processor, a baseband processor, a Graphics Processor (GPU)223, an Image Signal Processor (ISP), a controller, a memory, a video codec 224, a Digital Signal Processor (DSP), a baseband processor, a neural Network Processor (NPU), etc., or combinations thereof.
In some embodiments, the processor 201 may include one or more interfaces. The interfaces may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a Universal Asynchronous Receiver Transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a General Purpose Input Output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and so forth.
The wireless communication function of the 3D display device 200 may be implemented by the antennas 209 and 211, the mobile communication module 208, the wireless communication module 210, a modem processor or a baseband processor, and the like.
The antennas 209, 211 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the 3D display device 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The mobile communication module 208 may provide a solution including 2G/3G/4G/5G wireless communication applied on the 3D display device 200.
The wireless communication module 210 may provide a solution for wireless communication applied to the 3D display device 200, including Wireless Local Area Network (WLAN), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like.
In some embodiments, the antenna 209 and the mobile communication module 208 of the 3D display device 200 are coupled and the antenna 211 and the wireless communication module 210 are coupled so that the 3D display device 200 can communicate with a network and other devices through a wireless communication technology.
In some embodiments, the external interface for receiving the 3D signal may include a USB interface 204, a mobile communication module 208, a wireless communication module 209, or a combination thereof. Furthermore, other possible interfaces for receiving 3D signals are also conceivable, such as the interfaces described above.
In some embodiments, the memory of the 3D display device may comprise an (internal) memory 203, an external memory card to which the external memory interface 202 is connected, or a combination thereof. In other embodiments, the signal interface may also adopt different internal interface connection modes or a combination thereof in the above embodiments.
In some embodiments, the camera unit 221 may capture images or video.
In some embodiments, the 3D display device 200 implements a display function through the signal interface 140, the 3D processing apparatus 130, the multi-view naked eye 3D display screen 110, and the application processor.
In some embodiments, the 3D display device 200 may include a GPU, for example, within the processor 201 for processing 3D video images, as well as for processing 2D video images.
In some embodiments, the 3D display device 200 further includes a video codec 224 for compressing or decompressing digital video.
In some embodiments, the signal interface 140 is used to output 3D signals, e.g., video frames of decompressed 3D signals, processed by the GPU or the codec 224, or both, to the 3D processing device 130.
In some embodiments, the GPU or codec 224 is integrated with a formatter.
The multi-view naked eye 3D display screen 110 is used to display a 3D (stereoscopic) image or video, etc. The multi-view naked eye 3D display screen 110 includes a display panel. The display panel can be a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED) or an Active Matrix Organic Light Emitting Diode (AMOLED), a Flexible Light Emitting Diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light emitting diode (QLED), and the like.
In some embodiments, the eye tracking device 150 is communicatively connected to the 3D processing device 130. In some embodiments, the eye tracking device 150 may also be connected to the processor 201, such as bypassing the connection processor 201.
The 3D display device 200 may implement an audio function through the audio module 212, the speaker 213, the receiver 214, the microphone 215, the earphone interface 216, and the application processor, etc.
The software system of the 3D display device 200 may employ a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture. The following takes an android system of a layered architecture as an example, and exemplifies a software structure of the 3D display device 200. It is contemplated that embodiments of the present disclosure may be implemented in different software systems, such as an operating system.
Fig. 3 is a software configuration diagram of the 3D display device 200 according to the embodiment of the present disclosure. The layered architecture divides the software into several layers. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer 310, a framework layer 320, a core class library and Runtime (Runtime)330, and a kernel layer 340.
The application layer 310 may include a series of application packages. As shown in fig. 3, the application packages may include bluetooth, WLAN, navigation, music, camera, calendar, telephony, video, gallery, map, short message, etc. applications. The 3D display method according to the embodiments of the present disclosure may be implemented in, for example, a video application.
Framework layer 320 provides an Application Programming Interface (API) and programming framework for applications at the application layer. The framework layer includes some predefined functions. For example, in some embodiments of the present disclosure, functions or algorithms that identify captured 3D video images, algorithms that process images, and the like may be included at the framework layer.
As shown in FIG. 3, the framework layer 320 may include an explorer, a phone manager, a content manager, a notification manager, a window manager, a view system, an installation package manager, and the like.
The android Runtime includes a core library and a virtual machine. The android Runtime is responsible for scheduling and managing the android system.
The core library comprises two parts: one part is a function to be called by java language, and the other part is a core library of android.
The application layer and the framework layer run in a virtual machine. And executing java files of the application program layer and the framework layer into binary files by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The core class library may include a plurality of functional modules. For example: a 3D graphics processing library (e.g., OpenGL ES), a surface manager, an image processing library, a media library, a graphics engine (e.g., SGL), and the like.
The kernel layer 340 is a layer between hardware and software. The inner core layer at least comprises a camera drive, an audio and video interface, a communication interface, a Wifi interface, a sensor drive, a power supply management and a GPS interface.
Here, an embodiment of 3D video transmission and display in a 3D display device is described taking as an example a 3D display device as a mobile terminal having the structure shown in fig. 2 and 3; it is contemplated, however, that additional or fewer features may be included or changes may be made in the features of alternative embodiments.
In some embodiments, the 3D display apparatus 200, for example, a mobile terminal, such as a smart cellular phone or a tablet computer, receives, for example, a compressed 3D signal from a network, such as a cellular network, a WLAN network, bluetooth, for example, by means of the mobile communication module 208 and the antenna 209 or the wireless communication module 210 and the antenna 211 as an external interface, the compressed 3D signal is subjected to image processing, codec and decompression by the GPU 223, for example, and then the decompressed 3D signal is sent to the 3D processing device 130, for example, via the signal interface 140 as an internal interface, such as a MIPI interface or a mini-MIPI interface, and a video frame of the decompressed 3D signal includes at least three images or a composite image according to the embodiments of the present disclosure. Further, the 3D processing device 130 renders sub-pixels in each of the composite sub-pixels of the composite pixel of the display screen accordingly, thereby implementing 3D video playback.
In other embodiments, the 3D display device 200 reads the (internal) memory 203 or reads the compressed 3D signal stored in the external memory card through the external memory interface 202, and implements 3D video playback through corresponding processing, transmission, and rendering.
In some embodiments, the playing of the 3D video is implemented in a video application in the android system application layer 310.
In some embodiments, the signal interface 140 is configured to receive a video frame of the 3D signal, where the video frame includes at least three images, i images, and the 3D processing device 130 is configured to render, on the basis of the i images, i sub-pixels corresponding to i viewpoints in each of m × n composite sub-pixels of the multi-viewpoint naked eye 3D display screen 110 one by one.
In some embodiments, the 3D display device 100 is configured to transmit identification data related to i viewpoints, i.e. identification data indicating a correspondence between images and viewpoints. And determining the one-by-one corresponding relation between the i images and the i viewpoints through the identification data.
As shown in fig. 6A, a video frame of a 3D signal illustratively contains 6 images 601, 602, 603, 604, 605, and 606(i ═ 6) in a top-bottom format, each image having an image resolution of m × n. There is an effective space AS between every two images.
With reference to fig. 1A, 4A and 6A, 6 images included in a video frame of a 3D signal correspond to 6 viewpoints one by one, that is, correspond to 6 sub-pixels of each composite sub-pixel one by one.
With continued reference to fig. 1A, the 3D display device 100 includes a signal interface 140, receives a video frame of the 3D signal through the signal interface 140, and transmits identification data related to the above-mentioned 6 viewpoints (i ═ 6) through the signal interface 140.
In some embodiments, the signal interface 140 is configured as a High Definition Multimedia Interface (HDMI). In this case, Extended Display Identification Data (EDID) may be transmitted through the Display Data Channel (DDC) of the HDMI, enhanced extended display identification data (E-EDID) may also be transmitted, and identification data related to the above-mentioned 6 viewpoints (i ═ 6) may be included in the Vendor Specific Data Block (VSDB) of the EDID or E-EDID.
Referring to fig. 6B, a content structure of the VSDB of EDID or E-EDID is exemplarily shown. Wherein identification data related to the above-mentioned 6 viewpoints (i-6) may be defined in the reserved bits.
In some embodiments, the Vendor Specific InfoFrame (Vendor Specific InfoFrame) may be transmitted over a transition minimized differential signaling channel (TMDS channel) of HDMI. Identification data related to the above-mentioned 6 viewpoints (i ═ 6) may be included in the package Contents (Packet Contents) of the provider-specific information frame.
Referring to fig. 6C, the content structure of the package content of the provider-specific information frame is exemplarily shown. Wherein identification data related to the above-mentioned 6 viewpoints (i-6) may be defined in the reserved bits. In connection with fig. 6A, identification data related to the above-mentioned 6 viewpoints (i ═ 6) may be located in the effective space AS.
By way of example and not limitation, in other embodiments, j images corresponding to j viewpoints are transmitted, where j ≦ i. When i is 6, this is the case, for example, when human eye tracking data of 4 eyes of two users at 4 viewpoints (j is 4) respectively is acquired through an eye tracking device or an eye tracking data interface, then 4 images corresponding to the 4 viewpoints are transmitted, and the 3D processing device is configured to render 4 subpixels corresponding to the 4 viewpoints in each of m × n composite subpixels of a multi-viewpoint naked eye 3D display screen correspondingly one by one based on the 4 images.
The embodiment of the present disclosure further provides a 3D display method for a multi-view naked eye 3D display screen, where the multi-view naked eye 3D display screen includes a plurality of composite pixels, each of the plurality of composite pixels includes a plurality of composite sub-pixels, and each of the plurality of composite sub-pixels is composed of a plurality of sub-pixels corresponding to a plurality of views.
Referring to fig. 7, in some embodiments, a 3D display method includes:
s701: transmitting a video frame of the 3D signal, wherein the video frame comprises at least three images or a composite image consisting of at least three images;
s702: rendering at least one sub-pixel of a plurality of composite sub-pixels in a multi-view naked eye 3D display screen based on at least one of the at least three images or the composite image.
In some embodiments, the i subpixels of each composite subpixel corresponding to the i viewpoints are constructed as i same-color subpixels, where i ≧ 3.
In some embodiments, the multi-view naked-eye 3D display screen comprises m × n composite pixels and thus defines an m × n display resolution.
In some embodiments, the at least three images comprised by the video frame of the 3D signal each have an image resolution of m × n.
The embodiment of the disclosure also provides a 3D video data communication method.
Referring to fig. 8, in some embodiments, a 3D video data communication method includes:
s801: acquiring identification data indicating a corresponding relationship between the image and the viewpoint;
s802: converting the initial 3D video data based on the identification data to obtain converted 3D video data containing video frames; wherein the video frame comprises at least three images or a composite image consisting of at least three images.
In some embodiments, the 3D video data communication method further comprises: the converted 3D video data is transmitted.
In some embodiments, the obtaining of identification data indicating a correspondence between an image and a viewpoint and the transmitting of the converted 3D video data are performed via a high definition multimedia interface, HDMI
In some embodiments, obtaining identification data indicating a correspondence between the image and the viewpoint includes: receiving extended display identification data EDID via HDMI, the EDID including identification data indicating correspondence between the image and the viewpoint
In some embodiments, transmitting the converted 3D video data comprises: the converted 3D video data is transmitted at synchronized timing via the HDMI.
In some embodiments, the 3D video data communication method further comprises preparing initial 3D video data for multi-view naked eye 3D display, comprising: reading the stored compressed 3D video data or receiving the compressed 3D video data.
An embodiment of the present disclosure provides a 3D display system 900, and referring to fig. 9, the 3D display system 900 includes a terminal 910, and a 3D display device 920 as described above, wherein the terminal 910 is communicatively connected to a signal interface of the 3D display device 920.
In some embodiments, the terminal 910 includes a processor 911, a memory 912, an external interface 913, and an image processing apparatus 914 including a codec; the memory 912 is configured to store the compressed 3D signal or the external interface 1313 is configured to receive the compressed 3D signal; the image processing device 914 is configured to decompress and codec the compressed 3D signal and process the decompressed 3D signal into a 3D signal including at least three images.
In some embodiments, the 3D display system 900 further comprises a formatter configured to pre-process the video frames of the 3D signal such that each of the at least three images has an image resolution of m × n.
The 3D display system 900 may also include a communication interface 940 and a bus 930. The processor 911, the communication interface 940 and the memory 912 can communicate with each other via the bus 930. Communication interface 940 may be used for information transfer. The processor 911 may call logic instructions in the memory 912 to perform the 3D display method and the 3D video data communication method of the above-described embodiment.
In addition, the logic instructions in the memory 912 can be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 912 is a computer-readable storage medium that can be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 911 performs functional applications and data processing, i.e., implementing the 3D display method and the 3D video data communication method in the above-described method embodiments, by executing program instructions/modules stored in the memory 912.
The memory 912 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 912 may include high speed random access memory, and may also include non-volatile memory.
Embodiments of the present disclosure provide an article, such as a smart television, smart cellular phone, tablet computer, personal computer, or wearable device, configured as or incorporating a 3D display device or system as described above.
The methods, programs, systems, apparatuses, devices, etc. in embodiments of the present application may be performed or implemented in a single or multiple networked computers, or may be practiced in distributed computing environments. In the described embodiments, tasks are performed by remote processing devices that are linked through a communications network in these distributed computing environments.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
Unless specifically stated otherwise, the actions or steps of a method, program or process described in accordance with the embodiments of the present application do not have to be performed in a particular order and still achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Exemplary devices, systems and methods of the present application have been particularly shown and described with reference to the foregoing embodiments, which are merely illustrative of the best modes of carrying out the devices, systems and methods. It will be appreciated by those skilled in the art that various changes in the embodiments of the apparatus, system, and method described herein may be made in practicing the apparatus, system, and method without departing from the spirit and scope of the application as defined in the appended claims. It is intended that the following claims define the scope of the present apparatus, system, and method and that the system and method within the scope of these claims and their equivalents be covered thereby.

Claims (19)

1. A3D display device, comprising:
a multi-view naked-eye 3D display screen including a plurality of composite pixels, each of the plurality of composite pixels including a plurality of composite sub-pixels, each of the plurality of composite sub-pixels being composed of a plurality of sub-pixels corresponding to a plurality of views;
a signal interface configured to receive a video frame in a 3D signal, wherein the video frame contains at least three images or a composite image composed of at least three images;
a 3D processing device configured to render at least one of the plurality of composite sub-pixels based on at least one of the at least three images in the video frame or the composite image.
2. 3D display device according to claim 1,
the signal interface is further configured to transmit identification data indicating a correspondence between the image and the viewpoint.
3. 3D display device according to claim 2,
the signal interface is a high-definition multimedia interface (HDMI).
4. 3D display device according to claim 3,
the HDMI is configured to transmit extended display identification data EDID including the identification data indicating the correspondence between the images and the viewpoints.
5. The 3D display device according to any of claims 1 to 4,
the arrangement mode of the at least three images is left-right arrangement or up-down arrangement; or
The composite image is a left-right interwoven composite image or a top-bottom interwoven composite image.
6. The 3D display device according to any of claims 1 to 4, wherein at least one of the plurality of composite sub-pixels comprises a single row or column or array of sub-pixels.
7. The 3D display device according to any of claims 1 to 4, wherein the 3D processing means comprises at least one of a field programmable gate array, FPGA, chip, an application specific integrated circuit, ASIC, chip.
8. The 3D display device according to any of claims 1 to 4, further comprising:
an eye position acquisition means configured to acquire a user's eye position.
9. A3D display system, comprising:
a terminal; and
the 3D display device of any of claims 1 to 8;
wherein the terminal is in communication connection with the signal interface.
10. A 3D display system according to claim 9, wherein the terminal comprises a processor, a memory and an image processing device, the memory being configured to store the 3D signal; the image processing device is configured to process the 3D signal to obtain a video frame in the 3D signal, wherein the video frame comprises at least three images or a composite image formed by at least three images.
11. A 3D display system according to claim 9 or 10, further comprising:
a formatter configured to process video frames in the 3D signal such that each of the at least three images has an image resolution consistent with a display resolution of the multi-view naked eye 3D display screen;
wherein the display resolution of the multi-view naked-eye 3D display screen is defined by composite pixels.
12. A 3D display method, comprising:
transmitting a video frame in the 3D signal, wherein the video frame comprises at least three images or a composite image consisting of at least three images; and
rendering at least one sub-pixel of a plurality of composite sub-pixels in a multi-view naked eye 3D display screen based on at least one of the at least three images or the composite image;
wherein the multi-view naked-eye 3D display screen comprises a plurality of composite pixels, each of the plurality of composite pixels comprises a plurality of composite sub-pixels, and each of the plurality of composite sub-pixels is composed of a plurality of sub-pixels corresponding to a plurality of views.
13. A method of 3D video data communication, comprising:
acquiring identification data indicating a corresponding relationship between the image and the viewpoint;
converting the initial 3D video data based on the identification data to obtain converted 3D video data containing video frames; wherein the video frame comprises at least three images or a composite image composed of at least three images.
14. The method of claim 13, further comprising: transmitting the converted 3D video data.
15. The method according to claim 14, wherein the obtaining of identification data indicating correspondence between images and viewpoints and the transmitting of the converted 3D video data are performed via a high definition multimedia interface, HDMI.
16. The method of claim 15, wherein obtaining identification data indicating a correspondence between the image and the viewpoint comprises: receiving extended display identification data EDID via the HDMI, the EDID including the identification data indicating the correspondence between the image and the viewpoint.
17. The method according to claim 15 or 16, wherein transmitting the converted 3D video data comprises: transmitting the converted 3D video data at synchronized timing via the HDMI.
18. The method of any one of claims 13 to 16, further comprising: initial 3D video data for multi-view naked eye 3D display is prepared.
19. The method according to claim 18, wherein preparing initial 3D video data for multi-view naked eye 3D display comprises: reading the stored compressed 3D video data or receiving the compressed 3D video data.
CN201911231387.0A 2019-12-05 2019-12-05 3D display device, system and method, and 3D video data communication method Pending CN112929645A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911231387.0A CN112929645A (en) 2019-12-05 2019-12-05 3D display device, system and method, and 3D video data communication method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911231387.0A CN112929645A (en) 2019-12-05 2019-12-05 3D display device, system and method, and 3D video data communication method

Publications (1)

Publication Number Publication Date
CN112929645A true CN112929645A (en) 2021-06-08

Family

ID=76160759

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911231387.0A Pending CN112929645A (en) 2019-12-05 2019-12-05 3D display device, system and method, and 3D video data communication method

Country Status (1)

Country Link
CN (1) CN112929645A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040184A (en) * 2021-11-26 2022-02-11 京东方科技集团股份有限公司 Image display method, system, storage medium and computer program product

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104185014A (en) * 2013-05-24 2014-12-03 三星电子株式会社 Display apparatus and method of displaying multi-view images
CN105282539A (en) * 2014-07-18 2016-01-27 三星电子株式会社 Curved multi-view image display apparatus and control method thereof
CN106559662A (en) * 2015-09-24 2017-04-05 三星电子株式会社 Multi-view image display device and its control method
CN110072099A (en) * 2019-03-21 2019-07-30 朱晨乐 A kind of naked eye 3D video pixel arrangement architecture and aligning method
CN110113596A (en) * 2018-02-01 2019-08-09 上海济丽信息技术有限公司 A kind of switchable grating formula naked eye 3D display system and display methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104185014A (en) * 2013-05-24 2014-12-03 三星电子株式会社 Display apparatus and method of displaying multi-view images
CN105282539A (en) * 2014-07-18 2016-01-27 三星电子株式会社 Curved multi-view image display apparatus and control method thereof
CN106559662A (en) * 2015-09-24 2017-04-05 三星电子株式会社 Multi-view image display device and its control method
CN110113596A (en) * 2018-02-01 2019-08-09 上海济丽信息技术有限公司 A kind of switchable grating formula naked eye 3D display system and display methods
CN110072099A (en) * 2019-03-21 2019-07-30 朱晨乐 A kind of naked eye 3D video pixel arrangement architecture and aligning method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040184A (en) * 2021-11-26 2022-02-11 京东方科技集团股份有限公司 Image display method, system, storage medium and computer program product

Similar Documents

Publication Publication Date Title
TWI746302B (en) Multi-viewpoint 3D display, multi-viewpoint 3D display terminal
CN112584125A (en) Three-dimensional image display apparatus and display method thereof
EP4068769A1 (en) Eye positioning device and method, and 3d display device and method
CN112929647A (en) 3D display device, method and terminal
CN211791829U (en) 3D display device
CN211791828U (en) 3D display device
WO2022166624A1 (en) Screen display method and related apparatus
US11924398B2 (en) Method for implementing 3D image display and 3D display device
WO2022166712A1 (en) Image display method, apparatus, readable medium, and electronic device
CN211128026U (en) Multi-view naked eye 3D display screen and multi-view naked eye 3D display terminal
US20150077575A1 (en) Virtual camera module for hybrid depth vision controls
CN112929645A (en) 3D display device, system and method, and 3D video data communication method
CN112929643B (en) 3D display device, method and terminal
CN211930763U (en) 3D display device
WO2021110040A1 (en) Multi-viewpoint 3d display screen and 3d display terminal
CN112929641B (en) 3D image display method and 3D display device
EP4068780A1 (en) Method for realizing 3d image display, and 3d display device
WO2021110037A1 (en) Method for realizing 3d image display, and 3d display device
TWI840636B (en) Method for realizing 3D image display, and 3D display device
CN112911268A (en) Image display method and electronic equipment
CN114827440A (en) Display mode conversion method and conversion device based on light field display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210608