CN114079764A - 3D display method, 3D display device and 3D display device - Google Patents

3D display method, 3D display device and 3D display device Download PDF

Info

Publication number
CN114079764A
CN114079764A CN202010799315.2A CN202010799315A CN114079764A CN 114079764 A CN114079764 A CN 114079764A CN 202010799315 A CN202010799315 A CN 202010799315A CN 114079764 A CN114079764 A CN 114079764A
Authority
CN
China
Prior art keywords
display
display device
parallax
displayed
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010799315.2A
Other languages
Chinese (zh)
Inventor
刁鸿浩
黄玲溪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Vision Technology Venture Capital Pte Ltd
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Technology Venture Capital Pte Ltd, Beijing Ivisual 3D Technology Co Ltd filed Critical Vision Technology Venture Capital Pte Ltd
Priority to CN202010799315.2A priority Critical patent/CN114079764A/en
Priority to PCT/CN2021/108968 priority patent/WO2022033314A1/en
Publication of CN114079764A publication Critical patent/CN114079764A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to the technical field of display, and discloses a 3D display method, which comprises the following steps: displaying, in a 3D display device, at least one 3D display interface having a 3D effect with respect to the 3D display device; and performing 3D display in the at least one 3D display interface. The 3D display method provided by the application has the advantages that 3D display is carried out in the 3D display interface with the 3D effect relative to the 3D display device, the flexibility of the 3D display is improved, and the 3D display effect is favorably improved. The application also discloses a 3D display device and a 3D display device.

Description

3D display method, 3D display device and 3D display device
Technical Field
The present application relates to the field of display technologies, and for example, to a 3D display method, a 3D display apparatus, and a 3D display device.
Background
Currently, when 3D display is performed, 3D display is generally performed on the entire display interface.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art:
3D display mode solidification reduces 3D and shows the flexibility, is unfavorable for improving 3D display effect.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a 3D display method, a 3D display device and a 3D display device, and aims to solve the technical problems that 3D display modes are solidified, the 3D display flexibility is reduced, and the improvement of a 3D display effect is not facilitated.
The 3D display method provided by the embodiment of the disclosure comprises the following steps:
displaying, in a 3D display device, at least one 3D display interface having a 3D effect with respect to the 3D display device;
and performing 3D display in at least one 3D display interface.
In some embodiments, displaying at least one 3D display interface may include:
and generating and displaying an interface parallax image capable of forming at least one 3D display interface.
In some embodiments, generating the interface parallax image may include:
rendering at least one 3D display interface based on the parallax texture of the interface parallax image to obtain an image to be displayed with parallax;
and distributing the image to be displayed with parallax to the pixels.
In some embodiments, rendering at least one 3D display interface based on the disparity texture to obtain an image to be displayed with disparity may include:
and separating a left eye parallax texture and a right eye parallax texture from the parallax texture, and rendering a rendering scene of at least one 3D display interface through a left eye logic camera and a right eye logic camera respectively to obtain a left eye image to be displayed and a right eye image to be displayed.
In some embodiments, rendering a rendered scene of at least one 3D display interface may include:
performing left-eye parallax texture mapping on an interface of a rendered scene through a left-eye logic camera to obtain a left-eye image to be displayed; and performing right eye parallax texture mapping in an interface of a rendered scene through a right eye logic camera to obtain a right eye image to be displayed.
In some embodiments, the 3D display method may further include: and establishing a 3D model of at least one 3D display interface to obtain a rendering scene.
In some embodiments, assigning the image to be displayed with parallax to the pixels may include:
and distributing the image to be displayed for the left eye and the image to be displayed for the right eye to the pixels.
In some embodiments, the 3D display device may include at least two viewpoints;
distributing the left-eye image to be displayed and the right-eye image to be displayed to pixels, comprising the following steps:
and distributing the left-eye image to be displayed and the right-eye image to be displayed to pixels corresponding to at least two viewpoints.
In some embodiments, the 3D display method may further include: a parallax texture is obtained.
In some embodiments, obtaining the disparity texture may include at least one of:
obtaining parallax textures through the 3D picture;
obtaining a parallax texture through a 3D video file;
obtaining a disparity texture through a 3D video stream;
the parallax texture is directly obtained by a 3D photographing apparatus.
In some embodiments, the 3D effect of the at least one 3D display interface with respect to the 3D display device may include:
at least one 3D display interface is displayed relative to the 3D display device; or
At least one 3D display interface is arranged in a screen relative to the 3D display device; or
One part of the at least one 3D display interface is displayed relative to the 3D display device, and the other part of the at least one 3D display interface is displayed relative to the 3D display device.
In some embodiments, the at least one 3D display interface may be arranged in at least one of the following ways:
the position of at least one 3D display interface in the 3D display device is fixed or variable;
the 3D effect of the at least one 3D display interface with respect to the 3D display device is fixed or variable.
In some embodiments, the at least one 3D display interface having a 3D effect with respect to the 3D display device may include at least one of:
at least one display frame having a 3D effect with respect to a 3D display device;
at least one display background having a 3D effect with respect to the 3D display device.
In some embodiments, the 3D display in the at least one 3D display interface may include:
and performing 3D display in a display area defined by a 3D display interface including at least one of a display frame and a display background.
The 3D display device provided by the embodiments of the present disclosure includes a processor and a memory storing program instructions, and the processor is configured to execute the 3D display method described above when executing the program instructions.
The 3D display device provided by the embodiment of the disclosure comprises the 3D display device.
In some embodiments, the 3D display device may be a 3D display module; or
The 3D display device can be a 3D display screen comprising a 3D display module; or
The 3D display device may be a 3D display including a 3D display screen.
The 3D display method, the 3D display device and the 3D display device provided by the embodiment of the disclosure can realize the following technical effects:
through carrying out 3D display in 3D display interface that has the 3D effect for 3D display device, improved 3D display's flexibility, be favorable to improving 3D display effect.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
At least one embodiment is illustrated by the accompanying drawings, which correspond to the accompanying drawings, and which do not form a limitation on the embodiment, wherein elements having the same reference numeral designations are shown as similar elements, and which are not to scale, and wherein:
fig. 1A is a schematic flow chart of a 3D display method provided by an embodiment of the present disclosure;
fig. 1B, 1C, and 1D are schematic diagrams illustrating a 3D display method according to an embodiment of the disclosure;
FIG. 2 is a schematic flow chart illustrating a 3D display interface provided by an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of generating an interface parallax image according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of obtaining an image to be displayed according to an embodiment of the present disclosure;
fig. 5 is another schematic flow chart of a 3D display method provided by an embodiment of the present disclosure;
fig. 6A, 6B, and 6C are schematic diagrams of 3D effects of a 3D display interface provided by an embodiment of the disclosure;
fig. 7A and 7B are schematic composition diagrams of a 3D display interface provided by an embodiment of the disclosure;
fig. 8 is a schematic structural diagram of a 3D display device provided in an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a 3D display device provided by an embodiment of the present disclosure;
fig. 10A, 10B, and 10C are schematic structural diagrams of other 3D display devices provided in an embodiment of the present disclosure.
Reference numerals:
200: a 3D display device; 210: a 3D display interface; 220: a display frame; 230: displaying a background; 300: a 3D display device; 310: a 3D display module; 320: a 3D display screen; 330: a 3D display; 810: a processor; 820: a memory; 830: a communication interface; 840: a bus.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, at least one embodiment may be practiced without these specific details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
Referring to fig. 1A, an embodiment of the present disclosure provides a 3D display method, including:
s110: displaying, in a 3D display device, at least one 3D display interface having a 3D effect with respect to the 3D display device;
s120: and performing 3D display in at least one 3D display interface.
Like this, through carrying out 3D in the 3D display interface that has the 3D effect for 3D display device and show, improved the flexibility of 3D display mode, be favorable to improving the 3D display effect.
Referring to fig. 1B, 1C, and 1D, based on the 3D display method provided by the embodiment of the present disclosure, in the 3D display device 200, at least one 3D display interface 210 having a 3D effect with respect to the 3D display device 200 may be displayed, and 3D display is performed in the at least one 3D display interface 210.
In some embodiments, a 3D display interface 210 is illustrated in fig. 1B, represented by a four-sided frame with diagonal fill.
In some embodiments, two 3D display interfaces 210 are exemplarily shown in fig. 1C, each represented by a four-sided frame with diagonal fill. Optionally, two 3D display interfaces 210 are adjacent to each other.
In some embodiments, two 3D display interfaces 210 are exemplarily shown in fig. 1D, each represented by a four-sided box with diagonal fill. Optionally, the two 3D display interfaces 210 are spaced apart from each other.
In some embodiments, the number of 3D display interfaces 210 may be one or more, such as: one, two, three or more. Alternatively, when the number of the 3D display interfaces 210 is two or more, at least two of the 3D display interfaces 210 may be adjacent to each other or spaced apart from each other. Alternatively, the shape of the 3D display interface 210 may be different from the four-sided frame, but in other shapes, such as: circular, oval, triangular, polygonal, irregular, etc.
In some embodiments, when the number of the 3D display interfaces 210 is multiple, the multiple 3D display interfaces 210 may be arranged in an array, for example: and the lines are arranged in a row. Alternatively, the array arrangement of the plurality of 3D display interfaces 210 may be different from the above-mentioned determinant, but may be in other array shapes, for example: circular, oval, triangular and other array arrangement modes.
Referring to fig. 2, in some embodiments, displaying at least one 3D display interface may include:
s210: generating an interface parallax image capable of forming at least one 3D display interface;
s220: and displaying the generated interface parallax image.
Referring to fig. 3, in some embodiments, generating an interface parallax image may include:
s310: rendering at least one 3D display interface based on the parallax texture of the interface parallax image to obtain an image to be displayed with parallax;
s320: and distributing the image to be displayed with parallax to the pixels.
Referring to fig. 4, in some embodiments, rendering at least one 3D display interface based on a disparity texture to obtain an image to be displayed with disparity may include:
s410: separating left eye parallax texture and right eye parallax texture from the parallax texture of the interface parallax image;
s420: rendering a rendering scene of at least one 3D display interface based on the left-eye parallax texture and the right-eye parallax texture through the left-eye logic camera and the right-eye logic camera respectively to obtain a left-eye image to be displayed and a right-eye image to be displayed.
In some embodiments, rendering a rendered scene of at least one 3D display interface may include:
performing left-eye parallax texture mapping based on left-eye parallax textures in an interface of a rendered scene through a left-eye logic camera to obtain a left-eye image to be displayed; and performing right eye parallax texture mapping based on the right eye parallax texture in an interface of a rendered scene through a right eye logic camera to obtain a right eye image to be displayed.
In some embodiments, the above-mentioned disparity texture may be embodied in a computer memory storage format of an image to be displayed. Alternatively, the interface parallax image may be generated in other feasible ways to display at least one 3D display interface, besides the parallax texture based way described above.
Referring to fig. 5, in some embodiments, the 3D display method may further include:
s510: establishing a 3D model of at least one 3D display interface;
s520: and obtaining a rendering scene based on the established 3D model.
In some embodiments, assigning the image to be displayed with parallax to the pixels may include:
and distributing the image to be displayed for the left eye and the image to be displayed for the right eye to the pixels.
In some embodiments, the 3D display device may include at least two viewpoints;
the allocating the left-eye image to be displayed and the right-eye image to be displayed to the pixels may include:
and distributing the left-eye image to be displayed and the right-eye image to be displayed to pixels corresponding to at least two viewpoints.
In some embodiments, the 3D display device may include more than two multiple viewpoints, for example: three or more. Alternatively, the left-eye image to be displayed and the right-eye image to be displayed may be allocated to pixels corresponding to a plurality of viewpoints. In this way, the pixels to which the left-eye image to be displayed is allocated may transmit the left-eye image to be displayed to the left eye of the user, and the pixels to which the right-eye image to be displayed is allocated may transmit the right-eye image to be displayed to the right eye of the user, so as to achieve a 3D effect.
In some embodiments, the at least two viewpoints described above may correspond to at least one user, for example: one, two, three or more users. Alternatively, the same or different images to be displayed may be sent to different users based on the viewpoints corresponding to the users, so that the different users can see the same or different 3D contents.
In some embodiments, the 3D display method may further include: a parallax texture is obtained.
In some embodiments, obtaining the disparity texture may include at least one of:
obtaining parallax textures through the 3D picture;
obtaining a parallax texture through a 3D video file;
obtaining a disparity texture through a 3D video stream;
the parallax texture is directly obtained by a 3D photographing apparatus.
In some embodiments, other feasible approaches besides the above-mentioned 3D picture, 3D video file, 3D video stream, 3D photographing device may be considered as long as the parallax texture can be obtained smoothly.
Referring to fig. 6A, 6B, and 6C, in some embodiments, the 3D effect of the at least one 3D display interface 210 with respect to the 3D display device 200 may include:
at least one 3D display interface 210 is out-of-screen with respect to the 3D display device 200; or
At least one 3D display interface 210 is screened relative to the 3D display device 200; or
One portion of the at least one 3D display interface 210 is out-of-screen with respect to the 3D display device 200 and another portion is in-screen with respect to the 3D display device 200.
Referring to fig. 6A, in some embodiments, at least one 3D display interface 210 may be out-of-screen with respect to the 3D display device 200.
Referring to fig. 6B, in some embodiments, at least one 3D display interface 210 may be screen-in with respect to the 3D display device 200.
Referring to fig. 6C, in some embodiments, a portion of the at least one 3D display interface 210 may be out-of-screen with respect to the 3D display device 200 and another portion may be in-screen with respect to the 3D display device 200.
In some embodiments, fig. 6C illustrates that the upper half of at least one 3D display interface 210 may be out-of-screen with respect to the 3D display device 200 and the lower half may be in-screen with respect to the 3D display device 200. Alternatively, the setting position of the out-of-screen and in-screen portion in the at least one 3D display interface 210 may also be different from that shown in fig. 6C, as long as the out-of-screen and in-screen portion exists in the at least one 3D display interface 210.
In some embodiments, the at least one 3D display interface 210 may be arranged in at least one of the following ways:
the position of the at least one 3D display interface 210 in the 3D display device 200 may be fixed or variable;
the 3D effect of the at least one 3D display interface 210 with respect to the 3D display device 200 may be fixed or variable.
In some embodiments, at least one 3D display interface 210 of the plurality of 3D display interfaces 210 may have a fixed position and may also have a position that changes, for example: location changes occur based on specific logic, conditions, etc. Alternatively, the specific logic, condition described above may be a user instruction, a display time, a posture of the 3D display device 200, a display area setting of a display interface of the 3D display device 200, or the like.
In some embodiments, at least one 3D display interface 210 of the plurality of 3D display interfaces 210 may have a fixed 3D effect (e.g., always on screen, etc.), and may also have a 3D effect change, such as: location changes occur based on specific logic, conditions, etc. Alternatively, the specific logic, condition described above may be a user instruction, a display time, a posture of the 3D display device 200, a light sensing setting of a display interface of the 3D display device 200, and the like.
Referring to fig. 7A, 7B, in some embodiments, the at least one 3D display interface 210 having a 3D effect with respect to the 3D display device 200 may include at least one of:
at least one display frame 220 having a 3D effect with respect to the 3D display device 200;
at least one display background 230 having a 3D effect with respect to the 3D display apparatus 200.
Referring to fig. 7A, in some embodiments, the at least one 3D display interface 210 may be at least one display frame 220 having a 3D effect with respect to the 3D display apparatus 200. Alternatively, when the number of the 3D display interfaces 210 is two or more, the two or more 3D display interfaces 210 may be two or more display frames 220.
Referring to fig. 7B, in some embodiments, the at least one 3D display interface 210 may be at least one display background 230 having a 3D effect with respect to the 3D display apparatus 200. Alternatively, when the number of the 3D display interfaces 210 is two or more, the two or more 3D display interfaces 210 may be two or more display backgrounds 230, respectively. Optionally, at least one of the display frames 220 may be included in at least one of the display backgrounds 230 as the 3D display interface 210.
In some embodiments, in addition to the display frame 220 and the display background 230, the at least one 3D display interface 210 may be embodied in other forms as long as it has a 3D effect with respect to the 3D display apparatus 200.
In some embodiments, the 3D display in the at least one 3D display interface 210 may include:
the 3D display is performed in a display area defined by a 3D display interface including at least one of the display frame 220 and the display background 230.
In some embodiments, whether represented as display frame 220, display background 230, or otherwise, may be 3D displayed in a display area defined by at least one 3D display interface 210, such as: the 3D image is displayed. Alternatively, the display area for 3D display is exemplarily represented by a quadrangle filled with grid lines in fig. 1B, 1C, 1D, 6A, 6B, 6C, 7A, 7B.
Referring to fig. 8, the present disclosure discloses a 3D display apparatus 200 including a processor and a memory storing program instructions, the processor being configured to execute the above-mentioned 3D display method when executing the program instructions.
In some embodiments, the structure of the 3D display device 200 shown in fig. 8 includes:
a processor (processor)810 and a memory (memory)820, and may further include a Communication Interface 830 and a bus 840. The processor 810, the communication interface 830 and the memory 820 can communicate with each other via a bus 840. Communication interface 830 may be used for information transfer. The processor 810 may call logic instructions in the memory 820 to perform the 3D display method of the above-described embodiment.
Furthermore, the logic instructions in the memory 820 may be implemented in software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as a stand-alone product.
The memory 820 is a computer-readable storage medium for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 810 performs functional applications and data processing, i.e., implementing the 3D display method in the above-described method embodiments, by executing program instructions/modules stored in the memory 820.
The memory 820 may include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 820 may include a high speed random access memory and may also include a non-volatile memory.
Referring to fig. 9, the present disclosure discloses a 3D display device 300 including the 3D display apparatus 200 described above.
Referring to fig. 10A, in some embodiments, the 3D display device 300 may be a 3D display module 310.
Referring to fig. 10B, in some embodiments, the 3D display device 300 may be a 3D display screen 320 including a 3D display module 310.
Referring to fig. 10C, in some embodiments, the 3D display device 300 may be a 3D display 330 including a 3D display screen 320.
In some embodiments, the 3D display device 300 may further include other means for supporting the normal operation of the 3D display device 300, such as: at least one of a communication interface, a frame, a control circuit, and the like.
According to the 3D display method, the 3D display device and the 3D display device, 3D display is carried out in the 3D display interface with the 3D effect relative to the 3D display device, so that the flexibility of 3D display is improved, and the 3D display effect is favorably improved.
The disclosed embodiments provide a computer-readable storage medium storing computer-executable instructions configured to perform the above-described 3D display method.
An embodiment of the present disclosure provides a computer program product including a computer program stored on a computer-readable storage medium, the computer program including program instructions that, when executed by a computer, cause the computer to perform the 3D display method.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The computer-readable storage medium and the computer program product provided by the embodiments of the present disclosure improve the flexibility of 3D display by performing 3D display in a 3D display interface having a 3D effect with respect to a 3D display device, and are advantageous to improving the 3D display effect.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes at least one instruction to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It is clear to those skilled in the art that, for convenience and brevity of description, the working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit may be merely a division of a logical function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the drawings, the width, length, thickness, etc. of structures such as elements or layers may be exaggerated for clarity and descriptive purposes. When an element or layer is referred to as being "disposed on" (or "mounted on," "laid on," "attached to," "coated on," or the like) another element or layer, the element or layer may be directly "disposed on" or "over" the other element or layer, or intervening elements or layers may be present, or even partially embedded in the other element or layer.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises at least one executable instruction for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (17)

1. A 3D display method, comprising:
displaying, in a 3D display device, at least one 3D display interface having a 3D effect with respect to the 3D display device;
and performing 3D display in the at least one 3D display interface.
2. The method of claim 1, wherein displaying the at least one 3D display interface comprises:
and generating and displaying an interface parallax image capable of forming the at least one 3D display interface.
3. The method of claim 2, wherein generating the interfacial parallax image comprises:
rendering the at least one 3D display interface based on the parallax texture of the interface parallax image to obtain an image to be displayed with parallax;
and distributing the image to be displayed with parallax to pixels.
4. The method according to claim 3, wherein rendering the at least one 3D display interface based on the disparity texture, resulting in the image to be displayed with disparity, comprises:
and separating a left eye parallax texture and a right eye parallax texture from the parallax texture, and rendering the rendered scene of the at least one 3D display interface through a left eye logic camera and a right eye logic camera respectively based on the left eye parallax texture and the right eye parallax texture to obtain a left eye image to be displayed and a right eye image to be displayed.
5. The method of claim 4, wherein rendering the rendered scene of the at least one 3D display interface comprises:
performing left-eye parallax texture mapping based on the left-eye parallax texture in the interface of the rendered scene through a left-eye logic camera to obtain a left-eye image to be displayed; and performing right eye parallax texture mapping on the basis of the right eye parallax texture in the interface of the rendered scene through a right eye logic camera to obtain a right eye image to be displayed.
6. The method of claim 4 or 5, further comprising: and establishing a 3D model of the at least one 3D display interface to obtain the rendering scene.
7. The method according to any one of claims 4 to 6, wherein assigning the image to be displayed with parallax to pixels comprises:
and distributing the left-eye image to be displayed and the right-eye image to be displayed to pixels.
8. The method of claim 7, wherein the 3D display device comprises at least two viewpoints;
distributing the left eye image to be displayed and the right eye image to be displayed to pixels, comprising:
and distributing the left-eye image to be displayed and the right-eye image to be displayed to pixels corresponding to the at least two viewpoints.
9. The method of any of claims 3 to 8, further comprising: obtaining the parallax texture.
10. The method of claim 9, wherein obtaining the disparity texture comprises at least one of:
obtaining the parallax texture through a 3D picture;
obtaining the parallax texture through a 3D video file;
obtaining the disparity texture through a 3D video stream;
the parallax texture is obtained directly by a 3D photographing apparatus.
11. The method of claim 1, wherein the 3D effect of the at least one 3D display interface with respect to the 3D display device comprises:
the at least one 3D display interface is displayed relative to the 3D display device; or
The at least one 3D display interface is embedded relative to the 3D display device; or
One part of the at least one 3D display interface is displayed relative to the 3D display device, and the other part is displayed relative to the 3D display device.
12. The method according to claim 1, wherein the at least one 3D display interface is configured in at least one of the following ways:
the position of the at least one 3D display interface in the 3D display device is fixed or variable;
the 3D effect of the at least one 3D display interface with respect to the 3D display device is fixed or variable.
13. The method according to any of claims 1 to 12, wherein the at least one 3D display interface having a 3D effect with respect to the 3D display device comprises at least one of:
at least one display frame having a 3D effect with respect to the 3D display device;
at least one display background having a 3D effect with respect to the 3D display device.
14. The method of claim 13, wherein performing 3D display in the at least one 3D display interface comprises:
and performing 3D display in a display area defined by the 3D display interface including at least one of the display frame and the display background.
15. A 3D display device comprising a processor and a memory storing program instructions, characterized in that the processor is configured to perform the method according to any of claims 1 to 14 when executing the program instructions.
16. A 3D display device comprising the apparatus of claim 15.
17. The 3D display device according to claim 16,
the 3D display device is a 3D display module; or
The 3D display device is a 3D display screen comprising the 3D display module; or
The 3D display device is a 3D display comprising the 3D display screen.
CN202010799315.2A 2020-08-11 2020-08-11 3D display method, 3D display device and 3D display device Pending CN114079764A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010799315.2A CN114079764A (en) 2020-08-11 2020-08-11 3D display method, 3D display device and 3D display device
PCT/CN2021/108968 WO2022033314A1 (en) 2020-08-11 2021-07-28 3d display method, 3d display apparatus and 3d display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010799315.2A CN114079764A (en) 2020-08-11 2020-08-11 3D display method, 3D display device and 3D display device

Publications (1)

Publication Number Publication Date
CN114079764A true CN114079764A (en) 2022-02-22

Family

ID=80247665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010799315.2A Pending CN114079764A (en) 2020-08-11 2020-08-11 3D display method, 3D display device and 3D display device

Country Status (2)

Country Link
CN (1) CN114079764A (en)
WO (1) WO2022033314A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266546A (en) * 2008-05-12 2008-09-17 深圳华为通信技术有限公司 Method for accomplishing operating system three-dimensional display and three-dimensional operating system
CN103081002A (en) * 2010-08-10 2013-05-01 索尼公司 2D to 3D user interface content data conversion
US20170192738A1 (en) * 2014-06-16 2017-07-06 Zte Corporation View display processing method, apparatus, and projection device
CN107870702A (en) * 2016-09-23 2018-04-03 成都理想境界科技有限公司 User based on head-mounted display apparatus manipulates reminding method and device
CN109471603A (en) * 2017-09-07 2019-03-15 华为终端(东莞)有限公司 A kind of interface display method and device
CN109495641A (en) * 2018-10-24 2019-03-19 维沃移动通信有限公司 A kind of based reminding method and mobile terminal
JP2019161320A (en) * 2018-03-08 2019-09-19 キヤノン株式会社 Information processing apparatus, control method therefor, and program
TW202018460A (en) * 2018-10-31 2020-05-16 華碩電腦股份有限公司 Electronic device and control method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9021399B2 (en) * 2009-06-24 2015-04-28 Lg Electronics Inc. Stereoscopic image reproduction device and method for providing 3D user interface
EP2472878A1 (en) * 2010-12-31 2012-07-04 Advanced Digital Broadcast S.A. Method and apparatus for combining images of a graphic user interface with a stereoscopic video
US20120254791A1 (en) * 2011-03-31 2012-10-04 Apple Inc. Interactive menu elements in a virtual three-dimensional space
CN104809137B (en) * 2014-01-28 2018-07-13 上海尚恩华科网络科技股份有限公司 A kind of the three-dimensional web page production method and system of the two dimension page

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101266546A (en) * 2008-05-12 2008-09-17 深圳华为通信技术有限公司 Method for accomplishing operating system three-dimensional display and three-dimensional operating system
CN103081002A (en) * 2010-08-10 2013-05-01 索尼公司 2D to 3D user interface content data conversion
US20170192738A1 (en) * 2014-06-16 2017-07-06 Zte Corporation View display processing method, apparatus, and projection device
CN107870702A (en) * 2016-09-23 2018-04-03 成都理想境界科技有限公司 User based on head-mounted display apparatus manipulates reminding method and device
CN109471603A (en) * 2017-09-07 2019-03-15 华为终端(东莞)有限公司 A kind of interface display method and device
JP2019161320A (en) * 2018-03-08 2019-09-19 キヤノン株式会社 Information processing apparatus, control method therefor, and program
CN109495641A (en) * 2018-10-24 2019-03-19 维沃移动通信有限公司 A kind of based reminding method and mobile terminal
TW202018460A (en) * 2018-10-31 2020-05-16 華碩電腦股份有限公司 Electronic device and control method thereof

Also Published As

Publication number Publication date
WO2022033314A1 (en) 2022-02-17

Similar Documents

Publication Publication Date Title
JP6021541B2 (en) Image processing apparatus and method
KR100405060B1 (en) Enlarged Digital Image Providing Method and Apparatus Using Data Communication Networks
EP2347597B1 (en) Method and system for encoding a 3d image signal, encoded 3d image signal, method and system for decoding a 3d image signal
JP2006107213A (en) Stereoscopic image printing system
US20090244066A1 (en) Multi parallax image generation apparatus and method
JP2003111101A (en) Method, apparatus and system for processing stereoscopic image
KR20110074775A (en) Method and device for providing a layered depth model of a scene
JP2002077940A (en) Stereoscopic image generating device and game device
JP2009258726A (en) Method and apparatus for forming multiple viewpoint image, computer readable medium and video apparatus
CN109643462B (en) Real-time image processing method based on rendering engine and display device
JP2013223008A (en) Image processing device and method
CN110248147B (en) Image display method and device
WO2018037976A1 (en) Data processing device, data processing method, and computer program
CN106559662B (en) Multi-view image display apparatus and control method thereof
CN114079764A (en) 3D display method, 3D display device and 3D display device
CN112584124A (en) Method and device for realizing 3D display and 3D display terminal
CN114556433A (en) Information processing device, 3D data generation method, and program
CN114466174B (en) Multi-view 3D image coding method, device, system and storage medium
JP2006140553A (en) Solid image generation program, generator and generation method
CN114979614A (en) Display mode determining method and display mode determining device
JP2008167310A (en) Naked eye stereoscopic vision image processing method, device, and recording medium recording operation program
CN112584130A (en) Method and device for realizing 3D display and 3D display terminal
JP2011082698A (en) Image generation device, image generation method, and program
CN112584128A (en) Method and device for realizing 3D display and 3D display terminal
CN112929641B (en) 3D image display method and 3D display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20220401

Address after: Room 1808, block a, Langqin international, Xicheng District, Beijing 100055

Applicant after: Beijing Xinhai vision 3D Technology Co.,Ltd.

Address before: 1808, block a, LongQin international, 168 Guang'anmenwai street, Xicheng District, Beijing 100055

Applicant before: Beijing Xinhai vision 3D Technology Co.,Ltd.

Applicant before: Vision technology venture capital Pte. Ltd.

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20220222

WD01 Invention patent application deemed withdrawn after publication