WO2017028498A1 - 3d场景展示方法及装置 - Google Patents
3d场景展示方法及装置 Download PDFInfo
- Publication number
- WO2017028498A1 WO2017028498A1 PCT/CN2016/071616 CN2016071616W WO2017028498A1 WO 2017028498 A1 WO2017028498 A1 WO 2017028498A1 CN 2016071616 W CN2016071616 W CN 2016071616W WO 2017028498 A1 WO2017028498 A1 WO 2017028498A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- scene
- image
- position information
- viewer
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
- H04N13/279—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/221—Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/282—Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/16—Using real world measurements to influence rendering
Definitions
- Embodiments of the present invention relate to a 3D scene display method and apparatus.
- 3D display technology can make the picture become stereoscopic.
- the most basic principle is to use the left and right human eyes to receive different pictures separately, and then pass the brain to the image information. Superimposed and regenerated to form an image with a stereo effect.
- N is a display device, taking the hexahedron M in the 3D scene as an example, in general, standing at the observation position A, only one 3D picture of the hexahedron can be viewed, and only the 3D picture can be viewed in the 3D picture.
- the embodiment of the present invention provides a method and a device for displaying a 3D image scene, which are used to solve the problem in the prior art that a single viewer cannot view different 3D images in a 3D scene as the observed position changes.
- an embodiment of the present invention provides a 3D scene display method, which is applied to a 3D scene display apparatus, including: loading 3D scene information, wherein the 3D scene information stores a plurality of 3D image information, and each 3D The image information carries corresponding shooting position information; when the viewer is detected within the visible angle range of the 3D scene display device, the current viewer's observation position information is determined in real time, wherein the observed position information is the current viewer Determining the location information of the device with respect to the 3D scene; determining 3D image information to be displayed according to the observed location information and the plurality of 3D scene information; and 3D content corresponding to the 3D image information to be displayed Show it.
- an embodiment of the present invention provides a 3D scene display apparatus, including: a loading unit configured to load 3D scene information, wherein the 3D scene information stores a plurality of 3D image information, and each 3D image The information carries corresponding shooting position information; the determining unit is configured to determine the current position information of the current viewer in real time when the viewer is detected within the visible angle range of the 3D scene display device, wherein the observed position information Positioning information for the current viewer relative to the 3D scene display device; the processing unit configured to determine 3D image information to be displayed according to the observed position information and the 3D scene information; and a display unit configured to The corresponding 3D content of the displayed 3D image information is displayed.
- FIG. 1 is a schematic diagram of a viewer viewing the same 3D picture at positions A and B within a range of viewing angles provided by the prior art
- FIG. 2 is a schematic flowchart of a 3D scene display method according to Embodiment 1 of the present invention.
- FIG. 3 is a simplified schematic diagram of a 3D scene M enumerated in Embodiment 1 of the present invention
- 4(a) is a schematic diagram of the camera S capturing a 3D scene M;
- 4(b) is a schematic diagram of the viewer R viewing in the range of the viewing angle of the 3D scene display device N;
- FIG. 5 is a schematic diagram of different 3D pictures viewed by the viewers at positions A and B in the range of the viewing angle;
- FIG. 6 is a schematic diagram of a viewer viewing different 3D pictures at position A;
- FIG. 7 is a schematic structural diagram of a 3D scene display apparatus according to Embodiment 2 of the present invention.
- Embodiment 1 is a diagrammatic representation of Embodiment 1:
- FIG. 2 is a schematic flowchart diagram of a 3D scene display method according to Embodiment 1 of the present invention. The method is applied to a single viewer's display scene, and the viewer is within a visible angle range of the 3D scene display device. Moving, the method mainly includes the following steps:
- Step 11 Load 3D scene information.
- the 3D scene information stores a plurality of 3D image information, and each 3D image information carries corresponding shooting position information.
- the 3D scene information is obtained by: determining a 3D scene to be photographed; setting the image capturing device at a preset starting shooting point, and correspondingly recording shooting position information of the current starting shooting point; in the 3D Moving the image acquisition device to capture in a range of visible angles of the scene, and correspondingly recording all the shooting position information on the path through which the movement is performed; reconstructing each of the left and right viewpoint image information obtained by the image to form a plurality of 3D Image information.
- the orientation information is correlated with the observation position information of the subsequent viewer, and the observation position information is limited to the visible angle range of the display screen side of the 3D scene display device, the other side of the display screen is impossible.
- the position information is related to the observation position information. Therefore, the orientation information should correspond to the observation position information, and the observation position information overlaps with the partial orientation information, and the orientation information and the arrangement manner of the observation position information and the reference target are as consistent as possible.
- the image capturing device used for shooting may be a camera having an acquisition and image processing function.
- each 3D image information also carries corresponding shooting position information, which can be obtained by any physical coordinate capable of representing the position information in the 3D scene, and is used to indicate the position relative to the 3D scene M.
- the relationship may be, for example, the latitude and longitude, the linear distance from the 3D scene M, and the like.
- the 3D scene M may be used as the coordinate origin
- the photographing position information may be an X coordinate, a Y coordinate, a Z coordinate, or the like with respect to the coordinate origin.
- the viewer R can only view the 3D picture displayed on the display screen within the viewing angle range of the 3D scene display device N. Therefore, the observation position information of the viewer R is limited to the viewing angle. Azimuth information within the range.
- embodiment of the present invention may also acquire 3D scene information by using system modeling or the like.
- Step 12 When the viewer is detected within the current viewing angle range, the current viewer's observation position is determined in real time to obtain the observed position information.
- the observation position information is position information of the current viewer relative to the 3D scene display device.
- the current position information of the current viewer is obtained in the step 12, and the eye information of the viewer may be captured in real time, and the observed position information of the viewer is determined according to the captured eye information.
- the manner of obtaining the observed position information in the embodiment of the present invention includes, but is not limited to, the above obtaining operation, and can also determine the observer's observation by capturing the other body parts of the viewer, for example, the head. position.
- the eye information of the viewer may be captured by a human eye based on a single camera or a multi-camera.
- the human eye detection may be performed by an infrared detecting device disposed on a display screen of the 3D scene display device.
- an infrared detection device can also be integrated in the camera, which together assist in human eye detection to improve accuracy.
- Step 13 Determine 3D image information to be displayed according to the observed position information and the 3D scene information.
- the step 13 may include: searching for 3D scene information according to the observed position information, and using the found 3D image information corresponding to the observed position information as the 3D image information to be displayed.
- an information parameter correspondence table composed of each 3D image information and the photographing position information carried by itself may be established in advance, and then each of the photographing position information in the observation position information and the information parameter correspondence table is performed one by one.
- the 3D image information corresponding to the observed position information is used as a 3D image letter to be displayed according to the correlation between the preset observation position and the shooting position. interest.
- Step 14 Display the 3D content corresponding to the 3D image information to be displayed.
- the above method may further include the following steps:
- the first step is to receive a 3D image switching instruction.
- the 3D image switching instruction includes at least: a rotation switching instruction, a reduction switching instruction, an amplification switching instruction, and a movement switching instruction.
- the 3D image switching instruction may be key value information selected by the viewer using a remote control or the like to control transmission, or may be gesture information of the viewer (for a 3D scene display device having a touch screen).
- the 3D image switching instruction is parsed to determine the switching type and the shooting position information after the switching;
- the 3D image information to be switched is searched from the 3D scene information according to the determined captured shooting position information
- the 3D image information to be switched is switched according to the switching type, and the corresponding 3D content of the 3D image information to be switched is displayed.
- the embodiments involved in the first step to the fourth step are mainly directed to the case where the viewer does not move, for example, when the viewer moves to the boundary of the visible angle range, if the user wants to view other 3D pictures, Moving away from the direction of the 3D scene display device, the range of invisible angles is entered, and therefore, the limited observation position information of the viewer limits that it cannot display 3D image information other than the 3D image information corresponding to the observation position information.
- 3D picture That is, in the case where the viewer moves, if the automatic recognition mode is relied on, only a limited number of different 3D pictures can be viewed, and not all the different 3D pictures can be viewed.
- the 3D scene information to be displayed is loaded. Before being displayed, the viewer firstly checks whether there is a viewer within the current viewing angle range. If the detection is not detected, the screen is displayed according to the preset playing rule.
- the 3D image information in the 3D scene information can play only the 3D image information corresponding to the shooting position information of the starting shooting point. If the viewer is detected, it is further determined whether the observer's observation position has moved. If it is determined that the movement occurs, the human eye information is detected in real time by using the infrared camera configured by itself, and the viewer's reality is finally determined according to the human eye information. Observe position information. Taking the viewer in FIG.
- Example 2 mainly adopts the following two methods.
- various touch operations can be performed on the 3D scene display device according to the touch touch method, thereby converting into a 3D image switching instruction that can be analyzed by the 3D scene display device, and then, according to the analysis As a result, the switching type and the shooting position information after the switching are determined, and further, the 3D image information to be switched is searched from the 3D scene information, and the 3D content corresponding to the 3D image information is displayed.
- the 3D scene display device receives the 3D image switching instruction corresponding to the touch operation, and analyzes and determines that the type of the switching needs to be rotated, and the switched shooting position information may be determined according to the application range of the touch operation on the display screen. .
- 3D scene display devices not all 3D scene display devices have a touch function. Therefore, for a 3D scene display device without a touch function, a control device similar to a 3D scene display device such as a remote controller can be used, and The key value is clicked, long pressed, etc. to send a 3D image switching instruction to the 3D scene display device, and the 3D image display device receives and parses the 3D image switching instruction to determine the switching type, the switched shooting position information, and the subsequent operation. The same as the above method 1, and will not be described here.
- the 3D image switching instruction includes, but is not limited to, a rotation switching instruction, a reduction switching instruction, an amplification switching instruction, and a movement switching instruction.
- the 3D scene display device parses the corresponding switching type and the captured shooting position information according to a preset instruction parsing manner.
- the image capturing device moves each time.
- a certain step size can be set by a person skilled in the art as needed.
- the step size can be set in consideration of the calculation speed and the amount of data processed.
- the moving step of the camera should not be smaller than the human eyelid distance, and the step size can be set to the pupil distance.
- a 3D scene display method provided by the above-mentioned first embodiment of the present invention belongs to the same technical concept.
- the embodiment of the present invention further provides a 3D scene display device. The following is explained in detail through the second embodiment.
- Embodiment 2 is a diagrammatic representation of Embodiment 1:
- the 3D scene display device may be a display device, which may be a liquid crystal display device, an OLED display device, or a plasma display device.
- the embodiment of the present invention is not limited thereto, as long as it can Achieve 3D display effects.
- the 3D scene display device includes a structural component such as a display screen, a display module, a driving module, and the like, and most importantly, in the embodiment of the present invention, the structural unit capable of realizing the functional purpose of the embodiment of the present invention is mainly included in the figure. As shown in Figure 7, the 3D scene display device mainly includes:
- the loading unit 21 is configured to load 3D scene information.
- the 3D scene information stores a plurality of 3D image information, and each 3D image information carries corresponding shooting position information.
- the 3D scene information loaded in the loading unit 21 can be obtained by: determining a 3D scene to be photographed; setting the image capturing device at a preset starting shooting point, and correspondingly recording the current starting shooting point. Position information; moving the image capturing device to capture in the range of the viewing angle of the 3D scene, and correspondingly recording all the shooting position information on the path through which the movement is performed; reconstructing each of the left and right viewpoint image information obtained by the image, A plurality of 3D image information is formed.
- the determining unit 22 is configured to determine the current position information of the current viewer in real time when the viewer is detected within the current viewing angle range.
- the observation position information is position information of the current viewer relative to the 3D scene display device.
- the determining unit 22 is configured to capture the viewer's eye information when the viewer is detected within the current viewing angle range, and determine the viewer's observed position information based on the captured eye information.
- the processing unit 23 is configured to determine 3D image information to be displayed according to the observed location information and the 3D scene information.
- the processing unit 23 is configured to: find 3D scene information according to the observed position information, and use the found 3D image information corresponding to the observed position information as the 3D image information to be displayed.
- the display unit 24 is configured to perform 3D content corresponding to the 3D image information to be displayed.
- the 3D scene display apparatus may further include: a receiving unit, configured to receive when the viewer does not move within a range of the viewing angle of the 3D scene display device 3D image switching instruction.
- the parsing unit is configured to parse the 3D image switching instruction received by the receiving unit, and determine the switching type and the shooting position information after the switching. Therefore, the processing unit is further configured to search for 3D image information to be switched from the 3D scene information according to the switched shooting position information determined by the analyzing unit.
- the display unit is further configured to switch to the 3D image information to be switched according to the switching type determined by the parsing unit, and display the corresponding 3D content of the 3D image information to be switched; wherein the 3D image switching instruction at least includes: a rotation switching instruction, Reduce the switching command, the amplification switching command, and the movement switching command.
- the collecting device performs shooting and correspondingly records all the shooting position information on the path through which the movement passes.
- the image capturing device moves a certain step size each time, and those skilled in the art can perform setting according to needs, for example, may consider
- the step size is set by calculating the speed, the amount of processed data, etc.
- the moving step of the camera should not be smaller than the human eyelid distance, and the step size can be set to the pupil distance.
- the embodiment of the present invention can load 3D scene information in a 3D scene display device, where the 3D scene information stores a plurality of 3D image information carrying corresponding shooting position information, and according to the determined current
- the observer's observation position information further determines the 3D image information to be displayed, thereby ensuring that the viewer can see different 3D images at any observation position within the range of the viewing angle, and as far as possible, the viewer experiences the actual The effect of watching the 3D scene in the scene.
- the 3D scene information stores a plurality of 3D image information, and each 3D image information carries corresponding shooting position information; and the viewing is detected within the current viewing angle range. And determining the current position information of the current viewer in real time, the observed position information is position information of the current viewer relative to the 3D scene display device; and then determining the 3D image information to be displayed according to the observed position information and the 3D scene information. And displaying the 3D content corresponding to the 3D image information to be displayed. Thereby, it is ensured that the viewer can view different 3D picture content at different observation positions within the range of the viewing angle, thereby effectively improving the viewing experience of the viewer.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (16)
- 一种3D场景展示方法,应用于3D场景显示装置,包括:加载3D场景信息,其中,所述3D场景信息中存储有多个3D图像信息,且每个3D图像信息携带有相应的拍摄位置信息;在所述3D场景显示装置的可视角度范围内检测到观看者时,实时确定当前观看者的观测位置信息,其中,所述观测位置信息为当前观看者相对于所述3D场景展示装置的位置信息;根据所述观测位置信息以及所述多个3D场景信息确定待展示的3D图像信息;将与所述待展示的3D图像信息相应的3D内容进行展示。
- 如权利要求1所述的3D场景展示方法,其中所述多个3D场景信息通过以下方式获取:确定待拍摄的3D场景;将图像采集装置设置在预设的起始拍摄点,并对应记录当前起始拍摄点的拍摄位置信息;在该3D场景的可视角度范围内移动所述图像采集装置进行拍摄,并对应记录移动所经过的路径上的全部拍摄位置信息;将拍摄得到的每个左右视点图像信息进行图像重构,形成所述多个3D图像信息。
- 如权利要求1所述的3D场景展示方法,其中所述实时确定当前观看者的观测位置信息,包括:实时捕获观看者的眼部信息,并根据捕获到的眼部信息确定观看者的观测位置信息。
- 如权利要求1所述的3D场景展示方法,其中根据所述观测位置信息以及所述3D场景信息确定待展示的3D图像信息,包括:根据所述观测位置信息查找所述3D场景信息,并将查找到的与所述观测位置信息对应的3D图像信息作为待展示的3D图像信息。
- 如权利要求1-4任一所述的3D场景展示方法,其中若所述观看者在所述3D场景展示装置的可视角度范围内不移动,所述方法还包括:接收3D图像切换指令;解析所述3D图像切换指令,确定切换类型以及切换后的拍摄位置信息;根据确定的切换后的拍摄位置信息从所述3D场景信息中查找待切换的3D图像信息;按照所述切换类型切换至所述待切换的3D图像信息,并将该待切换的3D图像信息相应的3D内容进行展示;其中,所述3D图像切换指令至少包括:旋转切换指令、缩小切换指令、放大切换指令、移动切换指令。
- 如权利要求1所述的3D场景展示方法,其中根据所述观测位置信息以及所述3D场景信息确定待展示的3D图像信息包括:根据预设的观测位置与所述多个3D场景信息携带的拍摄位置信息的关联性,将与所述观测位置对应的3D图像信息作为所述待展示的3D图像信息。
- 如权利要求6所述的3D场景展示方法,其中所述根据所述观测位置信息以及所述3D场景信息确定待展示的3D图像信息,在根据预设的观测位置与所述多个3D场景信息携带的拍摄位置信息的关联性,将与所述观测位置对应的3D图像信息作为所述待展示的3D图像信息之前,还包括:建立所述多个3D图像信息的每个与其携带的拍摄位置信息的信息参数对应表。
- 如权利要求5所述的3D场景展示方法,其中所述3D图像切换指令由所述观看者手动输入。
- 一种3D场景展示装置,包括:加载单元,构造为加载3D场景信息,其中,所述3D场景信息中存储有多个3D图像信息,且每个3D图像信息携带有相应的拍摄位置信息;确定单元,构造为在所述3D场景展示装置的可视角度范围内检测到观看者时,实时确定当前观看者的观测位置信息,其中,所述观测位置信息为当前观看者相对于所述3D场景展示装置的位置信息;处理单元,构造为根据所述观测位置信息以及所述3D场景信息确定待展示的3D图像信息;以及展示单元,构造为将与所述待展示的3D图像信息相应的3D内容进行展示。
- 如权利要求9所述的3D场景展示装置,其中所述3D场景信息通过以下方式获取:确定待拍摄的3D场景;将图像采集装置设置在预设的起始拍摄点,并对应记录当前起始拍摄点的拍摄位置信息;在该3D场景的可视角度范围内移动所述图像采集装置进行拍摄,并对应记录移动所经过的路径上的全部拍摄位置信息;将拍摄得到的每个左右视点图像信息进行图像重构,形成多个3D图像信息。
- 如权利要求9所述的3D场景展示装置,其中所述确定单元,构造为:捕获观看者的眼部信息,并根据捕获到的眼部信息确定观看者的观测位置信息。
- 如权利要求9所述的3D场景展示装置,其中所述处理单元,构造为:根据所述观测位置信息查找所述3D场景信息,并将查找到的与所述观测位置信息对应的3D图像信息作为待展示的3D图像信息。
- 如权利要求9-12中任一项所述的3D场景展示装置,其中所述装置还包括:接收单元,构造为在观看者在3D场景展示装置的可视角度范围内不移动时,接收3D图像切换指令;解析单元,用于对所述接收单元接收到的3D图像切换指令进行解析,确定切换类型以及切换后的拍摄位置信息;所述处理单元,还用于根据解析单元确定的切换后的拍摄位置信息从所述3D场景信息中查找待切换的3D图像信息;所述展示单元,还用于按照所述解析单元确定的切换类型切换至所述待切换的3D图像信息,并将该待切换的3D图像信息相应的3D内容进行展示;其中,所述3D图像切换指令至少包括:旋转切换指令、缩小切换指令、放大切换指令、移动切换指令。
- 如权利要求9所述的3D场景展示装置,其中所述处理单元构造为:根据预设的观测位置与所述多个3D场景信息携带的拍摄位置信息的关联性,将与所述观测位置对应的3D图像信息作为所述待展示的3D图像信息。
- 如权利要求14所述的3D场景展示装置,其中所述处理单元还构造为:建立所述多个3D图像信息的每个与其携带的拍摄位置信息的信息参数对应表。
- 如权利要求13所述的3D场景展示装置,其中所述3D图像切换指令由所述观看者手动输入。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/037,511 US10045007B2 (en) | 2015-08-19 | 2016-01-21 | Method and apparatus for presenting 3D scene |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510512281.3 | 2015-08-19 | ||
CN201510512281.3A CN105120251A (zh) | 2015-08-19 | 2015-08-19 | 一种3d场景展示方法及装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017028498A1 true WO2017028498A1 (zh) | 2017-02-23 |
Family
ID=54668121
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/071616 WO2017028498A1 (zh) | 2015-08-19 | 2016-01-21 | 3d场景展示方法及装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10045007B2 (zh) |
CN (1) | CN105120251A (zh) |
WO (1) | WO2017028498A1 (zh) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105120251A (zh) | 2015-08-19 | 2015-12-02 | 京东方科技集团股份有限公司 | 一种3d场景展示方法及装置 |
CN106896732B (zh) * | 2015-12-18 | 2020-02-04 | 美的集团股份有限公司 | 家用电器的展示方法和装置 |
CN105574914B (zh) * | 2015-12-18 | 2018-11-30 | 深圳市沃优文化有限公司 | 3d动态场景的制作装置及其制作方法 |
CN106162204A (zh) * | 2016-07-06 | 2016-11-23 | 传线网络科技(上海)有限公司 | 全景视频生成、播放方法、装置及系统 |
CN107623812A (zh) * | 2016-07-14 | 2018-01-23 | 幸福在线(北京)网络技术有限公司 | 一种实现实景展示的方法、相关装置及系统 |
EP3511764B1 (en) * | 2016-09-30 | 2021-07-21 | Huawei Technologies Co., Ltd. | 3d display method and user terminal |
CN107957772B (zh) * | 2016-10-17 | 2021-09-21 | 阿里巴巴集团控股有限公司 | 现实场景中采集vr图像的处理方法以及实现vr体验的方法 |
CN107146278B (zh) * | 2017-04-18 | 2020-05-26 | 深圳市智能现实科技有限公司 | 场景建模方法及装置 |
CN109254660B (zh) | 2018-08-31 | 2020-11-17 | 歌尔光学科技有限公司 | 内容显示方法、装置及设备 |
CN110908499A (zh) * | 2018-09-18 | 2020-03-24 | 西安中兴新软件有限责任公司 | 一种3d图像显示方法和装置、及终端 |
CN109451296B (zh) * | 2018-09-26 | 2020-09-08 | 深圳市新致维科技有限公司 | 一种可供多人观看的裸眼3d系统和3d画面的显示方法 |
CN111416949A (zh) * | 2020-03-26 | 2020-07-14 | 上海擎天电子科技有限公司 | 一种实景展示装置 |
CN115695771A (zh) * | 2021-07-28 | 2023-02-03 | 京东方科技集团股份有限公司 | 一种显示装置及其显示方法 |
CN114422819A (zh) * | 2022-01-25 | 2022-04-29 | 纵深视觉科技(南京)有限责任公司 | 一种视频显示方法、装置、设备、系统及介质 |
CN114500846B (zh) * | 2022-02-12 | 2024-04-02 | 北京蜂巢世纪科技有限公司 | 现场活动观看视角切换方法、装置、设备及可读存储介质 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008041313A1 (fr) * | 2006-10-02 | 2008-04-10 | Pioneer Corporation | Dispositif d'affichage d'images |
CN101729920A (zh) * | 2009-11-23 | 2010-06-09 | 南京大学 | 一种自由视角立体视频显示方法 |
CN103517060A (zh) * | 2013-09-03 | 2014-01-15 | 展讯通信(上海)有限公司 | 一种终端设备的显示控制方法及装置 |
CN104349155A (zh) * | 2014-11-25 | 2015-02-11 | 深圳超多维光电子有限公司 | 模拟立体图像显示方法及显示设备 |
CN104506841A (zh) * | 2014-12-31 | 2015-04-08 | 宇龙计算机通信科技(深圳)有限公司 | 一种多摄像头终端的3d文件制作、播放方法及装置 |
CN104618706A (zh) * | 2015-01-12 | 2015-05-13 | 深圳市亿思达科技集团有限公司 | 一种分时实现多人多角度全息立体显示的移动终端及方法 |
CN104679227A (zh) * | 2013-12-02 | 2015-06-03 | 创世界科技有限公司 | 一种产品展示实现方法 |
CN104820497A (zh) * | 2015-05-08 | 2015-08-05 | 东华大学 | 一种基于增强现实的3d交互显示系统 |
CN105120251A (zh) * | 2015-08-19 | 2015-12-02 | 京东方科技集团股份有限公司 | 一种3d场景展示方法及装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3992629B2 (ja) * | 2003-02-17 | 2007-10-17 | 株式会社ソニー・コンピュータエンタテインメント | 画像生成システム、画像生成装置、画像生成方法 |
US8189035B2 (en) * | 2008-03-28 | 2012-05-29 | Sharp Laboratories Of America, Inc. | Method and apparatus for rendering virtual see-through scenes on single or tiled displays |
US9432661B2 (en) * | 2011-02-24 | 2016-08-30 | Kyocera Corporation | Electronic device, image display method, and image display program |
JP6240963B2 (ja) * | 2011-09-12 | 2017-12-06 | インテル・コーポレーション | 運動視差を用いた、2d画像からの3d知覚の生成 |
WO2014035204A1 (ko) * | 2012-08-30 | 2014-03-06 | (주)지에스엠솔루션 | 뷰 포인트에 따른 입체화 방법 |
CN103488413B (zh) * | 2013-04-26 | 2016-12-28 | 展讯通信(上海)有限公司 | 触控设备及在触控设备上显示3d界面的控制方法和装置 |
US9838672B2 (en) * | 2013-05-16 | 2017-12-05 | Mediatek Inc. | Apparatus and method for referring to motion status of image capture device to generate stereo image pair to auto-stereoscopic display for stereo preview |
-
2015
- 2015-08-19 CN CN201510512281.3A patent/CN105120251A/zh active Pending
-
2016
- 2016-01-21 US US15/037,511 patent/US10045007B2/en not_active Expired - Fee Related
- 2016-01-21 WO PCT/CN2016/071616 patent/WO2017028498A1/zh active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008041313A1 (fr) * | 2006-10-02 | 2008-04-10 | Pioneer Corporation | Dispositif d'affichage d'images |
CN101729920A (zh) * | 2009-11-23 | 2010-06-09 | 南京大学 | 一种自由视角立体视频显示方法 |
CN103517060A (zh) * | 2013-09-03 | 2014-01-15 | 展讯通信(上海)有限公司 | 一种终端设备的显示控制方法及装置 |
CN104679227A (zh) * | 2013-12-02 | 2015-06-03 | 创世界科技有限公司 | 一种产品展示实现方法 |
CN104349155A (zh) * | 2014-11-25 | 2015-02-11 | 深圳超多维光电子有限公司 | 模拟立体图像显示方法及显示设备 |
CN104506841A (zh) * | 2014-12-31 | 2015-04-08 | 宇龙计算机通信科技(深圳)有限公司 | 一种多摄像头终端的3d文件制作、播放方法及装置 |
CN104618706A (zh) * | 2015-01-12 | 2015-05-13 | 深圳市亿思达科技集团有限公司 | 一种分时实现多人多角度全息立体显示的移动终端及方法 |
CN104820497A (zh) * | 2015-05-08 | 2015-08-05 | 东华大学 | 一种基于增强现实的3d交互显示系统 |
CN105120251A (zh) * | 2015-08-19 | 2015-12-02 | 京东方科技集团股份有限公司 | 一种3d场景展示方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN105120251A (zh) | 2015-12-02 |
US20170054972A1 (en) | 2017-02-23 |
US10045007B2 (en) | 2018-08-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017028498A1 (zh) | 3d场景展示方法及装置 | |
US9807342B2 (en) | Collaborative presentation system | |
US10171792B2 (en) | Device and method for three-dimensional video communication | |
CN109615703B (zh) | 增强现实的图像展示方法、装置及设备 | |
EP3334173A1 (en) | Method and device for playing video content at any position and time | |
CN106066701B (zh) | 一种ar和vr数据处理设备与方法 | |
JP2019092170A (ja) | 3dプレノプティックビデオ画像を作成するためのシステムおよび方法 | |
US8655163B2 (en) | Consolidated 2D/3D camera | |
WO2022199260A1 (zh) | 静态对象的立体显示方法、装置、介质及电子设备 | |
WO2016021034A1 (ja) | 3次元上の注視点の位置特定アルゴリズム | |
US20190130193A1 (en) | Virtual Reality Causal Summary Content | |
JPWO2016199731A1 (ja) | ヘッドマウントディスプレイ、表示制御方法及びプログラム | |
KR20130039522A (ko) | 입체 파노라마 영상을 생성하는 장치 및 방법 | |
US20150326847A1 (en) | Method and system for capturing a 3d image using single camera | |
US20210400234A1 (en) | Information processing apparatus, information processing method, and program | |
JP2018033107A (ja) | 動画の配信装置及び配信方法 | |
US20190028690A1 (en) | Detection system | |
US11128836B2 (en) | Multi-camera display | |
TWI502271B (zh) | 控制方法及電子裝置 | |
TW201603557A (zh) | 立體影像處理系統、裝置與方法 | |
JP4249187B2 (ja) | 立体映像処理装置並びにそのプログラム | |
CN105630170B (zh) | 一种信息处理方法及电子设备 | |
JP2012120194A (ja) | 画像表示装置、画像表示方法、および画像補正方法 | |
JP2006042280A (ja) | 画像処理装置 | |
WO2018165906A1 (zh) | 一种头戴式显示装置及其显示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 15037511 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16836363 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836363 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836363 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24/08/2018) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836363 Country of ref document: EP Kind code of ref document: A1 |