WO2017113731A1 - 360度全景显示方法、显示模块及移动终端 - Google Patents

360度全景显示方法、显示模块及移动终端 Download PDF

Info

Publication number
WO2017113731A1
WO2017113731A1 PCT/CN2016/089569 CN2016089569W WO2017113731A1 WO 2017113731 A1 WO2017113731 A1 WO 2017113731A1 CN 2016089569 W CN2016089569 W CN 2016089569W WO 2017113731 A1 WO2017113731 A1 WO 2017113731A1
Authority
WO
WIPO (PCT)
Prior art keywords
current
viewing angle
viewpoint
degree panoramic
sphere model
Prior art date
Application number
PCT/CN2016/089569
Other languages
English (en)
French (fr)
Inventor
许小飞
Original Assignee
乐视控股(北京)有限公司
乐视致新电子科技(天津)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视致新电子科技(天津)有限公司 filed Critical 乐视控股(北京)有限公司
Priority to US15/240,024 priority Critical patent/US20170186219A1/en
Publication of WO2017113731A1 publication Critical patent/WO2017113731A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Definitions

  • the present patent application relates to the field of image display technologies, and in particular, to a 360 degree panoramic display method, a display module, and a mobile terminal.
  • the 360 degree panorama is a virtual reality technology that can be implemented on a microcomputer platform based on static images. Allows people to perform 360-degree panoramic viewing on a computer, and through interactive operations, can freely browse to experience the three-dimensional virtual reality visual world.
  • a developer in a mobile phone-based virtual reality solution, a developer usually displays a 360-degree panoramic video or image by constructing a sphere model, and the user can see based on his own screen through the screen display.
  • the rest of the image outside the viewing angle is always rendered and rendered (and not visible to the user), which causes unnecessary waste of resources.
  • An embodiment of the present invention provides a 360 degree panoramic display method and a display module. And the mobile terminal enables the mobile terminal to reduce the amount of program calculation and improve the rendering efficiency in the process of 360-degree panoramic display.
  • an embodiment of the present invention provides a 360-degree panoramic display method, including the steps of: acquiring a current viewpoint; establishing a sphere model in the current perspective range according to the current viewpoint; The inner sphere model is rendered to produce a three-dimensional image within a range of viewing angles; a three-dimensional image within the range of viewing angles is displayed.
  • An embodiment of the present invention further provides a 360 degree panoramic display module, including: a view point acquisition unit, a modeling unit, a rendering unit, and a display unit; the view point acquisition unit is configured to acquire a current view point; Establishing a sphere model in a current perspective range according to the current viewpoint; the rendering unit is configured to render a sphere model in the range of the perspective to generate a three-dimensional image in a range of the angle of view; the display unit is configured to display the sphere A three-dimensional image within the range of viewing angles.
  • Embodiments of the present invention also provide a mobile terminal including the 360 degree panoramic display module.
  • One embodiment of the present invention provides a computer readable storage medium comprising computer executable instructions that, when executed by at least one processor, cause the processor to perform the above method.
  • the embodiment of the present invention establishes a sphere model in the current perspective range according to the acquired current viewpoint, and renders the sphere model in the range of the angle of view to generate a three-dimensional image in the range of the angle of view. That is, in the method for realizing the 360-degree panoramic display, the present invention only renders and renders the image in the current viewing angle, which reduces the number of vertices of the drawing model; thereby reducing the amount of program calculation and improving the rendering efficiency.
  • FIG. 1 is a flow chart of a 360 degree panoramic display method according to a first embodiment of the present invention
  • FIG. 2 is a block diagram of a 360 degree panoramic display module in accordance with a second embodiment of the present invention.
  • a first embodiment of the present invention relates to a 360-degree panoramic display method applied to a mobile terminal; in the present embodiment.
  • the specific process is shown in Figure 1.
  • Step 10 Get the current viewpoint. Wherein step 10 includes the following sub-steps.
  • Sub-step 101 Detecting the current pose of the mobile terminal.
  • the spatial orientation of the mobile terminal may be changed; the current posture reflects the spatial orientation of the mobile terminal.
  • the current posture in the present embodiment is characterized by the angular velocity of the mobile terminal.
  • the angular velocity of the mobile terminal includes three angular velocities of the mobile terminal in the X, Y, and Z axis directions.
  • the specific parameters for characterizing the current posture are not limited as long as the spatial orientation of the mobile terminal can be reflected.
  • Sub-step 102 Calculate the current viewpoint according to the current pose.
  • three angles of the Euler angles are calculated according to the three angular velocities of the mobile terminal in the X, Y, and Z axis directions, and the three angles are: yaw, which represents an angle at which the viewpoint rotates around the Y axis; pitch, Indicates the angle at which the viewpoint rotates around the X axis, and roll, which represents the angle at which the viewpoint rotates about the Z axis.
  • yaw matrix::rotateY(yaw);
  • matrix_pitch matrix::rotateX(pitch);
  • matrix_roll matrix::rotateZ(roll). That is, the current viewpoint is substantially represented by three rotation matrices.
  • the current view point is not limited in any way; in other embodiments, the current view point may also be a pre-existing recommended view point in the mobile terminal (representing a better viewing angle), or pre-existing mobile Multiple continuously varying viewpoints within the terminal.
  • Step 11 Establish a sphere model in the current perspective range according to the obtained current viewpoint. Wherein step 11 comprises the following sub-steps.
  • Sub-step 111 Establish a sphere model within a reference angle of view according to the preset reference viewpoint and the reference perspective.
  • the reference point of view and the reference angle of view are pre-stored in the mobile terminal; in general, the default viewpoint of the reference viewpoint is forward facing forward; the reference angle of view can be set to 120 degrees (can be arbitrarily set, and the screen can be covered); The present embodiment does not impose any limitation on the reference viewpoint and the reference perspective.
  • the basic parameters of the sphere model are pre-established in the mobile terminal.
  • the basic parameters include the number of grids in the vertical direction (vertiacl), the number of grids in the horizontal direction (horizontal), and the radius of the sphere. .
  • the specific values of these basic parameters are set by the designer according to the quality requirements of the three-dimensional image; the more the number of meshes, the higher the definition of the three-dimensional image; the radius of the ball only needs to satisfy the larger than the viewpoint to the projection plane (ie, the near plane) The distance can be.
  • the sphere model established from the basic parameters is a complete sphere model
  • the reference viewpoint and the reference perspective can determine the portion of the complete sphere model within the reference viewing angle range.
  • the first step setting basic parameters, reference viewpoints and reference viewing angles; the setting is based on the above.
  • Step 5 Calculate the vertex coordinates (x, y, z) of each point on the grid according to the above data.
  • the specific formula is as follows:
  • Sub-step 112 Update the sphere model within the reference range of view with the current viewpoint to produce a sphere model within the current range of perspective.
  • the three rotation matrices matrix_yaw, matrix_pitch, matrix_roll (ie, the current viewpoint) calculated in the sub-step 102 and the X, Y, and Z axes of the vertex coordinates (x, y, z) calculated in the sub-step 111 are specifically performed.
  • the coordinate values correspond to the multiplication, and the calculated new vertex coordinates are the vertex coordinates of the sphere model in the current viewing angle range.
  • the above calculation process is to update the sphere model within the reference angle of view with the current viewpoint to generate a sphere model within the current range of perspective.
  • Step 12 Render the sphere model in the current perspective range to generate a three-dimensional image within the current range of perspective.
  • step 12 includes the following sub-steps.
  • Sub-step 121 Calculate texture coordinates corresponding to the current angle of view range according to the sphere model in the current perspective range.
  • the texture coordinates (s, t) corresponding to the current range of angles of view are calculated; the specific calculation formula is as follows:
  • Sub-step 122 Perform texture mapping on the sphere model in the current viewing angle range according to the texture coordinates corresponding to the current viewing angle range to generate a three-dimensional image in the current viewing angle range.
  • firstly acquiring a two-dimensional panoramic image stored in advance in the mobile terminal; secondly, acquiring a two-dimensional image corresponding to the current viewing angle range from the two-dimensional panoramic image according to the texture coordinate corresponding to the current viewing angle range;
  • the two-dimensional image texture maps to a sphere model within the current viewing angle; thereby, generating a three-dimensional image within the current viewing angle range.
  • the generated three-dimensional image can be modified in terms of light, transparency, etc., so that the finally rendered three-dimensional image is more realistic.
  • Step 13 Display a three-dimensional image within the current viewing angle range.
  • the three-dimensional image generated in the sub-step 122 in the current viewing angle range is rendered into the frame buffer for display by the display.
  • the 360-degree panoramic display method provided by the embodiment can only construct a sphere model in the current perspective range according to the detected current viewpoint, and only draw and render the sphere model in the current perspective range; that is, the current perspective range is not needed.
  • the outer sphere model is drawn and rendered; this reduces the amount of program computation and improves rendering efficiency.
  • a second embodiment of the present invention relates to a 360-degree panoramic display module.
  • the present invention includes a viewpoint acquiring unit 10, a modeling unit 11, a rendering unit 12, and a display unit 13.
  • the viewpoint obtaining unit 10 is configured to acquire a current viewpoint.
  • the viewpoint acquiring unit 10 includes a posture detecting subunit and a viewpoint calculating subunit; the posture detecting subunit is configured to detect a current posture of the mobile terminal, and the viewpoint calculating subunit is configured to calculate a current viewpoint according to the current posture.
  • the posture detecting subunit includes, for example, a gyroscope.
  • the modeling unit 11 is configured to establish a sphere model within a current perspective range according to the acquired current viewpoint.
  • the rendering unit 12 is configured to render a sphere model within a current range of perspectives to generate a three-dimensional image within a current range of perspectives.
  • the rendering unit 12 includes a texture calculation subunit and a texture map subunit; the texture calculation subunit is configured to calculate texture coordinates corresponding to the current perspective range according to the sphere model in the current perspective range; the texture map subunit is used according to the current perspective The texture coordinates corresponding to the range texture map the sphere model in the current perspective range to generate a three-dimensional image in the current perspective range.
  • the display unit 13 is for displaying a three-dimensional image within the current viewing angle range.
  • the present embodiment is a system embodiment corresponding to the first embodiment, and the present embodiment can be implemented in cooperation with the first embodiment.
  • the related technical details mentioned in the first embodiment are still effective in the present embodiment, and are not described herein again in order to reduce repetition. Accordingly, the related art details mentioned in the present embodiment can also be applied to the first embodiment.
  • each module involved in this embodiment is a logic module.
  • a logical unit may be a physical unit, a part of a physical unit, or multiple physical entities. A combination of units is implemented.
  • the present embodiment does not introduce a unit that is not closely related to solving the technical problem proposed by the present invention, but this does not mean that there are no other units in the present embodiment.
  • a third embodiment of the present invention relates to a mobile terminal including the 360-degree panoramic display module according to the second embodiment.
  • the mobile terminal in this embodiment is a smart phone, but is not limited thereto.
  • the steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two.
  • Software modules can reside in random access memory (RAM), flash memory, read only memory (ROM), programmable read only memory (PROM), erasable read only memory (PROM), erasable and programmable only Read memory (EPROM), electrically erasable programmable read only memory (EEPROM), registers, hard disk, removable disk, compact disk read only memory (CD-ROM), or any other form known in the art Storage media.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium can reside in an application specific integrated circuit (ASIC).
  • the ASIC can reside in a computing device or user terminal, or the processor and storage medium can reside as discrete components in a computing device or user terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种360度全景显示方法、显示模块及移动终端,涉及图像显示技术领域。360度全景显示方法包含以下步骤:获取当前视点(10);根据所述当前视点建立所述当前视角范围内的球体模型(11);对所述视角范围内的球体模型进行渲染,以产生视角范围内的三维图像(12);显示所述视角范围内的三维图像(13)。使得移动终端在360度全景显示的过程中,能够减少程序计算量、提高渲染效率。

Description

360度全景显示方法、显示模块及移动终端
交叉引用
本申请引用于2015年12月28日递交的第201511014470.4号中国专利申请,其通过引用被全部并入本申请。
技术领域
本专利申请涉及图像显示技术领域,特别涉及一种360度全景显示方法、显示模块及移动终端。
背景技术
360度全景是一种基于静态图像在微机平台上能够实现虚拟现实技术。让人们能在电脑上进行360度全景观察,而且通过交互操作,可以实现自由浏览,从而体验三维的虚拟现实视觉世界。
目前,发明人在实现本发明的过程中发现,在基于手机的虚拟现实方案中,开发者通常是通过构建球体模型来显示360度全景视频或者图像,通过屏幕显示,用户能够看到基于自己所在方位的视角范围内的三维图像;当用户变换方位时,能够看到方位变换后的视角范围内的三维图像。即,用户始终只能在屏幕上看到自己所在方位的视角范围内的三维图像。而实质上的,在计算机内部,该视角范围外的其余图像始终在进行渲染绘制(而用户看不到),这就造成了造成不必要的资源浪费。
发明内容
本发明部分实施例的目的在于提供一种360度全景显示方法、显示模块 及移动终端,使得移动终端在360度全景显示的过程中,能够减少程序计算量、提高渲染效率。
为解决上述技术问题,本发明的实施方式提供了一种360度全景显示方法,包含以下步骤:获取当前视点;根据所述当前视点建立所述当前视角范围内的球体模型;对所述视角范围内的球体模型进行渲染,以产生视角范围内的三维图像;显示所述视角范围内的三维图像。
本发明的实施方式还提供了一种360度全景显示模块,包含:视点获取单元、建模单元、渲染单元以及显示单元;所述视点获取单元用于获取当前视点;所述建模单元用于根据所述当前视点建立当前视角范围内的球体模型;所述渲染单元用于对所述视角范围内的球体模型进行渲染,以产生视角范围内的三维图像;所述显示单元用于显示所述视角范围内的三维图像。
本发明的实施方式还提供了一种移动终端,包含所述的360度全景显示模块。
本发明的一个实施例提供了一种计算机可读存储介质,包括计算机可执行指令,所述计算机可执行指令在由至少一个处理器执行时致使所述处理器执行上述方法。
本发明实施方式相对于现有技术而言,根据获取的当前视点建立当前视角范围内的球体模型,并对视角范围内的球体模型进行渲染,以产生视角范围内的三维图像。即,本发明在实现360度全景显示的方法中,仅对当前视角内的图像进行渲染绘制,减少了绘制模型的顶点数量;从而减少了程序计算量,提高了渲染效率。
附图说明
图1是根据本发明第一实施方式的一种360度全景显示方法的流程图;
图2是根据本发明第二实施方式的一种360度全景显示模块的方框图。
具体实施方式
为使本发明部分实施例的目的、技术方案和优点更加清楚,下面将结合附图对本发明的各实施方式进行详细的阐述。然而,本领域的普通技术人员可以理解,在本发明各实施方式中,为了使读者更好地理解本申请而提出了许多技术细节。但是,即使没有这些技术细节和基于以下各实施方式的种种变化和修改,也可以实现本申请所要求保护的技术方案。
本发明的第一实施方式涉及一种360度全景显示方法,应用于移动终端;本实施方式中的。具体流程如图1所示。
步骤10:获取当前视点。其中,步骤10包含以下子步骤。
子步骤101:检测移动终端的当前姿态。
具体而言,用户在使用移动终端时,可能会变换移动终端的空间朝向;当前姿态即反映移动终端的空间朝向。本实施方式中的当前姿态由移动终端的角速度表征。其中,移动终端的角速度包含移动终端在X、Y、Z轴方向上的三个角速度。然而,本实施方式中对表征当前姿态的具体参数不作任何限制,只要能够反映移动终端的空间朝向即可。
子步骤102:根据当前姿态计算当前视点。
具体而言,首先,根据移动终端在X、Y、Z轴方向上的三个角速度计算欧拉角的三个角度,三个角度分别为:yaw,表示视点绕Y轴旋转的角度;pitch,表示视点绕X轴旋转的角度,roll,表示视点绕Z轴旋转的角度。其次,根据欧拉角的三个角度,计算三个旋转矩阵matrix_yaw=matrix::rotateY(yaw);matrix_pitch=matrix::rotateX(pitch);matrix_roll=matrix::rotateZ(roll)。即,当前视点实质由三个旋转矩阵表示。
需要说明的是,本实施方式对当前视点的获取方式不作任何限制;于其他实施方式中,当前视点也可以为预存在移动终端内的推荐视点(表示较佳的观看角度)、或者预存在移动终端内的多个连续变化的视点。
步骤11:根据获取的当前视点建立当前视角范围内的球体模型。其中,步骤11包含以下子步骤。
子步骤111:根据预设的参考视点与参考视角建立参考视角范围内的球体模型。
其中,移动终端内部预先储存有参考视点与参考视角;一般而言,参考视点默认的观察点是正向朝前;参考视角例如可以设定为120度(可任意设定,覆盖屏幕即可);本实施方式对参考视点与参考视角不做任何限制。
另外,移动终端内实质还预设有建立球体模型的基本参数,基本参数包含球面在垂直方向的网格数量(vertiacl)、球面在水平方向的网格数量(horizontal)以及球的半径(radius)。这些基本参数的具体数值由设计人员根据对三维图像的质量要求而设定;网格数量越多,则三维图像的清晰度越高;球的半径只需满足大于视点到投影平面(即近平面)的距离即可。
即,根据基本参数建立的球体模型是一个完整的球体模型,参考视点与参考视角可以确定完整的球体模型在参考视角范围内的部分。
于本实施方式中,建立参考视角范围内的球体模型的具体方式如下:
第一步:设定基本参数、参考视点与参考视角;其设定依据如上所述。于本实施方式中,球面在垂直方向的网格数量vertiacl=64;球面在水平方向的网格数量horizontal=64;球的半径radius=100;参考视角fov=120°;参考视点为正向朝前。
第二步:计算垂直方向每一格占据的分量;即:yf=y/vertical,y的取值为[0,vertiacl]。
第三步:将第二步中的分量yf映射至[-0.5,0.5]的区间内,并计算参考视角在yf上的分量;即:lat_vertical=(yf–0.5)*fov。
第四步:计算垂直方向上的lat的余弦值coslat=cosf(lat)。
同理,计算网格水平方向上每一格占据的分量xf=x/horizontal,x的取值为[0,horizontal];计算参考视角在xf上的分量,lat_horizontal=(xf–0.5)*fov;计算水平方向上的lat的余弦值coslat=cosf(lat)。
第五步:根据上述数据,计算得网格上每个点的顶点坐标(x,y,z),具体公式如下:
x=radius*cosf(lat_horizontal)*coslat
y=radius*sinf(lat_horizontal)*coslat
z=radius*sinf(lat_vertical)
子步骤112:利用当前视点更新参考视角范围内的球体模型,以产生当前视角范围内的球体模型。
具体而言,将子步骤102中计算出来的三个旋转矩阵matrix_yaw、matrix_pitch、matrix_roll(即当前视点)与子步骤111中计算出来的顶点坐标(x,y,z)的X、Y、Z轴的坐标值对应相乘,计算出来的新的顶点坐标即为当前视角范围内的球体模型的顶点坐标。上述计算过程,即为利用当前视点更新参考视角范围内的球体模型,以产生当前视角范围内的球体模型。
步骤12:对当前视角范围内的球体模型进行渲染,以生成当前视角范围内的三维图像。其中,步骤12包含以下子步骤。
子步骤121:根据当前视角范围内的球体模型计算当前视角范围对应的纹理坐标。
即,根据子步骤112中计算出的当前视角范围内的球体模型的顶点坐标,计算当前视角范围对应的纹理坐标(s,t);具体计算公式如下:
s=xf–0.5
t=(1.0–yf)–0.5
子步骤122:根据当前视角范围对应的纹理坐标对当前视角范围内的球体模型进行纹理贴图,以生成当前视角范围内的三维图像。
具体而言,首先,获取移动终端内预先储存的二维全景图像;其次,根据当前视角范围对应的纹理坐标,从二维全景图像中获取该当前视角范围对应的二维图像;然后,将该二维图像纹理贴图至当前视角范围内的球体模型;从而,生成当前视角范围内的三维图像。
较佳的,纹理贴图后,还可以对生成的三维图像进行光线、透明度等方面的修饰,使得最后呈现的三维图像更加真实。
步骤13:显示当前视角范围内的三维图像。
即,将子步骤122中生成当前视角范围内的三维图像渲染到帧缓存中,从而由显示器显示出来。
本实施方式提供的360度全景显示方法,能够根据检测到的当前视点,仅仅构建当前视角范围内的球体模型,并仅仅对当前视角范围内的球体模型进行绘制和渲染;即无需对当前视角范围外的球体模型进行绘制和渲染;从而减少了程序计算量,提高了渲染效率。
上面各种方法的步骤划分,只是为了描述清楚,实现时可以合并为一个步骤或者对某些步骤进行拆分,分解为多个步骤,只要包含相同的逻辑关系,都在本专利的保护范围内;对算法中或者流程中添加无关紧要的修改或者引入无关紧要的设计,但不改变其算法和流程的核心设计都在该专利的保护范围内。
本发明第二实施方式涉及一种360度全景显示模块,如图2所示,包含:视点获取单元10、建模单元11、渲染单元12以及显示单元13。
视点获取单元10用于获取当前视点。具体而言,视点获取单元10包含姿态检测子单元与视点计算子单元;姿态检测子单元用于检测移动终端的当前姿态,视点计算子单元用于根据当前姿态计算当前视点。其中,姿态检测子单元例如包含陀螺仪。
建模单元11用于根据获取的当前视点建立当前视角范围内的球体模型。
渲染单元12用于对当前视角范围内的球体模型进行渲染,以生成当前视角范围内的三维图像。具体而言,渲染单元12包含纹理计算子单元与纹理贴图子单元;纹理计算子单元用于根据当前视角范围内的球体模型计算当前视角范围对应的纹理坐标;纹理贴图子单元用于根据当前视角范围对应的纹理坐标对当前视角范围内的球体模型进行纹理贴图,以生成当前视角范围内的三维图像。
显示单元13用于显示当前视角范围内的三维图像。
不难发现,本实施方式为与第一实施方式相对应的系统实施例,本实施方式可与第一实施方式互相配合实施。第一实施方式中提到的相关技术细节在本实施方式中依然有效,为了减少重复,这里不再赘述。相应地,本实施方式中提到的相关技术细节也可应用在第一实施方式中。
值得一提的是,本实施方式中所涉及到的各模块均为逻辑模块,在实际应用中,一个逻辑单元可以是一个物理单元,也可以是一个物理单元的一部分,还可以以多个物理单元的组合实现。此外,为了突出本发明的创新部分,本实施方式中并没有将与解决本发明所提出的技术问题关系不太密切的单元引入,但这并不表明本实施方式中不存在其它的单元。
本发明第三实施方式涉及一种移动终端,包含第二实施方式所述的360度全景显示模块。本实施方式中的移动终端为智能手机,然并不限于此。
其中,第二实施方式中提到的相关技术细节在本实施方式中依然有效, 在第二实施方式中所能达到的技术效果在本实施方式中也同样可以实现,为了减少重复,这里不再赘述。相应地,本实施方式中提到的相关技术细节也可应用在第二实施方式中。
结合本文中所揭示的实施例而描述的方法或算法的步骤可直接体现于硬件中,由处理器执行的软件模块中或所述两者的组合中。软件模块可驻留在随机存取存储器(RAM)、快闪存储器、只读存储器(ROM)、可编程只读存储器(PROM),可擦除只读存储器(PROM)、可擦除可编程只读存储器(EPROM)、电可擦除可编程只读存储器(EEPROM)、寄存器、硬盘、可装卸式盘、压缩光盘只读存储器(CD-ROM)或此项技术中已知的任一其他形式的存储媒体。在替代方案中,存储媒体可与处理器成一体式。处理器及存储媒体可驻留在专用集成电路(ASIC)中。ASIC可驻留在计算装置或用户终端中,或者,处理器及存储媒体可作为离散组件驻留在计算装置或用户终端中。
本领域的普通技术人员可以理解,上述各实施方式是实现本发明的具体实施例,而在实际应用中,可以在形式上和细节上对其作各种改变,而不偏离本发明的精神和范围。

Claims (11)

  1. 一种360度全景显示方法,包含:
    获取当前视点;
    根据所述当前视点建立当前视角范围内的球体模型;
    对所述当前视角范围内的球体模型进行渲染,以生成当前视角范围内的三维图像;
    显示所述当前视角范围内的三维图像。
  2. 根据权利要求1所述的360度全景显示方法,其中,根据所述当前视点建立当前视角范围内的球体模型的步骤,包含:
    根据预设的参考视点与参考视角建立参考视角范围内的球体模型;
    利用所述当前视点更新所述参考视角范围内的球体模型,以产生所述当前视角范围内的球体模型。
  3. 根据权利要求1或2所述的360度全景显示方法,其中,所述获取当前视点的步骤,包含:
    检测移动终端的当前姿态;
    根据所述当前姿态计算所述当前视点。
  4. 根据权利要求3所述的360度全景显示方法,其中,所述当前姿态至少由所述移动终端的当前角速度表征。
  5. 根据权利要求1至4任一项所述的360度全景显示方法,其中,对所述当前视角范围内的球体模型进行渲染,以生成当前视角范围内的三维图像的步骤,包含:
    根据所述当前视角范围内的球体模型计算当前视角范围对应的纹理坐 标;
    根据所述当前视角范围对应的纹理坐标对所述当前视角范围内的球体模型进行纹理贴图,以生成所述当前视角范围内的三维图像。
  6. 一种360度全景显示模块,应用于权利要求1或2所述的360度全景显示方法,所述360度全景显示模块包含:视点获取单元、建模单元、渲染单元以及显示单元;
    所述视点获取单元用于获取当前视点;
    所述建模单元用于根据所述当前视点建立当前视角范围内的球体模型;
    所述渲染单元用于对所述当前视角范围内的球体模型进行渲染,以生成当前视角范围内的三维图像;
    所述显示单元用于显示所述当前视角范围内的三维图像。
  7. 根据权利要求6所述的360度全景显示模块,其中,所述视点获取单元包含:姿态检测子单元与视点计算子单元;
    所述姿态检测子单元用于检测所述移动终端的当前姿态;
    所述视点计算子单元用于根据所述当前姿态计算所述当前视点。
  8. 根据权利要求7所述的360度全景显示模块,其中,所述姿态检测子单元包含陀螺仪。
  9. 根据权利要求6至8任一项所述的360度全景显示模块,其中,所述渲染单元包含:纹理计算子单元与纹理贴图子单元;
    所述纹理计算子单元用于根据所述当前视角范围内的球体模型计算当前视角范围对应的纹理坐标;
    所述纹理贴图子单元用于根据当前视角范围对应的纹理坐标对当前视角范围内的球体模型进行纹理贴图,以生成当前视角范围内的三维图像。
  10. 一种移动终端,包含权利要求6至9中任意一项所述的360度全景显示模块。
  11. 一种计算机可读存储介质,包括计算机可执行指令,所述计算机可执行指令在由至少一个处理器执行时致使所述处理器执行如权利要求1-5任一项所述的方法。
PCT/CN2016/089569 2015-12-28 2016-07-10 360度全景显示方法、显示模块及移动终端 WO2017113731A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/240,024 US20170186219A1 (en) 2015-12-28 2016-08-18 Method for 360-degree panoramic display, display module and mobile terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201511014470.4 2015-12-28
CN201511014470.4A CN105913478A (zh) 2015-12-28 2015-12-28 360度全景显示方法、显示模块及移动终端

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/240,024 Continuation US20170186219A1 (en) 2015-12-28 2016-08-18 Method for 360-degree panoramic display, display module and mobile terminal

Publications (1)

Publication Number Publication Date
WO2017113731A1 true WO2017113731A1 (zh) 2017-07-06

Family

ID=56744257

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/089569 WO2017113731A1 (zh) 2015-12-28 2016-07-10 360度全景显示方法、显示模块及移动终端

Country Status (2)

Country Link
CN (1) CN105913478A (zh)
WO (1) WO2017113731A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275803A (zh) * 2020-02-25 2020-06-12 北京百度网讯科技有限公司 3d模型渲染方法、装置、设备和存储介质
CN113064809A (zh) * 2020-01-02 2021-07-02 北京沃东天骏信息技术有限公司 一种跨设备页面调试方法和装置

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106447788B (zh) * 2016-09-26 2020-06-16 北京疯景科技有限公司 观看视角的指示方法及装置
CN106570923A (zh) * 2016-09-27 2017-04-19 乐视控股(北京)有限公司 一种画面渲染方法及装置
CN107979763B (zh) * 2016-10-21 2021-07-06 阿里巴巴集团控股有限公司 一种虚拟现实设备生成视频、播放方法、装置及系统
EP3319307B1 (en) * 2016-10-27 2023-09-06 Samsung Electronics Co., Ltd. Spherical image display apparatus and method of displaying a spherical image on a planar display
KR102589853B1 (ko) * 2016-10-27 2023-10-16 삼성전자주식회사 영상 표시 장치 및 영상 표시 방법
KR102598082B1 (ko) 2016-10-28 2023-11-03 삼성전자주식회사 영상 표시 장치, 모바일 장치 및 그 동작방법
CN109890472A (zh) 2016-11-14 2019-06-14 华为技术有限公司 一种图像渲染的方法、装置及vr设备
CN106604087B (zh) * 2016-12-13 2019-09-10 杭州映墨科技有限公司 一种全景直播的渲染实现方法
CN106815870B (zh) * 2016-12-16 2020-05-19 珠海研果科技有限公司 一种机内标定全景摄像装置的方法和系统
CN108513096B (zh) * 2017-02-27 2021-09-14 中国移动通信有限公司研究院 信息传输方法、代理服务器、终端设备以及内容服务器
CN106961592B (zh) * 2017-03-01 2020-02-14 深圳市魔眼科技有限公司 3d视频的vr显示方法及系统
CN107659851B (zh) * 2017-03-28 2019-09-17 腾讯科技(北京)有限公司 全景图像的展示控制方法及装置
CN108668108B (zh) * 2017-03-31 2021-02-19 杭州海康威视数字技术股份有限公司 一种视频监控的方法、装置及电子设备
CN108961371B (zh) * 2017-05-19 2023-06-02 阿里巴巴(中国)有限公司 全景启动页及app显示方法、处理装置以及移动终端
CN107248193A (zh) * 2017-05-22 2017-10-13 北京红马传媒文化发展有限公司 二维平面与虚拟现实场景进行切换的方法、系统及装置
CN109547766B (zh) 2017-08-03 2020-08-14 杭州海康威视数字技术股份有限公司 一种全景图像生成方法及装置
CN107526566B (zh) * 2017-09-13 2021-05-28 歌尔科技有限公司 一种移动终端的显示控制方法及装置
CN108154548B (zh) * 2017-12-06 2022-02-22 北京像素软件科技股份有限公司 图像渲染方法及装置
CN108921778B (zh) * 2018-07-06 2022-12-30 成都品果科技有限公司 一种星球效果图生成方法
CN112465939B (zh) * 2020-11-25 2023-01-24 上海哔哩哔哩科技有限公司 全景视频渲染方法及系统
CN112396683B (zh) * 2020-11-30 2024-06-04 腾讯科技(深圳)有限公司 虚拟场景的阴影渲染方法、装置、设备及存储介质
CN115018967B (zh) * 2022-06-30 2024-05-03 联通智网科技股份有限公司 一种图像生成方法、装置、设备和存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006211105A (ja) * 2005-01-26 2006-08-10 Konica Minolta Holdings Inc 画像生成装置およびシステム
CN102111672A (zh) * 2009-12-29 2011-06-29 康佳集团股份有限公司 一种在数字电视上浏览全景图像的方法、系统及终端
CN103905761A (zh) * 2012-12-26 2014-07-02 株式会社理光 图像处理系统和图像处理方法
CN104239431A (zh) * 2014-08-27 2014-12-24 广东威创视讯科技股份有限公司 三维gis模型显示方法及装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101458824B (zh) * 2009-01-08 2011-06-15 浙江大学 一种基于web的全景图的光照渲染方法
KR101694969B1 (ko) * 2012-10-29 2017-01-10 한국전자통신연구원 카메라 캘리브레이션 방법 및 장치

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006211105A (ja) * 2005-01-26 2006-08-10 Konica Minolta Holdings Inc 画像生成装置およびシステム
CN102111672A (zh) * 2009-12-29 2011-06-29 康佳集团股份有限公司 一种在数字电视上浏览全景图像的方法、系统及终端
CN103905761A (zh) * 2012-12-26 2014-07-02 株式会社理光 图像处理系统和图像处理方法
CN104239431A (zh) * 2014-08-27 2014-12-24 广东威创视讯科技股份有限公司 三维gis模型显示方法及装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113064809A (zh) * 2020-01-02 2021-07-02 北京沃东天骏信息技术有限公司 一种跨设备页面调试方法和装置
CN111275803A (zh) * 2020-02-25 2020-06-12 北京百度网讯科技有限公司 3d模型渲染方法、装置、设备和存储介质
CN111275803B (zh) * 2020-02-25 2023-06-02 北京百度网讯科技有限公司 3d模型渲染方法、装置、设备和存储介质

Also Published As

Publication number Publication date
CN105913478A (zh) 2016-08-31

Similar Documents

Publication Publication Date Title
WO2017113731A1 (zh) 360度全景显示方法、显示模块及移动终端
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN107564089B (zh) 三维图像处理方法、装置、存储介质和计算机设备
US10643300B2 (en) Image display method, custom method of shaped cambered curtain, and head-mounted display device
EP3534336B1 (en) Panoramic image generating method and apparatus
US11282264B2 (en) Virtual reality content display method and apparatus
US8390617B1 (en) Visualizing oblique images
WO2017092303A1 (zh) 虚拟现实场景模型建立方法及装置
US9135750B2 (en) Technique for filling holes in a three-dimensional model
WO2017088361A1 (zh) 一种基于虚拟现实设备的视锥体裁剪方法及装置
US9165397B2 (en) Texture blending between view-dependent texture and base texture in a geographic information system
CN111161336B (zh) 三维重建方法、三维重建装置和计算机可读存储介质
Jia et al. 3D image reconstruction and human body tracking using stereo vision and Kinect technology
CN112399158A (zh) 投影图像校准方法、装置及投影设备
WO2017113729A1 (zh) 360度图像加载方法、加载模块及移动终端
WO2017113733A1 (zh) 一种三维视频的自由观影方法及设备
US11302023B2 (en) Planar surface detection
CN114419226A (zh) 全景渲染方法、装置、计算机设备和存储介质
US20180213215A1 (en) Method and device for displaying a three-dimensional scene on display surface having an arbitrary non-planar shape
CN113470112A (zh) 图像处理方法、装置、存储介质以及终端
US20230260218A1 (en) Method and apparatus for presenting object annotation information, electronic device, and storage medium
CN109816765B (zh) 面向动态场景的纹理实时确定方法、装置、设备和介质
WO2019042028A1 (zh) 全视向的球体光场渲染方法
CN115619986B (zh) 场景漫游方法、装置、设备和介质
CN114820980A (zh) 三维重建方法、装置、电子设备和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16880529

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16880529

Country of ref document: EP

Kind code of ref document: A1