WO2018045759A1 - Method and device for lighting rendering in augmented reality, and mobile terminal - Google Patents

Method and device for lighting rendering in augmented reality, and mobile terminal Download PDF

Info

Publication number
WO2018045759A1
WO2018045759A1 PCT/CN2017/081402 CN2017081402W WO2018045759A1 WO 2018045759 A1 WO2018045759 A1 WO 2018045759A1 CN 2017081402 W CN2017081402 W CN 2017081402W WO 2018045759 A1 WO2018045759 A1 WO 2018045759A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
real
scene
real scene
augmented reality
Prior art date
Application number
PCT/CN2017/081402
Other languages
French (fr)
Chinese (zh)
Inventor
邵红胜
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018045759A1 publication Critical patent/WO2018045759A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/12Shadow map, environment map
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Definitions

  • the present disclosure relates to the field of augmented reality technologies, and in particular, to a method, an apparatus, and a mobile terminal for performing illumination rendering in an augmented reality.
  • Augmented Reality is a technique for calculating the position and angle of a camera image in real time and adding a corresponding virtual model.
  • the goal of this technology is to load the virtual model into the real world and perform it on the screen. interactive.
  • the problem of rendering virtual models in augmented reality has always been the key to affecting the visual experience.
  • the lighting rendering of virtual models has the greatest impact on the visual experience.
  • the virtual model of the augmented reality is simply rendered by the fixed source, so that the realism of the virtual model cannot meet the requirements.
  • the technical problem to be solved by the present disclosure is to provide a method, a device and a mobile terminal for performing illumination rendering in an augmented reality, which enhances the fidelity of a virtual object and improves the visual effect of the augmented reality.
  • the determining the position and brightness of the light source in the real scene includes:
  • the brightness value of all the position points in the real scene is calculated, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
  • the determining, according to the positional relationship between any real object in the real scene and the shadow thereof, the three-dimensional coordinate position of the light source in the real scene including:
  • the position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
  • the feature points include: an inflection point and an extreme point.
  • the feature vector is that any feature point on the boundary line of the entity points to the shadow boundary line A vector of feature points corresponding to any of the feature points.
  • the illuminating the virtual object in the virtual scene according to the position and brightness of the light source in the real scene including:
  • the converting the position of the light source in the real scene to the position of the light source in the virtual scene comprises:
  • the present disclosure also provides an apparatus for performing illumination rendering in augmented reality, including:
  • a light source determining module for determining a position and brightness of the light source in a real scene
  • the illumination rendering module is configured to perform illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
  • the light source determining module includes:
  • a position determining unit configured to determine a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and a shadow thereof in the real scene;
  • the brightness determining unit is configured to calculate a brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
  • the location determining unit is configured to:
  • the position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
  • the feature points include: an inflection point and an extreme point.
  • the feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
  • the illumination rendering module includes:
  • a conversion unit configured to convert a position of the light source in the real scene to a position of the light source in the virtual scene
  • a rendering unit configured to perform illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
  • the conversion unit is configured to:
  • the present disclosure also provides a mobile terminal, including the apparatus for performing illumination rendering in the augmented reality described above.
  • the present disclosure also provides a non-transitory computer readable storage medium having stored therein computer program instructions that, when executed by one or more processors of a terminal device, perform the computer program instructions as described above A method of lighting rendering in augmented reality.
  • the present disclosure has at least the following advantages:
  • the method, device and mobile terminal for performing illumination rendering in the augmented reality of the present disclosure can obtain the light source information in the real scene of the augmented reality in real time, and dynamically virtualize the augmented reality according to the light source information in the real scene.
  • the virtual objects in the scene are rendered by illumination, so that the virtual objects in the virtual scene can be better integrated with the real scene, bringing the user a visual experience closer to reality.
  • FIG. 1 is a flowchart of a method for performing illumination rendering in an augmented reality according to a first embodiment of the present disclosure
  • FIG. 2 is a flowchart of a method for performing illumination rendering in augmented reality according to a second embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of a device for performing illumination rendering in an augmented reality according to a third embodiment of the present disclosure
  • FIG. 4 is a schematic structural diagram of an apparatus for performing illumination rendering in an augmented reality according to a fourth embodiment of the present disclosure.
  • a first embodiment of the present disclosure includes the following steps:
  • Step S101 Determine the position and brightness of the light source in the real scene.
  • step S101 includes:
  • Step A1 Determine the three-dimensional coordinate position of the light source in the real scene according to the positional relationship between any real object and its shadow in the real scene.
  • step A1 includes:
  • Step A11 Determine the physical boundary line and the shadow boundary line of any real object in the real scene.
  • Step A12 Select feature points respectively on the physical boundary line and the shadow boundary line of any of the real objects.
  • the feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
  • Step A13 Determine a feature vector according to a correspondence relationship between a feature point on the physical boundary line and a feature point on the shadow boundary line.
  • the feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
  • Step A14 determining the position of the intersection of the extension line or the reverse extension line of all the feature vectors as the light source in the true The position of the three-dimensional coordinates in the real scene.
  • Step A2 Calculate the brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
  • Step S102 Perform light rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
  • step S102 includes:
  • Step B1 Converting the position of the light source in the real scene to the position of the light source in the virtual scene.
  • Step B2 Perform light rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
  • step B1 includes:
  • a second embodiment of the present disclosure includes the following steps:
  • Step S201 Acquire a real-world video image by using a camera.
  • Step S202 Decomposing the video image of the real world to obtain a series of image frames.
  • Step S203 Calculate the position of the light source in each of the image frames.
  • step S203 includes:
  • Step C1 Determine a physical boundary line and a shadow boundary line of any real object in the image frame.
  • Step C2 Select feature points on the physical boundary line and the shadow boundary line, respectively.
  • the feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
  • Step C3 Determine a feature vector according to a correspondence relationship between a feature point on the solid boundary line and a feature point on the shadow boundary line.
  • the feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
  • Step C4 Determine the position of the light source in the image frame according to the feature vector.
  • the position of the light source in the image frame is the intersection of the extension line or the inverse extension line of all of the feature vectors.
  • Step S204 Calculate the brightness of the light source in each of the image frames.
  • step S204 includes:
  • the luminance value of each pixel in any of the image frames is calculated, and the maximum luminance value in any of the image frames is the luminance of the light source in any of the image frames.
  • Step S205 Converting the position of the light source in the image frame to the position of the light source in the three-dimensional model of the augmented reality.
  • step S205 includes:
  • Step S206 Perform illumination rendering on the virtual object in the three-dimensional model of the augmented reality according to the position of the light source in the three-dimensional model of the augmented reality and the brightness of the light source in the image frame.
  • Step S207 Place the virtual object rendered by the illumination into the real world video image acquired by the camera.
  • the light source information in the real-world video image is dynamically extracted, and the virtual object is dynamically rendered according to the light source information.
  • a third embodiment of the present disclosure an apparatus for performing illumination rendering in augmented reality, as shown in FIG. 3, includes the following components:
  • a light source determining module 301 for determining the position and brightness of the light source in a real scene.
  • the light source determining module 301 includes:
  • the position determining unit is configured to determine a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and its shadow in the real scene.
  • the location determining unit is configured to:
  • Feature points are respectively selected on the physical boundary line and the shadow boundary line of any of the real objects.
  • the feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
  • the feature vector is determined according to a correspondence relationship between a feature point on the solid boundary line and a feature point on the shadow boundary line.
  • the feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
  • the position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
  • the brightness determining unit is configured to calculate a brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
  • the illumination rendering module 302 is configured to perform illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
  • the illumination rendering module 302 is configured to:
  • a conversion unit for converting a position of the light source in the real scene to a position of the light source in the virtual scene.
  • a rendering unit configured to perform illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
  • the conversion unit is configured to:
  • a fourth embodiment of the present disclosure an apparatus for performing illumination rendering in augmented reality, as shown in FIG. 4, includes the following components:
  • a video acquisition module 401 configured to acquire a real-world video image by using a camera.
  • the video decomposition module 402 is configured to decompose the real-world video image to obtain a series of image frames.
  • a position calculation module 403 for calculating the position of the light source in each of the image frames.
  • the calculating module 403 includes:
  • a boundary line determining unit is configured to determine a physical boundary line and a shadow boundary line of any real object in the image frame.
  • a feature point determining unit is configured to respectively select feature points on the physical boundary line and the shadow boundary line.
  • the feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
  • a feature vector determining unit configured to determine a feature vector according to a correspondence between a feature point on the solid boundary line and a feature point on the shadow boundary line.
  • the feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
  • a light source determining unit configured to determine a position of the light source in the image frame according to the feature vector.
  • the position of the light source in the image frame is the intersection of the extension line or the inverse extension line of all of the feature vectors.
  • a brightness calculation module 404 for calculating the brightness of the light source in each of the image frames.
  • the brightness calculation module 404 is configured to:
  • a luminance value is calculated for each point in any of the image frames, and the maximum luminance value in any of the image frames is the luminance of the light source in any of the image frames.
  • a position conversion module 405 for converting the position of the light source in the image frame to the position of the light source in the three-dimensional model of the augmented reality.
  • the location conversion module 405 is configured to:
  • the illumination rendering module 406 is configured to perform illumination rendering on the virtual object in the augmented reality three-dimensional model according to the position of the light source in the three-dimensional model of the augmented reality and the brightness of the light source in the image frame.
  • the video restoration module 407 is configured to place the virtual object rendered by the illumination into the real world video image acquired by the camera.
  • the light source information in the real-world video image is dynamically extracted, and the virtual object is dynamically rendered according to the light source information.
  • a fifth embodiment of the present disclosure is a mobile terminal provided with an apparatus for performing illumination rendering in an augmented reality according to a third embodiment of the present disclosure.
  • the present disclosure also provides a non-transitory computer readable storage medium having stored therein computer program instructions
  • the terminal device executes the computer program instructions, the terminal device performs the method of performing illumination rendering in the augmented reality as described in the first or second embodiment.
  • the method, device and mobile terminal for performing illumination rendering in the augmented reality introduced in the embodiment of the present disclosure can obtain the light source information in the real scene of the augmented reality in real time, and dynamically augment the reality according to the light source information in the real scene.
  • the virtual objects in the virtual scene are rendered by the light, so that the virtual objects in the virtual scene can be better integrated with the real scene, bringing the user a visual experience closer to reality.
  • the method for performing illumination rendering in the augmented reality may be applied to the light source information in the real scene of the augmented reality acquired in real time by the terminal device, and dynamically enhanced according to the dynamic information of the light source in the real scene.
  • the virtual objects in the realistic virtual scene are rendered by the light, so that the virtual objects in the virtual scene can be better integrated with the real scene, bringing the user a visual experience closer to reality.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method and device for lighting rendering in augmented reality, and a mobile terminal. The method comprises: determining position and brightness of a light source in a real scene; and performing lighting rendering on a virtual object in a virtual scene according to the position and brightness of the light source in the real scene. The present invention can obtain information of a light source in a real scene in augmented reality in real time, and dynamically perform lighting rendering on a virtual object in a virtual scene in augmented reality according to the information of the light source in the real scene, so that the virtual object in the virtual scene can be better combined with the real scene, thereby providing visual experience closer to reality for a user.

Description

一种增强现实中进行光照渲染的方法、装置及移动终端Method, device and mobile terminal for performing illumination rendering in augmented reality 技术领域Technical field
本公开涉及增强现实技术领域,尤其涉及一种增强现实中进行光照渲染的方法、装置及移动终端。The present disclosure relates to the field of augmented reality technologies, and in particular, to a method, an apparatus, and a mobile terminal for performing illumination rendering in an augmented reality.
背景技术Background technique
增强现实(Augmented Reality,简称AR),是一种实时地计算摄影机影像的位置及角度并加上相应虚拟模型的技术,这种技术的目标是在屏幕上把虚拟模型加载到真实世界中并进行互动。但是,在增强现实中对虚拟模型的渲染问题一直是影响视觉体验的关键。特别是对虚拟模型的光照渲染,对视觉体验的影响最大。目前的增强现实技术中,只简单的对增强现实中的虚拟模型进行固定光源的光照渲染,导致虚拟模型的真实感满足不了要求。Augmented Reality (AR) is a technique for calculating the position and angle of a camera image in real time and adding a corresponding virtual model. The goal of this technology is to load the virtual model into the real world and perform it on the screen. interactive. However, the problem of rendering virtual models in augmented reality has always been the key to affecting the visual experience. In particular, the lighting rendering of virtual models has the greatest impact on the visual experience. In the current augmented reality technology, the virtual model of the augmented reality is simply rendered by the fixed source, so that the realism of the virtual model cannot meet the requirements.
发明内容Summary of the invention
本公开要解决的技术问题是,提供一种增强现实中进行光照渲染的方法、装置及移动终端,增强了虚拟物体的逼真度,提高了增强现实的视觉效果。The technical problem to be solved by the present disclosure is to provide a method, a device and a mobile terminal for performing illumination rendering in an augmented reality, which enhances the fidelity of a virtual object and improves the visual effect of the augmented reality.
本公开采用的技术方案是,所述增强现实中进行光照渲染的方法,包括:The technical solution adopted by the present disclosure is that the method for performing illumination rendering in the augmented reality includes:
确定光源在真实场景中的位置和亮度;Determine the position and brightness of the light source in the real scene;
根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染。Illuminating the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
可选的,所述确定光源在真实场景中的位置和亮度,包括:Optionally, the determining the position and brightness of the light source in the real scene includes:
根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置;Determining a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and its shadow in the real scene;
计算真实场景中的所有位置点的亮度值,真实场景中的最大亮度值即为所述光源在真实场景中的亮度。The brightness value of all the position points in the real scene is calculated, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
可选的,所述根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置,包括:Optionally, the determining, according to the positional relationship between any real object in the real scene and the shadow thereof, the three-dimensional coordinate position of the light source in the real scene, including:
确定真实场景中的任一真实物体的实体边界线和阴影边界线;Determining the physical boundary line and the shadow boundary line of any real object in the real scene;
在所述任一真实物体的实体边界线和阴影边界线上分别选取特征点;Selecting feature points on the physical boundary line and the shadow boundary line of any of the real objects;
根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量;Determining a feature vector according to a correspondence between a feature point on the physical boundary line and a feature point on the shadow boundary line;
根据所有所述特征向量的延长线或反向延长线的交点位置确定为光源在真实场景中的三维坐标位置。The position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
可选的,所述特征点包括:拐点和极值点。Optionally, the feature points include: an inflection point and an extreme point.
可选的,所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与 所述任一特征点相对应的特征点的向量。Optionally, the feature vector is that any feature point on the boundary line of the entity points to the shadow boundary line A vector of feature points corresponding to any of the feature points.
可选的,所述根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染,包括:Optionally, the illuminating the virtual object in the virtual scene according to the position and brightness of the light source in the real scene, including:
将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置;Converting the position of the light source in the real scene to the position of the light source in the virtual scene;
根据所述光源在虚拟场景中的位置以及所述光源在真实场景中的亮度对虚拟场景中的虚拟物体进行光照渲染。Illuminating the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
可选的,所述将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置,包括:Optionally, the converting the position of the light source in the real scene to the position of the light source in the virtual scene comprises:
根据并行跟踪和映射算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M将光源在真实场景中的位置(x,y,z)转换为所述光源在虚拟场景中的位置(u,v,w);其中,(u,v,w)=M*(x,y,z)。Obtaining a realistic model transformation matrix M according to the parallel tracking and mapping algorithm, and converting the position (x, y, z) of the light source in the real scene to the position of the light source in the virtual scene by using the real model transformation matrix M (u, v, w); where (u, v, w) = M * (x, y, z).
本公开还提供一种增强现实中进行光照渲染的装置,包括:The present disclosure also provides an apparatus for performing illumination rendering in augmented reality, including:
光源确定模块,用于确定光源在真实场景中的位置和亮度;a light source determining module for determining a position and brightness of the light source in a real scene;
光照渲染模块,用于根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染。The illumination rendering module is configured to perform illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
可选的,所述光源确定模块,包括:Optionally, the light source determining module includes:
位置确定单元,用于根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置;a position determining unit, configured to determine a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and a shadow thereof in the real scene;
亮度确定单元,用于计算真实场景中的所有位置点的亮度值,真实场景中的最大亮度值即为所述光源在真实场景中的亮度。The brightness determining unit is configured to calculate a brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
可选的,所述位置确定单元用于:Optionally, the location determining unit is configured to:
确定真实场景中的任一真实物体的实体边界线和阴影边界线;Determining the physical boundary line and the shadow boundary line of any real object in the real scene;
在所述任一真实物体的实体边界线和阴影边界线上分别选取特征点;Selecting feature points on the physical boundary line and the shadow boundary line of any of the real objects;
根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量;Determining a feature vector according to a correspondence between a feature point on the physical boundary line and a feature point on the shadow boundary line;
根据所有所述特征向量的延长线或反向延长线的交点位置确定为光源在真实场景中的三维坐标位置。The position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
可选的,所述特征点包括:拐点和极值点。Optionally, the feature points include: an inflection point and an extreme point.
可选的,所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。Optionally, the feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
可选的,所述光照渲染模块包括:Optionally, the illumination rendering module includes:
转换单元,用于将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置;a conversion unit, configured to convert a position of the light source in the real scene to a position of the light source in the virtual scene;
渲染单元,用于根据所述光源在虚拟场景中的位置以及所述光源在真实场景中的亮度对虚拟场景中的虚拟物体进行光照渲染。And a rendering unit, configured to perform illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
可选的,所述转换单元用于:Optionally, the conversion unit is configured to:
根据并行跟踪和映射算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M 将光源在真实场景中的位置(x,y,z)转换为所述光源在虚拟场景中的位置(u,v,w);其中,(u,v,w)=M*(x,y,z)。Obtaining a realistic model transformation matrix M according to a parallel tracking and mapping algorithm, and using the real model transformation matrix M Converting the position (x, y, z) of the light source in the real scene to the position (u, v, w) of the light source in the virtual scene; where (u, v, w) = M * (x, y) ,z).
本公开还提供一种移动终端,包括上述介绍的增强现实中进行光照渲染的装置。The present disclosure also provides a mobile terminal, including the apparatus for performing illumination rendering in the augmented reality described above.
本公开还提供了一种非临时性计算机可读存储介质,其中存储有计算机程序指令,当终端设备的一个或多个处理器执行所述计算机程序指令时,所述终端设备执行如上所述的增强现实中进行光照渲染的方法。The present disclosure also provides a non-transitory computer readable storage medium having stored therein computer program instructions that, when executed by one or more processors of a terminal device, perform the computer program instructions as described above A method of lighting rendering in augmented reality.
采用上述技术方案,本公开至少具有下列优点:With the above technical solutions, the present disclosure has at least the following advantages:
本公开所述的增强现实中进行光照渲染的方法、装置及移动终端,可以实时获取的增强现实的真实场景中的光源信息,并根据所述真实场景中的光源信息动态的对增强现实的虚拟场景中的虚拟物体进行光照渲染,使得虚拟场景中的虚拟物体可以和真实场景更好的融合起来,带给使用者更接近现实的视觉体验。The method, device and mobile terminal for performing illumination rendering in the augmented reality of the present disclosure can obtain the light source information in the real scene of the augmented reality in real time, and dynamically virtualize the augmented reality according to the light source information in the real scene. The virtual objects in the scene are rendered by illumination, so that the virtual objects in the virtual scene can be better integrated with the real scene, bringing the user a visual experience closer to reality.
附图说明DRAWINGS
图1为本公开第一实施例的增强现实中进行光照渲染的方法的流程图;1 is a flowchart of a method for performing illumination rendering in an augmented reality according to a first embodiment of the present disclosure;
图2为本公开第二实施例的增强现实中进行光照渲染的方法的流程图;2 is a flowchart of a method for performing illumination rendering in augmented reality according to a second embodiment of the present disclosure;
图3为本公开第三实施例的增强现实中进行光照渲染的装置的组成结构示意图;3 is a schematic structural diagram of a device for performing illumination rendering in an augmented reality according to a third embodiment of the present disclosure;
图4为本公开第四实施例的增强现实中进行光照渲染的装置的组成结构示意图。FIG. 4 is a schematic structural diagram of an apparatus for performing illumination rendering in an augmented reality according to a fourth embodiment of the present disclosure.
具体实施方式detailed description
为更进一步阐述本公开为达成预定目的所采取的技术手段及功效,以下结合附图及较佳实施例,对本公开进行详细说明如后。The present disclosure will be described in detail below with reference to the accompanying drawings and preferred embodiments.
本公开第一实施例,一种增强现实中进行光照渲染的方法,如图1所示,包括以下步骤:A first embodiment of the present disclosure, a method for performing illumination rendering in augmented reality, as shown in FIG. 1, includes the following steps:
步骤S101:确定光源在真实场景中的位置和亮度。Step S101: Determine the position and brightness of the light source in the real scene.
可选的,步骤S101,包括:Optionally, step S101 includes:
步骤A1:根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置。Step A1: Determine the three-dimensional coordinate position of the light source in the real scene according to the positional relationship between any real object and its shadow in the real scene.
可选的,步骤A1,包括:Optionally, step A1 includes:
步骤A11:确定真实场景中的任一真实物体的实体边界线和阴影边界线。Step A11: Determine the physical boundary line and the shadow boundary line of any real object in the real scene.
步骤A12:在所述任一真实物体的实体边界线和阴影边界线上分别选取特征点。所述特征点包括:拐点和极值点,例如:边界线上的最高点、最低点和顶点等。Step A12: Select feature points respectively on the physical boundary line and the shadow boundary line of any of the real objects. The feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
步骤A13:根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量。所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。Step A13: Determine a feature vector according to a correspondence relationship between a feature point on the physical boundary line and a feature point on the shadow boundary line. The feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
步骤A14:根据所有所述特征向量的延长线或反向延长线的交点位置确定为光源在真 实场景中的三维坐标位置。Step A14: determining the position of the intersection of the extension line or the reverse extension line of all the feature vectors as the light source in the true The position of the three-dimensional coordinates in the real scene.
步骤A2:计算真实场景中的所有位置点的亮度值,真实场景中的最大亮度值即为所述光源在真实场景中的亮度。Step A2: Calculate the brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
步骤S102:根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染。Step S102: Perform light rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
可选的,步骤S102,包括:Optionally, step S102 includes:
步骤B1:将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置。Step B1: Converting the position of the light source in the real scene to the position of the light source in the virtual scene.
步骤B2:根据所述光源在虚拟场景中的位置以及所述光源在真实场景中的亮度对虚拟场景中的虚拟物体进行光照渲染。Step B2: Perform light rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
可选的,步骤B1包括:Optionally, step B1 includes:
根据并行跟踪和映射算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M将光源在真实场景中的位置(x,y,z)转换为所述光源在虚拟场景中的位置(u,v,w);其中,(u,v,w)=M*(x,y,z)。Obtaining a realistic model transformation matrix M according to the parallel tracking and mapping algorithm, and converting the position (x, y, z) of the light source in the real scene to the position of the light source in the virtual scene by using the real model transformation matrix M (u, v, w); where (u, v, w) = M * (x, y, z).
本公开第二实施例,一种增强现实中进行光照渲染的方法,如图2所示,包括以下步骤:A second embodiment of the present disclosure, a method for performing illumination rendering in an augmented reality, as shown in FIG. 2, includes the following steps:
步骤S201:利用摄像头获取真实世界的视频图像。Step S201: Acquire a real-world video image by using a camera.
步骤S202:对真实世界的视频图像进行分解,得到一系列的图像帧。Step S202: Decomposing the video image of the real world to obtain a series of image frames.
步骤S203:计算光源在每个所述图像帧中的位置。Step S203: Calculate the position of the light source in each of the image frames.
可选的,步骤S203,包括:Optionally, step S203 includes:
步骤C1:确定所述图像帧中的任一真实物体的实体边界线和阴影边界线。Step C1: Determine a physical boundary line and a shadow boundary line of any real object in the image frame.
步骤C2:在所述实体边界线和所述阴影边界线上分别选取特征点。Step C2: Select feature points on the physical boundary line and the shadow boundary line, respectively.
所述特征点包括:拐点和极值点,例如:边界线上的最高点、最低点和顶点等。The feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
步骤C3:根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量。所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。Step C3: Determine a feature vector according to a correspondence relationship between a feature point on the solid boundary line and a feature point on the shadow boundary line. The feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
步骤C4:根据所述特征向量确定光源在所述图像帧中的位置。所述光源在所述图像帧中的位置为所有所述特征向量的延长线或反向延长线的交点位置。Step C4: Determine the position of the light source in the image frame according to the feature vector. The position of the light source in the image frame is the intersection of the extension line or the inverse extension line of all of the feature vectors.
步骤S204:计算光源在每个所述图像帧中的亮度。Step S204: Calculate the brightness of the light source in each of the image frames.
可选的,步骤S204,包括:Optionally, step S204 includes:
计算任一图像帧中的每个像素点的亮度值,所述任一图像帧中的最大亮度值即为光源在所述任一图像帧中的亮度。The luminance value of each pixel in any of the image frames is calculated, and the maximum luminance value in any of the image frames is the luminance of the light source in any of the image frames.
步骤S205:将光源在所述图像帧中的位置转换为所述光源在增强现实的三维模型中的位置。Step S205: Converting the position of the light source in the image frame to the position of the light source in the three-dimensional model of the augmented reality.
可选的,步骤S205,包括:Optionally, step S205 includes:
根据PTAM(Parallel Tracking and Mapping,并行跟踪和映射)算法得到现实模型转 换矩阵M,利用所述现实模型转换矩阵M将光源在所述图像帧中的位置(x,y,z)转换为所述光源在增强现实的三维模型中的位置(u,v,w),其中,(u,v,w)=M*(x,y,z)。According to PTAM (Parallel Tracking and Mapping) algorithm, the real model is transferred. Converting the matrix M, using the real model transformation matrix M to convert the position (x, y, z) of the light source in the image frame to the position (u, v, w) of the light source in the augmented reality three-dimensional model , where (u, v, w) = M * (x, y, z).
步骤S206:根据所述光源在增强现实的三维模型中的位置以及所述光源在所述图像帧中的亮度对增强现实的三维模型中的虚拟物体进行光照渲染。Step S206: Perform illumination rendering on the virtual object in the three-dimensional model of the augmented reality according to the position of the light source in the three-dimensional model of the augmented reality and the brightness of the light source in the image frame.
步骤S207:将经过光照渲染的虚拟物体放到所述摄像头获取的真实世界的视频图像中。Step S207: Place the virtual object rendered by the illumination into the real world video image acquired by the camera.
随着摄像头不断的获取到真实世界的视频图像,动态的提取真实世界的视频图像中的光源信息,并根据所述光源信息动态的对所述虚拟物体进行动态光照渲染。As the camera continuously acquires the real-world video image, the light source information in the real-world video image is dynamically extracted, and the virtual object is dynamically rendered according to the light source information.
本公开第三实施例,一种增强现实中进行光照渲染的装置,如图3所示,包括以下组成部分:A third embodiment of the present disclosure, an apparatus for performing illumination rendering in augmented reality, as shown in FIG. 3, includes the following components:
1)光源确定模块301,用于确定光源在真实场景中的位置和亮度。1) A light source determining module 301 for determining the position and brightness of the light source in a real scene.
可选的,光源确定模块301,包括:Optionally, the light source determining module 301 includes:
位置确定单元,用于根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置。The position determining unit is configured to determine a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and its shadow in the real scene.
可选的,所述位置确定单元用于:Optionally, the location determining unit is configured to:
确定真实场景中的任一真实物体的实体边界线和阴影边界线。Determines the physical boundary line and shadow boundary line of any real object in the real scene.
在所述任一真实物体的实体边界线和阴影边界线上分别选取特征点。所述特征点包括:拐点和极值点,例如:边界线上的最高点、最低点和顶点等。Feature points are respectively selected on the physical boundary line and the shadow boundary line of any of the real objects. The feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量。所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。The feature vector is determined according to a correspondence relationship between a feature point on the solid boundary line and a feature point on the shadow boundary line. The feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
根据所有所述特征向量的延长线或反向延长线的交点位置确定为光源在真实场景中的三维坐标位置。The position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
亮度确定单元,用于计算真实场景中的所有位置点的亮度值,真实场景中的最大亮度值即为所述光源在真实场景中的亮度。The brightness determining unit is configured to calculate a brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
2)光照渲染模块302,用于根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染。2) The illumination rendering module 302 is configured to perform illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
可选的,光照渲染模块302,用于:Optionally, the illumination rendering module 302 is configured to:
转换单元,用于将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置。a conversion unit for converting a position of the light source in the real scene to a position of the light source in the virtual scene.
渲染单元,用于根据所述光源在虚拟场景中的位置以及所述光源在真实场景中的亮度对虚拟场景中的虚拟物体进行光照渲染。And a rendering unit, configured to perform illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
可选的,所述转换单元用于:Optionally, the conversion unit is configured to:
根据并行跟踪和映射算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M将光源在真实场景中的位置(x,y,z)转换为所述光源在虚拟场景中的位置(u,v,w); 其中,(u,v,w)=M*(x,y,z)。Obtaining a realistic model transformation matrix M according to the parallel tracking and mapping algorithm, and converting the position (x, y, z) of the light source in the real scene to the position of the light source in the virtual scene by using the real model transformation matrix M (u, v,w); Where (u, v, w) = M * (x, y, z).
本公开第四实施例,一种增强现实中进行光照渲染的装置,如图4所示,包括以下组成部分:A fourth embodiment of the present disclosure, an apparatus for performing illumination rendering in augmented reality, as shown in FIG. 4, includes the following components:
1)视频获取模块401,用于利用摄像头获取真实世界的视频图像。1) A video acquisition module 401, configured to acquire a real-world video image by using a camera.
2)视频分解模块402,用于对真实世界的视频图像进行分解,得到一系列的图像帧。2) The video decomposition module 402 is configured to decompose the real-world video image to obtain a series of image frames.
3)位置计算模块403,用于计算光源在每个所述图像帧中的位置。3) A position calculation module 403 for calculating the position of the light source in each of the image frames.
可选的,计算模块403,包括:Optionally, the calculating module 403 includes:
边界线确定单元,用于确定所述图像帧中的任一真实物体的实体边界线和阴影边界线。A boundary line determining unit is configured to determine a physical boundary line and a shadow boundary line of any real object in the image frame.
特征点确定单元,用于在所述实体边界线和所述阴影边界线上分别选取特征点。所述特征点包括:拐点和极值点,例如:边界线上的最高点、最低点和顶点等。A feature point determining unit is configured to respectively select feature points on the physical boundary line and the shadow boundary line. The feature points include: an inflection point and an extreme point, for example, a highest point, a lowest point, and a vertex on the boundary line.
特征向量确定单元,用于根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量。所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。And a feature vector determining unit, configured to determine a feature vector according to a correspondence between a feature point on the solid boundary line and a feature point on the shadow boundary line. The feature vector is a vector of any feature point on the solid boundary line pointing to a feature point corresponding to the any feature point on the shadow boundary line.
光源确定单元,用于根据所述特征向量确定光源在所述图像帧中的位置。所述光源在所述图像帧中的位置为所有所述特征向量的延长线或反向延长线的交点位置。And a light source determining unit configured to determine a position of the light source in the image frame according to the feature vector. The position of the light source in the image frame is the intersection of the extension line or the inverse extension line of all of the feature vectors.
4)亮度计算模块404,用于计算光源在每个所述图像帧中的亮度。4) A brightness calculation module 404 for calculating the brightness of the light source in each of the image frames.
可选的,亮度计算模块404,用于:Optionally, the brightness calculation module 404 is configured to:
计算任一图像帧中的每个点的亮度值,所述任一图像帧中的最大亮度值即为光源在所述任一图像帧中的亮度。A luminance value is calculated for each point in any of the image frames, and the maximum luminance value in any of the image frames is the luminance of the light source in any of the image frames.
5)位置转换模块405,用于将光源在所述图像帧中的位置转换为所述光源在增强现实的三维模型中的位置。5) A position conversion module 405 for converting the position of the light source in the image frame to the position of the light source in the three-dimensional model of the augmented reality.
可选的,位置转换模块405,用于:Optionally, the location conversion module 405 is configured to:
根据PTAM算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M将光源在所述图像帧中的位置(x,y,z)转换为所述光源在增强现实的三维模型中的位置(u,v,w),其中,(u,v,w)=M*(x,y,z)。Obtaining a real model transformation matrix M according to the PTAM algorithm, and converting the position (x, y, z) of the light source in the image frame into a position of the light source in the augmented reality three-dimensional model by using the real model transformation matrix M ( u, v, w), where (u, v, w) = M * (x, y, z).
6)光照渲染模块406,用于根据所述光源在增强现实的三维模型中的位置以及所述光源在所述图像帧中的亮度对增强现实的三维模型中的虚拟物体进行光照渲染。6) The illumination rendering module 406 is configured to perform illumination rendering on the virtual object in the augmented reality three-dimensional model according to the position of the light source in the three-dimensional model of the augmented reality and the brightness of the light source in the image frame.
7)视频还原模块407,用于将经过光照渲染的虚拟物体放到所述摄像头获取的真实世界的视频图像中。7) The video restoration module 407 is configured to place the virtual object rendered by the illumination into the real world video image acquired by the camera.
随着摄像头不断的获取到真实世界的视频图像,动态的提取真实世界的视频图像中的光源信息,并根据所述光源信息动态的对所述虚拟物体进行动态光照渲染。As the camera continuously acquires the real-world video image, the light source information in the real-world video image is dynamically extracted, and the virtual object is dynamically rendered according to the light source information.
本公开第五实施例,一种移动终端,所述移动终端设置有本公开第三实施例所述的增强现实中进行光照渲染的装置。A fifth embodiment of the present disclosure is a mobile terminal provided with an apparatus for performing illumination rendering in an augmented reality according to a third embodiment of the present disclosure.
本公开还提供了一种非临时性计算机可读存储介质,其中存储有计算机程序指令,当 终端设备的一个或多个处理器执行所述计算机程序指令时,所述终端设备执行如上第一或第二实施例所述的增强现实中进行光照渲染的方法。The present disclosure also provides a non-transitory computer readable storage medium having stored therein computer program instructions When the one or more processors of the terminal device execute the computer program instructions, the terminal device performs the method of performing illumination rendering in the augmented reality as described in the first or second embodiment.
本公开实施例中介绍的增强现实中进行光照渲染的方法、装置及移动终端,可以实时获取的增强现实的真实场景中的光源信息,并根据所述真实场景中的光源信息动态的对增强现实的虚拟场景中的虚拟物体进行光照渲染,使得虚拟场景中的虚拟物体可以和真实场景更好的融合起来,带给使用者更接近现实的视觉体验。The method, device and mobile terminal for performing illumination rendering in the augmented reality introduced in the embodiment of the present disclosure can obtain the light source information in the real scene of the augmented reality in real time, and dynamically augment the reality according to the light source information in the real scene. The virtual objects in the virtual scene are rendered by the light, so that the virtual objects in the virtual scene can be better integrated with the real scene, bringing the user a visual experience closer to reality.
通过具体实施方式的说明,应当可对本公开为达成预定目的所采取的技术手段及功效得以更加深入且的了解,然而所附图示仅是提供参考与说明之用,并非用来对本公开加以限制。The description of the specific embodiments is intended to provide a more thorough understanding of the embodiments of the invention, and the .
工业实用性Industrial applicability
本公开实施例提供的增强现实中进行光照渲染的方法,可应用于终端设备中,通过实时获取的增强现实的真实场景中的光源信息,并根据所述真实场景中的光源信息动态的对增强现实的虚拟场景中的虚拟物体进行光照渲染,使得虚拟场景中的虚拟物体可以和真实场景更好的融合起来,带给使用者更接近现实的视觉体验。 The method for performing illumination rendering in the augmented reality provided by the embodiment of the present disclosure may be applied to the light source information in the real scene of the augmented reality acquired in real time by the terminal device, and dynamically enhanced according to the dynamic information of the light source in the real scene. The virtual objects in the realistic virtual scene are rendered by the light, so that the virtual objects in the virtual scene can be better integrated with the real scene, bringing the user a visual experience closer to reality.

Claims (15)

  1. 一种增强现实中进行光照渲染的方法,包括:A method of performing illumination rendering in augmented reality, including:
    确定光源在真实场景中的位置和亮度;Determine the position and brightness of the light source in the real scene;
    根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染。Illuminating the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
  2. 根据权利要求1所述的增强现实中进行光照渲染的方法,其中,所述确定光源在真实场景中的位置和亮度,包括:The method for performing illumination rendering in augmented reality according to claim 1, wherein the determining the position and brightness of the light source in the real scene comprises:
    根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置;Determining a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and its shadow in the real scene;
    计算真实场景中的所有位置点的亮度值,真实场景中的最大亮度值即为所述光源在真实场景中的亮度。The brightness value of all the position points in the real scene is calculated, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
  3. 根据权利要求2所述的增强现实中进行光照渲染的方法,其中,所述根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置,包括:The method for performing illumination rendering in augmented reality according to claim 2, wherein the determining a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object in the real scene and the shadow thereof, including :
    确定真实场景中的任一真实物体的实体边界线和阴影边界线;Determining the physical boundary line and the shadow boundary line of any real object in the real scene;
    在所述任一真实物体的实体边界线和阴影边界线上分别选取特征点;Selecting feature points on the physical boundary line and the shadow boundary line of any of the real objects;
    根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量;Determining a feature vector according to a correspondence between a feature point on the physical boundary line and a feature point on the shadow boundary line;
    根据所有所述特征向量的延长线或反向延长线的交点位置确定为光源在真实场景中的三维坐标位置。The position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
  4. 根据权利要求3所述的增强现实中进行光照渲染的方法,其中,所述特征点包括:拐点和极值点。The method of performing illumination rendering in augmented reality according to claim 3, wherein the feature points comprise: an inflection point and an extreme point.
  5. 根据权利要求3所述的增强现实中进行光照渲染的方法,其中,所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。The method for performing illumination rendering in augmented reality according to claim 3, wherein the feature vector is any feature point on the solid boundary line pointing to the shadow boundary line and any one of the feature points The vector of the corresponding feature point.
  6. 根据权利要求1所述的增强现实中进行光照渲染的方法,其中,所述根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染,包括:The method for performing illumination rendering in augmented reality according to claim 1, wherein the rendering of the virtual object in the virtual scene according to the position and brightness of the light source in the real scene comprises:
    将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置;Converting the position of the light source in the real scene to the position of the light source in the virtual scene;
    根据所述光源在虚拟场景中的位置以及所述光源在真实场景中的亮度对虚拟场景中的虚拟物体进行光照渲染。Illuminating the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
  7. 根据权利要求6所述的增强现实中进行光照渲染的方法,其中,所述将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置,包括:The method for performing illumination rendering in augmented reality according to claim 6, wherein the converting the position of the light source in the real scene to the position of the light source in the virtual scene comprises:
    根据并行跟踪和映射算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M将光源在真实场景中的位置(x,y,z)转换为所述光源在虚拟场景中的位置(u,v,w);其中,(u,v,w)=M*(x,y,z)。 Obtaining a realistic model transformation matrix M according to the parallel tracking and mapping algorithm, and converting the position (x, y, z) of the light source in the real scene to the position of the light source in the virtual scene by using the real model transformation matrix M (u, v, w); where (u, v, w) = M * (x, y, z).
  8. 一种增强现实中进行光照渲染的装置,包括:An apparatus for performing illumination rendering in augmented reality, comprising:
    光源确定模块,设置为确定光源在真实场景中的位置和亮度;a light source determining module configured to determine a position and brightness of the light source in a real scene;
    光照渲染模块,设置为根据所述光源在真实场景中的位置和亮度对虚拟场景中的虚拟物体进行光照渲染。The illumination rendering module is configured to perform illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
  9. 根据权利要求8所述的增强现实中进行光照渲染的装置,其中,所述光源确定模块包括:The apparatus for performing illumination rendering in augmented reality according to claim 8, wherein the light source determining module comprises:
    位置确定单元,设置为根据真实场景中的任一真实物体及其阴影之间的位置关系,确定光源在真实场景中的三维坐标位置;a position determining unit configured to determine a three-dimensional coordinate position of the light source in the real scene according to a positional relationship between any real object and its shadow in the real scene;
    亮度确定单元,设置为计算真实场景中的所有位置点的亮度值,真实场景中的最大亮度值即为所述光源在真实场景中的亮度。The brightness determining unit is configured to calculate a brightness value of all the position points in the real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene.
  10. 根据权利要求9所述的增强现实中进行光照渲染的装置,其中,所述位置确定单元设置为:The apparatus for performing illumination rendering in augmented reality according to claim 9, wherein the position determining unit is configured to:
    确定真实场景中的任一真实物体的实体边界线和阴影边界线;Determining the physical boundary line and the shadow boundary line of any real object in the real scene;
    在所述任一真实物体的实体边界线和阴影边界线上分别选取特征点;Selecting feature points on the physical boundary line and the shadow boundary line of any of the real objects;
    根据所述实体边界线上的特征点与所述阴影边界线上的特征点的对应关系确定特征向量;Determining a feature vector according to a correspondence between a feature point on the physical boundary line and a feature point on the shadow boundary line;
    根据所有所述特征向量的延长线或反向延长线的交点位置确定为光源在真实场景中的三维坐标位置。The position of the intersection of the extension line or the inverse extension line of all the feature vectors is determined as the three-dimensional coordinate position of the light source in the real scene.
  11. 根据权利要求10所述的增强现实中进行光照渲染的装置,其中,所述特征点包括:拐点和极值点。The apparatus for performing illumination rendering in augmented reality according to claim 10, wherein the feature points comprise: an inflection point and an extreme point.
  12. 根据权利要求10所述的增强现实中进行光照渲染的装置,其中,所述特征向量为所述实体边界线上的任一特征点指向所述阴影边界线上的与所述任一特征点相对应的特征点的向量。The apparatus for performing illumination rendering in augmented reality according to claim 10, wherein the feature vector is any feature point on the solid boundary line pointing to the shadow boundary line and the any feature point The vector of the corresponding feature point.
  13. 根据权利要求8所述的增强现实中进行光照渲染的装置,其中,所述光照渲染模块包括:The apparatus for performing illumination rendering in augmented reality according to claim 8, wherein the illumination rendering module comprises:
    转换单元,设置为将光源在真实场景中的位置转换为所述光源在虚拟场景中的位置;a conversion unit configured to convert a position of the light source in the real scene to a position of the light source in the virtual scene;
    渲染单元,设置为根据所述光源在虚拟场景中的位置以及所述光源在真实场景中的亮度对虚拟场景中的虚拟物体进行光照渲染。And a rendering unit configured to perform illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
  14. 根据权利要求13所述的增强现实中进行光照渲染的装置,其中,所述转换单元设置为:The apparatus for performing illumination rendering in augmented reality according to claim 13, wherein the conversion unit is configured to:
    根据并行跟踪和映射算法得到现实模型转换矩阵M,利用所述现实模型转换矩阵M将光源在真实场景中的位置(x,y,z)转换为所述光源在虚拟场景中的位置(u,v,w);其中,(u,v,w)=M*(x,y,z)。Obtaining a realistic model transformation matrix M according to the parallel tracking and mapping algorithm, and converting the position (x, y, z) of the light source in the real scene to the position of the light source in the virtual scene by using the real model transformation matrix M (u, v, w); where (u, v, w) = M * (x, y, z).
  15. 一种移动终端,包括权利要求8~14中任一项所述的增强现实中进行光照渲染的装置。 A mobile terminal comprising the apparatus for performing illumination rendering in an augmented reality according to any one of claims 8 to 14.
PCT/CN2017/081402 2016-09-07 2017-04-21 Method and device for lighting rendering in augmented reality, and mobile terminal WO2018045759A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610810809.X 2016-09-07
CN201610810809.XA CN107808409B (en) 2016-09-07 2016-09-07 Method and device for performing illumination rendering in augmented reality and mobile terminal

Publications (1)

Publication Number Publication Date
WO2018045759A1 true WO2018045759A1 (en) 2018-03-15

Family

ID=61561344

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/081402 WO2018045759A1 (en) 2016-09-07 2017-04-21 Method and device for lighting rendering in augmented reality, and mobile terminal

Country Status (2)

Country Link
CN (1) CN107808409B (en)
WO (1) WO2018045759A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108986199A (en) * 2018-06-14 2018-12-11 北京小米移动软件有限公司 Dummy model processing method, device, electronic equipment and storage medium
GB2569267A (en) * 2017-10-13 2019-06-19 Mo Sys Engineering Ltd Lighting integration
CN111145341A (en) * 2019-12-27 2020-05-12 陕西职业技术学院 Single light source-based virtual-real fusion illumination consistency drawing method
CN112367750A (en) * 2020-11-02 2021-02-12 北京德火科技有限责任公司 Linkage structure of AR immersion type panoramic simulation system and lighting system and control method thereof
CN112837425A (en) * 2021-03-10 2021-05-25 西南交通大学 Mixed reality illumination consistency adjusting method
CN113302665A (en) * 2019-10-16 2021-08-24 谷歌有限责任公司 Illumination estimation for augmented reality

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108877340A (en) * 2018-07-13 2018-11-23 李冬兰 A kind of intelligent English assistant learning system based on augmented reality
CN110021071B (en) * 2018-12-25 2024-03-12 创新先进技术有限公司 Rendering method, device and equipment in augmented reality application
CN110033423B (en) * 2019-04-16 2020-08-28 北京字节跳动网络技术有限公司 Method and apparatus for processing image
WO2021109885A1 (en) * 2019-12-06 2021-06-10 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Light source detection for extended reality technologies
CN111402409B (en) * 2020-04-03 2021-03-05 湖北工业大学 Exhibition hall design illumination condition model system
CN111862344B (en) * 2020-07-17 2024-03-08 抖音视界有限公司 Image processing method, apparatus and storage medium
CN112040596B (en) * 2020-08-18 2022-11-08 张雪媛 Virtual space light control method, computer readable storage medium and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102696057A (en) * 2010-03-25 2012-09-26 比兹摩德莱恩有限公司 Augmented reality systems
US20130141434A1 (en) * 2011-12-01 2013-06-06 Ben Sugden Virtual light in augmented reality
US20140267412A1 (en) * 2013-03-15 2014-09-18 Disney Enterprises, Inc. Optical illumination mapping
CN104766270A (en) * 2015-03-20 2015-07-08 北京理工大学 Virtual and real lighting fusion method based on fish-eye lens
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8144978B2 (en) * 2007-08-01 2012-03-27 Tandent Vision Science, Inc. System and method for identifying complex tokens in an image
CN101710429B (en) * 2009-10-12 2012-09-05 湖南大学 Illumination algorithm of augmented reality system based on dynamic light map
US9578226B2 (en) * 2012-04-12 2017-02-21 Qualcomm Incorporated Photometric registration from arbitrary geometry for augmented reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102696057A (en) * 2010-03-25 2012-09-26 比兹摩德莱恩有限公司 Augmented reality systems
US20130141434A1 (en) * 2011-12-01 2013-06-06 Ben Sugden Virtual light in augmented reality
US20140267412A1 (en) * 2013-03-15 2014-09-18 Disney Enterprises, Inc. Optical illumination mapping
CN104766270A (en) * 2015-03-20 2015-07-08 北京理工大学 Virtual and real lighting fusion method based on fish-eye lens
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2569267A (en) * 2017-10-13 2019-06-19 Mo Sys Engineering Ltd Lighting integration
US11189101B2 (en) 2017-10-13 2021-11-30 Mo-Sys Engineering Limited Lighting integration
CN108986199A (en) * 2018-06-14 2018-12-11 北京小米移动软件有限公司 Dummy model processing method, device, electronic equipment and storage medium
CN108986199B (en) * 2018-06-14 2023-05-16 北京小米移动软件有限公司 Virtual model processing method and device, electronic equipment and storage medium
CN113302665A (en) * 2019-10-16 2021-08-24 谷歌有限责任公司 Illumination estimation for augmented reality
CN111145341A (en) * 2019-12-27 2020-05-12 陕西职业技术学院 Single light source-based virtual-real fusion illumination consistency drawing method
CN111145341B (en) * 2019-12-27 2023-04-28 陕西职业技术学院 Virtual-real fusion illumination consistency drawing method based on single light source
CN112367750A (en) * 2020-11-02 2021-02-12 北京德火科技有限责任公司 Linkage structure of AR immersion type panoramic simulation system and lighting system and control method thereof
CN112837425A (en) * 2021-03-10 2021-05-25 西南交通大学 Mixed reality illumination consistency adjusting method

Also Published As

Publication number Publication date
CN107808409B (en) 2022-04-12
CN107808409A (en) 2018-03-16

Similar Documents

Publication Publication Date Title
WO2018045759A1 (en) Method and device for lighting rendering in augmented reality, and mobile terminal
JP7395577B2 (en) Motion smoothing of reprojected frames
CN110196746B (en) Interactive interface rendering method and device, electronic equipment and storage medium
US10026230B2 (en) Augmented point cloud for a visualization system and method
US8817046B2 (en) Color channels and optical markers
US10719920B2 (en) Environment map generation and hole filling
US11276150B2 (en) Environment map generation and hole filling
US20160148335A1 (en) Data-processing apparatus and operation method thereof
JP7244810B2 (en) Face Texture Map Generation Using Monochromatic Image and Depth Information
US20150009216A1 (en) Storage medium, image processing apparatus, image processing system and image processing method
CN112529097B (en) Sample image generation method and device and electronic equipment
RU2422902C2 (en) Two-dimensional/three-dimensional combined display
WO2017113729A1 (en) 360-degree image loading method and loading module, and mobile terminal
JP6521352B2 (en) Information presentation system and terminal
KR101680672B1 (en) Method for providing texture effect and display apparatus applying the same
Leal-Meléndrez et al. Occlusion handling in video-based augmented reality using the kinect sensor for indoor registration
US10748331B2 (en) 3D lighting
Seo et al. 3-D visual tracking for mobile augmented reality applications
JP2020177619A (en) Interactive image processing system using infrared cameras
CN109949396A (en) A kind of rendering method, device, equipment and medium
CN114998504A (en) Two-dimensional image illumination rendering method, device and system and electronic device
JP5865092B2 (en) Image processing apparatus, image processing method, and program
JP2020177618A (en) Method, apparatus, and medium for interactive image processing using depth engine and digital signal processor
CN116684561B (en) Startup image adjusting method and device, electronic equipment and storage medium
JP7190780B1 (en) Image processing program, image processing apparatus, and image processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17847933

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17847933

Country of ref document: EP

Kind code of ref document: A1