CN110136239B - Method for enhancing illumination and reflection reality degree of virtual reality scene - Google Patents

Method for enhancing illumination and reflection reality degree of virtual reality scene Download PDF

Info

Publication number
CN110136239B
CN110136239B CN201910283324.3A CN201910283324A CN110136239B CN 110136239 B CN110136239 B CN 110136239B CN 201910283324 A CN201910283324 A CN 201910283324A CN 110136239 B CN110136239 B CN 110136239B
Authority
CN
China
Prior art keywords
reflection
light supplement
supplement lamp
scene
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910283324.3A
Other languages
Chinese (zh)
Other versions
CN110136239A (en
Inventor
李源
黄首志
朱海天
牛泽平
韩峰
刘景明
白路
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Five Horizons Network Technology Co ltd
Original Assignee
Nanjing Five Horizons Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Five Horizons Network Technology Co ltd filed Critical Nanjing Five Horizons Network Technology Co ltd
Priority to CN201910283324.3A priority Critical patent/CN110136239B/en
Publication of CN110136239A publication Critical patent/CN110136239A/en
Application granted granted Critical
Publication of CN110136239B publication Critical patent/CN110136239B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention discloses a method for enhancing illumination and reflection reality degree of a virtual reality scene, which comprises the following steps: inputting a house type area drawn by a user, and preprocessing the data of the house type area; dividing function modules in the house type based on the module characteristics; calculating and generating a light supplement lamp object of each module based on the module characteristics; adjusting the intensity and color of the light supplement lamp based on the environmental parameters; extracting reflection materials in the environment, and dividing different types according to the opacity and the reflection coefficient; and generating a reflection map based on the environmental parameters and surrounding objects, and rendering the light supplement lamp in real time. According to the invention, the light supplement lamp information and the reflection map in the area are calculated in parallel, so that the real-time rendering effect experience is greatly improved, and through the two ways, the rendering efficiency is effectively improved, and the interaction experience is greatly improved.

Description

Method for enhancing illumination and reflection reality degree of virtual reality scene
Technical Field
The invention relates to a method for enhancing illumination and reflection reality degree of a virtual reality scene, and belongs to the technical field of real-time rendering of three-dimensional scenes.
Background
In the real-time rendering of the three-dimensional scene, the virtual reality experience is enhanced, the scene reality degree is improved, and the method is more and more important for attracting users to enhance the human-computer interaction experience. In the real-time rendering, the light luminance decay is more, in order to simulate more real ambient light, needs us to generate some light filling lamps according to the environment, and through supplementary formation reflection mapping, promote the shadow effect simultaneously.
The existing illumination rendering algorithm is mainly based on ambient illumination, diffuse illumination, mirror illumination and the like. The light shines on the object surface, and is painted on the pixel point through the coloring device to simulate the material effect with the sense of reality, such as a semitransparent effect, a reflection effect and a jade effect. The physical surface color in the three-dimensional space is usually described by three primary color vectors, and real-time rendering generally comprises three rendering methods, namely multi-light-source one-time rendering, multi-light-source multiple rendering and delayed coloring. The former two wastes resources because of calculating each object in large quantity, and rendering the hidden object; and the delayed coloring is based on the attribute of the last pixel point, so the time complexity is much smaller. However, in real life, in a light room, the surface of an object is not completely dark, and a part of light quantum always exists on the surface of the object.
Therefore, the light supplementing lamps are added to simulate the lost ambient light, the reflection map is generated to render in real time according to the reflection material, the rendering efficiency is effectively improved through the two ways, and the interaction experience is greatly improved.
Disclosure of Invention
The purpose of the invention is as follows: in order to overcome the defects in the prior art, the invention provides a method for enhancing the illumination and reflection reality degree of a virtual reality scene, which can realize the automatic generation of a light supplement lamp and a reflection map.
The technical scheme is as follows: in order to achieve the purpose, the invention adopts the technical scheme that:
a method for enhancing illumination and reflection reality degree of a virtual reality scene comprises the following steps:
1) According to the house type area drawn by the user, inputting a data structure and finishing preprocessing;
2) According to the module characteristics, the indoor functional modules, such as bedroom living rooms and the like, are divided, a basis is provided for the environment information of the light supplement lamp, meanwhile, the calculation parallelism can be effectively improved, and the real-time experience is enhanced;
3) Calculating and generating light supplement lamp information of each module based on the module characteristics including the size, the position and the depth of the window, wherein the light supplement lamp information includes the size, the position and the number of light supplement lamps;
4) Based on the environmental parameters, the intensity and the color of the light supplement lamp are adjusted, and the sense of reality of the light supplement lamp is further improved;
5) Extracting reflection materials in the environment, and dividing different types according to the opacity and the reflection coefficient;
6) Based on the environmental parameters and surrounding objects, rendering for a plurality of times in a superposition mode, reflecting for a set number of times in a superposition mode, and generating a corresponding panoramic reflection map;
7) And pasting a paste picture, and rendering the light supplement lamp in real time.
Further, the pretreatment process comprises: and realizing the triangularization of point-surface information, and calculating a normal vector of a surface patch.
Further, the generating process of the light supplement lamp information of each module includes: if the height of the window is h and the width is w, the height of the light-compensating lamp patch is k 1 h, width k 2 w, length L = k of light supplement lamp from window 3 * min (h, w), wherein the area of an irradiation range is S, and the light intensity of the light supplement lamp is positively correlated with the irradiation area S; wherein k is 1 、k 2 Is a multiple coefficient, and k is the illumination intensity of the whole scene is set as N 1 ∝N,k 2 Is alpha to N, and k 3 Is a coefficient of the multiple of the number,
Figure BDA0002022421000000021
further, the environmental parameters include window depth, indoor illumination intensity and color.
Furthermore, the generation process of the panoramic reflection map comprises the steps of obtaining the maximum inscribed sphere of the scene, generating a general and stable spherical panoramic map based on OpenCV, cutting out the panoramic map of the material plane size in the view angle according to the viewpoint position of the scene through a panoramic map splicing technology, and pasting the panoramic map on the reflection material. The panoramic image synthesis flow comprises the following steps: feature point matching, picture mode matching, panorama correction, image illumination tone balance and image frequency band fusion.
Has the beneficial effects that: compared with the prior art, the method for enhancing the illumination and reflection reality degree of the virtual reality scene has the following advantages: 1. model division is combined with a regional fill light calculation algorithm, and algorithm parallelism is increased; 2. the related information of the light supplement lamp and the reflection map in the area are calculated in parallel, so that the real-time rendering effect experience is greatly improved; 3. the partitioning pretreatment is added in the model partitioning, so that the model partitioning is faster and more efficient; 4. and the high-fitting-degree reflection mapping generated by combining the specific environment material effectively improves the real-time reality degree.
Drawings
FIG. 1 is a flow chart of a method of enhancing the illumination and reflection realism of a virtual reality scene according to the present invention;
fig. 2 is a schematic diagram of generating information of a light filling lamp patch in the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and examples.
Fig. 1 shows a method for enhancing the illumination and reflection reality of a virtual reality scene, comprising the following steps:
1) Inputting a data structure and finishing preprocessing according to a house type area drawn by a user, namely realizing the triangularization of point-surface information and calculating a surface patch normal vector;
2) Dividing the function modules in the house type according to the module characteristics;
3) Based on the module characteristics including the size, position and depth of the window, calculating and generating light supplement lamp information of each module including the size, position and number of light supplement lamps;
4) Adjusting the intensity and color of the light supplement lamp based on parameters such as the depth of the window, the indoor illumination intensity and color and the like;
5) Extracting reflection materials in the environment, and dividing different types according to the opacity and the reflection coefficient;
6) Based on the environmental parameters and surrounding objects, rendering for a plurality of times in a superposition mode, reflecting for a set number of times in a superposition mode, and generating a corresponding panoramic reflection map;
7) And pasting a paste map, and rendering the light supplement lamp in real time.
The conventional light supplement lamp is usually generated by setting the position of the light supplement lamp or providing the light supplement lamp by a programThe manual increase light filling lamp is to the scene in the entry, and the flow is more complicated, and it is also more troublesome to adjust light parameter, often can consume the designer considerable time. The invention provides a reliable compromise scheme, and the generation process of the light supplement lamp information of each module comprises the following steps: as shown in fig. 2, if the height of the window is h and the width is w, the height of the light-compensating lamp patch is k 1 h, width k 2 w, the length L = k of the light supplement lamp from the window 3 * min (h, w), the area of the irradiation range is S, and the light intensity of the light supplement lamp is positively correlated with the irradiation area S; wherein k is 1 、k 2 Is a multiple coefficient, and k is the illumination intensity of the whole scene is set as N 1 ∝N,k 2 Is alpha to N, and k 3 Is the coefficient of the multiple of the number,
Figure BDA0002022421000000031
in this embodiment, the generating process of the panoramic reflection map includes obtaining a maximum inscribed sphere of the scene, generating a general robust spherical panorama based on OpenCV, cutting out a panoramic map of the material plane size in the view angle according to the scene viewpoint position through a panorama splicing technology, and attaching the panoramic map to the reflection material.
Because the number of the maps in the scene is large, the number of the required light supplement lamps is large, and the scheme is an OpenMP-based acceleration scheme for the calculation process.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (3)

1. A method for enhancing the illumination and reflection reality degree of a virtual reality scene is characterized by comprising the following steps:
1) According to the house type area drawn by the user, inputting a data structure and finishing preprocessing;
2) Dividing the function modules in the house type according to the module characteristics;
3) Based on the module characteristics including the size, position and depth of the window, calculating and generating light supplement lamp information of each module including the size, position and number of light supplement lamps;
4) Adjusting the intensity and color of the light supplement lamp based on the environmental parameters;
5) Extracting reflection materials in the environment, and dividing different types according to the opacity and the reflection coefficient;
6) Based on the environmental parameters and surrounding objects, performing superposition rendering for a plurality of times to generate a corresponding panoramic reflection map;
7) Pasting a paste picture, and rendering a light supplement lamp in real time;
the generation process of the light supplement lamp information of each module comprises the following steps: if the height of the window is h and the width is w, the height of the light-compensating lamp patch is k 1 h, width k 2 w, the length L = k of the light supplement lamp from the window 3 * min (h, w), the area of the irradiation range is S, and the light intensity of the light supplement lamp is positively correlated with the irradiation area S; wherein k is 1 、k 2 Is a multiple coefficient, and the illumination intensity of the whole scene is set to be N, k 1 ∝N,k 2 Is alpha to N, and k 3 Is a multiple coefficient, k 3 ∝(k 1 2 +k 2 2 ) 1/2
The environmental parameters comprise the depth of a window, the indoor illumination intensity and the color.
2. The method of claim 1, wherein the preprocessing comprises: and realizing the triangularization of point-surface information, and calculating a normal vector of a surface patch.
3. The method according to claim 1, wherein the generating process of the panoramic reflection map includes obtaining a maximum inscribed sphere of the scene, generating a general robust spherical panorama based on OpenCV, cutting out a panorama map of a material plane size in a viewing angle according to a scene viewpoint position by using a panorama stitching technique, and pasting the panorama map on the reflection material.
CN201910283324.3A 2019-04-10 2019-04-10 Method for enhancing illumination and reflection reality degree of virtual reality scene Active CN110136239B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910283324.3A CN110136239B (en) 2019-04-10 2019-04-10 Method for enhancing illumination and reflection reality degree of virtual reality scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910283324.3A CN110136239B (en) 2019-04-10 2019-04-10 Method for enhancing illumination and reflection reality degree of virtual reality scene

Publications (2)

Publication Number Publication Date
CN110136239A CN110136239A (en) 2019-08-16
CN110136239B true CN110136239B (en) 2023-03-10

Family

ID=67569493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910283324.3A Active CN110136239B (en) 2019-04-10 2019-04-10 Method for enhancing illumination and reflection reality degree of virtual reality scene

Country Status (1)

Country Link
CN (1) CN110136239B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111145332B (en) * 2019-11-21 2022-08-12 江苏艾佳家居用品有限公司 General method for designing photometry for home decoration

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090742A (en) * 2014-07-17 2014-10-08 北京邮电大学 Parallelization type progressive photon mapping method and device based on OpenCL
CN107248195A (en) * 2017-05-31 2017-10-13 珠海金山网络游戏科技有限公司 A kind of main broadcaster methods, devices and systems of augmented reality
CN108460841A (en) * 2018-01-23 2018-08-28 电子科技大学 A kind of indoor scene light environment method of estimation based on single image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080310707A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Virtual reality enhancement using real world data
US9753288B2 (en) * 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090742A (en) * 2014-07-17 2014-10-08 北京邮电大学 Parallelization type progressive photon mapping method and device based on OpenCL
CN107248195A (en) * 2017-05-31 2017-10-13 珠海金山网络游戏科技有限公司 A kind of main broadcaster methods, devices and systems of augmented reality
CN108460841A (en) * 2018-01-23 2018-08-28 电子科技大学 A kind of indoor scene light environment method of estimation based on single image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于移动增强现实的实时软阴影绘制算法;郭巾铭等;《计算机应用与软件》;20160915;第33卷(第09期);全文 *

Also Published As

Publication number Publication date
CN110136239A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN104484896B (en) It is a kind of that the physical method of figure skin Subsurface Scattering is simulated based on Environment
US7583264B2 (en) Apparatus and program for image generation
US20070126864A1 (en) Synthesizing three-dimensional surround visual field
CN111968215A (en) Volume light rendering method and device, electronic equipment and storage medium
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
CN102667865B (en) For the method for build environment map
US9183654B2 (en) Live editing and integrated control of image-based lighting of 3D models
CN113436343A (en) Picture generation method and device for virtual studio, medium and electronic equipment
US20200302579A1 (en) Environment map generation and hole filling
CN103995700A (en) Method for achieving global illumination of 3D game engine
US10719920B2 (en) Environment map generation and hole filling
Sheng et al. A spatially augmented reality sketching interface for architectural daylighting design
CN110136239B (en) Method for enhancing illumination and reflection reality degree of virtual reality scene
CN104899913B (en) A kind of fluid special effect making method true to nature under virtual stage environment
CN103679818A (en) Real-time scene drawing method based on virtual surface light source
CN116055800A (en) Method for mobile terminal to obtain customized background real-time dance video
Wang et al. Research and design of digital museum based on virtual reality
CN110400366A (en) A kind of real-time flood disaster visual simulating method based on OpenGL
CN112087662B (en) Method for generating dance combination dance video by mobile terminal and mobile terminal
Stojanov et al. Application of 3ds Max for 3D Modelling and Rendering
CN114392551A (en) Display control method and device of virtual object and electronic equipment
Dimitrijević et al. Light and shadow in 3d modeling
CN112087663B (en) Method for generating dance video with adaptive light and shade environment by mobile terminal
US20170221504A1 (en) Photorealistic CGI Generated Character
Yang et al. Visual effects in computer games

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant