CN102096941B - Consistent lighting method under falsehood-reality fused environment - Google Patents

Consistent lighting method under falsehood-reality fused environment Download PDF

Info

Publication number
CN102096941B
CN102096941B CN 201110033282 CN201110033282A CN102096941B CN 102096941 B CN102096941 B CN 102096941B CN 201110033282 CN201110033282 CN 201110033282 CN 201110033282 A CN201110033282 A CN 201110033282A CN 102096941 B CN102096941 B CN 102096941B
Authority
CN
China
Prior art keywords
brdf
under
environment
coordinate system
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201110033282
Other languages
Chinese (zh)
Other versions
CN102096941A (en
Inventor
齐越
王维
贾军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN 201110033282 priority Critical patent/CN102096941B/en
Publication of CN102096941A publication Critical patent/CN102096941A/en
Application granted granted Critical
Publication of CN102096941B publication Critical patent/CN102096941B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a consistent lighting method under a falsehood-reality fused environment, comprising the following steps: shooting a high dynamic range image reflecting a real environment, of a small mirror-surface sphere in a real scene, and acquiring an environmental lighting map of the real scene; preprocessing the environmental lighting map of the real scene and surface reflection attributes of a virtual object; preprocessing the environmental lighting of the real scene to generate environment lighting on a unit small sphere under local coordinates corresponding to all normal vectors to obtain Haar wavelet coefficients of the environment lighting; preprocessing a bidirectional reflection distribution function to generate slices of the bidirectional reflection distribution function on the unit hemisphere under all sight directions to obtain Haar wavelet coefficients of the slices; and drawing a falsehood-reality fused environment, indexing the wavelet coefficients of the environment lighting and the slices of the bidirectional reflection distribution function according to a normal vector of an apex of the virtual object and the sight directions, calculating a vector product of the coefficients to obtain an apex drawing color, and meanwhile, accelerating by adopting GPU (Graphics Processing Unit). The consistent lighting method can be used for efficiently realizing a full-range environment lighting drawing effect under the falsehood-reality fused environment.

Description

Illumination coherence method under the actual situation integrated environment
Technical field
The invention belongs to the computer virtual reality technology field, specifically realize illumination consistance under the actual situation integrated environment, draw realistically actual situation and merge scene, the method can be used for the drafting true to nature of the scene under the actual situation integrated environment.
Background technology
The actual situation integrated environment is one and combines the environment that forms by dummy object and real-world object that nowadays the actual situation fusion application is used more and more widely, such as aspects such as medical treatment, maintenance, building, cultural heritage and life ﹠ amusements.Be in the virtual environment of high fidelity, user's Activities all can occur in the mode near nature more, user behavior is more efficient, for the virtual objects that computing machine is generated realistic, make the user be sure of that from sense organ virtual objects is the ingredient of surrounding environment, need to solve the problem of how much consistance, illumination consistance and real-times.
The researchist is to the geometry consistance in the actual situation integrated environment, and namely actual situation merges registration and carried out large quantity research, and has obtained good achievement, and is then relatively less for the research of the illumination consistency problem in the actual situation integrated environment.Actual situation registration technology illumination consistance refers to that virtual objects and the image of real scene should reach consistent lighting effect, this just need to recover the illumination model of real scene, then calculate real scene illumination to the impact of virtual objects, such as light and shade, shade, reflecting effect etc.Kanbara (referring to M.Kanbara and N.Yokoya.Real-time Estimation of Light SourceEnvironment for Photorealistic Augmented Reality[C] .Proceedings of the PatternRecognition, 17th International Conference.Washington, DC, USA, 2004:911-914) propose to be placed on the bead of the black on the mark and to be identified in the consistent photograph that diffuses of actual situation that the true light source that reflexes on the bead has obtained comprising shade by taking one; Supan is (referring to P.Supan, I.Stuppacher, and M.Haller.ImageBased Shadowing in Real-time Augmented Reality[C] .IJVR, 5 (3): 1-7,2006) use two kinds of different equipment to gather true environment illumination and obtained similar effect, a kind of for taking ideal mirror bead of placement in scene, another kind is directly to use fish eye lens to gather ambient lighting.Because the environment incident light is with very high resolution collection, so the method can embody the DE Specular Lighting effect.Madsen (referring to C.B.Madsen and R.E.Laursen.AScalable Gpu-based Approach to Shading and Shadowing for Photorealistic Real-timeAugmented Reality[C] .In GRAPP, 2007:252-261) use the high dynamic range Environment that obtains in advance to obtain the consistent effect of actual situation illumination, Grosch is (referring to T.Grosch, T.Eble, and S.Mueller.ConsistentInteractive Augmentation of Live Camera Images with Correct Near-field Illumination[C] .Proceedings of the 2007 ACM symposium on Virtual Reality software and technology.New York, 2003:56-65) then use the high dynamic range camera that environment is carried out Real-time Collection, although the method effect pretty good then lacks an accurately simulation for the impact of body surface reflecting attribute.Pessoa is (referring to S.Pessoa, G.Moura, J.Lima, et al.Photorealistic Rendering for Augmented Reality:A GlobalIllumination and BRDF Solution[C] .Proceedings of IEEE Virtual Reality, 2010) by using the minute surface bead to gather ambient lighting, and use multi-stage environmental illumination pinup picture to object illumination in the scene, simulate the effects consistent with true environment such as look oozes, high light reflectivity, shade, simultaneously the dummy object material has been used the Lafortune model representation.The method adopts the SH basis function approximate to the low frequency environments pinup picture, and DE Specular Lighting has then used higher the obtaining of glossiness in the multi-stage environmental pinup picture, and is therefore not accurate enough for the simulation of high frequency light photograph.
Aspect the drafting of actual situation integrated environment illumination consistance, to compare with traditional virtual light source such as pointolite, direction light etc., surround lighting gathers from real world, and its drawing result can provide the more significant sense of reality.But the real-time rendering under the surround lighting will be considered the luminous energy of all directions incident, is an integral operation on the sizable hemisphere face of calculated amount.Certain methods is by introducing precomputation, and the calculated amount when having reduced operation is so that real-time rendering becomes possibility.The method for drafting of environment in recent years illumination can be divided under the low frequency environments light by research and development and the full range surround lighting under drafting.Low frequency environments light is mainly expressed surround lighting by the spherical harmonics basis function, Sloan is (referring to P.Sloan, J.Kautz.Precomputed RadianceTransfer for Real-time Rendering in Dynamics, Low-frequency Lighting Environments[J] ACM Transactions on Graphics, 2002,21 (3): the linear behavio(u)r of 527-536) propagating according to light has proposed radiance transmission algorithm (the precomputed radiance transfer of precomputation, PRT) and a kind of transmission matrix compression algorithm based on the cluster pivot analysis, but all do not support the motion of object.It is to the clearly performance of the high finish matter of shadow edge and object that full range surround lighting rendering algorithm is the most typically drawn effect.Ng is (referring to R.Ng, R.Ramamoorthi, P.Hanrahan.Triple Product Wavelet Integrals for All-frequency Relighting[J] ACMTransactions on Graphics, 2004,23 (3): 477-487) on the basis of PRT the small echo expression of light source, observability and BRDF is brought in the illumination formula, the radiance of outgoing then can be considered as the form of a small echo triple product.Because the local support of wavelet basis function can realize the computing of small echo triple product efficiently.Need to produce the wavelet coefficient of BRDF during precomputation and on each summit the wavelet coefficient of observability function, this algorithm is that transition functions with 6 dimensions resolve into the long-pending of 24 dimension functions in essence, so that its precomputation and storage become possibility.This algorithm is unresolved problem how to process moving object still, has ignored simultaneously mutual reflection when operation.
Summary of the invention
Technology of the present invention is dealt with problems: overcome the deficiencies in the prior art, illumination coherence method under a kind of actual situation integrated environment is provided, the method is used for realizing the illumination consistance of actual situation environment, and draws realistically fast the actual situation integrated environment.
Technical solution of the present invention: the illumination coherence method under the actual situation integrated environment, its step is as follows:
(1) a certain position makes up the overall world coordinate system of real world as benchmark in the real scene, according to the transformation relation of real world global coordinate system and virtual world global coordinate system, unifies real world and virtual world coordinate system;
(2) find the solution Camera extrinsic based on the unified coordinate system of step (1), shooting is in the minute surface bead of reflection true environment in the real scene, obtain to represent exceeding the high dynamic range images of the color value of ordinary range, gather the ambient lighting pinup picture of real scene;
(3) specify its surface reflection attribute for the dummy object that needs to draw, adopt bidirectional reflectance distribution function BRDF representation surface reflecting attribute;
(4) the normal vector direction sampling under the generation unit ball field of definition and the direction of visual lines sampling under the unit hemisphere field of definition;
(5) according to the ambient lighting pinup picture of step (2) generation and the normal vector sampling of step (4), ambient lighting is carried out precomputation, obtain the local environment illumination under each normal vector sample direction, and do the Haar wavelet transformation and obtain wavelet coefficient corresponding to local environment illumination;
(6) according to the BRDF of step (3) and the direction of visual lines sampling of step (4) generation, BRDF is carried out precomputation, obtain the BRDF section under each sight line sample direction, and do the wavelet coefficient that the Haar wavelet transformation obtains the BRDF section;
(7) wavelet coefficient of storing based on above-mentioned steps, each summit of the dummy object that needs are drawn, do integral operation according to ambient lighting and BRDF at the summit episphere, obtain drawing the color on summit, when rotating model or conversion viewpoint, can repeat said process according to the normal vector after the conversion and direction of visual lines, obtain drawing the color on summit, finish the painted of dummy object; Simultaneously can adopt GPU to carry out acceleration drawing.
Adopt the ambient lighting pinup picture that gathers real scene in the described step (2), the steps include: for the ambient lighting pinup picture that guarantees to collect unanimously with the coordinate system of real world, at first obtain the outer ginseng of collection camera.Then fixedly camera is taken the minute surface bead that is in the reflection true environment in the real scene, under different exposure rates, obtain some low-dynamic range bead images, match camera response curve, by the synthetic bead high dynamic range images that obtains of camera response curve, from the captured image of camera, determine the bead profile, each pixel correspondence that this profile surrounds a certain position on the sphere.According to the pixel distance of this pixel and center of circle pixel, interpolation obtains the sphere normal vector of this pixel correspondence position.Ambient lighting is reflected into camera via this position, and this moment, known camera was taken direction, can try to achieve the incident direction of ambient lighting corresponding to this pixel by light reflection, thereby obtain the illumination pinup picture of true environment.
Described step (5) is carried out precomputation to collecting ambient lighting method is: to each normal vector sample direction, solve binormal vector and tangent vector, determine the local coordinate system of this normal vector definition.To the sampling of the episphere of this local coordinate system, each sampling is a corresponding direction vector under the local coordinate system transforms to all sample direction under the global coordinate system, and the sampling ambient lighting can obtain local environment illumination corresponding to this normal vector; Then the Haar wavelet transformation is done in local environment illumination corresponding to each normal vector sampling, and stored the wavelet coefficient corresponding to local environment illumination of trying to achieve.
The method of in the described step (6) BRDF being carried out precomputation is: in the episphere field of definition direction of visual lines is sampled, then the incident direction under each sample direction is sampled, and try to achieve the BRDF value by incident direction and direction of visual lines, obtain BRDF section corresponding to each sight line sample direction; The Haar wavelet transformation is done in section to the BRDF under each sight line sample direction at last, and wavelet coefficient corresponding to BRDF section tried to achieve in storage.
The step of drawing to dummy object in the described step (7) is: for each summit on the model, with the normal vector under the vertex model coordinate system according to the Current Transform matrix rotation to world coordinate system, wavelet coefficient according to normal vector index local light photograph after the rotation, because the discreteness of normal vector sampling, so wavelet coefficient corresponding to indexing method vector need to be done linear interpolation; Find the solution direction of visual lines according to viewpoint position and vertex position, direction of visual lines is transformed under the definite local coordinate system of vertex scheme vector, according to the wavelet coefficient of direction of visual lines index BRDF section; Then the wavelet coefficient of illumination and BRDF section is done vector and take advantage of, obtain the color on this summit; At last vertex color is sent into graphic hardware, rendering model; When rotating model or conversion viewpoint, repeat said process according to the normal vector after the conversion and direction of visual lines, finish the painted of dummy object; Drawing process can use GPU to realize raising the efficiency.
The present invention's advantage compared with prior art is:
(1) the present invention uses wavelet basis function approximate to ambient lighting and bidirectional reflectance distribution function, can realize that the full range ambient lighting under the actual situation integrated environment is drawn effect.
(2) the present invention adopts the high dynamic range Environment to characterize actual situation illumination, and the modeling of mating surface reflecting attribute makes up vertex color, has improved the fidelity of scene drawing, and can revise easily illumination and reflecting attribute, has improved use.
(3) the present invention adopts PRT algorithm pre-service ambient lighting and BRDF, and carry out approximate expression with wavelet basis function, it is long-pending that vertex color calculating is converted into triple small echos, when doing the vector product of illumination coefficient and BRDF coefficient, can only do vector product with coefficient seldom, calculated amount when greatly reducing operation improves render speed, can adopt GPU to draw simultaneously and further raise the efficiency.
Description of drawings
Fig. 1 is the process flow diagram of the inventive method;
Fig. 2 is that real world coordinates is the gridiron pattern synoptic diagram among the present invention;
Fig. 3 is the process flow diagram that obtains Environment among the present invention;
Fig. 4 is ambient lighting rendering algorithm synoptic chart;
Fig. 5 is ambient lighting precomputation process flow diagram;
Fig. 6 is ambient lighting GPU rendering algorithm basic block diagram;
The dummy model illumination consistance drafting figure of Fig. 7 for adopting the inventive method to carry out, wherein figure (a), figure (c) they are drawing result under the virtual light source, figure (b), figure (d) they are illumination consistance drawing result.
Embodiment
Process flow diagram of the present invention as shown in Figure 1, the illumination calculation model that the present invention uses under needing to introduce before introducing step implementation of the present invention, each point has been described the luminous power of unit solid angle on the unit proj ected surface areas to radiancy (radiance) I of viewpoint direction outgoing in the scene, and it is proportional to the brightness value of pixel RGB passage on the screen.Calculate the color that the summit of body surface shows under illumination, following information need to be arranged: the position p on summit, normal vector n and direction of visual lines v, the color I on this summit is calculated as incident light L, body surface reflecting attribute (BRDF) f under its residing episphere Ω (n) that is determined by normal vector n and the integration of observability function V, and is as follows:
I n,v,p=∫ Ω(n)L n(ω)f v(ω)V p(ω)(n·ω)d(ω)
Wherein Ln is the illumination under the local coordinate system of summit, and fv is the BRDF section that direction of visual lines v is corresponding under the local coordinate system, and Vp is the light observability function of p under the local coordinate system; In, v, p represent that the object vertex color is to the dependence of normal vector n, direction of visual lines v and position p.Because the advantage of small echo in approximation theory, signal done wavelet transformation and store the wavelet basis function coefficient many benefits, the value that obvious advantage is most wavelet coefficients is all very little, these coefficients are blocked or remove, utilize remaining wavelet coefficient to rebuild original signal with very little distortion.It has good character at high frequency signals such as match such as HDR image, DE Specular Lighting BRDF, can be used for the ambient lighting rendering algorithm.
The embodiment of step of the present invention is as follows:
(1) a certain position makes up the overall world coordinate system of real world as benchmark in the real scene, according to the transformation relation of real world global coordinate system and virtual world global coordinate system, unifies real world and virtual world coordinate system.At first, determine real world coordinates system by gridiron pattern as shown in Figure 2, wherein the red, green, blue coordinate axis represents respectively x, y, z axle, then adopts the method identification gridiron pattern of computer vision, can be with the unification of virtual world coordinate system in above-mentioned real world coordinates system; Perhaps finish dummy object in the location of real world coordinates system, the i.e. unification of actual situation coordinate system by ARToolkitPlus.The coordinate system that gridiron pattern or ARToolkitPlus mark are determined is as the coordinate system of real world, after determining to need to merge the locus and attitude of dummy object, try to achieve the local coordinate system of dummy object model and the transformation relation of real world coordinates system, realize the unification of real world and virtual world coordinate system.
(2) based on the unified coordinate system of step (1), at first determine the outer ginseng of camera, and take the high dynamic range images that is in the minute surface bead of reflection true environment in the real scene that gather the ambient lighting pinup picture of real scene, its process flow diagram as shown in Figure 3.The minute surface bead can be mapped to sphere with surrounding environment, and the reflection by sphere can obtain surrounding environment under the viewpoint arbitrarily.At first fixedly minute surface bead and camera, regulate camera focus and aperture, make bead clear picture and imaging enough large, then take many LDR images of different exposure time, simulate the response curve of camera by these images, according to this curve many unified scene LDR image collections being become a HDR image, is the longitude and latitude Environment finally by crossing longitude and latitude projection.Consistent with the coordinate system of real world for the ambient lighting pinup picture that guarantees to collect, need to obtain the outer ginseng that gathers camera.From the captured image of camera, determine the bead profile, each pixel correspondence that this profile surrounds a certain position on the sphere.According to the pixel distance of this pixel and center of circle pixel, interpolation obtains the sphere normal vector of this pixel correspondence position.Ambient lighting is reflected into camera via this position, and this moment, known camera was taken direction, can try to achieve the incident direction of ambient lighting corresponding to this pixel by light reflection, thereby obtain the illumination pinup picture of true environment.
(3) specify its surface reflection attribute (bidirectional reflectance distribution function BRDF) for the dummy object that needs to draw.BRDF is the function of ratio that represents to reflex to from the luminous energy of any incident direction emission the luminous energy of any exit direction with, specifying BRDF for dummy object can be by directly reading the BRDF database, such as MERL BRDF Database etc., perhaps by the modeling of body surface reflecting attribute, mainly comprise based on the modeling method of analytic expression and the modeling method of based on data driving (Data-Driven).Analytic expression BRDF model commonly used has Phong, Ward, Cook-Torrance model etc., and the BRDF model of data-driven obtains by body surface is carried out intensive sampling.
(4) the normal vector direction sampling under the generation unit ball field of definition and the direction of visual lines sampling under the unit hemisphere field of definition.Because the object vertex color is the result by illumination and material reflecting attribute, sight line combined action, and computing machine can only be expressed by discrete mode for continuous function, therefore ambient lighting is processed and under the unit ball field of definition, to be carried out the sampling of normal vector direction, because the BRDF field of definition is the local coordinate system episphere, need the direction of visual lines under the unit hemisphere field of definition is sampled simultaneously.The present invention asks the direction method of sampling to adopt octahedra mapping method to unit (partly), spherical projection is surperficial to octahedron, and project at last rectangle plane, obtain the one by one mapping relations of pixel and direction on the plane, see EngelhardtT. (Octahedron Environment Maps for details, Proceedings of Vision, Modeling andVisualization, 2008).
(5) based on the ambient lighting of step (2) acquisition and the normal vector sampling of step (4), ambient lighting is carried out precomputation, obtain the local environment illumination under each normal vector sample direction, and do the Haar wavelet transformation and obtain wavelet coefficient corresponding to local environment illumination.In the ambient lighting algorithm, for each summit in the model all will according to its normal vector to global context illumination sample obtain local light according to, according to direction of visual lines BRDF sampling is obtained the BRDF section, then carry out obtaining behind the wavelet transformation a large amount of wavelet coefficients and do vector and take advantage of operation, the algorithm general view as shown in Figure 4.The operand of these processes is very large, calculated amount when therefore needing the sampling precomputation to reduce operation.The precomputation process flow diagram that ambient lighting is carried out at first, to each normal vector sample direction, solves binormal vector and tangent vector as shown in Figure 5, determines the local coordinate system of this normal vector definition; Owing to only have the ambient lighting of episphere just can contribution be arranged to vertex color, thus local light according to the local coordinate system episphere of only sampling, so then the episphere of this local coordinate system is sampled, each sampling is a corresponding direction vector under the local coordinate system; And all sample direction are transformed under the global coordinate system, the sampling ambient lighting can obtain local environment illumination corresponding to this normal vector; At last the Haar wavelet transformation is done in local environment illumination corresponding to each normal vector sampling, and with the wavelet coefficient storage of trying to achieve, two-dimentional Haar wavelet transformation provides in algorithm 1.For any global context illumination pattern, only need once calculate, all need to use the scene of this global context illumination pattern to need not to do precomputation again later on.The allocation of computer of experiment is Intel Core2 Duo E7200 2.53GHz, the 4G internal memory, and under the video card NVIDIA 9800GTX+512M, local light is according to showing as follows the computing time that is stored in of precomputation under the different sampling rates:
Figure BDA0000046292890000081
(6) the direction of visual lines sampling that the BRDF that gives based on step (3) and step (4) obtain is carried out precomputation to BRDF, obtains the BRDF section under each sight line sample direction, and does the wavelet coefficient that the Haar wavelet transformation obtains the BRDF section.The BRDF of material is the 4D function that is defined in the episphere of the definite local coordinate system of this normal vector.Certain point, direction of visual lines is the vector that this point arrives viewpoint in three dimensions, thus under a certain viewpoint in the space direction of visual lines of certain point determine.Therefore can at first sample to direction of visual lines in the episphere field of definition, try to achieve the BRDF value then to the sampling of the incident direction under each sample direction, and by incident direction and direction of visual lines, obtain BRDF section corresponding to each sight line sample direction.Next the Haar wavelet transformation is done in section to the BRDF under each sight line sample direction, and with the wavelet coefficient storage of trying to achieve.The allocation of computer of experiment is Intel Core2 Duo E72002.53GHz, the 4G internal memory, and under the video card NVIDIA 9800GTX+512M, show the computing time that is stored in of BRDF precomputation as follows under the different sampling rates:
Figure BDA0000046292890000101
(7) each summit of the dummy object of needs being drawn is done integral operation according to ambient lighting and BRDF at the summit episphere, obtains drawing the color on summit.For each summit on the model, with the normal vector under the vertex model coordinate system according to the Current Transform matrix rotation to world coordinate system, wavelet coefficient according to normal vector index local light photograph after the rotation, because the discreteness of normal vector sampling, so wavelet coefficient corresponding to indexing method vector need to be done linear interpolation; Find the solution direction of visual lines according to viewpoint position and vertex position, direction of visual lines is transformed under the definite local coordinate system of vertex scheme vector, according to the wavelet coefficient of direction of visual lines index BRDF section; Then the wavelet coefficient of illumination and BRDF section is done vector and take advantage of, obtain the color on this summit; At last vertex color is sent into graphic hardware, rendering model; When rotating model or conversion viewpoint, repeat said process according to the normal vector after the conversion and direction of visual lines, finish the painted of dummy object.A part just can be similar to out original image owing to need in whole wavelet coefficients seldom, therefore when doing the vector product of illumination coefficient and BRDF coefficient, can only do vector product with coefficient seldom, calculated amount when greatly reducing operation, raising render speed.
Rendering algorithm can use GPU to realize raising the efficiency, although GPU can give the lifting of arithmetic speed, but (sparse matrix computing degree of parallelism is lower) then well do not supported in the computing to sparse matrix, while is owing to the GPU storage space is limited, so that the GPU of rendering algorithm realization is more limited by contrast.Because GPU is not suitable for the sparse matrix computing, if therefore all adopt the sparse matrix storage for ambient lighting and BRDF coefficient, the vector product computing meeting of two kinds of coefficients is very difficult.Can consider that one of them adopts dense matrix storage mode with ambient lighting coefficient or BRDF coefficient, and another adopts Sparse Storage Modes, then can find according to the index value of sparse storage in the dense matrix accordingly value, then carry out point multiplication operation.In fact, the variation of surround lighting is more stronger than the variation of BRDF, by contrast BRDF is rebuild and need less wavelet coefficient, therefore can carry out nonlinear approximation to BRDF, and store under all sight lines wavelet coefficient and the index of corresponding BRDF section with the mode of sparse matrix, all the local environment illumination under the normal vectors is then stored with dense matrix.Whole drawing process as shown in Figure 6, for each pixel, search the wavelet coefficient that obtains the section of BRDF under this direction of visual lines and index (among the figure 1.) according to direction of visual lines, each wavelet coefficient index value to the BRDF section, search local light according to the analog value of wavelet coefficient (among the figure 2.) according to this index value and normal vector, multiply each other with corresponding BRDF section wavelet coefficient values, namely finish the vector product of illumination coefficient and wavelet coefficient, finally tried to achieve the color (among the figure 3.) of this pixel.
As shown in Figure 7, the left side is the drawing result under the overall virtual light source, the right is for adopting the illumination consistance drawing result of inventive method, adopt respectively virtual light source and environment light source to the drawing result of the Buddha model in true environment (Aero-Engine Laboratory), the model material is silver-colored material, virtual light source is 3 pointolites in the space, and environment light source is the Environment that collects, and the drafting on the right is obviously more true to nature than the left side.
In a word, the present invention can be used for drawing efficiently, realistically the consistent actual situation integrated environment of actual situation illumination.
The content that is not described in detail in the instructions of the present invention belongs to the known prior art of this area professional and technical personnel.

Claims (2)

1. the illumination coherence method under the actual situation integrated environment is characterized in that performing step is as follows:
(1) a certain position makes up the overall world coordinate system of real world as benchmark in the real scene, according to the transformation relation of real world global coordinate system and virtual world global coordinate system, unifies real world and virtual world coordinate system;
(2) find the solution Camera extrinsic based on the unified coordinate system of step (1), take the minute surface bead be in reflection true environment in the real scene, obtain representing exceeding the high dynamic range images of the color value of ordinary range, the ambient lighting pinup picture of collection real scene;
(3) specify its surface reflection attribute for the dummy object that needs to draw, adopt bidirectional reflectance distribution function BRDF representation surface reflecting attribute;
(4) the normal vector direction sampling under the generation unit ball field of definition and the direction of visual lines sampling under the unit hemisphere field of definition;
(5) according to the ambient lighting pinup picture of step (2) generation and the normal vector sampling of step (4), ambient lighting is carried out precomputation, obtain the local environment illumination under each normal vector sample direction, and do the Haar wavelet transformation and obtain wavelet coefficient corresponding to local environment illumination;
(6) according to the BRDF of step (3) and the direction of visual lines sampling of step (4) generation, BRDF is carried out precomputation, obtain the BRDF section under each sight line sample direction, and do the wavelet coefficient that the Haar wavelet transformation obtains the BRDF section;
(7) obtain the wavelet coefficient that BRDF cuts into slices based on above-mentioned steps (5) local environment illumination corresponding wavelet coefficient and step (6) Haar wavelet transformation, each summit of the dummy object that needs are drawn, do integral operation according to ambient lighting and BRDF at the summit episphere, obtain drawing the color on summit, when rotating model or conversion viewpoint, repeat said process according to the normal vector after the conversion and direction of visual lines, obtain drawing the color on summit, finish the painted of dummy object; Adopt simultaneously GPU to carry out acceleration drawing.
2. the illumination coherence method under the actual situation integrated environment according to claim 1 is characterized in that: described step (2) adopts the step of the ambient lighting pinup picture that gathers real scene to be: at first obtain the outer ginseng that gathers camera; Then fixedly camera is taken the minute surface bead that is in the reflection true environment in the real scene, under different exposure rates, obtain some low-dynamic range bead images, match camera response curve, by the synthetic bead high dynamic range images that obtains of camera response curve, from the captured image of camera, determine the bead profile, each pixel correspondence that this profile surrounds a certain position on the sphere; According to the pixel distance of this pixel and center of circle pixel, interpolation obtains the sphere normal vector of this pixel correspondence position; Ambient lighting is reflected into camera via this position, and this moment, known camera was taken direction, can try to achieve the incident direction of ambient lighting corresponding to this pixel by light reflection, thereby obtain the illumination pinup picture of true environment.
3. the illumination coherence method under the actual situation integrated environment according to claim 1, it is characterized in that: described step (5) is carried out precomputation to collecting ambient lighting method is: to each normal vector sample direction, solve binormal vector and tangent vector, determine the local coordinate system of this normal vector definition; To the sampling of the episphere of this local coordinate system, each sampling is a corresponding direction vector under the local coordinate system transforms to all sample direction under the global coordinate system, and the sampling ambient lighting can obtain local environment illumination corresponding to this normal vector; Then the Haar wavelet transformation is done in local environment illumination corresponding to each normal vector sampling, and stored the wavelet coefficient corresponding to local environment illumination of trying to achieve.
4. the illumination coherence method under the actual situation integrated environment according to claim 1, it is characterized in that: described step (6) to the method that BRDF carries out precomputation is: in the episphere field of definition direction of visual lines is sampled, then the incident direction under each sample direction is sampled, and try to achieve the BRDF value by incident direction and direction of visual lines, obtain BRDF section corresponding to each sight line sample direction; The Haar wavelet transformation is done in section to the BRDF under each sight line sample direction at last, and wavelet coefficient corresponding to BRDF section tried to achieve in storage.
5. the illumination coherence method under the actual situation integrated environment according to claim 1, it is characterized in that: described step (7) to the step of drawing of dummy object is: for each summit on the model, with the normal vector under the vertex model coordinate system according to the Current Transform matrix rotation to world coordinate system, wavelet coefficient according to normal vector index local light photograph after the rotation, because the discreteness of normal vector sampling, so wavelet coefficient corresponding to indexing method vector need to be done linear interpolation; Find the solution direction of visual lines according to viewpoint position and vertex position, direction of visual lines is transformed under the definite local coordinate system of vertex scheme vector, according to the wavelet coefficient of direction of visual lines index BRDF section; Then the wavelet coefficient of illumination and BRDF section is done vector and take advantage of, obtain the color on this summit; At last vertex color is sent into graphic hardware, rendering model; When rotating model or conversion viewpoint, repeat said process according to the normal vector after the conversion and direction of visual lines, finish the painted of dummy object, drawing process uses GPU to realize raising the efficiency.
CN 201110033282 2011-01-30 2011-01-30 Consistent lighting method under falsehood-reality fused environment Expired - Fee Related CN102096941B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110033282 CN102096941B (en) 2011-01-30 2011-01-30 Consistent lighting method under falsehood-reality fused environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110033282 CN102096941B (en) 2011-01-30 2011-01-30 Consistent lighting method under falsehood-reality fused environment

Publications (2)

Publication Number Publication Date
CN102096941A CN102096941A (en) 2011-06-15
CN102096941B true CN102096941B (en) 2013-03-20

Family

ID=44130017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110033282 Expired - Fee Related CN102096941B (en) 2011-01-30 2011-01-30 Consistent lighting method under falsehood-reality fused environment

Country Status (1)

Country Link
CN (1) CN102096941B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254340B (en) * 2011-07-29 2013-01-09 北京麒麟网文化股份有限公司 Method and system for drawing ambient occlusion images based on GPU (graphic processing unit) acceleration
CN103679649B (en) * 2013-11-18 2016-10-05 联想(北京)有限公司 A kind of information processing method and electronic equipment
US10417824B2 (en) * 2014-03-25 2019-09-17 Apple Inc. Method and system for representing a virtual object in a view of a real environment
CN104392481B (en) * 2014-11-25 2017-12-05 无锡梵天信息技术股份有限公司 A kind of method and device that high light reflectivity definition is controlled using textures
CN104766270B (en) * 2015-03-20 2017-10-03 北京理工大学 One kind is based on fish-eye actual situation illumination fusion method
CN107204035B (en) * 2016-12-31 2021-05-14 桂林大容文化科技有限公司 Real-time rendering method of multilayer non-uniform material reflective object
CN107316344B (en) * 2017-05-18 2020-08-14 深圳市佳创视讯技术股份有限公司 Method for planning roaming path in virtual-real fusion scene
CN107608077B (en) * 2017-09-08 2020-01-03 长春理工大学 Multi-light-source position estimation method
CN108470079B (en) * 2017-10-26 2023-04-07 北京特种工程设计研究院 Simulation method for radiation safety evaluation of nuclear operation of space launching field
CN109934903B (en) * 2017-12-19 2020-10-16 北大方正集团有限公司 Highlight information extraction method, system, computer equipment and storage medium
CN108460841A (en) * 2018-01-23 2018-08-28 电子科技大学 A kind of indoor scene light environment method of estimation based on single image
CN109509246B (en) * 2018-03-25 2022-08-02 哈尔滨工程大学 Photon map clustering method based on self-adaptive sight division
CN108509887A (en) * 2018-03-26 2018-09-07 深圳超多维科技有限公司 A kind of acquisition ambient lighting information approach, device and electronic equipment
CN108830804B (en) * 2018-05-23 2023-03-10 长春理工大学 Virtual-real fusion fuzzy consistency processing method based on line spread function standard deviation
CN109003273B (en) * 2018-07-27 2021-05-18 郑州工程技术学院 Method for detecting light guide consistency of car lamp
CN110412828A (en) * 2018-09-07 2019-11-05 广东优世联合控股集团股份有限公司 A kind of Method of printing and system of three-dimensional optical track image
CN109242800B (en) * 2018-09-26 2022-03-29 北京邮电大学 Method for realizing illumination consistency of virtual model by estimating environmental illumination through image
CN111833374B (en) * 2019-04-22 2023-12-05 曜科智能科技(上海)有限公司 Path planning method, system, storage medium and terminal based on video fusion
CN110493540B (en) * 2019-08-21 2020-12-04 四川大学 Scene dynamic illumination real-time acquisition method and device
CN110458964B (en) * 2019-08-21 2021-07-27 四川大学 Real-time calculation method for dynamic illumination of real environment
CN110570503B (en) * 2019-09-03 2021-04-16 浙江大学 Method for acquiring normal vector, geometry and material of three-dimensional object based on neural network
CN110728748B (en) * 2019-09-30 2023-05-05 中国科学院国家天文台南京天文光学技术研究所 Rendering method based on hemispherical orthogonal function
CN111145341B (en) * 2019-12-27 2023-04-28 陕西职业技术学院 Virtual-real fusion illumination consistency drawing method based on single light source
CN111652963B (en) * 2020-05-07 2023-05-02 浙江大学 Augmented reality drawing method based on neural network
CN112837425B (en) * 2021-03-10 2022-02-11 西南交通大学 Mixed reality illumination consistency adjusting method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101710429A (en) * 2009-10-12 2010-05-19 湖南大学 Illumination algorithm of augmented reality system based on dynamic light map
CN101794459A (en) * 2010-02-09 2010-08-04 北京邮电大学 Seamless integration method of stereoscopic vision image and three-dimensional virtual object

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100609145B1 (en) * 2004-12-20 2006-08-08 한국전자통신연구원 Rendering Apparatus and Method for real-time global illumination in real light environment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101710429A (en) * 2009-10-12 2010-05-19 湖南大学 Illumination algorithm of augmented reality system based on dynamic light map
CN101794459A (en) * 2010-02-09 2010-08-04 北京邮电大学 Seamless integration method of stereoscopic vision image and three-dimensional virtual object

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
增强现实一致光照技术研究;葛学东;《第十三届全国图像图形学学术会议论文集》;20061130;664-667 *
葛学东.增强现实一致光照技术研究.《第十三届全国图像图形学学术会议论文集》.2006,664-667.

Also Published As

Publication number Publication date
CN102096941A (en) 2011-06-15

Similar Documents

Publication Publication Date Title
CN102096941B (en) Consistent lighting method under falsehood-reality fused environment
WO2022121645A1 (en) Method for generating sense of reality of virtual object in teaching scene
CN108648269B (en) Method and system for singulating three-dimensional building models
JP6246757B2 (en) Method and system for representing virtual objects in field of view of real environment
CN107330964B (en) Display method and system of complex three-dimensional object
US20050041024A1 (en) Method and apparatus for real-time global illumination incorporating stream processor based hybrid ray tracing
CN104766270A (en) Virtual and real lighting fusion method based on fish-eye lens
CN104392481B (en) A kind of method and device that high light reflectivity definition is controlled using textures
CN102915559A (en) Real-time transparent object GPU (graphic processing unit) parallel generating method based on three-dimensional point cloud
CN107644453A (en) A kind of rendering intent and system based on physical colored
WO2023185262A1 (en) Illumination rendering method and apparatus, computer device, and storage medium
KR20240001021A (en) Image rendering method and apparatus, electronic device, and storage medium
Grosch et al. Consistent interactive augmentation of live camera images with correct near-field illumination
Kolivand et al. Covering photo-realistic properties of outdoor components with the effects of sky color in mixed reality
CA3199390A1 (en) Systems and methods for rendering virtual objects using editable light-source parameter estimation
CN103679818B (en) A kind of real-time scene method for drafting based on virtual surface light source
US11804008B2 (en) Systems and methods of texture super sampling for low-rate shading
CN104517313A (en) AO (ambient occlusion) method based on screen space
Wang et al. Bidirectional shadow rendering for interactive mixed 360° videos
CN116385619A (en) Object model rendering method, device, computer equipment and storage medium
Noh et al. Soft shadow rendering based on real light source estimation in augmented reality
Iwasaki et al. Efficient rendering of optical effects within water using graphics hardware
Dai et al. Interactive mixed reality rendering on holographic pyramid
Heymann et al. Illumination reconstruction from real-time video for interactive augmented reality
KR102237382B1 (en) Method of harmonic rendering on augmented reality environment, and augmented reality system and recoding medium for performing thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130320

Termination date: 20200130