CN106504315B - The method and apparatus for simulating global illumination - Google Patents

The method and apparatus for simulating global illumination Download PDF

Info

Publication number
CN106504315B
CN106504315B CN201611027685.4A CN201611027685A CN106504315B CN 106504315 B CN106504315 B CN 106504315B CN 201611027685 A CN201611027685 A CN 201611027685A CN 106504315 B CN106504315 B CN 106504315B
Authority
CN
China
Prior art keywords
scene
pixel
polyhedron
building block
towards
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611027685.4A
Other languages
Chinese (zh)
Other versions
CN106504315A (en
Inventor
刘皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201611027685.4A priority Critical patent/CN106504315B/en
Publication of CN106504315A publication Critical patent/CN106504315A/en
Application granted granted Critical
Publication of CN106504315B publication Critical patent/CN106504315B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Abstract

This application discloses a kind of method and apparatus of simulation global illumination, this method includes:The target object simulation for carrying out illumination render will be needed to be split as multiple building blocks in scene to be output;For any one building block, the polyhedron of the building block is surrounded in simulation, and towards on polyhedral multiple target directions, analog video camera images scene respectively, obtains several scene images simulated from multiple target directions;The ambient light color information that polyhedron can be generated on the target direction is determined according to the pixel color of each pixel in scene image for any one scene image;Based on the ambient light color information that the polyhedron can be generated on multiple target direction, illumination render is carried out to the building block that the polyhedron surrounds, to obtain the lighting effect of the target object.The scheme of the application can go out the lighting effect of object in the scene with real-time rendering, and more can really reflect the global illumination effect of object.

Description

The method and apparatus for simulating global illumination
Technical field
This application involves technical field of image processing more particularly to a kind of method and apparatus of simulation global illumination.
Background technology
Global illumination is very important research field in computer graphics, passes through the mould to light conditions in the Nature It is quasi-, capture soft shadow, the indirectly illumination such as refraction effect caused by the multiple propagation (e.g., reflect and reflect) of light in true environment Fruit, these effects can greatly reinforce the sense of reality of rendering result.Global illumination has been widely used in animation, virtual at present In the fields such as reality, game.
In the fields such as animation, virtual reality or game, in addition to static object is (fixed in the scene in scene Object or person etc.) except, can also include a large amount of dynamic object (transportable object or person etc. in the scene).It is right It, can be in such a way that precomputation generates light-maps, to realize global illumination for static object;However it is right for dynamic As for, since position can constantly change in the scene for it so that the light conditions that different moments dynamic object is subject to It can constantly change, in particular, in game or virtual reality etc. in the higher field of requirement of real-time, dynamic object meeting Due to the difference of user demand, and it is uncertain that shift position is generated, in this way, just can not be by the way that light-maps are generated in advance Mode come determine dynamic object different location point light conditions.
In order to determine the lighting effect of dynamic object, space discrete in scene can be generally analyzed in advance at present Ambient light color at location point and storage can call pre- when needing to render the lighting effect of dynamic object in scene The ambient light information first stored carries out illumination render to the dynamic object.However, prestoring different spatial point in scene Ambient light information be necessarily required to occupy a large amount of memory space, lead to the waste of storage resource;Moreover, however in scene in addition to Except static light source, there is also some dynamic light sources, such as some dynamic objects may become illuminator or dynamic object sheet Body is exactly mobile light source, in this way, dynamic object by the illumination of static light source other than being acted on, it is also possible to can be by other dynamics The illumination of object acts on, and due to the uncertainty of dynamic light source self-position, discrete space in the scene precomputed During the ambient light color information of location point, the ambient light color of dynamic light source can not be considered to spatial position point at all Illumination effect, the true lighting effect of dynamic object to render.
Invention content
In view of this, this application provides a kind of method and apparatus of simulation global illumination, object is gone out with real-time rendering and is existed Lighting effect in scene, and more can really reflect the global illumination of object.
To achieve the above object, on the one hand, the embodiment of the present application provides a kind of method of simulation global illumination, including:
The target object simulation for carrying out illumination render will be needed to be split as multiple building blocks in scene to be output;
For any one building block, the polyhedron of the building block is surrounded in simulation;
Towards on polyhedral multiple target directions, analog video camera images the scene respectively, obtains To several scene images simulated from the multiple target direction;
For the scene image simulated on any one target direction, according to each pixel in the scene image Pixel color determines that the polyhedron acts on one group of ambient light color information of the building block, obtains corresponding to described Multigroup ambient light color information of multiple target directions;
Based on polyhedral multigroup ambient light color information, the building block that the polyhedron is surrounded Illumination render is carried out, to obtain the lighting effect of the target object.
On the other hand, the embodiment of the present application also provides a kind of devices of simulation global illumination, including:
Object split cells, for the target object simulation for carrying out illumination render will to be needed to be split as in scene to be output Multiple building blocks;
Space construction unit, for for any one building block, the polyhedron of the building block to be surrounded in simulation;
Camera unit is simulated, for towards polyhedral multiple target directions, analog video camera to be to institute respectively It states scene to be imaged, obtains several scene images simulated from the multiple target direction;
Light color determination unit, for the scene image for being simulated on any one target direction, according to the field The pixel color of each pixel in scape image determines that the polyhedron acts on one group of ambient light color of the building block Information obtains multigroup ambient light color information corresponding to the multiple target direction;
Illumination render unit, for being based on polyhedral multigroup ambient light color information, to the polyhedron The building block surrounded carries out illumination render, to obtain the lighting effect of the target object.
It can be seen via above technical scheme that in applying for embodiment, it, will be to be rendered in scene before exporting scene Target object is modeled as being made of multiple building blocks, and the polyhedron of each building block is surrounded in simulation respectively, is taken the photograph by simulation Camera can obtain several discrete scene images, root from towards being imaged to scene on polyhedral multiple target directions The ambient light that polyhedron can be applied on the building block can be analyzed according to the pixel color of pixel in the scene image Colouring information, since each group of ambient light color information is all based on the pixel determination of discrete scene image, and to discrete The sample rate that is sampled of scene image can reach higher rank, to be conducive to improve the ambient light color determined Precision;Moreover, every width scene image is all real-time simulation intake in the embodiment of the present application, combines in current scene and move State light source and static light source are to polyhedral illumination effect for simulating, in this way, the work determined based on these scene images Ambient light color information for building block in target object is more accurate, so as to more really render target pair The lighting effect of elephant.
In addition, finally determining that polyhedron acts on composition portion due to the application simulated scenario image, and based on scene image The ambient light color of part is that Rendering based on GPU is completed, relative to determining ambient light color based on the Rendering of CPU Information, computational efficiency can be greatly improved by based on GPU render, and reduce to calculate taking.
Description of the drawings
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this The embodiment of application for those of ordinary skill in the art without creative efforts, can also basis The attached drawing of offer obtains other attached drawings.
Fig. 1 is a kind of composed structure schematic diagram of computer equipment disclosed in the embodiment of the present application;
Fig. 2 is a kind of possible structure composed signal of the system of global illumination in simulated scenario disclosed in the embodiment of the present application Figure;
Fig. 3 is the flow of one embodiment of the method for global illumination in a kind of simulated scenario disclosed in the embodiment of the present application Schematic diagram;
Fig. 4 is the flow of another embodiment of the method for global illumination in a kind of simulated scenario disclosed in the embodiment of the present application Schematic diagram;
Fig. 5 a are the schematic diagram for the who object for including in scene;
Figure 5b shows that the multiple cubical schematic diagrames for surrounding multiple building blocks of who object in scene;
Fig. 6 shows a kind of composed structure signal of one embodiment of the device of simulation global illumination of the embodiment of the present application Figure.
Specific implementation mode
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete Site preparation describes, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.It is based on Embodiment in the application, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall in the protection scope of this application.
The embodiment of the present application provides a kind of method and apparatus of simulation global illumination, and this method and device can be applied to To fields such as game, virtual realities, the lighting effect of object in scene is gone out with real-time rendering.
The method and apparatus of the present embodiment are suitable for arbitrary computer equipment, and e.g., which can be outside There is provided game services or virtual reality service server or other have the equipment of graphic processing data ability.
Such as Fig. 1, it illustrates the meters that the method and apparatus of global illumination in the simulated scenario of the embodiment of the present application are applicable in Calculate a kind of composed structure schematic diagram of machine equipment.In Fig. 1, which may include:Processor 101, memory 102, communication interface 103, display 104, input unit 105 and communication bus 106.
Processor 101, memory 102, communication interface 103, display 104, input unit 105 pass through communication bus 106 complete mutual communication.
In the embodiment of the present application, which includes at least:Graphics processor (GPU, Graphics Processing Unit) 1012, GPU can be used to implement in the embodiment of the present application analog video camera and carries out image to scene and take the photograph It takes, generates the scene image of different directions;Alternatively, realizing at the relevant graph datas such as image rendering, computing environment optical information Reason.
Optionally, it in the processor 101 may include central processing unit (CPU, Central Processing Unit) 1011, some relevant data processings are completed with secondary graphics processor, may be implemented at the main data of the computer equipment Reason operation, certainly, which may be replaced with application-specific integrated circuit (application-specific Integrated circuit, ASIC), digital signal processor (DSP), application-specific integrated circuit (ASIC), ready-made programmable gate Array (FPGA) or other programmable logic device etc..
For storing one or more than one program in memory 102, program may include program code, described program Code includes computer-managed instruction.The memory may include high-speed RAM memory, it is also possible to further include non-volatile memories Device (non-volatile memory), for example, at least a magnetic disk storage.
The communication interface 103 can be the interface of communication module, such as the interface of gsm module.
The display 104 can be used for showing object involved in scene and other image informations;It can also show Information input by user, or it is supplied to the information of user and the various graphical user interface of computer equipment, these figures Shape user interface can be made of the arbitrarily combination such as figure, text, picture.The display may include display panel, e.g., can Think the display panel configured using forms such as liquid crystal display, Organic Light Emitting Diodes.Further, which can be with Including having the touch display panel of acquisition touch event.
The input unit 105 can be used for receiving the information such as the character input by user of input, number, and generation and user Setting and the related signal input of function control.The input unit can include but is not limited to physical keyboard, mouse, operating lever It is one or more in.
Certainly, computer equipment structure shown in FIG. 1 does not constitute the restriction to computer equipment, counts in practical applications It may include than more or fewer components shown in FIG. 1, or the certain components of combination to calculate machine equipment.
In a kind of application scenarios, the scheme of the embodiment of the present application can have the applications such as game, virtual reality in operation Before computer equipment exports scene, the object (preferably dynamic object) in scene is rendered in real time.In this kind of scene In, computer equipment determines scene to be output and rendered to the dynamic object in scene can be by the computer equipment Complete independently.Such as, game player is calculating during playing single-play game using computer equipments such as mobile phone, platform computers Machine equipment can carry out real-time rendering before exporting per frame game picture according to object involved in the game picture, And export the game picture for including the object after rendering.
In another application scenarios, the scheme of the embodiment of the present application is readily applicable to terminal and is handed over server Mutually, in the corresponding picture of end side output scene.In this kind of scene, end side scene to be output can be from server side It obtains, and terminal can carry out illumination render before exporting the scene to the object in the scene.For the ease of understanding this Shen Scheme please below simply introduces this kind of scene that the scheme of the application is applicable in, and referring to Fig. 2, it illustrates this Shens It please a kind of a kind of system composed structure schematic diagram that the method for simulation global illumination is applicable in.As shown in Fig. 2, the system can wrap It includes by service system and more station terminals 202 that at least a server 201 forms.
Wherein, it can be stored in the server 201 in service system for realizing functions such as game or virtual realities Contextual data, and in terminal request contextual data, contextual data is transferred to terminal.
The terminal 202 is used to show the scene corresponding to the contextual data of server return, and depending on the user's operation, to Server sends the update request for updating dynamic object position in the scene, with obtain server to the position of the dynamic object into The updated scene of row.
Further, the terminal 202 is before the scene that export server returns, it is also necessary to be waited for according to determining in the scene The target object of rendering, and simulated using the GPU in terminal and target object is split as multiple building blocks, it is surrounded by simulating The polyhedron of the building block, and towards on polyhedral multiple target directions, analog video camera takes the photograph scene respectively Picture obtains several scene images simulated from multiple target directions, then the picture according to each pixel in scene image Plain color determines the ambient light color information that polyhedron can be generated on the target direction, to be based on the polyhedron at this The ambient light color information that can be generated on multiple target directions carries out illumination wash with watercolours to the building block that the polyhedron surrounds Dye, to finally obtain the lighting effect of the target object.
It certainly, in practical applications, can also be by server to the operation that target object to be rendered is rendered in scene Side is completed, and detailed process is similar to the implementation procedure of end side.
With reference to the above general character, describe in detail to the method for the simulation global illumination of the embodiment of the present application.
Referring to Fig. 3, it illustrates a kind of flow diagrams of method one embodiment of simulation global illumination of the application, originally The method of embodiment can be applied to computer equipment noted earlier, specifically can be by the GPU in the computer come complete At relevant operation.The method of the present embodiment may include:
301, the target object simulation for carrying out illumination render will be needed to be split as multiple building blocks in scene to be output.
Wherein, scene to be output can be understood as to be output to the scene corresponding to the scenic picture in display.Root Different according to the field applied, which can be scene of game, or the scene etc. of virtual reality.
May include multiple objects, e.g., stationary body, dynamic object, personage etc. in the scene.By field in the present embodiment The object for carrying out illumination render is needed to be known as target object in scape, the target object can be static object in the present embodiment, It may be dynamic object.Optionally, it is contemplated that static object can realize illumination wash with watercolours by the estimated light-maps calculated Dye, in the present embodiment, the target object can refer in particular to the dynamic object for needing to carry out illumination render.Certainly, the application is implemented The target object determined in example is either one or more, but for the illumination render of any one target object The processing step that the present embodiment may be used in process is handled.
It is understood that each target object can be split as multiple portions composition, each partly may be considered One building block, specific fractionation mode can be set as needed.It should be noted that the fractionation described in the present embodiment is simultaneously It is non-that target object is disassembled to and is separated into multiple component parts, and only simulated on target object and form the target object institute The multiple building blocks needed.
Such as, when target object is automobile, simulation fractionation can be carried out in the automobile itself, determines to form the automobile The specific location of the multiple building blocks of car door, compartment, wheel etc..For another example, when target object is behaved, composition can be simulated The body composition of human body is e.g. carried out simulation fractionation by multiple tissues of human body, obtains the head of group adult body, neck, four Multiple building blocks such as limb, trunk.
In the embodiment of the present application, the purpose for carrying out simulation fractionation to target object is, in order to can subsequently determine respectively Go out the lighting effect of each building block, and then the group is reflected by the lighting effect of all building blocks in the target object At the lighting effect of component entirety, to obtain target object more accurately lighting effect.
302, for any one building block, the polyhedron of the building block is surrounded in simulation.
Polyhedron is actually the enclosure space being made of multiple faces.The quantity in polyhedral face specifically having can be with It is set as needed, such as the polyhedron can be tetrahedron, cube or octahedron.
It is understood that scene is formed by largely continuously putting, and the lighting effect of dynamic object in the scene is The ambient light color continuously put by these determines, and since the quantity put in space where scene is very huge, at all Can not possibly determine that ambient light color that these are continuously put is applied in real time by algorithm can show on dynamic object Lighting effect.Therefore, for that in the higher field of the requirement of real-time such as virtual reality, game, scene can be abstracted as small Enclosure space, and by the ambient light color in the small enclosure space residing for object, determine the lighting effect of dynamic object.Accordingly , in order to determine the lighting effect of building block, the polyhedron for surrounding the building block can be simulated, to determine that this is more The enclosure space of face body can be applied to the lighting effect on the building block.
303, towards this, on polyhedral multiple target directions, analog video camera images the scene respectively, obtains To several scene images simulated from multiple target direction.
It is understood that for any one polyhedron, the ambient light in enclosure space corresponding to polyhedron Information is related with the optical signal that each pixel is applied on the polyhedron in scene, and therefore, it is necessary to polyhedral towards this On multiple target directions, analog video camera shoots scene towards the polyhedron respectively, to pass through the video camera simulated Capture the camera shooting direction in the scene domain in the mould video camera and the optical signal in image pickup scope, and the light that will be captured Signal is presented in the image simulated.In the embodiment of the present application, the image that analog video camera generates is known as scene image.
Wherein, which can preset as needed, or from towards polyhedral multiple directions Multiple directions are randomly choosed out as target direction.
Further, the ambient light color information in polyhedral enclosure space can be by taking out from polyhedron Discrete point or the ambient light color in face reflect, in this way, in analog video camera towards polyhedron to be taken the photograph to scene When picture, the video camera that simulation can be arranged images scene on multiple target directions in each face towards in polyhedron, Either analog video camera images scene in the multiple directions of multiple and different points towards in the polyhedron.
In one implementation, in the multiple directions of different sides, analog video camera can distinguished towards in polyhedron Towards polyhedral different sides and scene is imaged.Wherein, for any one face in polyhedron, which can direction Two opposite directions are distinguished outside radiant lights and are shone, and therefore, it needs to be determined that going out the face, to act on this more in the embodiment of the present application The ambient light color for the building block that face body is surrounded.For the ease of distinguishing, the ambient light color in the face is applied to the multi-panel The direction on building block that body surrounds is known as the direction of illumination that the face acts on the building block, which is from the face It is directed toward inside the polyhedron, and it is understood that in order to get the light that the face can be generated on direction of illumination Signal condition needs analog video camera to be imaged towards the opposite direction of the direction of illumination to the face, that is to say, that direction should The target direction in face is the direction that the face is internally pointed to from polyhedron, specifically, can be to be directed toward to be somebody's turn to do from polyhedron internal vertical The direction in face.
Optionally, in order to reflecting that the point of all pixels in scene is applied to the environment that can be generated on the face comprehensively Light color, the video camera of simulation need that the scene domain corresponding to the face can be captured comprehensively, then can towards the target direction in the face Think the direction that the face is perpendicularly oriented to from polyhedral center.
Such as, by taking polyhedron is cube as an example, then there are six faces for cube tool, this six faces may correspond to six targets Direction can be perpendicularly oriented to from cubical center on the direction in the face, simulation is set in this way, for cubical each face It sets video camera and is imaged towards the face, by the simulation of the pixel color of the pixel in the corresponding scene domain in the face to taking the photograph In the pixel for the image that camera absorbs.
In another realization method, on multiple target directions of difference, can respectively it be simulated towards in polyhedron Video camera images the scene towards polyhedral difference.In view of polyhedral particularity, direction can be chosen Multiple target directions on different vertex in polyhedron.For that on the target direction corresponding to any one vertex, can simulate and take the photograph Camera images scene towards the vertex.Specifically, in view of polyhedron vertex acts on the composition of polyhedron encirclement The direction of illumination of component is to be directed toward the direction inside the polyhedron from the vertex, therefore, towards the target side on vertex in polyhedron To can be the direction being directed toward from the vertex inside the polyhedron.
304, for the scene image simulated on any one target direction, according to each pixel in the scene image Pixel color, determine that the polyhedron acts on one group of ambient light color information of the building block that the polyhedron is surrounded, obtain To multigroup ambient light color information corresponding to multiple target direction.
It should be noted that due to towards simulating a width scene image respectively on polyhedral multiple target directions, In this way, being required to carry out the behaviour of the step 304 for the scene image simulated on any one target direction in the polyhedron Make.
Wherein, for the scene image simulated on a target direction, based on pixel in the scene image Pixel color can determine the ambient light color that polyhedron acts on the building block that the polyhedron is surrounded, and this group of ring Border light color information is actually the ambient light that the polyhedron can be generated on the direction of illumination opposite with the target direction Color.
Wherein, the pixel color according to each pixel in the scene image, determines the side of one group of ambient light color information Formula can there are many.It optionally, can be by the scene graph for the scene image simulated on a target direction The pixel color of each pixel averages as in, and will equalize obtained pixel color as one group of ambient light face Color.
Further, the pixel color of each pixel in scene image is averaged to be:Gradually reducing should Scene image, and using the pixel color of each pixel in the scene image after Gaussian filter equalization diminution, until To the pixel color of a unit pixel.
305, it is based on polyhedral multigroup ambient light color information, the composition surrounded to the polyhedron Component carries out illumination render, to obtain the lighting effect of the target object.
For any one polyhedron, which can be obtained by step 304 and act on what the polyhedron was surrounded Multigroup ambient light color information of building block, and every group of ambient light color corresponds to the polyhedron and acts on the building block One direction of illumination, in this way, multigroup ambient light color by the polyhedron on multiple direction of illuminations to the building block Information, it may be determined that go out to act on the lighting effect of the building block.
The pixel for being all based on discrete scene image due to each group of ambient light color information in the embodiment of the present application is true Fixed, and the sample rate sampled to discrete scene image can reach higher rank, be determined to be conducive to improve The precision of the ambient light color gone out;In addition, finally determining multi-panel due to the application simulated scenario image, and based on scene image The ambient light color that body acts on building block is that Rendering based on GPU is completed, relative to the Rendering based on CPU come Determine ambient light color information, computational efficiency can be greatly improved by based on GPU render, and reduce to calculate taking.
Meanwhile every width scene image is all real time shooting in the embodiment of the present application, in this way, dynamic in scene can be integrated Light source and static light source are to polyhedral illumination effect for simulating, in this way, the effect determined based on these scene images The ambient light color information of building block is more accurate in target object, more true so as to render target object Lighting effect.
It is understood that in the embodiment of the present application, the pixel of each pixel in the scene image simulated Each spatial point is former within the scope of the camera photography in color, with the position in the scene of the video camera of simulation, scene The pixel color and original image vegetarian refreshments of beginning is related with the camera position of the video camera simulated, therefore, scene is generated in simulation When image, the pixel color of each pixel in the scene image simulated can be determined based on the above information, with most Throughout one's life at scene image.
By taking a target direction as an example, optionally, in one implementation for can be first based on preset video camera With the distance of polyhedron (e.g., polyhedral center, polyhedral vertex or polyhedral face), simulation is determined in the scene Video camera is in the space camera position on target direction;Then, it is determined that original within the scope of camera photography in scene Pixel, wherein for the ease of distinguishing, the space pixel being within the scope of camera photography in three-dimensional scene space is known as Original image vegetarian refreshments;Finally, according to the pixel color of original image vegetarian refreshments, the location of pixels of original image vegetarian refreshments scape on the scene is relative to the sky Between camera position first direction vector and corresponding to the camera shooting direction that is imaged towards the polyhedron of the video camera the Two direction vectors simulate video camera and carry out each pixel in scene image obtained by image capture to scene towards the polyhedron The pixel color of point.Wherein, the video camera is identical as the direction of target direction towards the camera shooting direction that polyhedron is imaged, no It is with place, the camera shooting direction of video camera can be video camera and the vector side corresponding to camera shooting focus in polyhedron herein To.
It is understood that in the embodiment of the present application, in order to further increase the complete of follow-up determining dynamic object The quantity of the precision of office's lighting effect, target direction is unsuitable very few.
When particularly, to be imaged towards different sides in polyhedron or different top point, face possessed by the polyhedron or The quantity on person vertex is unsuitable very few, and optionally, the polyhedron of the encirclement building block that the embodiment of the present application can simulate can be with For cube.For the ease of understanding the scheme of the embodiment of the present application, it is as cube using the polyhedron for surrounding building block below Example, is introduced the scheme of the application.
Referring to Fig. 4, it illustrates a kind of flow diagram of another embodiment of image processing method of the application, this implementations The method of example can be applied to foregoing computer equipment, and the method for the present embodiment may include:
401, obtain the information of scene to be output.
It is understood that in the fields such as game or virtual reality, each frame picture all corresponds in computer equipment One scene, scene to be output can be understood as the scene corresponding to picture to be output.
402, the target object of pending illumination render is determined from the scene.
In the embodiment of the present application, the object for carrying out illumination render will be needed to be known as target object in scene, the target pair As that can be static object, or dynamic object.The target object of the embodiment of the present application as a preferred implementation manner, It can be dynamic object.
Particularly, when the scene in the embodiment of the present application is scene of game, which can be in scene of game Game object.
403, according to the composed structure of target object, target object simulation is split as multiple building blocks.
It is understood that the composed structure of target object can characterize form the target object each building block it Between connection relation, e.g., by taking human body as an example, between all parts of human body all by each joint be connected, therefore, in this implementation It can simulate and the target object is divided, obtain the multiple of the target object based on the composed structure of target object in example Building block.
404, for any one building block, simulate the cube for surrounding the building block.
In the present embodiment, it is introduced so that the enclosure space polyhedron for surrounding human body is regular cube as an example.
Since each building block is in the enclosure space where a cube, the light of each building block May be considered according to effect is influenced by the cubical ambient light color.
For the ease of the spatial relationship of cube and building block that understanding simulates, Fig. 5 a and Fig. 5 b are may refer to, by The object for including in scene shown in Fig. 5 a is human body 501.In order to determine human body lighting effect in the scene, can divide respectively The lighting effect for analysing the body parts of the human body, to finally obtain the global illumination effect of the human body.It, can such as referring to Fig. 5 b Constituted so that the human body 501 can be modeled as multiple tissues, such as Fig. 5 b, simulate the head of group adult body, arm it is upper Articular portion, the lower articulating component of arm, the upper articular portion of leg, the lower articular portion of leg, buttocks and chest trunk etc. Multiple building blocks, and each building block is surrounded by a cube 502 respectively so that human body is covered by multiple cubes 502 Lid.
405, for any one face in cube, it is directed toward on the target direction in the face from cube internal vertical, Simulating video camera direction should carry out imaging required space camera position in face of the scene.
In the present embodiment, the spatial position point residing for the video camera by simulation is known as space camera position.Specifically, can With the vertical range based on preset analog video camera and face, determine the video camera of simulation in the target direction towards the face On the space camera position that is imaged.
Wherein, when analog video camera is towards some cubical face, in order to which video camera can capture corresponding to the face The scene information of spatial dimension, the target direction can be the target direction that the face is perpendicularly oriented to from cube center.Correspondingly, It can be in perpendicular to the face and be passed through on the vertical line of central point in the face with analog video camera, meanwhile, analog video camera and the face The vertical range at center is pre-determined distance.
406, determine the original image vegetarian refreshments within the scope of the camera photography in the scene.
Wherein, area is carried out in order to be imaged the pixel in generated scene image to the scene with analog video camera Point, the pixel being within the scope of camera photography in the scene is known as original image vegetarian refreshments.
407, according to the pixel color of original image vegetarian refreshments, the location of pixels of the original image vegetarian refreshments subtracts the space camera position Obtained difference is converted to the obtained first direction vector of unit vector and the camera shooting direction of the video camera is converted to list The obtained second direction vector of bit vector, simulates the pixel face of each pixel in the scene image that the video camera absorbs Color.
Since original image vegetarian refreshments is pixel original in the scene, the pixel color of the original image vegetarian refreshments is Know, the pixel color of the original image vegetarian refreshments in the scene can be directly obtained.
For the ease of distinguishing, in the embodiment of the present application, by the position (coordinate bit in other words of original image vegetarian refreshments scape on the scene Set) it is known as the location of pixels of the original image vegetarian refreshments.
Wherein, the pixel color of each pixel is in the scene image:There is the original image of mapping relations with pixel The pixel color of vegetarian refreshments is multiplied by the vector dot of the first direction vector and second direction vector.
In the embodiment of the present application, the location of pixels of original image vegetarian refreshments in the scene relative to the space camera position One direction vector is specially:The location of pixels of the original image vegetarian refreshments subtracts the obtained difference of space camera position and is converted to list The obtained direction vector of bit vector;Correspondingly, second corresponding to the camera shooting direction that the video camera is imaged towards the face Direction vector is specially:The video camera is converted to the obtained direction vector of unit vector towards the camera shooting direction that the face images.
408, the scene image is gradually reduced, and utilize each picture in the scene image after Gaussian filter equalization diminution The pixel color of vegetarian refreshments, until the pixel color of a unit pixel is obtained, using the pixel color of the unit pixel as upright Ambient light color in body on the face.
409, for any one building block in the target object, foundation surrounds six of the positive solid of the building block Ambient light color information function on face is in the building block, to render the lighting effect of the target object.
410, will include that the scenic picture of lighting effect of the target object is output in display.
It should be noted that Fig. 4 is introduced for surrounding the polyhedron of each building block and being cube, still It is understood that working as, the case where surrounding building block using other polyhedrons is similar to the present embodiment, and difference is only in that more When the body difference of face, the quantity for forming face possessed by polyhedron is different, and for determining the corresponding ambient light color in each face Process it is identical as the present embodiment, details are not described herein.
It is understood that Fig. 4 is the analog video camera pair with towards in polyhedron on multiple target directions in each face Scene is introduced for being imaged, but corresponds to and on multiple target directions on each vertex, simulated towards in polyhedron The process that video camera images scene can be similar to process shown in Fig. 4 embodiments, and difference is only in that setting On the direction of video camera being oriented inside from polyhedral each vertex to the polyhedron, and for generating the scene of simulation Image and scene image is handled, to determine that the process of the corresponding multigroup ambient light color of polyhedron is identical , details are not described herein.
Below to it is provided in an embodiment of the present invention it is a kind of simulation global illumination device be introduced, one kind described below Simulate global illumination device can with it is above-described it is a kind of simulation global illumination method correspond reference.
Referring to Fig. 6, it illustrates a kind of composed structure signals of device one embodiment of simulation global illumination of the application Figure, the device of the present embodiment may include:
Object split cells 601, for the target object simulation for carrying out illumination render will to be needed to tear open in scene to be output It is divided into multiple building blocks;
Space construction unit 602, for for any one building block, the multi-panel of the building block to be surrounded in simulation Body;
Camera unit 603 is simulated, for towards polyhedral multiple target directions, distinguishing analog video camera pair The scene is imaged, several scene images simulated from the multiple target direction are obtained;
Light color determination unit 604, for the scene image for being simulated on any one target direction, according to described in The pixel color of each pixel in scene image determines that the polyhedron acts on one group of ambient light face of the building block Color information obtains multigroup ambient light color information corresponding to the multiple target direction;
Illumination render unit 605, for being based on polyhedral multigroup ambient light color information, to the multi-panel The building block that body surrounds carries out illumination render, to obtain the lighting effect of the target object.
Optionally, the simulation camera unit, including:
Towards determination unit, for determining towards polyhedral multiple target directions;
Position determination unit, for towards any one described polyhedral target direction, simulating the camera shooting Machine carries out imaging required space camera position towards the polyhedron, and executes the operation of pixel matching unit;
Pixel matching unit, for determining the original image vegetarian refreshments in the scene within the scope of the camera photography;
Image simulation unit, the pixel face of the original image vegetarian refreshments for being determined according to the pixel matching unit Color, first direction vector of location of pixels of the original image vegetarian refreshments in the scene relative to the space camera position, And the second direction vector imaged corresponding to direction that the video camera is imaged towards the polyhedron, simulate the camera shooting Machine carries out the scene pixel color of each pixel in scene image obtained by image capture.
Optionally, location of pixels phase of the original image vegetarian refreshments that described image analogue unit is determined in the scene First direction vector for the space camera position is:Target difference is converted to direction vector obtained by unit vector, In, the target difference is that the location of pixels of the original image vegetarian refreshments subtracts the obtained difference of space camera position;
Corresponding to the camera shooting direction that the video camera that described image analogue unit is determined is imaged towards the polyhedron Second direction vector be:It is obtained that the video camera towards the camera shooting direction that the polyhedron images is converted to unit vector Direction vector;
The pixel color of each pixel is in the scene image that described image analogue unit is determined:With the picture The pixel color of the original image vegetarian refreshments of mapping relations that vegetarian refreshments has be multiplied by the first direction vector and second direction vector to Measure dot product.
Optionally, the light color determination unit, including:
Light color determination subelement, for the scene image for being simulated on any one target direction, by the field The pixel color of each pixel averages in scape image, and using the obtained pixel color of equalization as multi-panel One group of ambient light color information that body can be generated.
Optionally, the light color determination subelement, by the pixel color of each pixel in the scene image into When row equalization, specifically for gradually reducing scene image, and using in the scene image after Gaussian filter equalization diminution The pixel color of each pixel, until obtaining the pixel color of a unit pixel.
Optionally, camera unit is simulated, in multiple target directions towards different sides in the polyhedron, difference mould Quasi- video camera images the scene, obtains several scene images simulated from the multiple target direction.
Optionally, the space construction unit is specifically, for for any one building block, simulation surrounds described group At the cube of component.
It should be noted that each embodiment in this specification is described in a progressive manner, each embodiment weight Point explanation is all difference from other examples, and the same or similar parts between the embodiments can be referred to each other. For device class embodiment, since it is basically similar to the method embodiment, so fairly simple, the related place ginseng of description See the part explanation of embodiment of the method.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning Covering non-exclusive inclusion, so that the process, method, article or equipment including a series of elements includes not only that A little elements, but also include other elements that are not explicitly listed, or further include for this process, method, article or The intrinsic element of equipment.In the absence of more restrictions, the element limited by sentence "including a ...", is not arranged Except there is also other identical elements in the process, method, article or equipment including element.
The foregoing description of the disclosed embodiments enables those skilled in the art to realize or use the present invention.To this A variety of modifications of a little embodiments will be apparent for a person skilled in the art, and the general principles defined herein can Without departing from the spirit or scope of the present invention, to realize in other embodiments.Therefore, the present invention will not be limited It is formed on the embodiments shown herein, and is to fit to consistent with the principles and novel features disclosed in this article widest Range.
It the above is only the preferred embodiment of the present invention, it is noted that those skilled in the art are come It says, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications also should be regarded as Protection scope of the present invention.

Claims (12)

1. a kind of method of simulation global illumination, which is characterized in that including:
The target object simulation for carrying out illumination render will be needed to be split as multiple building blocks in scene to be output;
For any one building block, the polyhedron of the building block is surrounded in simulation;
Towards on polyhedral multiple target directions, analog video camera images the scene respectively, obtain from Several scene images simulated on the multiple target direction, wherein the multiple target direction is out of described polyhedron It is directed toward the direction on polyhedral face or vertex in portion;
For the scene image simulated on any one target direction, the pixel according to each pixel in the scene image Color determines that the polyhedron acts on one group of ambient light color information of the building block, obtains corresponding to the multiple Multigroup ambient light color information of target direction;
Based on polyhedral multigroup ambient light color information, the building block that the polyhedron surrounds is carried out Illumination render, to obtain the lighting effect of the target object.
2. according to the method described in claim 1, it is characterized in that, described towards polyhedral multiple target directions On, analog video camera images the scene respectively, including:
It determines towards polyhedral multiple target directions;
Towards on any one described polyhedral target direction, simulates the video camera and imaged towards the polyhedron Required space camera position;
Determine the original image vegetarian refreshments within the scope of the camera photography in the scene;
According to the pixel color of the original image vegetarian refreshments, location of pixels of the original image vegetarian refreshments in the scene is relative to institute State the first direction vector of space camera position and camera shooting direction that the video camera is imaged towards the polyhedron corresponding to Second direction vector, simulate the video camera and each pixel in scene image carried out obtained by image capture to the scene The pixel color of point.
3. according to the method described in claim 2, it is characterized in that, location of pixels of the original image vegetarian refreshments in the scene First direction vector relative to the space camera position is:Target difference is converted to direction vector obtained by unit vector, Wherein, the target difference is that the location of pixels of the original image vegetarian refreshments subtracts the obtained difference of space camera position;
The second direction vector corresponding to camera shooting direction that the video camera is imaged towards the polyhedron is:The video camera court The camera shooting direction imaged to the polyhedron is converted to the obtained direction vector of unit vector;
The pixel color of each pixel is in the scene image:There is the original image vegetarian refreshments of mapping relations with the pixel Pixel color be multiplied by the vector dot of the first direction vector and second direction vector.
4. method according to any one of claims 1 to 3, which is characterized in that described according to each in the scene image The pixel color of pixel determines that the polyhedron acts on one group of ambient light color information of the building block, including:
The pixel color of each pixel in the scene image is averaged, and obtained pixel is equalized by described One group of ambient light color information that color can be generated as polyhedron.
5. according to the method described in claim 4, it is characterized in that, the pixel by each pixel in the scene image Color averages, including:
Scene image is gradually reduced, and utilizes the pixel of each pixel in the scene image after Gaussian filter equalization diminution Color, until obtaining the pixel color of a unit pixel.
6. method according to any one of claims 1 to 3, which is characterized in that described towards polyhedral multiple mesh Direction is marked, including:
Multiple target directions of different sides in the polyhedron are internally pointed to from the polyhedron.
7. method according to any one of claims 1 to 3, which is characterized in that described for any one building block, mould The quasi- polyhedron for surrounding the building block, including:
For any one building block, the cube of the building block is surrounded in simulation.
8. a kind of device of simulation global illumination, which is characterized in that including:
Object split cells, it is multiple for the target object simulation for carrying out illumination render will to be needed to be split as in scene to be output Building block;
Space construction unit, for for any one building block, the polyhedron of the building block to be surrounded in simulation;
Camera unit is simulated, for towards polyhedral multiple target directions, analog video camera to be to the field respectively Scape is imaged, and obtains several scene images simulated from the multiple target direction, wherein the multiple target direction To be internally pointed to the direction on polyhedral face or vertex from the polyhedron;
Light color determination unit, for the scene image for being simulated on any one target direction, according to the scene graph The pixel color of each pixel as in determines that the polyhedron acts on one group of ambient light color letter of the building block Breath obtains multigroup ambient light color information corresponding to the multiple target direction;
Illumination render unit surrounds the polyhedron for being based on polyhedral multigroup ambient light color information The building block carry out illumination render, to obtain the lighting effect of the target object.
9. device according to claim 8, which is characterized in that the simulation camera unit, including:
Towards determination unit, for determining towards polyhedral multiple target directions;
Position determination unit, for towards any one described polyhedral target direction, simulating the video camera court It carries out imaging required space camera position to the polyhedron, and executes the operation of pixel matching unit;
Pixel matching unit, for determining the original image vegetarian refreshments in the scene within the scope of the camera photography;
Image simulation unit, the pixel color of the original image vegetarian refreshments for being determined according to the pixel matching unit, institute State first direction vector of location of pixels of the original image vegetarian refreshments in the scene relative to the space camera position, Yi Jisuo The second direction vector corresponding to the camera shooting direction that video camera is imaged towards the polyhedron is stated, simulates the video camera to institute State the pixel color that scene carries out each pixel in scene image obtained by image capture.
10. device according to claim 8 or claim 9, which is characterized in that the light color determination unit, including:
Light color determination subelement, for the scene image for being simulated on any one target direction, by the scene graph The pixel color of each pixel averages as in, and using the obtained pixel color of equalization as polyhedron institute The one group of ambient light color information that can be generated.
11. device according to claim 10, which is characterized in that the light color determination subelement, by the scene When the pixel color of each pixel averages in image, filtered specifically for gradually reducing scene image, and using Gauss The pixel color of each pixel in scene image after the equalization diminution of wave device, until obtaining the pixel face of a unit pixel Color.
12. device according to claim 8 or claim 9, which is characterized in that simulation camera unit, for towards the multi-panel Multiple target directions of different sides in body, respectively analog video camera the scene is imaged, obtain from the multiple target Several scene images simulated on direction.
CN201611027685.4A 2016-11-17 2016-11-17 The method and apparatus for simulating global illumination Active CN106504315B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611027685.4A CN106504315B (en) 2016-11-17 2016-11-17 The method and apparatus for simulating global illumination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611027685.4A CN106504315B (en) 2016-11-17 2016-11-17 The method and apparatus for simulating global illumination

Publications (2)

Publication Number Publication Date
CN106504315A CN106504315A (en) 2017-03-15
CN106504315B true CN106504315B (en) 2018-09-07

Family

ID=58327426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611027685.4A Active CN106504315B (en) 2016-11-17 2016-11-17 The method and apparatus for simulating global illumination

Country Status (1)

Country Link
CN (1) CN106504315B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108236783B (en) * 2018-01-09 2020-10-23 网易(杭州)网络有限公司 Method and device for simulating illumination in game scene, terminal equipment and storage medium
CN108520551B (en) * 2018-03-30 2022-02-22 苏州蜗牛数字科技股份有限公司 Method for realizing dynamic illumination of light map, storage medium and computing equipment
CN109191398B (en) * 2018-08-29 2021-08-03 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN110288692B (en) 2019-05-17 2021-05-11 腾讯科技(深圳)有限公司 Illumination rendering method and device, storage medium and electronic device
CN110675479B (en) * 2019-10-14 2023-05-16 北京代码乾坤科技有限公司 Dynamic illumination processing method and device, storage medium and electronic device
CN112562051B (en) * 2020-11-30 2023-06-27 腾讯科技(深圳)有限公司 Virtual object display method, device, equipment and storage medium
CN113873156A (en) * 2021-09-27 2021-12-31 北京有竹居网络技术有限公司 Image processing method and device and electronic equipment
CN115100339B (en) * 2022-06-15 2023-06-20 北京百度网讯科技有限公司 Image generation method, device, electronic equipment and storage medium
CN115423906B (en) * 2022-09-02 2023-07-07 北京城市网邻信息技术有限公司 Solar simulation method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103077552A (en) * 2012-12-27 2013-05-01 浙江大学 Three-dimensional displaying method based on multi-view video
CN104361624A (en) * 2014-11-20 2015-02-18 南京大学 Method for rendering global illumination in computer three-dimensional model
CN105989624A (en) * 2015-02-11 2016-10-05 华为技术有限公司 Method used for drawing global illumination scene and apparatus thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7227548B2 (en) * 2004-05-07 2007-06-05 Valve Corporation Method and system for determining illumination of models using an ambient cube
KR100609145B1 (en) * 2004-12-20 2006-08-08 한국전자통신연구원 Rendering Apparatus and Method for real-time global illumination in real light environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103077552A (en) * 2012-12-27 2013-05-01 浙江大学 Three-dimensional displaying method based on multi-view video
CN104361624A (en) * 2014-11-20 2015-02-18 南京大学 Method for rendering global illumination in computer three-dimensional model
CN105989624A (en) * 2015-02-11 2016-10-05 华为技术有限公司 Method used for drawing global illumination scene and apparatus thereof

Also Published As

Publication number Publication date
CN106504315A (en) 2017-03-15

Similar Documents

Publication Publication Date Title
CN106504315B (en) The method and apparatus for simulating global illumination
CN106780709B (en) A kind of method and device of determining global illumination information
CN104732585B (en) A kind of method and device of human somatotype reconstruct
CN106780707B (en) The method and apparatus of global illumination in simulated scenario
US20170287196A1 (en) Generating photorealistic sky in computer generated animation
CN111161422A (en) Model display method for enhancing virtual scene implementation
CN112712582B (en) Dynamic global illumination method, electronic device and computer readable storage medium
CN116091676B (en) Face rendering method of virtual object and training method of point cloud feature extraction model
CN116977522A (en) Rendering method and device of three-dimensional model, computer equipment and storage medium
CN110147737A (en) For generating the method, apparatus, equipment and storage medium of video
CN112308977A (en) Video processing method, video processing apparatus, and storage medium
Chae et al. Introduction of physics simulation in augmented reality
US10909752B2 (en) All-around spherical light field rendering method
CN115965727A (en) Image rendering method, device, equipment and medium
US20140285513A1 (en) Animation of a virtual object
CN116342782A (en) Method and apparatus for generating avatar rendering model
Zeng et al. Motion capture and reconstruction based on depth information using Kinect
CN114862997A (en) Image rendering method and apparatus, medium, and computer device
CA3172140A1 (en) Full skeletal 3d pose recovery from monocular camera
Jarabo et al. Rendering relativistic effects in transient imaging
CN108140256A (en) Orientation information based on display shows method, equipment and the program that the 3D of object is represented
CN110148202A (en) For generating the method, apparatus, equipment and storage medium of image
Hempe et al. A semantics-based, active render framework to realize complex eRobotics applications with realistic virtual testing environments
CN117611727B (en) Rendering processing method, device, equipment and medium
TWI797761B (en) Display method of virtual reality

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant