CN108038897A - Shadow map generation method and device - Google Patents

Shadow map generation method and device Download PDF

Info

Publication number
CN108038897A
CN108038897A CN201711277706.2A CN201711277706A CN108038897A CN 108038897 A CN108038897 A CN 108038897A CN 201711277706 A CN201711277706 A CN 201711277706A CN 108038897 A CN108038897 A CN 108038897A
Authority
CN
China
Prior art keywords
texture
view
image
matrix
shadow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711277706.2A
Other languages
Chinese (zh)
Other versions
CN108038897B (en
Inventor
吕天胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pixel Software Technology Co Ltd
Original Assignee
Beijing Pixel Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pixel Software Technology Co Ltd filed Critical Beijing Pixel Software Technology Co Ltd
Priority to CN201711277706.2A priority Critical patent/CN108038897B/en
Publication of CN108038897A publication Critical patent/CN108038897A/en
Application granted granted Critical
Publication of CN108038897B publication Critical patent/CN108038897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)

Abstract

The present invention provides a kind of shadow map generation method and device.The described method includes:One texture maps is created according to default shadow image number and preset image sizes, and texture maps are divided into the texture region of multiple formed objects;Parallel divisional carries out the camera space under current visual angle according to default shadow image number, and based on the radiation direction in current game scene, each view spaces that generation segmentation obtains corresponding view projections matrix under current light direction;Successively according to the view projections matrix of each view spaces, the shade depth image for generating the object projected under current light direction in respective view space is rendered in corresponding texture region;Matrixing is carried out to each view projections matrix, obtains the corresponding transition matrix of each texture region and texture coordinate scope.The shadow image of different stage can be sampled at the same time in the shadow map of the method generation, to improve rendering efficiency and level transition effect.

Description

Shadow map generation method and device
Technical field
The present invention relates to scene of game processing technology field, in particular to a kind of shadow map generation method and dress Put.
Background technology
It is truly simulated environment situation of change of trying one's best in 3d gaming, it is often necessary to real-time rendering is carried out, with Lift game image quality.
In the prior art, common real-time rendering scheme is that the object being directed in scene of game generates multiple differences Echo, and when carrying out Shading Rendering according to the difference of Object Depth value, choose different echoes and carry out sampling and render, To reach the effect of real-time rendering.But the echo of this schemes generation can make Shading Rendering inefficient, different stage Echo cannot be sampled at the same time, be pointed to for the pixel between the other echo of adjacent level, level transition effect Difference.
The content of the invention
In order to overcome above-mentioned deficiency of the prior art, it is an object of the invention to provide a kind of shadow map generation method And device, the shadow map of the method generation can be sampled at the same time in the shadow image of different stage, improve cloudy in real time The rendering efficiency and level transition effect that shadow renders.
For method, preferred embodiments of the present invention provide a kind of shadow map generation method, the described method includes:
One texture maps is created according to default shadow image number and preset image sizes, and the texture maps are divided into The texture region of multiple formed objects, wherein the sum of the texture region is equal to the default shadow image number;
Parallel divisional is carried out to the camera space under current visual angle according to default shadow image number, and is based on going game Radiation direction in scene, each view spaces that generation segmentation obtains corresponding view projections matrix under current light direction;
Successively according to the view projections matrix of each view spaces, generation current light direction is rendered in corresponding texture region Under project to the shade depth image of object in respective view space;
Matrixing is carried out to the view projections matrix of each view spaces, obtains the corresponding transition matrix of each texture region, And texture coordinate scope of each shade depth image in the texture maps, to generate corresponding shadow-texture textures.
For device, preferred embodiments of the present invention provide a kind of shadow map generating means, and described device includes:
Texture creation module, for creating a texture maps according to default shadow image number and preset image sizes, and The texture maps are divided into the texture region of multiple formed objects, wherein the sum of the texture region is equal to described default the moon Shadow picture number;
Matrix generation module, for carrying out parallel point to the camera space under current visual angle according to default shadow image number Cut, and based on the radiation direction in current game scene, each view spaces that generation segmentation obtains are right under current light direction The view projections matrix answered;
Shading Rendering module, for successively according to the view projections matrix of each view spaces, the wash with watercolours in corresponding texture region The shade depth image of the object in respective view space is projected under dye generation current light direction;
Matrixing module, for carrying out matrixing to the view projections matrix of each view spaces, obtains each texture area The corresponding transition matrix in domain, and texture coordinate scope of each shade depth image in the texture maps, to generate corresponding the moon Shadow texture mapping.
In terms of existing technologies, the shadow map generation method and device that preferred embodiments of the present invention provide have Following beneficial effect:The shadow map of the shadow map generation method generation can carry out together in the shadow image of different stage When sample, improve the rendering efficiency and level transition effect of real-time rendering.First, according to default shadow image number and in advance If picture size creates a texture maps, and the texture maps is divided into the texture region of multiple formed objects, wherein described The sum of texture region is equal to the default shadow image number;Then, the method according to default shadow image number to work as Camera space under preceding visual angle carries out parallel divisional, and is obtained based on the radiation direction in current game scene, generation segmentation Each view spaces corresponding view projections matrix under current light direction;Then, the method is successively according to each view spaces View projections matrix, rendered in corresponding texture region and project to the thing in respective view space under generation current light direction The shade depth image of body;Finally, the method carries out matrixing according to the view projections matrix to each view spaces, obtains The corresponding transition matrix of each texture region, and texture coordinate scope of each shade depth image in the texture maps, with generation Corresponding shadow-texture textures, so that including the shadow image of different stage in a texture mapping, it is ensured that not at the same level Other shadow image can be sampled at the same time, improve the rendering efficiency and level transition effect of real-time rendering.
To enable the above objects, features and advantages of the present invention to become apparent, present pre-ferred embodiments cited below particularly, And attached drawing appended by coordinating, it is described in detail below.
Brief description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below will be to needed in the embodiment attached Figure is briefly described, it will be appreciated that the following drawings illustrate only certain embodiments of the present invention, therefore be not construed as pair The restriction of the claims in the present invention protection domain, for those of ordinary skill in the art, what is do not made the creative labor Under the premise of, other relevant attached drawings can also be obtained according to these attached drawings.
Fig. 1 is the block diagram for the computing device that preferred embodiments of the present invention provide.
Fig. 2 is the flow diagram for the shadow map generation method that preferred embodiments of the present invention provide.
Fig. 3 is the block diagram of the shadow map generating means shown in Fig. 1 that preferred embodiments of the present invention provide.
Fig. 4 is the block diagram of the Shading Rendering module shown in Fig. 3.
Icon:10- computing devices;11- memories;12- processors;13- communication units;14- video card units;100- shades Pinup picture generating means;110- texture creation modules;120- matrix generation modules;130- Shading Rendering modules;140- matrixings Module;131- classification submodules;132- rendering submodules.
Embodiment
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, the technical solution in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is Part of the embodiment of the present invention, instead of all the embodiments.The present invention implementation being usually described and illustrated herein in the accompanying drawings The component of example can be arranged and designed with a variety of configurations.
Therefore, below the detailed description of the embodiment of the present invention to providing in the accompanying drawings be not intended to limit it is claimed The scope of the present invention, but be merely representative of the present invention selected embodiment.Based on the embodiments of the present invention, this area is common Technical staff's all other embodiments obtained without creative efforts, belong to the model that the present invention protects Enclose.
It should be noted that:Similar label and letter represents similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined, then it further need not be defined and explained in subsequent attached drawing in a attached drawing.
Below in conjunction with the accompanying drawings, elaborate to some embodiments of the present invention.In the case where there is no conflict, it is following Feature in embodiment and embodiment can be mutually combined.
Fig. 1 is refer to, is the block diagram for the computing device 10 that preferred embodiments of the present invention provide.Of the invention real Apply in example, the computing device 10, which can be directed in current game scene, to be needed to carry out the object of real-time rendering, generation pair That answers includes the shadow map of the shadow image of different stage, with when carrying out real-time rendering to the object, it is ensured that The shadow image of different stage can be sampled at the same time in the shadow map, improve real-time rendering rendering efficiency and Level transition effect.In the present embodiment, the computing device 10 may be, but not limited to, PC (personal Computer, PC), tablet computer, personal digital assistant (personal digital assistant, PDA) or there is image Server of processing function etc..
In the present embodiment, the computing device 10 can include shadow map generating means 100, memory 11, processing Device 12, communication unit 13 and video card unit 14.The memory 11, processor 12, communication unit 13 and video card unit 14 are each Element is directly or indirectly electrically connected between each other, to realize the transmission of data or interaction.For example, these elements are mutual It can be realized and be electrically connected by one or more communication bus or signal wire.The shadow map generating means 100 are included at least One can be stored in the software function module in the memory 11, the processing in the form of software or firmware (firmware) Device 12 is stored in software program and module in memory 11 by operation, so as to perform various functions at application and data Reason.
In the present embodiment, the memory 11 may be, but not limited to, random access memory (Random Access Memory, RAM), read-only storage (Read Only Memory, ROM), programmable read only memory (Programmable Read-Only Memory, PROM), Erasable Programmable Read Only Memory EPROM (Erasable Programmable Read-Only Memory, EPROM), electrically erasable programmable read-only memory (Electric Erasable Programmable Read- Only Memory, EEPROM) etc..Wherein, memory 11 is used for storage program, and the processor 12 is receiving execute instruction Afterwards, described program is performed.Further, the software program in above-mentioned memory 11 and module may also include operating system, its May include the various component softwares for management system task (such as memory management, storage device control, power management etc.) and/ Or driving, and can be in communication with each other with various hardware or component software, so as to provide the running environment of other software component.
In the present embodiment, the processor 12 can be a kind of IC chip of the disposal ability with signal. The processor 12 can be general processor, including central processing unit (Central Processing Unit, CPU), network Processor (Network Processor, NP) etc..Can realize or perform disclosed each method in the embodiment of the present invention, Step and logic diagram.General processor can be microprocessor or the processor can also be any conventional processor Deng.
In the present embodiment, the communication unit 13 is used to establish the computing device 10 with setting outside other by network Communication connection between standby, and carried out data transmission by the network.
In the present embodiment, the video card unit 14 is used to carry out calculation process to graph data, to alleviate processor 12 Computing pressure.Wherein, the core component of the video card unit 14 is GPU (Graphics Processing Unit, at figure Manage device), for the pattern data information needed for computing device 10 to be carried out conversion driving, and control display to be shown.
In the present embodiment, the computing device 10 passes through the shadow map generating means that are stored in the memory 11 The 100 object generations being directed in scene of game include the shadow map of the shadow image of different stage, so as to pass through the shade Textures are to needing the object of real-time rendering to carry out the shade wash with watercolours that rendering efficiency is high and level transition effect is good in scene of game Dye.
It is understood that the structure shown in Fig. 1 is only a kind of structure diagram of computing device 10, the computing device 10 may also include more either less components than shown in Fig. 1 or have the configuration different from shown in Fig. 1.Shown in Fig. 1 Each component can use hardware, software or its combination realize.
Fig. 2 is refer to, is the flow diagram for the shadow map generation method that preferred embodiments of the present invention provide.At this In inventive embodiments, the shadow map generation method is applied to above-mentioned computing device 10, and the shade shown in Fig. 2 is pasted below The idiographic flow and step of drawing generating method are described in detail.
In embodiments of the present invention, the shadow map generation method comprises the following steps:
Step S210, a texture maps are created according to default shadow image number and preset image sizes, and by the line Reason figure is divided into the texture region of multiple formed objects.
In the present embodiment, the default shadow image number can characterize that the computing device 10 needs to generate number The shadow image of a different stage, the preset image sizes can characterize the corresponding length of each shadow image and width.Its In, the computing device 10 can obtain described default the moon by network at other external equipments to communicate with the computing device 10 Shadow picture number and the preset image sizes;The computing device 10 can also be connect by way of providing external input device Receive the default shadow image number of game programmer's input and the preset image sizes.In the present embodiment, it is described Default shadow image number and the preset image sizes are storable in the memory 11.
In the present embodiment, the sum for the texture region being divided into the texture maps is equal to the default shadow image Number, the preset image sizes include picture traverse and image length.The basis presets shadow image number and default figure As size creates a texture maps, and the step of texture maps are divided into the texture region of multiple formed objects, includes:
According to the image length in default shadow image number and preset image sizes, the length for treating wound figure is calculated;
Picture traverse in the length and the preset image sizes for treating wound figure, which corresponds to, creates the matched texture of size Figure, and the texture maps are divided into multiple texture areas using described image length as interval along the length direction of the texture maps Domain.
Wherein, described to treat that wound figure is the texture maps to be created, the texture region is used to draw corresponding shade Image, so that the shadow image of different stage can be included in a texture maps.
Step S220, the camera space under current visual angle is carried out in parallel divisional, and base according to default shadow image number Radiation direction in current game scene, each view spaces that generation segmentation obtains corresponding view under current light direction Projection matrix.
In the present embodiment, the camera space is the shooting space that virtual camera can observe under current visual angle, The radiation direction is the direction that light source irradiates in scene of game, and the view spaces are the computing device 10 to described virtual The different range section that the coverage of camera is divided, the corresponding object of each view spaces is in respective view space In shade depth figure shade depth figure corresponding with other spaces and differ, and rank is also different.And the view projections Matrix is the matrix that respective view space is used when the world coordinates of object is converted to corresponding view coordinate, and different regards Map space corresponds to different view projections matrixes.
In the present embodiment, the basis presets shadow image number and carries out parallel point to the camera space under current visual angle The step of cutting includes:
According to the distance between viewpoint corresponding with camera space information everywhere in the camera space, by closely to far According to the default shadow image number to the camera space carry out run-in index segmentation, make each view spaces that segmentation obtains by Closely to being far sequentially increased.
Wherein, the corresponding viewpoint of the camera space be the virtual camera in itself, the number for the view spaces being partitioned into Mesh is identical with the default shadow image number, and the view spaces near the virtual camera are first level view sky Between, the view spaces adjacent with the first level view spaces are second level view spaces, and and so on until point Last view spaces cut out.Wherein each view spaces correspond to a rank shade depth image.For example, first level View spaces correspond to first level shadow image, and second level view spaces correspond to second level shadow image.
In the present embodiment, the view projections matrix corresponding to each view spaces is obtained based on same radiation direction, The corresponding view projections matrix in adjacent view space is different and different because view spaces split position.Wherein, it is same to regard The corresponding view projections matrix of map space can change with the change of radiation direction.
Step S230, successively according to the view projections matrix of each view spaces, rendering generation in corresponding texture region ought The shade depth image of the object in respective view space is projected under preceding radiation direction.
In the present embodiment, the computing device 10 can complete the corresponding shade depth image of a view spaces rendering Afterwards, then to the corresponding shade depth image of next view spaces render.Wherein, the regarding according to each view spaces successively Figure projection matrix, renders the object that is projected under generation current light direction in respective view space in corresponding texture region The step of shade depth image, includes:
According to by closely classifying to remote order to projecting to the object in each view spaces in scene of game;
According to the correspondence between each texture region and each view spaces, successively based on corresponding in each texture region View projections matrix carries out shadow image drawing modification to the object for being categorized into respective view space, obtains corresponding shade depth Image.
Wherein, the computing device 10 is by being starting point by closely to far to projecting in each view spaces using virtual camera The mode that object is identified, classifies the object in scene of game.Each texture region is according to from a left side in the texture maps Corresponded successively with each view spaces to right order, each texture region is corresponded to according to corresponding view projections matrix and drawn The shade depth image of the object in match views space is projected to, so as to draw out different stage in same texture maps Shade depth image.
Step S240, carries out matrixing to the view projections matrix of each view spaces, it is corresponding to obtain each texture region Transition matrix, and texture coordinate scope of each shade depth image in the texture maps, are pasted with generating corresponding shadow-texture Figure.
In the present embodiment, after the shadow-texture textures is have drawn the shade depth image of all ranks, obtain The corresponding transition matrix of each texture region, and described in texture coordinate scope of each shade depth image in the texture maps Texture maps., will be to each view after the computing device 10 delineates shade depth images at different levels in the texture maps The view projections matrix in space carries out matrixing processing, enables each transition matrix that conversion obtains and each line of the texture maps Manage region Corresponding matching, it is ensured that the pixel, which exists, can be judged by each transition matrix to the pixel of object in scene of game Whether corresponding texture coordinate falls in corresponding texture region in the texture maps.For example, the world coordinates of a pixel After carrying out coordinate transform by the transition matrix of the corresponding texture region of first level view spaces, a texture will be obtained and sat Mark, if the texture coordinate is fallen in the range of the texture coordinate of the corresponding texture region of the first level view spaces, institute Stating computing device 10 the shade depth image in the texture region can be used to carry out Shading Rendering;If the texture coordinate falls in institute When stating outside the texture coordinate scope of the corresponding texture region of first level view spaces, then the computing device 10 will use next The transition matrix of the corresponding texture region in level view space carries out coordinate transform to the world coordinates of the pixel, and judges to become Whether the texture coordinate obtained after changing falls in the range of the texture coordinate of the texture region corresponding to the transition matrix used.
In the present embodiment, the view projections matrix to each view spaces carries out matrixing, obtains each texture area The corresponding transition matrix in domain, and the step of texture coordinate scope of each shade depth image in the texture maps include:
Processing and displacement are zoomed in and out to each view projections matrix according to default shadow image number and Coordinate Adjusting strategy Processing, obtains the corresponding transition matrix of each texture region;
According to the distribution situation of the corresponding transition matrix of each texture region and each texture region in the texture maps, obtain Each texture region corresponds to the texture coordinate scope of shade depth image.
In the present embodiment, the Coordinate Adjusting strategy is used to adjust the corresponding coordinate range of each view projections matrix, institute Each texture can be adjusted to by the Coordinate Adjusting strategy by the corresponding coordinate range of each view projections matrix by stating computing device 10 Region corresponding coordinate range in the texture maps, you can obtain the texture coordinate that each texture region corresponds to shade depth image Scope.For example, there is three texture regions in a texture maps, then the texture coordinate scope of first level shade depth image is (0,0), (0.3333,0), (0.3333,1) and (0,1) four texture coordinates surround the rectangle to be formed, second level shade depth The texture coordinate scope of image is (0.3333,0), (0.6666,0), (0.6666,1) and (0.3333,1) four texture coordinate Surround the rectangle formed, the texture coordinate scope of third level shade depth image is (0.6666,0), (1,0), (1,1) and (0.6666,1) four texture coordinate surrounds the rectangle to be formed.
Fig. 3 is refer to, is the shadow map generating means 100 shown in Fig. 1 of preferred embodiments of the present invention offer Block diagram.In embodiments of the present invention, the shadow map generating means 100 include texture creation module 110, matrix is given birth to Into module 120, Shading Rendering module 130 and matrixing module 140.
The texture creation module 110, for creating a line according to default shadow image number and preset image sizes Reason is schemed, and the texture maps are divided into the texture region of multiple formed objects.
In the present embodiment, the sum of the texture region is equal to the default shadow image number, the pre-set image Size includes picture traverse and image length, and the texture creation module 110 is according to default shadow image number and pre-set image Size creates a texture maps, and the step of texture maps are divided into the texture region of multiple formed objects includes:
According to the image length in default shadow image number and preset image sizes, the length for treating wound figure is calculated;
Picture traverse in the length and the preset image sizes for treating wound figure, which corresponds to, creates the matched texture of size Figure, and the texture maps are divided into multiple texture areas using described image length as interval along the length direction of the texture maps Domain.
Wherein, the texture creation module 110 can perform the step S210 shown in Fig. 2, and specific description can be joined According to above to the detailed description of step S210.
The matrix generation module 120, for according to default shadow image number to the camera space under current visual angle into Row parallel divisional, and based on the radiation direction in current game scene, each view spaces that generation segmentation obtains are in current light Corresponding view projections matrix under direction.
In the present embodiment, the matrix generation module 120 according to default shadow image number to the phase under current visual angle The mode that machine space carries out parallel divisional includes:
According to the distance between viewpoint corresponding with camera space information everywhere in the camera space, by closely to far According to the default shadow image number to the camera space carry out run-in index segmentation, make each view spaces that segmentation obtains by Closely to being far sequentially increased.
Wherein, the matrix generation module 120 can perform the step S220 shown in Fig. 2, and specific description can be joined According to above to the detailed description of step S220.
The Shading Rendering module 130, for according to the view projections matrix of each view spaces, corresponding to texture area successively The shade depth image of the object projected under generation current light direction in respective view space is rendered in domain.
Alternatively, Fig. 4 is refer to, is the block diagram of the Shading Rendering module 130 shown in Fig. 3.In the present embodiment In, the Shading Rendering module 130 can include classification submodule 131 and rendering submodule 132.
It is described classification submodule 131, for according to by closely to remote order to projecting to each view spaces in scene of game Interior object is classified.
The rendering submodule 132, for according to the correspondence between each texture region and each view spaces, existing successively Shadow image drafting is carried out to the object for being categorized into respective view space based on corresponding view projections matrix in each texture region Processing, obtains corresponding shade depth image.
Referring once again to Fig. 3, the matrixing module 140, for the view projections matrix progress to each view spaces Matrixing, obtains the corresponding transition matrix of each texture region, and texture of each shade depth image in the texture maps is sat Scope is marked, to generate corresponding shadow map.
In the present embodiment, the matrixing module 140 becomes the view projections matrix of each view spaces into row matrix Change, obtain the corresponding transition matrix of each texture region, and texture coordinate scope of each shade depth image in the texture maps Mode include:
Processing and displacement are zoomed in and out to each view projections matrix according to default shadow image number and Coordinate Adjusting strategy Processing, obtains the corresponding transition matrix of each texture region;
According to the distribution situation of the corresponding transition matrix of each texture region and each texture region in the texture maps, obtain Each texture region corresponds to the texture coordinate scope of shade depth image.
Wherein, the matrixing module 140 can perform the step S240 in Fig. 2, and specific description can refer to above In to the detailed description of the step S240.
In conclusion in the shadow map generation method and device that preferred embodiments of the present invention provide, the shade The shadow map of textures generation method generation can be sampled at the same time in the shadow image of different stage, improve real-time shadow wash with watercolours The rendering efficiency and level transition effect of dye.First, a line is created according to default shadow image number and preset image sizes Reason is schemed, and the texture maps are divided into the texture region of multiple formed objects, wherein the sum of the texture region is equal to institute State default shadow image number;Then, the method according to default shadow image number to the camera space under current visual angle into Row parallel divisional, and based on the radiation direction in current game scene, each view spaces that generation segmentation obtains are in current light Corresponding view projections matrix under direction;Then, the method is successively according to the view projections matrix of each view spaces, in correspondence The shade depth image of the object projected under generation current light direction in respective view space is rendered in texture region;Most Afterwards, the method carries out matrixing according to the view projections matrix to each view spaces, obtains corresponding turn of each texture region Matrix, and texture coordinate scope of each shade depth image in the texture maps are changed, to generate corresponding shadow-texture textures, So that include the shadow image of different stage in a texture mapping, it is ensured that the shadow image of different stage can carry out Sample at the same time, improve the rendering efficiency and level transition effect of real-time rendering.
The foregoing is only a preferred embodiment of the present invention, is not intended to limit the invention, for the skill of this area For art personnel, the invention may be variously modified and varied.Within the spirit and principles of the invention, that is made any repaiies Change, equivalent substitution, improvement etc., should all be included in the protection scope of the present invention.

Claims (10)

  1. A kind of 1. shadow map generation method, it is characterised in that the described method includes:
    A texture maps are created according to default shadow image number and preset image sizes, and the texture maps are divided into multiple The texture region of formed objects, wherein the sum of the texture region is equal to the default shadow image number;
    Parallel divisional is carried out to the camera space under current visual angle according to default shadow image number, and is based on current game scene In radiation direction, the obtained each view spaces of generation segmentation corresponding view projections matrix under current light direction;
    Successively according to the view projections matrix of each view spaces, render in corresponding texture region and thrown under generation current light direction Shade depth image of the shadow to the object in respective view space;
    Matrixing is carried out to the view projections matrix of each view spaces, obtains the corresponding transition matrix of each texture region, and respectively Texture coordinate scope of the shade depth image in the texture maps, to generate corresponding shadow-texture textures.
  2. 2. according to the method described in claim 1, it is characterized in that, the preset image sizes include picture traverse and image is grown Degree, the basis presets shadow image number and preset image sizes create a texture maps, and the texture maps are divided into The step of texture region of multiple formed objects, includes:
    According to the image length in default shadow image number and preset image sizes, the length for treating wound figure is calculated;
    Picture traverse in the length and the preset image sizes for treating wound figure, which corresponds to, creates the matched texture maps of size, and The texture maps are divided into multiple texture regions using described image length as interval along the length direction of the texture maps.
  3. 3. according to the method described in claim 1, it is characterized in that, the basis presets shadow image number under current visual angle Camera space carry out parallel divisional the step of include:
    According to the distance between viewpoint corresponding with camera space information everywhere in the camera space, by closely to far according to The default shadow image number carries out run-in index segmentation to the camera space, makes each view spaces that segmentation obtains by closely extremely Far it is sequentially increased.
  4. It is 4. according to the method described in claim 3, it is characterized in that, described successively according to the view projections square of each view spaces Battle array, renders the shade depth of the object projected under generation current light direction in respective view space in corresponding texture region The step of image, includes:
    According to by closely classifying to remote order to projecting to the object in each view spaces in scene of game;
    According to the correspondence between each texture region and each view spaces, corresponding view is based in each texture region successively Projection matrix carries out shadow image drawing modification to the object for being categorized into respective view space, obtains corresponding shade depth figure Picture.
  5. 5. according to the method described in claim 1, it is characterized in that, the view projections matrix to each view spaces carries out square Battle array conversion, obtains the corresponding transition matrix of each texture region, and texture coordinate of each shade depth image in the texture maps The step of scope, includes:
    Processing and displacement processing zoom in and out each view projections matrix according to default shadow image number and Coordinate Adjusting strategy, Obtain the corresponding transition matrix of each texture region;
    According to the distribution situation of the corresponding transition matrix of each texture region and each texture region in the texture maps, each line is obtained Reason region corresponds to the texture coordinate scope of shade depth image.
  6. 6. a kind of shadow map generating means, it is characterised in that described device includes:
    Texture creation module, for creating a texture maps according to default shadow image number and preset image sizes, and by institute The texture region that texture maps are divided into multiple formed objects is stated, wherein the sum of the texture region is equal to the default echo As number;
    Matrix generation module, for carrying out parallel divisional to the camera space under current visual angle according to default shadow image number, And based on the radiation direction in current game scene, each view spaces that generation segmentation obtains are corresponding under current light direction View projections matrix;
    Shading Rendering module, for according to the view projections matrix of each view spaces, life to be rendered in corresponding texture region successively The shade depth image of the object in respective view space is projected under into current light direction;
    Matrixing module, for carrying out matrixing to the view projections matrix of each view spaces, obtains each texture region pair The transition matrix answered, and texture coordinate scope of each shade depth image in the texture maps, to generate corresponding shade line Manage textures.
  7. 7. device according to claim 6, it is characterised in that the preset image sizes include picture traverse and image is grown Degree, the texture creation module create a texture maps according to default shadow image number and preset image sizes, and by described in The mode that texture maps are divided into the texture region of multiple formed objects includes:
    According to the image length in default shadow image number and preset image sizes, the length for treating wound figure is calculated;
    Picture traverse in the length and the preset image sizes for treating wound figure, which corresponds to, creates the matched texture maps of size, and The texture maps are divided into multiple texture regions using described image length as interval along the length direction of the texture maps.
  8. 8. device according to claim 6, it is characterised in that the matrix generation module is according to default shadow image number The mode of parallel divisional is carried out to the camera space under current visual angle to be included:
    According to the distance between viewpoint corresponding with camera space information everywhere in the camera space, by closely to far according to The default shadow image number carries out run-in index segmentation to the camera space, makes each view spaces that segmentation obtains by closely extremely Far it is sequentially increased.
  9. 9. device according to claim 8, it is characterised in that the Shading Rendering module includes:
    Classification submodule, for according to by closely being carried out to remote order to projecting to the object in each view spaces in scene of game Classification;
    Rendering submodule, for according to the correspondence between each texture region and each view spaces, successively in each texture region It is interior that shadow image drawing modification is carried out to the object for being categorized into respective view space based on corresponding view projections matrix, obtain pair The shade depth image answered.
  10. 10. device according to claim 6, it is characterised in that view of the matrixing module to each view spaces Projection matrix carries out matrixing, obtains the corresponding transition matrix of each texture region, and each shade depth image in the texture The mode of texture coordinate scope in figure includes:
    Processing and displacement processing zoom in and out each view projections matrix according to default shadow image number and Coordinate Adjusting strategy, Obtain the corresponding transition matrix of each texture region;
    According to the distribution situation of the corresponding transition matrix of each texture region and each texture region in the texture maps, each line is obtained Reason region corresponds to the texture coordinate scope of shade depth image.
CN201711277706.2A 2017-12-06 2017-12-06 Shadow map generation method and device Active CN108038897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711277706.2A CN108038897B (en) 2017-12-06 2017-12-06 Shadow map generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711277706.2A CN108038897B (en) 2017-12-06 2017-12-06 Shadow map generation method and device

Publications (2)

Publication Number Publication Date
CN108038897A true CN108038897A (en) 2018-05-15
CN108038897B CN108038897B (en) 2021-06-04

Family

ID=62095780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711277706.2A Active CN108038897B (en) 2017-12-06 2017-12-06 Shadow map generation method and device

Country Status (1)

Country Link
CN (1) CN108038897B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109448099A (en) * 2018-09-21 2019-03-08 腾讯科技(深圳)有限公司 Rendering method, device, storage medium and the electronic device of picture
CN109949401A (en) * 2019-03-14 2019-06-28 成都风际网络科技股份有限公司 A kind of method of the non real-time Shading Rendering of non-static object of mobile platform
CN109993823A (en) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 Shading Rendering method, apparatus, terminal and storage medium
CN110473296A (en) * 2019-08-15 2019-11-19 浙江中国轻纺城网络有限公司 Chart pasting method and device
CN110585713A (en) * 2019-09-06 2019-12-20 腾讯科技(深圳)有限公司 Method and device for realizing shadow of game scene, electronic equipment and readable medium
CN111131807A (en) * 2019-12-30 2020-05-08 华人运通(上海)云计算科技有限公司 Method and system for simulating and displaying vehicle light projection
CN111243077A (en) * 2020-01-17 2020-06-05 江苏艾佳家居用品有限公司 Real transition shadow implementation method based on spatial pre-exploration
CN111862295A (en) * 2020-07-17 2020-10-30 完美世界(重庆)互动科技有限公司 Virtual object display method, device, equipment and storage medium
CN113269863A (en) * 2021-07-19 2021-08-17 成都索贝视频云计算有限公司 Video image-based foreground object shadow real-time generation method
CN113362392A (en) * 2020-03-05 2021-09-07 杭州海康威视数字技术股份有限公司 Visual field generation method and device, computing equipment and storage medium
CN113893533A (en) * 2021-09-30 2022-01-07 网易(杭州)网络有限公司 Display control method and device in game
CN114494384A (en) * 2021-12-27 2022-05-13 北京吉威空间信息股份有限公司 Building shadow analysis method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104103092A (en) * 2014-07-24 2014-10-15 无锡梵天信息技术股份有限公司 Real-time dynamic shadowing realization method based on projector lamp
CN104299257A (en) * 2014-07-18 2015-01-21 无锡梵天信息技术股份有限公司 Outdoor-sunlight-based method for realizing real-time dynamic shadow
CN107274476A (en) * 2017-08-16 2017-10-20 城市生活(北京)资讯有限公司 The generation method and device of a kind of echo

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104299257A (en) * 2014-07-18 2015-01-21 无锡梵天信息技术股份有限公司 Outdoor-sunlight-based method for realizing real-time dynamic shadow
CN104103092A (en) * 2014-07-24 2014-10-15 无锡梵天信息技术股份有限公司 Real-time dynamic shadowing realization method based on projector lamp
CN107274476A (en) * 2017-08-16 2017-10-20 城市生活(北京)资讯有限公司 The generation method and device of a kind of echo

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
郭钊等: "基于光源空间透视的平行分割阴影图算法", 《地理与地理信息科学》 *
马上等: "基于GPU的光源空间平行分割阴影图算法", 《计算机辅助设计与图形学学报》 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109448099B (en) * 2018-09-21 2023-09-22 腾讯科技(深圳)有限公司 Picture rendering method and device, storage medium and electronic device
CN109448099A (en) * 2018-09-21 2019-03-08 腾讯科技(深圳)有限公司 Rendering method, device, storage medium and the electronic device of picture
CN109949401A (en) * 2019-03-14 2019-06-28 成都风际网络科技股份有限公司 A kind of method of the non real-time Shading Rendering of non-static object of mobile platform
CN109993823A (en) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 Shading Rendering method, apparatus, terminal and storage medium
CN109993823B (en) * 2019-04-11 2022-11-25 腾讯科技(深圳)有限公司 Shadow rendering method, device, terminal and storage medium
CN110473296A (en) * 2019-08-15 2019-11-19 浙江中国轻纺城网络有限公司 Chart pasting method and device
CN110473296B (en) * 2019-08-15 2023-09-26 浙江中国轻纺城网络有限公司 Mapping method and device
CN110585713B (en) * 2019-09-06 2021-10-15 腾讯科技(深圳)有限公司 Method and device for realizing shadow of game scene, electronic equipment and readable medium
CN110585713A (en) * 2019-09-06 2019-12-20 腾讯科技(深圳)有限公司 Method and device for realizing shadow of game scene, electronic equipment and readable medium
CN111131807B (en) * 2019-12-30 2021-11-23 华人运通(上海)云计算科技有限公司 Method and system for simulating and displaying vehicle light projection
CN111131807A (en) * 2019-12-30 2020-05-08 华人运通(上海)云计算科技有限公司 Method and system for simulating and displaying vehicle light projection
CN111243077A (en) * 2020-01-17 2020-06-05 江苏艾佳家居用品有限公司 Real transition shadow implementation method based on spatial pre-exploration
CN113362392A (en) * 2020-03-05 2021-09-07 杭州海康威视数字技术股份有限公司 Visual field generation method and device, computing equipment and storage medium
CN113362392B (en) * 2020-03-05 2024-04-23 杭州海康威视数字技术股份有限公司 Visual field generation method, device, computing equipment and storage medium
CN111862295A (en) * 2020-07-17 2020-10-30 完美世界(重庆)互动科技有限公司 Virtual object display method, device, equipment and storage medium
CN113269863B (en) * 2021-07-19 2021-09-28 成都索贝视频云计算有限公司 Video image-based foreground object shadow real-time generation method
CN113269863A (en) * 2021-07-19 2021-08-17 成都索贝视频云计算有限公司 Video image-based foreground object shadow real-time generation method
CN113893533A (en) * 2021-09-30 2022-01-07 网易(杭州)网络有限公司 Display control method and device in game
CN114494384A (en) * 2021-12-27 2022-05-13 北京吉威空间信息股份有限公司 Building shadow analysis method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN108038897B (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN108038897A (en) Shadow map generation method and device
CN107358643B (en) Image processing method, image processing device, electronic equipment and storage medium
CN107680042B (en) Rendering method, device, engine and storage medium combining texture and convolution network
CN104361556A (en) Image synthesis method, image chip and image equipment
US6741243B2 (en) Method and system for reducing overflows in a computer graphics system
GB2406252A (en) Generation of texture maps for use in 3D computer graphics
JP2612221B2 (en) Apparatus and method for generating graphic image
US20230125255A1 (en) Image-based lighting effect processing method and apparatus, and device, and storage medium
US5719598A (en) Graphics processor for parallel processing a plurality of fields of view for multiple video displays
US20220375186A1 (en) Method and apparatus for generating bounding box, device and storage medium
CN112734900A (en) Baking method, baking device, baking equipment and computer-readable storage medium of shadow map
CN112686939A (en) Depth image rendering method, device and equipment and computer readable storage medium
CN113808246B (en) Method and device for generating map, computer equipment and computer readable storage medium
CN111223105B (en) Image processing method and device
CN113256484A (en) Method and device for stylizing image
US6222548B1 (en) Three-dimensional image processing apparatus
CN114820908B (en) Virtual image generation method and device, electronic equipment and storage medium
CN116630516B (en) 3D characteristic-based 2D rendering ordering method, device, equipment and medium
CN117032617B (en) Multi-screen-based grid pickup method, device, equipment and medium
CN111028357B (en) Soft shadow processing method and device of augmented reality equipment
CN116206088A (en) Panorama local processing method, device and storage medium
CN116228936A (en) Image processing method, device, terminal device, storage medium and program product
CN114359013A (en) Generation method and device of war fog, electronic equipment and storage medium
US8970614B2 (en) Apparatus and a method for obtaining a blur image
CN114722465A (en) Method, device and equipment for structure conversion in model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant