CN115239869B - Shadow processing method, shadow rendering method and device - Google Patents

Shadow processing method, shadow rendering method and device Download PDF

Info

Publication number
CN115239869B
CN115239869B CN202211156153.6A CN202211156153A CN115239869B CN 115239869 B CN115239869 B CN 115239869B CN 202211156153 A CN202211156153 A CN 202211156153A CN 115239869 B CN115239869 B CN 115239869B
Authority
CN
China
Prior art keywords
shadow
information
target object
light
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211156153.6A
Other languages
Chinese (zh)
Other versions
CN115239869A (en
Inventor
毛春华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Jianyue Information Technology Co ltd
Original Assignee
Guangzhou Jianyue Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Jianyue Information Technology Co ltd filed Critical Guangzhou Jianyue Information Technology Co ltd
Priority to CN202211156153.6A priority Critical patent/CN115239869B/en
Publication of CN115239869A publication Critical patent/CN115239869A/en
Application granted granted Critical
Publication of CN115239869B publication Critical patent/CN115239869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Abstract

The embodiment of the application provides a shadow processing method, a shadow rendering method and shadow rendering equipment. Obtaining shadow information of a plurality of light angles according to light shielding information of a target object at the plurality of light angles; converting the shadow information of the plurality of ray angles into texture information; the texture information is used for obtaining target shadow information corresponding to the current light angle according to a shadow rendering instruction aiming at the target object; the target shadow information is used for drawing the shadow of the target object under the current ray angle. The technical scheme provided by the embodiment of the application reduces the performance overhead.

Description

Shadow processing method, shadow rendering method and device
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to a shadow processing method, a shadow rendering method and shadow rendering equipment.
Background
Shadow is very important for improving the reality of virtual pictures such as game pictures, and therefore shadow rendering also becomes an essential operation in the virtual picture rendering process.
The current shadow rendering technology is usually implemented by using a real-time shadow technology, such as shadow depth map (shadow depth map), but the performance overhead of the real-time shadow technology is large.
Disclosure of Invention
The embodiment of the application provides a shadow processing method, a shadow rendering method and shadow rendering equipment, which are used for solving the technical problem that the overhead of shadow rendering performance is high in the prior art.
In a first aspect, an embodiment of the present application provides a shadow processing method, including:
obtaining shadow information of a plurality of light angles according to light shielding information of a target object at the plurality of light angles;
converting the shadow information of the plurality of ray angles into texture information;
the texture information is used for obtaining target shadow information corresponding to the current light angle according to a shadow rendering instruction aiming at the target object; the target shadow information is used for drawing the shadow of the target object under the current ray angle.
In a second aspect, an embodiment of the present application provides a shadow rendering method, including:
responding to a shadow rendering instruction aiming at a target object, and acquiring target shadow information corresponding to a current light angle from texture information corresponding to the target object; the texture information is obtained by converting shadow information of a plurality of light angles; the shadow information of the plurality of light angles is determined according to the light shielding information of the target object at the plurality of light angles;
rendering a shadow that generates the target object based on the target shadow information.
In a third aspect, a computing device is provided in an embodiment of the present application, comprising a processing component and a storage component;
the storage component stores one or more computer instructions; the one or more computer instructions are for execution by the processing component to implement the shadow processing method as described in the first aspect above or the shadow rendering method as described in the second aspect above.
In a fourth aspect, an embodiment of the present application provides a computer storage medium storing a computer program, which when executed by a computer implements the shadow processing method according to the first aspect or the shadow rendering method according to the second aspect.
In the embodiment of the application, shadow information of a plurality of light angles is obtained according to the light shielding condition of a target object at the plurality of light angles; converting the shadow information of the plurality of ray angles into texture information; when a client performs shadow rendering on a target object, target shadow information corresponding to a current light angle can be obtained from texture information corresponding to the target object, so that a shadow of the target object can be generated by rendering based on the target shadow information. According to the embodiment of the application, the texture information of the target object at a plurality of light angles is stored in advance, when shadow rendering is carried out, the target shadow information obtained from the current light angle can be directly utilized for rendering, real-time calculation is not needed, the consumption of calculation resources can be reduced, and the performance overhead is reduced.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 shows a system architecture diagram to which the technical solution of the embodiment of the present application is applied;
FIG. 2 is a flow diagram illustrating one embodiment of a shadow processing method provided herein;
FIG. 3a is a schematic diagram of ray tracing in one practical application of the embodiment of the present application;
FIG. 3b is a schematic diagram illustrating the light occlusion of the embodiment of the present application in a practical application;
FIG. 4 illustrates a flow diagram for one embodiment of a method of shadow rendering provided by the present application;
FIG. 5 is a schematic diagram illustrating scene interaction in a practical application according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating an embodiment of a shadow processing apparatus provided herein;
FIG. 7 illustrates a schematic structural diagram of one embodiment of a computing device provided herein;
FIG. 8 is a schematic block diagram illustrating one embodiment of a shadow rendering apparatus according to the present application;
fig. 9 is a schematic structural diagram illustrating a further embodiment of a computing device provided by the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
In some of the flows described in the specification and claims of this application and in the above-described figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, the number of operations, e.g., 101, 102, etc., merely being used to distinguish between various operations, and the number itself does not represent any order of performance. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor do they limit the types of "first" and "second".
The technical scheme of the embodiment of the application can be applied to an application scene for performing shadow rendering display on virtual objects in a virtual scene, the virtual scene can refer to a game scene, a virtual reality scene and the like, the virtual scene can be rendered, displayed and presented through corresponding rendering data, and one virtual scene usually comprises a plurality of virtual objects.
Shadow rendering is performed by using a real-time shadow technique, such as shadow depth mapping (Shadowmap), which is computationally complex and computationally expensive, requires a large number of Draw calls, and is performance-expensive. Particularly, the farther the virtual scene is from the sight line, the more virtual objects are located in the sight cone, the more shadows need to be drawn, and the performance overhead is higher by adopting a real-time shadow technology.
In order to reduce the amount of calculation and reduce the performance overhead, the inventor provides the technical scheme of the application through a series of researches, in the embodiment of the application, the texture information of the target object at a plurality of light angles can be stored in advance, and when shadow rendering is carried out, the target shadow information obtained from the current light angle can be directly utilized for rendering without real-time calculation, so that the consumption of calculation resources can be reduced, the performance overhead can be reduced, the realization of the dynamic shadow effect at different times can be ensured, and the requirement of 24-hour dynamic shadow can be met.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1, which is a schematic structural diagram of a system architecture to which the technical solution of the embodiment of the present application may be applied, the system architecture may include a client 101 and a server 102. And the client and the server establish connection through a network. The network provides a medium for communication links between clients and servers. The network may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A client may interact with a server over a network to receive or send data, etc.
The client may be a browser, an APP (Application), or a web Application such as an H5 (HyperText Markup Language5, 5 th edition) Application, or a light Application (also referred to as an applet, a light Application), or a cloud Application, and the client may be deployed in the electronic device and needs to run depending on the device or some APPs in the device. The electronic device may have a display screen and support information browsing and the like, for example, may be a personal mobile terminal such as a mobile phone, a tablet computer, a personal computer, a television and the like.
The server may include a server providing various services, such as a server performing interactive processing with the client, a server providing a rendering data generation service, and the like. It should be noted that the server may be implemented as a distributed server cluster formed by multiple servers, or may be implemented as a single server. The server may also be a server of a distributed system, or a server incorporating a blockchain. The server can also be a cloud server, or an intelligent cloud computing server or an intelligent cloud host with artificial intelligence technology.
Taking a game scene as an example, the client may be a game client, and the electronic device may run the game client and may be capable of presenting a game screen on the display screen. The electronic device may interact with the player user through a graphical user interface, that is, a game client may be downloaded and installed and run through the electronic device, and the manner in which the game client provides the graphical user interface to the player may include various manners, for example, the graphical user interface may be rendered and displayed on a display screen of the electronic device, or provided to the user through holographic projection, and the like. The server may include a server that provides various services for the game client.
The server side can send rendering data of the game scene to the client side, and the client side performs rendering display to generate and display a game picture corresponding to the game scene.
In addition, the technical scheme of the embodiment of the application can also be suitable for cloud game scenes, the system architecture can be realized as a cloud interaction system, and the cloud game can refer to a game mode based on cloud computing. In a cloud game scenario, a game screen may be generated by a server based on rendering data, a game client is used only to present the game screen, and so on.
Certainly, the technical solution of the embodiment of the present application may also be applicable to a local game scene, that is, the game client may render the data for local storage without interacting with the server, and may present the game picture of the game scene through the game client, and may present different game pictures based on user control, and the like.
When the game client generates the game screen based on the rendering data, shadow rendering requirements generally exist for different virtual objects, particularly real-time shadow is needed for many current game scenes, and performance overhead can be reduced by adopting the technical scheme of the embodiment of the application.
It should be noted that the shadow processing method provided by the embodiment of the present application is generally executed by a server, and the shadow rendering method is generally executed by a client. However, in other embodiments of the present application, the server may also execute the shadow rendering method provided in the embodiments of the present application. In other embodiments of the present application, the shadow rendering method provided by the embodiments of the present application may also be executed by the client and the server together.
The details of implementation of the technical solution of the embodiments of the present application are set forth in the following.
Fig. 2 is a flowchart of an embodiment of a shadow processing method provided by an embodiment of the present application, where the method may include the following steps:
201: and obtaining shadow information of a plurality of light angles according to the light shielding information of the target object at the plurality of light angles.
The target object may be any virtual object in the virtual scene for which dynamic shadow rendering requirements exist. In practical applications, the virtual scene may be, for example, a game scene or a virtual reality scene, and the target object may be, for example, a building, a tree, or other character objects such as a character in the virtual scene.
The plurality of light angles are different, and optionally, the plurality of light angles may be obtained by splitting an omnidirectional illumination angle of the target object, for example, splitting an illumination angle of 360 degrees. The plurality of light angles may be obtained by splitting the illumination angle of 360 degrees in the horizontal direction, as shown in fig. 3a, the plurality of light angles may be obtained by splitting the horizontal illumination angle of the target object 30, where the number of splits may be determined in accordance with an actual situation, for example, 16 equal portions or 100 equal portions, an elevation angle of the illumination angle in the vertical direction may be fixed, and the target elevation angle may also be adjusted in accordance with the actual situation.
Wherein the plurality of ray angles may be determined according to the light source position at different times, for example according to the light source position at 24 hours.
The light rays emitted from the light ray angles collide with the target object, so that the light rays are shielded by the target object, light ray shielding information is generated, and shadow information can be generated according to the light ray shielding information.
Wherein, at each ray angle, the light source may emit a plurality of rays, and the shadow information for each ray angle may include a plurality of intersections of the target object with the plurality of rays emitted from the ray angle. The plurality of intersection points may be represented by coordinate values in a world coordinate system in which the target object is located. The light ray blocking information is composed of a plurality of intersection points.
The light shielding information may be directly used as shadow information, and in addition, in order to further reduce performance overhead, the light shielding information may be sampled to obtain the shadow information, optionally, the shadow information for each light angle may include a boundary intersection point between the target object and each light angle, and therefore, obtaining the shadow information for a plurality of light angles according to the light shielding information of the target object at the plurality of light angles may include:
respectively emitting light rays to a target object at the plurality of light ray angles to obtain light ray shielding information of the target object at the plurality of light ray angles;
sampling light shielding information of a target object at a plurality of light angles respectively, and determining boundary intersection points of the target object at different light angles;
shadow information including boundary intersections is obtained.
Wherein the boundary intersection points may include a highest intersection point and a lowest intersection point. Therefore, the light shielding information of the target object at a plurality of light angles can be sampled, and the highest intersection point and the lowest intersection point of the target object at different light angles can be determined.
For ease of understanding, fig. 3B shows that a plurality of rays emitted from ray angles collide with the target object, the rays are blocked by the target object, a plurality of intersection points are generated, from which a highest intersection point a and a lowest intersection point B can be determined, and the shadow information corresponding to the ray angles can include the highest intersection point a and the lowest intersection point B.
202: and converting the shadow information of a plurality of ray angles into texture information.
The shadow information may be converted into texture information for storage, and as described above, the shadow information is composed of the intersection points of the target object and the light, and the texture information may be in the form of texture maps, but only the intersection points are recorded.
Optionally, the shadow information may only include the boundary intersection point, so that only the conversion information corresponding to the boundary intersection point may be recorded in the texture information, the data amount of the texture information may be significantly reduced, and the purpose of data compression is achieved, and only a few K B (kilobytes) texture information is needed, so that a large amount of shadow information may be stored, and high-performance rendering is achieved.
Optionally, shadow information of a plurality of light angles may be first projected to a cylindrical coordinate space to obtain projection information; and then converting the projection information into texture information.
The shadow information is projected to a cylindrical coordinate space, which may mean that coordinates of intersection points in the shadow information are converted into coordinates of the cylindrical coordinate space, then, coordinates in the cylindrical coordinate space may be recorded by using a texture map, each pixel in the texture map represents one intersection point, coordinates in the cylindrical coordinate space are represented by using pixel values, in the texture map, a plurality of intersection points of a target object and each ray angle are mapped into a column of pixels, when the plurality of intersection points only include a boundary intersection point, a column of pixels may only record pixel values corresponding to a highest intersection point and a lowest intersection point, the texture map is composed of a plurality of columns of pixels, pixel values of the plurality of columns of pixels form texture information, the pixel values may refer to color channel values of the pixels, furthermore, each pixel corresponds to a UV coordinate (the texture map coordinate, U represents a horizontal direction, and V represents a vertical direction).
The generated texture information may be stored in rendering data of the virtual scene, and is used for performing shadow rendering of the target object during rendering of the virtual scene.
The texture information is used for obtaining target shadow information corresponding to the current light angle according to a shadow rendering instruction aiming at a target object; the target shadow information is used for drawing the shadow of the target object under the current ray angle. The specific way of shadow rendering based on texture information is described in detail in the following embodiments.
In the embodiment, shadow information of each light angle is generated based on the light shielding information of the target object at a plurality of light angles; converting shadow information of a plurality of light angles into texture information; when the shadow rendering of the target object is carried out, the shadow of the target object under the current light angle can be drawn according to the target shadow information corresponding to the current light angle, and the texture information of different light angles is pre-baked, so that the dynamic shadow rendering can be realized, the requirements on computing resources and storage resources can be reduced, and the performance overhead is reduced.
Fig. 4 is a flowchart of an embodiment of a shadow rendering method provided in the embodiment of the present application, where the embodiment introduces a technical solution of the present application from a shadow rendering perspective, and the method may include the following steps:
401: and responding to a shadow rendering instruction aiming at the target object, and acquiring target shadow information corresponding to the current light angle from texture information corresponding to the target object.
When virtual scene rendering is performed, a shadow rendering instruction can be generated for a target object needing shadow rendering in a virtual scene.
The current light angle may be determined according to a scene time corresponding to the virtual scene. Accordingly, shadow rendering instructions for the target object may be generated based on the current scene time. The light source positions corresponding to different times are different, so the light ray angles are different.
Wherein, the texture information is obtained by the shadow information conversion of a plurality of light angles; determining the shadow information of the plurality of light angles according to the light shielding information of the target object at the plurality of light angles; the specific determination manner of the texture information may be described in detail in the embodiment shown in fig. 2, and will not be described repeatedly here.
The technical solution of this embodiment can be executed by a client, and the client can call a GPU (graphics processing unit) to execute the operations of steps 401 to 402.
402: based on the target shadow information, the rendering generates shadows of the target object.
Since the texture information is obtained by converting the shadow information of a plurality of ray angles, the target shadow information can be reconstructed from the texture information based on the current ray angle, so that the shadow of the target object can be rendered and generated based on the target shadow information.
Wherein, based on the target shadow information, a shader (shader) can be called to render the shadow of the target object.
In this embodiment, texture information of the target object at a plurality of light angles is generated by pre-baking, and when performing shadow rendering, the target shadow information corresponding to the current light angle may be reconstructed and obtained, and rendering may be performed based on the target shadow information, which may reduce consumption of computing resources and storage resources, reduce computation workload, and thus reduce performance overhead.
In some embodiments, obtaining the target shadow information corresponding to the current ray angle from the texture information corresponding to the target object includes:
acquiring target texture information corresponding to the current light angle from texture information corresponding to a target object;
and converting the target texture information into target shadow information.
Because multiple rows of pixels in the texture information respectively correspond to the intersection points of the target object and different light angles, a row of pixels corresponding to the current light angle in the texture information can be determined by combining texture coordinates, and the coordinates of the target object in a world coordinate system can be converted according to the pixel values of the intersection points recorded by the row of pixels, namely the target texture information, through coordinate conversion and the like, so that the target shadow information is obtained.
In some embodiments, rendering the shadow of the generated target object based on the target shadow information may include:
determining a shadow shape according to the target shadow information;
and drawing the shadow corresponding to the shadow shape according to the drawing parameters.
The target shadow information at least comprises the highest intersection point and the lowest intersection point of the target object and different rays, so that the shadow shape can be determined according to the highest intersection point and the lowest intersection point.
The drawing parameters may include a shadow color, and may further include transparency, definition, resolution, and the like, so that the shadow rendering of the target object may be completed according to the drawing parameters, that is, the shadow corresponding to the drawing shadow shape.
In some embodiments, determining the shadow shape from the target shadow information may comprise:
acquiring a shadow map of a target object;
and determining the shadow shape according to whether the pixel point in the shadow map is positioned in the shadow range defined by the target shadow information.
Then the shadow corresponding to the drawing shadow shape according to the drawing parameters may be: and drawing the shadow corresponding to the shadow shape in the shadow map according to the drawing parameters.
That is, the shadow map of the target object can be obtained according to the current light angle. The shadow map may be a decal or a patch, and optionally, the shadow map may be embodied as a decal, in order to avoid being affected by other objects entering the shadow. Among them, the decal technology is a kind of decal technology, which can realize the drawing of a picture on the surface of another object without being influenced by the object.
The target shadow information is used for limiting a shadow range, so that whether pixel points in the shadow map are located in the shadow range or not can be judged. The shadow range is determined by the highest intersection point and the lowest intersection point of the target object at the current light angle in the target shadow information, specifically, the highest intersection point and the lowest intersection point corresponding to the current light angle can be projected onto a shadow rendering surface, and can be converted into the highest shadow point and the lowest shadow point corresponding to the shadow rendering surface according to the current light elevation angle and the trigonometric function relationship. Then, in the shadow map, the shadow corresponding to the shadow shape can be obtained by drawing according to the drawing parameters.
And determining the shadow shape according to whether the pixel point in the shadow map is positioned in the shadow range defined by the target shadow information, namely determining the intersection part of the shadow map and the shadow range, namely the shadow shape.
The shadow map of the target object may be rendered according to the target size and the target shape.
Wherein the target size and target shape may be preconfigured.
In addition, the current light angle and/or the specification parameters of the target object can be determined. The sizes and/or shapes corresponding to different light angles can be configured in advance, and the sizes and/or shapes corresponding to different specification parameters can be configured in advance. The specification parameters may include an object size, an object shape, and the like.
Thus, in some embodiments, a shadow map of the target object may be rendered based on the current ray angle. The shadow map of the target object can also be drawn by combining the specification parameters of the target object, or the shadow map of the target object can be drawn by combining the current light angle and the specification parameters of the target object.
Because the shadow map is drawn based on the target object, even if the target object is changed in position or damaged, corresponding dynamic shadow rendering can be realized, and under the condition that the target object is removed, the corresponding shadow can be removed because the shadow map can also be removed.
In addition, a shadow rendering area can be determined according to the current ray angle, a shadow map can be rendered in the shadow rendering area, and the shadow rendering area can change along with the position of the target object.
In some embodiments, in response to the shadow rendering instruction for the target object, obtaining target shadow information corresponding to the current ray angle from texture information corresponding to the target object may include:
in response to a shadow rendering instruction for a target object, determining whether a line-of-sight distance of the target object is greater than a predetermined distance;
and under the condition that the sight line distance is greater than the preset distance, acquiring target shadow information corresponding to the current light angle from the texture information corresponding to the target object.
The line-of-sight distance can be the distance between a target object and a camera, the requirement on the fineness of the image is not high for the target object with the line-of-sight distance being greater than the preset distance, shadow rendering can be achieved only by storing the boundary intersection point of the target object and the light by adopting the technical scheme of the embodiment of the application, and a complete shadow picture does not need to be drawn, so that the consumption of storage resources and calculation resources can be reduced, and the performance cost is reduced. The camera may refer to a camera in a virtual scene, corresponding to a perspective in the virtual scene.
If the line-of-sight distance of the target object is less than the predetermined distance, the shadow of the target object can be rendered by adopting a real-time shadow technology so as to ensure the shadow rendering effect.
In practical application, the technical solution of the embodiment of the present application may be applied to a game scene, where the target object is a game object in the game scene, such as a game character object or other objects. The following takes a game scene as an example, and details the technical solution of the present application are described with reference to a scene interaction diagram shown in fig. 5.
The server 501 may perform ray tracing on the game object in advance, that is, emit light to the target object at a plurality of light angles, sample light shielding information of the game object at the plurality of light angles, determine the highest intersection point and the lowest intersection point of the game object and different light angles, and form shadow information from the highest intersection point and the lowest intersection point of the game object at each light angle. And then, the shadow information can be projected to a cylindrical coordinate space and then converted into texture coordinates to obtain texture information. The texture information can only record the highest intersection point and the lowest intersection point corresponding to different light rays, so that the data volume can be reduced, the consumption of storage resources can be reduced, and the like. The server 501 may save the texture information into rendering data and may send the rendering data to the game client 502.
The game client 502 runs and renders a game scene based on the rendering data, shadow rendering instructions may be generated for game objects for which shadow rendering requirements exist. In response to the shadow rendering instruction, if the game object is an object far from the sight, the game client 502 may obtain target shadow information corresponding to the current light angle from texture information corresponding to the game object, draw a shadow map of the game object based on the target shadow information, the specification parameters of the game object, and the like, determine a shadow shape according to whether a pixel point in the shadow map is located within a shadow range defined by the target shadow information, and draw a shadow corresponding to the shadow shape in the shadow map according to the drawing parameters, thereby achieving the purpose of shadow rendering.
In the embodiment of the application, the texture information of the game object under a plurality of light angles can be baked in advance, and the highest intersection point and the lowest intersection point of the game object and different light angles are stored to realize shadow rendering, so that the data storage amount is reduced, the consumption of storage resources is reduced, when the shadow rendering requirement is stored, whether pixel points in a shadow map of the game object are located in a shadow range can be judged according to target shadow information corresponding to the current light angle, and based on the judgment result, the shadow of the game object under the current light angle can be drawn and generated.
Fig. 6 is a schematic structural diagram of an embodiment of a shadow processing apparatus according to an embodiment of the present application, where the apparatus may include:
the shadow baking module 601 is configured to obtain shadow information of a plurality of light angles according to light shielding information of the target object at the plurality of light angles;
an information conversion module 602, configured to convert shadow information of a plurality of ray angles into texture information;
the texture information is used for obtaining target shadow information corresponding to the current light angle according to a shadow rendering instruction aiming at a target object; the target shadow information is used for drawing the shadow of the target object under the current ray angle.
In some embodiments, the shadow baking module may specifically determine a plurality of light angles corresponding to light source positions at different times, and emit light to the target object at the plurality of light angles, respectively, to obtain light shielding information of the target object at the plurality of light angles, respectively; sampling light shielding information of a target object at a plurality of light angles respectively, and determining boundary intersection points of the target object and different lights; shadow information including boundary intersections is obtained.
In some embodiments, the shadow baking module samples light shielding information of the target object at a plurality of light angles, respectively, and the determining the boundary intersection point of the target object at different light angles may be by sampling the light shielding information of the target object at a plurality of light angles, respectively, and determining the highest intersection point and the lowest intersection point of the target object at different light angles.
In some embodiments, the information conversion module may specifically project the shadow information of a plurality of light angles to a cylindrical coordinate space to obtain projection information; and converting the projection information into texture coordinates to obtain texture information.
The shadow processing apparatus shown in fig. 6 can execute the shadow processing method shown in the embodiment shown in fig. 2, and the implementation principle and the technical effect are not repeated. The specific manner in which each module and unit of the shadow processing apparatus in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be elaborated herein.
An embodiment of the present application further provides a computing device, as shown in fig. 7, the computing device may include a storage component 701 and a processing component 702;
the storage component 701 stores one or more computer instructions for execution by the processing component 702 to implement the shadow processing method of the embodiment shown in fig. 2.
Of course, a computing device may also include other components as necessary, such as input/output interfaces, display components, communication components, and so forth.
The input/output interface provides an interface between the processing components and peripheral interface modules, which may be output devices, input devices, etc. The communication component is configured to facilitate wired or wireless communication between the computing device and other devices, and the like.
Wherein the processing components may include one or more processors executing computer instructions to perform all or part of the steps of the above-described method. Of course, the processing elements may also be implemented as one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components configured to perform the above-described methods.
The storage component is configured to store various types of data to support operations at the terminal. The memory components may be implemented by any type or combination of volatile and non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
It should be noted that the computing device may be a physical device or a flexible computing host provided by a cloud computing platform. It can be implemented as a distributed cluster consisting of a plurality of servers or terminal devices, or as a single server or a single terminal device.
An embodiment of the present application further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a computer, the shadow processing method according to the embodiment shown in fig. 2 can be implemented. The computer-readable medium may be included in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device.
Embodiments of the present application further provide a computer program product, which includes a computer program carried on a computer-readable storage medium, and when the computer program is executed by a computer, the shadow processing method as described in the embodiment shown in fig. 2 can be implemented. In such embodiments, the computer program may be downloaded and installed from a network, and/or installed from a removable medium. The computer program, when executed by a processor, performs various functions defined in the system of the present application.
Fig. 8 is a schematic structural diagram of an embodiment of a shadow rendering apparatus according to an embodiment of the present application, where the apparatus may include:
an information determining module 801, configured to, in response to a shadow rendering instruction for a target object, obtain target shadow information corresponding to a current light angle from texture information corresponding to the target object; wherein, the texture information is obtained by the shadow information conversion of a plurality of ray angles; determining the shadow information of a plurality of light angles according to the light shielding condition of the target object at the plurality of light angles;
a rendering module 802 for rendering the shadow of the generated target object based on the target shadow information.
In some embodiments, the information determining module may specifically obtain target texture information corresponding to the current light angle from texture information corresponding to the target object; and converting the target texture information into target shadow information.
In some embodiments, the rendering module may specifically determine the shadow shape according to the target shadow information; and drawing the shadow corresponding to the shadow shape according to the drawing parameters.
In some embodiments, the rendering module may specifically obtain a shadow map of the target object; determining a shadow shape according to whether pixel points in the shadow map are positioned in a shadow range defined by the target shadow information; and drawing the shadow corresponding to the shadow shape in the shadow map according to the drawing parameters.
In some embodiments, the rendering module may draw the shadow map of the target object according to the current ray angle, specifically, draw the shadow map of the target object according to the current ray angle and/or a specification parameter of the target object.
In some embodiments, the information determination module may specifically determine, in response to a shadow rendering instruction for the target object, whether a line-of-sight distance of the target object is greater than a predetermined distance;
and under the condition that the line-of-sight distance is greater than the preset distance, acquiring target shadow information corresponding to the current light angle from the texture information corresponding to the target object.
The rendering apparatus shown in fig. 9 may perform the shadow rendering method according to the embodiment shown in fig. 4, and details of implementation principles and technical effects are not repeated. The specific manner in which each module and unit of the shadow processing device in the above embodiments perform operations has been described in detail in the embodiments related to the method, and will not be described in detail here.
An embodiment of the present application further provides a computing device, as shown in fig. 9, the computing device may include a storage component 901 and a processing component 902;
storage component 901 stores one or more computer instructions for execution by processing component 902 to implement the shadow rendering method of the embodiment shown in FIG. 4.
Of course, a computing device may also include other components as necessary, such as input/output interfaces, display components, communication components, and so forth.
The input/output interface provides an interface between the processing components and peripheral interface modules, which may be output devices, input devices, etc. The communication component is configured to facilitate wired or wireless communication between the computing device and other devices, and the like.
The computing device may be a user terminal in practical application, for example, a mobile phone, a tablet computer, a personal computer, etc., as described in fig. 9, and the computing device may further include a display component 903 to perform display operations, such as displaying a shadow generated by rendering.
Wherein the processing components may include one or more processors executing computer instructions to perform all or part of the steps of the above-described method. The one or more processors may include a CPU and a GPU; of course, the processing elements may also be implemented as one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components configured to perform the above-described methods.
The storage component is configured to store various types of data to support operations at the terminal. The storage component may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The display element may be an Electroluminescent (EL) element, a liquid crystal display or a microdisplay of similar construction, or a retina-directable or similar laser-scanned display.
An embodiment of the present application further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a computer, the shadow rendering method according to the embodiment shown in fig. 4 can be implemented. The computer-readable medium may be included in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device.
Embodiments of the present application further provide a computer program product, which includes a computer program carried on a computer-readable storage medium, and when the computer program is executed by a computer, the shadow rendering method as described in the embodiment shown in fig. 4 can be implemented. In such embodiments, the computer program may be downloaded and installed from a network, and/or installed from a removable medium. The computer program, when executed by a processor, performs various functions defined in the system of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (11)

1. A shadow processing method, comprising:
obtaining shadow information of a plurality of light angles according to light shielding information of a target object at the plurality of light angles;
converting the shadow information of the plurality of ray angles into texture information;
the texture information is used for obtaining target shadow information corresponding to the current light angle according to a shadow rendering instruction aiming at the target object; the target shadow information is used for drawing the shadow of the target object under the current light angle;
wherein, the obtaining shadow information of a plurality of light angles according to the light shielding condition of the target object at the plurality of light angles comprises:
determining a plurality of light angles corresponding to light source positions at different times;
respectively emitting light rays to a target object at the plurality of light ray angles to obtain light ray shielding information of the target object at the plurality of light ray angles;
sampling light shielding information of the target object at the plurality of light angles respectively, and determining boundary intersection points of the target object at different light angles;
shadow information is obtained including the boundary intersection points.
2. The method of claim 1, wherein the sampling the ray occlusion information of the target object at the plurality of ray angles, respectively, and the determining boundary intersection points of the target object at different ray angles comprises:
and sampling the light shielding information of the target object at the plurality of light angles respectively, and determining the highest intersection point and the lowest intersection point of the target object at different light angles as boundary intersection points.
3. The method of claim 1, wherein converting the shadow information of the plurality of ray angles into texture information comprises:
projecting the shadow information of the light angles to a cylindrical coordinate space to obtain projection information;
and converting the projection information into texture information.
4. A method of shadow rendering, comprising:
responding to a shadow rendering instruction aiming at a target object, and acquiring target shadow information corresponding to a current ray angle from texture information corresponding to the target object; the texture information is obtained by converting shadow information of a plurality of light angles; the shadow information of the plurality of light angles is determined according to the light shielding information of the target object at the plurality of light angles; the light shielding information is obtained by respectively emitting light to the target object at the plurality of light angles; the shadow information comprises boundary intersection points of the target object on different ray angles, which are obtained by sampling through the ray occlusion information; the light angles correspond to light source positions at different times;
rendering a shadow that generates the target object based on the target shadow information.
5. The method according to claim 4, wherein the obtaining target shadow information corresponding to a current ray angle from texture information corresponding to the target object comprises:
acquiring target texture information corresponding to the current light angle from the texture information corresponding to the target object;
and converting the target texture information into target shadow information.
6. The method of claim 4, wherein rendering shadows that generate the target object based on the target shadow information comprises:
determining a shadow shape according to the target shadow information;
and drawing the shadow corresponding to the shadow shape according to the drawing parameters.
7. The method of claim 4, wherein rendering the shadow of the target object based on the target shadow information comprises:
acquiring a shadow map of the target object;
determining a shadow shape according to whether a pixel point in the shadow map is located in a shadow range defined by the target shadow information;
and drawing the shadow corresponding to the shadow shape in the shadow map according to the drawing parameters.
8. The method of claim 6, wherein said rendering the shadow map of the target object according to the current ray angle comprises:
and drawing the shadow map of the target object according to the current light angle and/or the specification parameters of the target object.
9. The method of claim 4, wherein the obtaining target shadow information corresponding to the current ray angle from texture information corresponding to a target object in response to a shadow rendering instruction for the target object comprises:
in response to a shadow rendering instruction for a target object, determining whether a line-of-sight distance of the target object is greater than a predetermined distance;
and under the condition that the line-of-sight distance is greater than the preset distance, acquiring target shadow information corresponding to the current light angle from the texture information corresponding to the target object.
10. A computing device comprising a processing component and a storage component;
the storage component stores one or more computer instructions; the one or more computer instructions to be invoked for execution by the processing component to implement the shadow processing method of any of claims 1~3 or the shadow rendering method of any of claims 4~9.
11. A computer storage medium, characterized in that a computer program is stored, which when executed by a computer implements the shadow processing method of any one of claims 1~3 or the shadow rendering method of any one of claims 4~9.
CN202211156153.6A 2022-09-22 2022-09-22 Shadow processing method, shadow rendering method and device Active CN115239869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211156153.6A CN115239869B (en) 2022-09-22 2022-09-22 Shadow processing method, shadow rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211156153.6A CN115239869B (en) 2022-09-22 2022-09-22 Shadow processing method, shadow rendering method and device

Publications (2)

Publication Number Publication Date
CN115239869A CN115239869A (en) 2022-10-25
CN115239869B true CN115239869B (en) 2023-03-24

Family

ID=83667061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211156153.6A Active CN115239869B (en) 2022-09-22 2022-09-22 Shadow processing method, shadow rendering method and device

Country Status (1)

Country Link
CN (1) CN115239869B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903296A (en) * 2014-04-23 2014-07-02 东南大学 Method for shadow rendering in virtual home decoration indoor scene design
CN109993823A (en) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 Shading Rendering method, apparatus, terminal and storage medium
CN111862295A (en) * 2020-07-17 2020-10-30 完美世界(重庆)互动科技有限公司 Virtual object display method, device, equipment and storage medium
CN113012274A (en) * 2021-03-24 2021-06-22 北京壳木软件有限责任公司 Shadow rendering method and device and electronic equipment
CN114067043A (en) * 2021-11-09 2022-02-18 网易(杭州)网络有限公司 Rendering method and device of virtual model
CN114596401A (en) * 2020-11-20 2022-06-07 华为云计算技术有限公司 Rendering method, device and system
WO2022126145A1 (en) * 2022-02-01 2022-06-16 Innopeak Technology, Inc. Hybrid shadow rendering

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016103891A1 (en) * 2015-03-03 2016-09-08 Imagination Technologies Limited Systems and methods for soft shading in 3D rendering

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903296A (en) * 2014-04-23 2014-07-02 东南大学 Method for shadow rendering in virtual home decoration indoor scene design
CN109993823A (en) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 Shading Rendering method, apparatus, terminal and storage medium
CN111862295A (en) * 2020-07-17 2020-10-30 完美世界(重庆)互动科技有限公司 Virtual object display method, device, equipment and storage medium
CN114596401A (en) * 2020-11-20 2022-06-07 华为云计算技术有限公司 Rendering method, device and system
CN113012274A (en) * 2021-03-24 2021-06-22 北京壳木软件有限责任公司 Shadow rendering method and device and electronic equipment
CN114067043A (en) * 2021-11-09 2022-02-18 网易(杭州)网络有限公司 Rendering method and device of virtual model
WO2022126145A1 (en) * 2022-02-01 2022-06-16 Innopeak Technology, Inc. Hybrid shadow rendering

Also Published As

Publication number Publication date
CN115239869A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
CN108154548B (en) Image rendering method and device
CN111260766B (en) Virtual light source processing method, device, medium and electronic equipment
CN111739142A (en) Scene rendering method and device, electronic equipment and computer readable storage medium
US20240005592A1 (en) Image rendering method and apparatus, device, and storage medium
GB2591354A (en) Video frame processing method and apparatus
US20240087219A1 (en) Method and apparatus for generating lighting image, device, and medium
CN115512025A (en) Method and device for detecting model rendering performance, electronic device and storage medium
CN115984449A (en) Illumination rendering method and device, electronic equipment and storage medium
WO2022063260A1 (en) Rendering method and apparatus, and device
CN113313622A (en) Illumination simulation method and device in virtual scene and electronic equipment
CN115239869B (en) Shadow processing method, shadow rendering method and device
CN109448123B (en) Model control method and device, storage medium and electronic equipment
CN114428573B (en) Special effect image processing method and device, electronic equipment and storage medium
CN115131531A (en) Virtual object display method, device, equipment and storage medium
CN112967369A (en) Light ray display method and device
KR102235679B1 (en) Device and method to display object with visual effect
KR20230013099A (en) Geometry-aware augmented reality effects using real-time depth maps
KR20230022153A (en) Single-image 3D photo with soft layering and depth-aware restoration
US10235798B2 (en) System and method for rendering shadows for a virtual environment
CN114245907A (en) Auto-exposure ray tracing
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
CN116912387A (en) Texture map processing method and device, electronic equipment and storage medium
CN115170715A (en) Image rendering method and device, electronic equipment and medium
CN112184873A (en) Fractal graph creating method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40081904

Country of ref document: HK