CN115970275A - Projection processing method and device for virtual object, storage medium and electronic equipment - Google Patents

Projection processing method and device for virtual object, storage medium and electronic equipment Download PDF

Info

Publication number
CN115970275A
CN115970275A CN202211533161.8A CN202211533161A CN115970275A CN 115970275 A CN115970275 A CN 115970275A CN 202211533161 A CN202211533161 A CN 202211533161A CN 115970275 A CN115970275 A CN 115970275A
Authority
CN
China
Prior art keywords
projection
virtual object
scene
game scene
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211533161.8A
Other languages
Chinese (zh)
Inventor
陈晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211533161.8A priority Critical patent/CN115970275A/en
Publication of CN115970275A publication Critical patent/CN115970275A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The disclosure provides a projection processing method and device of a virtual object, a computer readable storage medium and electronic equipment, and relates to the technical field of computers. The projection processing method of the virtual object comprises the following steps: responding to the controlled virtual object entering an indoor game scene, and acquiring first position information of the controlled virtual object in the indoor game scene and second position information of a pseudo light source appointed in the indoor game scene; determining a projection position and a projection angle of a controlled virtual object in an indoor game scene based on first position information and the second position information; displaying a projection applique in the indoor game scene at the projection position according to the projection angle so as to simulate the projection of the controlled virtual object in the indoor game scene; the projection applique is generated by rendering according to scene information in an indoor game scene and a specified projection material. The virtual object projection reality sense is improved, and the user experience is improved.

Description

Projection processing method and device for virtual object, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a projection processing method for a virtual object, a projection processing apparatus for a virtual object, a computer-readable storage medium, and an electronic device.
Background
In game production, a ground projection is generated for a virtual object according to a light source in a game scene, so as to improve the reality of the virtual object on the ground.
In the related technology, after a game scene is illuminated and constructed, only one to two main light sources are reserved, even the light sources are completely deleted, so that the performance consumption caused by shadow rendering is reduced; however, this approach may make the ground projection of the virtual object very faint or even no projection, resulting in a reduced realism of the movement of the virtual object in the scene, thereby reducing the user experience.
Disclosure of Invention
The present disclosure provides a projection processing method of a virtual object, a projection processing apparatus of a virtual object, a computer-readable storage medium, and an electronic device, thereby improving a problem of low reality of projection of a virtual object at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, a projection processing method of a virtual object is provided, which is applied to an indoor game scene, where the indoor game scene includes a static virtual object, and a projection of the static virtual object is generated by one-time calculation in a scene baking manner; the method comprises the following steps: responding to a controlled virtual object entering the indoor game scene, and acquiring first position information of the controlled virtual object in the indoor game scene and second position information of a pseudo light source specified in the indoor game scene; the pseudo light source is a light source which is used for simulating a real light source position in the indoor game scene and does not generate an illumination effect; determining a projection position and a projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information; displaying a projection applique in the indoor game scene at the projection position according to the projection angle so as to simulate the projection of the controlled virtual object in the indoor game scene; and the projection applique is generated by rendering according to scene information in the indoor game scene and a specified projection material.
According to a second aspect of the present disclosure, a projection processing apparatus of a virtual object is provided, which is applied to an indoor game scene, where the indoor game scene includes a static virtual object, and a projection of the static virtual object is generated by a one-time calculation in a scene baking manner; the method comprises the following steps: a position information acquisition module configured to acquire first position information of a controlled virtual object in an indoor game scene and second position information of a pseudo light source specified in the indoor game scene in response to the controlled virtual object entering the indoor game scene; the pseudo light source is a light source which is used for simulating a real light source position in the indoor game scene and does not generate an illumination effect; a projection position and projection angle determination module configured to determine a projection position and a projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information; a projection generation module configured to display a projection decal in the indoor game scene at the projection position according to the projection angle to simulate the projection of the controlled virtual object in the indoor game scene; and the projection applique is generated by rendering according to scene information in the indoor game scene and a specified projection material.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the projection processing method of a virtual object of the first aspect described above and possible implementations thereof.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; a memory for storing executable instructions of the processor. Wherein the processor is configured to execute the method for processing the projection of the virtual object of the first aspect and possible implementations thereof via executing the executable instructions.
The technical scheme of the disclosure has the following beneficial effects:
based on the method, on one hand, the projection position and the projection angle of the controlled virtual object are determined based on the position information of the controlled virtual object and the pseudo light source in the indoor game scene, the projection applique is displayed at the projection position according to the projection angle, the problem that the projection of the controlled virtual object is weak due to the fact that the projection applique is obtained according to the main light source is solved, the projection applique is generated according to the scene information of the game scene and the appointed projection material, then the projection of the virtual object is generated based on the projection applique, the illumination construction is not required to be continuously carried out according to the movement of the controlled virtual object, the projection of the controlled virtual object is guaranteed to have reality while the performance overhead of the projection processing process is reduced, and the whole efficiency of the projection processing process of the virtual object is improved; on the other hand, the controlled virtual object can still generate ground projection according to the pseudo light source when the illumination is weak, so that the sense of reality of the moving effect of the controlled virtual object in the scene is improved, the projection quality of the virtual object is improved, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 illustrates a system architecture of the present exemplary real-time mode runtime environment;
fig. 2 is a schematic diagram showing a flow of a projection processing method of a virtual object in the present exemplary embodiment;
FIG. 3 is a schematic diagram of a process for acquiring projection positions and projection angles in the exemplary embodiment;
FIG. 4 is a schematic diagram of a light source illuminating a controlled virtual object to produce a ground projection;
fig. 5 shows a schematic view of a projection decal in this exemplary embodiment;
fig. 6 shows a schematic flow chart of one method of generating a projected decal in this exemplary embodiment;
fig. 7 is a flowchart illustrating a process of acquiring scene information according to the present exemplary embodiment;
FIG. 8 shows a schematic of the sampling process in a Kawase Blur process;
fig. 9 (a) shows a schematic diagram of a variation in transparency of a projection decal in this exemplary embodiment;
fig. 9 (b) is a schematic diagram showing a variation in blurring effect of one kind of projection decal in the present exemplary embodiment;
fig. 10 shows a flow chart for generating a projected decal in one exemplary embodiment;
fig. 11 shows another flow chart for generating a projected decal in this exemplary embodiment;
fig. 12 is a flowchart showing a projection processing method of a virtual object in the present exemplary embodiment;
fig. 13 is a schematic view showing a virtual irradiation range of a pseudo light source in the present exemplary embodiment;
FIG. 14 is a schematic diagram of a loaded projected blueprint in accordance with an exemplary embodiment;
fig. 15 is a schematic diagram showing a projection processing apparatus of a virtual object in the present exemplary embodiment;
fig. 16 shows a schematic configuration diagram of an electronic device in the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the related art, only one to two main light sources are reserved after a game scene is illuminated and constructed, so that the performance consumption caused by shadow rendering is reduced; however, this approach tends to cause the ground projection of the movable virtual object to be very faint or even disappear in the scene away from the primary light source, causing the player to visually perceive that the virtual object is not standing on the ground, resulting in less realism of movement of the virtual object in the scene, thereby reducing the player's immersive experience.
In view of one or more of the above problems, the exemplary embodiment of the present disclosure first provides a method for processing a virtual object by projection, and a system architecture of an operating environment of the exemplary embodiment is described below with reference to fig. 1.
Referring to fig. 1, system architecture 100 may include a terminal device 110 and a server 120. The terminal device 110 may be an electronic device such as a desktop computer, a notebook computer, and a smart phone, and the terminal device 110 may be configured to obtain scene information of an area where the controlled virtual object is located in the game scene. The server 120 generally refers to a background system that provides a projection processing related service of a virtual object in the present exemplary embodiment, and may be, for example, a server that implements a projection processing method of a virtual object. The server 120 may be a server or a cluster of servers, which is not limited by the present disclosure. The terminal device 110 and the server 120 may form a connection through a wired or wireless communication link for data interaction.
In one embodiment, the projection processing method of the virtual object in the present exemplary embodiment may be performed by the terminal device 110. For example, in a game scene in which the controlled virtual object stands on the ground, the terminal device 110 may be a device running a current game, and a display screen thereof may display the game scene, and when the controlled virtual object enters an indoor game scene, the terminal device 110 may determine a projection position and a projection angle of the controlled virtual object according to position information of the controlled virtual object and position information of a pseudo light source in the indoor game scene, and perform rendering according to scene information of the game scene and a specified projection material to generate a projection decal; and finally, displaying the projection applique according to the projection angle at the projection position to form a projection of the controlled virtual object.
In one embodiment, the terminal device 110 may run a game, and after the controlled virtual object enters the indoor game scene, the position information of the controlled virtual object in the indoor game scene is obtained in real time, and the position information of the pseudo light source in the indoor game scene are sent to the server together, after the position information of the controlled virtual object and the position information of the pseudo light source sent by the terminal device 110 are obtained, the server 120 may determine the projection position and the projection angle of the controlled virtual object based on the two position information, and perform rendering according to the scene information of the indoor game scene and the specified projection material, so as to generate a projection applique; then, the projection position, the projection angle and the projection applique are sent to the terminal device 110; after receiving the projection position and the projection angle, the terminal device 110 displays the projection decal at the projection position according to the projection angle to form a projection of the controlled virtual object.
The projection processing method of the virtual object will be described with reference to fig. 2. Fig. 2 shows an exemplary flow of a projection processing method of a virtual object, including the following steps S210 to S230:
step S210, responding to the controlled virtual object entering the indoor game scene, and acquiring first position information of the controlled virtual object in the indoor game scene and second position information of a pseudo light source appointed in the indoor game scene; the pseudo light source is a light source which is used for simulating a real light source position in an indoor game scene and does not generate an illumination effect;
step S220, determining the projection position and the projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information;
step S230, displaying a projection applique in the indoor game scene at the projection position according to the projection angle so as to simulate the projection of the controlled virtual object in the indoor game scene; the projection applique is generated by rendering according to scene information in an indoor game scene and a specified projection material.
Based on the method, on one hand, the projection position and the projection angle of the controlled virtual object are determined based on the position information of the controlled virtual object and the pseudo light source in the indoor game scene, the projection applique is displayed at the projection position according to the projection angle, the problem that the projection of the controlled virtual object is weak due to the fact that the projection applique is obtained according to the main light source is solved, the projection applique is generated according to the scene information of the game scene and the appointed projection material, then the projection of the virtual object is generated based on the projection applique, the illumination construction is not required to be continuously carried out according to the movement of the controlled virtual object, the projection of the controlled virtual object is guaranteed to have reality while the performance overhead of the projection processing process is reduced, and the whole efficiency of the projection processing process of the virtual object is improved; on the other hand, the controlled virtual object can still generate ground projection according to the pseudo light source when the illumination is weak, so that the sense of reality of the moving effect of the controlled virtual object in the scene is improved, the projection quality of the virtual object is improved, and the user experience is improved.
Each step in fig. 2 is explained in detail below.
Step S210, responding to the controlled virtual object entering the indoor game scene, and acquiring first position information of the controlled virtual object in the indoor game scene and second position information of a pseudo light source appointed in the indoor game scene; the pseudo light source is a light source which is used for simulating the position of a real light source in an indoor game scene and does not generate an illumination effect.
The virtual object may include a virtual object in a game, and the present disclosure does not specifically limit the specific content of the virtual object, for example, the virtual object may include a character in the game, a virtual item in the game, and the controlled virtual object may be a virtual object controlled by a user or the terminal device 110, and for example, the controlled virtual object may include a game character controlled by a player. The first position information may be a position of the controlled virtual object in the indoor game scene, for example, the first position information may include a coordinate point where the controlled virtual object is located in a three-dimensional coordinate system constructed according to the indoor game scene. The second position information may indicate a position of the pseudo light source in the indoor game scene, and may include, for example, coordinates and an illumination direction of the pseudo light source in the indoor game scene. The pseudo light source is a light source which does not produce illumination effect in a game scene, and has a virtual illumination range, and if the virtual object enters the virtual illumination range of the pseudo light source, a projection processing method is triggered to generate the projection of the controlled virtual object through projection applique; the shape of the irradiation range of the pseudo light source is not particularly limited in the present disclosure, and for example, the irradiation range of the pseudo light source may be a spherical irradiation range in consideration that the controlled virtual object may fly or walk in a game, so as to monitor whether the controlled virtual object enters the irradiation range of the pseudo light source in all directions.
For example, the virtual irradiation range of the pseudo light source may include the indoor game scene, and in response to the controlled virtual object entering the indoor game scene, first position information of the controlled virtual object in the indoor game scene and second position information of the pseudo light source specified in the indoor game scene may be immediately acquired, thereby generating a projection of the controlled virtual object.
After the first position information and the second position information are acquired, with continuing reference to fig. 2, a projection position and a projection angle of the controlled virtual object in the indoor game scene may be determined based on the first position information and the second position information in step S220.
Wherein the projection position is a position displayed by a projection of the controlled virtual object, and the projection angle may include a display angle of the projection.
In one embodiment, the determining the projection position and the projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information may include steps S310 to S320:
step S310, determining a projection position of the controlled virtual object based on the first position information;
step S320, determining a projection angle of the controlled virtual object according to the position relationship between the first position information and the second position information.
In step S310, the projection position may be determined based on the first position information, and in step S320, the projection angle is adjusted in real time according to the position relationship between the first position information and the second position information, so that the projected display fits the pose of the controlled virtual object, thereby improving the reality of the projection of the controlled virtual object and improving the user' S impression.
After determining the projection position and the projection angle, and with continued reference to fig. 2, a projection decal may be displayed in the indoor game scene at the projection position according to the projection angle to simulate the projection of the controlled virtual object in the indoor game scene in step S230; the projection applique is generated by rendering according to scene information in an indoor game scene and a specified projection material.
The scene information may be image information of one or more frames in a game scene, the specific content and the obtaining mode of the scene information are not particularly limited in the present disclosure, for example, the scene information may be captured in real time through a scenecapturecamera 2D, the principle of which is similar to that of receiving an image from a video camera, the scenecapturecamera 2D may capture the scene information from a virtual camera and store the scene information as an image to generate a material of a virtual object, and the scene information may include RGB information and Alpha information.
Rendering is a process of visually presenting a game effect in game production, and the specific process of rendering is not particularly limited in the present disclosure, for example, as shown in fig. 4, a game scene in which a light source in fig. 4 irradiates a virtual object to generate a ground projection may be rendered by a UE4 engine; generally, better art effect needs to be achieved for stylized rendering, the UE4 engine default rendering mode cannot be used at the moment, the rendering logic needs to be manually written according to the rendering requirement, the rendering pipeline can be modified from the bottom layer of the UE4 engine through the manual rendering logic, and the rendering logic can also be directly written into a material surface layer.
A decal is a material that can be projected onto a mesh (including static meshes and skeletal meshes). Whether these mesh bodies are static or removable, decals can be projected onto the mesh bodies. By rendering many decals simultaneously, a better game effect can be presented without causing a significant performance degradation. In the creation of game contents, the applique is mostly used for displaying stain effects on the surface of an object, such as rust on a metal surface, moss on a wet wall surface, bloodstain on a terrorist game wall surface and the like. The projection applique can be a material of a projection of the controlled virtual object, as shown in fig. 5, the projection applique of the controlled virtual object can include a shadow part on the ground in the drawing, and the projection of the virtual character is generated through the applique in the scheme, so that the controlled virtual object has a clear projection on the ground, and the performance overhead is reduced while the reality of the controlled virtual object moving in the scene is improved.
A specific process of generating a projection decal by rendering according to scene information in an indoor game scene and a specified projection material will be described below.
In one embodiment, the generating of the projection applique according to the rendering of the scene information in the indoor game scene and the specified projection material may include steps S610 to S620:
step S610, scene information is obtained;
and step S620, importing the scene information into a rendering component, rendering the projection material through the rendering component, and generating the projection applique.
The rendering component may be a game production component that renders a projection material according to scene information, and the specific content of the rendering component is not particularly limited in the present disclosure, for example, the rendering component may include a rendering Target (Render Target), and the component is a special map that stores color information, and unlike a conventional map, the special map may be used to write data or read data.
In step S610, scene information may be first acquired, and in one embodiment, the acquiring of the scene information may include the following steps S710 to S720:
step S710, acquiring one or more frames of scene images of the area where the controlled virtual object is located in the game scene captured by the virtual camera view port;
step S720, extracting color information and transparency information in one or more frames of scene images to obtain scene information.
The color information may represent color and transparency in one or more frames of scene images, and the specific content of the color information and transparency information is not limited in the present disclosure, for example, the color information may include RGB information of one or more frames of scene images, and the transparency information may include Alpha information in one or more frames of scene images.
In step S710, one or more frames of scene images of an area where a controlled virtual object is located in a game scene captured by a viewport of a virtual camera may be obtained; in step S720, color information and transparency information in the scene image may be extracted to obtain scene information.
For example, the CaptureSource option of the scenecapturecompany 2D component may be "scenecor (HDR) in RGB, inv options in a" to capture RGB and Alpha information in one or more frames of scene images of the virtual camera viewport as scene information. In addition, the staff member can set the information type of the virtual camera view port needing to be captured, so as to obtain richer scene information.
After the scene information is obtained, in step S620, the scene information may be imported into the rendering component, and the projection material is rendered by the rendering component to generate the projection applique.
And importing the scene information into a rendering component, and performing temporary storage and image processing on the scene information through the rendering component to obtain the projection applique. Illustratively, the scene information may include RGB and Alpha information, and the rendering component may include a Render Target of the UE4 engine; RGB and Alpha information may be written to the Render Target for temporary storage.
In an embodiment, the rendering the projection material by the rendering component to generate the projection decal may include the following steps:
and carrying out one or more of fuzzy processing, color adjustment and transparency adjustment on the projection material through the rendering component to generate the projection applique.
The fuzzy processing can be performed on the obtained projection material to enable the display effect of the projection material to be closer to the projection, the specific operation of the fuzzy processing is not particularly limited in the present disclosure, for example, the projection material can be subjected to the fuzzy processing through Kawase Blur or Gaussian Blur, and the fuzzy processing effect of the Kawase Blur is very close to that of the Gaussian Blur.
As shown in fig. 8, the principle of Kawase Blur is to sample four corner sampling points at positions further and further away from the current pixel, and perform ping-pong bit block transmission between two equal-sized textures; and a fuzzy kernel that moves with the number of iterations is used instead of a fixed fuzzy kernel like gaussian fuzzy or fast mean fuzzy (Box bur).
Gaussian blurring, also called Gaussian smoothing, is a classical blurring algorithm. In the field of image processing, gaussian blurring is commonly used to reduce image noise and blur an image, which has the visual effect of viewing the image through a translucent screen. The essence of image blurring is the process of filtering high frequency signals, leaving low frequency signals. The common method for filtering high-frequency signals is convolution filtering, so that the Gaussian blur process of an image can be similar to the convolution of the image and normal distribution, and the normal distribution is also called Gaussian distribution, so that the technology is called Gaussian blur; since the fourier transform of the gaussian function is a gaussian function, the process of gaussian blurring the image is similar to the process of inputting the image to a low pass filter.
The color adjustment and transparency adjustment may include further adjustment of the color and transparency of the projection material in the rendering component, and the specific steps of the color adjustment and transparency adjustment are not particularly limited by the present disclosure, and for example, the color and transparency of the projection material may be adjusted according to the color and transparency of the ground projection generated when the controlled virtual object in the game scene is in the direct light source projection range, so as to obtain a clearer projection applique.
In one embodiment, the rendering component Render Target may temporarily store the scene information RGB and the Alpha information, create a new material, read the scene information in the Render Target from the new material, perform Kawase Blur on the scene information to obtain blurred scene information, extract the Alpha information in the blurred scene information, and display the Alpha information with decals to obtain the projection decals of the controlled virtual object.
In one embodiment, after receiving the commands of "modify projection color", "modify projection ambiguity", and "modify projection transparency" of the player, as shown in fig. 9 (a), the projection decal ambiguity, transparency and color of the projection decal can be dynamically adjusted, so as to realize personalized customization of the player to the projection of the controlled virtual object and improve the user experience.
Based on the method of FIG. 6, the projection applique of the controlled virtual object is generated according to the scene information, the problem that the projection effect of the controlled virtual object is weak when the scene light source is limited is solved, and the projection of the controlled virtual object is forged through the applique technology, so that the movement of the character in the scene is more realistic.
In addition, projection decals of the controlled virtual object can also be generated by means of shadow maps (shadow maps).
In one embodiment, as shown in fig. 10, generating a projected decal for the controlled virtual object may further include steps S1010-S1020:
step S1010, acquiring a first depth value of a characteristic point of a controlled virtual object in a light source coordinate system and a second depth value of the characteristic point of the controlled virtual object in an eye coordinate system; the light source coordinate system is a coordinate system constructed by taking a pseudo light source as a viewpoint;
step S1020, a projection applique of the controlled virtual object is generated according to the comparison result of the first depth value and the second depth value.
The feature points of the controlled virtual object are one or more points on the controlled virtual object, for example, the feature points may include pixel points or voxel points of the controlled virtual object. The depth value may represent a distance between the virtual camera and the virtual object, for example, in this embodiment, the viewpoint, the position of the virtual camera, and the origin of the light source coordinate system may be the same point, and the depth value of the feature point may include a distance between the origin of the light source coordinate system and the feature point. The light source coordinate system may be a coordinate system constructed with the position of the pseudo light source as a viewpoint and the irradiation direction of the pseudo light source as an observation direction. The eye coordinate system is a coordinate system constructed by taking the direction of a user looking at the screen of the terminal device as a visual angle, for example, in a game scene, the eye coordinate system can be used as an observation direction according to the direction of a player looking at a computer screen running a game, and a virtual coordinate system is constructed by taking the position of an eyeball of the player as the origin of the coordinate system.
For example, a first depth value may be first obtained based on a distance between the feature points of the controlled virtual object and the origin of the light source coordinate system; and comparing the second depth value of the feature point in the eye coordinate system with the first depth value, if the first depth value of the feature point is greater than the second depth value, determining that the feature point belongs to the projection applique, and setting all the feature points belonging to the projection applique to be preset shadow colors, so that the projection applique of the controlled virtual object can be obtained.
Based on the method of fig. 10, the projection applique can be generated by comparing the depth values of the feature points of the controlled virtual object in different spaces, the calculation process is simple, and the method is beneficial to improving the overall operation efficiency.
In one embodiment, as shown in fig. 11, a projection decal for the controlled virtual object may also be generated through steps S1110 to S1140:
step S1110, acquiring a first depth map of an indoor game scene in a light source coordinate system and a second depth map in an eye coordinate system; the light source coordinate system is a coordinate system constructed by taking a pseudo light source as a viewpoint;
step S1120, mapping the second depth map to a light source coordinate system to obtain a third depth map;
step S1130, acquiring a middle shadow map according to a comparison result of the depth values of the first depth map and the third depth map;
step S1140, the intermediate shadow map is sampled based on the position information of the controlled virtual object in the light source coordinate system to generate a projection decal of the controlled virtual object.
Wherein the depth map may include a depth value for each point in the indoor game scene.
For example, the value of the light mode (LightMode) may be set to "shadowcase" in the game engine Unity to obtain a shadow mapping texture of the indoor game scene in the light source coordinate system, and the texture may be mapped based on the shadow to obtain the first depth map; then, a second depth map of the indoor game scene in the eye coordinate system is obtained, and a point mapping value light source coordinate system in the second depth map is used for obtaining a third depth map; and acquiring an intermediate shadow map according to the points of the first depth map with the depth values larger than the third depth map, and sampling the intermediate shadow map according to the coordinates of the controlled virtual object in the light source coordinate system, thereby generating a projection applique of the controlled virtual object.
Based on the method of fig. 11, the projection decals of the controlled virtual objects are generated according to the depth map generated by the indoor game scene where the controlled virtual objects are located, so that the generation process of the projection decals is simplified, the generation speed of the projection decals is increased, and the overall operation efficiency of the method is improved.
A specific process of displaying a projection decal in an indoor game scene at a projection position by a projection angle will be described below.
In one embodiment, if the controlled virtual object enters the virtual illumination range of the pseudo light source, creating a projection blueprint, and displaying the projection applique according to a projection angle at a projection position in the projection blueprint; and if the controlled virtual object leaves the virtual irradiation range of the pseudo light source, destroying the projection blueprint to cancel the display of the projection applique.
The projection blueprint may include generation logic of the projection decal, specific contents of the projection blueprint are not particularly limited in the present disclosure, and the present disclosure may illustratively include scene information acquisition logic, and color, transparency, and blur effect of the projection material are updated according to the scene information; and a binding event can be further included, so that the projected blueprints are bound to the corresponding controlled virtual objects, and the projected blueprints and the controlled virtual objects are subjected to position synchronization.
In an embodiment, the determining the projection position and the projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information may include:
and binding the projection blueprint with the controlled virtual object so as to determine the projection position and the projection angle in real time based on the first position information and the second position information.
In the exemplary embodiment, the projection blueprint and the controlled virtual object are bound, so that the projection blueprint and the controlled virtual object can be subjected to position synchronization in real time, the projection angle is determined according to the positions of the controlled virtual object and the pseudo light source, the projection angle can be updated in real time, the projection effect is more suitable for the action of the controlled virtual object, and the immersion feeling of a player in a game is improved.
For example, if the virtual irradiation range of the pseudo light source may be a spherical irradiation range, a spherical collision component may be added to the pseudo light source, and a spherical collision logic may be programmed; when the controlled virtual object enters the virtual irradiation range of the pseudo light source, triggering spherical collision logic to create a projection blueprint, binding the projection blueprint and the controlled virtual object in the projection blueprint, rendering the projection material according to scene information to generate a projection applique, synchronizing the position of the projection applique according to the position of the controlled virtual object, and adjusting the projection angle in real time according to the position relation of the controlled virtual object and the pseudo light source; if the controlled virtual object leaves the virtual irradiation range, the spherical collision logic is triggered to destroy the projection blueprint, so that the projection applique displaying the controlled virtual object is cancelled.
Based on the method, aiming at the condition that the projection effect of the controlled virtual object is weak due to the limited scene light source, the projection of the controlled virtual object is forged by using the applique technology according to the scene information in the game scene, so that the reality sense of the movement of the controlled virtual object in the game scene is improved, the performance expense is reduced, and the projection processing efficiency of the controlled virtual object is improved.
In one embodiment, an exemplary flow of the projection processing method of the virtual object of the present disclosure is shown in fig. 12, and the projection processing of the virtual object may be performed according to steps S1201 to S1210.
Step S1201, when the controlled virtual object enters the virtual irradiation range of the pseudo light source, a projection blueprint is created;
step S1202, in the projection blueprint, acquiring one or more frames of scene images of the area where the controlled virtual object is located in the game scene captured by the virtual camera viewport;
step S1203, extracting color information and transparency information in one or more frames of scene images to obtain scene information;
step S1204, import the scene information into the rendering assembly;
step S1205, performing one or more of fuzzy processing, color adjustment and transparency adjustment on the projection material through the rendering component to generate a projection applique;
step S1206, binding the projection blueprint with the controlled virtual object;
step S1207, determining a projection position based on the position of the controlled virtual object;
step S1208, determining a projection angle according to the position relation between the controlled virtual object and the pseudo light source;
step S1209, displaying the projection applique at the projection position according to the projection angle to form a projection of the controlled virtual object;
step S1210, when the controlled virtual object leaves the virtual irradiation range of the pseudo light source, the projection blueprint is destroyed.
In one embodiment, the static light source is deleted after the staff member performs the illumination baking, the game scene may include the illumination baked scene, and the pseudo light source may include the static light source. The virtual irradiation range of the pseudo light source may be as shown in fig. 13, and if the controlled virtual object enters the virtual irradiation range of the pseudo light source, the projection blueprint creation method may be triggered to load the projection blueprint as shown in fig. 14; in the projection blueprint, acquiring one or more frames of scene images of an area where a controlled virtual object is located in a game scene captured by a virtual camera viewport; extracting color information and transparency information in one or more frames of scene images to obtain scene information; after the scene information is imported into the rendering component, the projection material can be subjected to one or more of fuzzy processing, color adjustment and transparency adjustment through the rendering component to generate a projection applique; binding the projection applique with the controlled virtual object to determine a projection position based on the position of the controlled virtual object, and determining a projection angle according to the position relationship between the controlled virtual object and the pseudo light source; displaying the projection applique at a projection position according to a projection angle to form a projection of the controlled virtual object; and when the controlled virtual object leaves the virtual irradiation range of the pseudo light source, destroying the projection blueprint to cancel the display of the projection applique.
Exemplary embodiments of the present disclosure also provide a projection processing apparatus of a virtual object. As shown in fig. 15, the projection processing apparatus 1500 of the virtual object may include:
a position information acquiring module 1510 configured to acquire first position information of the controlled virtual object in the indoor game scene and second position information of the pseudo light source specified in the indoor game scene in response to the controlled virtual object entering the indoor game scene; the pseudo light source is a light source which is used for simulating a real light source position in an indoor game scene and does not generate an illumination effect;
a projection position and projection angle determination module 1520 configured to determine a projection position and a projection angle of a controlled virtual object in an indoor game scene based on the first position information and the second position information;
a projection generation module 1530 configured to display a projection decal in the indoor game scene at a projection position at a projection angle to simulate a projection of the controlled virtual object in the indoor game scene; the projection applique is generated by rendering according to scene information in an indoor game scene and a specified projection material.
In one embodiment, the determining the projection position and the projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information may include:
determining a projection position of the controlled virtual object based on the first position information;
and determining the projection angle of the controlled virtual object according to the position relation between the first position information and the second position information.
In one embodiment, the apparatus may further include:
acquiring scene information;
and importing the scene information into a rendering component, rendering the projection material through the rendering component, and generating the projection applique.
In an embodiment, the acquiring the scene information may include:
acquiring one or more frames of scene images of an area where a controlled virtual object is located in a game scene captured by a viewport of a virtual camera;
and extracting color information and transparency information in one or more frames of scene images to obtain scene information.
In an embodiment, the rendering the projection material by the rendering component to generate the projection decal may include:
and carrying out one or more of fuzzy processing, color adjustment and transparency adjustment on the projection material through the rendering component to generate the projection applique.
In one embodiment, the apparatus may further include:
acquiring a first depth value of a characteristic point of a controlled virtual object in a light source coordinate system and a second depth value of the characteristic point of the controlled virtual object in an eye coordinate system; the light source coordinate system is a coordinate system constructed by taking a pseudo light source as a viewpoint;
a projected decal for the controlled virtual object is generated based on the comparison of the first depth value and the second depth value.
In one embodiment, the apparatus may further include:
acquiring a first depth map of an indoor game scene in a light source coordinate system and a second depth map of the indoor game scene in an eye coordinate system; the light source coordinate system is a coordinate system constructed by taking a pseudo light source as a viewpoint;
mapping the second depth map to a light source coordinate system to obtain a third depth map;
acquiring a middle shadow map according to a comparison result of the depth values of the first depth map and the third depth map;
and sampling the intermediate shadow map based on the position information of the controlled virtual object in the light source coordinate system to generate a projection decal of the controlled virtual object.
In one embodiment, the projection processing apparatus of the virtual object may include a projection blueprint creation module configured to create a projection blueprint if the controlled virtual object enters a virtual illumination range of the pseudo light source, and display a projection applique at a projection position in the projection blueprint according to a projection angle, wherein the projection applique is generated by rendering according to scene information in an indoor game scene and a specified projection material;
the projection processing device of the virtual object can further comprise a projection blueprint destroying module configured to destroy the projection blueprint to cancel the display of the projection applique if the controlled virtual object leaves the virtual irradiation range of the pseudo light source.
In one embodiment, the determining the projection position and the projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information may include:
and binding the projection blueprint with the controlled virtual object so as to determine the projection position and the projection angle in real time based on the first position information and the second position information.
The specific details of each part in the above device have been described in detail in the method part embodiments, and thus are not described again.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium, which may be implemented in the form of a program product, including program code for causing an electronic device to perform the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned "exemplary method" section of this specification, when the program product is run on the electronic device. In an alternative embodiment, the program product may be embodied as a portable compact disc read only memory (CD-ROM) and include program code, and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Exemplary embodiments of the present disclosure also provide an electronic device. The electronic device may include a processor and a memory. The memory stores executable instructions of the processor, such as may be program code. The processor executes the executable instructions to perform the method in the exemplary embodiment.
Referring now to FIG. 16, an electronic device in the form of a general purpose computing device is illustrated. It should be understood that the electronic device 1600 shown in fig. 16 is only one example and should not be taken to limit the functionality or scope of use of embodiments of the present disclosure.
As shown in fig. 16, an electronic device 1600 may include: processor 1610, memory 1620, bus 1630, I/O (input/output) interface 1640, and network adapter 1650.
Processor 1610 may include one or more processing units, such as: the Processor 1610 may include a Central Processing Unit (CPU), an AP (Application Processor), a modem Processor, a Display Processor (DPU), a GPU (Graphics Processing Unit), an ISP (Image Signal Processor), a controller, an encoder, a decoder, a DSP (Digital Signal Processor), a baseband Processor, an artificial intelligence Processor, and the like. In one embodiment, the CPU may obtain first position information of the controlled virtual object in the indoor game scene and second position information of a pseudo light source specified in the indoor game scene in response to the controlled virtual object entering the indoor game scene, determine a projection position and a projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information, and finally display a projection applique in the indoor game scene at the projection position according to the projection angle to simulate the projection of the controlled virtual object in the indoor game scene; the projection applique is generated by rendering according to scene information in an indoor game scene and a specified projection material.
The memory 1620 may include a volatile memory such as a RAM 1621, a cache unit 1622, and a non-volatile memory such as a ROM 1623. Memory 1620 may also include one or more program modules 1624, such program modules 1624 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment. For example, program modules 1624 may include modules within apparatus 1500 described above.
Bus 1630 is used to enable connections between various components of electronic device 1600, and may include a data bus, an address bus, and a control bus.
Electronic device 1600 may communicate with one or more external devices 1700 (e.g., keyboard, mouse, external controller, etc.) via I/O interface 1640.
The electronic device 1600 may communicate with one or more networks through the network adapter 1650, for example, the network adapter 1650 may provide a mobile communication solution such as 3G/4G/5G or a wireless communication solution such as wireless local area network, bluetooth, near field communication, etc. Network adapter 1650 may communicate with other modules of electronic device 1600 via bus 1630.
Although not shown in fig. 16, other hardware and/or software modules may also be provided in electronic device 1600, including but not limited to: displays, microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, according to exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (12)

1. The projection processing method of the virtual object is applied to an indoor game scene, wherein the indoor game scene comprises a static virtual object, and the projection of the static virtual object is generated by one-time calculation in a scene baking mode; the method comprises the following steps:
responding to a controlled virtual object entering the indoor game scene, and acquiring first position information of the controlled virtual object in the indoor game scene and second position information of a pseudo light source specified in the indoor game scene; the pseudo light source is a light source which is used for simulating a real light source position in the indoor game scene and does not generate an illumination effect;
determining a projection position and a projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information;
displaying a projection applique in the indoor game scene at the projection position according to the projection angle so as to simulate the projection of the controlled virtual object in the indoor game scene;
and the projection applique is generated by rendering according to scene information in the indoor game scene and a specified projection material.
2. The method of claim 1, wherein determining a projection position and a projection angle of the controlled virtual object in the indoor gaming scene based on the first location information and the second location information comprises:
determining a projection position of the controlled virtual object based on the first position information;
and determining the projection angle of the controlled virtual object according to the position relation between the first position information and the second position information.
3. The method of claim 1, further comprising:
acquiring the scene information;
and importing the scene information into a rendering component, rendering the projection material through the rendering component, and generating the projection applique.
4. The method of claim 3, wherein the obtaining the scene information comprises:
acquiring one or more frames of scene images of the area where the controlled virtual object is located in the game scene captured by a virtual camera viewport;
and extracting color information and transparency information in the one or more frames of scene images to obtain the scene information.
5. The method of claim 3, wherein the rendering of the projected material by the rendering component to generate the projected decal comprises:
and performing one or more of fuzzy processing, color adjustment and transparency adjustment on the projection material through the rendering component to generate the projection applique.
6. The method of claim 1, further comprising:
acquiring a first depth value of a characteristic point of the controlled virtual object in a light source coordinate system and a second depth value of the characteristic point of the controlled virtual object in an eye coordinate system; the light source coordinate system is a coordinate system constructed by taking the pseudo light source as a viewpoint;
generating a projected decal for the controlled virtual object as a result of the comparison of the first depth value and the second depth value.
7. The method of claim 1, further comprising:
acquiring a first depth map of the indoor game scene in a light source coordinate system and a second depth map of the indoor game scene in an eye coordinate system; the light source coordinate system is a coordinate system constructed by taking the pseudo light source as a viewpoint;
mapping the second depth map to the light source coordinate system to obtain a third depth map;
acquiring a middle shadow map according to a comparison result of the depth values of the first depth map and the third depth map;
and sampling the intermediate shadow map based on the position information of the controlled virtual object in the light source coordinate system to generate a projection applique of the controlled virtual object.
8. The method of claim 1, wherein said displaying a projected decal in said indoor gaming scene at said projection location at said projection angle comprises:
if the controlled virtual object enters the virtual irradiation range of the pseudo light source, creating a projection blueprint, and displaying the projection applique at the projection position in the projection blueprint according to the projection angle, wherein the projection applique is generated by rendering according to scene information in the indoor game scene and a specified projection material;
the method further comprises the following steps:
and if the controlled virtual object leaves the virtual irradiation range of the pseudo light source, destroying the projection blueprint to cancel the display of the projection applique.
9. The method of claim 8, wherein determining a projection position and a projection angle of the controlled virtual object in the indoor gaming scene based on the first location information and the second location information comprises:
and binding the projection blueprint with the controlled virtual object so as to determine the projection position and the projection angle in real time based on the first position information and the second position information.
10. The projection processing device of the virtual object is applied to an indoor game scene, wherein the indoor game scene comprises a static virtual object, and the projection of the static virtual object is generated by one-time calculation in a scene baking mode; the method comprises the following steps:
a position information acquisition module configured to acquire first position information of a controlled virtual object in an indoor game scene and second position information of a pseudo light source specified in the indoor game scene in response to the controlled virtual object entering the indoor game scene; the pseudo light source is a light source which is used for simulating a real light source position in the indoor game scene and does not generate an illumination effect;
a projection position and projection angle determination module configured to determine a projection position and a projection angle of the controlled virtual object in the indoor game scene based on the first position information and the second position information;
a projection generation module configured to display a projection decal in the indoor game scene at the projection position according to the projection angle to simulate projection of the controlled virtual object in the indoor game scene;
and the projection applique is generated by rendering according to scene information in the indoor game scene and a specified projection material.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 9.
12. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 9 via execution of the executable instructions.
CN202211533161.8A 2022-12-01 2022-12-01 Projection processing method and device for virtual object, storage medium and electronic equipment Pending CN115970275A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211533161.8A CN115970275A (en) 2022-12-01 2022-12-01 Projection processing method and device for virtual object, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211533161.8A CN115970275A (en) 2022-12-01 2022-12-01 Projection processing method and device for virtual object, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115970275A true CN115970275A (en) 2023-04-18

Family

ID=85967102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211533161.8A Pending CN115970275A (en) 2022-12-01 2022-12-01 Projection processing method and device for virtual object, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115970275A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974417A (en) * 2023-07-25 2023-10-31 江苏泽景汽车电子股份有限公司 Display control method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974417A (en) * 2023-07-25 2023-10-31 江苏泽景汽车电子股份有限公司 Display control method and device, electronic equipment and storage medium
CN116974417B (en) * 2023-07-25 2024-03-29 江苏泽景汽车电子股份有限公司 Display control method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11335379B2 (en) Video processing method, device and electronic equipment
CN108351864B (en) Concave geometric dense paving
CN110196746B (en) Interactive interface rendering method and device, electronic equipment and storage medium
JP4698893B2 (en) Method, graphics system, and program for providing improved fog effects
CN106575445B (en) Fur avatar animation
US9928637B1 (en) Managing rendering targets for graphics processing units
CN108805971B (en) Ambient light shielding method
CN111047506B (en) Environmental map generation and hole filling
US7064755B2 (en) System and method for implementing shadows using pre-computed textures
US20200302579A1 (en) Environment map generation and hole filling
CN113436343A (en) Picture generation method and device for virtual studio, medium and electronic equipment
EP3462743A1 (en) Device and method for dynamic range expansion in a virtual reality scene
US10891801B2 (en) Method and system for generating a user-customized computer-generated animation
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
JP2006195882A (en) Program, information storage medium and image generation system
Loscos et al. Real-time shadows for animated crowds in virtual cities
Trapp et al. Colonia 3D communication of virtual 3D reconstructions in public spaces
CN116228943B (en) Virtual object face reconstruction method, face reconstruction network training method and device
JP4513423B2 (en) Object image display control method using virtual three-dimensional coordinate polygon and image display apparatus using the same
CN116958344A (en) Animation generation method and device for virtual image, computer equipment and storage medium
CN114832375A (en) Ambient light shielding processing method, device and equipment
JP2001143100A (en) Method and device for providing depth gradation effects in three-dimensional video graphic system
CN116958390A (en) Image rendering method, device, equipment, storage medium and program product
US7710419B2 (en) Program, information storage medium, and image generation system
CN111243099B (en) Method and device for processing image and method and device for displaying image in AR (augmented reality) equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination