CN118045353A - Game picture display control method and device, storage medium and electronic device - Google Patents

Game picture display control method and device, storage medium and electronic device Download PDF

Info

Publication number
CN118045353A
CN118045353A CN202410276546.3A CN202410276546A CN118045353A CN 118045353 A CN118045353 A CN 118045353A CN 202410276546 A CN202410276546 A CN 202410276546A CN 118045353 A CN118045353 A CN 118045353A
Authority
CN
China
Prior art keywords
depth information
pixels
depth
game
display control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410276546.3A
Other languages
Chinese (zh)
Inventor
赵海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202410276546.3A priority Critical patent/CN118045353A/en
Publication of CN118045353A publication Critical patent/CN118045353A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application discloses a game picture display control method and device, a storage medium and an electronic device. The method comprises the following steps: acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining the scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining the role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining the focusing position of a virtual camera in the virtual game scene; and performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter. The application solves the technical problems of high manufacturing cost and poor picture display effect when the depth of field effect in the two-dimensional game picture is realized in the related technology.

Description

Game picture display control method and device, storage medium and electronic device
Technical Field
The present application relates to the field of game technologies, and in particular, to a game screen display control method and apparatus, a storage medium, and an electronic apparatus.
Background
The depth effect in the real world is simulated in the game, so that the game picture is more real and vivid, the depth effect can enable a player to feel the change of the distance and the depth in the game, and the stereoscopic impression and the fidelity of the game are enhanced. In the related art, a Three-dimensional (3D) game often uses a depth camera to capture the distance between a Three-dimensional object and a camera to simulate the depth effect, so as to obtain a better sense of spatial depth. In a 2D card type game, in order to represent a game scenario, 2D components such as a vertical drawing picture and a background picture are generally arranged in a layered manner according to a three-dimensional space, and the mode can capture depth by means of a depth camera of a 3D game to achieve a simple depth effect, but cannot represent a lens depth effect in a large-space depth game scene, and cannot represent a scenario similar to that achieved by conversion of video works through lens focusing. Therefore, the related art simply distinguishes the foreground and the background, and cannot show a fine depth effect. For a game with delicate scene objects and realistic style, the three-dimensional scene is difficult to realize vivid and fine depth effect by splitting and layering; for the original scene painting production flow, additional manpower is required to perform layering and three-dimensional scene construction on the scene, so that the production cost is further increased.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
At least some embodiments of the present application provide a game screen display control method, apparatus, storage medium, and electronic device, so as to at least solve the technical problems of high manufacturing cost and poor screen display effect when implementing a depth of field effect in a two-dimensional game screen in the related art.
According to one embodiment of the present application, there is provided a game screen display control method including: acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining the scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining the role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining the focusing position of a virtual camera in the virtual game scene; and performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
According to one embodiment of the present application, there is also provided a game screen display control apparatus including: the system comprises an acquisition module, a focusing module and a focusing module, wherein the acquisition module is used for acquiring first depth information, second depth information and focusing parameters, the first depth information is used for determining the scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining the role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining the focusing position of a virtual camera in the virtual game scene; and the control module is used for performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
According to one embodiment of the present application, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the game screen display control method in any one of the above-described aspects when executed.
According to an embodiment of the present application, there is also provided an electronic device including a memory in which a computer program is stored, and a processor configured to run the computer program to execute the game screen display control method in any one of the above.
According to another aspect of embodiments of the present application, there is also provided a computer program product including a computer program which, when executed by a processor, implements the above-described game screen display control method in the respective embodiments of the present application.
According to another aspect of the embodiments of the present application, there is also provided a computer program product including a non-volatile computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described game screen display control method in the respective embodiments of the present application.
According to another aspect of the embodiments of the present application, there is also provided a computer program which, when executed by a processor, implements the above game screen display control method in the respective embodiments of the present application.
In at least some embodiments of the present application, by acquiring the first depth information, the second depth information, and the focusing parameter, and further performing display control on the two-dimensional game frame based on the first depth information, the second depth information, and the focusing parameter, the purpose of rapidly performing display control on the two-dimensional game frame is achieved, thereby realizing the technical effects of reducing the manufacturing cost and improving the frame display effect when the depth effect in the two-dimensional game frame is realized, and further solving the technical problems of high manufacturing cost and poor frame display effect when the depth effect in the two-dimensional game frame is realized in the related art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
Fig. 1 is a block diagram of a hardware structure of a mobile terminal of a game screen display control method according to one embodiment of the present application;
FIG. 2 is a flow chart of a game screen display control method according to one embodiment of the present application;
FIG. 3 is a schematic diagram of first depth information according to one embodiment of the present application;
FIG. 4 is a diagram of second depth information according to one embodiment of the present application;
FIG. 5 is a schematic diagram showing blur level according to one embodiment of the present application;
FIG. 6 is a schematic diagram of a game screen display control method according to one embodiment of the present application;
FIG. 7 is a schematic diagram of a further game screen display control method according to one embodiment of the present application;
FIG. 8 is a schematic diagram of a further game screen display control method according to one embodiment of the present application;
fig. 9 is a block diagram showing a structure of a game screen display control device according to one embodiment of the present application;
Fig. 10 is a schematic diagram of an electronic device according to an embodiment of the application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, partial terms or terminology appearing in the course of describing embodiments of the application are applicable to the following explanation:
Depth of Field (DOF): refers to the distance between the nearest and farthest elements in the scene that appear "acceptable sharpness" in the image. In optics, especially video or photographic, depth of field is a distance range that is described in space and can be clearly imaged. In three-dimensional games, the depth effect in the real world is often simulated by a screen post-processing mode to achieve better lens performance.
Frame buffer: for storing rendering data processed or to be processed by the graphics card. The game will save the results of the geometric rendering, depth, etc. information in the three-dimensional scene in the frame buffer. Typically including a number of individual textures such as color buffers, depth buffers, stencil buffers, etc. The display hardware will fetch rendering results from the frame buffer to refresh onto the screen. But in modern games a post-processing operation is often performed to adjust the final output picture before the image rendering is finished drawing on the screen.
Screen post-processing: generally, after a screen image is obtained by rendering a complete scene, a series of operations are performed on the image to realize various screen special effects. More artistic effects such as depth of field, motion blur, etc. can be added to the game picture at the time of screen post-processing. The post-processing operation needs to obtain various data information from the frame buffer as input, for example, achieving the depth of field effect needs to obtain the current rendering result and depth from the frame buffer, and perform secondary processing on the original screen image.
Rendering Texture (RT): a texture created at run-time and refreshed in real-time may store information such as results captured by a target camera in the texture for caching certain information during rendering. For example, saving depth information to the rendering texture to replace depth information of the depth buffer.
Shader (Shader): a computer program for changing rendering results by dynamically adjusting contents of vertices, textures, pixels, etc. of an image is generally developed for a graphics processor. In the fields of post-processing of movies, computer imaging, video games, etc., shaders are often used to make various special effects. Besides the common illumination model, the shader can also adjust the hue, saturation, brightness and contrast of the image to generate the effects of blurring, highlighting, volume light source, defocus, cartoon rendering, tone separation, distortion, concave-convex mapping, color keys, edge detection and the like.
The human eye or the camera lens is a structure similar to a convex lens. The light is refracted onto retina or photosensitive element through lens to obtain image, human eye can change lens shape or regulate distance structure of lens group, so as to ensure that the light of object which is desired to be seen and is refracted into eyeball can clearly make image fall on retina or photosensitive element.
The vertical plane in front of the lens where the object forming the sharp image is located is called the focal plane. When the focus plane is just on the object to be photographed, the objects on the front and rear sides of the focus plane are blurred because of no focusing, and the farther the clutch focus plane is, the more blurred the imaging is.
The camera principles used in computer graphics rendering are similar to small aperture imaging and do not produce the depth of field effects that are typical of human eyes and video cameras. Therefore, in order to make the rendered image more fit to the human eye's experience, the effect of depth of field is often simulated by various means.
In the related art, a depth camera is often used in a 3D game to capture the distance between a three-dimensional object and a camera to simulate the depth effect, so as to obtain a better sense of spatial depth. In a 2D card type game, in order to represent a game scenario, 2D components such as a vertical drawing picture and a background picture are generally arranged in a layered manner according to a three-dimensional space, and the mode can capture depth by means of a depth camera of a 3D game to achieve a simple depth effect, but cannot represent a lens depth effect in a large-space depth game scene, and cannot represent a scenario similar to that achieved by conversion of video works through lens focusing. Therefore, the related art simply distinguishes the foreground and the background, and cannot show a fine depth effect. For a game with delicate scene objects and realistic style, the three-dimensional scene is difficult to realize vivid and fine depth effect by splitting and layering; for the original scene painting production flow, additional manpower is required to perform layering and three-dimensional scene construction on the scene, so that the production cost is further increased.
The above-described method embodiments to which the present disclosure relates may be performed in a mobile terminal, a computer terminal or similar computing device. Taking the mobile terminal as an example, the mobile terminal can be a smart phone, a tablet computer, a palm computer, a mobile internet device, a PAD, a game machine and other terminal devices. Fig. 1 is a block diagram showing a hardware configuration of a mobile terminal of a game screen display control method according to an embodiment of the present application. As shown in fig. 1, the mobile terminal may include one or more (only one is shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a Central Processing Unit (CPU), a Graphics Processor (GPU), a Digital Signal Processing (DSP) chip, a Microprocessor (MCU), a programmable logic device (FPGA), a neural Network Processor (NPU), a Tensor Processor (TPU), an Artificial Intelligence (AI) type processor, etc.) and a memory 104 for storing data, and in one embodiment of the present application, may further include: input output device 108 and display device 110.
In some optional embodiments, which are based on game scenes, the device may further provide a human-machine interaction interface with a touch-sensitive surface, where the human-machine interaction interface may sense finger contacts and/or gestures to interact with a Graphical User Interface (GUI), where the human-machine interaction functions may include the following interactions: executable instructions for performing the above-described human-machine interaction functions, such as creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving electronic mail, talking interfaces, playing digital video, playing digital music, and/or web browsing, are configured/stored in a computer program product or readable storage medium executable by one or more processors.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
According to one embodiment of the present application, there is provided an embodiment of a game screen display control method, it being noted that the steps shown in the flowcharts of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is shown in the flowcharts, in some cases the steps shown or described may be performed in an order different from that herein.
Fig. 2 is a flowchart of a game screen display control method according to one embodiment of the present application, as shown in fig. 2, the method includes the steps of:
Step S21, acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining the scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining the role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining the focusing position of a virtual camera in the virtual game scene;
Step S22, display control is performed on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
The game scene resource model refers to various game elements built in the virtual game scene, including buildings, props and the like. The scene depth of the game scene resource model represents the position and distance relation of the game scene resource model in the virtual game scene and is used for determining the position and distance relation of each game element in the virtual game scene in the three-dimensional space so that players can correctly perceive and interact in the game.
The virtual character model refers to a role played in a virtual game scene, and may be a virtual character or a non-virtual character controlled by a human player or a virtual character controlled by a computer program. The virtual character model typically has its own appearance, characteristics, capabilities, and behaviors, and players can participate in various activities and interactions in the virtual game scene by manipulating the virtual character model. The depth of the character of the virtual character model represents the degree of detail and complexity of the virtual character model and may include, in particular, covering aspects of the appearance, actions, behaviors, emotions, interactions, etc. of the virtual character model, as well as relationships with the environment, other characters, etc. In a virtual game scenario, the setting of character depth may affect the player's sense of immersion and experience of the game.
The focusing parameters are used for determining the focusing position of the virtual camera in the virtual game scene, and the focusing position refers to the focal position selected by the virtual camera when shooting or recording video. By adjusting the focusing parameters, the virtual camera can be aimed at a specific object or scene, so that the virtual camera can keep a clear and clear focus. In a virtual game scene, the in-focus position may affect the viewing angle and focus of the player in the game, thereby affecting the player's game experience.
Based on the steps S21 to S22, the first depth information, the second depth information and the focusing parameter are obtained, and then the two-dimensional game picture is displayed and controlled based on the first depth information, the second depth information and the focusing parameter, so that the purpose of rapidly performing display control on the two-dimensional game picture is achieved, the technical effects of reducing the manufacturing cost and improving the picture display effect when the depth effect in the two-dimensional game picture is achieved are achieved, and the technical problems of high manufacturing cost and poor picture display effect when the depth effect in the two-dimensional game picture is achieved in the related art are solved.
The game screen display control method in the embodiment of the application is further described below.
Optionally, in step S22, performing display control on the two-dimensional game screen based on the first depth information, the second depth information, and the focusing parameter includes:
Step S221, combining the first depth information and the second depth information to obtain target depth information;
step S222, determining depth values corresponding to a plurality of pixels contained in the two-dimensional game picture based on the target depth information;
in step S223, the two-dimensional game screen is controlled by using the depth values and the focusing parameters corresponding to the plurality of pixels.
The first depth information and the second depth information may be obtained through a three-dimensional auxiliary process, where the three-dimensional auxiliary process is a scene original painting making process, and generally includes the following links: drawing a sketch; building a model asset and refining details of a scene; drawing a map; performing replacement integration of scenes; rendering a drawing; layering into image processing software (Photoshop) to delineate the completion. For a large-scale partial compaction scene, the three-dimensional auxiliary process can simply and rapidly realize accurate modeling, is convenient to modify and iterate, and has the advantages of higher efficiency, material compaction, visual finished products and the like.
When the three-dimensional auxiliary flow is adopted to produce the original scene picture, firstly, modeling of scene objects is carried out through digital content creation (Digital Content Creation, DCC) software, the original scene picture is assisted to be drawn, and then the original function of the DCC software is utilized to store the first depth information.
In the scenario editing stage, a virtual character model is added in the virtual game scene, and the second depth information can be set by adjusting the depth attribute of the attribute panel. Fig. 3 is a schematic diagram of first depth information according to an embodiment of the present application, as shown in fig. 3, after the display depth mode is turned on, the first depth information can be displayed in the form of a gray scale. Fig. 4 is a schematic diagram of second depth information according to an embodiment of the present application, and as shown in fig. 4, checking whether the depth of the character of the virtual character model is reasonable by using a gray scale map.
Based on the above-mentioned alternative embodiment, the first depth information and the second depth information are combined to obtain the target depth information, and further, depth values corresponding to a plurality of pixels included in the two-dimensional game picture are determined based on the target depth information, and finally, display control is performed on the two-dimensional game picture by using the depth values corresponding to the plurality of pixels and the focusing parameters, so that a more real depth feeling can be presented in the two-dimensional game picture. By processing the depth information, background blurring, depth of field effect and the like can be realized, and expressive force and immersion sense of a game picture are enhanced. Meanwhile, the focusing parameters are utilized to control the picture, so that a player can concentrate more attention, and the playability and visual experience of the game are improved.
Optionally, in step S221, merging the first depth information and the second depth information to obtain the target depth information includes: and in the game rendering process, combining and storing the first depth information and the second depth information to the first rendering texture to obtain target depth information.
Specifically, in the game rendering process, the scene depth and the role depth are combined and stored in a render texture to obtain target depth information, and the depth RT is obtained.
Based on the above-mentioned alternative embodiment, the target depth information can be obtained quickly by combining and storing the first depth information and the second depth information to the first rendering texture in the game rendering process.
Optionally, in step S222, determining depth values corresponding to a plurality of pixels included in the two-dimensional game screen based on the target depth information includes: and sampling the target depth information stored in the first rendering texture to obtain depth values corresponding to a plurality of pixels contained in the two-dimensional game picture.
Based on the above embodiment, by sampling the target depth information stored in the first rendering texture, depth values corresponding to a plurality of pixels included in the two-dimensional game picture are obtained, and sampling and presenting of the depth information are realized, so that a more real visual effect is realized in the two-dimensional game picture, and the stereoscopic impression and the realism of the game picture are enhanced.
Optionally, in step S223, performing display control on the two-dimensional game screen using depth values and focusing parameters corresponding to the plurality of pixels includes:
Step S2231, comparing depth values corresponding to a plurality of pixels with focusing parameters in the screen post-processing process to obtain fuzzy values corresponding to the plurality of pixels, wherein the fuzzy values corresponding to the plurality of pixels are used for determining the display fuzzy degree of the plurality of pixels in the two-dimensional game picture;
In step S2232, display control is performed on the two-dimensional game screen using the blur values corresponding to the plurality of pixels.
Based on the above-mentioned alternative embodiment, in the screen post-processing process, the depth values corresponding to the plurality of pixels are compared with the focusing parameters to obtain the blur values corresponding to the plurality of pixels, and then the two-dimensional game picture is displayed and controlled by using the blur values corresponding to the plurality of pixels, so that the two-dimensional game picture can simulate the depth effect of the real world according to the depth values and the focusing parameters when being displayed, thereby enhancing the realism and the immersion of the game picture.
Optionally, in step S2231, comparing the depth values corresponding to the plurality of pixels with the focusing parameter to obtain blur values corresponding to the plurality of pixels includes: calculating distance values between depth values corresponding to the pixels and focusing parameters; blur values corresponding to the plurality of pixels are determined based on the distance values.
Specifically, fig. 5 is a schematic diagram showing the blur degree according to one embodiment of the present application, and as shown in fig. 5, the blur value of the current pixel is obtained by calculating the distance between the focusing area and the depth value corresponding to the current pixel, and the closer the pixel blur degree to the focusing area is, the higher the pixel blur degree is.
Based on the above-mentioned alternative embodiment, by calculating the distance values between the depth values corresponding to the plurality of pixels and the focusing parameters, and further determining the blur values corresponding to the plurality of pixels based on the distance values, intelligent blur processing of the image can be achieved, so that objects with different depths in the image can exhibit a realistic depth effect.
Optionally, in step S2232, performing display control on the two-dimensional game screen using blur values corresponding to the plurality of pixels includes: sampling pixel color information stored in the second rendering texture to obtain initial color values corresponding to a plurality of pixels, wherein the pixel color information is used for determining pixel colors to be rendered in a two-dimensional game picture; calculating to obtain target color values corresponding to the pixels based on the initial color values corresponding to the pixels and the fuzzy values corresponding to the pixels; and performing display control on the two-dimensional game picture by using the target color values corresponding to the pixels.
The second rendering texture is a screen RT, which includes rendering results that are not post-processed in the frame buffer. And sampling pixel color information stored in the second rendering texture to obtain initial color values (color) corresponding to the plurality of pixels, and then calculating to obtain target color values corresponding to the plurality of pixels based on the initial color values corresponding to the plurality of pixels and fuzzy values corresponding to the plurality of pixels, so that the target color values corresponding to the plurality of pixels can be utilized to display and control the two-dimensional game picture.
Based on the above-mentioned alternative embodiment, the initial color values corresponding to the plurality of pixels are obtained by sampling the pixel color information stored in the second rendering texture, and then the target color values corresponding to the plurality of pixels are obtained by calculating based on the initial color values corresponding to the plurality of pixels and the blur values corresponding to the plurality of pixels, and finally the two-dimensional game picture is displayed and controlled by using the target color values corresponding to the plurality of pixels, so that the method can be used for enhancing the visual effect of the game picture and improving the realism and immersion of the game. Meanwhile, by controlling the target color value, more personalized and refined image processing effects can be realized.
Optionally, calculating the target color value corresponding to the plurality of pixels based on the initial color value corresponding to the plurality of pixels and the blur value corresponding to the plurality of pixels includes: and performing step-by-step downsampling fuzzy calculation on the initial color values corresponding to the pixels based on the fuzzy values corresponding to the pixels to obtain target color values corresponding to the pixels, wherein the resolution of the previous calculation result obtained by every two adjacent fuzzy calculations in the step-by-step downsampling is larger than the resolution of the next calculation result, and the resolution of the next calculation result and the resolution of the previous calculation result are in a preset proportion.
Specifically, a block blurring algorithm, a gaussian blurring algorithm or a foreground blurring algorithm is adopted to achieve a specific blurring effect. In order to obtain a fine blurring effect, a larger number of computations are usually required, that is, the result is blurred again as input after one blurring computation is performed on the original image, and the process is repeated for a plurality of times. The performance consumption in the cyclic calculation process is large, and for performance consideration, a step-by-step downsampling performance optimization mode can be adopted, namely, the resolution of the result after one-time fuzzy calculation is halved, and then the result is used as the input of the next fuzzy, so that the fine fuzzy effect can be achieved by using fewer cyclic times.
Based on the above-mentioned alternative embodiment, the fuzzy calculation of step-by-step downsampling is performed on the initial color values corresponding to the plurality of pixels based on the fuzzy values corresponding to the plurality of pixels, so as to obtain the target color values corresponding to the plurality of pixels, thereby enhancing the visual effect of the game picture and improving the realism and immersion of the game.
Optionally, a graphical user interface is provided through the terminal device, and the content displayed by the graphical user interface at least partially includes a scenario editor interface, where the scenario editor interface includes: the two-dimensional game picture preview window and the time line editing window, and the game picture display control method further comprises the following steps: responding to touch operation acted on the time line editing window, and acquiring an adjustment mode of focusing parameters; and determining the focusing position of the two-dimensional game picture previewed in the two-dimensional game picture preview window based on the adjustment mode.
Fig. 6 is a schematic diagram of a game screen display control method according to an embodiment of the present application, as shown in fig. 6, in response to a touch operation applied to a timeline editing window, an adjustment manner of focusing parameters is acquired, and a focusing position of a two-dimensional game screen previewed in a two-dimensional game screen preview window is determined as a first position based on the adjustment manner, where the first position is a face position of a foreground person in the game screen, so that the game screen located at the first position is the clearest.
Fig. 7 is a schematic diagram of a game screen display control method according to another embodiment of the present application, as shown in fig. 7, in response to a touch operation applied to a timeline editing window, an adjustment manner of focusing parameters is acquired, and a focusing position of a two-dimensional game screen previewed in a two-dimensional game screen preview window is determined as a second position based on the adjustment manner, where the second position is a face position of a person in a game screen, so that the game screen located at the second position is the clearest.
Fig. 8 is a schematic diagram of a game screen display control method according to another embodiment of the present application, as shown in fig. 8, in response to a touch operation applied to a timeline editing window, an adjustment manner of focusing parameters is acquired, and a focusing position of a two-dimensional game screen previewed in a two-dimensional game screen preview window is determined as a third position based on the adjustment manner, where the third position is a position of a background character in the game screen, so that the game screen located at the third position is the clearest.
Particularly, the touch operation in the embodiment of the application refers to the touch operation performed by a user by touching the display screen of the terminal device with a finger and controlling the operation of the terminal device, and may also be the touch operation performed by the user through external devices such as a mouse, a keyboard, a game handle and the like connected with the terminal device, where the touch operation may include single-point touch and multi-point touch, and the touch operation of each touch point may include clicking, long-pressing, heavy-pressing, swiping and the like.
Based on the above-mentioned alternative embodiment, the accurate focusing adjustment is performed on the two-dimensional game picture through the touch operation, so that the user can more intuitively adjust the focusing position of the game picture, and the user experience is improved. Meanwhile, the technology also realizes the response to the touch operation of the time line editing window, so that a user can edit and preview the game picture more conveniently, the efficiency of game development and editing can be improved, and the control capability of the user on the game picture is improved.
Optionally, the adjustment means includes at least one of: adjusting focusing parameters based on the game scenario progress; and adjusting focusing parameters based on the preset game interaction event.
Specifically, when adjusting focus parameters based on game scenario progress, as in inferential games, focus can be on the latest cues that occur in real-time; in the scenario editor, the planning and editing can manually adjust the focus position according to the scenario dialogue or automatically focus on the current speaker.
Based on the above-mentioned alternative embodiments, the adjustment efficiency can be further improved by adjusting the focusing parameter based on the progress of the game scenario or adjusting the focusing parameter based on the preset game interaction event.
According to the embodiment of the application, the first depth information, the second depth information and the focusing parameters are acquired, and further, based on the first depth information, the second depth information and the focusing parameters, the two-dimensional game picture is displayed and controlled, so that the depth effect generated by focusing the lens in the three-dimensional space photography can be simulated in the 2D game, the focus of the picture can be changed in real time by adjusting the focusing parameters, the game picture is more in line with the visual characteristics of human eyes, and meanwhile, the lens language in the film and television can be sampled to enrich the plot expression and the interactive experience of the player. The embodiment of the application relies on the original scene original picture production flow, and almost no labor cost is increased in asset production, so that the labor cost required to be increased for improving picture expression is low. The depth buffer information for realizing the depth effect in the embodiment of the application can also be used for realizing other effects simulating the three-dimensional space, such as scene scanning, fog and the like, so that the expression effect of the game picture is further enriched.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
In this embodiment, a game screen display control device is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, and will not be described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 9 is a block diagram showing a game screen display control apparatus according to an embodiment of the present application, as shown in fig. 9, comprising:
The acquisition module 901 is configured to acquire first depth information, second depth information and focusing parameters, where the first depth information is used to determine a scene depth of a game scene resource model built in a virtual game scene, the second depth information is used to determine a role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used to determine a focusing position of a virtual camera in the virtual game scene;
The control module 902 is configured to perform display control on the two-dimensional game screen based on the first depth information, the second depth information, and the focusing parameter.
Optionally, the control module 902 is further configured to: combining the first depth information and the second depth information to obtain target depth information; determining depth values corresponding to a plurality of pixels contained in the two-dimensional game picture based on the target depth information; and performing display control on the two-dimensional game picture by utilizing the depth values and the focusing parameters corresponding to the pixels.
Optionally, the control module 902 is further configured to: and in the game rendering process, combining and storing the first depth information and the second depth information to the first rendering texture to obtain target depth information.
Optionally, the control module 902 is further configured to: and sampling the target depth information stored in the first rendering texture to obtain depth values corresponding to a plurality of pixels contained in the two-dimensional game picture.
Optionally, the control module 902 is further configured to: in the screen post-processing process, comparing depth values corresponding to a plurality of pixels with focusing parameters to obtain fuzzy values corresponding to the plurality of pixels, wherein the fuzzy values corresponding to the plurality of pixels are used for determining the display fuzzy degree of the plurality of pixels in a two-dimensional game picture; and performing display control on the two-dimensional game picture by using the fuzzy values corresponding to the pixels.
Optionally, the control module 902 is further configured to: calculating distance values between depth values corresponding to the pixels and focusing parameters; blur values corresponding to the plurality of pixels are determined based on the distance values.
Optionally, the control module 902 is further configured to: sampling pixel color information stored in the second rendering texture to obtain initial color values corresponding to a plurality of pixels, wherein the pixel color information is used for determining pixel colors to be rendered in a two-dimensional game picture; calculating to obtain target color values corresponding to the pixels based on the initial color values corresponding to the pixels and the fuzzy values corresponding to the pixels; and performing display control on the two-dimensional game picture by using the target color values corresponding to the pixels.
Optionally, the control module 902 is further configured to: and performing step-by-step downsampling fuzzy calculation on the initial color values corresponding to the pixels based on the fuzzy values corresponding to the pixels to obtain target color values corresponding to the pixels, wherein the resolution of the previous calculation result obtained by every two adjacent fuzzy calculations in the step-by-step downsampling is larger than the resolution of the next calculation result, and the resolution of the next calculation result and the resolution of the previous calculation result are in a preset proportion.
Optionally, the acquiring module 901 is further configured to respond to a touch operation applied to the timeline editing window, and acquire an adjustment manner of the focusing parameter; the game screen display control device further includes: a determining module 903, configured to determine a focus position of the two-dimensional game screen previewed in the two-dimensional game screen preview window based on the adjustment mode.
Optionally, the adjustment means includes at least one of: adjusting focusing parameters based on the game scenario progress; and adjusting focusing parameters based on the preset game interaction event.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; or the above modules may be located in different processors in any combination.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
Alternatively, in this embodiment, the above-mentioned computer-readable storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
s1, acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining focusing positions of virtual cameras in the virtual game scene;
And S2, performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: combining the first depth information and the second depth information to obtain target depth information; determining depth values corresponding to a plurality of pixels contained in the two-dimensional game picture based on the target depth information; and performing display control on the two-dimensional game picture by utilizing the depth values and the focusing parameters corresponding to the pixels.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: and in the game rendering process, combining and storing the first depth information and the second depth information to the first rendering texture to obtain target depth information.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: and sampling the target depth information stored in the first rendering texture to obtain depth values corresponding to a plurality of pixels contained in the two-dimensional game picture.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: in the screen post-processing process, comparing depth values corresponding to a plurality of pixels with focusing parameters to obtain fuzzy values corresponding to the plurality of pixels, wherein the fuzzy values corresponding to the plurality of pixels are used for determining the display fuzzy degree of the plurality of pixels in a two-dimensional game picture; and performing display control on the two-dimensional game picture by using the fuzzy values corresponding to the pixels.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: calculating distance values between depth values corresponding to the pixels and focusing parameters; blur values corresponding to the plurality of pixels are determined based on the distance values.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: sampling pixel color information stored in the second rendering texture to obtain initial color values corresponding to a plurality of pixels, wherein the pixel color information is used for determining pixel colors to be rendered in a two-dimensional game picture; calculating to obtain target color values corresponding to the pixels based on the initial color values corresponding to the pixels and the fuzzy values corresponding to the pixels; and performing display control on the two-dimensional game picture by using the target color values corresponding to the pixels.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: and performing step-by-step downsampling fuzzy calculation on the initial color values corresponding to the pixels based on the fuzzy values corresponding to the pixels to obtain target color values corresponding to the pixels, wherein the resolution of the previous calculation result obtained by every two adjacent fuzzy calculations in the step-by-step downsampling is larger than the resolution of the next calculation result, and the resolution of the next calculation result and the resolution of the previous calculation result are in a preset proportion.
Optionally, the above computer readable storage medium is further configured to store program code for performing the steps of: responding to touch operation acted on the time line editing window, and acquiring an adjustment mode of focusing parameters; and determining the focusing position of the two-dimensional game picture previewed in the two-dimensional game picture preview window based on the adjustment mode.
Optionally, the adjustment means includes at least one of: adjusting focusing parameters based on the game scenario progress; and adjusting focusing parameters based on the preset game interaction event.
In the computer readable storage medium of the embodiment, the first depth information, the second depth information and the focusing parameter are acquired, so that the two-dimensional game picture is displayed and controlled based on the first depth information, the second depth information and the focusing parameter, the purpose of rapidly performing display control on the two-dimensional game picture is achieved, the technical effects of reducing the manufacturing cost and improving the picture display effect when the depth effect in the two-dimensional game picture is achieved are achieved, and the technical problems of high manufacturing cost and poor picture display effect when the depth effect in the two-dimensional game picture is achieved in the related art are solved.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a computer readable storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, a computer-readable storage medium stores thereon a program product capable of implementing the method described above in this embodiment. In some possible implementations, the various aspects of the embodiments of the application may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the application as described in the "exemplary methods" section of this embodiment, when the program product is run on the terminal device.
A program product for implementing the above-described method according to an embodiment of the present application may employ a portable compact disc read-only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the embodiments of the present application is not limited thereto, and in the embodiments of the present application, the computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Any combination of one or more computer readable media may be employed by the program product described above. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the program code embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
An embodiment of the application also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining focusing positions of virtual cameras in the virtual game scene;
And S2, performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
Optionally, the above processor may be further configured to perform the following steps by a computer program: combining the first depth information and the second depth information to obtain target depth information; determining depth values corresponding to a plurality of pixels contained in the two-dimensional game picture based on the target depth information; and performing display control on the two-dimensional game picture by utilizing the depth values and the focusing parameters corresponding to the pixels.
Optionally, the above processor may be further configured to perform the following steps by a computer program: and in the game rendering process, combining and storing the first depth information and the second depth information to the first rendering texture to obtain target depth information.
Optionally, the above processor may be further configured to perform the following steps by a computer program: and sampling the target depth information stored in the first rendering texture to obtain depth values corresponding to a plurality of pixels contained in the two-dimensional game picture.
Optionally, the above processor may be further configured to perform the following steps by a computer program: in the screen post-processing process, comparing depth values corresponding to a plurality of pixels with focusing parameters to obtain fuzzy values corresponding to the plurality of pixels, wherein the fuzzy values corresponding to the plurality of pixels are used for determining the display fuzzy degree of the plurality of pixels in a two-dimensional game picture; and performing display control on the two-dimensional game picture by using the fuzzy values corresponding to the pixels.
Optionally, the above processor may be further configured to perform the following steps by a computer program: calculating distance values between depth values corresponding to the pixels and focusing parameters; blur values corresponding to the plurality of pixels are determined based on the distance values.
Optionally, the above processor may be further configured to perform the following steps by a computer program: sampling pixel color information stored in the second rendering texture to obtain initial color values corresponding to a plurality of pixels, wherein the pixel color information is used for determining pixel colors to be rendered in a two-dimensional game picture; calculating to obtain target color values corresponding to the pixels based on the initial color values corresponding to the pixels and the fuzzy values corresponding to the pixels; and performing display control on the two-dimensional game picture by using the target color values corresponding to the pixels.
Optionally, the above processor may be further configured to perform the following steps by a computer program: and performing step-by-step downsampling fuzzy calculation on the initial color values corresponding to the pixels based on the fuzzy values corresponding to the pixels to obtain target color values corresponding to the pixels, wherein the resolution of the previous calculation result obtained by every two adjacent fuzzy calculations in the step-by-step downsampling is larger than the resolution of the next calculation result, and the resolution of the next calculation result and the resolution of the previous calculation result are in a preset proportion.
Optionally, the above processor may be further configured to perform the following steps by a computer program: responding to touch operation acted on the time line editing window, and acquiring an adjustment mode of focusing parameters; and determining the focusing position of the two-dimensional game picture previewed in the two-dimensional game picture preview window based on the adjustment mode.
Optionally, the adjustment means includes at least one of: adjusting focusing parameters based on the game scenario progress; and adjusting focusing parameters based on the preset game interaction event.
In the electronic device of the embodiment, the first depth information, the second depth information and the focusing parameter are acquired, and then the two-dimensional game picture is displayed and controlled based on the first depth information, the second depth information and the focusing parameter, so that the purpose of rapidly performing display control on the two-dimensional game picture is achieved, the technical effects of reducing the manufacturing cost and improving the picture display effect when the depth effect in the two-dimensional game picture is achieved are achieved, and the technical problems of high manufacturing cost and poor picture display effect when the depth effect in the two-dimensional game picture is achieved in the related art are solved.
Fig. 10 is a schematic diagram of an electronic device according to an embodiment of the application. As shown in fig. 10, the electronic device 1000 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present application.
As shown in fig. 10, the electronic apparatus 1000 is embodied in the form of a general purpose computing device. Components of electronic device 1000 may include, but are not limited to: the at least one processor 1010, the at least one memory 1020, a bus 1030 connecting the various system components including the memory 1020 and the processor 1010, and a display 1040.
Wherein the memory 1020 stores program code that can be executed by the processor 1010 to cause the processor 1010 to perform the steps according to various exemplary embodiments of the present application described in the above method section of the embodiment of the present application.
Memory 1020 may include readable media in the form of volatile memory units such as Random Access Memory (RAM) 10201 and/or cache memory 10202, and may further include Read Only Memory (ROM) 10203, and may also include non-volatile memory such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
In some examples, memory 1020 may also include a program/utility 10204 having a set (at least one) of program modules 10205, such program modules 10205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Memory 1020 may further include memory located remotely from processor 1010, which may be connected to electronic device 1000 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
Bus 1030 may be representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processor 1010, or a local bus using any of a variety of bus architectures.
Display 1040 may be, for example, a touch-screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of electronic device 1000.
Optionally, the electronic apparatus 1000 may also be in communication with one or more external devices 1100 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic apparatus 1000, and/or with any device (e.g., router, modem, etc.) that enables the electronic apparatus 1000 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1050. Also, electronic device 1000 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1060. As shown in fig. 10, the network adapter 1060 communicates with other modules of the electronic device 1000 over a bus 1030. It should be appreciated that although not shown in fig. 10, other hardware and/or software modules may be used in connection with the electronic device 1000, which may include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The electronic device 1000 may further include: a keyboard, a cursor control device (e.g., a mouse), an input/output interface (I/O interface), a network interface, a power supply, and/or a camera.
It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 10 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, electronic device 1000 may also include more or fewer components than shown in FIG. 10, or have a different configuration than shown in FIG. 1. The memory 1020 may be used for storing a computer program and corresponding data, such as a computer program and corresponding data corresponding to a game screen display control method in an embodiment of the present application. The processor 1010 executes a computer program stored in the memory 1020 to perform various functional applications and data processing, that is, to realize the above-described game screen display control method.
Embodiments of the present application also provide a computer program product. Alternatively, in the present embodiment, the computer program product may comprise a computer program which, when executed by a processor, implements the method provided by the above embodiment.
Optionally, the computer program product comprises a computer program for execution by a processor to:
s1, acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining focusing positions of virtual cameras in the virtual game scene;
And S2, performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
Embodiments of the present application also provide a computer program product. Alternatively, the computer program product may comprise a non-volatile computer readable storage medium, which may be used for storing a computer program, which when executed by a processor implements the method provided by the above embodiments.
Optionally, the computer program stored in the above non-volatile computer readable storage medium is executed by the processor to:
s1, acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining focusing positions of virtual cameras in the virtual game scene;
And S2, performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
Embodiments of the present application also provide a computer program. Optionally, in this embodiment, the above-mentioned computer program, when executed by a processor, implements the method provided in the above-mentioned embodiment.
Optionally, the computer program is executed by the processor to:
s1, acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining focusing positions of virtual cameras in the virtual game scene;
And S2, performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (14)

1. A game screen display control method, characterized by comprising:
Acquiring first depth information, second depth information and focusing parameters, wherein the first depth information is used for determining scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining focusing positions of virtual cameras in the virtual game scene;
And performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
2. The game screen display control method according to claim 1, wherein performing display control on the two-dimensional game screen based on the first depth information, the second depth information, and the focusing parameter comprises:
Combining the first depth information and the second depth information to obtain target depth information;
Determining depth values corresponding to a plurality of pixels contained in the two-dimensional game picture based on the target depth information;
and performing display control on the two-dimensional game picture by utilizing the depth values corresponding to the pixels and the focusing parameters.
3. The game screen display control method according to claim 2, wherein combining the first depth information and the second depth information to obtain the target depth information includes:
and in the game rendering process, combining and storing the first depth information and the second depth information to a first rendering texture to obtain the target depth information.
4. A game screen display control method according to claim 3, wherein determining depth values corresponding to the plurality of pixels included in the two-dimensional game screen based on the target depth information comprises:
And sampling the target depth information stored in the first rendering texture to obtain depth values corresponding to the pixels contained in the two-dimensional game picture.
5. The game screen display control method according to claim 2, wherein performing display control on the two-dimensional game screen using the depth values corresponding to the plurality of pixels and the focusing parameter comprises:
In the screen post-processing process, comparing the depth values corresponding to the pixels with the focusing parameters to obtain fuzzy values corresponding to the pixels, wherein the fuzzy values corresponding to the pixels are used for determining the display fuzzy degree of the pixels in the two-dimensional game picture;
and performing display control on the two-dimensional game picture by using the fuzzy values corresponding to the pixels.
6. The game screen display control method according to claim 5, wherein comparing the depth values corresponding to the plurality of pixels with the focusing parameter to obtain the blur values corresponding to the plurality of pixels comprises:
calculating distance values between depth values corresponding to the pixels and the focusing parameters;
And determining fuzzy values corresponding to the pixels based on the distance values.
7. The game screen display control method according to claim 5, wherein performing display control on the two-dimensional game screen using blur values corresponding to the plurality of pixels comprises:
Sampling pixel color information stored in a second rendering texture to obtain initial color values corresponding to the pixels, wherein the pixel color information is used for determining pixel colors to be rendered in the two-dimensional game picture;
calculating a target color value corresponding to the plurality of pixels based on the initial color value corresponding to the plurality of pixels and the fuzzy value corresponding to the plurality of pixels;
and performing display control on the two-dimensional game picture by utilizing the target color values corresponding to the pixels.
8. The game screen display control method according to claim 7, wherein calculating the target color values corresponding to the plurality of pixels based on the initial color values corresponding to the plurality of pixels and the blur values corresponding to the plurality of pixels includes:
and performing step-by-step down-sampling fuzzy calculation on the initial color values corresponding to the pixels based on the fuzzy values corresponding to the pixels to obtain target color values corresponding to the pixels, wherein the resolution of a previous calculation result obtained by every two adjacent fuzzy calculations in the step-by-step down-sampling is larger than the resolution of a next calculation result, and the resolution of the next calculation result and the resolution of the previous calculation result are in a preset proportion.
9. The game screen display control method according to claim 1, wherein a graphic user interface is provided through the terminal device, the content displayed by the graphic user interface at least partially including a scenario editor interface, the scenario editor interface comprising: the two-dimensional game picture preview window and the time line editing window, and the game picture display control method further comprises the following steps:
responding to touch operation acted on the time line editing window, and acquiring an adjustment mode of the focusing parameter;
And determining the focusing position of the two-dimensional game picture previewed in the two-dimensional game picture preview window based on the adjustment mode.
10. The game screen display control method according to claim 9, wherein the adjustment means includes at least one of:
Adjusting the focusing parameters based on game scenario progress;
and adjusting the focusing parameter based on a preset game interaction event.
11. A game screen display control device, comprising:
the device comprises an acquisition module, a focusing module and a focusing module, wherein the acquisition module is used for acquiring first depth information, second depth information and focusing parameters, the first depth information is used for determining the scene depth of a game scene resource model built in a virtual game scene, the second depth information is used for determining the role depth of a virtual role model added in the virtual game scene, and the focusing parameters are used for determining the focusing position of a virtual camera in the virtual game scene;
And the control module is used for performing display control on the two-dimensional game picture based on the first depth information, the second depth information and the focusing parameter.
12. A computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, wherein the computer program is configured to perform the game screen display control method according to any one of claims 1 to 10 when executed by a processor.
13. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the game screen display control method of any one of claims 1 to 10.
14. A computer program product comprising a computer program which, when executed by a processor, implements the game screen display control method as claimed in any one of claims 1 to 10.
CN202410276546.3A 2024-03-11 2024-03-11 Game picture display control method and device, storage medium and electronic device Pending CN118045353A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410276546.3A CN118045353A (en) 2024-03-11 2024-03-11 Game picture display control method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410276546.3A CN118045353A (en) 2024-03-11 2024-03-11 Game picture display control method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN118045353A true CN118045353A (en) 2024-05-17

Family

ID=91048406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410276546.3A Pending CN118045353A (en) 2024-03-11 2024-03-11 Game picture display control method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN118045353A (en)

Similar Documents

Publication Publication Date Title
US20210390767A1 (en) Computing images of head mounted display wearer
RU2586566C1 (en) Method of displaying object
US20090251460A1 (en) Systems and methods for incorporating reflection of a user and surrounding environment into a graphical user interface
CN108876886B (en) Image processing method and device and computer equipment
EP3533218B1 (en) Simulating depth of field
CN106447756B (en) Method and system for generating user-customized computer-generated animations
US11328437B2 (en) Method for emulating defocus of sharp rendered images
CN111754431A (en) Image area replacement method, device, equipment and storage medium
Ma et al. Neural compositing for real-time augmented reality rendering in low-frequency lighting environments
Liu et al. Stereo-based bokeh effects for photography
CN116958344A (en) Animation generation method and device for virtual image, computer equipment and storage medium
Gholap et al. Past, present, and future of the augmented reality (ar)-enhanced interactive techniques: A survey
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
CN118045353A (en) Game picture display control method and device, storage medium and electronic device
US20220076382A1 (en) Method for Applying a Vignette Effect to Rendered Images
CN115311395A (en) Three-dimensional scene rendering method, device and equipment
CN114066715A (en) Image style migration method and device, electronic equipment and storage medium
Zhou Accurate depth based post-processing for perception enhancement in real time three-dimensional graphics
CN116112657B (en) Image processing method, image processing device, computer readable storage medium and electronic device
US20240135616A1 (en) Automated system for generation of facial animation rigs
US11983819B2 (en) Methods and systems for deforming a 3D body model based on a 2D image of an adorned subject
US20220215512A1 (en) Method for Emulating Defocus of Sharp Rendered Images
CN115953520B (en) Recording and playback method and device for virtual scene, electronic equipment and medium
CN116630509A (en) Image processing method, image processing apparatus, computer-readable storage medium, and electronic apparatus
CN117496033A (en) Mapping processing method and device, computer readable storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination