CN113577770A - Game rendering method - Google Patents

Game rendering method Download PDF

Info

Publication number
CN113577770A
CN113577770A CN202110839968.3A CN202110839968A CN113577770A CN 113577770 A CN113577770 A CN 113577770A CN 202110839968 A CN202110839968 A CN 202110839968A CN 113577770 A CN113577770 A CN 113577770A
Authority
CN
China
Prior art keywords
interface
scene
background image
main scene
main
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110839968.3A
Other languages
Chinese (zh)
Inventor
陈炯栩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yuanyou Information Technology Co ltd
Original Assignee
Guangzhou Yuanyou Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yuanyou Information Technology Co ltd filed Critical Guangzhou Yuanyou Information Technology Co ltd
Priority to CN202110839968.3A priority Critical patent/CN113577770A/en
Publication of CN113577770A publication Critical patent/CN113577770A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a game rendering method, which comprises the following steps: in a game main scene interface, responding to the triggering operation of a sub-operation interface, and acquiring scene elements in the main scene interface; obtaining a relatively fixed background image according to the scene elements; and displaying the background image on the main scene interface, and displaying the sub-operation interface in front of the main scene interface. According to the game rendering method, when the user opens the sub-operation interface, the background image is displayed on the main scene interface, the main scene interface does not need to be rendered in real time according to real-time game data of the server, the rendering working pressure of the mobile phone on the main scene interface is reduced, the frame dropping situation of the game card frame is reduced, and the smoothness of the mobile phone in game running can be improved.

Description

Game rendering method
Technical Field
The invention relates to the technical field of game picture rendering, in particular to a game rendering method.
Background
In the process of a mobile phone game, rendering of a game combat scene (or a main scene) often occupies a large part of GPU resources, while in the process of opening a sub-operation interface, the main scene interface can be partially shielded, at this time, a player does not care about rendering conditions of the main scene interface, but continuous rendering of real-time data of the main scene interface can bring great working pressure to the mobile phone, so that the problems of serious heat generation, blocking and the like of the mobile phone easily occur.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings in the prior art and provide a game rendering method which can reduce the rendering working pressure of a mobile phone on a main scene interface in the process that a user opens a sub-operation interface and performs operation.
One embodiment of the present invention provides a game rendering method, including the steps of:
in a game main scene interface, responding to the triggering operation of a sub-operation interface, and acquiring scene elements in the main scene interface;
obtaining a relatively fixed background image according to the scene elements;
and displaying the background image on the main scene interface, and displaying the sub-operation interface in front of the main scene interface.
Compared with the prior art, the game rendering method provided by the invention has the advantages that when the user opens the sub-operation interface, the scene elements in the main scene interface are obtained, and then the relatively fixed background image is obtained according to the scene elements, so that the real-time rendering of the main scene interface is not needed by displaying the background image to the main scene interface, the rendering working pressure of the mobile phone on the main scene interface is reduced, the frame dropping situation of a game card frame is reduced, and the smoothness of the mobile phone in game running can be improved.
Further, when the background image is displayed on the main scene interface, the real-time dynamic rendering of the scene elements of the main scene interface is closed, so that the rendering working pressure of the mobile phone on the main scene interface is reduced.
Further, the scene elements within the main scene interface include: scene elements corresponding to the current time in the main scene interface;
the step of obtaining a relatively fixed background image from the scene element includes:
rendering the scene elements into a preset canvas to obtain a static picture corresponding to the scene elements;
determining the still picture as the background image.
According to the method and the device, the static picture is generated by obtaining the picture of the scene element corresponding to the current time, and then the work of rendering the main scene interface in real time by the mobile phone is replaced by displaying the static picture, so that the working pressure of the mobile phone is greatly reduced.
Further, the size of the canvas is smaller than the size of the main scene interface;
the step of rendering the scene element into a preset canvas to obtain a static picture corresponding to the scene element includes:
determining a rendering ratio according to the size of the canvas and the size of the main scene interface;
rendering the scene elements into the canvas according to the rendering proportion to obtain a static picture corresponding to the scene elements;
the displaying the background image on the main scene interface comprises the following steps,
amplifying the static picture to be the same as the size of the main scene interface;
obtaining the background image according to the amplified static picture;
and displaying the background image on the main scene interface.
According to the embodiment of the application, the static picture with the size smaller than the main scene interface is generated, and then the static picture is amplified, so that the background image which occupies a small memory and has the same size as the main scene interface can be obtained, and the working pressure of the mobile phone for displaying the background image is reduced.
Further, the step of obtaining the background image according to the enlarged still picture includes:
carrying out fuzzy blurring processing on the amplified static picture to obtain a blurred static picture;
determining the blurred still picture as the background image. The background image is made to look more natural, preventing the visual perception of the user from being affected.
Further, the scene elements within the main scene interface include: scene elements corresponding to the current time and scene elements corresponding to a preset time period before the current time in the main scene interface;
the step of obtaining a relatively fixed background image from the scene element includes:
and generating a dynamic picture by using the current time in the main scene interface and scene elements in a preset time period before the current time, and determining the dynamic picture as the background image.
According to the method and the device, the dynamic picture is generated by obtaining the picture of the scene element in the preset time period, and then the dynamic picture is displayed to replace the work of rendering the main scene interface in real time by the mobile phone, so that the working pressure of the mobile phone is greatly reduced.
Further, after the step of displaying the background image on the main scene interface and displaying the sub-operation interface in front of the main scene interface, the method further includes:
acquiring an updating instruction triggered by a preset timer; the updating instruction is triggered when the timer time reaches a preset time threshold;
acquiring scene elements dynamically rendered in real time by the main scene interface at the time corresponding to the updating instruction;
updating the background image of the primary scene interface according to the scene element.
According to the embodiment of the application, the content displayed by the main scene interface is updated at regular time, so that the user can adapt to the change of the main scene interface when the sub-operation interface is closed, and the condition that the mobile phone screen is easy to age due to the fact that the same content is displayed on the mobile phone screen in a large area for a long time can be prevented.
Further, the scene elements include virtual buildings, virtual objects, virtual characters, and/or player characters. And the user can know the position of the role operated by the user according to the scene element.
Further, the size of the sub-operation interface is smaller than that of the main scene interface. And preventing the sub operation interface from blocking the display content of the main scene interface.
Further, the sub-operation interface is displayed in front of the main scene interface in a preset transparent proportion. And enabling a user to see the background image displayed by the main scene interface through the sub-operation interface.
In order that the invention may be more clearly understood, specific embodiments thereof will be described hereinafter with reference to the accompanying drawings.
Drawings
FIG. 1 is a flow chart of a game rendering method according to an embodiment of the present invention.
Fig. 2 is a display diagram of the tour scene interface and the sub-operation interface according to an embodiment of the present invention.
FIG. 3 is a flowchart of steps S21-S33 of a game rendering method according to an embodiment of the present invention.
Fig. 4 is a still picture obtained according to the steps S211 to S212 of the game rendering method according to an embodiment of the present invention.
FIG. 5 is a background image obtained according to the steps S31-S32 of the game rendering method according to one embodiment of the present invention.
FIG. 6 is a flowchart of steps S321-S322 of a game rendering method according to an embodiment of the present invention.
Fig. 7 is a blurred still picture obtained from the steps S321 to S322 of the game rendering method according to an embodiment of the present invention.
Fig. 8 is a flowchart of updating a background image of a game rendering method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that the embodiments described are only some embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without any creative effort belong to the protection scope of the embodiments in the present application.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. In the description of the present application, it is to be understood that the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not necessarily used to describe a particular order or sequence, nor are they to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The word "if/if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination".
Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Please refer to fig. 1, which is a game rendering method according to an embodiment of the present invention, including the following steps:
s1: in a game main scene interface, responding to the triggering operation of a sub-operation interface, and acquiring scene elements in the main scene interface.
Referring to fig. 2, the main scene interface refers to an interface for controlling the activities of a character by a user at a game end of a mobile phone, and in the main scene interface, the activities of the character include, but are not limited to, moving, making a monster, interacting with a virtual character or other player characters, and changing the posture of the character caused by the activities. The sub-operation interface is an operation interface which is popped up in a main scene interface of a mobile game and is independent of the main scene interface, the sub-operation interface is positioned in front of the main scene interface, and the content displayed by the sub-operation interface can be videos, activity content, role attributes, role equipment, seats, pets, skill pages and setting options.
The triggering operation refers to an operation of opening the execution instruction of the corresponding sub-operation interface by clicking an icon on the main scene interface or a shortcut key of the mobile phone.
The scene elements include virtual buildings, virtual objects, virtual characters and/or player characters, and preferably, the scene elements can also include illumination. And when the response and the trigger operation are carried out, acquiring one or more of a virtual building, a virtual object, a virtual character and a player character in the main scene interface. For example: the method comprises the steps of firstly, obtaining a virtual building, secondly, obtaining a virtual building and a virtual object, thirdly, obtaining a virtual building, a virtual object and a virtual character, fourthly, combining the virtual building, the virtual object, the virtual character and a player character and the like.
The 3D engine is a set of algorithm implementations that abstracts real materials into polygons or various curves and the like, performs correlation calculation in a computer, and outputs a final image, and the 3D engine is just like building a "real world" in the computer. The mainstream 3D engines now generally render scene elements in the field of view of the virtual camera into the main scene interface in real time, so as to obtain a real-time changing main scene interface. In an embodiment of the present application, when a trigger operation of a sub-operation interface is responded, the scene element is an element in a current time view range of the virtual camera corresponding to the main scene interface, so as to generate a static picture which is fixed and unchanged according to the scene element. In an optional embodiment of the present application, when responding to a triggering operation of the sub-operation interface, the scene element is an element in a view range of the virtual camera corresponding to the main scene interface and a preset time period before the current time, so as to generate a relatively fixed dynamic picture according to the scene element, and only the dynamic picture is rendered during the triggering display of the sub-operation interface without rendering in real time during which the sub-operation interface is triggered to display
S2: and obtaining a relatively fixed background image according to the scene element.
Wherein, when only one scene element within the main scene interface is acquired, the composition of the scene element is relatively fixed, for example: and acquiring the virtual buildings, wherein the relatively fixed background images mean that the positions among the virtual buildings and the positions among the parts of the virtual buildings are relatively fixed. When more than one scene element within the main scene interface is acquired, the acquired various scene elements are also relatively fixed, for example: and acquiring the virtual buildings and the virtual objects, wherein the virtual buildings and the virtual objects are fixed with each other.
The background image may be a still image or a moving image.
S3: and displaying the background image on the main scene interface, and displaying the sub-operation interface in front of the main scene interface.
Compared with the prior art, the game rendering method disclosed by the invention has the advantages that when the user opens the sub-operation interface, the background image is displayed on the main scene interface, the real-time rendering of the main scene interface is not needed, the rendering working pressure of the mobile phone on the main scene interface is reduced, the frame dropping situation of the game card frame is reduced, and the smoothness of the mobile phone in game running can be improved.
Preferably, when the background image is displayed on the main scene interface, the real-time dynamic rendering of the scene elements of the main scene interface is turned off. The rendering work pressure of the mobile phone on the main scene interface is reduced, because when the scene elements of the main scene interface are dynamically rendered in real time, the mobile phone needs to continuously interact with the appearance data of the scene elements of the main scene interface at the background, and performs the real-time rendering work on the main scene interface according to the interactive data, so that huge work pressure is brought to the mobile phone, the heat productivity of the mobile phone is increased, the power consumption rate is increased, and even the running of a game is changed into a card. And displaying the background image on the main scene interface and closing the real-time dynamic rendering of the scene elements of the main scene interface, so that the data interaction amount generated by games can be reduced, the working pressure of the mobile phone is reduced, the power consumption of the mobile phone is reduced, the game operation is more flow, and the entertainment experience of users is improved.
Wherein at least one of the following relationships also exists between the sub-operation interface and the main scene interface:
optionally, the size of the sub-operation interface is smaller than the size of the main scene interface. And preventing the sub operation interface from blocking the display content of the main scene interface.
Optionally, the sub-operation interface is displayed in front of the main scene interface in a preset transparent proportion. And enabling a user to see the background image displayed by the main scene interface through the sub-operation interface. Preferably, the sub-operation interface includes a window and an option, the window is displayed in front of the main scene interface in a preset transparent ratio, so that a user can see the option and see a background image displayed by the main scene interface through the window when operating the sub-operation interface.
In one possible embodiment, when the background image is a still image, the scene elements within the main scene interface include: and scene elements corresponding to the current time in the main scene interface. Referring to fig. 3, in this case, the step S2 includes the following steps:
s21: rendering the scene elements into a preset canvas to obtain a static picture corresponding to the scene elements;
s22: determining the still picture as the background image.
According to the method and the device, the static picture is generated by obtaining the picture of the scene element corresponding to the current time, and then the work of rendering the main scene interface in real time by the mobile phone is replaced by displaying the static picture, so that the working pressure of the mobile phone is greatly reduced.
Preferably, in order to further reduce memory resources and operating resources occupied by the background image and reduce the working pressure of the mobile phone, the size of the canvas is smaller than that of the main scene interface; at this time, the step S21 includes the steps of:
s211: determining a rendering ratio according to the size of the canvas and the size of the main scene interface;
the sizes of the canvas can be various, and the aspect ratios of the sizes of the canvas are different, for example, the aspect ratio of the sizes of the canvas can be 9:16, 9:18, 9:18.7, 9:19 and the like, which are display ratios of screens of mainstream mobile phones. When the display ratio of the screen of the mobile phone for the user to operate the game is 9:16, the display ratio of the main scene interface is also 9:16 because the game end of the mobile phone is generally displayed in a full screen. At this time, a canvas with the width ratio of 9:16 in size is selected, and then the rendering ratio is determined according to the size of the selected canvas and the size of the main scene interface.
S212: and rendering the scene elements into the canvas according to the rendering proportion to obtain a static picture corresponding to the scene elements.
Because the size of the canvas is smaller than that of the main scene interface, the obtained scene elements can be completely rendered into the canvas in a rendering mode according to the rendering proportion. At this time, the step S3 includes the steps of:
s31: amplifying the static picture to be the same as the size of the main scene interface;
s32: obtaining the background image according to the amplified static picture;
s33: and displaying the background image on the main scene interface.
According to the still pictures obtained in the steps S211 to S212, as shown in fig. 4, and according to the background images obtained in the steps S31 to S32, as the size of the still pictures obtained in the steps S211 to S212 is smaller than that of the main scene interface, the size of the still pictures needs to be enlarged in the step S31 to obtain a background image having the same size as that of the main scene interface, and the background image is displayed on the main scene interface, so that the background image having a small memory but the same size as that of the main scene interface can be obtained, and the working pressure of the mobile phone when displaying the background image is reduced.
Referring to fig. 6, in a possible embodiment, after the step S32, the method further includes the following steps:
s321: carrying out fuzzy blurring processing on the amplified static picture to obtain a blurred static picture;
the blurring treatment may be performed on the enlarged static picture by using a gaussian blurring algorithm or a radial blurring algorithm, or may be performed on the enlarged static picture by using a constructed blurring model to obtain a blurred static picture.
S322: determining the blurred still picture as the background image.
As shown in fig. 7, the blurred still pictures obtained according to the steps S321-S322 may make the background image look more natural, and prevent the visual perception of the user from being affected.
In one possible embodiment, when the background image is a dynamic image, the scene elements within the main scene interface include: scene elements corresponding to the current time and scene elements corresponding to a preset time period before the current time in the main scene interface;
the step S2 includes: and generating a dynamic picture by using the current time in the main scene interface and scene elements in a preset time period before the current time, and determining the dynamic picture as the background image.
The time unit of the preset time period is second, and preferably, the value of the time unit is less than 10.
Although the memory occupied by the dynamic picture is generally larger than that of the static picture, the working pressure of displaying the dynamic picture on the mobile phone is still far less than that of rendering the main scene interface in real time.
Referring to fig. 8, in a possible embodiment, after the step S3, the method further includes:
s4: acquiring an updating instruction triggered by a preset timer; the updating instruction is triggered when the timer time reaches a preset time threshold value.
The preset time threshold may be a certain value, when the timer reaches the stable value, the timer triggers generation of an update command, and the time of the timer is reset to zero and starts counting again, for example, the time threshold is 30 seconds, when the time of the timer reaches 30 seconds, the timer generates the update command, and then the time of the timer changes from 30 seconds to 0 seconds. The preset time threshold may also be a combination of a plurality of determined but not necessarily identical time values, such as regularly increasing time values of 30 seconds, 1 minute and 1.5 minutes, or irregularly increasing time values of 25 seconds, 1 minute, 3 minutes, 10 minutes and 20 minutes.
In other embodiments, one skilled in the art can utilize the countdown timer alone with the timer in the present embodiment according to the same timing concept to achieve the same effect in the present embodiment.
S5: and acquiring scene elements dynamically rendered in real time by the main scene interface at the time corresponding to the updating instruction.
Although the main scene interface displays the background image, scene elements in the visual field range of the virtual camera corresponding to the main scene interface may change, for example, when the player character is still, even though the visual field range of the virtual camera is not displaced, virtual objects, virtual characters, other player characters and even illumination in the visual field range of the virtual camera may change with time.
S6: updating the background image of the primary scene interface according to the scene element.
In step S6, the background image includes a background image and a second background image, wherein the image displayed on the main scene interface is a background image, the second background image is generated according to the scene element dynamically rendered in real time by the main scene interface at the time corresponding to the update instruction, after the second background image is generated, the second background image is displayed on the main scene interface instead of the background image, so as to update the background image displayed on the main scene interface, and when the second background image is displayed on the main scene interface, the old background image may be deleted, and the second background image displayed on the main scene interface becomes a new background image until the next update of the background image.
By updating the content displayed on the main scene interface at regular time, the method is beneficial to enabling the user to adapt to the change of the main scene interface when the sub-operation interface is closed, and can also prevent the condition that the mobile phone screen is easy to age due to the fact that the same content is displayed on the mobile phone screen in a large area for a long time.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention.

Claims (10)

1. A game rendering method, comprising the steps of:
in a game main scene interface, responding to the triggering operation of a sub-operation interface, and acquiring scene elements in the main scene interface;
obtaining a relatively fixed background image according to the scene elements;
and displaying the background image on the main scene interface, and displaying the sub-operation interface in front of the main scene interface.
2. A game rendering method according to claim 1, wherein: and when the background image is displayed on the main scene interface, closing the real-time dynamic rendering of the scene elements of the main scene interface.
3. A game rendering method according to claim 1,
the scene elements within the main scene interface include: scene elements corresponding to the current time in the main scene interface;
the step of obtaining a relatively fixed background image from the scene element includes:
rendering the scene elements into a preset canvas to obtain a static picture corresponding to the scene elements;
determining the still picture as the background image.
4. A game rendering method according to claim 3, wherein:
the size of the canvas is smaller than that of the main scene interface;
the step of rendering the scene element into a preset canvas to obtain a static picture corresponding to the scene element includes:
determining a rendering ratio according to the size of the canvas and the size of the main scene interface;
rendering the scene elements into the canvas according to the rendering proportion to obtain a static picture corresponding to the scene elements;
the displaying the background image on the main scene interface comprises the following steps,
amplifying the static picture to be the same as the size of the main scene interface;
obtaining the background image according to the amplified static picture;
and displaying the background image on the main scene interface.
5. A game rendering method according to claim 4, wherein:
the step of obtaining the background image according to the enlarged still picture includes:
carrying out fuzzy blurring processing on the amplified static picture to obtain a blurred static picture;
determining the blurred still picture as the background image.
6. A game rendering method according to claim 1,
the scene elements within the main scene interface include: scene elements corresponding to the current time and scene elements corresponding to a preset time period before the current time in the main scene interface;
the step of obtaining a relatively fixed background image from the scene element includes:
and generating a dynamic picture by using the current time in the main scene interface and scene elements in a preset time period before the current time, and determining the dynamic picture as the background image.
7. A game rendering method according to any one of claims 1 to 6, further comprising, after the step of displaying the background image on the main scene interface and the sub-operation interface in front of the main scene interface:
acquiring an updating instruction triggered by a preset timer; the updating instruction is triggered when the timer time reaches a preset time threshold;
acquiring scene elements dynamically rendered in real time by the main scene interface at the time corresponding to the updating instruction;
updating the background image of the primary scene interface according to the scene element.
8. A game rendering method according to claim 1, wherein: the scene elements include virtual buildings, virtual objects, virtual characters, and/or player characters.
9. A game rendering method according to claim 1, wherein: and the size of the sub operation interface is smaller than that of the main scene interface.
10. A game rendering method according to claim 1, wherein: and the sub-operation interface is displayed in front of the main scene interface in a preset transparent proportion.
CN202110839968.3A 2021-07-23 2021-07-23 Game rendering method Pending CN113577770A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110839968.3A CN113577770A (en) 2021-07-23 2021-07-23 Game rendering method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110839968.3A CN113577770A (en) 2021-07-23 2021-07-23 Game rendering method

Publications (1)

Publication Number Publication Date
CN113577770A true CN113577770A (en) 2021-11-02

Family

ID=78249418

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110839968.3A Pending CN113577770A (en) 2021-07-23 2021-07-23 Game rendering method

Country Status (1)

Country Link
CN (1) CN113577770A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750450A (en) * 2012-06-21 2012-10-24 北京像素软件科技股份有限公司 Scene management method and device in network game
US20160139773A1 (en) * 2014-11-17 2016-05-19 Supercell Oy Electronic device for facilitating user interactions
CN107168616A (en) * 2017-06-08 2017-09-15 网易(杭州)网络有限公司 Game interaction interface display method, device, electronic equipment and storage medium
CN110060325A (en) * 2019-04-19 2019-07-26 成都四方伟业软件股份有限公司 Screen space rendering method and device
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN110559659A (en) * 2019-07-09 2019-12-13 深圳市瑞立视多媒体科技有限公司 game rendering optimization method, device, equipment and storage medium
CN112426711A (en) * 2020-10-23 2021-03-02 杭州电魂网络科技股份有限公司 Bloom effect processing method, system, electronic device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750450A (en) * 2012-06-21 2012-10-24 北京像素软件科技股份有限公司 Scene management method and device in network game
US20160139773A1 (en) * 2014-11-17 2016-05-19 Supercell Oy Electronic device for facilitating user interactions
CN107168616A (en) * 2017-06-08 2017-09-15 网易(杭州)网络有限公司 Game interaction interface display method, device, electronic equipment and storage medium
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN110060325A (en) * 2019-04-19 2019-07-26 成都四方伟业软件股份有限公司 Screen space rendering method and device
CN110559659A (en) * 2019-07-09 2019-12-13 深圳市瑞立视多媒体科技有限公司 game rendering optimization method, device, equipment and storage medium
CN112426711A (en) * 2020-10-23 2021-03-02 杭州电魂网络科技股份有限公司 Bloom effect processing method, system, electronic device and storage medium

Similar Documents

Publication Publication Date Title
US9671942B2 (en) Dynamic user interface for inheritance based avatar generation
US20170132845A1 (en) System and Method for Reducing Virtual Reality Simulation Sickness
JP4809922B2 (en) Motion desktop
CN112619167A (en) Information processing method and device, computer equipment and medium
CN105389090B (en) Method and device, mobile terminal and the computer terminal of game interaction interface display
Brackeen et al. Developing games in Java
CN110689604A (en) Personalized face model display method, device, equipment and storage medium
JP2018147002A (en) Image processing program, image processing system, image processing apparatus and image processing method
CN110471731B (en) Display interface drawing method and device, electronic equipment and computer readable medium
WO2023160054A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN111790150B (en) Shadow data determination method, device, equipment and readable medium
CN111467803A (en) In-game display control method and device, storage medium, and electronic device
CN113577770A (en) Game rendering method
US7277583B2 (en) Game software and game machine
CN112843693A (en) Method and device for shooting image, electronic equipment and storage medium
WO2023202254A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
US20170031583A1 (en) Adaptive user interface
CN117205553A (en) Information processing method and device and electronic equipment
CN112138377A (en) Method and system for adjusting game APP rendering effect
CN115708956A (en) Game picture updating method and device, computer equipment and medium
CN113282290B (en) Object rendering method, device, equipment and storage medium
GB2595445A (en) Digital sandtray
CN116943160A (en) Visual effect display method and device and electronic equipment
CN118092670B (en) Expression preview method, system and storage medium for adaptive roles in virtual scene
CN115314754A (en) Display control method and device of interactive control and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211102