CN108363569B - Image frame generation method, device, equipment and storage medium in application - Google Patents

Image frame generation method, device, equipment and storage medium in application Download PDF

Info

Publication number
CN108363569B
CN108363569B CN201810186248.XA CN201810186248A CN108363569B CN 108363569 B CN108363569 B CN 108363569B CN 201810186248 A CN201810186248 A CN 201810186248A CN 108363569 B CN108363569 B CN 108363569B
Authority
CN
China
Prior art keywords
rendering
hierarchy
call
image frame
hierarchical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810186248.XA
Other languages
Chinese (zh)
Other versions
CN108363569A (en
Inventor
余煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810186248.XA priority Critical patent/CN108363569B/en
Publication of CN108363569A publication Critical patent/CN108363569A/en
Application granted granted Critical
Publication of CN108363569B publication Critical patent/CN108363569B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Abstract

The application discloses an image frame generation method, device, equipment and storage medium in application, and belongs to the technical field of computers. The method comprises the following steps: creating at least two panels in an image frame to be generated and controls in each panel; creating a first rendering call for the control, and generating a hierarchy of the first rendering call according to the hierarchy of the control, wherein the hierarchy of the first rendering call is irrelevant to the hierarchies of at least two panels; obtaining the material of the control corresponding to each first rendering call, searching a material example matched with the level and the material in a material index library, and filling the material example into the first rendering call; merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain an image frame; and at least two first rendering calls exist in the merged first rendering calls, wherein the at least two first rendering calls correspond to the controls in different panels. The image frame generation method and the image frame generation device can reduce time consumption of image frame generation.

Description

Image frame generation method, device, equipment and storage medium in application
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to an image frame generation method, device, equipment and storage medium in application.
Background
The image frame in the application may include a plurality of panels, each panel may include a plurality of controls, and the terminal may obtain the image frame by rendering all the controls. The controls herein may also be referred to as UI (User Interface) components.
The rendering involves a first rendering call and a second rendering call, where the first rendering call is a rendering call (UIDrawcall) of a user interface, and each control may correspond to a respective first rendering call; the second render call is a global render call (Drawcall), and each image frame may correspond to multiple second render calls. And during rendering, rendering all the first rendering calls through the second rendering calls.
This results in a long time consuming generation of the image frame since all first rendering calls need to be rendered separately.
Disclosure of Invention
The embodiment of the application provides an image frame generation method, device, equipment and storage medium in application, which are used for solving the problem that time consumption for generating image frames is long due to the fact that all first rendering calls need to be rendered. The technical scheme is as follows:
in one aspect, a method for generating an image frame in an application is provided, the method including:
creating at least two panels in an image frame to be created and controls in respective ones of the at least two panels;
creating a first rendering call for the control, and generating a hierarchy of the first rendering call according to the hierarchy of the control;
for each first rendering call, obtaining the material of the control corresponding to the first rendering call, searching a material example matched with the hierarchy and the material in a material index library, and filling the material example into the first rendering call;
and merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain the image frame.
In one aspect, an image frame generation apparatus in an application is provided, the apparatus including:
the system comprises a creating module, a generating module and a display module, wherein the creating module is used for creating at least two panels in an image frame to be generated and controls in each panel of the at least two panels;
the creating module is further configured to create a first rendering call for the control, and generate a hierarchy of the first rendering call according to the hierarchy of the control;
the processing module is used for acquiring the material of the control corresponding to each first rendering call for the creating module, searching a material example matched with the hierarchy and the material in a material index library, and filling the material example into the user interface rendering call;
and the generation module is used for merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain the image frame.
In one aspect, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the image frame generation method in an application as described above.
In one aspect, there is provided an image frame generation device in an application, the image frame generation device in an application comprising a processor and a memory, the memory having stored therein at least one instruction, the instruction being loaded and executed by the processor to implement the image frame generation method in an application as described above.
The technical scheme provided by the embodiment of the application has the beneficial effects that:
because the material index library comprises the material, the level and the material example, the same material example in the material index library shared by a plurality of first rendering calls with the same material and level can be set, namely, the material examples filled in the first rendering calls are the same, and the second rendering call can be combined when the first rendering call is rendered, so that the number of the first rendering calls to be rendered is reduced, and the time consumption for generating the image frame is reduced.
In addition, since the number of first rendering calls after merging becomes small, the power consumption of the terminal can also be reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of the relationship between image frames, controls, and hierarchies provided by one embodiment of the present application;
FIG. 2 is a flowchart of a method for generating an image frame in an application according to an embodiment of the present application;
FIG. 3 is a diagram of a material index repository according to one embodiment of the present application;
fig. 4 is a flowchart of a method of generating an image frame in an application according to another embodiment of the present application;
FIG. 5 is a schematic diagram of creating a user interface rendering call as provided by another embodiment of the present application;
FIG. 6 is a schematic diagram of a configuration page provided by another embodiment of the present application;
FIGS. 7A to 7D are schematic views of an interface of a game according to another embodiment;
fig. 8 is a block diagram of an image frame generation apparatus in an application according to an embodiment of the present application;
fig. 9 is a block diagram of a terminal according to still another embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The following explains the nouns appearing below:
1. and (4) control: a widget is a component that can be displayed in an image frame.
In one application scenario, a Widget is a UI (User Interface) component in an image, which may also be referred to as a Widget.
The control contains all raw materials required by rendering, including at least one of shape, material and position information. The shape of the control may be at least one of a triangle, a square, and a circle, which is not limited in this embodiment. Referring to fig. 1, each triangle in fig. 1 represents a control.
2. The material is as follows: the material is the combination of all visual attributes in the surface of the control, and the visual attributes refer to at least one of texture, color, smoothness, transparency, reflectivity, refractive index and luminosity of the surface.
In one application scenario, textures are used to generate texture instances.
3. Example of the material: a material instance is a set of parameter sets obtained after instantiating a material. After each set of parameter sets is rendered by a GPU (Graphics Processing Unit), an object displayed in the user interface can be obtained, for example, the object may be a white snow mountain or a green tree.
4. Hierarchy of controls: the hierarchy of the controls is used to indicate the rendering order of the controls, and the larger the hierarchy of the controls, the earlier the rendering order, the smaller the hierarchy of the controls, the later the rendering order.
Referring to fig. 1, if the leftmost rectangle represents the plane of the image frame, the rendering order is from right to left, and the level of the control 1 is 4; the level of the control 2 is 3; controls 3 and 4 are located on the same plane (indicated by a dashed box in the figure), and the levels are both 2; the level of control 5 is 1.
5. Panel: a panel is a collection of controls in an image frame that have the same properties, and thus operate the controls in their entirety.
In one application scenario, a Panel is a collection of NGUI (Unity engine) packaged controls, which may also be referred to as Panel.
In fig. 1, assuming that controls 1 and 3 are packaged into one set and controls 2, 4, and 5 are packaged into one set, the set of controls 1 and 3 may be referred to as panel 1 and the set of controls 2, 4, and 5 may be referred to as panel 2.
6. Hierarchy of panels: the hierarchy of panels is used to indicate the overall rendering order of the control sets that it includes.
In fig. 1, it is assumed that the level of the panel 1 is 1, the level of the panel 2 is 2, the panel 1 includes controls 1 and 3, the level of the control 1 is greater than that of the control 3, the panel 2 includes controls 2, 4, and 5, the level of the control 2 is greater than that of the control 4, and the level of the control 4 is greater than that of the control 5, so that the rendering order of the sets composed of the controls 2, 4, and 5 is earlier than that of the sets composed of the controls 1 and 3. That is, the rendering order of the controls is: 2. 4, 5, 1 and 3.
7. A first render call: the first render call is a render call of the user interface.
In one application scenario, the first render call is a render unit of the NGUI package, and the first render call at this time may also be referred to as uidrowcall.
And the terminal creates a first rendering call for each control, and the number of the first rendering calls is equal to that of the controls. For example, the terminal creates a first rendering call for each of the 5 controls in fig. 1, and the number of the first rendering calls is 5.
8. Hierarchy of first render call: the level of the first render call is used to indicate the rendering order of the first render call, and the larger the level of the first render call the earlier the rendering order, the smaller the level of the first render call the later the rendering order.
In this embodiment, the hierarchy of the first rendering call is related to the hierarchy of the corresponding control, and is not related to the hierarchy of the panel to which the control belongs. The generation flow of the hierarchy of the first rendering call is described in detail in step 402, and is not described herein again.
9. A second render call: the second render call is a global render call.
In one application scenario, the process by which the Unity engine prepares the data and notifies the GPU is referred to as a second render call, which may also be referred to as Drawcall.
The terminal can generate a second rendering call for each first rendering call, and the number of the second rendering calls is equal to that of the first rendering calls. For example, if the terminal creates a first rendering call for each of the 5 controls in the figure, the number of the second rendering calls is also 5.
In the related technology, each first rendering call comprises a material instance, the material instance is generated according to the material of a control corresponding to the first rendering call, the level of the corresponding first rendering call is determined according to the level of the panel to which the control belongs among different panels, the level of the corresponding first rendering call is determined according to the level of the control in the same panel, and if the material instances in the first rendering calls corresponding to the controls with the same level in the same panel are the same, the first rendering calls corresponding to the two controls can be merged. However, the embodiment can solve the above problem because the related art cannot merge the first rendering calls having the same material instance between different panels, which results in a long time for generating an image frame.
Referring to fig. 2, a flowchart of an image frame generating method in an application, which may be applied in a terminal, according to an embodiment of the present application is shown. The image frame generation method in the application comprises the following steps:
step 201, at least two panels in an image frame to be generated and controls in each panel of the at least two panels are created.
The terminal may generate an image frame in the application according to the frame rate, or may receive an operation signal triggered by a user and used for instructing to generate the image frame in the application, and generate one image frame according to the operation signal.
In this embodiment, the terminal may perform steps 202 and 203 after each panel and the control in the panel are created, and perform step 204 after all panels and controls in the image frame are created. Alternatively, the terminal may perform step 203 and 204 after all panels and controls are created.
Step 202, creating a first rendering call for the control, and generating a hierarchy of the first rendering call according to the hierarchy of the control.
Optionally, the level of the first render call is independent of the level of the at least two panels.
The terminal creates a first rendering call for each control created in step 201, and then generates a hierarchy of the first rendering call according to the hierarchy of the control. The generation flow of the first rendering call level is described in detail in step 402, and is not described herein again.
In the embodiment, the level of the first rendering call is related to the level of the control and is unrelated to the level of the panel to which the control belongs; in the related art, the level of the first rendering call is related to both the level of the control and the level of the panel to which the control belongs, so the level of the first rendering call in the embodiment is different from the level of the first rendering call in the related art.
Step 203, for each first rendering call, obtaining the material of the control corresponding to the first rendering call, searching a material instance matched with the hierarchy and the material in a material index library, and filling the material instance into the first rendering call.
The material index library comprises at least one material, a hierarchical list of each material and a material example corresponding to each hierarchy in the hierarchical list. Referring to fig. 3, the hierarchical list of material 1 includes 4 levels of first rendering calls, the hierarchical list of material 2 includes 3 levels of first rendering calls, the hierarchical list of material 3 includes 1 level of first rendering calls, and the hierarchical list of material 4 includes 2 levels of first rendering calls, each level in fig. 3 corresponding to a material instance (the material instance is not shown in fig. 3).
Initially, the material index library is empty, and in the process of searching the material instance matched with the material and the level called by the first rendering by the terminal, if the terminal does not search the material instance, the material, the level and the corresponding material instance are created according to the material and the level. As can be seen, the material index library is generated in the process of creating the first rendering call, and the specific generation process is described in steps 408 and 410, which is not described herein again.
When searching for a material instance matched with the material and the level of the first rendering call, the terminal may first search for a material identical with the material of the first rendering call in the material index library, then search for a level identical with the level of the first rendering call in the level list of the material, and fill the material instance corresponding to the level into the first rendering call, where the first rendering call at this time is equivalent to a container accommodating the material instance.
And step 204, merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain an image frame.
When at least two first rendering calls exist in the merged first rendering calls and correspond to different controls in the same panel, the merged first rendering calls of the different controls in the same panel can be merged; when at least two first rendering calls exist in the merged first rendering calls and correspond to the controls in different panels, the merged first rendering calls of the different controls in the different panels can be described. In this way, the number of first rendering calls to be rendered may be reduced by merging the first rendering calls without having to render all of the first rendering calls, which may reduce the time consumed to generate the image frame.
Still taking fig. 1 as an example, assuming that the panel 1 includes controls 1 and 3, and the panel 2 includes controls 2, 4, and 5, the terminal needs to create 5 first rendering calls, and if the first rendering call corresponding to the control 3 and the first rendering call corresponding to the control 4 contain the same material instance, the first rendering call corresponding to the control 3 and the first rendering call corresponding to the control 4 may be merged to generate one second rendering call, one second rendering call is generated for each first rendering call in the remaining 3 first rendering calls, a total of 4 second rendering calls are generated, and the 4 second rendering calls are rendered to obtain the image frame.
To sum up, according to the image frame generation method in application provided by the embodiment of the present application, since the material index library includes a material, a level, and a material instance, the same material instance in the material index library shared by a plurality of first rendering calls having the same material and level may be set, that is, the material instances filled in the plurality of first rendering calls are the same, and the second rendering call may merge the first rendering calls including the same material instance when rendering the first rendering call, thereby reducing the number of the first rendering calls to be rendered and reducing the time consumption for generating an image frame.
When the first rendering calls filled with the same material instance correspond to controls in different panels, the first rendering calls corresponding to the different panels may also be merged to reduce the number of first rendering calls rendered by the second rendering call, thereby reducing the time consumed to generate the image frame.
In addition, since the number of first rendering calls after merging becomes small, the power consumption of the terminal can also be reduced.
Referring to fig. 4, a flowchart of an image frame generating method in an application, which may be applied in a terminal, according to another embodiment of the present application is shown. The image frame generation method in the application comprises the following steps:
step 401, at least two panels in an image frame to be generated and a control in each panel of the at least two panels are created.
The trigger mode for generating the image frame, the creation mode of the panel and the control are described in step 201, and are not described herein again.
When an image frame is generated, if the image frame is generated for the first time, the terminal needs to create all panels in the image frame and all controls in each panel, and at this time, the controls created by the terminal are all controls in the panels. If the image frame to be generated is not the image frame generated for the first time, the image of the image frame to be generated may only be partially changed compared with the previous image frame, and at this time, only the changed part in the image frame may be updated, so as to improve the generation efficiency of the image frame. At this time, the control created by the terminal is a control in which a change occurs in the panel compared to the image frame generated last time.
Step 402, creating a first rendering call for the control, and generating a hierarchy of the first rendering call according to the hierarchy of the control, wherein the hierarchy of the first rendering call is independent of the hierarchies of the at least two panels.
The terminal creates a first rendering call for each control created in step 201, and then generates a hierarchy of the first rendering call according to the hierarchy of the control. The creating a first rendering call for the control and generating the hierarchy of the first rendering call according to the hierarchy of the control may include the following substeps:
and substep 1, for each panel in at least two panels, obtaining the material of each control in the panel.
Since the controls already have the attributes of the material when being created, the terminal can directly read the corresponding material from each control in the panel.
And a substep 2, acquiring the hierarchy of each control with the material for each material, and acquiring the hierarchy interval corresponding to the material.
In one implementation, the terminal may number each control in the panel first; creating a group for each material; classifying the control numbers with the same material into the same group; and finally, for each group, reading the level of the control represented by each number in the group to obtain a level interval corresponding to the group, wherein the level interval is the level interval corresponding to the material.
Assuming that the panel includes controls 1-5, the controls 1 and 2 are made of material 1, the controls 3 and 4 are made of material 2, and the control 5 is made of material 3, the terminal may generate a group 1 corresponding to the material 1, and add the numbers 1 and 2 to the group 1; generating a group 2 corresponding to the material 2, and adding numbers 3 and 4 into the group 2; a packet 3 corresponding to the material 3 is generated, and the number 5 is added to the packet 3.
Assuming that the levels of the controls 1-5 are 1-5 respectively, the level interval corresponding to the group 1 is [1, 2], the level interval corresponding to the group 2 is [3, 4], and the level interval corresponding to the group 3 is [5 ].
And substep 3, when the numerical values in the hierarchical interval are continuous, creating corresponding first rendering calls for each control with the material, and determining the hierarchy of the first rendering calls according to the hierarchical interval.
If the values within the hierarchical interval are continuous, the terminal may create a first rendering call for the controls in each group. When determining the hierarchy of the first rendering call, the terminal may sequence the hierarchy intervals according to the magnitude of the numerical values in the hierarchy intervals, and determine the corresponding hierarchy of the first rendering call according to the sequence number of the hierarchy intervals, where the sequence number is in a positive correlation with the hierarchy of the first rendering call. That is, the smaller the sequence number, the smaller the hierarchy of the first render call; the larger the sequence number, the larger the hierarchy of the first render call.
Assuming that after the hierarchical intervals are sequenced according to the magnitude of numerical values in the hierarchical intervals, the sequence numbers of the groups 1-3 are determined to be 1-3 respectively, the terminal can create a first rendering call for the controls 1 and 2, and the hierarchy of the first rendering call is set to be 1; creating a first render call to controls 3 and 4, the first render call having a level of 2; a first render call is created for control 5, setting the level of the first render call to 3.
It should be noted that the terminal may create the same first rendering call for the controls whose numbers belong to the same group, that is, the controls that are adjacent in the inner level of the same panel and have the same material correspond to the same first rendering call, and compared to creating a first rendering call for each control in the related art, the number of the first rendering calls to be rendered may be already reduced here.
And substep 4, when the numerical value in the hierarchy interval is discontinuous, dividing the hierarchy interval into at least two hierarchy subintervals, creating corresponding first rendering calls for each control with the material and the hierarchy belonging to the same hierarchy subinterval, and determining the hierarchy of the first rendering calls according to the hierarchy subintervals, wherein the numerical value in the hierarchy subintervals is continuous.
If the value in the hierarchical interval is not continuous, the terminal can divide the hierarchical interval into at least two hierarchical sub-intervals, and create a first rendering call for the control corresponding to the same hierarchical sub-interval in the group. Assuming that the hierarchical interval corresponding to the group 1 is [1, 2], the hierarchical interval corresponding to the group 2 is [3, 5], and the hierarchical interval corresponding to the group 3 is [4], the terminal may split the hierarchical interval of the group 2 into hierarchical subintervals [3] and [5], create a first rendering call for the controls 1 and 2, create a first rendering call for the control 3, create a first rendering call for the control 4, and create a first rendering call for the control 5.
When determining the hierarchy of the first rendering call, the terminal may sort the hierarchy intervals and the hierarchy subintervals according to the numerical values, and determine the corresponding hierarchy of the first rendering call according to the sequence number, where the sequence number is in a positive correlation with the hierarchy of the first rendering call. That is, the smaller the sequence number, the smaller the hierarchy of the first render call; the larger the sequence number, the larger the hierarchy of the first render call.
Assuming that the sequence number of the group 1 is 1, the sequence number of the level subinterval [3] in the group 2 is 2, the sequence number of the group 3 is 3, and the sequence number of the level subinterval [5] in the group 2 is 4, the terminal sets the hierarchy of the first rendering call corresponding to the controls 1 and 2 to 1, the hierarchy of the first rendering call corresponding to the control 3 to 2, the hierarchy of the first rendering call corresponding to the control 4 to 3, and the hierarchy of the first rendering call corresponding to the control 5 to 4.
According to the algorithm, the hierarchy of the first rendering call is related to the hierarchy of the control and is unrelated to the hierarchy of the panel to which the control belongs; in the related art, the level of the first rendering call is related to both the level of the control and the level of the panel to which the control belongs, so the level of the first rendering call in the embodiment is different from the level of the first rendering call in the related art.
Step 403, for each first rendering call, obtaining the material of the control corresponding to the first rendering call.
Since the control already has the attribute of the material when being created, the terminal can directly read the corresponding material from the control corresponding to the first rendering call.
In step 404, detecting whether the material index library includes the material, and if the material index library includes the material, executing step 405; when the material index library does not include the material, step 408 is performed.
Step 405, detecting whether the hierarchy is included in the hierarchical list corresponding to the material, and executing step 406 when the hierarchy is included in the hierarchical list; when the hierarchy is not included in the list of hierarchies, step 409 is performed.
Step 406, detecting whether the material index library includes a material instance corresponding to the level, and executing step 407 when the material index library includes the material instance corresponding to the level; when the material index library does not include the material instance corresponding to the level, step 410 is performed.
In step 407, the material instance corresponding to the level is determined as the material instance matching the level and the material.
Step 408, create the material and the hierarchical list of the material in the material index library, and execute step 406.
The terminal creates the material in a material index base, creates a hierarchical list of the material, and adds the hierarchy of the first rendering call to the hierarchical list.
In one possible implementation, the material may be represented by a parameter V, the hierarchy may be represented by a parameter K, and the hierarchical list may be a combination of a plurality of parameters K. The terminal may create a parameter value for indicating the parameter V of the material, create a parameter value for indicating the parameter K of the level, and create a corresponding relationship between the parameter V and the parameter K to obtain a corresponding relationship between the material and the level list. Wherein the parameter V may include at least one of texture, color, smoothness, transparency, reflectivity, refractive index, and luminosity, and the parameter K may include a level of the first render call.
Step 409, create the hierarchy in the list of hierarchies, step 406 is performed.
The terminal adds the hierarchy of the first render call to the hierarchy list.
Step 410, create a material instance corresponding to the level in a material index database according to the material, and determine the material instance as a material instance matching the level and the material.
The terminal can create a material instance according to the material and store the material instance into the material index library corresponding to the hierarchy.
It should be noted that, in order to simplify the implementation of the program, step 406 may be executed after both steps 408 and 409. Alternatively, the terminal may also perform step 410 after step 408, since there is certainly no corresponding material instance when there is no corresponding material and level list for the first rendering call in the material index repository. Similarly, when there is no corresponding level of the first rendering call in the texture index library, there is no corresponding texture instance, so the terminal may further perform step 410 after step 409.
Step 411, fill the material instance into the first render call.
Step 412, merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain an image frame.
When at least two first rendering calls exist in the merged first rendering calls and correspond to different controls in the same panel, the merged first rendering calls of the different controls in the same panel can be merged; when at least two first rendering calls exist in the merged first rendering calls and correspond to the controls in different panels, the merged first rendering calls of the different controls in the different panels can be described. In this way, the number of first rendering calls to be rendered may be reduced by merging the first rendering calls without having to render all of the first rendering calls, which may reduce the time consumed to generate the image frame.
Referring to fig. 5, the first rendering call 1 corresponding to panel 1 is the same as the first rendering call 1 in panel 2, and the two first rendering calls may be merged to generate a second rendering call; the first rendering call 3 corresponding to the panel 1 is the same as the first rendering call 2 corresponding to the panel 2, the two first rendering calls can be combined to generate a second rendering call, each of the other first rendering calls generates a second rendering call, and finally the terminal can reduce the original 7 second rendering calls to 5 second rendering calls.
The method provided by the embodiment can be applied to all projects packaged by using the Unity + NGUI, and the number of the first rendering call and the second rendering call can be reduced by modifying the bottom layer code and reserving the manufacturing process of the logic layer and the resource manufacturing layer, so that the generation efficiency of the image frame is improved and the power consumption of the terminal is reduced. When the embodiment is applied to a game, the performance of the game can be improved.
Referring to FIG. 6, a configuration page of the present embodiment is shown for configuring a panel with a grouping of shared texture indexes on a pre-fabricated part (prefab) of a user interface resource during a game, as shown in the dashed box of FIG. 6.
To sum up, according to the image frame generation method in application provided by the embodiment of the present application, since the material index library includes a material, a level, and a material instance, the same material instance in the material index library shared by a plurality of first rendering calls having the same material and level may be set, that is, the material instances filled in the plurality of first rendering calls are the same, and the second rendering call may merge the first rendering calls including the same material instance when rendering the first rendering call, thereby reducing the number of the first rendering calls to be rendered and reducing the time consumption for generating an image frame.
When the first rendering calls filled with the same material instance correspond to controls in different panels, the first rendering calls corresponding to the different panels may also be merged to reduce the number of first rendering calls rendered by the second rendering call, thereby reducing the time consumed to generate the image frame.
In addition, since the number of first rendering calls after merging becomes small, the power consumption of the terminal can also be reduced.
When the image frame is not generated for the first time, only the controls which change in the image frame need to be created and rendered, and all the controls in the image frame do not need to be created and rendered, so that the generation efficiency of the image frame can be improved.
Taking the application of the soul fighting compass as an example, when the user starts the soul fighting compass application, the terminal generates the top page of the soul fighting compass application according to the method. After that, the user can initiate a competition in the home page, and during the competition, the terminal generates an image frame in the competition according to the frame rate and an operation signal triggered by the user, wherein the operation signal is used for instructing a game role to attack and defend in the competition.
Taking the generation of the image frame in the competition process as an example, please refer to fig. 7A and 7D, fig. 7A is a schematic diagram of a line of the image frame when using the correlation technique, and fig. 7B is a schematic diagram of a gray scale of the image frame when using the correlation technique.
1) In creating the image frame, the terminal creates an operation panel 710 (indicated by an unshaded broken-line frame in fig. 7A) and respective operation controls in the operation panel 710, and a property panel 720 (indicated by an unshaded broken-line frame in fig. 7A) and respective property controls in the property panel 720. The operation controls are controls for a user to manipulate the game character, such as a forward/backward control 711, a jump control 712, and an attack control 713, and the property controls are controls for showing properties of the game character, such as a character name 721 and a blood bar 722.
2) And the terminal creates a first rendering call for each operation control and each attribute control.
3) And for each first rendering call, generating a material example according to the material of the operation control or the attribute control corresponding to the first rendering call, and filling the material example into the first rendering call.
4) A second render call is generated for each first render call.
5) And rendering the first rendering call through the second rendering call to obtain an image frame.
As can be seen from fig. 7A and 7B, the number of second render calls at this time is 426.
Referring to fig. 7C and 7D, fig. 7C is a schematic diagram of lines of an image frame when the method provided by the present embodiment is used, and fig. 7D is a schematic diagram of gray scales of the image frame when the method provided by the present embodiment is used.
1) In creating the image frame, the terminal creates an operation panel 710 (indicated by an unshaded broken-line frame in fig. 7C) and respective operation controls in the operation panel 710, and a property panel 720 (indicated by an unshaded broken-line frame in fig. 7C) and respective property controls in the property panel 720. The operation controls are controls for a user to manipulate the game character, such as a forward/backward control 711, a jump control 712, and an attack control 713, and the property controls are controls for showing properties of the game character, such as a character name 721 and a blood bar 722.
2) And the terminal creates a first rendering call for each operation control and each attribute control, and determines the hierarchy of the first rendering call according to the hierarchy of the corresponding control. Wherein the level of the first rendering call is positively correlated with the rendering order. For example, in fig. 7C, the level of the first rendering call corresponding to the attribute control 8 is greater than the level of the first rendering call corresponding to the attribute control 4, and if the attribute control 8 intersects with the display area of the attribute control 4, the attribute control 4 covers the attribute control 8 after rendering.
3) And for each first rendering call, acquiring the material of the control corresponding to the first rendering call, searching a material example matched with the hierarchy and the material in a material index library, and filling the material example into the first rendering call.
4) And merging the first rendering calls containing the same material example, and generating a second rendering call for each first rendering call obtained after merging.
5) And rendering the first rendering call through the second rendering call to obtain an image frame.
As can be seen from fig. 7C and 7D, the number of second render calls at this time is 365. The number of overall second render calls is reduced by 61, i.e. 61 first render calls are merged, compared to generating the image frame of fig. 7A and 7B.
Referring to fig. 8, a block diagram of an image frame generating apparatus in an application, which may be applied to a terminal, according to an embodiment of the present application is shown. The image frame generation apparatus in this application includes:
a creating module 810, configured to implement the functions related to creation implied in the above steps 201 and 202 and each step, or implement the functions related to creation implied in the above steps 401 and 402 and each step.
The search module 820 is configured to implement the functions related to search implied in the above step 203 and each step, or implement the functions related to search implied in the above step 403 and 411 and each step.
A generating module 830, configured to implement the functions related to the generation implied in the above step 204 and each step, or implement the functions related to the generation implied in the above step 412 and each step.
To sum up, the image frame generation apparatus in application provided by the embodiment of the present application, because the material index library includes a material, a level, and a material instance, may set a same material instance in a plurality of first rendering calls having a same material and a level in the common material index library, that is, the material instances filled in the plurality of first rendering calls are the same, and the second rendering call may merge the first rendering calls including the same material instance when rendering the first rendering call, thereby reducing the number of the first rendering calls to be rendered and reducing the time consumption for generating the image frame.
When the first rendering calls filled with the same material instance correspond to controls in different panels, the first rendering calls corresponding to the different panels may also be merged to reduce the number of first rendering calls rendered by the second rendering call, thereby reducing the time consumed to generate the image frame.
In addition, since the number of first rendering calls after merging becomes small, the power consumption of the terminal can also be reduced.
When the image frame is not generated for the first time, only the controls which change in the image frame need to be created and rendered, and all the controls in the image frame do not need to be created and rendered, so that the generation efficiency of the image frame can be improved.
Fig. 9 shows a block diagram of a terminal 900 according to an exemplary embodiment of the present application. The terminal 900 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 900 may also be referred to by other names such as user equipment, portable terminals, laptop terminals, desktop terminals, and the like.
In general, terminal 900 includes: a processor 901 and a memory 902.
Processor 901 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 901 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 901 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 901 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one instruction for execution by processor 901 to implement an image frame generation method in an application provided by method embodiments herein.
In some embodiments, terminal 900 can also optionally include: a peripheral interface 903 and at least one peripheral. The processor 901, memory 902, and peripheral interface 903 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 903 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 904, a touch display screen 905, a camera 906, an audio circuit 907, a positioning component 908, and a power supply 909.
The peripheral interface 903 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 901 and the memory 902. In some embodiments, the processor 901, memory 902, and peripheral interface 903 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 901, the memory 902 and the peripheral interface 903 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 904 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 904 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 904 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 904 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 904 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 904 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 905 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 905 is a touch display screen, the display screen 905 also has the ability to capture touch signals on or over the surface of the display screen 905. The touch signal may be input to the processor 901 as a control signal for processing. At this point, the display 905 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 905 may be one, providing the front panel of the terminal 900; in other embodiments, the number of the display panels 905 may be at least two, and each of the display panels is disposed on a different surface of the terminal 900 or is in a foldable design; in still other embodiments, the display 905 may be a flexible display disposed on a curved surface or a folded surface of the terminal 900. Even more, the display screen 905 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display panel 905 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 906 is used to capture images or video. Optionally, camera assembly 906 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 906 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuit 907 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 901 for processing, or inputting the electric signals to the radio frequency circuit 904 for realizing voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of the terminal 900. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 901 or the radio frequency circuit 904 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuit 907 may also include a headphone jack.
The positioning component 908 is used to locate the current geographic Location of the terminal 900 for navigation or LBS (Location Based Service). The Positioning component 908 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 909 is used to provide power to the various components in terminal 900. The power source 909 may be alternating current, direct current, disposable or rechargeable. When the power source 909 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 900 can also include one or more sensors 910. The one or more sensors 910 include, but are not limited to: acceleration sensor 911, gyro sensor 912, pressure sensor 913, fingerprint sensor 914, optical sensor 915, and proximity sensor 916.
The acceleration sensor 911 can detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 900. For example, the acceleration sensor 911 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 901 can control the touch display 905 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 911. The acceleration sensor 911 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 912 may detect a body direction and a rotation angle of the terminal 900, and the gyro sensor 912 may cooperate with the acceleration sensor 911 to acquire a 3D motion of the user on the terminal 900. The processor 901 can implement the following functions according to the data collected by the gyro sensor 912: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 913 may be disposed on the side bezel of terminal 900 and/or underneath touch display 905. When the pressure sensor 913 is disposed on the side frame of the terminal 900, the user's holding signal of the terminal 900 may be detected, and the processor 901 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 913. When the pressure sensor 913 is disposed at a lower layer of the touch display 905, the processor 901 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 905. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 914 is used for collecting a fingerprint of the user, and the processor 901 identifies the user according to the fingerprint collected by the fingerprint sensor 914, or the fingerprint sensor 914 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 901 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 914 may be disposed on the front, back, or side of the terminal 900. When a physical key or vendor Logo is provided on the terminal 900, the fingerprint sensor 914 may be integrated with the physical key or vendor Logo.
The optical sensor 915 is used to collect ambient light intensity. In one embodiment, the processor 901 may control the display brightness of the touch display 905 based on the ambient light intensity collected by the optical sensor 915. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 905 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 905 is turned down. In another embodiment, the processor 901 can also dynamically adjust the shooting parameters of the camera assembly 906 according to the ambient light intensity collected by the optical sensor 915.
Proximity sensor 916, also known as a distance sensor, is typically disposed on the front panel of terminal 900. The proximity sensor 916 is used to collect the distance between the user and the front face of the terminal 900. In one embodiment, when the proximity sensor 916 detects that the distance between the user and the front face of the terminal 900 gradually decreases, the processor 901 controls the touch display 905 to switch from the bright screen state to the dark screen state; when the proximity sensor 916 detects that the distance between the user and the front surface of the terminal 900 gradually becomes larger, the processor 901 controls the touch display 905 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 9 does not constitute a limitation of terminal 900, and may include more or fewer components than those shown, or may combine certain components, or may employ a different arrangement of components.
An embodiment of the present application provides a computer-readable storage medium having stored therein at least one instruction, at least one program, code set or set of instructions that is loaded and executed by the processor to implement an image frame generation method in an application as described above.
One embodiment of the present application provides an image frame generation device in an application, which includes a processor and a memory, where the memory stores at least one instruction, and the instruction is loaded and executed by the processor to implement the image frame generation method in an application as described above.
It should be noted that: in the image frame generating device in the application provided in the above embodiment, when the image frame is generated, only the division of the above functional modules is taken as an example, and in practical application, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the image frame generating device in the application is divided into different functional modules to complete all or part of the above described functions. In addition, the image frame generation device in the application and the image frame generation method in the application provided by the above embodiment belong to the same concept, and the specific implementation process thereof is described in detail in the method embodiment and is not described herein again.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is not intended to limit the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present application should be included in the scope of the present application.

Claims (11)

1. A method of image frame generation in an application, the method comprising:
creating at least two panels in an image frame to be generated and a control in each panel of the at least two panels;
for each panel in the at least two panels, obtaining the material of each control in the panel;
for each material, acquiring the level of each control with the material to obtain a level interval corresponding to the material;
when numerical values in the hierarchical interval are continuous, corresponding first rendering calls are established for all the controls with the materials, and the hierarchy of the first rendering calls is determined according to the hierarchical interval, wherein the hierarchy of the first rendering calls is irrelevant to the hierarchies of the at least two panels;
when numerical values in the hierarchical intervals are discontinuous, dividing the hierarchical intervals into at least two hierarchical subintervals, creating corresponding first rendering calls for each control with the material and the hierarchy belonging to the same hierarchical subinterval, and determining the hierarchy of the first rendering calls according to the hierarchical subintervals, wherein the numerical values in the hierarchical subintervals are continuous;
for each first rendering call, obtaining the material of the control corresponding to the first rendering call, searching a material example matched with the hierarchy and the material in a material index library, and filling the material example into the first rendering call;
and merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain the image frame.
2. The method of claim 1, wherein said searching the material index library for the material instance matching the hierarchy and the material comprises:
detecting whether the material index library comprises the material or not;
when the material index library comprises the material, detecting whether a hierarchy list corresponding to the material comprises the hierarchy;
when the hierarchy is included in the hierarchy list, detecting whether a material example corresponding to the hierarchy is included in the material index library;
and when the material index library comprises the material examples corresponding to the levels, determining the material examples corresponding to the levels as the material examples matched with the levels and the materials.
3. The method of claim 2, wherein after said detecting whether the material is included in the material index library, further comprising:
when the material is not included in the material index library, creating a hierarchical list of the material and the material in the material index library, and creating the hierarchy in the hierarchical list;
and triggering and executing the step of detecting whether the material index library comprises the material example corresponding to the level.
4. The method according to claim 2, wherein after said detecting whether the hierarchy is included in the hierarchical list corresponding to the material, further comprising:
creating the hierarchy in the hierarchical list when the hierarchy is not included in the hierarchical list;
and triggering and executing the step of detecting whether the material index library comprises the material example corresponding to the level.
5. The method according to claim 3 or 4, wherein after said detecting whether the material index library includes the material instance corresponding to the hierarchy, further comprising:
when the material index library does not comprise the material example corresponding to the level, the material example corresponding to the level is created in the material index library according to the material, and the material example is determined as the material example matched with the level and the material.
6. The method of claim 1,
when the image frame is generated for the first time, the created controls are all controls in the panel;
when the image frame is not generated for the first time, the created control is a control that changes in the panel compared to the last generated image frame.
7. The method of claim 1, wherein the controls are components that can be displayed in the image frame, and wherein the panel is a collection of controls in the image frame that have the same property.
8. The method of claim 1, wherein the first render call is a render call for a user interface and the second render call is a global render call.
9. An apparatus for generating image frames in an application, the apparatus comprising:
the system comprises a creating module, a generating module and a display module, wherein the creating module is used for creating at least two panels in an image frame to be generated and controls in each panel of the at least two panels;
the creating module is further configured to obtain, for each panel of the at least two panels, a material of each control in the panel; for each material, acquiring the level of each control with the material to obtain a level interval corresponding to the material; when numerical values in the hierarchical interval are continuous, corresponding first rendering calls are established for all the controls with the materials, and the hierarchy of the first rendering calls is determined according to the hierarchical interval, wherein the hierarchy of the first rendering calls is irrelevant to the hierarchies of the at least two panels; when numerical values in the hierarchical intervals are discontinuous, dividing the hierarchical intervals into at least two hierarchical subintervals, creating corresponding first rendering calls for each control with the material and the hierarchy belonging to the same hierarchical subinterval, and determining the hierarchy of the first rendering calls according to the hierarchical subintervals, wherein the numerical values in the hierarchical subintervals are continuous;
the searching module is used for acquiring the material of the control corresponding to each first rendering call created by the creating module, searching a material example matched with the hierarchy and the material in a material index library, and filling the material example into the first rendering call;
and the generation module is used for merging the first rendering calls containing the same material example through the second rendering call, and rendering the merged first rendering call to obtain the image frame.
10. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement an image frame generation method in an application according to any one of claims 1 to 8.
11. An image frame generation device in an application, characterized in that the image frame generation device comprises a processor and a memory, wherein the memory has stored therein at least one instruction, which is loaded and executed by the processor to implement the image frame generation method in an application according to any one of claims 1 to 8.
CN201810186248.XA 2018-03-07 2018-03-07 Image frame generation method, device, equipment and storage medium in application Active CN108363569B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810186248.XA CN108363569B (en) 2018-03-07 2018-03-07 Image frame generation method, device, equipment and storage medium in application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810186248.XA CN108363569B (en) 2018-03-07 2018-03-07 Image frame generation method, device, equipment and storage medium in application

Publications (2)

Publication Number Publication Date
CN108363569A CN108363569A (en) 2018-08-03
CN108363569B true CN108363569B (en) 2021-06-11

Family

ID=63003822

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810186248.XA Active CN108363569B (en) 2018-03-07 2018-03-07 Image frame generation method, device, equipment and storage medium in application

Country Status (1)

Country Link
CN (1) CN108363569B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109509057A (en) * 2018-10-22 2019-03-22 美宅科技(北京)有限公司 A kind of indoor design method and device
CN109961498B (en) * 2019-03-28 2022-12-13 腾讯科技(深圳)有限公司 Image rendering method, device, terminal and storage medium
CN110109667A (en) * 2019-04-30 2019-08-09 广东趣炫网络股份有限公司 A kind of interface UI draw method of calibration, device, terminal and computer storage medium
CN110633121A (en) * 2019-09-05 2019-12-31 北京无限光场科技有限公司 Interface rendering method and device, terminal equipment and medium
CN111061480B (en) * 2019-12-27 2023-08-25 珠海金山数字网络科技有限公司 Method and device for rendering multi-layer material based on NGUI
CN111240674B (en) * 2020-01-09 2023-03-28 上海米哈游天命科技有限公司 Parameter modification method, device, terminal and storage medium
CN112364496B (en) * 2020-11-03 2024-01-30 中国航空无线电电子研究所 Avionics simulation panel generation system based on HTML5 and VUE technologies
CN114494546A (en) * 2020-11-13 2022-05-13 华为技术有限公司 Data processing method and device and electronic equipment
CN114064039A (en) * 2020-12-22 2022-02-18 完美世界(北京)软件科技发展有限公司 Rendering pipeline creating method and device, storage medium and computing equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6583787B1 (en) * 2000-02-28 2003-06-24 Mitsubishi Electric Research Laboratories, Inc. Rendering pipeline for surface elements
CN102708537A (en) * 2011-03-03 2012-10-03 Arm有限公司 Graphics processing
CN104751507A (en) * 2013-12-31 2015-07-01 北界创想(北京)软件有限公司 Method and device for rendering pattern contents
CN106296785A (en) * 2016-08-09 2017-01-04 腾讯科技(深圳)有限公司 A kind of picture rendering intent and picture rendering apparatus
CN107292960A (en) * 2017-06-30 2017-10-24 浙江科澜信息技术有限公司 A kind of Local hydrodynamic unit method that large-scale terrain is rendered in three-dimensional scenic

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9779533B2 (en) * 2014-01-27 2017-10-03 Nvidia Corporation Hierarchical tiled caching
CN106227513B (en) * 2016-07-12 2019-04-30 华自科技股份有限公司 Method and system based on graphic configuration under mobile platform

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6583787B1 (en) * 2000-02-28 2003-06-24 Mitsubishi Electric Research Laboratories, Inc. Rendering pipeline for surface elements
CN102708537A (en) * 2011-03-03 2012-10-03 Arm有限公司 Graphics processing
CN104751507A (en) * 2013-12-31 2015-07-01 北界创想(北京)软件有限公司 Method and device for rendering pattern contents
CN106296785A (en) * 2016-08-09 2017-01-04 腾讯科技(深圳)有限公司 A kind of picture rendering intent and picture rendering apparatus
CN107292960A (en) * 2017-06-30 2017-10-24 浙江科澜信息技术有限公司 A kind of Local hydrodynamic unit method that large-scale terrain is rendered in three-dimensional scenic

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Unity手游性能优化——从魂斗罗谈起 http://gad.qq.com/lundao/detail/10000136;余煜;《腾讯游戏学院-伴梦前行-论道》;20180203;视频第20-31分 *
Unity手游性能优化——从魂斗罗谈起 笔记;StormLikeMe;《CSDN博客》;20180203;全文 *
余煜.Unity手游性能优化——从魂斗罗谈起 http://gad.qq.com/lundao/detail/10000136.《腾讯游戏学院-伴梦前行-论道》.2018,第20-31分钟. *

Also Published As

Publication number Publication date
CN108363569A (en) 2018-08-03

Similar Documents

Publication Publication Date Title
CN108363569B (en) Image frame generation method, device, equipment and storage medium in application
CN108304265B (en) Memory management method, device and storage medium
CN110841285B (en) Interface element display method and device, computer equipment and storage medium
CN109815150B (en) Application testing method and device, electronic equipment and storage medium
CN108132790B (en) Method, apparatus and computer storage medium for detecting a garbage code
CN111569435B (en) Ranking list generation method, system, server and storage medium
CN111694834A (en) Method, device and equipment for putting picture data into storage and readable storage medium
CN110673944B (en) Method and device for executing task
CN108492339B (en) Method and device for acquiring resource compression packet, electronic equipment and storage medium
CN111782950A (en) Sample data set acquisition method, device, equipment and storage medium
CN111275607A (en) Interface display method and device, computer equipment and storage medium
CN112717393B (en) Virtual object display method, device, equipment and storage medium in virtual scene
CN111641853B (en) Multimedia resource loading method and device, computer equipment and storage medium
CN111258673A (en) Fast application display method and terminal equipment
CN108881715B (en) Starting method and device of shooting mode, terminal and storage medium
CN113076452A (en) Application classification method, device, equipment and computer readable storage medium
CN112230781A (en) Character recommendation method and device and storage medium
CN110458289B (en) Multimedia classification model construction method, multimedia classification method and device
CN115379274B (en) Picture-based interaction method and device, electronic equipment and storage medium
CN111866047B (en) Data decoding method, device, computer equipment and storage medium
CN113590669B (en) Method and device for generating cost report forms and computer storage medium
CN110543305B (en) Method and device for replacing easy UI component
CN111526221B (en) Domain name quality determining method, device and storage medium
CN112131340B (en) Character string detection method, device and storage medium
CN111580892B (en) Method, device, terminal and storage medium for calling service components

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant