CN114073858A - Data processing method, device and equipment and readable storage medium - Google Patents

Data processing method, device and equipment and readable storage medium Download PDF

Info

Publication number
CN114073858A
CN114073858A CN202010827039.6A CN202010827039A CN114073858A CN 114073858 A CN114073858 A CN 114073858A CN 202010827039 A CN202010827039 A CN 202010827039A CN 114073858 A CN114073858 A CN 114073858A
Authority
CN
China
Prior art keywords
layer
rendering
quality
target
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010827039.6A
Other languages
Chinese (zh)
Inventor
叶敏华
杨宇
闫龙阁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010827039.6A priority Critical patent/CN114073858A/en
Publication of CN114073858A publication Critical patent/CN114073858A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the application discloses a data processing method, a device, equipment and a readable storage medium, belonging to the field of cloud computing, wherein the method comprises the following steps: responding to the trigger operation aiming at the target control, generating a layer rendering instruction, and sending the layer rendering instruction to a terminal rendering component and a service server; the service server is a cloud server; calling a first cache component and a second cache component; when a first layer exists in the first cache assembly and a second layer exists in the second cache assembly, determining a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering the target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; and outputting the target layer in the display interface. By the method and the device, the picture quality can be improved while the smooth operation of the application client side can be ensured.

Description

Data processing method, device and equipment and readable storage medium
Technical Field
The present application relates to the field of cloud computing, and in particular, to a data processing method, apparatus, device, and readable storage medium.
Background
The rapid development of the related art of electronic devices and the popularity of the internet have resulted in a dramatic development opportunity for the gaming industry that exists and operates by means of electronic devices. Particularly, after the intelligent terminals represented by smart phones, tablet computers and the like appear, the development potential of the game industry is more prominent.
In the prior art, rendering levels in a game generally have several levels such as ultra-definition, high-definition, standard definition and smoothness, rendering in the game mainly depends on a local intelligent terminal for rendering, and due to the fact that rendering in the game only depends on the local terminal, when the local terminal cannot support rendering of higher levels (such as ultra-definition and high-definition) due to computing capacity, the situation that a picture of the intelligent terminal is stuck in the game running process may be caused, and the smoothness of game running cannot be guaranteed; in order to guarantee the fluency in the operation process, the terminal can locally perform rendering at a lower level (such as standard definition, fluency and the like), but the method cannot bring picture quality with higher definition to the user.
Disclosure of Invention
The embodiment of the application provides a data processing method, a data processing device, data processing equipment and a readable storage medium, and the image quality can be improved while the smooth operation of an application client is ensured.
An embodiment of the present application provides a data processing method, including:
responding to the trigger operation aiming at the target control, generating a layer rendering instruction, and sending the layer rendering instruction to a terminal rendering component and a service server;
calling a first cache component and a second cache component; the first cache assembly is used for storing layers rendered by the terminal rendering assembly; the second cache component is used for storing the layer rendered by the service server;
when a first layer exists in the first cache assembly and a second layer exists in the second cache assembly, determining a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering the target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is business data triggered by the target control;
and outputting the target layer in the display interface.
An embodiment of the present application provides a data processing apparatus, including:
the instruction generating module is used for responding to the triggering operation aiming at the target control and generating an image layer rendering instruction;
the instruction sending module is used for sending the layer rendering instruction to the terminal rendering component and the service server;
the component calling module is used for calling the first cache component and the second cache component; the first cache assembly is used for storing layers rendered by the terminal rendering assembly; the second cache component is used for storing the layer rendered by the service server;
the target layer generation module is used for determining a target layer used for responding to the trigger operation according to the first layer and the second layer when the first layer exists in the first cache assembly and the second layer exists in the second cache assembly; the first layer is obtained by rendering the target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is the business data triggered by the target control
And the layer output module is used for outputting the target layer in the display interface.
The layer rendering instruction comprises a first rendering instruction and a second rendering instruction;
the instruction generation module comprises:
the parameter detection unit is used for responding to the trigger operation aiming at the target control and detecting the component performance parameters of the terminal rendering component;
the instruction generating unit is used for determining a terminal rendering quality grade according to the component performance parameters and generating a first rendering instruction containing the terminal rendering quality grade;
the instruction generating unit is also used for generating a second rendering instruction aiming at the business server;
an instruction sending module, comprising:
the first instruction sending unit is used for sending the first rendering instruction to the terminal rendering component, and rendering the target service data through the terminal rendering component to obtain a first layer matched with the rendering quality grade of the terminal;
and the second instruction sending unit is used for sending the second rendering instruction to the service server so that the service server renders the target service data according to the second rendering instruction to obtain a second layer.
The target layer generation module comprises:
the image layer quality acquiring unit is used for acquiring first image layer quality corresponding to a first image layer and second image layer quality corresponding to a second image layer when the first cache assembly has the first image layer and the second cache assembly has the second image layer; the first layer quality corresponds to a terminal rendering quality grade, and the second layer quality corresponds to a second rendering quality grade;
the first target layer determining unit is used for determining the first layer as a target layer if the quality of the first layer is greater than that of the second layer;
the first target layer determining unit is further configured to determine the second layer as the target layer if the first layer quality is less than the second layer quality.
The first image layer is composed of at least two first image layer blocks; the second layer is composed of at least one second layer block;
the target layer generation module comprises:
the layer block acquiring unit is used for acquiring at least one second layer block when a first layer exists in the first cache assembly, a second layer exists in the second cache assembly and the second layer is an incomplete layer;
an associated layer block acquiring unit configured to acquire, as an associated layer block, a first layer block having the same pixel position as a second layer block among at least two first layer blocks;
the image layer block replacing unit is used for replacing the associated image layer block in the first image layer with a second image layer block;
and the second target layer determining unit is used for determining the replaced first layer as a target layer.
Wherein, the association diagram layer block acquisition unit includes:
the number acquiring subunit is configured to acquire a first tag number corresponding to each first layer block, and acquire a second tag number corresponding to each second layer block; the first mark number is used for representing the pixel position of the first image layer block in the first image layer, and the second mark number is used for representing the pixel position of the second image layer block in the second image layer;
the associated number obtaining subunit is configured to determine, in the first tag numbers corresponding to the at least two first layer blocks, a first tag number that is the same as the second tag number, and determine, as the associated tag number, the first tag number that is the same as the second tag number;
and the association map layer block determining subunit is used for determining the first map layer block corresponding to the association mark number as the association map layer block.
Wherein, the layer block replacement unit includes:
the first quality determination subunit is used for acquiring the layer quality corresponding to the associated layer block and the layer quality corresponding to the second layer block;
and the layer block replacing subunit is used for replacing the associated layer block in the first layer with the second layer block if the layer quality corresponding to the second layer block is greater than the layer quality corresponding to the associated layer block.
Wherein, the layer block replacement unit includes:
the second quality determination subunit is used for acquiring the layer quality corresponding to the associated layer block and the layer quality corresponding to the second layer block;
the visual parameter adjusting subunit is configured to, if the layer quality corresponding to the second layer block is greater than the layer quality corresponding to the associated layer block, adjust the visual parameter corresponding to the associated layer block to a default parameter to obtain an adjusted associated layer block; adjusting the relevant graph layer block to be a graph layer block in a transparent state;
and the layer block covering unit is used for covering the second layer block on the adjustment associated layer block in the first layer.
Wherein, the device still includes:
and the first target layer determining module is used for determining the first layer as the target layer when the first layer exists in the first cache assembly and the second layer does not exist in the second cache assembly.
Wherein, the device still includes:
and the second target layer determining module is used for determining the second layer as the target layer when the first layer does not exist in the first cache assembly and the second layer exists in the second cache assembly.
Wherein, the device still includes:
and the third target layer determining module is used for generating the stuck prompt message and outputting the stuck prompt message in the display interface when the first layer does not exist in the first cache assembly and the second layer does not exist in the second cache assembly.
Wherein, the device still includes:
the operation parameter detection module is used for detecting the network quality parameters and the terminal operation parameters of the user terminal in the display process of the target layer; the user terminal comprises a terminal rendering component;
the performance parameter acquisition module is used for acquiring a terminal performance parameter corresponding to the user terminal if the network quality parameter and the terminal operation parameter do not meet the operation condition, and determining a recommended rendering quality grade according to the network quality parameter and the terminal performance parameter; the layer quality corresponding to the recommended rendering quality grade is smaller than the layer quality corresponding to the target layer;
the recommendation information generation module is used for generating quality recommendation information according to the recommended rendering quality grade and outputting the quality recommendation information in a display interface;
the updating instruction generating module is used for responding to quality conversion confirmation operation aiming at the quality recommendation information, generating an updating layer rendering instruction containing a recommended rendering quality grade, and sending the updating layer rendering instruction to the terminal rendering component;
the updated layer output module is used for receiving an updated layer returned by the terminal rendering component according to the updated layer rendering instruction and outputting the updated layer in the display interface; and the layer quality of the updated layer is matched with the recommended rendering quality grade.
An aspect of an embodiment of the present application provides a computer device, including: a processor and a memory;
the memory stores a computer program that, when executed by the processor, causes the processor to perform the method in the embodiments of the present application.
An aspect of the embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, where the computer program includes program instructions, and the program instructions, when executed by a processor, perform the method in the embodiments of the present application.
In one aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by one aspect of the embodiments of the present application.
In the embodiment of the application, for the rendering of the target service data, a mode that a service server and a terminal rendering component in a terminal render simultaneously is adopted. The terminal may determine a target layer and output the target layer in a display interface in a first layer rendered by the terminal rendering component and a second layer rendered by the service server. It should be understood that, because the service server and the terminal rendering component render simultaneously, the terminal may determine the target layer with higher picture quality according to the first layer and the second layer instead of only depending on the terminal rendering component in the terminal and also depending on the service server when determining the target layer. In view of this, a method of simultaneously rendering the service server and the terminal rendering component is adopted, a mode of determining the target layer by the terminal is added, and the target layer with higher picture quality can be determined while ensuring smooth operation, so that the picture quality can be improved while ensuring smooth operation of the application client.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a diagram of a network architecture provided by an embodiment of the present application;
FIG. 2 is a schematic view of a scenario provided by an embodiment of the present application;
fig. 3 is a schematic flowchart of a data processing method according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a target layer generated by fusing a first layer and a second layer according to an embodiment of the present application;
FIG. 5 is a schematic view of a scenario provided by an embodiment of the present application;
FIG. 6 is a diagram of a system architecture provided by an embodiment of the present application;
fig. 7 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a diagram of a network architecture according to an embodiment of the present disclosure. As shown in fig. 1, the network architecture may include a service server 1000 and a user terminal cluster, which may include one or more user terminals, where the number of user terminals is not limited. As shown in fig. 1, the plurality of user terminals may include a user terminal 100a, a user terminal 100b, a user terminal 100c, …, a user terminal 100 n; as shown in fig. 1, the user terminal 100a, the user terminal 100b, the user terminals 100c, …, and the user terminal 100n may be respectively in network connection with the service server 1000, so that each user terminal may perform data interaction with the service server 1000 through the network connection.
It is understood that each user terminal shown in fig. 1 may be installed with a target application, and when the target application runs in each user terminal, data interaction may be performed between the target application and the service server 1000 shown in fig. 1, respectively, so that the service server 1000 may receive service data from each user terminal. The target application may include an application having a function of displaying data information such as text, images, audio, and video. For example, the application may be an entertainment-like application (e.g., a gaming application) that may be used for gaming entertainment by a user. The service server 1000 in the present application may obtain service data according to the applications, for example, the service data may be service data triggered by a target user clicking a target control (for example, opening a parachute control, opening a gun control, and punching a fist control) in the game application;
subsequently, the service server 1000 may render the service data triggered by the target control (e.g., opening the parachute control), so as to obtain a rendered layer (e.g., an image of opening the parachute); subsequently, the service server 1000 may transmit the rendered layer to the user terminal.
In the embodiment of the present application, one user terminal may be selected from a plurality of user terminals as a target user terminal, and the user terminal may include: smart terminals carrying multimedia data processing functions (e.g., video data playing function, music data playing function), such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart television, a smart speaker, a desktop computer, and a smart watch, but are not limited thereto. For example, the user terminal 100a shown in fig. 1 may be used as the target user terminal in the embodiment of the present application, and the target application may be integrated in the target user terminal, and at this time, the target user terminal may perform data interaction with the service server 1000 through the target application.
For example, when a user uses a target application (e.g., a game application) in a user terminal, a control clicked by the user in the game application is a gun-opening control, and the user terminal can generate a layer rendering instruction according to the trigger action for the gun-opening control, send the layer rendering instruction to the service server 1000, and send the layer rendering instruction to the terminal rendering component; the service server 1000 may render the service data triggered by the gun firing control according to a high rendering level according to the layer rendering instruction to obtain a rendering layer with high picture quality (for example, a muzzle firing layer), and the service server 1000 may send the muzzle firing rendering layer to the user terminal;
similarly, the terminal rendering component may also render the service data triggered by the gun firing control according to the layer rendering instruction and according to a low rendering level, so as to obtain a rendering layer with low picture quality; the user terminal can determine a target layer according to the rendering layer sent by the service server 1000 and the rendering layer rendered by the terminal rendering assembly, and output the rendering layer on the terminal display interface, so that the user can view a picture of gunpoint flaming after clicking the gunpoint control.
Optionally, it may be understood that the network architecture may include a plurality of service servers, one user terminal may be connected to one service server, and each service server may obtain service data (e.g., service data triggered after a user clicks a target control) in the user terminal connected to the service server, and render the service data (e.g., service data triggered after the user clicks the target control).
It is understood that the method provided by the embodiment of the present application can be executed by a computer device, including but not limited to a user terminal or a service server. The service server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, big data and an artificial intelligence platform.
The user terminal and the service server may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein.
For easy understanding, please refer to fig. 2, and fig. 2 is a schematic view of a scenario provided by an embodiment of the present application. The service server shown in fig. 2 may be the service server 1000, and the user terminal M shown in fig. 2 may be any one user terminal selected from the user terminal cluster in the embodiment corresponding to fig. 1, for example, the user terminal may be the user terminal 100 b.
As shown in fig. 2, a player character of a user M in a game application is in a parachuting and landing state, and a parachute is not opened at present, and the user M can click a parachute opening control to open the parachute; when a user M clicks an umbrella opening control, a layer rendering instruction can be generated by the user terminal M aiming at the click triggering operation of the user M, and the layer rendering instruction is sent to a terminal rendering component and a service server in the user terminal M; then, after receiving the layer rendering instruction, the terminal rendering component may render the service data triggered by the umbrella opening control, so as to obtain a rendered image (a first layer); similarly, after receiving the layer rendering instruction, the service server may also render the service data triggered by the umbrella opening control, so as to obtain a rendered image (a second layer);
subsequently, the service server may add the second layer to return to the user terminal M, and the user terminal M may determine the target layer according to the first layer rendered by the terminal rendering component and the second layer rendered by the service server, and output the target layer on the display interface of the user terminal M. For example, as shown in fig. 2, the user terminal M compares the layer quality of the first layer with the layer quality of the second layer, and determines that the layer quality of the second layer is higher than the layer quality of the first layer, so that the user terminal M may output the second layer as a target layer, and after clicking the umbrella opening control, the user may view the second layer rendered by the service server on a display interface of the user terminal M, and as shown in fig. 2, a parachute of a player character in the second layer is already in an open state.
Further, please refer to fig. 3, where fig. 3 is a schematic flow chart of a data processing method according to an embodiment of the present application. The method may be executed by a user terminal (e.g., the user terminal shown in fig. 1 and fig. 2) or a service server (e.g., the service server 1000 shown in fig. 1), or may be executed by both the user terminal and the service server (e.g., the service server 1000 in the embodiment corresponding to fig. 1). For ease of understanding, the present embodiment is described as an example in which the method is executed by the user terminal described above. The data processing method at least comprises the following steps S101-S104:
and S101, responding to the trigger operation aiming at the target control, generating a layer rendering instruction, and sending the layer rendering instruction to the terminal rendering component and the service server.
In the application, a target application may be deployed in the user terminal, and the target application may be a video client, a game client, or the like. When the user uses the user terminal, the user may launch the target application in the user terminal, for example, the user may click on the target application and click on a launch control to launch the target application. When a user clicks the start control, the user terminal may respond to a click trigger operation of the user for the start control to generate a layer rendering instruction, and send the layer rendering instruction to a terminal rendering component and a service server in the user terminal, so that the terminal rendering component and the service server may render service data (e.g., application home page data) corresponding to the start control according to the layer rendering instruction to obtain a rendered layer (home page image).
Because the component performance parameters (e.g., computing capabilities) corresponding to the terminal rendering components in different user terminals are different, the rendering quality levels that the terminal rendering components in different terminals can render are also different, and the user terminal may first perform detection on the component performance parameters of the local terminal rendering component to determine the rendering quality level at which one terminal rendering component performs rendering.
Specifically, after a user clicks a target control, a user terminal may respond to a trigger operation for the target control, and detect a component performance parameter of a terminal rendering component in the user terminal; according to the component performance parameters, determining a terminal rendering quality grade, generating a first rendering instruction containing the terminal rendering quality grade, and sending the first rendering instruction containing the terminal rendering quality grade to a terminal rendering component, so as to render target service data triggered by the target control through the terminal rendering component, and obtain a first layer matched with the terminal rendering quality grade; and when the service server performs rendering each time, it can be understood that rendering is performed according to a default rendering quality level, the user terminal may directly generate a second rendering instruction (the rendering instruction does not include the rendering quality level) according to the trigger operation of the target control, and send the second rendering instruction to the service server, so that the service server may render the target service data according to the second rendering instruction, and obtain a second layer matched with the default rendering quality level.
It can be understood that, for example, if a user clicks a start control for a target application to run the target application, the user terminal may respond to a trigger operation of the start control, detect an actual computing capability of a terminal rendering component in the user terminal, determine a lower rendering quality level (e.g., a resolution definition of 270P) according to the actual computing capability, and render the terminal rendering component according to the lower rendering quality level and a low image quality (the resolution definition of 270P), so as to obtain an application top page layer with the low image quality; the service server may render according to a default higher rendering quality level and a high image quality (e.g., a resolution definition of 720P), so as to obtain an application home page layer with a high image quality.
Step S102, calling a first cache component and a second cache component; the first cache assembly is used for storing layers rendered by the terminal rendering assembly; the second cache component is used for storing the layer rendered by the service server.
In the application, the first cache component and the second cache component are both components in the user terminal, the first cache component can be used for storing a layer rendered by the terminal rendering component, and the second cache component can be used for storing a layer rendered by the service server; the user terminal may obtain the layer for output from the first cache component and the second cache component.
Step S103, when a first layer exists in the first cache assembly and a second layer exists in the second cache assembly, determining a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering the target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is the business data triggered by the target control.
In the application, when a first layer exists in a first cache assembly and a second layer exists in a second cache assembly, a first layer quality corresponding to the first layer and a second layer quality corresponding to the second layer can be obtained; it should be understood that, because the first layer is a layer obtained by rendering the target service data by the terminal rendering component according to the first rendering instruction including the terminal rendering quality level, the layer quality of the first layer is a quality corresponding to the terminal rendering quality level; similarly, because the second layer is obtained by rendering the target service data according to the default rendering quality level by the service server according to the second rendering instruction, the layer quality of the second layer is also a quality corresponding to the default rendering quality.
Further, the first layer quality and the second layer quality may be compared, and if the first layer quality is greater than the second layer quality, the first layer may be determined as the target layer; if the first layer quality is less than the second layer quality, the second layer may be determined as the target layer.
It should be understood that the meaning of comparing the first layer quality with the second layer quality here is that a layer with higher layer quality can be determined in the first layer and the second layer, and thus the layer quality of the target layer is higher.
It can be understood that the computing power or data processing power of the terminal rendering component local to the user terminal is far lower than that of the service server, and the terminal rendering quality level may also be understood as being lower than the default rendering quality level of the service server, and the layer quality of the first layer rendered by the terminal rendering component may also be lower than the layer quality of the second layer rendered by the service server; therefore, when the first layer exists in the first cache component and the second layer also exists in the second cache component (that is, when the output time of the target layer is reached, the terminal rendering component has already rendered to obtain the rendered layer and has successfully received the rendered layer sent by the service server), the received second layer rendered by the service server can be directly used as the target layer without performing quality comparison.
Optionally, it may be understood that, when a first layer exists in the first cache component and a second layer exists in the second cache component and the second layer is an incomplete layer (that is, the second layer is one or more layer blocks, because when the output time of the target layer is reached, a part of the layer blocks is not timely transmitted to the user terminal due to network delay or network jitter in the transmission process), the first layer rendered by the terminal rendering component may be fused with the incomplete second layer (one or more second layer blocks) to generate the target layer.
Specifically, in the rendering process of the terminal rendering component, the target service data may be divided into a plurality of data blocks, and each data block is numbered, so that each data block has a unique first tag number, and one first tag number corresponds to a pixel position of one data block in the first layer, and the process of rendering the target service data by the terminal rendering component may be understood as rendering each data block, so as to obtain a plurality of (at least two) first layer blocks (one data block corresponds to one first layer block); similarly, in the process of rendering by the service server, the target service data may also be divided into a plurality of (at least two) data blocks, and each data block is numbered, so that each data block has a unique second tag number, and one second tag number also corresponds to a pixel position of one data block in the second map layer, and then the process of rendering by the service server on the target service data may also be understood as rendering each data block, so that a plurality of second map layers may be obtained (one data block corresponds to one second map layer block).
The business server divides target business data in the same way as the terminal rendering component divides the target business data, and data blocks obtained after the terminal rendering component divides have one-to-one correspondence with data blocks obtained after the business server divides (a first mark number corresponds to a second mark number); in view of this, a specific method for fusing the first layer and the incomplete second layer (one or more second layer blocks) to generate the target layer may be that a first tag number corresponding to each first layer block and a second tag number corresponding to each second layer block stored in the second cache component may be obtained, and the first tag number may be matched with the second tag number, so that a first tag number that is the same as the second tag number in the first tag number may be determined, and the first tag number that is the same as the second tag number may be determined as an associated tag number, and a first layer block corresponding to the associated tag number may be determined as the associated layer block; it should be understood that the associated layer block has the same pixel location as the second layer block;
subsequently, the layer quality corresponding to the associated map layer block and the layer quality corresponding to the second map layer block may be obtained, and if the layer quality corresponding to the second map layer block is greater than the layer quality corresponding to the associated map layer block, the associated map layer block in the first map layer may be replaced by the second map layer block, and the replaced first map layer may be used as a target map layer.
For convenience of understanding, please refer to fig. 4 together, where fig. 4 is a schematic diagram of fusing a first image layer and a second image layer to generate a target image layer according to an embodiment of the present application. As shown in fig. 4, the terminal rendering component divides the target service data into four data blocks, namely a data block a1, a data block a2, a data block A3 and a data block a 4; similarly, the service server divides the target service data into four data blocks, namely a data block a1 ', a data block a 2', a data block A3 'and a data block a 4', wherein the data block a1 and the data block a1 'have the same pixel position, the data block a2 and the data block a 2' have the same pixel position, the data block A3 and the data block A3 'have the same pixel position, and the data block a4 and the data block a 4' have the same pixel position;
the terminal rendering component may render the data blocks, such that a first image layer block a1, a first image layer block a2, a first image layer block A3, and a first image layer block a4 may be obtained, the first image layer block a1, the first image layer block a2, the first image layer block A3, and the first image layer block a4 constitute a first image layer; similarly, the service server may render the data blocks and send the rendered second layer block to the user terminal.
As shown in fig. 4, when the display time of the target layer is reached, due to network delay, the user terminal only receives a part of the second layer block (including the second layer block a1 'and the second layer block a 4') sent by the service server, and by comparing the layer quality of the first layer with the layer qualities of the second layer block a1 'and the second layer block a 4', it can be determined that the layer qualities of the second layer block a1 'and the second layer block a 4' are higher than the layer quality of the first layer, an associated layer block (the first layer block a1) having the same pixel position as the second layer block a1 'can be obtained in the first layer, and the first layer block a1 in the first layer is replaced with the second layer block a 1'; an associated image layer block (first image layer block a4) having the same pixel position as the second image layer block a4 'may be retrieved in the first image layer and the first image layer block a4 in the first image layer may be replaced with the second image layer block a 4'. This results in a target layer comprising second layer block a1 ', first layer block a2, first layer block A3, and second layer block a 4'.
Optionally, it may be understood that, as to a specific method for replacing an associated map layer block in a first map layer with a second map layer block, the image layer quality corresponding to the associated map layer block and the image layer quality corresponding to the second map layer block may also be obtained first; if the layer quality corresponding to the second layer block is greater than the layer quality corresponding to the associated layer block, adjusting the visual parameter corresponding to the associated layer block to a default parameter to obtain an adjusted associated layer block; adjusting the relevant graph layer block to be a graph layer block in a transparent state; in the first layer, the second layer block may cover the adjustment associated layer block. Wherein the visual parameter may be a transparency parameter.
For example, as shown in fig. 4, after determining that the layer quality of the second layer block a1 'and the second layer block a 4' is higher than that of the first layer, the transparency parameter of the associated layer block (including the first layer block a1 and the first layer block a4) in the first layer may be adjusted to 0, and then the states of the first layer block a1 and the first layer block a4 are both transparent states; subsequently, the second image layer block a1 'may be overlaid onto the first image layer block a1, and the second image layer block a 4' may be overlaid onto the first image layer block a 4. Thus, the covered first layer may serve as a target layer.
Optionally, it may be understood that when the first cache component and the second cache component are called, if a first layer exists in the first cache component and a second layer does not exist in the second cache component, the first layer may be determined as a target layer. It should be understood that, in the transmission process in which the service server sends the second layer to the user terminal, there is a case that transmission delay is caused by network delay or the like, and when the first cache component and the second cache component are called, the second layer sent by the service server may not be received yet, in this case, the first layer rendered by the terminal rendering component may be output as a target layer.
Optionally, it may be understood that when the first cache component and the second cache component are called, if the first layer does not exist in the first cache component and the second layer exists in the second cache component, the first layer may be determined as the target layer.
Optionally, it may be understood that, when the first cache component and the second cache component are called, if the first layer does not exist in the first cache component and the second layer does not exist in the second cache component, the pause prompting message may be generated, and the pause prompting message may be output on a terminal display interface of the user terminal. It should be understood that, when the first cache component and the second cache component are invoked, if the first layer does not exist in the first cache component (the terminal rendering component does not render the first layer), and the second layer does not exist in the second cache component (the service server has rendered the second layer but the user terminal does not receive the second layer sent by the service server), the user terminal may generate a pause prompt message to prompt a user using the user terminal, and the user terminal cannot output the layer, so that a problem of picture pause occurs.
And step S104, outputting the target layer in the display interface.
In the application, after the user terminal obtains the target layer, the target layer can be output in the display interface, and then the user using the user terminal can view the picture presented by the target control after clicking the target control.
Optionally, it may be understood that, in the display process of the target layer, the user terminal may detect a network quality parameter and a terminal operation parameter of the user terminal; if the network quality parameter and the terminal operation parameter do not meet the operation condition, acquiring a terminal performance parameter corresponding to the user terminal, and determining a recommended rendering quality grade according to the network quality parameter and the terminal performance parameter; the layer quality corresponding to the recommended rendering quality is smaller than the layer quality corresponding to the target layer; then, generating quality recommendation information according to the recommended rendering quality level, and outputting the quality recommendation information in the display interface; then, the user terminal can respond to the quality conversion confirmation operation aiming at the quality recommendation information, generate an updating layer rendering instruction containing the recommendation rendering quality grade, and send the updating layer rendering instruction to the terminal rendering component; after receiving the updated layer returned by the terminal rendering component according to the updated layer rendering instruction, outputting the updated layer in the display interface; and the layer quality of the updated layer is matched with the recommended rendering quality grade.
It should be understood that, in the display process of the target layer, that is, in the running process of the target application, since a manner that a local user terminal (terminal rendering component) and a service are rendered simultaneously is adopted, when the user terminal determines the target layer, the layer quality is taken as a priority, that is, the obtained layer quality of the target layer is higher. Because different user terminals have different heat dissipation capacities and different endurance times, the situation that a part of user terminals generate too large heat and too large power consumption in the process of outputting a layer with high image quality (high layer quality) may exist, which indicates that the terminal operation parameters of the user terminals do not meet the operation conditions, that is, the user terminals do not have the operation conditions for the target application, if the user terminals continue to operate in the mode of outputting the layer with high image quality (high layer quality), the problem of operation jam is likely to occur, and the user terminals continue to operate in the mode of outputting the layer with high image quality, so that the damage to the user terminals is also great.
In view of this, in the display process of the target layer, that is, in the operation process of the target application, the user terminal may periodically detect the terminal operation parameter of the user terminal (for example, detect the terminal operation parameter every 1 minute), for example, the user terminal may detect the current network quality parameter, the heating value, the power consumption, and other parameters, and if the current network quality is poor, the user terminal may consider that the current network is not enough to smoothly display the high-quality image layer; if the current heating value or the current power consumption is too large, the user terminal can also consider that the current user terminal does not meet the operating condition of smooth operation, and in order to ensure that the target application runs smoothly without picture blockage, the user terminal can determine a layer quality (picture quality) which can be borne according to the terminal performance parameters of the user terminal and can take the layer quality which can be borne by the user terminal as the recommended layer quality;
further, the user terminal may generate quality recommendation information according to the recommended rendering quality level corresponding to the recommended layer quality, and output the quality recommendation information in a display interface of the user terminal to remind the user that the current display image quality is too high, and if the user continues to operate, a problem of image blockage may occur, and ask whether the user may reduce the image quality, so as to reduce the layer quality of the layer to the lower recommended layer quality. If the user clicks and confirms that the quality recommendation quality is reduced, the user terminal can generate an update layer instruction according to the lower recommendation rendering quality grade in the subsequent rendering process, sends the update layer instruction to the terminal rendering component, and outputs an update layer rendered by the terminal rendering component according to the lower recommendation rendering quality grade in a display interface. Therefore, although the quality of the picture viewed by the user in the user terminal is low, the problem of picture blockage does not occur, and the fluency is guaranteed.
For ease of understanding, please refer to fig. 5, and fig. 5 is a schematic view of a scenario provided by an embodiment of the present application. The ue E shown in fig. 5 may be any one ue selected from the ue cluster in the embodiment shown in fig. 1, for example, the ue may be the ue 100 a.
As shown in fig. 5, when the game application is running, the picture quality (layer quality) of the currently displayed target layer is an extreme definition level, and the user terminal E determines that the current calorific value of the user terminal E is too large by detecting the terminal running parameter, and if the game application continues to run at the extreme definition level, a picture is blocked, which may affect the experience quality of the user for the game application; the user equipment E can determine a relatively low picture quality level (e.g., standard definition level) that can be tolerated according to its own terminal performance parameters, and when operating at the relatively low picture quality level, the picture is not jammed.
As shown in fig. 5, after determining a lower picture quality level, the user terminal E may generate quality recommendation information, and output the quality recommendation information in the display interface to remind the user that the currently running picture quality is too high, the mobile phone load is too heavy, and a picture may be stuck, and ask the user whether to reduce the picture quality to ensure smooth operation. As shown in fig. 5, if the user E clicks the confirmation button in the display interface, which may indicate that the user E agrees to reduce the picture quality, the user terminal E may respond to the confirmation trigger operation of the user E to generate an updated layer rendering instruction, and send the updated layer rendering instruction to the terminal rendering component, and the terminal rendering component may render subsequent service data according to the updated rendering instruction and the lower picture quality level (such as standard definition) and obtain an updated rendered layer; subsequently, the user terminal E may output the updated rendering layer in the display interface. The user E can smoothly run the game application while using the game application although the currently viewed picture quality is low (standard definition).
In the embodiment of the application, for the rendering of the target service data, a mode that a service server and a terminal rendering component in a terminal render simultaneously is adopted. The terminal may determine a target layer and output the target layer in a display interface in a first layer rendered by the terminal rendering component and a second layer rendered by the service server. It should be understood that, because the service server and the terminal rendering component render simultaneously, the terminal, when determining the target layer, does not rely on the terminal rendering component in the terminal, and may also rely on the service server at the same time, the terminal rendering component renders according to a lower layer quality level, and the service server renders according to a higher layer quality level, and when outputting the target layer, it is prioritized to consider that the second layer transmitted by the service server is output as the target layer, so that the presented picture quality may be improved; and if the second layer transmitted by the service server is not received when the target layer is output, the first layer with lower layer quality can be used as the target layer to be output, so that the fluency and the timeliness can be guaranteed. Therefore, the method can improve the picture quality while ensuring smooth operation of the application client side when the terminal rendering component and the service server are used for rendering simultaneously.
Further, please refer to fig. 6, fig. 6 is a system architecture diagram according to an embodiment of the present application. As shown in fig. 6, if a touch event occurs in the user terminal (for example, a user clicks any control), the user terminal generates an image layer rendering instruction, and sends the image layer rendering instruction to the service server through a network send component; after receiving the layer rendering instruction, the service server may render the service data triggered by the touch event, and return the rendered data to the user terminal, and the user terminal may perform hard decoding on the data returned by the service server, and store the layer obtained after the hard decoding in the second cache component; it should be understood that the user terminal may locally render the service data triggered by the touch event according to a lower image quality (the terminal rendering component renders), and store the rendered layer in the first cache component; subsequently, the layer determining component may obtain a layer rendered locally by the user terminal from the first cache component, obtain a layer rendered by the service server from the second cache component, and determine that the target layer is output on the screen (display interface).
Optionally, it may be understood that, after the user starts the target application, the user terminal may determine an image layer rendering quality level according to a member level of the user, generate an image layer rendering quality including the image layer rendering quality level, and send the image layer rendering quality to the service server, so that the service server renders the service data according to the image layer rendering quality level. The member level may be determined by a virtual asset of the user in the target application, and the virtual asset may be an amount of money put into the target application by the user, or an asset in a virtual form such as a game chip and an experience value acquired by the user when the user performs a task in the target application.
For example, if the virtual asset of the user in the target application is 0 and the member level of the user is 0, the user terminal may determine that the layer rendering quality level for the user is an ultradefinition level, and a higher layer rendering quality level (e.g., an ultradefinition level, a blue light level, etc.), and the user does not have permission to view the image; the super-definition level is the highest layer rendering quality level which can be observed by a user with a virtual asset of 0; if the user has a virtual asset in the target application, but the asset number of the virtual asset does not satisfy the threshold, the user terminal may determine that the layer rendering quality level for the user is an extreme definition level, a higher layer rendering quality level (e.g., an HDR level, etc.), and the user does not have permission to view; if the virtual asset of the user in the target application exceeds a threshold, the layer rendering quality level for the user may be determined to be the highest level (in-flow, HDR level) of all layer rendering quality levels.
Further, please refer to fig. 7, where fig. 7 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. The data processing means may be a computer program (comprising program code) running on a computer device, for example the data processing means being an application software; the data processing apparatus may be adapted to perform the method illustrated in fig. 3. As shown in fig. 7, the data processing apparatus 1 may include: the device comprises an instruction generating module 11, an instruction sending module 12, a component calling module 13, a target layer generating module 14 and a layer output module 15.
The instruction generating module 11 is configured to generate an image layer rendering instruction in response to a trigger operation for the target control;
the instruction sending module 12 is configured to send the layer rendering instruction to the terminal rendering component and the service server;
the component calling module 13 is used for calling the first cache component and the second cache component; the first cache assembly is used for storing layers rendered by the terminal rendering assembly; the second cache component is used for storing the layer rendered by the service server;
a target layer generation module 14, configured to determine, when a first layer exists in the first cache component and a second layer exists in the second cache component, a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering the target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is the business data triggered by the target control
And the layer output module 15 is configured to output the target layer in the display interface.
For specific implementation manners of the instruction generating module 11, the instruction sending module 12, the component calling module 13, the target layer generating module 14, and the layer outputting module 15, reference may be made to the descriptions in step S101 to step S104 in the embodiment corresponding to fig. 3, and details will not be described here.
The layer rendering instruction comprises a first rendering instruction and a second rendering instruction;
referring to fig. 7, the instruction generating module 11 may include: a parameter detection unit 111 and an instruction generation unit 112.
The parameter detection unit 111 is used for responding to the trigger operation aiming at the target control and detecting the component performance parameters of the terminal rendering component;
an instruction generating unit 112, configured to determine a terminal rendering quality level according to the component performance parameter, and generate a first rendering instruction including the terminal rendering quality level;
an instruction generating unit 112, further configured to generate a second rendering instruction for the business server;
for a specific implementation manner of the parameter detection unit 111 and the instruction generation unit 112, reference may be made to the description of generating the layer rendering instruction in step S101 in the embodiment corresponding to fig. 3, which will not be described herein again.
Referring to fig. 7, the instruction sending module 12 may include: a first instruction transmitting unit 121 and a second instruction transmitting unit 122.
A first instruction sending unit 121, configured to send a first rendering instruction to a terminal rendering component, and render target service data through the terminal rendering component to obtain a first layer matching a terminal rendering quality level;
and a second instruction sending unit 122, configured to send the second rendering instruction to the service server, so that the service server renders the target service data according to the second rendering instruction, and obtains a second layer.
For specific implementation of the first instruction sending unit 121 and the second instruction sending unit 122, reference may be made to the description of instruction sending in step S101 in the embodiment corresponding to fig. 3, which will not be described again here.
Referring to fig. 7, target layer generation module 14 may include: layer quality acquisition unit 141 and first target layer determination unit 142.
The layer quality obtaining unit 141 is configured to obtain a first layer quality corresponding to a first layer and a second layer quality corresponding to a second layer when the first buffer component has the first layer and the second buffer component has the second layer; the first layer quality corresponds to a terminal rendering quality grade, and the second layer quality corresponds to a second rendering quality grade;
a first target layer determining unit 142, configured to determine the first layer as a target layer if the first layer quality is greater than the second layer quality;
the first target layer determining unit 142 is further configured to determine the second layer as the target layer if the first layer quality is less than the second layer quality.
For specific implementation manners of the layer quality obtaining unit 141 and the first target layer determining unit 142, reference may be made to the description of generating the target layer in step S103 in the embodiment corresponding to fig. 3, which will not be described herein again.
The first image layer is composed of at least two first image layer blocks; the second layer is composed of at least one second layer block;
referring to fig. 7, target layer generation module 14 may include: layer block acquisition unit 143, associated layer block acquisition unit 144, layer block replacement unit 145, and second target layer determination unit 146.
The layer block obtaining unit 143 is configured to obtain at least one second layer block when a first layer exists in the first cache component, a second layer exists in the second cache component, and the second layer is an incomplete layer;
an associated layer block acquisition unit 144 configured to acquire, as an associated layer block, a first layer block having the same pixel position as the second layer block, among the at least two first layer blocks;
a layer block replacing unit 145, configured to replace the associated layer block in the first layer with a second layer block;
a second target layer determining unit 146, configured to determine the replaced first layer as a target layer.
For a specific implementation manner of the layer block obtaining unit 143, the associated layer block obtaining unit 144, the layer block replacing unit 145, and the second target layer determining unit 146, reference may be made to the description of generating the target layer when the second layer is an incomplete layer in step S103 in the embodiment corresponding to fig. 3, which will not be described herein again.
Referring to fig. 7, the association map layer block obtaining unit 144 may include: a number acquisition subunit 1441, an association number acquisition subunit 1442, and an association map layer block determination subunit 1443.
A number obtaining subunit 1441, configured to obtain a first tag number corresponding to each first layer block, and obtain a second tag number corresponding to each second layer block; the first mark number is used for representing the pixel position of the first image layer block in the first image layer, and the second mark number is used for representing the pixel position of the second image layer block in the second image layer;
an associated number obtaining subunit 1442, configured to determine, in the first tag numbers corresponding to the at least two first layer blocks, a first tag number that is the same as the second tag number, and determine, as an associated tag number, the first tag number that is the same as the second tag number;
an association map layer block determination subunit 1443 is configured to determine the first map layer block corresponding to the association flag number as an association map layer block.
For a specific implementation manner of the number obtaining subunit 1441, the associated number obtaining subunit 1442, and the associated diagram layer block determining subunit 1443, reference may be made to the description of determining the associated diagram layer block in step S103 in the embodiment corresponding to fig. 3, which will not be described herein again.
Referring to fig. 7, the layer block replacing unit 145 may include: a first quality determination subunit 1451, and a layer block replacement subunit 1452.
A first quality determination subunit 1451, configured to obtain layer quality corresponding to the associated layer block and layer quality corresponding to the second layer block;
an image layer block replacing subunit 1452, configured to replace the associated image layer block in the first image layer with the second image layer block if the layer quality corresponding to the second image layer block is greater than the layer quality corresponding to the associated image layer block.
For a specific implementation manner of the first quality determination subunit 1451 and the layer block replacement subunit 1452, reference may be made to the description of layer block replacement performed in S103 in the step in the embodiment corresponding to fig. 3, which will not be described herein again.
Referring to fig. 7, the layer block replacing unit 145 may include: a second quality determination subunit 1453, a visual parameter adjustment subunit 1454, and a layer block covering unit 1455.
A second quality determination subunit 1453, configured to obtain layer quality corresponding to the associated layer block and layer quality corresponding to the second layer block;
a visual parameter adjusting subunit 1454, configured to adjust the visual parameter corresponding to the associated map layer block to a default parameter to obtain an adjusted associated map layer block if the layer quality corresponding to the second map layer block is greater than the layer quality corresponding to the associated map layer block; adjusting the relevant graph layer block to be a graph layer block in a transparent state;
an overlay block covering unit 1455, configured to cover the second overlay block on the adjustment associated overlay block in the first overlay.
For a specific implementation manner of the second quality determining subunit 1453, the visual parameter adjusting subunit 1454, and the layer block covering unit 1455, reference may be made to the description of the layer block covering in step S103 in the embodiment corresponding to fig. 3, which will not be described again here.
Referring to fig. 7, the data processing apparatus 1 may include an instruction generating module 11, an instruction sending module 12, a component calling module 13, a target layer generating module 14, and a layer output module 15, and may further include: a first target layer determining module 16.
The first target layer determining module 16 is configured to determine, when a first layer exists in the first cache component and a second layer does not exist in the second cache component, the first layer as a target layer.
For a specific implementation manner of the first target layer determining module 16, reference may be made to the description in step S103 in the embodiment corresponding to fig. 3, which will not be described herein again.
Referring to fig. 7, the data processing apparatus 1 may include an instruction generating module 11, an instruction sending module 12, a component calling module 13, a target layer generating module 14, a layer output module 15, and a first target layer determining module 16, and may further include: and a second target layer determining module 17.
A second target layer determining module 17, configured to determine, when the first layer does not exist in the first cache component and a second layer exists in the second cache component, the second layer as the target layer.
The specific implementation manner of the second target layer determining module 17 may refer to the description in step S103 in the embodiment corresponding to fig. 3, and details are not repeated here.
Referring to fig. 7, the data processing apparatus 1 may include an instruction generating module 11, an instruction sending module 12, a component calling module 13, a target layer generating module 14, a layer output module 15, a first target layer determining module 16, and a second target layer determining module 17, and may further include: third target layer determination module 18.
For a specific implementation manner of the third target layer determining module 18, reference may be made to the description in step S103 in the embodiment corresponding to fig. 3, which will not be described herein again.
Referring to fig. 7, data processing apparatus 1 may include an instruction generating module 11, an instruction sending module 12, a component calling module 13, a target layer generating module 14, a layer output module 15, a first target layer determining module 16, a second target layer determining module 17, and a third target layer determining module 18, and may further include: the device comprises an operation parameter detection module 19, a performance parameter acquisition module 20, a recommendation information generation module 21, an update instruction generation module 22 and an update layer output module 23.
An operation parameter detection module 19, configured to detect a network quality parameter and a terminal operation parameter of the user terminal in a display process of the target layer; the user terminal comprises a terminal rendering component;
a performance parameter obtaining module 20, configured to obtain a terminal performance parameter corresponding to the user terminal if the network quality parameter and the terminal operation parameter do not meet the operation condition, and determine a recommended rendering quality level according to the network quality parameter and the terminal performance parameter; the layer quality corresponding to the recommended rendering quality grade is smaller than the layer quality corresponding to the target layer;
a recommendation information generation module 21, configured to generate quality recommendation information according to the recommended rendering quality level, and output the quality recommendation information in a display interface;
the update instruction generation module 22 is configured to generate an update layer rendering instruction including a recommended rendering quality level in response to a quality conversion confirmation operation for the quality recommendation information, and send the update layer rendering instruction to the terminal rendering component;
the updated layer output module 23 is configured to receive an updated layer returned by the terminal rendering component according to the updated layer rendering instruction, and output the updated layer in the display interface; and the layer quality of the updated layer is matched with the recommended rendering quality grade.
For specific implementation manners of the operation parameter detecting module 19, the performance parameter obtaining module 20, the recommendation information generating module 21, the update instruction generating module 22, and the update layer outputting module 23, reference may be made to the description in step S104 in the embodiment corresponding to fig. 3, which will not be described herein again.
In the embodiment of the application, for the rendering of the target service data, a mode that a service server and a terminal rendering component in a terminal render simultaneously is adopted. The terminal may determine a target layer and output the target layer in a display interface in a first layer rendered by the terminal rendering component and a second layer rendered by the service server. It should be understood that, because the service server and the terminal rendering component render simultaneously, the terminal, when determining the target layer, does not rely on the terminal rendering component in the terminal, and may also rely on the service server at the same time, the terminal rendering component renders according to a lower layer quality level, and the service server renders according to a higher layer quality level, and when outputting the target layer, it is prioritized to consider that the second layer transmitted by the service server is output as the target layer, so that the presented picture quality may be improved; and if the second layer transmitted by the service server is not received when the target layer is output, the first layer with lower layer quality can be used as the target layer to be output, so that the fluency and the timeliness can be guaranteed. Therefore, the method can improve the picture quality while ensuring smooth operation of the application client side when the terminal rendering component and the service server are used for rendering simultaneously.
Further, please refer to fig. 8, where fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application. As shown in fig. 8, the apparatus 1 in the embodiment corresponding to fig. 7 may be applied to the computer device 1000, and the computer device 1000 may include: the processor 1001, the network interface 1004, and the memory 1005, and the computer apparatus 1000 further includes: a user interface 1003, and at least one communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display) and a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a standard wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 8, a memory 1005, which is a kind of computer-readable storage medium, may include therein an operating system, a network communication module, a user interface module, and a device control application program.
In the computer device 1000 shown in fig. 8, the network interface 1004 may provide a network communication function; the user interface 1003 is an interface for providing a user with input; and the processor 1001 may be used to invoke a device control application stored in the memory 1005 to implement:
responding to the trigger operation aiming at the target control, generating a layer rendering instruction, and sending the layer rendering instruction to a terminal rendering component and a service server;
calling a first cache component and a second cache component; the first cache assembly is used for storing layers rendered by the terminal rendering assembly; the second cache component is used for storing the layer rendered by the service server;
when a first layer exists in the first cache assembly and a second layer exists in the second cache assembly, determining a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering the target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is the business data triggered by the target control
And outputting the target layer in the display interface.
It should be understood that the computer device 1000 described in this embodiment of the present application may perform the description of the data processing method in the embodiment corresponding to fig. 3 to fig. 6, and may also perform the description of the data processing apparatus 1 in the embodiment corresponding to fig. 7, which is not described herein again. In addition, the beneficial effects of the same method are not described in detail.
Further, here, it is to be noted that: an embodiment of the present application further provides a computer-readable storage medium, where a computer program executed by the aforementioned data processing computer device 1000 is stored in the computer-readable storage medium, and the computer program includes program instructions, and when the processor executes the program instructions, the description of the data processing method in the embodiment corresponding to fig. 3 to fig. 6 can be performed, so that details are not repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in embodiments of the computer-readable storage medium referred to in the present application, reference is made to the description of embodiments of the method of the present application.
The computer readable storage medium may be the data processing apparatus provided in any of the foregoing embodiments or an internal storage unit of the computer device, such as a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Memory Card (SMC), a Secure Digital (SD) card, a flash card (flash card), and the like, provided on the computer device. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the computer device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the computer device. The computer readable storage medium may also be used to temporarily store data that has been output or is to be output.
In one aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by one aspect of the embodiments of the present application.
The terms "first," "second," and the like in the description and in the claims and drawings of the embodiments of the present application are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprises" and any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, apparatus, product, or apparatus that comprises a list of steps or elements is not limited to the listed steps or modules, but may alternatively include other steps or modules not listed or inherent to such process, method, apparatus, product, or apparatus.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The method and the related apparatus provided by the embodiments of the present application are described with reference to the flowchart and/or the structural diagram of the method provided by the embodiments of the present application, and each flow and/or block of the flowchart and/or the structural diagram of the method, and the combination of the flow and/or block in the flowchart and/or the block diagram can be specifically implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block or blocks of the block diagram.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block or blocks of the block diagram. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block or blocks.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (14)

1. A data processing method, comprising:
responding to trigger operation aiming at a target control, generating a layer rendering instruction, and sending the layer rendering instruction to a terminal rendering component and a service server;
calling a first cache component and a second cache component; the first cache component is used for storing the layer rendered by the terminal rendering component; the second cache component is used for storing the layer rendered by the service server;
when a first layer exists in the first cache assembly and a second layer exists in the second cache assembly, determining a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is the business data triggered by the target control;
and outputting the target layer in a display interface.
2. The method of claim 1, wherein the layer rendering instructions comprise first rendering instructions and second rendering instructions;
the response is to the trigger operation of the target control, and the generation of the layer rendering instruction comprises the following steps:
responding to the trigger operation aiming at the target control, and detecting the component performance parameters of the terminal rendering component;
determining a terminal rendering quality grade according to the component performance parameters, and generating the first rendering instruction containing the terminal rendering quality grade;
generating the second rendering instructions for the business server;
the sending the layer rendering instruction to a terminal rendering component and a service server includes:
sending the first rendering instruction to the terminal rendering component, and rendering the target service data through the terminal rendering component to obtain a first layer matched with the terminal rendering quality grade;
and sending the second rendering instruction to the service server so that the service server renders the target service data according to the second rendering instruction to obtain a second layer.
3. The method according to claim 2, wherein when a first layer exists in the first cache component and a second layer exists in the second cache component, determining a target layer for responding to the trigger operation according to the first layer and the second layer includes:
when a first layer exists in the first cache assembly and a second layer exists in the second cache assembly, acquiring a first layer quality corresponding to the first layer and a second layer quality corresponding to the second layer; the first layer quality corresponds to the terminal rendering quality grade, and the second layer quality corresponds to the second rendering quality grade;
if the first layer quality is greater than the second layer quality, determining the first layer as the target layer;
and if the first layer quality is smaller than the second layer quality, determining the second layer as the target layer.
4. The method according to claim 1, wherein the first layer is composed of at least two first layer blocks; the second layer is composed of at least one second layer block;
when a first layer exists in the first cache component and a second layer exists in the second cache component, determining a target layer for responding to the trigger operation according to the first layer and the second layer includes:
when a first image layer exists in the first cache assembly, a second image layer exists in the second cache assembly, and the second image layer is an incomplete image layer, acquiring at least one second image layer block;
obtaining a first image layer block with the same pixel position as the second image layer block from the at least two first image layer blocks, taking the first image layer block as an associated image layer block, and replacing the associated image layer block in the first image layer with the second image layer block;
and determining the replaced first image layer as the target image layer.
5. The method according to claim 4, wherein the obtaining, as the associated map-layer block, the first map-layer block having the same pixel position as the second map-layer block, among the at least two first map-layer blocks, comprises:
acquiring a first mark number corresponding to each first layer block, and acquiring a second mark number corresponding to each second layer block; the first mark number is used for representing the pixel position of the first image layer block in the first image layer, and the second mark number is used for representing the pixel position of the second image layer block in the second image layer;
determining a first mark number which is the same as the second mark number in first mark numbers corresponding to the at least two first image layer blocks, and determining the first mark number which is the same as the second mark number as an associated mark number;
and determining the first image layer block corresponding to the association mark number as the association image layer block.
6. The method of claim 4, wherein replacing the associated map layer block in the first map layer with the second map layer block comprises:
obtaining the layer quality corresponding to the associated layer block and the layer quality corresponding to the second layer block;
and if the layer quality corresponding to the second layer block is greater than the layer quality corresponding to the associated layer block, replacing the associated layer block in the first layer with the second layer block.
7. The method of claim 4, wherein replacing the associated map layer block in the first map layer with the second map layer block comprises:
obtaining the layer quality corresponding to the associated layer block and the layer quality corresponding to the second layer block;
if the layer quality corresponding to the second layer block is greater than the layer quality corresponding to the associated layer block, adjusting the visual parameter corresponding to the associated layer block to a default parameter to obtain an adjusted associated layer block; the layer block in the transparent state is adjusted to be the associated layer block;
and in the first map layer, covering the adjusted associated map layer block by the second map layer block.
8. The method according to any one of claims 1 to 7, further comprising:
and when a first layer exists in the first cache assembly and a second layer does not exist in the second cache assembly, determining the first layer as the target layer.
9. The method according to any one of claims 1 to 7, further comprising:
and when the first layer does not exist in the first cache assembly and the second layer exists in the second cache assembly, determining the second layer as the target layer.
10. The method according to any one of claims 1 to 7, further comprising:
and when the first layer does not exist in the first cache assembly and the second layer does not exist in the second cache assembly, generating the stuck prompt message and outputting the stuck prompt message in the display interface.
11. The method according to any one of claims 1 to 7, further comprising:
in the display process of the target layer, detecting network quality parameters and terminal operation parameters of a user terminal; the user terminal comprises the terminal rendering component;
if the network quality parameter and the terminal operation parameter do not meet the operation condition, acquiring a terminal performance parameter corresponding to the user terminal, and determining a recommended rendering quality grade according to the network quality parameter and the terminal performance parameter; the layer quality corresponding to the recommended rendering quality grade is smaller than the layer quality corresponding to the target layer;
generating quality recommendation information according to the recommended rendering quality grade, and outputting the quality recommendation information in the display interface;
responding to quality conversion confirmation operation aiming at the quality recommendation information, generating an updated layer rendering instruction containing the recommended rendering quality grade, and sending the updated layer rendering instruction to the terminal rendering component;
receiving an updated layer returned by the terminal rendering component according to the updated layer rendering instruction, and outputting the updated layer in the display interface; and the layer quality of the updated layer is matched with the recommended rendering quality grade.
12. A data processing apparatus, comprising:
the instruction generating module is used for responding to the triggering operation aiming at the target control and generating an image layer rendering instruction;
the instruction sending module is used for sending the layer rendering instruction to a terminal rendering component and a service server;
the component calling module is used for calling the first cache component and the second cache component; the first cache component is used for storing the layer rendered by the terminal rendering component; the second cache component is used for storing the layer rendered by the service server;
a target layer generation module, configured to determine, when a first layer exists in the first cache component and a second layer exists in the second cache component, a target layer for responding to the trigger operation according to the first layer and the second layer; the first layer is obtained by rendering target service data by the terminal rendering component according to the layer rendering instruction; the second layer is obtained by rendering the target service data by the service server according to the layer rendering instruction; the target business data is the business data triggered by the target control;
and the layer output module is used for outputting the target layer in a display interface.
13. A computer device, comprising: a processor, a memory, and a network interface;
the processor is connected to the memory and the network interface, wherein the network interface is configured to provide a network communication function, the memory is configured to store program code, and the processor is configured to call the program code to perform the method of any one of claims 1-11.
14. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions which, when executed by a processor, perform the method of any of claims 1-11.
CN202010827039.6A 2020-08-17 2020-08-17 Data processing method, device and equipment and readable storage medium Pending CN114073858A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010827039.6A CN114073858A (en) 2020-08-17 2020-08-17 Data processing method, device and equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010827039.6A CN114073858A (en) 2020-08-17 2020-08-17 Data processing method, device and equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114073858A true CN114073858A (en) 2022-02-22

Family

ID=80280933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010827039.6A Pending CN114073858A (en) 2020-08-17 2020-08-17 Data processing method, device and equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114073858A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114816193A (en) * 2022-04-29 2022-07-29 阿里巴巴(中国)有限公司 Drawing display method and device and terminal equipment
WO2024061180A1 (en) * 2022-09-19 2024-03-28 杭州阿里云飞天信息技术有限公司 Cloud desktop system, cloud desktop display method, terminal device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114816193A (en) * 2022-04-29 2022-07-29 阿里巴巴(中国)有限公司 Drawing display method and device and terminal equipment
WO2024061180A1 (en) * 2022-09-19 2024-03-28 杭州阿里云飞天信息技术有限公司 Cloud desktop system, cloud desktop display method, terminal device and storage medium

Similar Documents

Publication Publication Date Title
CN110636353B (en) Display device
US10810686B2 (en) Identification of rule violations in a network community
CN111770366A (en) Message reissue method, server and display device
CN110675872B (en) Voice interaction method based on multi-system display equipment and multi-system display equipment
CN111752518A (en) Screen projection method of display equipment and display equipment
CN113784220B (en) Method for playing media resources, display device and mobile device
CN112698905B (en) Screen saver display method, display device, terminal device and server
CN112073664B (en) Video call method and display device
CN111836115B (en) Screen saver display method, screen saver skipping method and display device
CN112073762B (en) Information acquisition method based on multi-system display equipment and multi-system display equipment
CN114339332B (en) Mobile terminal, display device and cross-network screen projection method
US11425466B2 (en) Data transmission method and device
CN114073858A (en) Data processing method, device and equipment and readable storage medium
CN112399232A (en) Display equipment, camera priority use control method and device
CN112399263A (en) Interaction method, display device and mobile terminal
CN112165641A (en) Display device
CN111984167B (en) Quick naming method and display device
CN111897641B (en) Micro-service monitoring and scheduling method and display device
CN111935510B (en) Double-browser application loading method and display equipment
CN114079819A (en) Content display method and display equipment
CN113784186B (en) Terminal device, server, and communication control method
CN108429925B (en) View display method and device
CN106383705B (en) Method and device for setting mouse display state in application thin client
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN112073812B (en) Application management method on smart television and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination