CN110795074B - Application data processing method and device, computer equipment and storage medium - Google Patents

Application data processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110795074B
CN110795074B CN201810865998.XA CN201810865998A CN110795074B CN 110795074 B CN110795074 B CN 110795074B CN 201810865998 A CN201810865998 A CN 201810865998A CN 110795074 B CN110795074 B CN 110795074B
Authority
CN
China
Prior art keywords
data
virtual object
dynamic analysis
image
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810865998.XA
Other languages
Chinese (zh)
Other versions
CN110795074A (en
Inventor
雷志强
刘畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810865998.XA priority Critical patent/CN110795074B/en
Publication of CN110795074A publication Critical patent/CN110795074A/en
Application granted granted Critical
Publication of CN110795074B publication Critical patent/CN110795074B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/37Compiler construction; Parser generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/31Programming languages or programming paradigms
    • G06F8/315Object-oriented languages
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • G06F8/44Encoding
    • G06F8/447Target code generation

Abstract

The application relates to an application data processing method, an application data processing device, a computer readable storage medium and computer equipment, wherein the method comprises the following steps: loading dynamic analysis code data, acquiring characteristic data corresponding to the interactive application data, analyzing the dynamic analysis code data, analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine, and executing the second drawing command according to the characteristic data to obtain a corresponding drawing result. According to the scheme, the code data are analyzed dynamically and the interactive application data are completely decoupled, so that the development efficiency of the code data are analyzed dynamically.

Description

Application data processing method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an application data processing method and apparatus, a computer device, and a storage medium.
Background
With the rapid development of computer technology, the computer technology brings great convenience to the life of people, people can not only acquire information and browse videos through a terminal, but also control various virtual objects displayed in a terminal interface, and realize interaction with the terminal, and even interaction with a user who uses the terminal remotely.
However, at present, the interaction between the native application of the terminal and the interactive application attached to the native application is usually to compile and package a code corresponding to the interactive application into a native code corresponding to the native application, and when the interactive application needs to be updated, the native application needs to be updated together, which results in long development time between the native application and the interactive application and low pushing efficiency of the interactive application.
Disclosure of Invention
Therefore, it is necessary to provide an application data processing method, an apparatus, a computer device, and a storage medium for solving the above technical problems, in which codes of an interactive application corresponding to a local application are completely decoupled from codes of the local application, and when the interactive application needs to be updated, only the interactive application needs to be updated, so that development efficiency of the interactive application and push efficiency of the interactive application are improved.
A method of application data processing, the method comprising:
loading dynamic analysis code data;
acquiring feature data corresponding to the interactive application data;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
and executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
An application data processing apparatus, the apparatus comprising:
the dynamic analysis code data loading module is used for loading dynamic analysis code data;
the characteristic data acquisition module is used for acquiring characteristic data corresponding to the interactive application data;
the dynamic analysis code data analysis module is used for analyzing the dynamic analysis code data and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
and the characteristic data execution module is used for executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps when executing the program of:
loading dynamic analysis code data;
acquiring feature data corresponding to the interactive application data;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
and executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform the steps of:
loading dynamic analysis code data;
acquiring feature data corresponding to the interactive application data;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
and executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
An application data processing method, comprising:
loading dynamic analysis code data;
acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
executing a second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame;
drawing a target virtual object in the current video frame according to the drawing state information;
the current video frame including the target virtual object is played.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps when executing the program of:
loading dynamic analysis code data;
acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
executing a second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame;
drawing a target virtual object in the current video frame according to the drawing state information;
the current video frame including the target virtual object is played.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform the steps of:
loading dynamic analysis code data;
acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
executing a second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame;
drawing a target virtual object in the current video frame according to the drawing state information;
the current video frame including the target virtual object is played.
According to the application data processing method and device, the computer readable storage medium and the computer equipment, the terminal loads the dynamic analysis code data, obtains the characteristic data corresponding to the interactive application data, analyzes the dynamic analysis code data, analyzes the first drawing command in the dynamic analysis code data into the second drawing command corresponding to the image drawing engine, and executes the second drawing command according to the characteristic data to obtain the corresponding drawing result. The dynamic analysis code data is code data corresponding to the interactive application, the interactive application is an application associated with the local application, the interactive application can enter from an entrance provided by the local application, codes of the interactive application corresponding to the local application and the codes of the local application are completely decoupled, when the interactive application needs to be updated, only the dynamic analysis code data of the interactive application needs to be updated without updating the local application together, and therefore the updated dynamic analysis code data can be pushed in time, the pushing efficiency of the interactive application is improved, and the development difficulty is reduced. Furthermore, the dynamic analysis code data can not generate any influence due to different systems of the terminal, the terminals of different systems can load the dynamic analysis code data, the dynamic analysis code data do not need to be developed aiming at the terminals of different systems, the development difficulty of the dynamic analysis code data is reduced, and the development time is shortened.
Drawings
FIG. 1 is a diagram of an application environment in which a data processing method is applied in one embodiment;
FIG. 2 is a flow diagram illustrating a method for applying data processing in one embodiment;
FIG. 3 is a flowchart illustrating the steps of obtaining feature data corresponding to interactive application data according to one embodiment;
FIG. 4 is a diagram illustrating identification of a target interaction subject region in a current video frame, under an embodiment;
FIG. 5 is a flowchart illustrating the step of parsing dynamic code data in one embodiment;
FIG. 6 is a diagram illustrating parsing of a first rendering command in dynamic parsing code data into a second rendering command corresponding to an image rendering engine, under an embodiment;
FIG. 7 is a flowchart illustrating that a second rendering command is executed according to feature data to obtain a corresponding rendering result according to an embodiment;
FIG. 7A is a diagram illustrating an embodiment of obtaining a target virtual object according to a second rendering command;
FIG. 8 is a flowchart illustrating a method for processing application data according to another embodiment;
FIG. 9 is a diagram illustrating combining a first image corresponding to a rendering result with a second image corresponding to interactive application data to obtain a composite image, according to an embodiment;
FIG. 10 is a flowchart showing a method of processing application data in still another embodiment;
FIG. 11 is a flowchart showing a method of processing application data in still another embodiment;
FIG. 12 is a diagram illustrating a scenario in which a data processing method is applied in one embodiment;
FIG. 12A is a diagram illustrating a scenario in which a data processing method is applied in another embodiment;
FIG. 12B is a diagram illustrating a scenario in which a data processing method is applied in another embodiment;
FIG. 13 is a block diagram showing an example of the structure of an application data processing apparatus;
FIG. 14 is a block diagram of a feature data acquisition module in one embodiment;
FIG. 15 is a block diagram that illustrates a module that parses code data dynamically, in one embodiment;
FIG. 16 is a block diagram of a feature data execution module in one embodiment;
FIG. 17 is a block diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
FIG. 1 is a diagram of an application environment in which a data processing method is applied in one embodiment. Referring to fig. 1, the application data processing method is applied to an application data processing system. The application data processing system comprises a terminal 110 and a server 120. The terminal 110 and the server 120 are connected through a network. The terminal 110 may specifically be a desktop terminal or a mobile terminal, and the mobile terminal may specifically be at least one of a mobile phone, a tablet computer, a notebook computer, and the like. The server 120 may be implemented as a stand-alone server or a server cluster composed of a plurality of servers.
Specifically, the terminal 110 requests the server 120 to issue the local application code data, receives the local application code data sent by the server 120, and installs the corresponding local application, where the local application code data does not include the dynamic analysis code data. When the dynamic analysis code data download is triggered from the entry provided by the local application, the dynamic analysis code data is requested to be issued to the server 120, different interactive applications correspond to different dynamic analysis code data, and the interactive application identifier can be carried when the request is made, so that the corresponding dynamic analysis code data is downloaded. When detecting that the code data corresponding to the local application only includes no dynamic analysis code data, the server 120 may actively push the dynamic analysis code data to the terminal 110, without the terminal 110 actively triggering downloading of the dynamic analysis code data through an entry provided by the local application.
The server 120 returns the dynamic resolution code data to the terminal 110. The terminal 110 loads the dynamic analysis code data, obtains feature data corresponding to the interactive application data, analyzes the dynamic analysis code data, analyzes a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine, and finally executes the second drawing command according to the feature data to obtain a corresponding drawing result. When the dynamic analysis code data of the terminal needs to be updated, the server 120 actively pushes the updated dynamic decoding code data to the terminal, and the terminal does not need to download the updated dynamic analysis code data from an entry trigger request provided by the local application.
As shown in FIG. 2, in one embodiment, an application data processing method is provided. The embodiment is mainly illustrated by applying the method to the terminal 110 in fig. 1. Referring to fig. 2, the application data processing method specifically includes the following steps:
step 202, loading dynamic analysis code data.
The dynamic analysis code data refers to source codes or source codes written by a programming language which does not need compiling or dynamic analysis execution, the programming language which does not need compiling and dynamic analysis execution can be dynamically and efficiently analyzed during running, and compiling is not needed. The programming language that does not need to be compiled for dynamic parsing includes, but is not limited to, Javascript programming language, Python programming language, php programming language, lua programming language, and the like. In one embodiment, if the interactive application is inserted into the native application on the terminal, the interactive application is an application associated with the native application, and the interactive application can be entered through a portal provided by the native application, and the dynamic resolution code data may be code data related to the interactive application instead of code data related to the native application, that is, the code data of the native application and the dynamic resolution code data of the interactive application are completely decoupled. For example, a game application is inserted into a native application, the game application is an interactive application associated with the native application, the game application can be accessed through a portal provided by the native application, code data corresponding to the native application does not have game-application-related code data, the game-application-related code data and the native-application-related code data are separately separated, and the game-related code data may be dynamic analysis code data.
Specifically, the dynamic analysis code data may be stored in the server in advance, the terminal may actively request the server to issue the dynamic analysis code data according to a service requirement or a user requirement, and the server returns the dynamic analysis code data according to a request sent by the terminal. And after the terminal receives the dynamic analysis code data for the first time, loading the dynamic analysis code data, and storing the dynamic analysis code data which is loaded successfully to the local. Therefore, after the dynamic analysis code data which is loaded successfully is saved to the local, if the dynamic analysis code data is needed next time, the dynamic analysis code data can be directly obtained from the local, and the dynamic analysis code data does not need to be obtained from the server.
In an embodiment, when detecting that the code data corresponding to the local application only includes no dynamic analysis code data, the server may actively push the dynamic analysis code data to the terminal, without the terminal actively triggering downloading of the dynamic analysis code data through an entry provided by the local application.
In one embodiment, when the dynamic analysis code data of the terminal needs to be updated, the server actively pushes the updated dynamic decoding code data to the terminal, and the terminal does not need to download the updated dynamic analysis code data from an entry trigger request provided by the local application.
And step 204, acquiring characteristic data corresponding to the interactive application data.
The interactive application refers to an application associated with the local application, and the interactive application can be accessed through an entrance provided by the local application, and the interactive application may be a game application associated with the local application, or may be a short video application associated with the local application, or may be a shooting application associated with the local application, and the like. The interactive application data refers to data related to the interactive application, and the interactive application data can be acquired in real time through the interactive application. The interactive application data may be video data, or audio data, or both audio and video data, etc.
The feature data refers to data with features in the interactive application data, and can be specifically identified from the interactive application data according to preset conditions. The preset conditions can be customized, and the customized conditions can be used for obtaining corresponding characteristic data according to the position of a target characteristic point corresponding to a target interaction subject area in the interactive application data, or extracting audio characteristics in the interactive application data to obtain characteristic data and the like.
In an embodiment, if the customized preset condition is that corresponding feature data is determined according to the position of a target feature point corresponding to a target interaction subject area in the interaction application data, the interaction application data is first obtained, the target interaction subject area meeting the preset condition in the interaction application data is identified, then a target feature point corresponding to the target interaction subject area is identified, the target feature point may be a human face, a gesture and the like, and finally feature data corresponding to the interaction application data is determined according to the position of the target feature point, for example, the human face coordinate data is used as the feature data, or the gesture coordinate data is used as the feature data, or the human face coordinate data and the gesture coordinate data are simultaneously used as the feature data. If the target interaction subject does not exist in the interactive application data, the audio frame of the interactive application data can be obtained, the audio features, such as frequency and the like, corresponding to the audio frame are extracted, and finally the feature data corresponding to the interactive application data are determined according to the audio features.
And step 206, analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine.
The first drawing command is a command used for commanding the terminal to draw in the dynamic analysis code data, and can be obtained by analyzing the dynamic analysis code data, and specifically can be obtained by analyzing and determining the dynamic analysis code data according to application logic. Specifically, after the dynamic analysis code data is loaded, the dynamic analysis code data is analyzed, and a first drawing command for drawing is obtained according to application logic analysis in the dynamic analysis code data. Since the first drawing command is not a command executable by the image drawing engine, the first drawing command is parsed into a second drawing command executable by the image drawing engine. The second drawing command is a drawing command executable by the image drawing engine, and the image drawing engine can draw according to the second drawing command. The image rendering engine is an engine for performing image rendering, and the image rendering engine may be, but is not limited to, opengl (Open Graphics Library, GPU-based image rendering interface), and Direct (Direct eXtension, multimedia programming interface).
In an embodiment, the parsing of the first drawing command in the dynamic parsing code data into the second drawing command corresponding to the image drawing engine may be to first obtain a corresponding drawing parameter according to the first drawing command, determine drawing data corresponding to the second drawing command according to the drawing parameter, then obtain image drawing engine data corresponding to the image drawing engine according to the drawing data, and finally bind the drawing data and the image drawing engine data.
And step 208, executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
Specifically, after the feature data corresponding to the interactive application data is obtained, the second drawing command may be executed according to the feature data, so as to obtain a corresponding drawing result. In an embodiment, the second drawing command may be a preset function executable by the image drawing engine, the obtained feature data is used as an input variable of the preset function, and the image drawing engine draws according to the preset function to obtain a corresponding drawing result. The drawing result may be an audio-video picture including a plurality of virtual objects, some of the virtual objects may be displayed at a designated position of the drawing result according to the feature data, and the rest of the virtual objects may be displayed at a random position of the drawing result. Or the drawing result can be an audio-video picture comprising at least one virtual object, and the at least one virtual object is displayed at a designated position in the drawing result according to the characteristic data.
According to the application data processing method, the terminal loads the dynamic analysis code data, obtains the characteristic data corresponding to the interactive application data, analyzes the dynamic analysis code data, analyzes a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine, and executes the second drawing command according to the characteristic data to obtain a corresponding drawing result. The dynamic analysis code data is code data corresponding to the interactive application, the interactive application is an application associated with the local application, the interactive application can enter from an entrance provided by the local application, codes of the interactive application corresponding to the local application and the codes of the local application are completely decoupled, when the interactive application needs to be updated, only the dynamic analysis code data of the interactive application needs to be updated without updating the local application together, and therefore the updated dynamic analysis code data can be pushed in time, the pushing efficiency of the interactive application is improved, and the development difficulty is reduced. Furthermore, the dynamic analysis code data can not generate any influence due to different systems of the terminal, the terminals of different systems can load the dynamic analysis code data, the dynamic analysis code data do not need to be developed aiming at the terminals of different systems, the development difficulty of the dynamic analysis code data is reduced, and the development time is shortened.
In one embodiment, as shown in fig. 3, the interactive application data includes video data and/or audio data, and the obtaining feature data corresponding to the interactive application data includes:
step 302, acquiring a current video frame, and identifying a target interaction subject area in the current video frame.
And 304, identifying target feature points corresponding to the target interaction subject area, and determining first feature data corresponding to the interaction application data according to the positions of the target feature points.
The interactive application data may include only video data, only audio data, or both video data and audio data. In an embodiment, if the interactive application data includes only video data, specifically, a current video frame in the interactive application data is obtained, where the current video frame is a video picture acquired by the terminal in real time, and then the target interactive body area in the current video frame may be identified according to a preset identification condition. The preset recognition condition may be customized, where the customization may be to use a certain region of the target interaction subject as a target interaction subject region, for example, to use a region where a gesture of the target interaction subject is located as the target interaction subject region, or to use a region where a mouth of the target interaction subject is located as the target interaction subject region, or to use a region where a chin of the target interaction subject is located as the target interaction subject region, or the like. The target interaction subject may be a person, animal, etc. in the current video frame.
Further, after a target interaction subject area in the current video frame is identified, a target feature point corresponding to the target interaction area is identified, and finally first feature data corresponding to the interaction application data is determined according to the position of the target feature point. The specific determination of the first feature data may use the coordinate position of the target feature point as the first feature data corresponding to the interactive application data.
As shown in fig. 4, fig. 4 is a schematic diagram illustrating identification of a target interaction subject region in a current video frame in one embodiment, fig. 4 is a schematic diagram illustrating a current video frame in the interactive application data, where the current video frame may be a video frame captured by a terminal in real time, and when a target interaction subject exists in the current video frame, the target interaction subject region in the current video frame is identified according to a preset identification condition, for example, a region where a nose of a person in fig. 4 is located may be determined as the target interaction subject region. Further, a target feature point corresponding to the target interaction subject region is identified, and then first feature data corresponding to the interaction application data is determined according to the position of the target feature point, for example, the coordinate position corresponding to the human nose in fig. 4 is determined as the first feature data corresponding to the interaction application data.
Step 306, obtaining the current audio frame, and extracting the audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
In one embodiment, if the interactive application data only includes audio data, a current audio frame in the interactive application data is first acquired, and the current audio frame can be acquired in real time by an audio acquisition device of the terminal. And then extracting the audio features corresponding to the current audio frame, and finally taking the extracted audio features as second feature data corresponding to the interactive application data. The audio features can be obtained by extracting the current audio frame through a related audio feature algorithm, can also be obtained by determining according to the audio signal of the current audio frame, or can be obtained by extracting according to the specific content of the current audio frame, or can be obtained by extracting the current audio frame through physical hardware. The audio features may be, but are not limited to, audio frequency magnitudes, audio attributes, and the like, and the audio attributes include, but are not limited to, male voices, female voices, animal voices, and the like.
In an embodiment, if the interactive application data includes both video data and audio data, the steps 302 and 304 may be performed first, the step 306 may be performed, and finally the feature data corresponding to the interactive application data may be determined according to the first feature data and the second feature data. The mode of determining the feature data corresponding to the interactive application data according to the first feature data and the second feature data can be customized, and the customization can be that the first feature data and the second feature data are calculated according to respective preset proportion to obtain comprehensive feature data, the comprehensive feature data is used as the feature data corresponding to the interactive application data, or the average value of the first feature data and the second feature data is used as the feature data corresponding to the interactive application data, and the like.
It should be noted that step 306, step 302 and step 304 are not executed sequentially, and may also be executed simultaneously, or step 302 and step 304 are executed first and then step 306 is executed, or only step 302 and step 304 are executed, or only step 306 is executed.
In one embodiment, as shown in fig. 5, parsing the dynamic parsing code data, parsing a first drawing command in the dynamic parsing code data into a second drawing command corresponding to the image drawing engine, includes:
step 502, a first drawing command is obtained, and a first drawing parameter corresponding to the first drawing command is obtained, where the first drawing command is determined by an application logic corresponding to the dynamic analysis code data.
Specifically, a first drawing command is determined according to an application logic corresponding to the dynamic analysis code data which is loaded successfully, and then a first drawing parameter corresponding to the first drawing command is obtained. The first drawing command may be a custom preset function called according to the application logic corresponding to the successfully loaded dynamic analysis code data, and the custom may be a rectangular preset function, a circular preset function, or the like, for example, a rectangular fillRect function, and a triangular fillRect function may be the first drawing command. Further, a corresponding first drawing parameter is obtained according to the first drawing command, where the first drawing parameter is an input variable parameter in the first drawing command, and if the first drawing command is a custom preset function, the first drawing parameter is all variables in the custom preset function.
As shown in fig. 6, fig. 6 is a schematic diagram illustrating an embodiment of parsing a first drawing command in dynamically parsed code data into a second drawing command corresponding to an image drawing engine, where the dynamically parsed code data is a javascript code, and a first drawing command is determined according to an application logic in the javascript code, where the first drawing command is a rectangular function fillRect (x, y, w, h, color), and a corresponding first drawing parameter is obtained according to the first drawing command, where the first drawing parameter is x, y, w, h, color.
Step 504, determining second drawing data corresponding to the second drawing command according to the first drawing parameter.
Specifically, the second drawing command refers to a drawing command executable by the image drawing engine, and after the first drawing parameter is acquired, the second drawing data corresponding to the second drawing command is determined according to the first drawing parameter. As shown in fig. 6, the first drawing parameter is all variables x, y, w, h, and color in a rectangular function fillRect (x, y, w, h, color), where x and y represent characteristic data, w and h represent screen width of a terminal, and w and h may be specifically obtained by parsing from dynamic analysis code data, or may be set by self-definition according to business requirements or user requirements, and color represents a color required by drawing of an image drawing engine, and similarly, color may be specifically obtained by parsing from dynamic analysis code data, or may be set by self-definition according to business requirements or user requirements. Further, specific numerical values corresponding to the variables in the first drawing parameters are obtained, and second drawing data corresponding to the second drawing command are determined according to the first drawing parameters. For example, the feature data is input into x and y of the rectangular function fillRect (x, y, w, h, color), the screen width and the drawing color of the terminal are input into w, h, and color of the rectangular function fillRect (x, y, w, h, color), and second drawing data corresponding to the second drawing command is calculated, where the second drawing data may be opengl vertex data in fig. 6. Opengl is an image rendering engine.
Step 506, obtaining image drawing engine data corresponding to the second drawing command, and binding the second drawing data with the image drawing engine data.
Specifically, the image drawing engine data refers to data related to the image drawing engine, and after determining second drawing data corresponding to the second drawing command according to the first drawing parameter, the image drawing engine data corresponding to the second drawing command is obtained, for example, opengl frame buffering in fig. 6, and then the image drawing engine data corresponding to the second drawing command is bound to the second drawing data. As shown in fig. 6, the vertex number, the vertex coordinates, and the texture ID are bound to a specified opengl frame buffer, where the vertex number is the vertex number of a rectangular function, the vertex number of the rectangle is 4, the vertex coordinates are the coordinate position of the feature data, the texture ID is a color identifier of a color required by the image rendering engine for rendering, and second rendering data corresponding to the vertex number, the vertex coordinates, and the texture ID is bound to the image rendering engine data, that is, as shown in fig. 6, the second rendering data corresponding to the vertex number, the vertex coordinates, and the texture ID is bound to the specified opengl frame buffer.
Further, as shown in fig. 6, after the second rendering data corresponding to the vertex number, the vertex coordinates, and the texture ID is bound to the specified opengl frame buffer, the image rendering engine executes the corresponding shader, renders the bound opengl frame buffer, and finally, the image rendering engine outputs the current buffer frame, where the current buffer frame is the rendering result rendered by the image rendering engine.
In one embodiment, as shown in fig. 7, executing the second drawing command according to the feature data to obtain a corresponding drawing result includes:
step 702, a target virtual object is obtained according to the second drawing command.
Step 704, determining drawing state information of the target virtual object according to the characteristic data.
And step 706, drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result.
The target virtual object may be a virtual character or a target object to be controlled, or may be an attacked virtual character or target object. In the application scenario of the game, the target virtual object may be a virtual object used for attacking other virtual characters or virtual targets and capable of releasing skills, and the target virtual object may also be an attacked virtual character or virtual target object.
Specifically, an attack virtual object and an attacked virtual object are obtained according to the second drawing command, where the attack virtual object is a virtual character or a virtual target for attacking the attacked virtual object, and the attack virtual object can release corresponding attack skills, and where the attacked virtual object is a virtual object for being attacked by the attack virtual object, and the attacked virtual object can also release corresponding counterattack skills. Here, the attack virtual object and the attacked virtual object are both target virtual objects. As shown in fig. 7A, fig. 7A is a schematic diagram illustrating obtaining a target virtual object according to a second drawing command in an embodiment, where each rectangle in fig. 7A is an attacked virtual object, and an airplane in fig. 7A is an attacking virtual object.
And after the target virtual object is obtained according to the second drawing command, determining the drawing state information of the target virtual object according to the characteristic data corresponding to the interactive application data, and finally drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result. Wherein the drawing state information includes, but is not limited to, a position of the target virtual object, a state of the target virtual object, a pointing direction of the target virtual object, a color of the target virtual object, and the like. That is, the attacking virtual object can be presented at a designated location, which is determined according to the feature data, and the attacked virtual object can be presented at any location randomly. As shown in fig. 7A, the rendering state information of the target virtual object in fig. 7A is determined according to the feature data, and then the target virtual object is rendered according to the rendering state information to obtain a corresponding rendering result, which may be as shown in fig. 7A: the attacking virtual object airplane is shown at a designated position, and each attacked virtual object rectangle can be randomly shown at any position.
In one embodiment, as shown in fig. 8, the application data processing method further includes:
and 802, combining the first image corresponding to the drawing result with the second image corresponding to the interactive application data to obtain a composite image.
Step 804, displaying the composite image.
Specifically, the second drawing command is executed according to the feature data to obtain a corresponding drawing result, the generated drawing result can be fully utilized, the first image corresponding to the drawing result and the second image corresponding to the interactive application data can be combined to obtain a composite image, and finally the composite image is displayed. The combining the first image corresponding to the drawing result and the second image corresponding to the interactive application data to obtain the composite image may specifically be covering the first image corresponding to the drawing result on the second image corresponding to the interactive application data to obtain the composite image. At this time, the composite image has both the first image corresponding to the rendering result and the second image corresponding to the interactive application data.
As shown in fig. 9, fig. 9 illustrates a schematic diagram that an embodiment combines a first image corresponding to a drawing result and a second image corresponding to interactive application data to obtain a composite image, where the first image corresponding to the drawing result may be a first image including an attack virtual object and an attacked virtual object, the attack virtual object is an airplane in fig. 9, the attack virtual object is used to attack other virtual characters or virtual targets and can release corresponding attack skills, the attacked virtual object is used to be attacked by the attack virtual object and can release corresponding counterattack skills, and the attacked virtual object may be rectangles in fig. 9. In the first image corresponding to the drawing result, the position of the attack virtual object is obtained according to the characteristic data corresponding to the interactive application data and is specifically shown at the specified position of the first image, and the attacked virtual object can be obtained according to the dynamic analysis code data drawing and is specifically shown at the random position of the first image. The interactive application data is data related to the interactive application and can be audio and video data, the second image corresponding to the interactive application data can be an audio and video picture acquired by the terminal in real time, and the first image corresponding to the drawing result is covered on the audio and video picture acquired by the terminal in real time to obtain a final synthetic image. As shown in fig. 9, the person in fig. 9 is the second image corresponding to the interactive application data, the rectangle and the airplane on the second image constitute the first image corresponding to the drawing result, and the image shown in fig. 9 is the composite image finally displayed.
In one embodiment, before loading the dynamic parsing code data, the method further comprises: and sending a downloading request to the server through the local application so that the server returns dynamic analysis code data according to the downloading request, wherein the dynamic analysis code data is used for running the interactive application.
The dynamic analysis code data is code data which does not need to be compiled and packaged into the local application in advance, the terminal can obtain the dynamic analysis code data from the server to realize corresponding functions according to business requirements or user requirements, the interactive application is an application associated with the local application and can enter the interactive application from an entrance provided by the local application, and codes of the interactive application corresponding to the local application are completely decoupled from codes of the local application, namely the local code data corresponding to the local application does not have any code data related to the interactive application, and the dynamic analysis code data is only used for running the interactive application. Specifically, before loading the dynamic analysis code data, a download request can be sent to the server through the local application, the server returns the dynamic analysis code data to the terminal according to the download request, the terminal receives the dynamic analysis code data and then stores the dynamic analysis code data to the local, and the dynamic analysis code data can be directly obtained from the local when needed next time without obtaining from the server. Because the dynamic analysis code data and the local code data are completely decoupled, the local code data installation package compiled in advance does not contain any dynamic analysis code data, and the dynamic analysis code data can be downloaded from a server and directly stored in the local, so that the size of the local code data installation package compiled in advance is not influenced.
In one embodiment, as shown in fig. 10, the application data processing method further includes:
step 1002, a first thread is started, interactive application data are obtained through the first thread, and feature data corresponding to the interactive application data are obtained through analyzing the interactive application data.
And 1004, the first thread sends the characteristic data to a second thread, the second thread is used for loading dynamic analysis code data, starting an image drawing engine, the image drawing engine is used for analyzing the dynamic analysis code data, a first drawing command in the dynamic analysis code data is analyzed into a second drawing command corresponding to the image drawing engine, and the second drawing command is executed according to the characteristic data to obtain a corresponding drawing result.
And step 1006, the first thread reads the drawing result, and performs composite display on the drawing result and the interactive application data.
The first thread, the second thread and the image rendering engine are basic units for program operation, and are used for operating corresponding programs, and the image rendering engine can be operated on the second thread. Specifically, the terminal starts a first thread and simultaneously starts a second thread, interactive application data are obtained through the first thread, and the interactive application data can be obtained through real-time collection of the first thread. The first thread analyzes the acquired interactive application data to obtain feature data corresponding to the interactive application data, and the first thread actively sends the analyzed feature data to the second thread.
If the terminal does not locally store the dynamic analysis code data, the second thread needs to send a download request to request the server for the dynamic analysis code data, the server returns to the second thread according to the download request, and the second thread stores the received dynamic analysis code data to the local. And if the terminal has dynamic analysis code data locally, the second thread is directly obtained from the terminal. And after the dynamic analysis code data are obtained, loading the dynamic analysis code data, determining a first drawing command according to application logic in the dynamic analysis code data, and starting an image drawing engine. Since the image drawing engine cannot draw according to the first drawing command determined by the application logic in the dynamic analysis code data, the image drawing engine needs to analyze the first drawing command into a second drawing command corresponding to the image drawing engine, so that the image drawing engine executes the second drawing command according to the feature data to obtain a corresponding drawing result.
Further, the image rendering engine may send the rendered rendering result to a double-buffered frame region, and the double-buffered frame region may be used to avoid a read-write collision between the first thread and the image rendering engine. The double-buffer frame area is actually two buffer areas, such as a buffer area a and a buffer area B, and the image rendering engine can render in the buffer area a and send the rendering result to the buffer area B. And when the first thread needs the drawing result drawn by the image drawing engine, reading the drawing result from the buffer area B in the double-buffer frame area, and then performing synthesis display on the read drawing result and the interactive application data. By placing the drawing result into the double-buffer frame area, the problem of read-write conflict between the first thread and the image drawing engine can be solved, the first thread does not need to wait until the image drawing engine finishes all drawing and then can take the drawing result, the time for the first thread to read the drawing result is shortened, and the efficiency for the first thread to read the drawing result and the efficiency for synthesizing and displaying are improved.
In one embodiment, the first thread corresponds to a first image rendering engine execution context and the second thread corresponds to a second image rendering engine execution context, the first image rendering engine execution context being a shared context of the second image rendering engine execution context.
The rendering engine execution context is content executed by the image rendering engine, the rendering engine execution context includes but is not limited to rendering commands, rendering sites, texture resources, and the like, the first thread and the second thread correspond to the first image rendering engine and the second image rendering engine, respectively, and in order to implement data intercommunication and texture sharing between the first image rendering engine and the second image rendering engine, the first image rendering engine corresponding to the first thread is set as a shared context of the second image rendering engine execution context. The second image rendering engine execution context corresponding to the second thread can directly read the content of the first image rendering engine execution context corresponding to the first thread. For example, there may be a texture that the first thread completes in the execution context of the first image rendering engine corresponding to the first thread, and since the execution context of the second image rendering engine corresponding to the second thread is a shared context of the execution context of the first image rendering engine, the texture in the execution context of the second image rendering engine may be directly read by the texture identifier in the execution context of the first image rendering engine.
In one embodiment, as shown in fig. 11, the application data processing method further includes:
step 1102, converting the feature data into target feature data corresponding to the browser engine.
And step 1104, the browser engine parses the first drawing command in the dynamic parsing code data to obtain a corresponding third drawing command.
And step 1106, the browser engine executes the third drawing command according to the target feature data to obtain a corresponding drawing result.
The browser engine is a powerful rendering engine and can render the webpage codes displayed by the browser engine into corresponding webpages, so that the characteristic data can be rendered directly by using the rendering capability of the browser engine without rendering through an image rendering engine. Specifically, since the browser engine usually depends on a browser engine interface, and the browser engine interface cannot directly identify the feature data, the feature data needs to be converted into target feature data that can be identified by the browser engine, such as converting the feature data into a corresponding character string. After the feature data are converted into target feature data corresponding to the browser engine, the browser engine acquires dynamic analysis code data, and determines a corresponding first drawing command according to application logic in the dynamic analysis code data. Since the browser engine cannot draw according to the first drawing command, the first drawing command needs to be analyzed to obtain a third drawing command corresponding to the browser engine. And finally, the browser engine executes the third drawing command according to the target characteristic data so as to draw and obtain a corresponding drawing result.
In one embodiment, after the feature data corresponding to the interactive application data is obtained, an entry for obtaining the feature data can be obtained for a third-party developer, the third-party developer can obtain the feature data corresponding to the interactive application data through the entry, and then the corresponding dynamic analysis code data is automatically generated according to the obtained feature data and actual requirements. Therefore, different third-party developers have different dynamic analysis code data and different drawing commands in the dynamic analysis code data, and finally, drawing results obtained by executing the drawing commands according to the characteristic data are also different. Therefore, different third-party developers can participate in the development of the dynamic analysis code data, and the diversity of the dynamic analysis code data is improved, namely the diversity of interactive application is improved.
In a specific embodiment, an application data processing method is provided, which specifically includes the following steps:
step 1202, sending a download request to a server through a local application, so that the server returns dynamic analysis code data according to the download request, and the dynamic analysis code data is used for running an interactive application.
Step 1204, load dynamic parsing code data.
And step 1206, obtaining feature data corresponding to the interactive application data.
1206a, when the interactive application data comprises video data, acquiring a current video frame, and identifying a target interactive main body area in the current video frame; and identifying target feature points corresponding to the target interaction subject area, and determining first feature data corresponding to the interaction application data according to the positions of the target feature points.
In step 1206b, when the interactive application data includes audio data, obtaining a current audio frame, and extracting audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
In step 1206c, when the interactive application data includes video data and audio data, step 1206a may be executed first, then step 1206b may be executed, and then the feature data corresponding to the interactive application data may be determined according to the first feature data and the second feature data.
And 1208, analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine.
Step 1208a, obtaining a first drawing command, and obtaining a first drawing parameter corresponding to the first drawing command, where the first drawing command is determined by an application logic corresponding to the dynamic analysis code data.
And 1208b, determining second drawing data corresponding to the second drawing command according to the first drawing parameter.
And 1208c, acquiring the image drawing engine data corresponding to the second drawing command, and binding the second drawing data with the image drawing engine data.
And 1210, executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
In step 1210a, a target virtual object is obtained according to the second drawing command.
In step 1210b, drawing state information of the target virtual object is determined according to the feature data.
And step 1210c, drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result.
And 1212, combining the first image corresponding to the drawing result with the second image corresponding to the interactive application data to obtain a composite image, and displaying the composite image.
In one embodiment, an application data processing method is provided. The embodiment is mainly illustrated by applying the method to the terminal 110 in fig. 1. The application data processing method specifically comprises the following steps: loading dynamic analysis code data; acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame; analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine; executing a second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame; drawing a target virtual object in the current video frame according to the drawing state information; the current video frame including the target virtual object is played.
Specifically, after the terminal acquires the dynamic analysis code data, the terminal loads the acquired dynamic analysis code data, where the terminal may specifically actively request the server to issue the dynamic analysis code data, and the receiving server returns the dynamic analysis code data according to the request sent by the terminal and stores the dynamic analysis code data locally for loading next time. Or after detecting that the code data corresponding to the local application does not include the dynamic analysis code data, the server actively pushes the dynamic analysis code data to the terminal.
And acquiring feature data corresponding to interactive application data while loading the dynamic analysis code data, wherein the interactive application refers to an application associated with the local application, and enters the interactive application through an entrance provided by the local application, and the interactive application can be a game application associated with the local application, or a short video application associated with the local application, or a shooting application associated with the local application, and the like. The interactive application data refers to data related to the interactive application, and the interactive application data can be acquired in real time through the interactive application. The interactive application data may be video data, or audio data, or both audio and video data, etc. The interactive application data here includes a current video frame including video data. The characteristic data refers to data with characteristics in the interactive application data and can be identified from the interactive application data according to preset conditions. The preset conditions can be customized, and the customized conditions can be used for obtaining corresponding characteristic data according to the position of a target characteristic point corresponding to a target interaction subject area in the interactive application data, or extracting audio characteristics in the interactive application data to obtain characteristic data and the like.
After loading the dynamic analysis code data and acquiring the characteristic data corresponding to the interactive application data, analyzing the dynamic analysis code data, analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine, and executing the second drawing command according to the characteristic data to obtain drawing state information of the target virtual object corresponding to the current video frame. The drawing state information refers to state information of the target virtual object in the current video frame, and the drawing state information may be, but is not limited to, position information and direction information, the position information is information of the target virtual object related to a position in the current video frame, and the direction information is information of the target virtual object related to a pointing direction in the current video frame. The target virtual object may include an attack virtual object and an attacked virtual object, where the attack virtual object is used to attack a virtual role or a virtual target of another virtual object, the other virtual object may be an attacked virtual object, the attacked virtual object refers to an attacked virtual role or virtual target, and the attacked virtual object may be a virtual role or virtual target attacked by the attacked virtual object.
And finally, drawing the target virtual object in the current video frame according to the drawing state information, and playing the current video frame comprising the target virtual object. Specifically, after the drawing state information of the target virtual object corresponding to the current video frame is obtained by executing the second drawing command according to the feature data, the target virtual object is drawn in the current video frame according to the drawing state information, and the target virtual object can be drawn in the current video frame according to the feature data by the image drawing engine. When the target virtual object can be an attack virtual object, the attack virtual object is drawn at the specified position of the current video frame according to the drawing state information of the attack virtual object, wherein the specified position of the current video frame is determined according to the characteristic data, so that the attack virtual object is drawn at the specified position of the current video frame, and the attack virtual object not only has position information but also has direction information in the current video frame. As shown in fig. 7A, the attacking virtual object may be the airplane in fig. 7A, and in fig. 7A, the attacking virtual object has a designated position and a designated direction in the current video frame. When the target virtual object can be an attacked virtual object, the attacked virtual object is drawn at a random position of the current video frame according to the drawing state information of the attacked virtual object, and the attacked virtual object can be drawn and displayed at any position of the current video frame because the specific position of the attacking virtual object is determined by the characteristic data. As shown in fig. 7A, the attacked virtual object may be a rectangle in fig. 7A, and in fig. 7A, the attacked virtual object is shown at any position in the current video frame. Further, after the target virtual object is drawn in the current video frame according to the drawing state information, the current video frame including the target virtual object is played, and the target virtual object is displayed in the current video frame, that is, as shown in fig. 7A, the target virtual object airplane may be displayed at a designated position of the current video frame, and the target virtual object rectangle may be displayed at a random position of the current video frame.
In one embodiment, the drawing state information includes position information and direction information, and drawing the target virtual object in the current video frame according to the drawing state information includes: determining the target position of the virtual object in the current video frame according to the position information; determining the target direction of the virtual object in the current video frame according to the direction information; and drawing the target virtual object in the current video frame according to the target position and the target direction.
The drawing state information comprises position information and direction information, the position information is information of the target virtual object related to the position in the current video frame, and the direction information is information of the target virtual object related to the direction in the current video frame. Specifically, the target position of the virtual object in the current video frame is determined according to the position information, as shown in fig. 12A, fig. 12A shows a schematic view of a scene to which the data processing method is applied in one embodiment, the target position of the virtual object airplane in the current video frame is a designated position, and the target position of the virtual object rectangle in the current video frame is a random position. Then, the target orientation in the current video frame is determined according to the direction information, as shown in fig. 12A, the target orientation of the virtual object airplane in the current video frame is as shown in fig. 12A, and the target orientation of the virtual object rectangle in the current video frame is as shown in fig. 12A. Further, after determining the target position of the virtual object in the current video frame according to the position information and determining the target orientation of the virtual object in the current video frame according to the direction information, the target virtual object is drawn in the current video frame according to the target position and the target orientation, as shown in fig. 12A, the target virtual object plane is drawn in the current video frame according to the target position and the target orientation of the virtual object plane, and likewise, the target virtual object rectangle is drawn in the current video frame according to the target position and the target orientation of the virtual object rectangle. That is, as shown in fig. 12A, the virtual object plane is shown at a designated position of the current video frame and has a designated direction, and then can move at the designated position and in the designated direction. And the virtual object rectangle can be shown at any position of the current video frame and also has a specified direction.
In one embodiment, the feature data comprises virtual object associated feature data, the feature data corresponding to a backward video frame of the current video frame; the step of executing the second drawing command according to the feature data to obtain the drawing state information of the target virtual object corresponding to the current video frame includes: acquiring the position and the direction of a target virtual object in a backward video frame; determining the position and the direction of a related virtual object corresponding to the target virtual object according to the position and the direction of the target virtual object in the backward video frame; and drawing the associated virtual object according to the position and the direction of the associated virtual object corresponding to the target virtual object.
The virtual object associated feature data is related feature data used to determine drawing state information of an associated virtual object corresponding to the target virtual object, where the associated virtual object corresponding to the target virtual object refers to a sub-virtual object having an association relationship with the target virtual object, and the associated virtual object corresponding to the target virtual object may be a shot bullet of the target virtual object airplane in fig. 12B. The feature data is corresponding to a backward video frame of the current video frame, the current video frame is a video frame where the initial position of the target virtual object is located, the backward video frame is a next video frame of the current video frame, and the position and the direction of the target virtual object in the backward video frame can be changed along with the change of the feature data. As shown in fig. 12B, fig. 12B is a schematic view of a scene to which the data processing method is applied in an embodiment, where a video frame in which the target virtual object airplane is located, which is drawn by a dotted line at the position a in fig. 12B, may be a current video frame, and in a backward video frame of the current video frame, the position of the target virtual object airplane changes, such as moves from the position a to the position B, and a video frame in which the target virtual object airplane is located, which is drawn by a solid line at the position B, is a backward video frame.
Specifically, the position and orientation of the target virtual object in the backward video frame are acquired, and as shown in fig. 12B, the target virtual object airplane may move from the a position in the current video frame to the B position in the backward video frame. And then determining the position and the direction of the associated virtual object corresponding to the target virtual object according to the position and the direction of the target virtual object in the backward video frame, as shown in fig. 12B, determining the position and the direction of the associated virtual object shooting bullet corresponding to the target virtual object according to the position and the direction of the target virtual object aircraft in the backward video frame, wherein the position and the direction of the associated virtual object shooting bullet corresponding to the target virtual object are also specified, because the position and the direction of the associated virtual object corresponding to the target virtual object are determined according to the virtual object associated feature data, and finally drawing the associated virtual object according to the position and the direction of the associated virtual object corresponding to the target virtual object. As shown in fig. 12B, the target virtual object is an airplane, the associated virtual object corresponding to the target virtual object airplane is a shot bullet, the associated virtual object is drawn according to the position and the direction of the associated virtual object corresponding to the target virtual object, and the drawing result can be as shown in fig. 12B.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In an application scenario of video game interaction, as shown in fig. 12, fig. 12 is a schematic view of an application data processing method in an embodiment, a first thread may be a shooting component, a second thread may be a game thread, and an image rendering engine may be a rendering engine, where the image rendering engine may be run on the second thread. Specifically, as shown in fig. 12, the terminal starts the shooting component and starts the game thread at the same time, the first image rendering engine execution context corresponding to the shooting component is the shared context of the second image rendering engine execution context corresponding to the game thread, and the texture sharing between the shooting component and the game thread can be realized through the image rendering engine execution context. In the thread of the shooting component, current frame information is obtained, and the frame information is interactive application data, such as audio and video data. And dynamic parsing code data, such as Javascript code of the game, is loaded in the game thread and the rendering engine is started at the same time. Since the rendering engine is running on the game thread, the rendering engine parses the dynamic resolution code data, determining the first rendering command according to application logic in the dynamic resolution code data. Meanwhile, the shooting component acquires characteristic data corresponding to the frame information and sends the characteristic data to the drawing engine, the drawing engine receives the characteristic data, the drawing engine analyzes the first drawing command into a corresponding second drawing command, and the second drawing command is executed according to the characteristic data to obtain a drawing result.
As shown in fig. 12, the rendering engine further sends the rendered rendering result to a double-buffered frame area, and the use of the double-buffered frame area can avoid the read-write collision between the first thread and the image rendering engine. However, the double-buffer frame area is actually two buffer areas, such as a buffer area a and a buffer area B, and the image rendering engine may render in the buffer area a and send the rendering result to the buffer area B. And when the first thread needs to draw the drawing result drawn by the engine, reading the drawing result from the buffer area B in the double-buffer frame area, and then synthesizing the read drawing result and the currently acquired interactive application data, wherein the synthesis specifically can be that an image corresponding to the read drawing result is covered on an image corresponding to the interactive application data. Where the final composite image may be as shown in figure 9.
As shown in fig. 13, in one embodiment, there is provided an application data processing apparatus 1300, comprising:
and a dynamic analysis code data loading module 1302, configured to load dynamic analysis code data.
And a feature data obtaining module 1304, configured to obtain feature data corresponding to the interactive application data.
And the dynamic analysis code data analysis module 1306 is configured to analyze the dynamic analysis code data, and analyze a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine.
The characteristic data executing module 1308 is configured to execute the second drawing command according to the characteristic data to obtain a corresponding drawing result.
As shown in fig. 14, in one embodiment, the interactive application data includes video data and/or audio data, and the feature data acquisition module 1304 includes:
the current video frame identifying unit 1304a is configured to obtain a current video frame and identify a target interaction subject area in the current video frame.
The feature data determining unit 1304b is configured to identify a target feature point corresponding to the target interaction subject area, and determine first feature data corresponding to the interactive application data according to a position of the target feature point. And/or
And a current audio frame extracting unit 1304c, configured to obtain a current audio frame, and extract audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
As shown in FIG. 15, in one embodiment, dynamic resolution code data resolution module 1306 includes:
the first drawing command obtaining unit 1306a is configured to obtain a first drawing command, where the first drawing command is determined by application logic corresponding to the dynamic analysis code data, and obtain a first drawing parameter corresponding to the first drawing command.
The second drawing data determining unit 1306b is configured to determine second drawing data corresponding to the second drawing command according to the first drawing parameter.
The image drawing engine data obtaining unit 1306c is configured to obtain image drawing engine data corresponding to the second drawing command, and bind the second drawing data with the image drawing engine data.
As shown in FIG. 16, in one embodiment, the feature data execution module 1308 includes:
a target virtual object obtaining unit 1308a, configured to obtain the target virtual object according to the second drawing command.
A drawing state information determination unit 1308b for determining the drawing state information of the target virtual object from the feature data.
The target virtual object drawing unit 1308c is configured to draw the target virtual object according to the drawing state information to obtain a corresponding drawing result.
In one embodiment, the application data processing apparatus 1300 is further configured to combine the first image corresponding to the rendering result and the second image corresponding to the interactive application data to obtain a composite image, and display the composite image.
In one embodiment, the application data processing apparatus 1300 is further configured to send a download request to the server through the local application, so that the server returns dynamic parsing code data according to the download request, and the dynamic parsing code data is used for running the interactive application.
In an embodiment, the application data processing apparatus 1300 is further configured to start a first thread, obtain interactive application data through the first thread, and analyze the interactive application data to obtain feature data corresponding to the interactive application data; the first thread sends the characteristic data to a second thread, the second thread is used for loading dynamic analysis code data and starting an image drawing engine, the image drawing engine is used for analyzing the dynamic analysis code data, a first drawing command in the dynamic analysis code data is analyzed into a second drawing command corresponding to the image drawing engine, and the second drawing command is executed according to the characteristic data to obtain a corresponding drawing result; and the first thread reads the drawing result and performs synthesis display on the drawing result and the interactive application data.
In one embodiment, the first thread corresponds to a first image rendering engine execution context and the second thread corresponds to a second image rendering engine execution context, the first image rendering engine execution context being a shared context of the second image rendering engine execution context.
In one embodiment, the application data processing apparatus 1300 is further configured to convert the feature data into target feature data corresponding to a browser engine; the browser engine analyzes the first drawing command in the dynamic analysis code data to obtain a corresponding third drawing command; and the browser engine executes the third drawing command according to the target characteristic data to obtain a corresponding drawing result.
FIG. 17 is a diagram illustrating an internal structure of a computer device in one embodiment. The computer device may specifically be the terminal 110 in fig. 1. As shown in fig. 17, the computer apparatus includes a processor, a memory, a network interface, an input device, and a display screen connected through a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program that, when executed by the processor, causes the processor to implement the application data processing method. The internal memory may also have stored therein a computer program that, when executed by the processor, causes the processor to perform the application data processing method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 17 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the application data processing apparatus provided in the present application may be implemented in the form of a computer program, and the computer program may be run on a computer device as shown in fig. 17. The memory of the computer device may store various program modules constituting the application data processing apparatus, such as a dynamic analysis code data loading module, a feature data acquisition module, a dynamic analysis code data analysis module, and a feature data execution module shown in fig. 13. The computer program constituted by the respective program modules causes the processor to execute the steps in the application data processing method of the respective embodiments of the present application described in the present specification.
For example, the computer device shown in fig. 17 may perform loading of dynamic resolution code data by a dynamic resolution code data loading module in the application data processing apparatus shown in fig. 13. The computer equipment can execute the step of acquiring the characteristic data corresponding to the interactive application data through the characteristic data acquisition module. The computer equipment can analyze the dynamic analysis code data through the dynamic analysis code data analysis module, and analyze a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine. The computer equipment can execute the second drawing command according to the characteristic data through the characteristic data execution module to obtain a corresponding drawing result.
In one embodiment, a computer device is proposed, the computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program: loading dynamic analysis code data; acquiring feature data corresponding to the interactive application data; analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine; and executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
In one embodiment, the interactive application data includes video data and/or audio data, and the obtaining feature data corresponding to the interactive application data includes: acquiring a current video frame, and identifying a target interaction subject area in the current video frame; identifying target feature points corresponding to a target interaction subject area, and determining first feature data corresponding to the interaction application data according to the positions of the target feature points; and/or acquiring a current audio frame, and extracting audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
In one embodiment, parsing the dynamic parsing code data to parse a first drawing command in the dynamic parsing code data into a second drawing command corresponding to the image drawing engine includes: acquiring a first drawing command, and acquiring a first drawing parameter corresponding to the first drawing command, wherein the first drawing command is determined by application logic corresponding to dynamic analysis code data; determining second drawing data corresponding to the second drawing command according to the first drawing parameter; and acquiring image drawing engine data corresponding to the second drawing command, and binding the second drawing data with the image drawing engine data.
In one embodiment, executing the second drawing command according to the feature data to obtain a corresponding drawing result includes: acquiring a target virtual object according to the second drawing command; determining drawing state information of the target virtual object according to the characteristic data; and drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result.
In one embodiment, the computer program further causes the processor to perform the steps of: combining a first image corresponding to the drawing result with a second image corresponding to the interactive application data to obtain a composite image; and displaying the composite image.
In one embodiment, the computer program further causes the processor to perform the steps of: and sending a downloading request to the server through the local application so that the server returns dynamic analysis code data according to the downloading request, wherein the dynamic analysis code data is used for running the interactive application.
In one embodiment, the computer program further causes the processor to perform the steps of: starting a first thread, acquiring interactive application data through the first thread, and analyzing the interactive application data to obtain characteristic data corresponding to the interactive application data; the first thread sends the characteristic data to a second thread, the second thread is used for loading dynamic analysis code data and starting an image drawing engine, the image drawing engine is used for analyzing the dynamic analysis code data, a first drawing command in the dynamic analysis code data is analyzed into a second drawing command corresponding to the image drawing engine, and the second drawing command is executed according to the characteristic data to obtain a corresponding drawing result; and the first thread reads the drawing result and performs synthesis display on the drawing result and the interactive application data.
In one embodiment, the first thread corresponds to a first image rendering engine execution context and the second thread corresponds to a second image rendering engine execution context, the first image rendering engine execution context being a shared context of the second image rendering engine execution context.
In one embodiment, the computer program further causes the processor to perform the steps of: converting the characteristic data into target characteristic data corresponding to a browser engine; the browser engine analyzes the first drawing command in the dynamic analysis code data to obtain a corresponding third drawing command; and the browser engine executes the third drawing command according to the target characteristic data to obtain a corresponding drawing result.
In one embodiment, a computer readable storage medium is provided, having a computer program stored thereon, which, when executed by a processor, causes the processor to perform the steps of: loading dynamic analysis code data; acquiring feature data corresponding to the interactive application data; analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine; and executing the second drawing command according to the characteristic data to obtain a corresponding drawing result.
In one embodiment, the interactive application data includes video data and/or audio data, and the obtaining feature data corresponding to the interactive application data includes: acquiring a current video frame, and identifying a target interaction subject area in the current video frame; identifying target feature points corresponding to a target interaction subject area, and determining first feature data corresponding to the interaction application data according to the positions of the target feature points; and/or acquiring a current audio frame, and extracting audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
In one embodiment, parsing the dynamic parsing code data to parse a first drawing command in the dynamic parsing code data into a second drawing command corresponding to the image drawing engine includes: acquiring a first drawing command, and acquiring a first drawing parameter corresponding to the first drawing command, wherein the first drawing command is determined by application logic corresponding to dynamic analysis code data; determining second drawing data corresponding to the second drawing command according to the first drawing parameter; and acquiring image drawing engine data corresponding to the second drawing command, and binding the second drawing data with the image drawing engine data.
In one embodiment, executing the second drawing command according to the feature data to obtain a corresponding drawing result includes: acquiring a target virtual object according to the second drawing command; determining drawing state information of the target virtual object according to the characteristic data; and drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result.
In one embodiment, the computer program further causes the processor to perform the steps of: combining a first image corresponding to the drawing result with a second image corresponding to the interactive application data to obtain a composite image; and displaying the composite image.
In one embodiment, the computer program further causes the processor to perform the steps of: and sending a downloading request to the server through the local application so that the server returns dynamic analysis code data according to the downloading request, wherein the dynamic analysis code data is used for running the interactive application.
In one embodiment, the computer program further causes the processor to perform the steps of: starting a first thread, acquiring interactive application data through the first thread, and analyzing the interactive application data to obtain characteristic data corresponding to the interactive application data; the first thread sends the characteristic data to a second thread, the second thread is used for loading dynamic analysis code data and starting an image drawing engine, the image drawing engine is used for analyzing the dynamic analysis code data, a first drawing command in the dynamic analysis code data is analyzed into a second drawing command corresponding to the image drawing engine, and the second drawing command is executed according to the characteristic data to obtain a corresponding drawing result; and the first thread reads the drawing result and performs synthesis display on the drawing result and the interactive application data.
In one embodiment, the first thread corresponds to a first image rendering engine execution context and the second thread corresponds to a second image rendering engine execution context, the first image rendering engine execution context being a shared context of the second image rendering engine execution context.
In one embodiment, the computer program further causes the processor to perform the steps of: converting the characteristic data into target characteristic data corresponding to a browser engine; the browser engine analyzes the first drawing command in the dynamic analysis code data to obtain a corresponding third drawing command; and the browser engine executes the third drawing command according to the target characteristic data to obtain a corresponding drawing result.
In one embodiment, a computer device is proposed, the computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program: loading dynamic analysis code data; acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame; analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine; executing a second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame; drawing a target virtual object in the current video frame according to the drawing state information; the current video frame including the target virtual object is played.
In one embodiment, the drawing state information includes position information and direction information, and drawing the target virtual object in the current video frame according to the drawing state information includes: determining the target position of the virtual object in the current video frame according to the position information; determining the target direction of the virtual object in the current video frame according to the direction information; and drawing the target virtual object in the current video frame according to the target position and the target direction.
In one embodiment, the feature data comprises virtual object associated feature data, the feature data corresponding to a backward video frame of the current video frame; the step of executing the second drawing command according to the feature data to obtain the drawing state information of the target virtual object corresponding to the current video frame includes: acquiring the position and the direction of a target virtual object in a backward video frame; determining the position and the direction of a related virtual object corresponding to the target virtual object according to the position and the direction of the target virtual object in the backward video frame; and drawing the associated virtual object according to the position and the direction of the associated virtual object corresponding to the target virtual object.
In one embodiment, a computer readable storage medium is provided, having a computer program stored thereon, which, when executed by a processor, causes the processor to perform the steps of: loading dynamic analysis code data; acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame; analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine; executing a second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame; drawing a target virtual object in the current video frame according to the drawing state information; the current video frame including the target virtual object is played.
In one embodiment, the drawing state information includes position information and direction information, and drawing the target virtual object in the current video frame according to the drawing state information includes: determining the target position of the virtual object in the current video frame according to the position information; determining the target direction of the virtual object in the current video frame according to the direction information; and drawing the target virtual object in the current video frame according to the target position and the target direction.
In one embodiment, the feature data comprises virtual object associated feature data, the feature data corresponding to a backward video frame of the current video frame; the step of executing the second drawing command according to the feature data to obtain the drawing state information of the target virtual object corresponding to the current video frame includes: acquiring the position and the direction of a target virtual object in a backward video frame; determining the position and the direction of a related virtual object corresponding to the target virtual object according to the position and the direction of the target virtual object in the backward video frame; and drawing the associated virtual object according to the position and the direction of the associated virtual object corresponding to the target virtual object.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (21)

1. An application data processing method, comprising:
loading dynamic analysis code data;
acquiring feature data corresponding to the interactive application data;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to an image drawing engine;
acquiring a target virtual object according to the second drawing command;
determining drawing state information of the target virtual object according to the characteristic data;
and drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result.
2. The method of claim 1, wherein the interactive application data comprises video data and/or audio data, and the obtaining feature data corresponding to the interactive application data comprises:
acquiring a current video frame, and identifying a target interaction subject area in the current video frame;
identifying a target feature point corresponding to the target interaction subject area, and determining first feature data corresponding to the interaction application data according to the position of the target feature point; and/or
And acquiring a current audio frame, and extracting audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
3. The method of claim 1, wherein parsing the dynamic parsing code data to parse a first drawing command in the dynamic parsing code data into a second drawing command corresponding to an image drawing engine comprises:
acquiring the first drawing command, and acquiring a first drawing parameter corresponding to the first drawing command, wherein the first drawing command is determined by application logic corresponding to the dynamic analysis code data;
determining second drawing data corresponding to the second drawing command according to the first drawing parameter;
and acquiring image drawing engine data corresponding to the second drawing command, and binding the second drawing data with the image drawing engine data.
4. The method of claim 1, further comprising:
combining a first image corresponding to the drawing result with a second image corresponding to the interactive application data to obtain a composite image;
and displaying the composite image.
5. The method of claim 1, wherein prior to loading the dynamic parsing code data, further comprising:
and sending a downloading request to a server through a local application so that the server returns the dynamic analysis code data according to the downloading request, wherein the dynamic analysis code data is used for running an interactive application.
6. The method of claim 1, further comprising:
starting a first thread, acquiring the interactive application data through the first thread, and analyzing the interactive application data to obtain characteristic data corresponding to the interactive application data;
the first thread sends the feature data to a second thread, the second thread is used for loading the dynamic analysis code data and starting the image drawing engine, the image drawing engine is used for analyzing the dynamic analysis code data, a first drawing command in the dynamic analysis code data is analyzed into a second drawing command corresponding to the image drawing engine, and the second drawing command is executed according to the feature data to obtain a corresponding drawing result;
and the first thread reads the drawing result and performs synthesis display on the drawing result and the interactive application data.
7. The method of claim 6, wherein the first thread corresponds to a first image rendering engine execution context and the second thread corresponds to a second image rendering engine execution context, the first image rendering engine execution context being a shared context of the second image rendering engine execution context.
8. The method of claim 1, further comprising:
converting the characteristic data into target characteristic data corresponding to a browser engine;
the browser engine analyzes the first drawing command in the dynamic analysis code data to obtain a corresponding third drawing command;
and the browser engine executes the third drawing command according to the target characteristic data to obtain a corresponding drawing result.
9. An application data processing method, comprising:
loading dynamic analysis code data;
acquiring feature data corresponding to interactive application data, wherein the interactive application data comprises a current video frame;
analyzing the dynamic analysis code data, and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to an image drawing engine;
executing the second drawing command according to the characteristic data to obtain drawing state information of a target virtual object corresponding to the current video frame;
drawing the target virtual object in the current video frame according to the drawing state information;
playing the current video frame including the target virtual object.
10. The method of claim 9, wherein the drawing state information includes position information and direction information, and wherein drawing the target virtual object in the current video frame according to the drawing state information includes:
determining the target position of the virtual object in the current video frame according to the position information;
determining the target direction of the virtual object in the current video frame according to the direction information;
drawing the target virtual object in the current video frame according to the target position and the target orientation.
11. The method of claim 10, wherein the feature data comprises virtual object associated feature data corresponding to a backward video frame of the current video frame;
the obtaining of the drawing state information of the target virtual object corresponding to the current video frame by executing the second drawing command according to the feature data includes:
acquiring the position and the direction of the target virtual object in the backward video frame;
determining the position and the direction of a related virtual object corresponding to the target virtual object according to the position and the direction of the target virtual object in the backward video frame;
and drawing the associated virtual object according to the position and the direction of the associated virtual object corresponding to the target virtual object.
12. An application data processing apparatus, characterized in that the apparatus comprises:
the dynamic analysis code data loading module is used for loading dynamic analysis code data;
the characteristic data acquisition module is used for acquiring characteristic data corresponding to the interactive application data;
the dynamic analysis code data analysis module is used for analyzing the dynamic analysis code data and analyzing a first drawing command in the dynamic analysis code data into a second drawing command corresponding to the image drawing engine;
the characteristic data execution module is used for acquiring a target virtual object according to the second drawing command; determining drawing state information of the target virtual object according to the characteristic data; and drawing the target virtual object according to the drawing state information to obtain a corresponding drawing result.
13. The apparatus of claim 12, wherein the interactive application data comprises video data and/or audio data, and wherein the feature data obtaining module comprises:
the current video frame identification unit is used for acquiring a current video frame and identifying a target interaction main body area in the current video frame;
the characteristic data determining unit is used for identifying a target characteristic point corresponding to the target interaction subject area and determining first characteristic data corresponding to the interaction application data according to the position of the target characteristic point; and/or
And the current audio frame extraction unit is used for acquiring a current audio frame and extracting audio features corresponding to the current audio frame to obtain second feature data corresponding to the interactive application data.
14. The apparatus of claim 12, wherein the dynamic parsing code data parsing module comprises:
a first drawing command obtaining unit, configured to obtain the first drawing command, and obtain a first drawing parameter corresponding to the first drawing command, where the first drawing command is determined by application logic corresponding to the dynamic analysis code data;
a second drawing data determining unit, configured to determine, according to the first drawing parameter, second drawing data corresponding to the second drawing command;
and the image drawing engine data acquisition unit is used for acquiring the image drawing engine data corresponding to the second drawing command and binding the second drawing data with the image drawing engine data.
15. The apparatus according to claim 12, wherein the application data processing apparatus is further configured to combine a first image corresponding to the rendering result with a second image corresponding to the interactive application data to obtain a composite image; and displaying the composite image.
16. The apparatus according to claim 12, wherein the application data processing apparatus is further configured to send a download request to a server through a local application, so that the server returns the dynamic resolution code data according to the download request, and the dynamic resolution code data is used for running an interactive application.
17. The apparatus according to claim 12, wherein the application data processing apparatus is further configured to start a first thread, obtain the interactive application data through the first thread, and analyze the interactive application data to obtain feature data corresponding to the interactive application data; the first thread sends the feature data to a second thread, the second thread is used for loading the dynamic analysis code data and starting the image drawing engine, the image drawing engine is used for analyzing the dynamic analysis code data, a first drawing command in the dynamic analysis code data is analyzed into a second drawing command corresponding to the image drawing engine, and the second drawing command is executed according to the feature data to obtain a corresponding drawing result; and the first thread reads the drawing result and performs synthesis display on the drawing result and the interactive application data.
18. The apparatus of claim 17, wherein the first thread corresponds to a first image rendering engine execution context, wherein the second thread corresponds to a second image rendering engine execution context, and wherein the first image rendering engine execution context is a shared context for the second image rendering engine execution context.
19. The apparatus according to claim 12, wherein the application data processing apparatus is further configured to convert the feature data into target feature data corresponding to a browser engine; the browser engine analyzes the first drawing command in the dynamic analysis code data to obtain a corresponding third drawing command; and the browser engine executes the third drawing command according to the target characteristic data to obtain a corresponding drawing result.
20. A computer-readable storage medium, storing a computer program which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 11.
21. A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 11.
CN201810865998.XA 2018-08-01 2018-08-01 Application data processing method and device, computer equipment and storage medium Active CN110795074B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810865998.XA CN110795074B (en) 2018-08-01 2018-08-01 Application data processing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810865998.XA CN110795074B (en) 2018-08-01 2018-08-01 Application data processing method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110795074A CN110795074A (en) 2020-02-14
CN110795074B true CN110795074B (en) 2022-03-01

Family

ID=69425375

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810865998.XA Active CN110795074B (en) 2018-08-01 2018-08-01 Application data processing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110795074B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106843937A (en) * 2016-12-29 2017-06-13 北京奇虎科技有限公司 A kind of tune for notifying corresponding A pp plays method and device
CN107645521A (en) * 2016-07-21 2018-01-30 平安科技(深圳)有限公司 Functional unit installation method, terminal and server
CN108139952A (en) * 2017-06-14 2018-06-08 北京小米移动软件有限公司 Using exchange method, exchange method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8250521B2 (en) * 2007-12-14 2012-08-21 International Business Machines Corporation Method and apparatus for the design and development of service-oriented architecture (SOA) solutions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107645521A (en) * 2016-07-21 2018-01-30 平安科技(深圳)有限公司 Functional unit installation method, terminal and server
CN106843937A (en) * 2016-12-29 2017-06-13 北京奇虎科技有限公司 A kind of tune for notifying corresponding A pp plays method and device
CN108139952A (en) * 2017-06-14 2018-06-08 北京小米移动软件有限公司 Using exchange method, exchange method and device

Also Published As

Publication number Publication date
CN110795074A (en) 2020-02-14

Similar Documents

Publication Publication Date Title
EP3964270A1 (en) Virtual object display method and apparatus, electronic device, and storage medium
CN107890671B (en) Three-dimensional model rendering method and device for WEB side, computer equipment and storage medium
EP3910599A1 (en) Rendering method and apparatus
US9164798B2 (en) Method, apparatus and computer for loading resource file for game engine
CN110780789B (en) Game application starting method and device, storage medium and electronic device
US20220249948A1 (en) Image processing method and apparatus, server, and medium
US10127626B1 (en) Method and apparatus improving the execution of instructions by execution threads in data processing systems
CN111346378A (en) Game picture transmission method, device, storage medium and equipment
CN112114808A (en) Page rendering method and device and electronic equipment
CN112035198A (en) Home page loading method, television and storage medium
CN111399938A (en) Method and device for cold starting of small program, computer equipment and storage medium
CN110795074B (en) Application data processing method and device, computer equipment and storage medium
KR102256314B1 (en) Method and system for providing dynamic content of face recognition camera
CN113727039A (en) Video generation method and device, electronic equipment and storage medium
CN112569591A (en) Data processing method, device and equipment and readable storage medium
CN113448641A (en) Method and device for starting small program, computer equipment and storage medium
CN111324340B (en) Interaction method and device based on webpage copy, storage medium and computer equipment
US9539514B2 (en) Method and system for generating signatures and locating/executing associations for a game program
US20140289656A1 (en) Systems and Methods for Creating and Using Electronic Content with Displayed Objects Having Enhanced Features
CN113360199A (en) Method, device and computer readable storage medium for preloading script in game
US20230267710A1 (en) Method, system and apparatus for training object recognition model
WO2014024255A1 (en) Terminal and video playback program
CN111275782B (en) Graph drawing method and device, terminal equipment and storage medium
CN112347397A (en) Data visualization method and device based on browser and readable storage medium
CN114065076A (en) Unity-based visualization method, system, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021672

Country of ref document: HK

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant