CN113938750A - Video processing method and device, electronic equipment and storage medium - Google Patents

Video processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113938750A
CN113938750A CN202010605418.0A CN202010605418A CN113938750A CN 113938750 A CN113938750 A CN 113938750A CN 202010605418 A CN202010605418 A CN 202010605418A CN 113938750 A CN113938750 A CN 113938750A
Authority
CN
China
Prior art keywords
rendering
display style
configuration file
data
configuration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010605418.0A
Other languages
Chinese (zh)
Inventor
张武星
赖守波
王同岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010605418.0A priority Critical patent/CN113938750A/en
Publication of CN113938750A publication Critical patent/CN113938750A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23412Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs for generating or manipulating the scene composition of objects, e.g. MPEG-4 objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations
    • H04N21/4586Content update operation triggered locally, e.g. by comparing the version of software modules in a DVB carousel to the version stored locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand

Abstract

The embodiment of the application provides a video processing method, a video processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: loading a configuration file of a target display style; rendering target data according to the target display style according to the configuration file; outputting a corresponding rendering result; the efficiency of changing the processing function can be improved.

Description

Video processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a video processing method, a video processing apparatus, an electronic device, and a storage medium.
Background
With the development of computer technology, there are more and more applications based on image processing, such as short video applications, photographing applications, etc., which can process images and videos.
However, at present, processing functions are often integrated into an application software package for video and image processing, and if corresponding processing functions are to be added or modified, the application software package needs to be modified, generally by means of application update.
However, this method is often affected by the update of the application, and the processing function also needs to affect the size of the software package, so that the update of the processing function is not convenient and inefficient.
Disclosure of Invention
The embodiment of the application provides a video processing method, so that the processing efficiency is improved.
Correspondingly, the embodiment of the application also provides a video processing device, an electronic device and a storage medium, which are used for ensuring the realization and application of the system.
In order to solve the above problem, an embodiment of the present application discloses a video processing method, including: loading a configuration file of a target display style; rendering target data according to the target display style according to the configuration file; and outputting a corresponding rendering result.
In order to solve the above problem, an embodiment of the present application discloses a video processing method, where the method includes: providing a configuration template of a display style; acquiring parameter information of the display style; and generating a configuration file of the display style according to the parameter information and the configuration template.
In order to solve the above problem, an embodiment of the present application discloses a video processing apparatus, including: the configuration file loading module is used for loading a configuration file of the target display style; the data rendering module is used for rendering the target data according to the configuration file and the target display style; and the result output module is used for outputting the corresponding rendering result.
In order to solve the above problem, an embodiment of the present application discloses a video processing apparatus, including: the configuration template providing module is used for providing a configuration template of a display style; the parameter information acquisition module is used for acquiring the parameter information of the display style; and the configuration file generation module is used for generating the configuration file of the display style according to the parameter information and the configuration template.
In order to solve the above problem, an embodiment of the present application discloses an electronic device, including: a processor; and a memory having executable code stored thereon, which when executed, causes the processor to perform the method as described in one or more of the above embodiments.
To address the above issues, embodiments of the present application disclose one or more machine-readable media having executable code stored thereon that, when executed, cause a processor to perform a method as described in one or more of the above embodiments.
Compared with the prior art, the embodiment of the application has the following advantages:
in the embodiment of the application, the target data is rendered by using the loaded configuration file of the target display style, so that the rendering effect of the target display style can be obtained. According to the embodiment of the application, the configuration file and the software package are separated, when the processing function is changed, the corresponding processing function is changed by updating the corresponding configuration file, the whole software package does not need to be updated, the change of the processing function is facilitated, and the change efficiency of the processing function can be improved.
Drawings
FIG. 1 is a schematic diagram of video processing according to an embodiment of the present application;
FIG. 2 is a flow chart of the steps of one embodiment of a video processing method of the present application;
FIG. 3 is a schematic diagram of a configuration page of an embodiment of the present application;
FIG. 4 is a flow chart of steps in another video processing method embodiment of the present application;
FIG. 5 is a diagram illustrating an effect processing page according to an embodiment of the present application;
FIG. 6 is a flow chart of steps in yet another video processing method embodiment of the present application;
FIG. 7 is a block diagram of a video processing apparatus according to an embodiment of the present application;
FIG. 8 is a block diagram of another embodiment of a video processing apparatus of the present application;
fig. 9 is a schematic structural diagram of an apparatus according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
The embodiment of the application can be applied to the field of image processing and various scenes based on image processing, such as various video applications, short video applications, shooting applications and the like. The short video generally refers to a video with a video duration within a set range, such as 1 minute, 30 seconds, and the like. The applications can have the functions of one or more of image and video acquisition and shooting, image and video editing and processing, image and video playing and the like. In order to facilitate updating of processing functions of videos and images in an application, the embodiment of the application provides a configuration file of the processing functions, and various processing functions are updated in the application through the configuration file, so that the processing functions of the application can be adjusted conveniently and fast, and the efficiency is improved.
The embodiment of the application can be applied to various scenes based on image processing, for example, video media processing scenes, such as video on demand, video live broadcast, video media production and the like, and can also be applied to scenes such as game picture production and the like. The video processing method can be executed through a client side, and also can be executed through a server side, a cloud side or an edge node.
For example, in video-on-demand and video-live scenes, a user can request a video in a server through a client and select an effect to be added, and the video is sent to the client of the user from the server through an edge node. The method comprises the steps that an effect adding request of a client can be output to a server, a corresponding configuration file is loaded in the server to process a video, and a corresponding effect is added to the video in the server; the effect adding request of the client can also be output to the edge node, the corresponding configuration file is loaded in the edge node to process the video, and the corresponding effect is added to the video in the edge node; and a corresponding configuration file can be loaded in the client to process the video, and a corresponding effect is added to the video in the client and output.
In the following, the video processing method according to the embodiment of the present application is applied to the client as an example, and the processing at the server and the edge node is similar to that at the client. In the embodiment of the application, a rendering tree may be determined by a configuration file, and the rendering tree may also be referred to as a node tree (nodeTree), a rendering flow tree, and the like, where the rendering tree is used for rendering video data, and the video data may include video stream data, image data, and the like. The rendering tree includes: a node, an input node and an output node. A render tree may include at least one node for rendering video data, which may also be referred to as a processing node, a rendering node, and the like. The input node can be understood as an input for data and the output node as an output for data. For example, data collected by a camera, data played by a playing component, and a folder for storing the data may be used as input nodes; screens, video editors and folders that store data may be used as output nodes.
The nodes may include shaders, and the shaders include data input nodes for inputting data into the shaders and data output nodes for outputting data processed by the shaders. It is understood that the data input node may be an input node (inputnode), or may be a data output node of other shaders; the data output node may be an output node (output node), or may be a data input node of another shader. The shader is used to render video data, and the shader can be understood as a program that converts input into output, and by executing the program, video data can be rendered. Shaders may include vertex shaders (vertex shaders) and Fragment shaders (Fragment shaders). Vertices and pixels exist in the image, one pixel can contain a plurality of sampling points, and a vertex shader is used for rendering the vertices; the fragment shader is used for rendering the sampling points.
The configuration file of the embodiment of the application can be customized according to the operation of the user and can also be issued by the provider of the application. Specifically, in an example, the configuration file may be changed (added, deleted, modified, and the like) according to a configuration file defined by a user, so as to change the special effect processing function of the client. The special effect processing function is changed by updating the configuration file, the whole software package does not need to be updated, the processing function is updated simply and conveniently, and the updating efficiency of the processing function can be improved. As shown in fig. 1, the user may customize the profile 3 to add the profile 3 to the client, so as to add a processing function corresponding to the profile 3. In another example, the configuration file may be changed (added, deleted, modified, and the like) according to a configuration file delivered by a provider of the application, so as to change the special effect processing function of the client. As shown in fig. 1, a provider of an application may send a configuration file 3 to a client, so as to add the configuration file 3 to the client, and add a processing function corresponding to the configuration file 3.
The configuration file may include basic parameters, which may be understood as basic information of the configuration file, such as a name, a file description, a version number, and a file type of the configuration file, and rendering parameters. In a scene in which video data is subjected to special effect rendering, the file type may be understood as a special effect type, such as a filter type and a transition type. In addition, one effect type may correspond to a plurality of effects, for example, effects such as a filter type may include: black and white, bright, moonlight, morning light, soft light and other special effects.
The rendering parameters may include: nodes in the rendering tree and node parameters, which may include shader and texture parameters. The shader in the node parameters may render the video data into data of the target display style using the texture parameters. Texture parameters may be understood as parameters used in the shader rendering process, which may be defined according to user operations. The texture parameters may include information such as parameter name, parameter type, parameter value, parameter maximum value, and parameter minimum value. For example, the texture parameters of the shader may be unifonm variables, which are used to represent information such as transformation matrix, texture, lighting parameters, and color. The user may edit the base parameters and the rendering parameters to generate a configuration file.
In an alternative embodiment, a configuration template corresponding to the special effect processing function may be provided in the client, the configuration template is used to determine a configuration page, and the configuration page is used to interact with a user to generate a configuration file. In one example, the configuration template is generated according to a definition file capable of displaying a style, and for one display style, a corresponding definition file may be set in advance to generate the configuration template. Specifically, the client provides a configuration template of the display style in step 202. The client side can provide a configuration page of the display style according to the configuration template of the display style, and the configuration page can comprise a parameter configuration control. The user can input corresponding parameters on the configuration page by using the parameter configuration control to determine the configuration file. It can be understood that a document definition page may be one page or may be composed of multiple pages, and the embodiment of the present application is not limited. Fig. 3 is a schematic diagram of a configuration page according to an embodiment of the present application, and as shown in fig. 3, a user may input basic parameters and rendering parameters in the configuration page to generate a configuration file.
After determining the configuration page, the client may obtain parameter information of the display style in step 204. Specifically, the control can be configured according to the parameter, and the parameter information of the display style is received. The user may enter parameter information in a configuration page using the parameter configuration control to determine the configuration file. Specifically, as shown in fig. 3, the user inputs the name, the description, the version number, and the file type of the configuration file as basic parameters in the configuration page, and inputs the data and the texture parameters corresponding to the shader in the configuration page to generate the configuration file.
After determining the parameter information, the client generates a configuration file of the display style according to the parameter information and the configuration template in step 206. According to the basic parameters, the shader parameters and the texture parameters input by a user in the configuration page, the configuration template is combined to generate and store the configuration file of the display style. The client can complete the corresponding special effect processing function by using the configuration file.
In an alternative embodiment, as shown in FIG. 3. The node parameters may include shader information, which is used to determine shaders. The shader information may include one of shader data and shader storage locations. Shader data can be understood as instructions of a shader, and the shader can be determined by parsing shader data.
In an alternative embodiment, the client may generate the node parameters based on the shader data and texture parameters in the configuration page. Specifically, a user may input shader data and corresponding texture parameters in a configuration page to generate user-defined node parameters, so as to determine a corresponding configuration file. In the process of rendering video data, shader data can be analyzed, and a corresponding shader is determined for rendering.
In an alternative embodiment, the client may generate the node parameters according to the shader storage path and texture parameters in the configuration page. Specifically, a user may customize and store one shader, or may obtain and store a shader from another device, and in the configuration page, the user may input a shader storage path and texture parameters corresponding to the shader storage path to determine node parameters. In the process of rendering the video data, the corresponding shader can be called by using the shader storage path to perform rendering.
In an optional embodiment, the node parameter may further include data source information corresponding to the shader. The data source information is used to determine shader input data. The client may determine shader data source information at the configuration page and add the data source information to the rendering node information. The user can define the data source information in the configuration page and determine the node parameters. In one example, the data source information may correspond to a data input node and may also correspond to a folder for storing data to be rendered, and thus, the data source information of the shader may include one of a node identifier and a data storage path. The client may obtain a data source type (srcType), and in an example, the data source type may include a first type and a second type, where the first type of data may be data corresponding to a node, and the second type of data may be data preset in a folder, such as picture data. Under the condition that the data source type is a first type, acquiring an input node identifier as data source information; and acquiring the input data storage path as data source information under the condition that the data source type is a second type. The client can acquire data according to the data input node corresponding to the node identifier, and the client can also acquire the data in the folder according to the data storage path.
In an existing video processing method, some special effect processing functions are built in a Software package of an application, for example, for an application of an Android system, special effect processing is built in a Software Development Kit (SDK), and in this way, when the special effect processing functions are changed, the whole SDK needs to be updated, and the change of the special effect processing functions is very inconvenient. In the embodiment of the application, the user can define the rendering parameters and the basic parameters in the configuration page to determine the configuration file. The client can finish the rendering of the video data according to the configured configuration file. According to the embodiment of the application, the configuration file is separated from the SDK, when the special effect processing function is changed, the special effect processing function is changed by updating the corresponding configuration file, the whole SDK does not need to be updated, and the addition, modification and deletion of the special effect processing function are more flexible and convenient. And the configuration file is defined based on the configuration template, the configuration file can be configured into a file with a uniform format, and the universality of the configuration file is improved. For example, the file format may be configured based on JS Object Notation (JSON).
On the basis of the foregoing embodiments, the present application further provides a video processing method, as shown in fig. 4, the method includes:
step 402, providing a configuration page of the display style according to the configuration template of the display style, wherein the configuration page comprises: and (5) controlling parameter configuration. Step 404, configuring a control according to a parameter, and receiving parameter information of a display style, where the parameter information includes: basic parameters and rendering parameters. And 406, generating a configuration file of the display style according to the parameter information and the configuration template.
In the embodiment of the application, the client may provide a configuration page of the display style according to the configuration template, and the user may define texture parameters of the shader and the shader in the configuration page to generate node parameters, and then determine rendering parameters according to the node parameters and the nodes. And determining parameter information by combining the rendering parameters and the basic parameters. And then generating a configuration file of the display style according to the parameter information and the configuration template. A user can input information in the configuration page to define a configuration file, and the user can freely design a special effect; in addition, the configuration file is defined based on the configuration template, the configuration file can be configured into files with a uniform format, and the universality of the configuration file is improved.
The foregoing embodiment specifically illustrates a configuration process of a configuration file, and on the basis of the foregoing embodiment, the present application further provides a video processing method, which can be executed by a client, a server, a cloud, and/or an edge node, and can render video data by using the configured configuration file to determine a rendering result of a target display style.
Specifically, as shown in fig. 1, taking the video processing method executed by the client as an example, the client loads a configuration file of a target display style in step 102. The client can acquire the configuration file according to the operation of the user and load the configuration file. In an alternative embodiment, the client provides an effect processing page, as shown in fig. 5, fig. 5 is a schematic diagram of the effect processing page according to an embodiment of the present application, where the effect processing page includes at least one effect processing control (e.g., effect 1, effect 2, and effect 3) in a display style; and the client receives the trigger of the special effect processing control and loads a configuration file of a target display style corresponding to the special effect processing control. The special effect processing control corresponds to the configuration file, the special effect processing control is used for determining the configuration file, and a user triggers the special effect processing control in the special effect processing page to determine the corresponding configuration file.
In an alternative embodiment, the configuration file may be obtained from a provider of the application upon request of the user. Specifically, after receiving an acquisition request of a target display style of a user, a client sends the acquisition request of the target display style; and receiving a configuration file corresponding to the target display style, and storing the configuration file into a specified storage address so as to be called and loaded. The client sends the acquisition request to a server (an application provider), the server returns a corresponding configuration file to the client according to the acquisition request, and the client receives and stores the configuration file. The stored configuration file may be loaded for rendering of the target data.
In an alternative embodiment, the special effect processing control may correspond to a file storage address of the storage configuration file. The client can determine a specified storage address based on the trigger of the special effect processing control; and acquiring a corresponding configuration file based on the specified storage address.
After loading the configuration file, the client renders the target data according to the target display style according to the configuration file in step 104. Specifically, the configuration file includes basic parameters and rendering parameters, and the rendering parameters include nodes in a rendering tree and node parameters. And determining a corresponding rendering tree through the configuration file, and rendering the target data by using the rendering tree. In an optional embodiment, the client may render the target data by using a rendering tree in a configuration file, and determine a rendering result of adding the target display style. Specifically, the rendering tree includes a plurality of nodes, and the plurality of nodes may render the target data in order. In an alternative embodiment, target data is added to the render tree as an input node; and respectively rendering the target data on the nodes according to the node sequence of the rendering tree. The rendering order of the plurality of nodes can be determined according to the data source information of each node, and then the target data is rendered on the nodes respectively according to the rendering data.
In particular, during processing of video data, a data output node of one node may be a data input node of another node. Therefore, the order of the nodes can be determined according to the data sources of the nodes. As an optional embodiment, the client acquires data source information of the node; and determining a rendering sequence corresponding to the nodes based on the data source information. Specifically, the node whose data input node is the information input node may be regarded as the node of the first order, and the following nodes may be determined in order.
The data source information may include a node identifier and a data storage path, and the shader may acquire data using the node identifier or the data storage path during the rendering process. Specifically, under the condition that the data source information comprises a node identifier, the shader acquires data from a node corresponding to the node identifier; and under the condition that the data source information comprises a data storage path, the shader acquires data according to the data storage path. The corresponding obtaining mode can be determined according to the type of the data source information contained in the rendering node information so as to determine the input data of the shader.
The node parameters of the configuration file comprise a shader and texture parameters, and in the process of rendering the data by the node, the shader of the node can be determined by the node parameters, and the target data is rendered according to the texture parameters by the shader of the node to determine a rendering result.
In an alternative embodiment, the node parameters include shader data; and the client analyzes the shader data to determine a corresponding shader. In an alternative embodiment, the rendering node information includes shader storage paths; and the client calls the corresponding shader based on the shader storage path. For node parameters containing shader data, the shader data can be parsed to determine the shader. For node parameters including shader storage paths, the corresponding shaders can be called from the corresponding folders according to the shader storage paths.
The target data can be used as an input node of the rendering tree to be input into the node for processing, and the client can acquire the target data through the image acquisition device, the image display device and the folder for storing the target data. Specifically, as an optional embodiment, the client may obtain the acquired video data as the target data according to the camera; the client can acquire played video data from the playing component as target data; the client may retrieve at least one video data as the target data from the at least one storage address. The target data may include one or more video data, and the client may obtain the video data from one or more of the camera, the play component, and the folder as the target data. Wherein the memory address may be determined according to a video encoder.
After the image data is rendered by the plurality of rendering nodes, the client may output a corresponding rendering result in step 106. Specifically, the determined rendering result may be output to a display device for displaying, or may be output to a video encoder for encoding. In an alternative embodiment, the client may output the rendering result of the target data in a screen; the rendering result of the target data may also be output in a video editor. Specifically, the configuration file of the embodiment of the present application may render one video data, or may render two or more video data, and in an optional embodiment, when one video data is rendered, the rendering result is output to a screen for display; in case of rendering at least two video data, a rendering result is output in a video encoder. In the case of processing one piece of video data, for example, after rendering one piece of video data (for example, adding a filter special effect to an image), a rendering result may be output to a screen for display. After two or more video data are rendered (e.g., transition special effects are performed on two or more images), the two rendered video data may be output to a video encoder for encoding, so as to output a rendering result.
For example, as shown in FIG. 1, node 1 in FIG. 1 receives data from input node 1 and renders it; the node 2 further renders the data rendered by the node 1; the node 3 obtains data from the input node 2 and renders it. The rendering node 4 receives the data rendered by the rendering node 2 and the rendering node 3, renders the two data, and outputs a rendering result to a video encoder for encoding.
In the embodiment of the application, the rendering tree can be determined according to the configured configuration file, the video data is rendered by utilizing the nodes in the rendering tree to obtain the rendering effect of the target display style, the configuration file and the software package are separated, when the special effect processing function is changed, the special effect processing function is changed by updating the corresponding configuration file, the whole software package does not need to be updated, and the change of the special effect processing function is facilitated.
On the basis of the foregoing embodiments, the present application further provides a video processing method, as shown in fig. 6, the method includes:
step 602, providing an effect processing page, where the effect processing page includes at least one effect processing control of a display style.
And step 604, receiving the trigger of the special effect processing control, and loading a configuration file of a target display style corresponding to the special effect processing control.
Step 606, adding the target data as an input node into a rendering tree; the rendering tree comprises a plurality of nodes; the node parameters of the node include shader and texture parameters.
And 608, rendering the target data according to the texture parameters through the shaders of the nodes according to the node sequence of the rendering tree. The target data comprises one or more video data; the target data may be obtained in at least one of the following ways: acquiring collected video data as target data according to the camera; acquiring played video data from the playing component as target data; at least one video data is acquired as target data from at least one memory address.
And step 610, outputting a corresponding rendering result. Outputting the corresponding rendering result may include at least one of the following ways: outputting a rendering result of the target data in a screen; in the video editor, a rendering result of the target data is output.
In the embodiment of the application, a user can trigger the special effect processing control in the special effect processing page to determine the corresponding configuration file and load the configuration file. And then, according to the configured configuration file, determining a rendering tree, and rendering the video data by using a plurality of nodes in the rendering tree to obtain a rendering effect of the target display style.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
On the basis of the foregoing embodiment, this embodiment further provides a video processing apparatus, and with reference to fig. 7, the video processing apparatus may specifically include the following modules:
and a configuration file loading module 702, configured to load a configuration file of the target display style.
And a data rendering module 704, configured to render the target data according to the target display style according to the configuration file.
And a result output module 706, configured to output a corresponding rendering result.
In summary, in the embodiment of the present application, the rendering tree can be determined according to the configured configuration file, and the video data is rendered by using the nodes in the rendering tree to obtain the rendering effect of the target display style.
On the basis of the foregoing embodiment, this embodiment further provides a video processing apparatus, which may specifically include the following modules:
and the processing page providing module is used for providing a special effect processing page, and the special effect processing page comprises at least one special effect processing control of a display style.
And the configuration file selection module is used for receiving the trigger of the special effect processing control and loading the configuration file of the target display style corresponding to the special effect processing control.
The target data input module is used for adding target data serving as an input node into a rendering tree; the rendering tree comprises a plurality of nodes; the node parameters of the node include shader and texture parameters.
And the target data rendering module is used for rendering the target data according to the texture parameters through the shaders of the nodes according to the node sequence of the rendering tree. The target data comprises one or more video data; the target data may be obtained in at least one of the following ways: acquiring collected video data as target data according to the camera; acquiring played video data from the playing component as target data; at least one video data is acquired as target data from at least one memory address.
And the rendering result output module is used for outputting the corresponding rendering result. The rendering result output module may include at least one of the following modules: the display processing submodule is used for outputting a rendering result of the target data in a screen; and the coding processing sub-module is used for outputting a rendering result of the target data in the video editor.
In the embodiment of the application, a user can trigger the special effect processing control in the special effect processing page to determine the corresponding configuration file and load the configuration file. And then, according to the configured configuration file, determining a rendering tree, and rendering the video data by using a plurality of nodes in the rendering tree to obtain a rendering effect of the target display style.
In an optional embodiment, the apparatus further comprises:
and the request processing module is used for sending an acquisition request of the target display style.
And the configuration file storage module is used for receiving the configuration file corresponding to the target display style and storing the configuration file into a specified storage address.
On the basis of the foregoing embodiment, this embodiment further provides a video processing apparatus, and with reference to fig. 8, the video processing apparatus may specifically include the following modules:
a configuration template providing module 802 for providing a configuration template of the display style; a parameter information obtaining module 804, configured to obtain parameter information of the display style; a configuration file generating module 806, configured to generate a configuration file of the display style according to the parameter information and the configuration template.
In summary, the embodiment of the present application may provide a configuration template to provide a configuration page of a display style, and a user may edit parameter information and generate a configuration file in combination with the configuration template. The user can freely design special effects; in addition, the configuration file is defined based on the configuration template, the configuration file can be configured into files with a uniform format, and the universality of the configuration file is improved.
On the basis of the foregoing embodiment, this embodiment further provides a video processing apparatus, which may specifically include the following modules:
the configuration page display module is used for providing a configuration page of a display style according to a configuration template of the display style, and the configuration page comprises: and (5) controlling parameter configuration.
A parameter information receiving module, configured to receive parameter information of a display style according to a parameter configuration control, where the parameter information includes: basic parameters and rendering parameters.
And the configuration file determining module is used for generating a configuration file of the display style according to the parameter information and the configuration template.
In the embodiment of the application, a configuration page of the display style can be provided according to the configuration template, a user can define texture parameters of a shader and the shader in the configuration page to generate node parameters, and then rendering parameters are determined according to the node parameters and the nodes. And determining parameter information by combining the rendering parameters and the basic parameters. And then generating a configuration file of the display style according to the parameter information and the configuration template. A user can input information in the configuration page to define a configuration file, and the user can freely design a special effect; in addition, the configuration file is defined based on the configuration template, the configuration file can be configured into files with a uniform format, and the universality of the configuration file is improved.
The present application further provides a non-transitory, readable storage medium, where one or more modules (programs) are stored, and when the one or more modules are applied to a device, the device may execute instructions (instructions) of method steps in this application.
Embodiments of the present application provide one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an electronic device to perform the methods as described in one or more of the above embodiments. In the embodiment of the application, the electronic device includes a terminal device, an edge computing device, a cloud device, a server device and other devices.
Embodiments of the present disclosure may be implemented as an apparatus, which may include electronic devices such as a server (cluster), a terminal device, an edge computing device, a cloud device, a server device, and the like, using any suitable hardware, firmware, software, or any combination thereof, to perform a desired configuration. Fig. 9 schematically illustrates an example apparatus 900 that may be used to implement various embodiments described herein.
For one embodiment, fig. 9 illustrates an example apparatus 900 having one or more processors 902, a control module (chipset) 904 coupled to at least one of the processor(s) 902, a memory 906 coupled to the control module 904, a non-volatile memory (NVM)/storage 908 coupled to the control module 904, one or more input/output devices 910 coupled to the control module 904, and a network interface 912 coupled to the control module 904.
The processor 902 may include one or more single-core or multi-core processors, and the processor 902 may include any combination of general-purpose or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the apparatus 900 can be used as a terminal device, an edge computing device, a cloud device, a server device, and the like in this embodiment.
In some embodiments, apparatus 900 may include one or more computer-readable media (e.g., memory 906 or NVM/storage 908) having instructions 914 and one or more processors 902 in combination with the one or more computer-readable media and configured to execute instructions 914 to implement modules to perform the actions described in this disclosure.
For one embodiment, control module 904 may include any suitable interface controllers to provide any suitable interface to at least one of the processor(s) 902 and/or any suitable device or component in communication with control module 904.
The control module 904 may include a memory controller module to provide an interface to the memory 906. The memory controller module may be a hardware module, a software module, and/or a firmware module.
The memory 906 may be used, for example, to load and store data and/or instructions 914 for the device 900. For one embodiment, memory 906 may comprise any suitable volatile memory, such as suitable DRAM. In some embodiments, the memory 906 may comprise a double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
For one embodiment, the control module 904 may include one or more input/output controllers to provide an interface to the NVM/storage 908 and input/output device(s) 910.
For example, NVM/storage 908 may be used to store data and/or instructions 914. NVM/storage 908 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
The NVM/storage 908 may include storage resources that are part of the device on which the apparatus 900 is installed or may be accessible by the device and may not necessarily be part of the device. For example, NVM/storage 908 may be accessible over a network via input/output device(s) 910.
Input/output device(s) 910 may provide an interface for apparatus 900 to communicate with any other suitable device, input/output devices 910 may include communication components, audio components, sensor components, and so forth. Network interface 912 may provide an interface for device 900 to communicate over one or more networks, and device 900 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols, such as access to a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, 5G, etc., or a combination thereof.
For one embodiment, at least one of the processor(s) 902 may be packaged together with logic for one or more controller(s) (e.g., memory controller module) of the control module 904. For one embodiment, at least one of the processor(s) 902 may be packaged together with logic for one or more controller(s) of the control module 904 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 902 may be integrated on the same die with logic for one or more controller(s) of the control module 904. For one embodiment, at least one of the processor(s) 902 may be integrated on the same die with logic of one or more controllers of the control module 904 to form a system on a chip (SoC).
In various embodiments, the apparatus 900 may be, but is not limited to being: a server, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.), among other terminal devices. In various embodiments, apparatus 900 may have more or fewer components and/or different architectures. For example, in some embodiments, device 900 includes one or more cameras, keyboards, Liquid Crystal Display (LCD) screens (including touch screen displays), non-volatile memory ports, multiple antennas, graphics chips, Application Specific Integrated Circuits (ASICs), and speakers.
The detection device can adopt a main control chip as a processor or a control module, sensor data, position information and the like are stored in a memory or an NVM/storage device, a sensor group can be used as an input/output device, and a communication interface can comprise a network interface.
An embodiment of the present application further provides an electronic device, including: a processor; and a memory having executable code stored thereon that, when executed, causes the processor to perform a method as described in one or more of the embodiments of the application.
Embodiments of the present application also provide one or more machine-readable media having executable code stored thereon that, when executed, cause a processor to perform a method as described in one or more of the embodiments of the present application.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The foregoing detailed description has provided a video processing method, a video processing apparatus, an electronic device, and a storage medium, and the principles and embodiments of the present application are described herein using specific examples, which are only used to help understand the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (18)

1. A method of video processing, the method comprising:
loading a configuration file of a target display style;
rendering target data according to the target display style according to the configuration file;
and outputting a corresponding rendering result.
2. The method of claim 1, further comprising:
sending an acquisition request of a target display style;
and receiving a configuration file corresponding to the target display style, and storing the configuration file into a specified storage address.
3. The method of claim 1, wherein loading the configuration file of the target display style comprises:
providing an effect processing page, wherein the effect processing page comprises at least one effect processing control of a display style;
receiving the trigger of the special effect processing control, and loading a configuration file of a target display style corresponding to the special effect processing control.
4. The method of claim 1, wherein the rendering the target data in the target display style according to the configuration file comprises:
and rendering the target data by adopting a rendering tree in a configuration file, and determining a rendering result of adding the target display style.
5. The method of claim 4, wherein the render tree comprises a plurality of nodes; the rendering the target data by adopting the rendering tree in the configuration file comprises the following steps:
adding the target data as an input node to the rendering tree;
and respectively rendering the target data on the nodes according to the node sequence of the rendering tree.
6. The method of claim 5, wherein the node parameters of the node include shader and texture parameters;
the rendering processing of the target data on the node includes:
and rendering the target data according to the texture parameters through a shader of the node.
7. The method of any of claims 1-6, wherein the target data comprises one or more video data; the method further comprises at least one of the following steps of obtaining target data:
acquiring collected video data as target data according to the camera;
acquiring played video data from the playing component as target data;
at least one video data is acquired as target data from at least one memory address.
8. The method of claim 7, wherein outputting the corresponding rendering result comprises at least one of:
outputting a rendering result of the target data in a screen;
in a video editor, outputting a rendering result of the target data.
9. A method of video processing, the method comprising:
providing a configuration template of a display style;
acquiring parameter information of the display style;
and generating a configuration file of the display style according to the parameter information and the configuration template.
10. The method of claim 9, wherein providing the configuration template of the display style comprises:
providing a configuration page of a display style according to a configuration template of the display style, wherein the configuration page comprises: and (5) controlling parameter configuration.
11. The method according to claim 10, wherein the obtaining parameter information of the display style comprises:
receiving parameter information of the display style according to the parameter configuration control, wherein the parameter information comprises: basic parameters and rendering parameters.
12. The method of claim 11, wherein the rendering parameters comprise: rendering nodes and node parameters in the tree; the node parameters include shader and texture parameters.
13. A video processing apparatus, comprising:
the configuration file loading module is used for loading a configuration file of the target display style;
the data rendering module is used for rendering the target data according to the configuration file and the target display style;
and the result output module is used for outputting the corresponding rendering result.
14. A video processing apparatus, comprising:
the configuration template providing module is used for providing a configuration template of a display style;
the parameter information acquisition module is used for acquiring the parameter information of the display style;
and the configuration file generation module is used for generating the configuration file of the display style according to the parameter information and the configuration template.
15. An electronic device, comprising: a processor; and
memory having stored thereon executable code which, when executed, causes the processor to perform the method of one or more of claims 1-8.
16. One or more machine-readable media having executable code stored thereon that, when executed, causes a processor to perform the method of one or more of claims 1-8.
17. An electronic device, comprising: a processor; and
memory having stored thereon executable code which, when executed, causes the processor to perform the method of one or more of claims 9-12.
18. One or more machine-readable media having executable code stored thereon that, when executed, causes a processor to perform the method of one or more of claims 9-12.
CN202010605418.0A 2020-06-29 2020-06-29 Video processing method and device, electronic equipment and storage medium Pending CN113938750A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010605418.0A CN113938750A (en) 2020-06-29 2020-06-29 Video processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010605418.0A CN113938750A (en) 2020-06-29 2020-06-29 Video processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113938750A true CN113938750A (en) 2022-01-14

Family

ID=79272964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010605418.0A Pending CN113938750A (en) 2020-06-29 2020-06-29 Video processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113938750A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116684704A (en) * 2023-07-21 2023-09-01 北京美摄网络科技有限公司 Video processing method and device
WO2023197793A1 (en) * 2022-04-15 2023-10-19 北京字跳网络技术有限公司 Post-processing special effect manufacturing system and method and ar special effect rendering method and apparatus

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130097520A1 (en) * 2011-10-18 2013-04-18 Research In Motion Limited Method of rendering a user interface
CN104751507A (en) * 2013-12-31 2015-07-01 北界创想(北京)软件有限公司 Method and device for rendering pattern contents
CN106296785A (en) * 2016-08-09 2017-01-04 腾讯科技(深圳)有限公司 A kind of picture rendering intent and picture rendering apparatus
US10067950B1 (en) * 2014-06-25 2018-09-04 Google Llc Systems and methods for efficiently organizing map styling information
CN109035373A (en) * 2018-06-28 2018-12-18 北京市商汤科技开发有限公司 The generation of three-dimensional special efficacy program file packet and three-dimensional special efficacy generation method and device
CN109358936A (en) * 2018-09-29 2019-02-19 Oppo广东移动通信有限公司 Information processing method, device, storage medium, electronic equipment and system
CN109792564A (en) * 2016-07-21 2019-05-21 索尼互动娱乐美国有限责任公司 The method and system of previously stored game episode is accessed for the videograph by executing on game cloud system
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN110458930A (en) * 2019-08-13 2019-11-15 网易(杭州)网络有限公司 Rendering method, device and the storage medium of three-dimensional map
CN110599396A (en) * 2019-09-19 2019-12-20 网易(杭州)网络有限公司 Information processing method and device
CN110750958A (en) * 2019-10-17 2020-02-04 北京奇艺世纪科技有限公司 Text display method and device, electronic equipment and medium
CN110807761A (en) * 2019-09-18 2020-02-18 腾讯科技(深圳)有限公司 Method and device for generating label panel, storage medium and computer equipment
CN111031393A (en) * 2019-12-26 2020-04-17 广州酷狗计算机科技有限公司 Video playing method, device, terminal and storage medium
CN111142989A (en) * 2019-12-05 2020-05-12 苏州睿威博科技有限公司 Object management method, device and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130097520A1 (en) * 2011-10-18 2013-04-18 Research In Motion Limited Method of rendering a user interface
CN104751507A (en) * 2013-12-31 2015-07-01 北界创想(北京)软件有限公司 Method and device for rendering pattern contents
US10067950B1 (en) * 2014-06-25 2018-09-04 Google Llc Systems and methods for efficiently organizing map styling information
CN109792564A (en) * 2016-07-21 2019-05-21 索尼互动娱乐美国有限责任公司 The method and system of previously stored game episode is accessed for the videograph by executing on game cloud system
CN106296785A (en) * 2016-08-09 2017-01-04 腾讯科技(深圳)有限公司 A kind of picture rendering intent and picture rendering apparatus
CN109035373A (en) * 2018-06-28 2018-12-18 北京市商汤科技开发有限公司 The generation of three-dimensional special efficacy program file packet and three-dimensional special efficacy generation method and device
CN109358936A (en) * 2018-09-29 2019-02-19 Oppo广东移动通信有限公司 Information processing method, device, storage medium, electronic equipment and system
CN110152291A (en) * 2018-12-13 2019-08-23 腾讯科技(深圳)有限公司 Rendering method, device, terminal and the storage medium of game picture
CN110458930A (en) * 2019-08-13 2019-11-15 网易(杭州)网络有限公司 Rendering method, device and the storage medium of three-dimensional map
CN110807761A (en) * 2019-09-18 2020-02-18 腾讯科技(深圳)有限公司 Method and device for generating label panel, storage medium and computer equipment
CN110599396A (en) * 2019-09-19 2019-12-20 网易(杭州)网络有限公司 Information processing method and device
CN110750958A (en) * 2019-10-17 2020-02-04 北京奇艺世纪科技有限公司 Text display method and device, electronic equipment and medium
CN111142989A (en) * 2019-12-05 2020-05-12 苏州睿威博科技有限公司 Object management method, device and storage medium
CN111031393A (en) * 2019-12-26 2020-04-17 广州酷狗计算机科技有限公司 Video playing method, device, terminal and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023197793A1 (en) * 2022-04-15 2023-10-19 北京字跳网络技术有限公司 Post-processing special effect manufacturing system and method and ar special effect rendering method and apparatus
CN116684704A (en) * 2023-07-21 2023-09-01 北京美摄网络科技有限公司 Video processing method and device
CN116684704B (en) * 2023-07-21 2023-11-03 北京美摄网络科技有限公司 Video processing method and device

Similar Documents

Publication Publication Date Title
CN112184856B (en) Multimedia processing device supporting multi-layer special effect and animation mixing
CN109672884B (en) Image hardware coding processing method and device
WO2017024964A1 (en) Object-associated image quick preview method and device
CN111899322A (en) Video processing method, animation rendering SDK, device and computer storage medium
US20180053531A1 (en) Real time video performance instrument
CN113938750A (en) Video processing method and device, electronic equipment and storage medium
US11893770B2 (en) Method for converting a picture into a video, device, and storage medium
CN112073753B (en) Method, device, equipment and medium for publishing multimedia data
CN110674624A (en) Method and system for editing image and text
CN112637623A (en) Live broadcast processing method and device and electronic equipment
KR20220143442A (en) Method and apparatus for timed and event triggered updates in a scene
CN109643528B (en) Information processing apparatus, information processing method, and program
CN105589667B (en) Method and device for capturing display image of display equipment
KR101720635B1 (en) Method for web-based producing 3d video contents and server implementing the same
CN110782387A (en) Image processing method and device, image processor and electronic equipment
CN111432142A (en) Video synthesis method, device, equipment and storage medium
CN108010095B (en) Texture synthesis method, device and equipment
CN108876866B (en) Media data processing method, device and storage medium
US20210382931A1 (en) Information processing apparatus, control method of information processing apparatus, and non-transitory computer-readable storage medium
CN114721728A (en) Processing method based on cloud application, electronic equipment and storage medium
CN112348928A (en) Animation synthesis method, animation synthesis device, electronic device, and medium
CN111242688A (en) Animation resource manufacturing method and device, mobile terminal and storage medium
RU2690888C2 (en) Method, apparatus and computing device for receiving broadcast content
CN110659372A (en) Picture input and access method, device and equipment
CN110868637A (en) Video, data processing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40065673

Country of ref document: HK