CN110853121B - Cross-platform data processing method and device based on AE - Google Patents

Cross-platform data processing method and device based on AE Download PDF

Info

Publication number
CN110853121B
CN110853121B CN201911022115.XA CN201911022115A CN110853121B CN 110853121 B CN110853121 B CN 110853121B CN 201911022115 A CN201911022115 A CN 201911022115A CN 110853121 B CN110853121 B CN 110853121B
Authority
CN
China
Prior art keywords
data
video
video template
data structure
platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911022115.XA
Other languages
Chinese (zh)
Other versions
CN110853121A (en
Inventor
陈竞郴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gaoding Xiamen Technology Co Ltd
Original Assignee
Gaoding Xiamen Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gaoding Xiamen Technology Co Ltd filed Critical Gaoding Xiamen Technology Co Ltd
Priority to CN201911022115.XA priority Critical patent/CN110853121B/en
Publication of CN110853121A publication Critical patent/CN110853121A/en
Application granted granted Critical
Publication of CN110853121B publication Critical patent/CN110853121B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Abstract

The application discloses a cross-platform data processing method and device based on AE. The method comprises the steps that a mobile terminal obtains a video template designed and completed in AE; performing data analysis on the video template based on the same code library, wherein mobile terminals of different platform systems correspond to the same code library; assembling the analyzed data into a uniform data structure; and displaying the video template according to the unified data structure. The method and the device solve the problem that the final animation display effect is not uniform due to the fact that the conventional mode that the animation designed by AE is used for moving the terminal.

Description

Cross-platform data processing method and device based on AE
Technical Field
The application relates to the technical field of video processing, in particular to a method and a device for cross-platform data processing based on AE.
Background
Adobe After Effects abbreviated as "AE" is a graphic video processing software introduced by Adobe corporation, and is a tool commonly used by designers of individuals and units engaged in designing animation videos and the like. When the animation designed by AE is used for a mobile terminal, the file conversion is required to be carried out through corresponding software, and the converted animation is displayed on the mobile terminal.
Currently, the method of using animation of AE design for the mobile terminal is: and exporting the Ae from the animation production file through the open source code library, importing the animation file by using the code libraries on different platforms, and displaying the animation file on the mobile terminal. The code libraries corresponding to different platforms are different, for example, lottie of Airbnb, aiming at three different mobile terminals of iOS, android and web, the corresponding code libraries are respectively lottie-iOS, lottie-android and lottie-web. Multiple platforms need to correspond to multiple sets of codes, which may cause the final animation display effect to be non-uniform.
Disclosure of Invention
The application mainly aims to provide a cross-platform data processing method and device based on AE, so as to solve the problem that the final animation display effect is not uniform due to the fact that the conventional mode that animation designed by AE is used for a mobile terminal.
To achieve the above object, according to a first aspect of the present application, there is provided a method of AE-based cross-platform data processing.
The mobile terminal acquires a video template designed in AE;
performing data analysis on the video template based on the same code library, wherein mobile terminals of different platform systems correspond to the same code library;
assembling the analyzed data into a uniform data structure;
and displaying the video template according to the uniform data structure.
Further, the data parsing of the video template based on the same code library includes:
and analyzing layer information and layer association information in a json file corresponding to the video template based on the universal underlying language.
Further, the associated information of the image layer at least includes mask information, transformation information, and a special effect filter.
Further, the universal underlying language is C + +, and assembling the parsed data into a unified data structure includes:
and packaging the analyzed data, the logic of the audio and video track, the packaging interface and the packaging filter chain parameter array through the C + + class object.
Further, after assembling the parsed data into a unified data structure, the method further includes:
receiving replacement data, wherein the replacement data is used for replacing original data in a preset layer of a video template;
synthesizing the replacement data and the unified data structure to obtain a target video;
the displaying the video template according to the unified data structure comprises:
and displaying the target video.
Further, before receiving the replacement data, the method further comprises:
and modifying the setting of a preset layer in the video template to obtain an editable video template.
Further, the synthesizing the replacement data with the unified data structure includes:
and calling an open graphics library OpenGL interface to synthesize the replacement data and a unified data structure corresponding to the editable video template.
Further, the synthesizing the replacement data with the unified data structure further includes:
and superposing the special effects in the editable video template according to the filter chains of different filter combinations.
Further, the synthesizing the replacement data and the unified data structure further includes:
carrying out coding and decoding processing on a unified data structure and replacement data corresponding to the editable video template;
and importing the data after the coding and decoding processing into a GPU (graphics processing Unit) for multi-texture synthesis to obtain a target video.
To achieve the above object, according to a second aspect of the present application, there is provided an AE-based cross-platform data processing apparatus.
The device for AE-based cross-platform data processing comprises:
the acquisition unit is used for acquiring the video template designed in the AE by the mobile terminal;
the analysis unit is used for carrying out data analysis on the video template based on the same code library, and mobile terminals of different platform systems correspond to the same code library;
the assembling unit is used for assembling the analyzed data into a unified data structure;
and the display unit is used for displaying the video template according to a uniform data structure.
Further, the parsing unit is configured to:
and analyzing layer information and layer association information in a json file corresponding to the video template based on the universal underlying language.
Further, the associated information of the image layer at least includes mask information, transformation information, and a special effect filter.
Further, the universal bottom language is C + +, and the assembly unit is configured to:
and packaging the analyzed data, the logic of the audio and video track, the packaging interface and the packaging filter chain parameter array through the C + + class object.
Further, the apparatus further comprises:
a receiving unit for receiving replacement data for replacing original data in a preset layer of the video template after assembling the parsed data into a unified data structure,
the synthesis unit is used for synthesizing the replacement data and the unified data structure to obtain a target video;
the display unit is used for:
and displaying the target video.
Further, the apparatus further comprises:
and the modifying unit is used for modifying the setting of a preset layer in the video template before receiving the replacement data to obtain an editable video template.
Further, the synthesis unit includes:
and the calling module is used for calling an open graphics library OpenGL interface to synthesize the replacement data and a unified data structure corresponding to the editable video template.
Further, the synthesis unit further includes:
and the superposition module is used for superposing the special effects in the editable video template according to the filter chains of different filter combinations.
Further, the synthesis unit further includes:
the analysis module is used for carrying out coding and decoding processing on the unified data structure and the replacement data corresponding to the editable video template;
and the texture synthesis module is used for importing the data subjected to coding and decoding into a Graphics Processing Unit (GPU) for multi-texture synthesis to obtain a target video.
To achieve the above object, according to a third aspect of the present application, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing the computer to execute the method for AE-based cross-platform data processing according to any one of the first aspect.
In the embodiment of the application, in the method and the device for processing the cross-platform data based on the AE, a mobile terminal firstly acquires a video template designed in the AE; performing data analysis on the video template based on the same code library, wherein mobile terminals of different platform systems correspond to the same code library; assembling the analyzed data into a uniform data structure; and displaying the video template according to the unified data structure. It can be seen that, in the present application, the analysis of the video template designed in the AE is implemented based on the same code library, and different code libraries are not developed for different platforms, so that the data obtained by the analysis is a unified data structure, and the unification of the display effect can be ensured when the video template display is performed according to the unified data structure by the movement of different platform systems.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, serve to provide a further understanding of the application and to enable other features, objects, and advantages of the application to be more apparent. The drawings and their description illustrate the embodiments of the invention and do not limit it. In the drawings:
FIG. 1 is a flowchart of a method for AE-based cross-platform data processing according to an embodiment of the present application;
FIG. 2 is a flow diagram of another method for AE-based cross-platform data processing provided in accordance with an embodiment of the present application;
FIG. 3 is a schematic diagram of a parsing process of a video template according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a process for synthesizing replacement data with a unified data structure corresponding to an editable video template according to an embodiment of the application;
FIG. 5 is a block diagram of an apparatus for AE-based cross-platform data processing according to an embodiment of the present application;
fig. 6 is a block diagram of another AE-based cross-platform data processing apparatus provided in an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the accompanying drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
According to an embodiment of the present application, there is provided a method for AE-based cross-platform data processing, as shown in fig. 1, the method includes the following steps:
and S101, the mobile terminal acquires a video template designed in the AE.
After a designer designs a video template on AE software, two files, namely data and images, can be exported, wherein the data is a data file, and the images are folders containing materials such as pictures. The mobile terminal can export files such as pictures, texts, audios and videos contained in the data and images from the mobile terminal through a third-party plug-in.
The mobile terminal acquires a video template designed in AE, namely acquires files such as pictures, texts, audios and videos corresponding to the video template.
And S102, carrying out data analysis on the video template based on the same code library.
The same code base, namely the mobile terminals of different platform systems, corresponds to the same code base. Multiple sets of code libraries may not be developed for different platform systems. The design language of the same code base is a common bottom layer language for different platforms, such as C, C + +, and the like.
The data analysis of the video template based on the same code library is to ensure the consistency of the data analysis process, so as to ensure the consistency of the final display effect in the mobile terminals of different platform systems.
And S103, assembling the analyzed data into a unified data structure.
Specifically, the analyzed data is assembled into a unified data structure, that is, the analyzed data is encapsulated in a unified object encapsulation manner, so as to obtain a unified data structure. The unified object packaging manner corresponds to the underlying language involved in step S102.
And S104, displaying the video template according to the uniform data structure.
And displaying the obtained unified data structure through loading the unified data structure on an interface corresponding to the mobile terminal, wherein the displayed video is the video with the same effect as the video template designed in the AE. Thereby realizing the display of the video from the AE to the mobile terminal.
From the above description, it can be seen that in the method for processing cross-platform data based on AE in the embodiment of the present application, the mobile terminal first obtains a video template designed in AE; performing data analysis on the video template based on the same code library, wherein mobile terminals of different platform systems correspond to the same code library; assembling the analyzed data into a uniform data structure; and displaying the video template according to the unified data structure. It can be seen that, in the present application, the analysis of the video template designed in the AE is implemented based on the same code library, and different code libraries are not developed for different platforms, so that the data obtained by the analysis is a unified data structure, and the unification of the display effect can be ensured when the video template display is performed according to the unified data structure by the movement of different platform systems.
Fig. 1 is a scheme of directly displaying a video obtained from AE on a mobile terminal, in the scheme of fig. 1, a user cannot perform personalized modification on a video template obtained from AE according to individual requirements, and in order to further refine and supplement the scheme of fig. 1, the embodiment provides another method of cross-platform data processing based on AE, as shown in fig. 2. The embodiment corresponding to fig. 2 can also satisfy the requirement of personalized modification of the user on the basis of ensuring the consistency of the display effects of the mobile terminals of different platform systems.
And S201, the mobile terminal acquires the video template designed in the AE.
The implementation of this step is the same as that of step S101 in the figure, and is not described here again.
S202, analyzing layer information and layer association information in a json file corresponding to the video template based on the universal underlying language.
In this embodiment, taking a universal underlying language as C + + as an example, an analysis process of a video template is as follows: and decompressing a json file from a data packet corresponding to the AE video template, then analyzing the json file by using a c + + language uniformly, analyzing the layer firstly, and then analyzing related information of the layer, such as a mask, translation, rotation, scaling and other transformations, a special effect filter and the like. Fig. 3 is a schematic diagram of an analysis process of the video template in this step, where layer 1, layer 2, \8230:, layer N is all layers included in the video template, and a special effect 1, a special effect 2, \8230:, special effect N is a mask corresponding to the layer, a transformation such as translation, rotation, scaling and the like, and a special effect filter and the like.
And S203, assembling the analyzed data into a unified data structure.
Taking the universal bottom layer language as C + + as an example, the principle of assembling the analyzed data into a unified data structure is as follows: and obtaining a uniform data structure through the data after the C + + class object encapsulation and analysis, the logic for encapsulating the audio and video tracks, an encapsulation interface and an encapsulation filter chain parameter array.
And S204, receiving the replacement data.
The replacement data is data selected by a user to replace original data in a preset layer of the video template. The replacement data may be images and/or video and/or audio and/or text.
Before the user selects the replacement data, the user mobile terminal needs to display a uniform data structure corresponding to the obtained video template through loading, so that the user can select the data to be replaced according to the video template, and then select (from a local selection mode or an online downloading mode or the like) the replacement data to replace the data to be replaced.
In addition, it should be noted that before receiving the replacement data, the setting of the preset layer in the video template needs to be modified to be editable, and the replacement data may be received only when the preset layer of the video template is editable. And changing the video template into an editable video template through the modification of the preset layer setting, wherein the unified data structure needs to be correspondingly changed into a unified data structure corresponding to the editable video template according to the setting.
And S205, synthesizing the replacement data with a unified data structure corresponding to the editable video template to obtain the target video.
Specifically, "synthesizing the replacement data with the unified data structure corresponding to the editable video template" is to obtain the target video by decoding, encoding, texture synthesizing, and the like, the analyzed unified data structure and the replacement data received in S204.
A specific process of synthesizing the replacement data and the unified data structure corresponding to the editable video template is shown in fig. 4:
the video data mp4\ flv and the like represent a uniform data structure and replacement data corresponding to the editable video template; then audio sampling data, video pixel data and pictures are obtained through decapsulation, audio and video compression and audio and video decoding; and then importing the video pixel data and the picture into a GPU (graphics processing Unit) for multi-texture synthesis, and then carrying out video coding and packaging to obtain a target video.
In the above synthesis process, the mobile terminal calls the OpenGL interface to synthesize the unified data structure corresponding to the replacement data and the editable video template. Graphic languages platform graphic languages, such as Metal, vulkan, etc., may be readily used in the future.
In the synthesizing process, the special effects in the editable video template can be superposed according to the filter chains of different filter combinations, so that different special effects can be efficiently superposed, and the effect of the video template designed by Ae is seamlessly approximated.
In the synthesis process, the audio and the video are simultaneously processed in batch by using the ffmpeg, and the synthesis speed is accelerated and improved by using hardware. ffmpeg is a set of open source computer programs that can be used to record, convert digital audio, video, and convert them into streams.
And S206, displaying the target video.
And loading and displaying the obtained target video in a display interface of the mobile terminal.
In addition, the AE-based video composition method in the embodiment of fig. 2 is designed and developed by using a common underlying language (C + + language, etc.), and can be applied to mobile terminals of different platform systems (iOS, android, linux, and web). The method avoids the respective realization of a set of synthesis processing logic at each end, and avoids the problems of realization of differentiation, non-uniform effect, non-uniform and non-standard flow and non-uniform version. And the new characteristics can be developed efficiently, and the aim of product requirements can be realized efficiently. In addition, mobile terminals of different platforms all maintain the same interface, call back and the like. In the technology requiring interaction with the bottom layer, by appointing a uniform interface or protocol, mobile terminals of different platforms perform respective platform processing according to the interface. Such as unified gesture processing data, unified composition interfaces, unified template data, etc.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than here.
According to an embodiment of the present application, there is further provided an apparatus for AE-based cross-platform data processing for implementing the method described in fig. 1 to fig. 2, as shown in fig. 5, the apparatus includes:
an obtaining unit 31, configured to obtain a video template designed in AE at a mobile terminal;
the analysis unit 32 is configured to perform data analysis on the video template based on the same code library, where mobile terminals of different platform systems correspond to the same code library;
an assembling unit 33, configured to assemble the parsed data into a unified data structure;
and the display unit 34 is used for displaying the video template according to a uniform data structure.
From the above description, it can be seen that in the apparatus for cross-platform data processing based on AE according to the embodiment of the present application, the mobile terminal first acquires a video template designed in AE; performing data analysis on the video template based on the same code library, wherein mobile terminals of different platform systems correspond to the same code library; assembling the analyzed data into a uniform data structure; and displaying the video template according to the unified data structure. It can be seen that, in the present application, the analysis of the video template designed in the AE is implemented based on the same code library, and different code libraries are not developed for different platforms, so that the data obtained by the analysis is a unified data structure, and the unification of the display effect can be ensured when the video template display is performed according to the unified data structure by the movement of different platform systems.
Further, as shown in fig. 6, the parsing unit 32 is configured to:
and analyzing the layer information and the associated information of the layer in the json file corresponding to the video template based on the universal underlying language.
Further, the associated information of the layer at least includes mask information, transformation information, and a special effect filter.
Further, the general bottom language is C + +, as shown in fig. 6, the assembly unit 33 is configured to:
and packaging the analyzed data, the logic of the audio and video track, the packaging interface and the packaging filter chain parameter array through the C + + class object.
Further, as shown in fig. 6, the apparatus further includes:
a receiving unit 35, configured to receive replacement data after the parsed data are assembled into a unified data structure, where the replacement data is used to replace original data in a preset layer of a video template;
a synthesizing unit 36, configured to synthesize the replacement data and the unified data structure to obtain a target video;
the presentation unit 34 is configured to:
and displaying the target video.
Further, as shown in fig. 6, the apparatus further includes:
and a modifying unit 37, configured to modify a setting of a preset layer in the video template before receiving the replacement data, so as to obtain an editable video template.
Further, as shown in fig. 6, the synthesizing unit 36 includes:
the invoking module 361 is configured to invoke an OpenGL interface of an open graphics library to perform synthesis of the replacement data and a unified data structure corresponding to the editable video template.
Further, as shown in fig. 6, the synthesizing unit 36 further includes:
and the superposition module 362 is used for superposing the special effects in the editable video template according to the filter chains of different filter combinations.
Further, as shown in fig. 6, the synthesizing unit 36 further includes:
the parsing module 363 is configured to perform encoding and decoding processing on a unified data structure and replacement data corresponding to the editable video template;
and a texture synthesis module 364, configured to import the data subjected to the encoding and decoding processing into a GPU for performing multi-texture synthesis, so as to obtain a target video.
Specifically, the specific process of implementing the functions of each unit and module in the device in the embodiment of the present application may refer to the related description in the method embodiment, and is not described herein again.
There is also provided, in accordance with an embodiment of the present application, a non-transitory computer-readable storage medium storing computer instructions that cause the computer to perform the method of AE-based cross-platform data processing described in any of fig. 1-2.
It will be apparent to those skilled in the art that the modules or steps of the present application described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and they may alternatively be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or fabricated separately as individual integrated circuit modules, or fabricated as a single integrated circuit module from multiple modules or steps. Thus, the present application is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (8)

1. A method of AE-based cross-platform data processing, the method comprising:
the mobile terminal acquires a video template designed in the AE;
analyzing layer information and layer association information in json files corresponding to video templates based on a universal underlying language, wherein mobile terminals of different platform systems correspond to the same code library;
assembling the analyzed data into a uniform data structure;
receiving replacement data, wherein the replacement data is used for replacing original data in a preset image layer of a video template;
synthesizing the replacement data and the unified data structure to obtain a target video;
and displaying the target video.
2. The AE based cross-platform data processing method according to claim 1, wherein the associated information of the image layer at least comprises mask information, transformation information, and a special effect filter.
3. The AE-based cross-platform data processing method according to claim 1 or 2, wherein the common underlying language is C + +, and the assembling the parsed data into a unified data structure comprises:
and packaging the analyzed data, the logic of the audio and video track, the packaging interface and the packaging filter chain parameter array through the C + + class object.
4. The AE-based cross-platform data processing method of claim 1, wherein prior to receiving replacement data, the method further comprises:
and modifying the setting of a preset layer in the video template to obtain an editable video template.
5. The method of AE-based cross-platform data processing according to claim 1 or 4, wherein the compositing replacement data with the unified data structure comprises:
and calling an open graphics library OpenGL interface to synthesize the replacement data and a uniform data structure corresponding to the editable video template.
6. The method of AE-based cross-platform data processing according to claim 1 or 4, wherein the compositing replacement data with the unified data structure further comprises:
and superposing the special effects in the editable video template according to the filter chains of different filter combinations.
7. An apparatus for AE-based cross-platform data processing, the apparatus comprising:
the acquisition unit is used for acquiring the video template designed in the AE by the mobile terminal;
the analysis unit is used for analyzing layer information and layer association information in json files corresponding to the video template based on a universal bottom language, and mobile terminals of different platform systems correspond to the same code library;
the assembling unit is used for assembling the analyzed data into a uniform data structure;
the device comprises a receiving unit, a processing unit and a processing unit, wherein the receiving unit is used for receiving replacement data, and the replacement data is used for replacing original data in a preset layer of a video template;
the synthesis unit is used for synthesizing the replacement data and the unified data structure to obtain a target video; and the display unit is used for displaying the target video.
8. A non-transitory computer-readable storage medium storing computer instructions that cause the computer to perform the method of AE-based cross-platform data processing of any one of claims 1-6.
CN201911022115.XA 2019-10-25 2019-10-25 Cross-platform data processing method and device based on AE Active CN110853121B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911022115.XA CN110853121B (en) 2019-10-25 2019-10-25 Cross-platform data processing method and device based on AE

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911022115.XA CN110853121B (en) 2019-10-25 2019-10-25 Cross-platform data processing method and device based on AE

Publications (2)

Publication Number Publication Date
CN110853121A CN110853121A (en) 2020-02-28
CN110853121B true CN110853121B (en) 2023-02-10

Family

ID=69597843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911022115.XA Active CN110853121B (en) 2019-10-25 2019-10-25 Cross-platform data processing method and device based on AE

Country Status (1)

Country Link
CN (1) CN110853121B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111951356B (en) * 2020-08-11 2022-12-09 深圳市前海手绘科技文化有限公司 Animation rendering method based on JSON data format
CN111932660A (en) * 2020-08-11 2020-11-13 深圳市前海手绘科技文化有限公司 Hand-drawn video production method based on AE (Enterprise edition) file
CN116456165B (en) * 2023-06-20 2023-09-26 北京美摄网络科技有限公司 Scheduling method and device for AE engineering, electronic equipment and readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080072261A1 (en) * 2006-06-16 2008-03-20 Ralston John D System, method and apparatus of video processing and applications
US8136100B1 (en) * 2006-08-29 2012-03-13 Adobe Systems Incorporated Software installation and icon management support
US20080069475A1 (en) * 2006-09-18 2008-03-20 Simon Ekstrand Video Pattern Thumbnails and Method
US9858050B2 (en) * 2013-07-02 2018-01-02 Youi Labs Inc. System and method for streamlining user interface development
CN107450897B (en) * 2016-06-01 2021-03-02 阿里巴巴集团控股有限公司 Cross-platform migration method and device for graphic engine
US11816459B2 (en) * 2016-11-16 2023-11-14 Native Ui, Inc. Graphical user interface programming system

Also Published As

Publication number Publication date
CN110853121A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110853121B (en) Cross-platform data processing method and device based on AE
CN106611435B (en) Animation processing method and device
CN111193876B (en) Method and device for adding special effect in video
CN111899322B (en) Video processing method, animation rendering SDK, equipment and computer storage medium
CN109874048B (en) Video window assembly semitransparent display method and device and computer equipment
CN104765614A (en) Color filling processing method and device
CN107562498A (en) Animation effect implementation method, device and terminal device based on Android platform
US11882297B2 (en) Image rendering and coding method and related apparatus
CN110908762A (en) Dynamic wallpaper implementation method and device
CN112714357A (en) Video playing method, video playing device, electronic equipment and storage medium
CN110784739A (en) Video synthesis method and device based on AE
CN112689168A (en) Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
CN110213640B (en) Virtual article generation method, device and equipment
CN112950757A (en) Image rendering method and device
CN106293658B (en) Interface component generation method and equipment
CN110782387A (en) Image processing method and device, image processor and electronic equipment
CN104008565A (en) System and method for playing Flash bitmap animation by using cocos2d-x and HE engines
CN113055681B (en) Video decoding display method and device, electronic equipment and storage medium
CN114222185B (en) Video playing method, terminal equipment and storage medium
CN110647273A (en) Method, device, equipment and medium for self-defined typesetting and synthesizing long chart in application
CN117065357A (en) Media data processing method, device, computer equipment and storage medium
CN111242688B (en) Animation resource production method and device, mobile terminal and storage medium
CN114356324A (en) Interface processing method, device, equipment and storage medium
CN111068314A (en) Unity-based NGUI resource rendering processing method and device
CN106331834B (en) Multimedia data processing method and equipment thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant