CN113870394A - Animation generation method, device, equipment and storage medium - Google Patents

Animation generation method, device, equipment and storage medium Download PDF

Info

Publication number
CN113870394A
CN113870394A CN202111145304.3A CN202111145304A CN113870394A CN 113870394 A CN113870394 A CN 113870394A CN 202111145304 A CN202111145304 A CN 202111145304A CN 113870394 A CN113870394 A CN 113870394A
Authority
CN
China
Prior art keywords
animation
target
client
map
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111145304.3A
Other languages
Chinese (zh)
Inventor
郝华栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202111145304.3A priority Critical patent/CN113870394A/en
Publication of CN113870394A publication Critical patent/CN113870394A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure relates to an animation generation method, an animation generation device, animation generation equipment and a storage medium, relates to the technical field of internet application, and can improve the design reduction degree of animation. The animation generation method comprises the following steps: acquiring the display position and the display size of a target map in the animation to be displayed; generating an interactive control with the same display size as the target map; and adding the interactive control to the display position of the target map of the animation to be displayed to obtain the target animation.

Description

Animation generation method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of internet application technologies, and in particular, to an animation generation method, apparatus, device, and storage medium.
Background
Lottie is an open source tool developed by Airbnb that can add animation effects to native applications. The designer can design an animation based on lottiee and display the animation directly on the client.
However, lottiee cannot support adding click interaction controls in animations. In the prior art, an interactive control is usually created, and the size and the position of the interactive control are continuously debugged, so that the position of the interactive control is matched with an area where the clicked interactive control is added in the animation as much as possible.
However, under the condition that different mobile phones have different resolutions, the existing method may cause the position of the interactive control to be misaligned with the area to which the interactive control is added in the animation, so that the design reduction degree of the animation is reduced, and further, the user experience is reduced.
Disclosure of Invention
The present disclosure provides an animation generation method, device, apparatus, and storage medium, which can improve the design reduction degree of an animation, thereby enriching user experience.
The technical scheme of the embodiment of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided an animation generation method, which may be applied to a client, including: acquiring the display position and the display size of a target map in the animation to be displayed; generating an interactive control with the same display size as the target map; and adding the interactive control to the display position of the target map of the animation to be displayed to obtain the target animation.
Optionally, the animation generation method further includes: receiving animation to be displayed; the animation to be displayed comprises a plurality of maps; determining a map of the associated interactive operation in the plurality of maps, and taking the map of the associated interactive operation as a target map.
Optionally, the obtaining of the display position and the display size of the target map in the animation to be displayed includes: extracting the map information of the target map from an animation file of the animation to be displayed; the map information includes a display position and a display size.
Optionally, generating an interaction control with the same display size as the target map includes: acquiring a template control with the same display size as the target chartlet from a database in which a plurality of template controls are stored, and determining the template control as an interactive control; the display size of each template control is different.
Optionally, the animation generation method further includes: and in response to the triggering operation executed on the interaction control in the target animation by the account of the login client, displaying the target content associated with the target map.
According to a second aspect of the embodiments of the present disclosure, there is provided an animation generation apparatus, which may be applied to a client, including: the device comprises an acquisition unit, a generation unit and a processing unit; the obtaining unit is used for obtaining the display position and the display size of the target map in the animation to be displayed; the generating unit is used for generating an interactive control with the same display size as the target map; and the processing unit is used for adding the interactive control to the display position of the target map of the animation to be displayed so as to obtain the target animation.
Optionally, the animation generation apparatus further includes: a receiving unit; the receiving unit is used for receiving the animation to be displayed; the animation to be displayed comprises a plurality of maps; each map comprises a map mark for indicating whether the interactive operation is triggered; the obtaining unit is further used for obtaining a map comprising a target map mark from the plurality of maps and determining the map as a target map; the target map label is used for indicating the trigger interactive operation.
Optionally, the obtaining unit is specifically configured to: extracting the map information of the target map from an animation file of the animation to be displayed; the map information includes a display position and a display size.
Optionally, the generating unit is specifically configured to: acquiring a template control with the same display size as the target chartlet from a database in which a plurality of template controls are stored, and determining the template control as an interactive control; the display size of each template control is different.
Optionally, the animation generation apparatus further includes: a display unit; and the display unit is used for responding to the triggering operation executed by the account of the login client on the interactive control in the target animation and displaying the target content associated with the target map.
According to a third aspect of the embodiments of the present disclosure, there is provided a client, which may include: a processor and a memory for storing processor-executable instructions; wherein the processor is configured to execute the instructions to implement any of the above-described optional animation generation methods of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having instructions stored thereon, which, when executed by a processor of a client, enable the client to perform any one of the above-mentioned optional animation generation methods of the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product, which includes computer instructions, when the computer instructions are executed on a client, cause the client to execute the animation generation method according to any one of the optional implementation manners of the second aspect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
based on any one of the above aspects, in the present disclosure, the client may obtain the display position and the display size of the target map in the animation to be displayed, and generate the interactive control with the same display size as the target map. And subsequently, the client adds the interactive control to the display position of the target map of the animation to be displayed so as to obtain the target animation. Because the target control is generated by the client, the client can generate the target control corresponding to the client based on the resolution of the client. And secondly, the display size and the display position of the target control are the same as those of the target map, so that the target control can be perfectly matched with the target map, the design reduction degree of the animation is improved, and the user experience is enriched.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow chart diagram illustrating an animation generation method provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart diagram illustrating a further animation generation method provided by an embodiment of the present disclosure;
FIG. 3 is a flow chart diagram illustrating a further animation generation method provided by an embodiment of the present disclosure;
FIG. 4 is a flow chart diagram illustrating a further animation generation method provided by an embodiment of the present disclosure;
FIG. 5 is a flow chart diagram illustrating a further animation generation method provided by an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram of an animation generation apparatus provided by an embodiment of the present disclosure;
fig. 7 shows a schematic structural diagram of a client according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, and/or components.
The data to which the present disclosure relates may be data that is authorized by a user or sufficiently authorized by parties.
As described in the background art, under the condition that different mobile phones have different resolutions, the existing method may cause the position of the interactive control to be misaligned with the area to which the interactive control is added in the animation, thereby reducing the design reduction degree of the animation and further reducing the user experience.
Based on this, the embodiment of the present disclosure provides an animation generation method, where a client may obtain a display position and a display size of a target map in an animation to be displayed, and generate an interaction control with the same display size as the target map. And subsequently, the client adds the interactive control to the display position of the target map of the animation to be displayed so as to obtain the target animation. Because the target control is generated by the client, the client can generate the target control corresponding to the client based on the resolution of the client. And secondly, the display size and the display position of the target control are the same as those of the target map, so that the target control can be perfectly matched with the target map, the design reduction degree of the animation is improved, and the user experience is enriched.
The client may be a mobile phone, a tablet computer, a desktop, a laptop, a handheld computer, a notebook, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, and other devices that can install and use a content community application (e.g., a fast hand), and the specific form of the client is not particularly limited in this disclosure. The system can be used for man-machine interaction with a user through one or more modes of a keyboard, a touch pad, a touch screen, a remote controller, voice interaction or handwriting equipment and the like.
As shown in fig. 1, the animation generation method may include: S101-S103.
S101, the client side obtains the display position and the display size of the target map in the animation to be displayed.
Animation creation users can use ae (adobe After effects) software to design animations to be displayed on electronic devices. Subsequently, the electronic device may export the designed animation to be displayed into a JSON format by using the Bodymovin plug-in provided by lottiee, and send the animation to be displayed to the client. After receiving the animation to be displayed, the client can extract the display position and the display size of the target map in the animation to be displayed.
And the target map is a map of the interactive control to be added.
Optionally, when the display position of the target map is obtained, a coordinate system may be established on the display page of the client. In this case, the display position of the target map may be represented by coordinates.
Illustratively, the coordinate system is established with the center of the display page of the client as the origin. The display position of the target map may be a triangular area formed by coordinates a, B, and C.
Optionally, when the display size of the target map is obtained, a coordinate system may also be established on the display page of the client. In this case, the display size of the target map may be determined by the coordinates of the display position.
In connection with the above example, the display position of the target map may be a triangular area formed by coordinates a, B, and C. In this case, the display size of the target map may be determined according to the specific coordinates of the coordinates a, B, and C.
Optionally, the animation creation user may also add the display position and the display size of the target map to the map information of the animation file of the animation to be displayed. After receiving the animation to be displayed, the client can also extract the map information of the target map from the animation file of the animation to be displayed.
S102, the client generates an interactive control with the same display size as the target map.
After the display position and the display size of the target map in the animation to be displayed are obtained, the client can generate an interactive control with the same display size as the target map.
Optionally, a database of the client may store a plurality of template controls with different sizes. When the client generates the interactive control with the same display size as the target map, the template control with the same display size as the target map can be obtained from a database in which a plurality of template controls are stored and determined as the interactive control.
Optionally, when the client generates an interactive control with the same display size as the target map, a default control may also be generated. The default control may be sized by the user. After the display position and the display size of the target map in the animation to be displayed are obtained, the client can automatically adjust the display size of the default control based on the display size of the target map so as to obtain the interactive control with the display size being the same as that of the target map.
Optionally, the interaction control may be a click control, a long-press control, a voice control, or another type of interaction control for performing an interaction operation between the user and the client, which is not limited in this disclosure.
S103, the client adds the interactive control to the display position of the target map of the animation to be displayed to obtain the target animation.
Specifically, after generating an interactive control with the same display size as the target map and acquiring the display position of the target map in the animation to be displayed, the client may add the interactive control to the display position of the target map of the animation to be displayed to obtain the target animation.
The technical scheme provided by the embodiment at least has the following beneficial effects: S101-S103 show that the client can obtain the display position and the display size of the target map in the animation to be displayed and generate the interactive control with the same display size as the target map. And subsequently, the client adds the interactive control to the display position of the target map of the animation to be displayed so as to obtain the target animation. Because the target control is generated by the client, the client can generate the target control corresponding to the client based on the resolution of the client. And secondly, the display size and the display position of the target control are the same as those of the target map, so that the target control can be perfectly matched with the target map, the design reduction degree of the animation is improved, and the user experience is enriched.
In one embodiment, as shown in fig. 2 in conjunction with fig. 1, before S101, the animation generation method further includes S201-S202.
S201, the client receives the animation to be displayed.
The animation to be displayed comprises a plurality of maps.
When designing the animation to be displayed, the animation creation user can add an appropriate map to the animation to be displayed. In this case, the animation to be displayed includes a plurality of maps.
Optionally, in order to ensure that the client may add an appropriate interaction control to the map to which the interaction control is to be added, the animation creation user may add a map label to the map to which the associated interaction operation is required, so that the client may parse the target map to which the interaction control is required to be added.
Illustratively, a tag may be added behind each tile item in the JSON file of the animation to be displayed. Such as "0" or "1". Wherein "0" indicates that no interaction control needs to be added to the map. Accordingly, a "1" indicates that the map requires the addition of an interaction control.
S202, the client determines the maps associated with the interactive operation in the maps, and takes the maps associated with the interactive operation as target maps.
Specifically, after receiving the animation to be displayed, the animation creation user adds a map label to a map which needs the associated interactive operation, so that the client can determine the map of the associated interactive operation in the plurality of maps and take the map of the associated interactive operation as a target map.
In connection with the above example, after receiving the animation to be displayed, the client parses that the client includes 3 maps. The first and second maps are labeled as "0", and the third map is labeled as "1". In this case, the client determines the third map as the target map.
The technical scheme provided by the embodiment at least has the following beneficial effects: from S201 to S202, when the animation creating user designs the animation to be displayed, a chartlet mark may be added to the chartlet that needs the associated interaction operation, so that the client may parse the target chartlet that needs to add the interaction control, and therefore, the client may determine the chartlet of the associated interaction operation in the plurality of chartlets, and take the chartlet of the associated interaction operation as the target chartlet. Therefore, the client can accurately determine the target map, so that the follow-up client can determine the target control with the same display size and display position based on the display size and display position of the target map with the map mark for executing the trigger operation, the design reduction degree of the animation is further improved, and the user experience is enriched.
In an embodiment, with reference to fig. 2 and as shown in fig. 3, in the above S101, the method for the client to obtain the display position and the display size of the target map in the animation to be displayed specifically includes:
s301, the client extracts the map information of the target map from the animation file of the animation to be displayed.
The mapping information comprises a display position and a display size.
Specifically, when creating the animation to be displayed, the animation creation user may add the display position and the display size of the target map to the map information of the animation file of the animation to be displayed. After receiving the animation to be displayed, the client can extract the map information of the target map from the animation file of the animation to be displayed, so as to obtain the display position and the display size of the target map.
The technical scheme provided by the embodiment at least has the following beneficial effects: as known from S301, when creating an animation to be displayed, an animation creation user may add the display position and the display size of the target map to the map information of the animation file of the animation to be displayed. After receiving the animation to be displayed, the client can extract the map information of the target map from the animation file of the animation to be displayed, so as to obtain the display position and the display size of the target map. Therefore, the client can accurately determine the display position and the display size of the target map, so that the follow-up client can determine the target control with the same display size and the same display position based on the display size and the display position of the target map, the design reduction degree of the animation is improved, and the user experience is enriched.
In an embodiment, with reference to fig. 2 and as shown in fig. 4, the method for the client to generate the interactive control with the same display size as the target map specifically includes: s401.
S401, the client side obtains the template control with the same display size as the target map from a database in which a plurality of template controls are stored, and determines the template control as the interactive control.
Specifically, a database of the client may store a plurality of template controls of different sizes. When the client generates the interactive control with the same display size as the target map, the template control with the same display size as the target map can be obtained from a database in which a plurality of template controls are stored and determined as the interactive control.
Wherein the display size of each template control is different.
Optionally, a database of the client may store a one-to-one correspondence relationship between the sizes of the plurality of controls and the identifiers of the plurality of controls. After the display size of the target map is obtained, the one-to-one correspondence relationship between the sizes of the multiple controls and the identifiers of the multiple controls can be read based on the display size of the target map, so that the target controls with the same display size as the target map are obtained.
The technical scheme provided by the embodiment at least has the following beneficial effects: as can be seen from S401, a database of the client may store a plurality of template controls with different sizes. When the client generates the interactive control with the same display size as the target map, the template control with the same display size as the target map can be obtained from a database in which a plurality of template controls are stored and determined as the interactive control. Therefore, the client can accurately determine the target control with the same display size as the target map, so that the design reduction degree of the animation is improved, and the user experience is enriched.
In an embodiment, referring to fig. 2, as shown in fig. 5, after S103, the animation generation method further includes: and S501.
S501, the client responds to the triggering operation executed by the account of the login client on the interactive control in the target animation, and the target content related to the target map is displayed.
The target content is content which is preset by an animation creating user and is associated with the target map.
Specifically, after the target animation is obtained, since the target map in the target animation includes the interaction control for executing the interaction operation, the account of the login client may execute the trigger operation on the interaction control in the target animation. In this case, the client displays the target content associated with the target map in response to a triggering operation performed on the interaction control in the target animation by logging in the account of the client.
Illustratively, the target animation is a red-packed animation of an application. Wherein, the animation of opening the red envelope comprises a target map of opening the red envelope. When the target animation is generated, the client adds an interactive control with the same display size as the target map of the red envelope at the display position of the target map of the red envelope. And logging in the account of the client to perform triggering operation on the interaction control. In this case, the client displays the red-envelope opening animation associated with the "red-envelope opening" target map in response to a triggering operation performed on the interaction control in the target animation by logging in the account of the client.
The technical scheme provided by the embodiment at least has the following beneficial effects: as can be seen from S501, after the target animation is obtained, since the target map in the target animation includes the interaction control for performing the interaction operation, the account logged in the client may perform a trigger operation on the interaction control in the target animation. Under the condition, the client responds to the triggering operation executed by the account of the login client on the interaction control in the target animation, and the target content associated with the target map is displayed, so that the user can quickly view the target content, and the user experience is enriched.
It is understood that, in practical implementation, the client according to the embodiments of the present disclosure may include one or more hardware structures and/or software modules for implementing the corresponding animation generation method, and these hardware structures and/or software modules may constitute an electronic device. Those of skill in the art will readily appreciate that the present disclosure can be implemented in hardware or a combination of hardware and computer software for implementing the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Based on such understanding, the embodiment of the present disclosure also correspondingly provides an animation generation device, which can be applied to a client. Fig. 6 shows a schematic structural diagram of an animation generation apparatus provided in an embodiment of the present disclosure. As shown in fig. 6, the animation generation apparatus may include: an acquisition unit 601, a generation unit 602, and a processing unit 603.
The obtaining unit 601 is configured to obtain a display position and a display size of a target map in an animation to be displayed. For example, in conjunction with fig. 1, the acquisition unit 601 is configured to execute S101.
A generating unit 602, configured to generate an interaction control with the same display size as the target map. For example, in conjunction with fig. 1, the generating unit 602 is configured to execute S102.
The processing unit 603 is configured to add the interactive control to the display position of the target map of the animation to be displayed, so as to obtain the target animation. For example, in conjunction with fig. 1, the processing unit 603 is configured to execute S103.
Optionally, the animation generation apparatus further includes: a receiving unit 604.
A receiving unit 604, configured to receive an animation to be displayed; the animation to be displayed comprises a plurality of maps. For example, in conjunction with fig. 2, the receiving unit 604 is configured to perform S201.
The processing unit 603 is further configured to determine a map associated with the interactive operation in the plurality of maps, and use the map associated with the interactive operation as a target map. For example, in conjunction with fig. 2, the acquisition unit 601 is configured to execute S202.
Optionally, the obtaining unit 601 is specifically configured to:
extracting the map information of the target map from an animation file of the animation to be displayed; the map information includes a display position and a display size. For example, in conjunction with fig. 3, the acquisition unit 601 is configured to execute S301.
Optionally, the generating unit 602 is specifically configured to:
acquiring a template control with the same display size as the target chartlet from a database in which a plurality of template controls are stored, and determining the template control as an interactive control; the display size of each template control is different. For example, in conjunction with fig. 4, the generating unit 602 is configured to perform S401.
Optionally, the animation generation apparatus further includes: a display unit 605.
And the display unit 605 is configured to display the target content associated with the target map in response to a trigger operation performed on the interaction control in the target animation by the account of the login client. For example, in conjunction with fig. 5, the display unit 605 is configured to perform S501.
As described above, the embodiments of the present disclosure may perform the division of the functional modules on the client according to the above method examples. The integrated module can be realized in a hardware form, and can also be realized in a software functional module form. In addition, it should be further noted that the division of the modules in the embodiments of the present disclosure is schematic, and is only a logic function division, and there may be another division manner in actual implementation. For example, the functional blocks may be divided for the respective functions, or two or more functions may be integrated into one processing block.
With regard to the animation generating apparatus in the foregoing embodiment, the specific manner in which each module executes operations and the beneficial effects thereof have been described in detail in the foregoing method embodiment, and are not described herein again.
The embodiment of the disclosure also provides a client, which can be a user client such as a mobile phone, a computer and the like. Fig. 7 shows a schematic structural diagram of a client provided by an embodiment of the present disclosure. The client, which may be an animation generation device, may include at least one processor 61, a communication bus 62, a memory 63, and at least one communication interface 64.
The processor 61 may be a Central Processing Unit (CPU), a micro-processing unit, an ASIC, or one or more integrated circuits for controlling the execution of programs according to the present disclosure. As an example, in connection with fig. 6, the generating unit 602 and the processing unit 603 in the client implement the same functions as the processor 61 in fig. 7.
The communication bus 62 may include a path that carries information between the aforementioned components.
The communication interface 64 may be any device, such as a transceiver, for communicating with other devices or communication networks, such as a server, an ethernet, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), etc. As an example, in connection with fig. 6, the functions implemented by the obtaining unit 601 and the receiving unit 604 in the client are the same as those implemented by the communication interface 64 in fig. 7.
The memory 63 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be self-contained and connected to the processing unit by a bus. The memory may also be integrated with the processing unit.
The memory 63 is used for storing application program codes for executing the disclosed solution, and is controlled by the processor 61. The processor 61 is configured to execute application program code stored in the memory 63 to implement the functions in the disclosed method.
In particular implementations, processor 61 may include one or more CPUs such as CPU0 and CPU1 of fig. 7 as an example.
In particular implementations, as one embodiment, the animation generation client may include multiple processors, such as processor 61 and processor 65 in FIG. 7. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In particular implementations, the animation generation client may also include an input device 66 and an output device 67, as one embodiment. The input device 66 communicates with the output device 67 and may accept user input in a variety of ways. For example, the input device 66 may be a mouse, a keyboard, a touch screen device or a sensing device, and the like. The output device 67 is in communication with the processor 61 and may display information in a variety of ways. For example, the output device 61 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display device, or the like.
Those skilled in the art will appreciate that the architecture shown in fig. 7 does not constitute a limitation on the client, and may include more or fewer components than those shown, or combine certain components, or employ a different arrangement of components.
The present disclosure also provides a computer-readable storage medium including instructions stored thereon, which, when executed by a processor of a computer device, enable a computer to perform the animation generation method provided by the above-described illustrated embodiment. For example, the computer readable storage medium may be a memory 63 comprising instructions executable by the processor 61 of the client to perform the method described above. Also for example, the computer readable storage medium may be a memory 72 comprising instructions executable by a processor 71 of the server to perform the above-described method. Alternatively, the computer readable storage medium may be a non-transitory computer readable storage medium, for example, which may be a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The present disclosure also provides a computer program product comprising computer instructions which, when run on a client, cause the client to perform the animation generation method as described in any of the above figures 1-5.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An animation generation method is applied to a client, and is characterized by comprising the following steps:
acquiring the display position and the display size of a target map in the animation to be displayed;
generating an interactive control with the same display size as the target map;
and adding the interactive control to the display position of the target map of the animation to be displayed to obtain the target animation.
2. The animation generation method according to claim 1, further comprising:
receiving the animation to be displayed; the animation to be displayed comprises a plurality of maps;
determining the map of the associated interactive operation in the plurality of maps, and taking the map of the associated interactive operation as the target map.
3. The animation generation method according to claim 1, wherein the obtaining of the display position and the display size of the target map in the animation to be displayed comprises:
extracting the map information of the target map from the animation file of the animation to be displayed; the map information includes the display position and the display size.
4. The animation generation method as claimed in claim 1, wherein the generating of the interactive control with the same display size as the target map comprises:
obtaining a template control with the same display size as the target chartlet from a database in which a plurality of template controls are stored, and determining the template control as the interactive control; the display size of each template control is different.
5. The animation generation method according to any one of claims 1 to 4, further comprising:
and responding to the interactive operation executed on the interactive control in the target animation by logging in the account of the client, and displaying the target content associated with the target map.
6. An animation generation device applied to a client, comprising: the device comprises an acquisition unit, a generation unit and a processing unit;
the obtaining unit is used for obtaining the display position and the display size of the target map in the animation to be displayed;
the generating unit is used for generating an interactive control with the same display size as the target map;
and the processing unit is used for adding the interactive control to the display position of the target map of the animation to be displayed so as to obtain the target animation.
7. The animation generation apparatus according to claim 6, further comprising: a receiving unit;
the receiving unit is used for receiving the animation to be displayed; the animation to be displayed comprises a plurality of maps;
the processing unit is further configured to determine a map of an associated interaction operation in the plurality of maps, and use the map of the associated interaction operation as the target map.
8. A client, the client comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the animation generation method of any of claims 1-5.
9. A computer-readable storage medium having instructions stored thereon, wherein the instructions in the computer-readable storage medium, when executed by a processor of a client, enable the client to perform the animation generation method of any of claims 1-5.
10. A computer program product comprising instructions that, when run on a client, cause the client to perform the animation generation method of any of claims 1-5.
CN202111145304.3A 2021-09-28 2021-09-28 Animation generation method, device, equipment and storage medium Pending CN113870394A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111145304.3A CN113870394A (en) 2021-09-28 2021-09-28 Animation generation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111145304.3A CN113870394A (en) 2021-09-28 2021-09-28 Animation generation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113870394A true CN113870394A (en) 2021-12-31

Family

ID=78992086

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111145304.3A Pending CN113870394A (en) 2021-09-28 2021-09-28 Animation generation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113870394A (en)

Similar Documents

Publication Publication Date Title
US9864612B2 (en) Techniques to customize a user interface for different displays
US10528589B2 (en) Cross visualization interaction between data visualizations
US11132114B2 (en) Method and apparatus for generating customized visualization component
TW202113586A (en) Method and device for generating applet
KR20150035798A (en) Generating localized user interfaces
US20220324327A1 (en) Method for controlling terminal, electronic device and storage medium
CN103955339A (en) Terminal operation method and terminal equipment
CN110427182A (en) A kind of template type construction APP method and device
CN112966824A (en) Deployment method and device of inference library and electronic equipment
JP2021120867A (en) Fusion relation network construction method, fusion relation network construction device, electronic device, storage medium and program
US20220300301A1 (en) Method and apparatus for controlling interface focus on interface
CN115309470A (en) Method, device and equipment for loading widgets and storage medium
KR20170014589A (en) User terminal apparatus for providing translation service and control method thereof
CN111596897B (en) Code multiplexing processing method and device and electronic equipment
CN111198738A (en) Mobile terminal page display method and device and electronic equipment
CN113870394A (en) Animation generation method, device, equipment and storage medium
CN115731319A (en) Graph drawing method, device, equipment and storage medium
CN113656041A (en) Data processing method, device, equipment and storage medium
CN113138760A (en) Page generation method and device, electronic equipment and medium
CN114115855A (en) Code multiplexing method and device, computer readable storage medium and electronic equipment
US9389760B2 (en) Integrated visualization
US20160147741A1 (en) Techniques for providing a user interface incorporating sign language
CN112966201A (en) Object processing method, device, electronic equipment and storage medium
CN108009172B (en) File three-dimensional display method and device and terminal
CN113870393A (en) Animation generation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination