CN112954423B - Animation playing method, device and equipment - Google Patents

Animation playing method, device and equipment Download PDF

Info

Publication number
CN112954423B
CN112954423B CN202010277684.5A CN202010277684A CN112954423B CN 112954423 B CN112954423 B CN 112954423B CN 202010277684 A CN202010277684 A CN 202010277684A CN 112954423 B CN112954423 B CN 112954423B
Authority
CN
China
Prior art keywords
animation
sequence frame
playing
linked list
frame group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010277684.5A
Other languages
Chinese (zh)
Other versions
CN112954423A (en
Inventor
杨建培
李超
蔡鸿华
肖力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mingyuan Yunke E Commerce Co ltd
Original Assignee
Shenzhen Mingyuan Yunke E Commerce Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mingyuan Yunke E Commerce Co ltd filed Critical Shenzhen Mingyuan Yunke E Commerce Co ltd
Priority to CN202010277684.5A priority Critical patent/CN112954423B/en
Publication of CN112954423A publication Critical patent/CN112954423A/en
Application granted granted Critical
Publication of CN112954423B publication Critical patent/CN112954423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • H04N21/4725End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

An animation playing method comprises the following steps: acquiring an interaction instruction input by a user on an animation playing interface, and determining an interaction area corresponding to the interaction instruction; searching a sequence frame group corresponding to the interaction region as a response hot zone according to a plurality of pre-overlapped sequence frame groups; and sequentially rendering and playing the images in the sequence frame group according to a preset linked list in the sequence frame group. The animation playing interface can set a plurality of response hot areas, so that a user can trigger interaction instructions at different positions according to needs, richer animation playing is realized, and the use experience of the user is improved.

Description

Animation playing method, device and equipment
Technical Field
The application belongs to the field of image display, and particularly relates to an animation playing method, device and equipment.
Background
In order to enhance the experience of the user using the application program, when the scene picture is played in the application program, a scene animation mode is often adopted. For example, when the scene picture played by the lecturer in the house through the application program is the characteristics of the house of the lecturer in the house, compared with the mode of playing the static image, the vividness of the scene picture can be greatly improved through the mode of playing the scene animation.
However, at present, when playing a scene animation, the animation is usually set at a fixed position, and when a link corresponding to the animation is played by a user, the animation content is relatively single, which is not beneficial to further improving the user experience.
Disclosure of Invention
In view of this, the embodiments of the present application provide an animation playing method, apparatus, and device, so as to solve the problem that in the prior art, the content of the scene animation played is relatively single, which is not beneficial to further improving the user experience.
A first aspect of an embodiment of the present application provides an animation playing method, where the animation playing method includes:
acquiring an interaction instruction input by a user on an animation playing interface, and determining an interaction area corresponding to the interaction instruction;
searching a sequence frame group corresponding to the interaction region as a response hot zone according to a plurality of pre-overlapped sequence frame groups;
and sequentially rendering and playing the images in the sequence frame group according to a preset linked list in the sequence frame group.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the linked list is a bidirectional circular linked list, and the method further includes:
acquiring the moving direction of the interaction instruction;
and determining the link direction of the bidirectional circular linked list according to the moving direction of the interaction instruction.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the method further includes:
acquiring the content of the playing interface currently played;
determining nodes in a sequence frame group corresponding to the currently played content according to the currently played content of a playing interface;
and determining a starting frame of the animation playing corresponding to the interaction instruction according to the determined nodes in the sequence frame group.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the step of determining, according to content currently played by a playing interface, a node in a sequence frame group corresponding to the content currently played includes:
acquiring the position of a playing interface corresponding to the currently played content;
and searching the node corresponding to the position in the searched sequence frame group.
With reference to the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the step of determining, according to a moving direction of the interaction instruction, a link direction of the bidirectional circular linked list includes:
when the moving direction of the interaction instruction is the left direction, the linking direction of the bidirectional circular linked list is the forward direction;
and when the moving direction of the interaction instruction is the rightward direction, the linking direction of the bidirectional circular linked list is the backward direction.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the method further includes:
acquiring an image frame library to be played;
acquiring an image frame corresponding to the scene animation from the image frame library;
and connecting the acquired image frames through a linked list according to the playing order of the animation to obtain a sequence frame group.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, the method further includes:
and generating prompt information that the interaction instruction is invalid when the interaction area is not corresponding to the response hot area.
A second aspect of the embodiments of the present application provides an animation playing device, including:
the interactive region determining unit is used for obtaining an interactive instruction input by a user on the animation playing interface and determining an interactive region corresponding to the interactive instruction;
the sequence frame group searching unit is used for searching the sequence frame group corresponding to the interaction area as a response hot zone according to the plurality of sequence frame groups overlapped in advance;
and the playing unit is used for sequentially rendering and playing the images in the sequence frame group according to a preset linked list in the sequence frame group.
A third aspect of the embodiments of the present application provides an animation playing device, a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the animation playing method according to any one of the first aspects when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the animation playback method according to any one of the first aspects.
Compared with the prior art, the embodiment of the application has the beneficial effects that: through presetting the corresponding relation between response hot areas of a plurality of sequence frame groups and interaction areas, when an animation playing interface receives an interaction instruction input by a user, searching whether the response hot areas of the sequence frame groups correspond to the interaction areas according to the interaction areas corresponding to the interaction instruction, if the response hot areas of the sequence frame groups correspond to the interaction areas, sequentially rendering and playing images in the sequence frame groups according to a preset linked list in the searched sequence frame groups, so that the animation playing interface can set the response hot areas, the user can conveniently trigger the interaction instruction at different positions according to the needs, richer animation playing is realized, and the use experience of the user is facilitated to be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic implementation flow chart of an animation playing method according to an embodiment of the present application;
fig. 2 is a schematic implementation flow chart of a method for generating a sequence frame group according to an embodiment of the present application;
fig. 3 is a schematic implementation flow chart of a method for determining a start frame according to an embodiment of the present application;
fig. 4 is a schematic diagram of an animation playing device according to an embodiment of the present application;
fig. 5 is a schematic diagram of an animation playing device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
Fig. 1 is a schematic implementation flow chart of an animation playing method according to an embodiment of the present application, which is described in detail below:
in step S101, an interaction instruction input by a user on an animation playing interface is obtained, and an interaction area corresponding to the interaction instruction is determined.
Specifically, the animation playing interface may be any interface for playing an image in an application program. For example, the animation playback interface may be a house presentation interface in an application used by a property sales user, or the like. By playing scene animation in the house introduction interface, the content presented in the picture can be more real. In addition, the animation playing interface can further comprise a plurality of response hot areas, and the response hot areas can respectively correspond to different scene animations. When the interaction area corresponding to the interaction instruction input by the user corresponds to the response hot zone, the corresponding scene animation can be triggered to be played.
The response hot zone can be a fixed position area in the animation playing interface, and the response hot zone can be a visible graphical button or an invisible area. For example, the graphic button may be a corner button in the animation playing interface, and the button may be labeled with a scene animation content prompt corresponding to the button. For example, in the case of a scene animation of a house introduction being played, the prompt content may be determined according to the content of the scene animation, including, for example, the scene animation of a bedroom introduction may be labeled as "primary lying", "secondary lying", etc., and the scene animation of a kitchen introduction may be labeled as "kitchen", etc.
Of course, the graphic buttons corresponding to the response hot zone may be displayed according to the current playing state. For example, when a certain animation is triggered to play by the interaction instruction, the graphic button corresponding to the response hot zone can be hidden in the playing process. And displaying the graphic buttons corresponding to the response hot zone when the animation is played, or the scene animation pauses or stops playing.
In one implementation, the response hotspots may correspond to content in a scene animation. For example, a response hot zone is arranged at a fixed position in the played scene animation, and the position of the response hot zone in the animation playing interface is changed along with the playing of the animation, so that a user can obtain a browsing experience which is deeper than the user. For example, when the content of the scene animation is the position a, the position a may include three browsing directions, a corresponding response hot zone may be set at a corresponding direction of the scene animation, and when an interaction instruction input by a user is received and corresponds to the response hot zone, the corresponding scene animation is triggered to be played.
In this embodiment of the present application, the interaction instruction input by the user may be a touch instruction or a sliding instruction input by the user, or may be a click or drag instruction of an input device such as a mouse, or a confirmation instruction generated by combining the mouse and a keyboard.
The animation playing interface can be a full-screen animation playing interface or an interface occupied by a part of windows in an application program part. When the animation playing interface is a full-screen interface, the interactive instruction input by the user can be received in the full-screen area. When the animation playing interface is a window in the application program, the interactive instruction input by the user can be received in the area where the window is located.
After receiving an interaction instruction input by a user, determining an interaction area corresponding to the interaction instruction according to the type of the instruction. For example, for a click command of a touch or a mouse, a center point of a click, or a predetermined range centered on the center point is determined as an interaction area. And for a sliding instruction or a dragging instruction, determining the interaction area according to the starting point position of the sliding or dragging.
In step S102, according to the plurality of pre-superimposed sequence frame groups, the sequence frame group corresponding to the response hot zone in the interaction area is searched.
In the present application, a scene animation composed of a plurality of sequence frame groups is set in the animation playing interface in advance. And each frame of image in the sequence frame group forms the sequence frame group in a linked list mode. The linked list can be a unidirectional linked list, a bidirectional linked list, a unidirectional circulation list or a bidirectional circulation linked list.
When the linked list is a one-way linked list, the scene animation may be a sequence frame group of fixed-length, fixed-start image frames. And when receiving an interaction instruction triggering the unidirectional linked list, playing the sequence frame group with fixed length. Of course, the starting image frame of the single-linked list playing can also be determined according to different trigger positions, for example, the starting image frame of the single-linked list can be determined according to the position of the currently played scene.
When the linked list is a bidirectional linked list, the play direction of the linked list of the sequence frame group can be determined according to the moving direction (including the sliding direction and the dragging direction) of the interaction instruction. For example, when the moving direction of the interaction instruction is the left direction, the linking direction of the bidirectional circular linked list is the forward direction; and when the moving direction of the interaction instruction is the rightward direction, the linking direction of the bidirectional circular linked list is the backward direction.
When the linked list is a bidirectional cyclic list, the linking direction of the bidirectional cyclic linked list can be determined according to the moving direction of the interactive instruction.
When the linked list is a bidirectional cyclic list, considering that the same image frame may exist when the field Jing Donghua in the same scene is played, in order to reduce the number of repeatedly stored image frames, the method for constructing the sequential frame group shown in fig. 2 may be adopted, and specifically includes:
in step S201, a library of image frames to be played is acquired.
The image frame library to be played can be an image frame included in scene animation played by the current animation playing interface. When two or more scene animations include repeated image frames, only one corresponding image frame needs to be stored in the image frame library.
In step S202, an image frame corresponding to the scene animation is acquired from the image frame library.
According to the animation playing content, the image frames corresponding to the scene animation can be obtained from an image frame library. Wherein the same image frame may be extracted by multiple scene animations.
In step S203, the acquired image frames are connected through a linked list according to the animation playing order, so as to obtain a sequence frame group.
According to the playing requirement of the animation, the playing sequence of each image frame in the scene animation is determined, and according to the playing sequence, a plurality of image frames can be connected through a linked list, so as to obtain a sequence frame group comprising the playing sequence.
In one implementation, the linked list may be a circular linked list, and in one animation playing process, the starting image and the ending image may be the same image frame, and the corresponding identifier of the response hot zone is set in the image frame, so that a user can execute triggering operations of different scene animations according to the identifier of the response hot zone in the image frame.
In step S103, the images in the sequence frame group are sequentially rendered and played according to the preset linked list in the sequence frame group.
And sequentially reading and rendering and playing images in the sequence frame group according to a preset linked list in the sequence frame group, wherein the linked list comprises a unidirectional linked list, a bidirectional linked list, a unidirectional circular linked list or a bidirectional circular linked list.
When the linked list is a single linked list or a double linked list, the image frames in the sequence frame group can be sequentially played according to a preset fixed initial image frame (such as a first frame image), and play is ended when the last frame image is played or when an abort instruction is received.
When the linked list is a unidirectional circular linked list or a bidirectional circular linked list, the initial playing position of the scene animation can be determined according to the playing content, and the method can effectively adapt to the introduction requirement of the field space in the occasion of introducing the space scene. The implementation flow of determining the initial playing position of the scene animation according to the playing content may be as shown in fig. 3, and includes:
in step S301, the content currently played by the playback interface is acquired.
And for the animation introduced by the scene, acquiring the content played by the playing interface at present, so that the user can know the position corresponding to the content played at present. The position corresponding to the user immersed in the scene can be obtained. Or according to the playing content in the current playing interface, a target object of the current attention of the user can be obtained, and corresponding image frames of other sequence frame groups are determined according to the target object.
In step S302, according to the content currently played by the playing interface, a node in the sequence frame group corresponding to the content currently played is determined.
The corresponding relation between the scene position or the target object in the image and the nodes in each sequence frame group can be preset, the scene position or the target object in the content currently played by the playing interface is detected, the corresponding image frame in the new scene animation to be triggered and played can be found according to the preset corresponding relation, and the corresponding image frame is used as the initial frame of the scene animation to be played.
In step S303, a start frame of the animation playing corresponding to the interactive instruction is determined according to the determined nodes in the sequence frame group.
And sequentially rendering and playing the image frames in the sequence frame group according to the initial frames determined by the nodes in the determined sequence frame group and combining with the linked list in the sequence frame group until a user suspension instruction is received or instructions of other scene animations are triggered. The initial frame of the scene animation corresponding to the interaction instruction is determined through the current playing content, so that the immersion of the user on the scene can be more effectively improved.
In addition, when the interaction area is not corresponding to the response hot area, prompt information that the interaction instruction is invalid is generated. Such as prompting that the current instruction is invalid or may not be in an animated play response.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Fig. 4 is a schematic structural diagram of an animation playing device according to an embodiment of the present application, where the animation playing device includes:
an interaction region determining unit 401, configured to obtain an interaction instruction input by a user on an animation playing interface, and determine an interaction region corresponding to the interaction instruction;
a sequence frame group searching unit 402, configured to search a sequence frame group corresponding to the interaction region as a response hot zone according to a plurality of pre-superimposed sequence frame groups;
and a playing unit 403, configured to sequentially render and play images in the sequence frame group according to a preset linked list in the sequence frame group.
The animation playing device shown in fig. 4 corresponds to the animation playing method shown in fig. 1, and will not be described here again.
Fig. 5 is a schematic diagram of an animation playing device according to an embodiment of the present application. As shown in fig. 5, the animation playback apparatus 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52, such as an animation playing program, stored in the memory 51 and executable on the processor 50. The processor 50, when executing the computer program 52, implements the steps of the various animation playback method embodiments described above, such as steps 101 through 103 shown in fig. 1. Alternatively, the processor 50, when executing the computer program 52, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 501 to 503 shown in fig. 5.
By way of example, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 52 in the animation playback device 5. For example, the computer program 52 may be partitioned into units with the following specific functions:
the interactive region determining unit is used for obtaining an interactive instruction input by a user on the animation playing interface and determining an interactive region corresponding to the interactive instruction;
the sequence frame group searching unit is used for searching the sequence frame group corresponding to the interaction area as a response hot zone according to the plurality of sequence frame groups overlapped in advance;
and the playing unit is used for sequentially rendering and playing the images in the sequence frame group according to a preset linked list in the sequence frame group.
The animation playing device 5 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The animation playback device may include, but is not limited to, a processor 50, a memory 51. It will be appreciated by those skilled in the art that fig. 5 is merely an example of the animation playing device 5 and is not meant to be limiting of the animation playing device 5, and may include more or less components than illustrated, or may combine some components, or different components, e.g., the animation playing device may further include an input/output device, a network access device, a bus, etc.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the animation playing device 5, for example, a hard disk or a memory of the animation playing device 5. The memory 51 may also be an external storage device of the animation playing device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the animation playing device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the animation playing device 5. The memory 51 is used for storing the computer program and other programs and data required by the animation playback apparatus. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. . Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium may include content that is subject to appropriate increases and decreases as required by jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is not included as electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. An animation playing method, characterized in that the animation playing method comprises the following steps:
acquiring an interaction instruction input by a user on an animation playing interface, and determining an interaction area corresponding to the interaction instruction;
searching a sequence frame group corresponding to the interaction region as a response hot zone according to a plurality of pre-overlapped sequence frame groups, wherein the sequence frame group is provided with corresponding response hot zone identifications;
sequentially rendering and playing images in the sequence frame group according to a preset linked list in the sequence frame group;
wherein the linked list is a bidirectional circular linked list, the method further comprises:
acquiring the moving direction of the interaction instruction;
and determining the link direction of the bidirectional circular linked list according to the moving direction of the interaction instruction.
2. The animation playback method as claimed in claim 1, further comprising:
acquiring the content of the playing interface currently played;
determining nodes in a sequence frame group corresponding to the currently played content according to the currently played content of a playing interface;
and determining a starting frame of the animation playing corresponding to the interaction instruction according to the determined nodes in the sequence frame group.
3. The animation playback method as claimed in claim 2, wherein the step of determining a node in the sequence frame group corresponding to the currently played content according to the content currently played by the playback interface comprises:
acquiring the position of a playing interface corresponding to the currently played content;
and searching the node corresponding to the position in the searched sequence frame group.
4. The animation playback method as claimed in claim 1, wherein the step of determining the link direction of the doubly-linked list according to the moving direction of the interactive command comprises:
when the moving direction of the interaction instruction is the left direction, the linking direction of the bidirectional circular linked list is the forward direction;
and when the moving direction of the interaction instruction is the rightward direction, the linking direction of the bidirectional circular linked list is the backward direction.
5. The animation playback method as claimed in claim 1, further comprising:
acquiring an image frame library to be played;
acquiring an image frame corresponding to the animation from the image frame library;
and connecting the acquired image frames through a linked list according to the playing order of the animation to obtain a sequence frame group.
6. The animation playback method as claimed in claim 1, further comprising:
and generating prompt information that the interaction instruction is invalid when the interaction area is not corresponding to the response hot area.
7. An animation playback apparatus, characterized in that the animation playback apparatus comprises:
the interactive region determining unit is used for obtaining an interactive instruction input by a user on the animation playing interface and determining an interactive region corresponding to the interactive instruction;
the sequence frame group searching unit is used for searching the sequence frame group corresponding to the response hot zone in the interaction area according to the plurality of pre-overlapped sequence frame groups, and the sequence frame group is provided with a corresponding response hot zone identifier;
the playing unit is used for sequentially rendering and playing images in the sequence frame group according to a preset linked list in the sequence frame group;
the linked list is a bidirectional circular linked list, and the device method is further used for: acquiring the moving direction of the interaction instruction; and determining the link direction of the bidirectional circular linked list according to the moving direction of the interaction instruction.
8. An animation playback apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the animation playback method as claimed in any one of claims 1 to 6 when the computer program is executed by the processor.
9. A computer readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the steps of the animation playback method of any one of claims 1 to 6.
CN202010277684.5A 2020-04-08 2020-04-08 Animation playing method, device and equipment Active CN112954423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010277684.5A CN112954423B (en) 2020-04-08 2020-04-08 Animation playing method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010277684.5A CN112954423B (en) 2020-04-08 2020-04-08 Animation playing method, device and equipment

Publications (2)

Publication Number Publication Date
CN112954423A CN112954423A (en) 2021-06-11
CN112954423B true CN112954423B (en) 2023-05-26

Family

ID=76234468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010277684.5A Active CN112954423B (en) 2020-04-08 2020-04-08 Animation playing method, device and equipment

Country Status (1)

Country Link
CN (1) CN112954423B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165052A (en) * 2018-08-08 2019-01-08 腾讯科技(深圳)有限公司 Interaction processing method, device and the terminal of application scenarios, system, storage medium
WO2019052395A1 (en) * 2017-09-12 2019-03-21 腾讯科技(深圳)有限公司 Multimedia data presentation method, storage medium and computer device

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6968973B2 (en) * 2003-05-31 2005-11-29 Microsoft Corporation System and process for viewing and navigating through an interactive video tour
CN102999244A (en) * 2012-03-23 2013-03-27 广州市凡拓数码科技有限公司 Realization method, realization system and manufacturing method for exhibition of electronic house type
CN104423814A (en) * 2013-08-20 2015-03-18 腾讯科技(深圳)有限公司 Method for controlling network media information interaction and browser
CN104182125B (en) * 2014-08-25 2016-03-02 腾讯科技(深圳)有限公司 A kind of triggering operation method of suspended window and device
CN105303600A (en) * 2015-07-02 2016-02-03 北京美房云谷网络科技有限公司 Method of viewing 3D digital building by using virtual reality goggles
CN104966225A (en) * 2015-07-08 2015-10-07 深圳爱布丁梦想科技有限公司 Housing rental method and system based on mobile terminal and 3D panoramic image browsing
CN107330945A (en) * 2017-07-05 2017-11-07 合肥工业大学 A kind of examing heartbeat fastly method based on video
CN109669673A (en) * 2017-10-12 2019-04-23 世熠网络科技(上海)有限公司 Game engine device based on HTML5
WO2019082050A1 (en) * 2017-10-23 2019-05-02 ГИОРГАДЗЕ, Анико Тенгизовна User interaction in communication systems, using an augmented reality story message
CN108830938A (en) * 2018-05-30 2018-11-16 链家网(北京)科技有限公司 A kind of virtual three-dimensional space picture balance method and device
CN108830692B (en) * 2018-06-20 2020-04-14 厦门市超游网络科技股份有限公司 Remote panoramic house-viewing method and device, user terminal, server and storage medium
CN110933515A (en) * 2018-12-13 2020-03-27 湖南汉坤建筑安保器材有限公司 Panoramic video playing control method based on VR technology
CN109976527B (en) * 2019-03-28 2022-08-12 重庆工程职业技术学院 Interactive VR display system
CN110134478B (en) * 2019-04-28 2022-04-05 深圳市思为软件技术有限公司 Scene conversion method and device of panoramic scene and terminal equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019052395A1 (en) * 2017-09-12 2019-03-21 腾讯科技(深圳)有限公司 Multimedia data presentation method, storage medium and computer device
CN109165052A (en) * 2018-08-08 2019-01-08 腾讯科技(深圳)有限公司 Interaction processing method, device and the terminal of application scenarios, system, storage medium

Also Published As

Publication number Publication date
CN112954423A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN108010112B (en) Animation processing method, device and storage medium
CN109640188B (en) Video preview method and device, electronic equipment and computer readable storage medium
US20190158934A1 (en) Video frame capturing method and device
EP2924593A1 (en) Method and apparatus for constructing documents
WO2020220773A1 (en) Method and apparatus for displaying picture preview information, electronic device and computer-readable storage medium
CN110225246B (en) Event script generation method and device, electronic equipment and computer readable storage medium
EP4171006A1 (en) Previewing method and apparatus for effect application, and device and storage medium
CN105549847B (en) A kind of image display method and user terminal at playback of songs interface
US20230306694A1 (en) Ranking list information display method and apparatus, and electronic device and storage medium
CN112073301B (en) Method, device and computer readable medium for deleting chat group members
CN115190366B (en) Information display method, device, electronic equipment and computer readable medium
CN111597009B (en) Application program display method and device and terminal equipment
CN111652675A (en) Display method and device and electronic equipment
CN113050861B (en) Display interface control method, electronic device and storage medium
CN110290058A (en) A kind of method and apparatus that conversation message being presented in the application
CN111338549B (en) Information sharing method and device, storage medium and electronic equipment
CN112954423B (en) Animation playing method, device and equipment
US20230388266A1 (en) Method, apparatus, device and storage medium for reposting
CN112492399A (en) Information display method and device and electronic equipment
JP2023070068A (en) Video stitching method, apparatus, electronic device, and storage medium
CN113559503B (en) Video generation method, device and computer readable medium
CN115460448A (en) Media resource editing method and device, electronic equipment and storage medium
CN112652039A (en) Animation segmentation data acquisition method, segmentation method, device, equipment and medium
CN108415656B (en) Display control method, device, medium and electronic equipment in virtual scene
CN110990095A (en) Hosted application presentation method, device and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant