CN112954423A - Animation playing method, device and equipment - Google Patents

Animation playing method, device and equipment Download PDF

Info

Publication number
CN112954423A
CN112954423A CN202010277684.5A CN202010277684A CN112954423A CN 112954423 A CN112954423 A CN 112954423A CN 202010277684 A CN202010277684 A CN 202010277684A CN 112954423 A CN112954423 A CN 112954423A
Authority
CN
China
Prior art keywords
animation
playing
sequence frame
frame group
linked list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010277684.5A
Other languages
Chinese (zh)
Other versions
CN112954423B (en
Inventor
杨建培
李超
蔡鸿华
肖力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mingyuan Yunke E Commerce Co ltd
Original Assignee
Shenzhen Mingyuan Yunke E Commerce Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mingyuan Yunke E Commerce Co ltd filed Critical Shenzhen Mingyuan Yunke E Commerce Co ltd
Priority to CN202010277684.5A priority Critical patent/CN112954423B/en
Publication of CN112954423A publication Critical patent/CN112954423A/en
Application granted granted Critical
Publication of CN112954423B publication Critical patent/CN112954423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • H04N21/4725End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The animation playing method comprises the following steps: acquiring an interactive instruction input by a user on an animation playing interface, and determining an interactive area corresponding to the interactive instruction; searching a sequence frame group corresponding to the response hot area in the interaction area according to a plurality of sequence frame groups superposed in advance; and rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group. The animation playing interface can set a plurality of response hot areas, so that a user can trigger interaction instructions at different positions as required, richer animation playing is realized, and the use experience of the user is improved.

Description

Animation playing method, device and equipment
Technical Field
The present application relates to the field of image display, and in particular, to a method, an apparatus, and a device for playing an animation.
Background
In order to improve the experience of the user in using the application program, when the scene picture is played in the application program, a scene animation is often played. For example, when a room-selling interpreter explains the features of a house for room-purchasing personnel through a scene picture played by an application program, the vividness of the scene picture can be greatly improved through a mode of playing scene animation compared with a mode of playing static images.
However, when a scene animation is played, the animation is usually set at a fixed position, and when a user connects to a link corresponding to the animation, the animation corresponding to the link is played, so that the content of the animation is single, which is not favorable for further improving the user experience.
Disclosure of Invention
In view of this, embodiments of the present application provide an animation playing method, apparatus, and device, so as to solve the problem that the content of a scene animation played in the prior art is single, which is not beneficial to further improving the user experience.
A first aspect of an embodiment of the present application provides an animation playing method, where the animation playing method includes:
acquiring an interactive instruction input by a user on an animation playing interface, and determining an interactive area corresponding to the interactive instruction;
searching a sequence frame group corresponding to the response hot area in the interaction area according to a plurality of sequence frame groups superposed in advance;
and rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the linked list is a bi-directional circular linked list, and the method further includes:
acquiring the moving direction of the interactive instruction;
and determining the link direction of the bidirectional circular linked list according to the moving direction of the interactive instruction.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the method further includes:
acquiring the content played at present in a playing interface;
determining a node in a sequence frame group corresponding to the currently played content according to the currently played content of a playing interface;
and determining the starting frame of the animation playing corresponding to the interactive instruction according to the node in the determined sequence frame group.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the determining, according to the content currently played by the play interface, a node in a sequence frame group corresponding to the currently played content includes:
acquiring a position corresponding to the currently played content of a playing interface;
and searching the node corresponding to the position in the sequence frame group obtained and searched.
With reference to the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the determining, according to the moving direction of the interactive instruction, a link direction of the bi-directional circular linked list includes:
when the moving direction of the interactive instruction is the left direction, the link direction of the bidirectional circular linked list is the forward direction;
and when the moving direction of the interactive instruction is the right direction, the link direction of the bidirectional circular linked list is the backward direction.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the method further includes:
acquiring an image frame library to be played;
acquiring an image frame corresponding to the scene animation from the image frame library;
and connecting the acquired image frames through a linked list according to the animation playing sequence to obtain a sequence frame group.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, the method further includes:
and when the interactive area does not correspond to the response hot area, generating prompt information that the interactive instruction is invalid.
A second aspect of an embodiment of the present application provides an animation playback device, including:
the interactive area determining unit is used for acquiring an interactive instruction input by a user on an animation playing interface and determining an interactive area corresponding to the interactive instruction;
the sequence frame group searching unit is used for searching the sequence frame group corresponding to the response hot area in the interaction area according to a plurality of sequence frame groups superposed in advance;
and the playing unit is used for rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
A third aspect of the embodiments of the present application provides an animation playback device, a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the animation playback method according to any one of the first aspect when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps of the animation playing method according to any one of the first aspects.
Compared with the prior art, the embodiment of the application has the advantages that: by presetting the corresponding relation between the response hot areas of the sequence frame groups and the interactive areas, when an interactive instruction input by a user is received by the animation playing interface, whether the response hot areas of the sequence frame groups correspond to the interactive areas is searched according to the interactive areas corresponding to the interactive instruction, if the response hot areas of the sequence frame groups correspond to the interactive areas, images in the sequence frame groups are rendered and played in sequence according to a preset linked list in the searched sequence frame groups, so that the animation playing interface can set a plurality of response hot areas, the user can trigger the interactive instruction at different positions as required, richer animation playing is realized, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation process of an animation playing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of an implementation of a method for generating a sequence frame group according to an embodiment of the present application;
fig. 3 is a schematic flowchart of an implementation of a method for determining a start frame according to an embodiment of the present application;
fig. 4 is a schematic diagram of an animation playback device according to an embodiment of the present application;
fig. 5 is a schematic diagram of an animation playback device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic view of an implementation flow of an animation playing method provided in an embodiment of the present application, which is detailed as follows:
in step S101, an interactive instruction input by a user on an animation playing interface is obtained, and an interactive area corresponding to the interactive instruction is determined.
Specifically, the animation playing interface may be any interface for playing an image in an application program. For example, the animation playing interface may be a house introduction interface in an application used by a house sales user, and the like. By playing the scene animation in the house introduction interface, the content presented in the picture can be more real. In addition, the animation playing interface may further include a plurality of response hot areas, and the plurality of response hot areas may respectively correspond to different scene animations. And when the interactive area corresponding to the interactive instruction input by the user corresponds to the response hot area, triggering to play the corresponding scene animation.
The response hot area may be a fixed position area in the animation playing interface, and the response hot area may be a visible graphic button or an invisible area. For example, the graphic button may be a button at a corner in an animation playing interface, and a scene animation content prompt corresponding to the button may be marked on the button. For example, when a scene animation introduced into a house is played, prompt contents can be determined according to the contents of the scene animation, including that the scene animation introduced into a bedroom can be labeled as "primary lying" or "secondary lying", and the scene animation introduced into a kitchen can be labeled as "kitchen".
Of course, the graphic button corresponding to the response hot zone may also be displayed according to the current playing status. For example, when a certain animation is triggered to play through the interactive instruction, the graphic button corresponding to the response hot area may be hidden during the playing process. When the animation is played completely, or the scene animation is paused or stopped, the graphic button corresponding to the response hot area can be displayed.
In one implementation, the responsive hot zone may correspond to content in a scene animation. For example, a response hot zone is arranged at a fixed position in the played scene animation, and as the animation is played, the position of the response hot zone in the animation playing interface changes, so that the user can obtain a browsing experience of being more deepened in the vicinity of the user. For example, when the content of the scene animation is the position a, the position a may include three browsing directions, a corresponding response hot area may be set in a corresponding direction of the scene animation, and when an interactive instruction input by a user is received and corresponds to the response hot area, the corresponding scene animation is triggered to be played.
The interactive instruction input by the user in the embodiment of the application may be a touch instruction or a slide instruction input by the user, a click or drag instruction of an input device such as a mouse, or a confirmation instruction generated by combining the mouse and the keyboard.
The animation playing interface can be a full-screen animation playing interface and can also be an interface occupied by a part of windows in an application program part. When the animation playing interface is a full screen interface, an interactive instruction input by a user can be received in a full screen area. And when the animation playing interface is a window in the application program, receiving an interactive instruction input by a user in the area where the window is located.
After receiving an interactive instruction input by a user, determining an interactive area corresponding to the interactive instruction according to the type of the instruction. For example, for a touch or mouse click command, a center point of a click, or a predetermined range centered on the center point, is determined as an interaction area. For a slide instruction or a drag instruction, the interaction area may be determined at a start position of the slide or drag.
In step S102, according to the plurality of sequence frame groups superimposed in advance, the sequence frame group corresponding to the interactive region as the response hot region is searched.
In the application, a scene animation composed of a plurality of sequence frame groups is arranged in the animation playing interface in advance. And each frame of image in the sequence frame group forms a sequence frame group in a linked list mode. The linked list can be a single-direction linked list, a double-direction linked list, a single-direction circulating list or a double-direction circulating linked list.
When the linked list is a single-direction linked list, the scene animation may be a sequence frame group of fixed-length, fixed-start image frames. When an interactive instruction triggering the single linked list is received, the fixed length sequence frame group can be played. Of course, the starting image frame played by the single-direction linked list may also be determined according to the different trigger positions, for example, the starting image frame of the single-direction linked list may be determined according to the position of the currently played scene.
When the linked list is a bi-directional linked list, the playing direction of the linked list of the sequence frame group can be determined according to the moving direction (including the sliding direction and the dragging direction) of the interactive instruction. For example, when the moving direction of the interactive instruction is the left direction, the link direction of the bidirectional circular linked list is the forward direction; and when the moving direction of the interactive instruction is the right direction, the link direction of the bidirectional circular linked list is the backward direction.
When the linked list is a bidirectional circular list, the link direction of the bidirectional circular linked list can be determined according to the moving direction of the interactive instruction.
When the linked list is a bidirectional circular list, considering that the same image frames may exist when scene animation in the same scene is played, in order to reduce the number of repeatedly stored image frames, the sequence frame group construction method shown in fig. 2 may be adopted, and specifically includes:
in step S201, an image frame library to be played is acquired.
The image frame library to be played may be an image frame included in a scene animation played by a current animation playing interface. When two or more scene animations include repeated image frames, only one corresponding image frame needs to be stored in the image frame library.
In step S202, an image frame corresponding to the scene animation is acquired from the image frame library.
According to the animation playing content, the image frame corresponding to the scene animation can be obtained from the image frame library. Wherein the same image frame may be extracted by multiple scene animations.
In step S203, the acquired image frames are linked by a linked list according to the animation playing order, so as to obtain a sequence frame group.
According to the playing requirement of the animation, the playing sequence of each image frame in the scene animation is determined, and according to the playing sequence, a plurality of image frames can be connected through a linked list, so that a sequence frame group comprising the playing sequence is obtained.
In one implementation, the linked list may be a circular linked list, and in a process of playing an animation, the starting image and the ending image may be the same image frame, and a corresponding identifier of a response hot area is set in the image frame, so that a user can perform a trigger operation of different scene animations according to the identifier of the response hot area in the image frame.
In step S103, rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
And sequentially reading and playing the images in the sequence frame group according to a preset linked list in the sequence frame group, wherein the preset linked list comprises a single-direction linked list, a double-direction linked list, a single-direction circulating linked list or a double-direction circulating linked list and the like.
When the linked list is a single-direction linked list or a double-direction linked list, the image frames in the sequence frame group can be played in sequence according to a preset fixed initial image frame (such as a first image frame), and the playing is finished when the last image frame is played or a stop instruction is received.
When the linked list is a one-way circular linked list or a two-way circular linked list, the initial playing position of the scene animation can be determined according to the playing content, and the scene animation can effectively adapt to the introduction requirement of the scene space in the occasion of introducing the space scene. An implementation process for determining the starting playing position of the scene animation according to the playing content may be as shown in fig. 3, and includes:
in step S301, the content currently played by the playing interface is obtained.
For the animation introduced by the scene, the content of the playing interface currently played is obtained, so that the user can know the position corresponding to the currently played content. A corresponding position of the user when immersed in the scene may be obtained. Or, according to the playing content in the current playing interface, the target object in which the current attention of the user is focused can be obtained, and the corresponding image frame of other sequence frame groups is determined according to the target object.
In step S302, according to the currently played content on the playing interface, a node in the sequence frame group corresponding to the currently played content is determined.
The corresponding relation between the scene position or the target object in the image and the node in each sequence frame group can be preset, the scene position or the target object in the content played currently on the playing interface is detected, the corresponding image frame in the new scene animation to be triggered to be played can be found according to the preset corresponding relation, and the corresponding image frame is used as the starting frame of the scene animation to be played.
In step S303, a starting frame of the animation playing corresponding to the interactive instruction is determined according to the node in the determined sequence frame group.
And according to the initial frame determined by the node in the determined sequence frame group, combining a linked list in the sequence frame group, and sequentially rendering and playing the image frames in the sequence frame group until receiving a pause instruction of a user or triggering other scene animations. The initial frame of the scene animation corresponding to the interactive instruction is determined through the currently played content, so that the immersion of the user on the scene can be effectively improved.
In addition, when the interactive area does not correspond to the response hot area, prompt information that the interactive instruction is invalid is generated. Such as prompting that the current command is invalid or may not actuate an animation play response.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 4 is a schematic structural diagram of an animation playback device according to an embodiment of the present application, where the animation playback device includes:
the interactive region determining unit 401 is configured to obtain an interactive instruction input by a user on an animation playing interface, and determine an interactive region corresponding to the interactive instruction;
a sequence frame group search unit 402, configured to search, according to a plurality of sequence frame groups superimposed in advance, a sequence frame group corresponding to the response hot zone in the interaction area;
a playing unit 403, configured to render and play the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
The animation playback apparatus shown in fig. 4 corresponds to the animation playback method shown in fig. 1, and will not be described repeatedly here.
Fig. 5 is a schematic diagram of an animation playback device according to an embodiment of the present application. As shown in fig. 5, the animation playback device 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52, such as an animation playback program, stored in said memory 51 and executable on said processor 50. The processor 50 executes the computer program 52 to implement the steps in the above-mentioned animation playing method embodiments, such as the steps 101 to 103 shown in fig. 1. Alternatively, the processor 50, when executing the computer program 52, implements the functions of each module/unit in the above-mentioned device embodiments, for example, the functions of the modules 501 to 503 shown in fig. 5.
Illustratively, the computer program 52 may be partitioned into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 52 in the animation playback device 5. For example, the computer program 52 may be divided into units with specific functions as follows:
the interactive area determining unit is used for acquiring an interactive instruction input by a user on an animation playing interface and determining an interactive area corresponding to the interactive instruction;
the sequence frame group searching unit is used for searching the sequence frame group corresponding to the response hot area in the interaction area according to a plurality of sequence frame groups superposed in advance;
and the playing unit is used for rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
The animation playing device 5 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The animation playback device may include, but is not limited to, a processor 50 and a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of the animation playback device 5, and does not constitute a limitation of the animation playback device 5, and may include more or less components than those shown, or combine some components, or different components, for example, the animation playback device may further include an input-output device, a network access device, a bus, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the motion picture playing device 5, such as a hard disk or a memory of the motion picture playing device 5. The memory 51 may also be an external storage device of the motion picture playing device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the motion picture playing device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the animation playback device 5. The memory 51 is used for storing the computer program and other programs and data required by the animation playback device. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An animation playing method, characterized in that the animation playing method comprises:
acquiring an interactive instruction input by a user on an animation playing interface, and determining an interactive area corresponding to the interactive instruction;
searching a sequence frame group corresponding to the response hot area in the interaction area according to a plurality of sequence frame groups superposed in advance;
and rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
2. The animation playback method as claimed in claim 1, wherein the linked list is a bi-directional circular linked list, the method further comprising:
acquiring the moving direction of the interactive instruction;
and determining the link direction of the bidirectional circular linked list according to the moving direction of the interactive instruction.
3. The animation playback method as claimed in claim 2, further comprising:
acquiring the content played at present in a playing interface;
determining a node in a sequence frame group corresponding to the currently played content according to the currently played content of a playing interface;
and determining the starting frame of the animation playing corresponding to the interactive instruction according to the node in the determined sequence frame group.
4. The animation playback method according to claim 3, wherein the step of determining the node in the sequence frame group corresponding to the currently played content according to the currently played content in the playback interface comprises:
acquiring a position corresponding to the currently played content of a playing interface;
and searching the node corresponding to the position in the sequence frame group obtained and searched.
5. The animation playing method as claimed in claim 2, wherein the step of determining the link direction of the bi-directional circularly linked list according to the moving direction of the interactive command comprises:
when the moving direction of the interactive instruction is the left direction, the link direction of the bidirectional circular linked list is the forward direction;
and when the moving direction of the interactive instruction is the right direction, the link direction of the bidirectional circular linked list is the backward direction.
6. The animation playback method as claimed in claim 1, further comprising:
acquiring an image frame library to be played;
acquiring an image frame corresponding to the scene animation from the image frame library;
and connecting the acquired image frames through a linked list according to the animation playing sequence to obtain a sequence frame group.
7. The animation playback method as claimed in claim 1, further comprising:
and when the interactive area does not correspond to the response hot area, generating prompt information that the interactive instruction is invalid.
8. An animation playback apparatus, comprising:
the interactive area determining unit is used for acquiring an interactive instruction input by a user on an animation playing interface and determining an interactive area corresponding to the interactive instruction;
the sequence frame group searching unit is used for searching the sequence frame group corresponding to the response hot area in the interaction area according to a plurality of sequence frame groups superposed in advance;
and the playing unit is used for rendering and playing the images in the sequence frame group in sequence according to a preset linked list in the sequence frame group.
9. An animation playback device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the animation playback method as claimed in any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the animation playback method as claimed in any one of claims 1 to 7.
CN202010277684.5A 2020-04-08 2020-04-08 Animation playing method, device and equipment Active CN112954423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010277684.5A CN112954423B (en) 2020-04-08 2020-04-08 Animation playing method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010277684.5A CN112954423B (en) 2020-04-08 2020-04-08 Animation playing method, device and equipment

Publications (2)

Publication Number Publication Date
CN112954423A true CN112954423A (en) 2021-06-11
CN112954423B CN112954423B (en) 2023-05-26

Family

ID=76234468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010277684.5A Active CN112954423B (en) 2020-04-08 2020-04-08 Animation playing method, device and equipment

Country Status (1)

Country Link
CN (1) CN112954423B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040239699A1 (en) * 2003-05-31 2004-12-02 Uyttendaele Matthew T. System and process for viewing and navigating through an interactive video tour
CN102999244A (en) * 2012-03-23 2013-03-27 广州市凡拓数码科技有限公司 Realization method, realization system and manufacturing method for exhibition of electronic house type
CN104182125A (en) * 2014-08-25 2014-12-03 腾讯科技(深圳)有限公司 Method and device for triggering and operating suspension windows
CN104423814A (en) * 2013-08-20 2015-03-18 腾讯科技(深圳)有限公司 Method for controlling network media information interaction and browser
CN104966225A (en) * 2015-07-08 2015-10-07 深圳爱布丁梦想科技有限公司 Housing rental method and system based on mobile terminal and 3D panoramic image browsing
CN105303600A (en) * 2015-07-02 2016-02-03 北京美房云谷网络科技有限公司 Method of viewing 3D digital building by using virtual reality goggles
CN107330945A (en) * 2017-07-05 2017-11-07 合肥工业大学 A kind of examing heartbeat fastly method based on video
CN108830692A (en) * 2018-06-20 2018-11-16 厦门市超游网络科技股份有限公司 Long-range panorama sees room method, apparatus, user terminal, server and storage medium
CN108830938A (en) * 2018-05-30 2018-11-16 链家网(北京)科技有限公司 A kind of virtual three-dimensional space picture balance method and device
CN109669673A (en) * 2017-10-12 2019-04-23 世熠网络科技(上海)有限公司 Game engine device based on HTML5
WO2019082050A1 (en) * 2017-10-23 2019-05-02 ГИОРГАДЗЕ, Анико Тенгизовна User interaction in communication systems, using an augmented reality story message
CN109976527A (en) * 2019-03-28 2019-07-05 重庆工程职业技术学院 Interactive VR display systems
CN110134478A (en) * 2019-04-28 2019-08-16 深圳市思为软件技术有限公司 The scene conversion method, apparatus and terminal device of panoramic scene
CN110933515A (en) * 2018-12-13 2020-03-27 湖南汉坤建筑安保器材有限公司 Panoramic video playing control method based on VR technology

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109495427B (en) * 2017-09-12 2021-05-07 腾讯科技(深圳)有限公司 Multimedia data display method and device, storage medium and computer equipment
CN109165052B (en) * 2018-08-08 2021-10-26 腾讯科技(深圳)有限公司 Interactive processing method and device of application scene, terminal, system and storage medium

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040239699A1 (en) * 2003-05-31 2004-12-02 Uyttendaele Matthew T. System and process for viewing and navigating through an interactive video tour
CN102999244A (en) * 2012-03-23 2013-03-27 广州市凡拓数码科技有限公司 Realization method, realization system and manufacturing method for exhibition of electronic house type
CN104423814A (en) * 2013-08-20 2015-03-18 腾讯科技(深圳)有限公司 Method for controlling network media information interaction and browser
CN104182125A (en) * 2014-08-25 2014-12-03 腾讯科技(深圳)有限公司 Method and device for triggering and operating suspension windows
CN105303600A (en) * 2015-07-02 2016-02-03 北京美房云谷网络科技有限公司 Method of viewing 3D digital building by using virtual reality goggles
CN104966225A (en) * 2015-07-08 2015-10-07 深圳爱布丁梦想科技有限公司 Housing rental method and system based on mobile terminal and 3D panoramic image browsing
CN107330945A (en) * 2017-07-05 2017-11-07 合肥工业大学 A kind of examing heartbeat fastly method based on video
CN109669673A (en) * 2017-10-12 2019-04-23 世熠网络科技(上海)有限公司 Game engine device based on HTML5
WO2019082050A1 (en) * 2017-10-23 2019-05-02 ГИОРГАДЗЕ, Анико Тенгизовна User interaction in communication systems, using an augmented reality story message
CN108830938A (en) * 2018-05-30 2018-11-16 链家网(北京)科技有限公司 A kind of virtual three-dimensional space picture balance method and device
CN108830692A (en) * 2018-06-20 2018-11-16 厦门市超游网络科技股份有限公司 Long-range panorama sees room method, apparatus, user terminal, server and storage medium
CN110933515A (en) * 2018-12-13 2020-03-27 湖南汉坤建筑安保器材有限公司 Panoramic video playing control method based on VR technology
CN109976527A (en) * 2019-03-28 2019-07-05 重庆工程职业技术学院 Interactive VR display systems
CN110134478A (en) * 2019-04-28 2019-08-16 深圳市思为软件技术有限公司 The scene conversion method, apparatus and terminal device of panoramic scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王钢: "使用VRML技术构建虚拟现代客厅", 《科技信息》 *

Also Published As

Publication number Publication date
CN112954423B (en) 2023-05-26

Similar Documents

Publication Publication Date Title
US20220292590A1 (en) Two-dimensional code identification method and device, and mobile terminal
US20190158934A1 (en) Video frame capturing method and device
CN109725803B (en) Comment information processing method and device, storage medium and electronic equipment
CN104571877A (en) Display processing method and device for pages
CN111432264A (en) Content display method, device and equipment based on media information stream and storage medium
CN110928626A (en) Interface switching method and device and electronic equipment
CN112667118A (en) Method, apparatus and computer readable medium for displaying historical chat messages
CN105335036A (en) Input interaction method and input method system
CN111760272B (en) Game information display method and device, computer storage medium and electronic equipment
CN111597009B (en) Application program display method and device and terminal equipment
CN113050861B (en) Display interface control method, electronic device and storage medium
CN112506503A (en) Programming method, device, terminal equipment and storage medium
CN109558203B (en) Recent content display method, device, terminal and storage medium
CN112954423A (en) Animation playing method, device and equipment
CN110597432B (en) Interface control method, device, computer readable medium and electronic equipment
CN113542889A (en) List video playing method and device, computer equipment and storage medium thereof
CN114092608A (en) Expression processing method and device, computer readable storage medium and electronic equipment
CN108415656B (en) Display control method, device, medium and electronic equipment in virtual scene
CN110955473A (en) Method and device for displaying loading prompt information
CN115562496B (en) XR equipment, character input method based on XR equipment and character modification method
CN109450993B (en) Method and apparatus for presenting information
CN113112613B (en) Model display method and device, electronic equipment and storage medium
CN115760110A (en) Information identification method, information identification device, electronic equipment and medium
CN115729412A (en) Interface display method and device
CN113485629A (en) Touch event processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant