CN113676765B - Interactive animation display method and device - Google Patents

Interactive animation display method and device Download PDF

Info

Publication number
CN113676765B
CN113676765B CN202110963316.0A CN202110963316A CN113676765B CN 113676765 B CN113676765 B CN 113676765B CN 202110963316 A CN202110963316 A CN 202110963316A CN 113676765 B CN113676765 B CN 113676765B
Authority
CN
China
Prior art keywords
frame image
interactive animation
animation display
frame
identifier
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110963316.0A
Other languages
Chinese (zh)
Other versions
CN113676765A (en
Inventor
申鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN202110963316.0A priority Critical patent/CN113676765B/en
Publication of CN113676765A publication Critical patent/CN113676765A/en
Application granted granted Critical
Publication of CN113676765B publication Critical patent/CN113676765B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8455Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream

Abstract

The application provides an interactive animation display method and device, wherein the interactive animation display method comprises the following steps: receiving an interactive animation identifier, and obtaining a video file corresponding to the interactive animation identifier; acquiring frame images in the video file according to a preset acquisition frequency, and storing the frame images locally into a memory; receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request; and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface. Therefore, the accuracy of the acquired video and the interactive animation is ensured, the excessive acquired frame images caused by the overlarge video files are avoided, and the network transmission efficiency is improved to a certain extent. In addition, the performance of the computer and the requirements of browser version or application program version are reduced while ensuring the interactive animation effect experience and animation effect.

Description

Interactive animation display method and device
Technical Field
The application relates to the technical field of computers, in particular to an interactive animation display method. The present application also relates to an interactive animation display device, a computing device, and a computer-readable storage medium.
Background
With the continuous development of computer technology, various animations are layered endlessly. Interactive animation is very popular in the animation field. The interactive animation can display different animation effects according to different inputs of the user, so that the user is provided with a new feeling, and the viewing interest of the user to the browser or application for displaying the interactive animation can be greatly improved.
However, the conventional method for displaying interactive animation has the performance problems of overlarge picture file, low network transmission efficiency and the like. Therefore, an effective solution is needed to solve the above-mentioned problems.
Disclosure of Invention
In view of this, the embodiment of the application provides an interactive animation display method. The application relates to an interactive animation display device, a computing device and a computer readable storage medium, so as to solve the technical defects of overlarge picture file and low network transmission efficiency in the prior art.
According to a first aspect of an embodiment of the present application, there is provided an interactive animation display method, including:
receiving an interactive animation identifier, and obtaining a video file corresponding to the interactive animation identifier;
acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images;
Receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request;
and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface.
According to a second aspect of embodiments of the present application, there is provided an interactive animation display device, including:
the receiving module is configured to receive the interactive animation identification and acquire a video file corresponding to the interactive animation identification;
the storage module is configured to acquire frame images in the video file according to a preset acquisition frequency and locally store the frame images;
the determining module is configured to receive an interactive animation display request and determine a frame image identifier to be displayed according to the interactive animation display request;
and the display module is configured to acquire a target frame image corresponding to the frame image identifier from the local storage, and display the target frame image on an interactive animation display interface.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the interactive animation presentation method when executing the computer instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the interactive animation presentation method.
According to the interactive animation display method, the interactive animation identification is received, and the video file corresponding to the interactive animation identification is obtained; acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images; receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request; and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface. The video file is acquired according to the interactive animation identification, so that the accuracy of the acquired video is ensured, and the accuracy of the interactive animation is further ensured; the frame images in the video file are acquired through the preset acquisition frequency and are stored locally, so that the situation that the acquired frame images are too many due to the fact that the video file is too large is avoided, and the network transmission efficiency is improved to a certain extent. In addition, the corresponding frame images are obtained through the frame image identifiers, so that frame-by-frame reverse playing or random frame playing of the frame images can be realized, the interactive animation effect experience is ensured, the animation effect of a higher frame number is supported, and meanwhile, the requirements on the browser version and the performance of a computer or the requirements on the application program version and the performance of the computer are reduced.
Drawings
FIG. 1 is a flowchart of an interactive animation display method according to an embodiment of the present application;
FIG. 2A is a schematic diagram of an interactive animation display interface according to an embodiment of the present application;
FIG. 2B is a schematic diagram of an interactive animation display effect according to an embodiment of the present application;
FIG. 2C is a schematic diagram illustrating a cover animation display effect according to an embodiment of the present application;
fig. 2D is a schematic diagram illustrating an image display effect of a preset frame according to an embodiment of the present application;
FIG. 2E is a process flow diagram of an interactive animation display method applied to a browser according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of an interactive animation display device according to an embodiment of the present disclosure;
FIG. 4 is a block diagram of a computing device according to one embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
The terminology used in one or more embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of one or more embodiments of the application. As used in this application in one or more embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of the present application to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the present application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
First, terms related to one or more embodiments of the present application will be explained.
Interactive animation: the method refers to an animation effect of supporting event response and interactive functions by the content displayed on the screen of the electronic equipment, namely, the animation effect that the content displayed on the screen of the electronic equipment changes in real time according to the input of a user. That is, the content displayed on the screen of the electronic device may receive a control of a certain kind by the interactive-animation player or an operation prepared in advance at the time of interactive animation.
Video cover preview: for example, when a mouse is placed on the front page of a video website and the front page of the video website is covered by a video card, the effect of video content can be quickly previewed.
CSS animation is a module of CSS (cascading style sheets) that defines how key frames are used to animate the values of CSS attributes over time. The behavior of key frame animations can be controlled by specifying their duration, their number of repetitions, and how they repeat.
Canvas (Canvas) Canvas elements are used for drawing graphics on a web page in new HTML standard HTML5, and the element labels are powerful in that graphics operations can be directly performed on HTML, so that the Canvas has great application value. Any graphic (2D or 3D) is drawn on the element with JavaScript script. Canvas elements have two properties, "width" and "height" to set the width and height of the Canvas.
WebGL technology is a 3D drawing protocol that allows JavaScript (a lightweight, interpreted, or just-in-time programming language with functional preference) to be combined with OpenGL ES 2.0, by adding a JavaScript binding to OpenGL ES 2.0, webGL can provide hardware 3D accelerated rendering for HTML5 Canvas.
Video coding format: in order to save data, a common video coding format generally uses key frames to store image data, and only changes information is stored in the picture from one key frame to the next, that is, only image data of different picture areas between one key frame and the next, not all image data of the next key frame is stored. When playing, the last key frame needs to be decoded and read, and then the current frame data is obtained through decoding processing.
Next, the basic concept of the barrage processing method provided in the present specification will be briefly described.
In the prior art, two methods for realizing interactive animation effect are commonly used: one is typically implemented using CSS Animation, canvas2D, or WebGL (drawing protocol) techniques; and secondly, realizing the technology of utilizing the video cover preview effect, namely reading a plurality of frame images at a server side, splicing the read frame images into a large image, and when a user points to the large image through a mouse and other equipment, determining a target frame image corresponding to the mouse position according to the position of the mouse position in the large image by the terminal, so as to display the target frame image in the large image.
However, the first method is implemented based on a video coding format, and based on the video coding format, the last key frame needs to be decoded and read first, then the decoding process is performed to obtain the current frame data, when the frame data is played back in a reverse or random manner, the frame data cannot be read in advance, each frame needs to read the last key frame first and calculate the current frame, and the processing can cause a great performance problem in the running process. That is, the more complex the interactive animation effect, such as frame-by-frame reverse playing or random frame playing, the higher the performance requirements on the browser version and the computer or the application version and the computer for displaying the interactive animation are when the interactive animation is realized based on the video coding format, and the user cannot obtain the interactive experience of frame-by-frame reverse playing or random frame playing when the computer and the browser or the computer and the application used by the user cannot be frame-by-frame reverse playing or random frame playing.
While the second method is only feasible when the number of frame images exhibiting the video cover preview effect is low, that is, when the number of frame images constituting the large image is small. Once the number of frame images constituting a large image is high, performance problems such as oversized image files and low network transmission efficiency will be caused.
In order to avoid excessively high requirements on browser version and computer performance used by a user or on application program version and computer performance, ensure interactive animation effect experience and support of higher frame number animation effect, and avoid low network transmission efficiency, the specification provides an interactive animation display method, which is used for receiving an interactive animation identifier and acquiring a video file corresponding to the interactive animation identifier; acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images; receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request; and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface.
In the present application, an interactive animation display method is provided, and the present application relates to an interactive animation display device, a computing device, and a computer-readable storage medium, which are described in detail in the following embodiments one by one.
Fig. 1 shows a flowchart of an interactive animation display method according to an embodiment of the present application, which specifically includes the following steps:
Step 102: and receiving the interactive animation identification, and obtaining a video file corresponding to the interactive animation identification.
Specifically, the interactive animation identifiers refer to identifiers corresponding to interactive animations to be displayed, and the interactive animation identifiers are in one-to-one correspondence with the video files, that is, one interactive animation identifier uniquely corresponds to one video file, for example, the interactive animation identifier A1 corresponds to the video file A1 and the interactive animation identifier A2 corresponds to the video file A2. Video files refer to files that provide interactive animation resources.
In practical application, for a certain application program or a certain browser page, a plurality of video files are provided, different video files can be selected according to the version of the application program or the browser to form different animation display effects, that is, the version of the application program or the browser is used as an identifier of an interactive animation, for example, the version of the application program is B1, that is, the interactive animation is identified as B1, the application program corresponds to the video file B1, and for example, the version of the browser is B2, that is, the interactive animation is identified as B2, and the browser corresponds to the video file B2; different video files can be selected according to the development of time to form different animation display effects, namely, the current time is used as an identification of the interactive animation, for example, the interactive animation is identified as C1 in the time period from 0:00 to 12:00, the interactive animation identification C1 corresponds to the video file C1, the interactive animation is identified as C2 in the time period from 12:00 to 24:00, and the interactive animation identification C2 corresponds to the video file C2. And under the condition that the interactive animation identification is received, obtaining the video file corresponding to the interactive animation identification.
It should be noted that, in the present application, the size and the video duration of the video file are not limited. Preferably, the size of the video file and the duration of the video need to ensure that the video can be played smoothly without taking up excessive network resources.
In one or more implementations of the present embodiment, in order to reduce the storage amount of the local video file and improve the local processing efficiency, the video file is provided by the server, so that the local video file only needs to be obtained from the server according to the interactive animation identifier. Therefore, the specific implementation process of obtaining the video file corresponding to the interactive animation identifier may be as follows:
generating a video file downloading request according to the interactive animation identifier;
the video file downloading request is sent to a server;
and receiving the video file which is sent by the server and corresponds to the interactive animation identifier.
Specifically, the video file downloading request refers to a request for downloading a video file corresponding to the interactive animation identifier; the server refers to a server corresponding to the application program or the browser page.
In practical application, after receiving the interactive animation identifier, a local downloading request for the video file corresponding to the interactive animation identifier needs to be generated, that is, the downloading request for the video file carrying the interactive animation identifier. And sending the video file downloading request to a server, and sending the video file corresponding to the interactive animation identifier to the local by the server under the condition of allowing downloading, and receiving the video file locally. In addition, the video file corresponding to the interactive animation identifier may be downloaded from the server locally through an asynchronous HTTP (hypertext transfer protocol) request.
For example, the interactive animation identifier D1 is received locally, and a video file downloading request D2 is generated according to the interactive animation identifier D1, wherein the video file downloading request D2 carries the interactive animation identifier D1; the video file download request D2 is then sent to the server. After receiving the video file downloading request D2 and allowing the video file downloading request, the server matches the video file D3 according to the interactive animation identifier D1, and then sends the video file D3 to the local.
Step 104: and acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images.
On the basis of receiving the interactive animation identifier and acquiring the video file corresponding to the interactive animation identifier, further, extracting and storing the frame image of the video file.
Specifically, the acquisition frequency refers to a frequency of acquiring frame images, for example, one frame image is acquired every 20 milliseconds. The frame image is also referred to as an image frame.
In practical application, the frame images of the video corresponding to the video file are acquired according to the preset acquisition, namely, the frame images of the video file are acquired once every certain time interval, and then the frame images are locally stored until all the frame images are stored. Preferably, the frame images can be stored in the memory, so that the storage speed can be increased, and the efficiency of acquiring the frame images can be improved when the interactive animation is displayed.
For example, the preset acquisition frequency is that one frame image is acquired every 30 milliseconds, then one frame image is acquired every 30 milliseconds from the start of the video file, until all frame images are acquired, and then the frame images are stored in the memory.
It should be noted that the preset acquisition frequency is set for a video preview environment of the preview video file, that is, in the video preview environment, the frame image of the video file may be acquired according to the preset acquisition frequency. In this way, the efficiency of the frame image can be improved. That is, before acquiring the frame images in the video file according to the preset acquisition frequency, a video preview environment is also required to be created according to the video file, and the frame rate is set for the video preview environment; and determining the acquisition frequency of the video preview environment according to the frame rate.
Specifically, the Frame rate (Frame rate) is a frequency or a rate at which bitmap images in units of frames continuously appear on a display, wherein the preset Frame rate is independent of the original Frame rate of a video file and the Frame rate of a terminal display.
In practical applications, the video preview environment refers to the environment of previewing video files, i.e. HTML < video > elements. An HTML < video > element is created for a video file, and then a preset frame rate is set for the HTML < video > element, and an acquisition frequency is set for a video preview environment according to the preset frame rate.
For example, an HTML < video > element is created with a preset frame rate of 25fps, i.e., 25 frames per second, and the HTML < video > element acquisition frequency is 1000/25=40 milliseconds.
In the application, after a video preview environment is created and an acquisition frequency is set, a video file can be previewed in the video preview environment according to the acquisition frequency to obtain all frame data of the video file under the acquisition frequency, and then each frame data is converted into a frame image one by one and stored. Therefore, the specific implementation process of acquiring the frame images in the video file according to the preset acquisition frequency and locally storing the frame images may be as follows:
previewing the video file in the video preview environment according to a preset acquisition frequency to obtain each frame of data;
generating a current frame image according to each frame data;
and locally storing the current frame image.
In practical application, previewing a video file in a preset environment, acquiring frame data of the video file according to a preset acquisition frequency, generating frame images one by one according to the frame data after acquiring each frame data of the video file, namely drawing current frame data into a canvas element (canvas) by a canvas. Putimage method to acquire the current frame image, and storing the current frame image into a memory by a canvas. Getimagedata method. And then generating a frame image according to the next frame data, and storing the frame image in a memory. And so on until the last frame data is generated into an image frame and stored.
In addition, after the video preview environment is created and the acquisition frequency is set, the video file can be previewed in the video preview environment according to the acquisition frequency, when one frame data is obtained, the current frame data is converted into a frame image and stored, then the next frame data is acquired, the next frame data is converted into a frame image and stored, and the like until the video file preview is finished. Previewing the video file in the video preview environment according to a preset acquisition frequency to obtain current frame data; generating a current frame image according to the frame data, and locally storing the current frame image; and obtaining the next frame data of the current data, and continuously executing the step of generating the current frame image according to the frame data.
That is, previewing a video file in a preset environment, acquiring frame data of the video file according to a preset acquisition frequency, obtaining first frame data of the video file, generating a first frame image according to the first frame data, namely drawing the first frame data in a canvas element (canvas) by a canvas method to obtain the first frame image, and storing the first frame image in a memory by a canvas method. And then acquiring second frame data, generating a second frame image, and storing the second frame image in a memory. And so on until the last frame data is generated into an image frame and stored.
Step 106: and receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request.
On the basis of extracting and locally storing the frame images of the video file, further, under the condition of receiving an interactive animation display request, determining the frame image identification to be displayed according to the interactive animation display request.
Specifically, the interactive animation presentation request refers to a request that the user wants a certain application or browser page to display interactive animation. The frame image identification refers to the identification of the frame image corresponding to the interactive animation effect to be displayed.
In practical application, when a user browses a page capable of displaying the interactive animation through a browser or uses an application program capable of displaying the interactive animation, an interactive animation display request can be generated, namely the interactive animation display request is received locally, and further, a frame image identifier to be displayed is determined locally according to the interactive animation display request.
In one or more implementations of the present embodiment, the interactive animation display request may be a cover animation display request, where the target frame image is a cover frame image. That is, when the user opens the browser page or the application program, the browser page or the application program is immediately triggered to perform the cover presentation of the interactive animation, so that the cover animation presentation request is received locally, and then the cover frame image identification is determined. That is, receiving an interactive animation display request, and determining, according to the interactive animation display request, a frame image identifier to be displayed may be: and receiving a cover animation display request, and analyzing the cover animation display request to obtain a cover frame image identifier to be displayed.
Specifically, the cover animation display request refers to a request for a cover, which is triggered when a user opens a browser page or an application program, to enable the browser page or the application program to display interactive animation. The cover frame image is an image of an interactive animation displayed by a browser page or an application program when the user opens the browser page or the application program.
In practical application, when the interactive animation display request is a cover animation display request, the cover animation display request needs to be parsed to obtain a cover frame image identifier to be displayed, so as to determine which cover frame image needs to be displayed. Therefore, when a user opens a browser page or an application program, the efficiency of displaying the cover frame image is improved, and the user experience is further improved.
In order to further increase interactivity and experience of a user, the interactive animation display request may also be a request for changing a displayed frame image by user operation, the interactive animation display request is received, and a specific implementation process of determining a frame image identifier to be displayed according to the interactive animation display request may be as follows:
receiving an interactive animation display request, wherein the interactive animation display request carries position information;
And determining the frame image identification to be displayed corresponding to the position information.
Specifically, the location information refers to information of a location pointed by a mouse, a keyboard or a touch point when a user wants to perform interactive animation display by using the mouse, the keyboard or through touch.
In practical application, the user uses the mouse, the keyboard or triggers the local receiving of the interactive animation display request in a touch manner, wherein the interactive animation display request carries the position information of the mouse, the keyboard or the touch point, and each position information corresponds to a frame image identifier, so that the identifier of the image frame to be displayed can be determined based on the position information.
In order to improve the efficiency of determining the frame image identification, the identification of the image frame to be displayed can be determined based on the position information and the offset information of the preset reference position. The specific implementation process of determining the frame image identifier to be displayed corresponding to the position information may be as follows:
determining offset information of the position information relative to a reference position of the interactive animation display interface;
and determining a frame image identifier to be displayed, which corresponds to the offset information, according to a preset corresponding relation table of the offset information and the frame image.
Specifically, the interactive animation display interface refers to an interface on a browser page or an application program for displaying an interactive animation, as shown in fig. 2A, fig. 2A shows a schematic diagram of an interactive animation display interface provided in an embodiment of the present application, where the interactive animation display interface may be above, below, or in the middle of the browser page or the application program, and the position of the interactive animation display interface on the browser page or the application program is not limited in the present application. The table of correspondence between offset information and frame image is a preset table for determining one-to-one correspondence between offset information and frame image identifiers, as shown in table 1, each offset information corresponds to one frame image, and each frame image corresponds to one frame image identifier, that is, each offset information corresponds to one frame image identifier. The reference position refers to a position of a reference point preset on the interactive animation display interface for measuring position information, and the reference point can be any point on the interactive animation display interface, preferably a center point and a boundary point of the interactive animation display interface.
Table 1 offset information and frame image correspondence table
Offset information X1 X2 X3 X4 X5
Frame image Frame image x1 Frame image x2 Frame image x3 Frame image x4 Frame image x5
Frame image identification x1 x2 x3 x4 x5
In practical application, the offset information of the position information relative to the reference position can be determined according to the position information and the reference position, and after the offset information is obtained, the frame image identifier corresponding to the offset information, namely the frame image identifier to be displayed, is matched in the corresponding relation table of the offset information and the frame image.
For example, the reference position is the center position of the interactive animation display interface, the position information is the boundary point of the interactive animation display interface, the offset information between the boundary point and the center position is X3, and the frame image identifier to be displayed can be determined to be X3 according to table 1.
Further, in practice the offset information in the offset information to frame image correspondence table may be a range as shown in table 2. When the offset information of the position information relative to the reference position of the interactive animation display interface is X1.2, since X1.2 is within the range of [ X1-X2 ], the frame image corresponding to the offset information X1.2 is identified as X2.
Table 2 another table of correspondence between offset information and frame image
Offset information [X0-X1) [X1-X2) [X2-X3) [X3-X4) [X4-X5]
Frame image Frame image x1 Frame image x2 Frame image x3 Frame image x4 Frame image x5
Frame image identification x1 x2 x3 x4 x5
In order to improve the efficiency of determining the frame image identification, position offset information can be determined based on the starting position and the stopping position in the position information, and then the identification of the image frame to be displayed can be determined according to the position offset information. That is, in the case that the location information includes the start location information and the end location information, the specific implementation process of determining the frame image identifier to be displayed corresponding to the location information may be as follows:
determining position offset information according to the starting position information and the ending position information;
and determining the frame image identification to be displayed corresponding to the position offset information.
Specifically, the initial position information refers to information of an initial position of a user entering the interactive animation display interface by using a mouse, a keyboard or a touch manner; the termination position information refers to information of a termination position of the interactive animation display interface by a user through a mouse, a keyboard or a touch mode.
In practical application, when a user interacts with the interactive animation display interface, there are initial position information and end position information on the interactive animation display interface, and position offset information of a mouse, a keyboard or a touch point during user operation can be determined locally according to the initial position information and the end position information, so that a frame image identifier to be displayed is determined according to the position offset information.
The positional shift information determined based on the initial position information and the final position information may be a positional shift amount of the final position with respect to the initial position, that is, the positional shift information includes a positional shift amount. In order to improve the efficiency of further determining the frame image identification, the frame image identification to be displayed can be determined according to the position offset and the width of the interactive animation display interface. Namely, the specific implementation process of determining the frame image identifier to be displayed corresponding to the position offset information may be as follows:
calculating the ratio of the position offset to the width of the interactive animation display interface;
and determining the frame image identification to be displayed corresponding to the offset information according to a preset ratio and frame image corresponding relation table.
Specifically, the interactive animation display interface refers to an interface on a browser page or an application program for displaying an interactive animation, as shown in fig. 2A, where the interactive animation display interface may be above, below, or in the middle of the browser page or the application program. The table of the correspondence between the position offset and the frame image is a preset table for determining the one-to-one correspondence between the ratio and the frame image identifier, as shown in table 3, each ratio corresponds to one frame image, and each frame image corresponds to one frame image identifier, that is, each ratio corresponds to one frame image identifier.
Table 3 ratio and frame image correspondence table
Ratio of Y1 Y2 Y3 Y4 Y5
Frame image Frame image y1 Frame image y2 Frame image y3 Frame image y4 Frame image y5
Frame image identification y1 y2 y3 y4 y5
In practical application, after determining the position offset, acquiring the width of the interactive animation display interface, calculating the ratio of the position offset to the width, and matching the ratio with a frame image identifier corresponding to the ratio, namely the frame image identifier to be displayed, in a table of the corresponding relation between the ratio and the frame image.
For example, the position offset is M10, the width M2 of the interactive animation display interface, the ratio of the position offset to the width of the interactive animation display interface is Y5, and it can be determined that the frame image to be displayed is identified as Y5 according to table 2.
Further, in practice the ratio in the ratio-frame image correspondence table may be a range as shown in table 4. For example, when the ratio of the position offset to the width of the interactive animation display interface is Y4.5, since Y4.5 is within the range of [ Y4-Y5], the frame image corresponding to the ratio Y4.5 is identified as Y5.
TABLE 4 correspondence table between another ratio and frame image
Ratio of [Y0-Y1) [Y1-Y2) [Y2-Y3) [Y3-Y4) [Y4-Y5]
Frame image Frame image y1 Frame image y2 Frame image y3 Frame image y4 Frame image y5
Frame image identification y1 y2 y3 y4 y5
It should be noted that, in the present application, the frame image identifier may also be determined according to a percentage mapping of the position offset to the width of the browser or the application program, where the percentage mapping, that is, the mapping relationship supports free configuration of a designer, may be a linear mapping, or may be a nonlinear mapping, for example, the mapping relationship is represented by using a bezier curve. When the mapping relation is nonlinear mapping, the position offset and the playing time of the current frame image and the frame image to be displayed in the video file are not proportional, for example, when the mouse moves forward by 100 pixels, the playing time of the current frame image and the frame image to be displayed in the video file is 1 second, the mouse moves backward by 100 pixels of video, and the playing time of the current frame image and the frame image to be displayed in the video file is only 0.5 second.
Step 108: and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface.
And further, acquiring a target frame image corresponding to the frame image identifier from a local storage on the basis of determining the frame image identifier to be displayed according to the interactive animation display request, and displaying the target frame image on an interactive animation display interface.
Specifically, the target frame image refers to a frame image which is required to be displayed on the interactive animation display interface at present. The interactive animation display interface refers to an interface on a browser page or an application program for displaying interactive animation, as shown in fig. 2A, where the interactive animation display interface may be above, below, or in the middle of the browser page or the application program.
In practical application, after the frame image identification is determined, the corresponding frame image is matched in the local storage according to the frame image identification, and the successfully matched target frame image is rendered on the interactive animation display interface so as to be conveniently displayed to the user. That is, the current frame image displayed on the interactive animation display interface is changed into the target frame image, as shown in fig. 2B, fig. 2B shows a schematic diagram of an interactive animation display effect provided in an embodiment of the present application: the current frame image is the flower and the flower of the lotus, the target frame image is the blooming state of the lotus, and the current frame image is replaced by the target frame image, so that the animation from the flower to the blooming of the lotus is just like.
In one or more implementations of the present embodiment, the interactive animation display request may be a cover animation display request, and the corresponding target frame image is a cover frame image. After analyzing the received cover animation display request, obtaining a cover frame image identifier to be displayed, further, obtaining a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface may be: and acquiring a cover frame image corresponding to the cover frame image identifier from the local storage, and displaying the cover frame image on the interactive animation display interface.
In practical application, when the interactive animation display request is a cover animation display request, the cover animation display request is analyzed to obtain a cover frame image identifier to be displayed, further, a cover frame image corresponding to the cover frame image identifier is obtained from a local storage according to the cover frame image identifier, and then the cover frame image is displayed on an interactive animation display interface. As shown in fig. 2C, fig. 2C is a schematic diagram illustrating a cover animation display effect according to an embodiment of the present application: when a user opens a browser page or an application program, the cover frame image is displayed on an interactive animation display interface of the browser page or the application program. Therefore, when a user opens a browser page or an application program, the efficiency of displaying the frame images of the cover is improved, the phenomenon that the interactive animation display interface is blank when the user opens the browser page or the application program is avoided, namely, no frame images are displayed, and the user experience is further improved
In order to improve the fluency of the interactive animation effect, after the interactive animation display interface displays the target frame image, the next frame image of the target frame image can be acquired, and when the interactive animation display request continuously occurs, the next frame image can be conveniently and rapidly displayed. Namely, after the interactive animation display interface displays the target frame image, acquiring the next frame image of the target frame image in the local storage; and under the condition that continuous interactive animation display requests are received, responding to a next interactive animation display request of the current interactive animation display request, and displaying the next frame image on the interactive animation display interface.
In practice, the user may issue a plurality of consecutive interactive animation display requests, for example, the mouse slides leftwards on the interactive animation display interface and then continues to slide leftwards. In order to improve the processing efficiency, the next frame image of the target frame image can be acquired in advance, and when the next interactive animation display request occurs, the next frame image is displayed on the interactive animation display interface.
In addition, the browser can update the next frame of animation before each display of the target frame image by using the requestanimation frames method, so that the smoothness of the animation effect is maintained. It should be noted that, when a certain frame of image is displayed, the time required for displaying the frame of image exceeds the frame rate of the browser, and the display can be skipped once, so as to guarantee the interactive experience of the user and avoid a click feeling. For example, the refresh frequency of the browser is 60fps, at this time, the maximum time required for displaying a frame image is 16 milliseconds, if the time required for displaying a certain frame image exceeds 17 milliseconds, the frame image cannot be displayed in time in the interactive animation display interface of the browser, and the frame image is skipped when the interactive animation display interface of the browser uses the frame image next time.
It should be noted that, receiving the interactive animation display request through the interactive animation display interface is only the case of interaction with the user in the application, and when the interactive animation display request is ended, the interactive animation display interface displays a preset frame image, that is, when the interactive animation display request is detected to be ended, the preset frame image is displayed on the interactive animation display interface.
Specifically, the preset frame image may be a cover frame image, may be a target frame image, or may be one other frame image of the person in the frame image, and the designer configures the preset frame image to be a certain frame image in the frame images.
In practical application, when a user leaves the interactive animation display interface by operating a mouse or a keyboard or leaves the touch point from the interactive animation display interface in a touch manner, and at this time, the interactive animation display request is ended, and the interactive animation display interface displays a preset frame image.
Fig. 2D is a schematic diagram illustrating a preset frame image display effect according to an embodiment of the present application. As shown in fig. 2D, when the user operates the mouse to leave the interactive animation display interface in the browser page or the application program, the interactive animation display interface will display a preset frame image.
According to the interactive animation display method, the interactive animation identification is received, and the video file corresponding to the interactive animation identification is obtained; acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images; receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request; and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface. The video file is acquired according to the interactive animation identification, so that the accuracy of the acquired video is ensured, and the accuracy of the interactive animation is further ensured; the frame images in the video file are acquired through the preset acquisition frequency and are stored locally, so that the situation that the acquired frame images are too many due to the fact that the video file is too large is avoided, and the network transmission efficiency is improved to a certain extent. In addition, the corresponding frame images are obtained through the frame image identifiers, so that frame-by-frame reverse playing or random frame playing of the frame images can be realized, the interactive animation effect experience is ensured, the animation effect of a higher frame number is supported, and meanwhile, the requirements on the browser version and the performance of a computer or the requirements on the application program version and the performance of the computer are reduced.
The following describes the interactive animation display method with reference to fig. 2E by taking an application of the interactive animation display method provided in the present application in a browser as an example. Fig. 2E shows a process flow chart of an interactive animation display method applied to a browser according to an embodiment of the present application, which specifically includes the following steps:
step 202: an interactive animation identification is received.
Step 204: and generating a video file downloading request according to the interactive animation identifier.
Step 206: and sending the video file downloading request to a server.
Step 208: and receiving the video file which is sent by the server and corresponds to the interactive animation identifier.
Step 210: and creating a video preview environment according to the video file.
Step 212: and setting an acquisition frequency for the video preview environment.
The frame rate may be set for the video preview environment first, and then the acquisition frequency of the video preview environment may be determined according to the frame rate.
Step 214: previewing the video file in the video preview environment according to the preset acquisition frequency to obtain each frame of data.
Step 216: starting from the first frame data, a current frame image is generated from the frame data.
Step 218: and locally storing the current frame image.
Step 220: it is determined whether there is next frame data.
If yes, go to step 216, if not go to step 222.
Step 222: and receiving a cover animation display request.
Step 224: and analyzing the cover animation display request to obtain the cover frame image identification to be displayed.
Step 226: and acquiring the cover frame image corresponding to the cover frame image identifier from the local storage.
Step 228: and displaying the cover frame image on the interactive animation display interface.
Step 230: and receiving an interactive animation display request, wherein the interactive animation display request carries the position information.
Step 232: and determining the frame image identification to be displayed corresponding to the position information.
In one or more implementations of the present embodiment, offset information of the location information relative to a reference location of the interactive animation presentation interface may be determined; and determining a frame image identifier to be displayed, which corresponds to the offset information, according to a preset corresponding relation table of the offset information and the frame image.
In one or more implementations of the present embodiment, the location information includes start location information and end location information. On the basis, determining position offset information according to the starting position information and the ending position information; and determining the frame image identification to be displayed corresponding to the position offset information.
Further, the positional offset information includes a positional offset amount. The specific process of determining the frame image identifier to be displayed corresponding to the position offset information may be: calculating the ratio of the position offset to the width of the interactive animation display interface; and determining the frame image identification to be displayed corresponding to the offset information according to a preset ratio and frame image corresponding relation table.
Step 234: and acquiring a target frame image corresponding to the frame image identification from the local storage.
Step 236: and displaying the target frame image on an interactive animation display interface.
Step 238: and acquiring a next frame image of the target frame image in the local storage.
Step 240: and under the condition that continuous interactive animation display requests are received, responding to a next interactive animation display request of the current interactive animation display request, and displaying the next frame image on the interactive animation display interface.
Step 242: and under the condition that the end of the interactive animation display request is detected, displaying a preset frame image on the interactive animation display interface.
According to the interactive animation display method applied to the browser, the video file corresponding to the interactive animation identifier is obtained by receiving the interactive animation identifier; then acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images; then receiving an interactive animation display request, and determining a frame image identification to be displayed according to the interactive animation display request; and finally, acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface. The video file is acquired according to the interactive animation identification, so that the accuracy of the acquired video is ensured, and the accuracy of the interactive animation is further ensured; the frame images in the video file are acquired through the preset acquisition frequency and are stored locally, so that the situation that the acquired frame images are too many due to the fact that the video file is too large is avoided, and the network transmission efficiency is improved to a certain extent. In addition, the corresponding frame images are obtained through the frame image identifiers, so that frame-by-frame reverse playing or random frame playing of the frame images can be realized, the interactive animation effect experience is ensured, the animation effect of higher frames is supported, and meanwhile, the requirements on browser version and computer performance are reduced.
Corresponding to the method embodiment, the present application further provides an embodiment of an interactive animation display device, and fig. 3 shows a schematic structural diagram of an interactive animation display device according to an embodiment of the present application. As shown in fig. 3, the apparatus includes:
a receiving module 302, configured to receive an interactive animation identifier, and obtain a video file corresponding to the interactive animation identifier;
the storage module 304 is configured to acquire frame images in the video file according to a preset acquisition frequency and locally store the frame images;
a determining module 306 configured to receive an interactive animation display request, and determine a frame image identifier to be displayed according to the interactive animation display request;
and the display module 308 is configured to acquire a target frame image corresponding to the frame image identifier from the local storage, and display the target frame image on an interactive animation display interface.
In one or more implementations of this embodiment, the interactive animation display request is a cover animation display request, and the target frame image is a cover frame image;
the determining module 306 is further configured to:
receiving a cover animation display request, and analyzing the cover animation display request to obtain a cover frame image identifier to be displayed;
Further, the presentation module 308 is further configured to:
and acquiring a cover frame image corresponding to the cover frame image identifier from the local storage, and displaying the cover frame image on the interactive animation display interface.
In one or more implementations of the present embodiment, the determining module 306 is further configured to:
receiving an interactive animation display request, wherein the interactive animation display request carries position information;
and determining the frame image identification to be displayed corresponding to the position information.
In one or more implementations of the present embodiment, the determining module 306 is further configured to:
determining offset information of the position information relative to a reference position of the interactive animation display interface;
and determining a frame image identifier to be displayed, which corresponds to the offset information, according to a preset corresponding relation table of the offset information and the frame image.
In one or more implementations of the present embodiment, the location information includes start location information and end location information;
the determining module 306 is further configured to:
determining position offset information according to the starting position information and the ending position information;
and determining the frame image identification to be displayed corresponding to the position offset information.
In one or more implementations of the present embodiment, the positional offset information includes a positional offset amount;
the determining module 306 is further configured to:
calculating the ratio of the position offset to the width of the interactive animation display interface;
and determining the frame image identification to be displayed corresponding to the offset information according to a preset ratio and frame image corresponding relation table.
In one or more implementations of the present embodiment, the apparatus further includes a creation module configured to:
creating a video preview environment according to the video file, and setting a frame rate for the video preview environment;
and determining the acquisition frequency of the video preview environment according to the frame rate.
In one or more implementations of the present embodiment, the storage module 304 is further configured to:
previewing the video file in the video preview environment according to a preset acquisition frequency to obtain each frame of data;
generating a current frame image according to each frame data;
and locally storing the current frame image.
In one or more implementations of the present embodiment, the receiving module 302 is further configured to:
generating a video file downloading request according to the interactive animation identifier;
The video file downloading request is sent to a server;
and receiving the video file which is sent by the server and corresponds to the interactive animation identifier.
In one or more implementations of the present embodiment, the apparatus further includes an acquisition module configured to:
acquiring a next frame image of the target frame image in the local storage;
and under the condition that continuous interactive animation display requests are received, responding to a next interactive animation display request of the current interactive animation display request, and displaying the next frame image on the interactive animation display interface.
In one or more implementations of the present embodiment, the apparatus further includes a detection module configured to:
and under the condition that the end of the interactive animation display request is detected, displaying a preset frame image on the interactive animation display interface.
The interactive animation display device receives the interactive animation identification and acquires a video file corresponding to the interactive animation identification; acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images; receiving an interactive animation display request, and determining a frame image identifier to be displayed according to the interactive animation display request; and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface. The video file is acquired according to the interactive animation identification, so that the accuracy of the acquired video is ensured, and the accuracy of the interactive animation is further ensured; the frame images in the video file are acquired through the preset acquisition frequency and are stored locally, so that the situation that the acquired frame images are too many due to the fact that the video file is too large is avoided, and the network transmission efficiency is improved to a certain extent. In addition, the corresponding frame images are obtained through the frame image identifiers, so that frame-by-frame reverse playing or random frame playing of the frame images can be realized, the interactive animation effect experience is ensured, the animation effect of a higher frame number is supported, and meanwhile, the requirements on the browser version and the performance of a computer or the requirements on the application program version and the performance of the computer are reduced.
The above is a schematic scheme of an interactive animation display device of this embodiment. It should be noted that, the technical solution of the interactive animation display device and the technical solution of the interactive animation display method belong to the same concept, and details of the technical solution of the interactive animation display device, which are not described in detail, can be referred to the description of the technical solution of the interactive animation display method.
Fig. 4 illustrates a block diagram of a computing device 400 provided in accordance with an embodiment of the present application. The components of the computing device 400 include, but are not limited to, a memory 410 and a processor 420. Processor 420 is coupled to memory 410 via bus 430 and database 450 is used to hold data.
Computing device 400 also includes access device 440, access device 440 enabling computing device 400 to communicate via one or more networks 460. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 440 may include one or more of any type of network interface, wired or wireless (e.g., a Network Interface Card (NIC)), such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-described components of computing device 400, as well as other components not shown in FIG. 4, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device illustrated in FIG. 4 is for exemplary purposes only and is not intended to limit the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 400 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 400 may also be a mobile or stationary server.
Wherein the processor 420 implements the steps of the interactive animation display method when executing the computer instructions.
The foregoing is a schematic illustration of a computing device of this embodiment. It should be noted that, the technical solution of the computing device and the technical solution of the interactive animation display method belong to the same concept, and details of the technical solution of the computing device, which are not described in detail, can be referred to the description of the technical solution of the interactive animation display method.
An embodiment of the present application also provides a computer-readable storage medium storing computer instructions that, when executed by a processor, implement the steps of the interactive animation presentation method as described above.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the interactive animation display method belong to the same concept, and details of the technical solution of the storage medium which are not described in detail can be referred to the description of the technical solution of the interactive animation display method.
The foregoing describes specific embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all necessary for the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The above-disclosed preferred embodiments of the present application are provided only as an aid to the elucidation of the present application. Alternative embodiments are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the teaching of this application. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention. This application is to be limited only by the claims and the full scope and equivalents thereof.

Claims (12)

1. An interactive animation display method, comprising:
receiving an interactive animation identifier, and obtaining a video file corresponding to the interactive animation identifier;
acquiring frame images in the video file according to a preset acquisition frequency, and locally storing the frame images, wherein the acquisition frequency refers to the frequency of acquiring the frame images;
receiving an interactive animation display request, wherein the interactive animation display request carries position information, and the position information refers to information of a position pointed by a mouse, a keyboard or a touch point when a user performs interactive animation display by using the mouse, the keyboard or touch;
determining offset information of the position information relative to a reference position of the interactive animation display interface;
determining a frame image identifier to be displayed corresponding to the offset information according to a preset offset information and frame image corresponding relation table, wherein the offset information and frame image corresponding relation table is a table which is preset and is used for determining one-to-one correspondence between the offset information and the frame image identifier;
and acquiring a target frame image corresponding to the frame image identifier from the local storage, and displaying the target frame image on an interactive animation display interface.
2. The method of claim 1, wherein the interactive animation display request is a cover animation display request and the target frame image is a cover frame image;
the receiving the interactive animation display request, determining the frame image identification to be displayed according to the interactive animation display request, comprises the following steps:
receiving a cover animation display request, and analyzing the cover animation display request to obtain a cover frame image identifier to be displayed;
the step of obtaining the target frame image corresponding to the frame image identification from the local storage and displaying the target frame image on an interactive animation display interface comprises the following steps:
and acquiring a cover frame image corresponding to the cover frame image identifier from the local storage, and displaying the cover frame image on the interactive animation display interface.
3. The method of claim 1, wherein the location information comprises start location information and end location information;
the determining the frame image identifier to be displayed corresponding to the position information comprises the following steps:
determining position offset information according to the starting position information and the ending position information;
and determining the frame image identification to be displayed corresponding to the position offset information.
4. A method according to claim 3, wherein the positional offset information comprises a positional offset;
the determining the frame image identifier to be displayed corresponding to the position offset information comprises the following steps:
calculating the ratio of the position offset to the width of the interactive animation display interface;
and determining the frame image identification to be displayed corresponding to the offset information according to a preset ratio and frame image corresponding relation table.
5. The method according to claim 1 or 2, wherein before the frame images in the video file are acquired according to a preset acquisition frequency, the method comprises:
creating a video preview environment according to the video file, and setting a frame rate for the video preview environment;
and determining the acquisition frequency of the video preview environment according to the frame rate.
6. The method of claim 5, wherein the acquiring the frame images in the video file according to the preset acquisition frequency and locally storing the frame images comprises:
previewing the video file in the video preview environment according to a preset acquisition frequency to obtain each frame of data;
generating a current frame image according to each frame data;
And locally storing the current frame image.
7. The method according to claim 1 or 2, wherein the obtaining a video file corresponding to the interactive animation identifier comprises:
generating a video file downloading request according to the interactive animation identifier;
the video file downloading request is sent to a server;
and receiving the video file which is sent by the server and corresponds to the interactive animation identifier.
8. The method according to claim 1 or 2, wherein after the displaying of the target frame image by the interactive animation display interface, further comprising:
acquiring a next frame image of the target frame image in the local storage;
and under the condition that continuous interactive animation display requests are received, responding to a next interactive animation display request of the current interactive animation display request, and displaying the next frame image on the interactive animation display interface.
9. The method according to claim 1 or 2, wherein after the displaying of the target frame image by the interactive animation display interface, further comprising:
and under the condition that the end of the interactive animation display request is detected, displaying a preset frame image on the interactive animation display interface.
10. An interactive animation display device, comprising:
the receiving module is configured to receive the interactive animation identification and acquire a video file corresponding to the interactive animation identification;
the storage module is configured to acquire frame images in the video file according to a preset acquisition frequency and locally store the frame images, wherein the acquisition frequency refers to the frequency of acquiring the frame images;
the system comprises a determining module, a display module and a display module, wherein the determining module is configured to receive an interactive animation display request, the interactive animation display request carries position information, the position information refers to information of a position pointed by a mouse, a keyboard or a touch point when a user uses the mouse, the keyboard or the touch to perform interactive animation display, offset information of the position information relative to a reference position of an interactive animation display interface is determined, a frame image identifier to be displayed corresponding to the offset information is determined according to a preset offset information and frame image corresponding relation table, and the offset information and frame image corresponding relation table is a preset table for determining one-to-one correspondence of the offset information and the frame image identifier;
and the display module is configured to acquire a target frame image corresponding to the frame image identifier from the local storage, and display the target frame image on an interactive animation display interface.
11. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor, when executing the computer instructions, performs the steps of the method of any one of claims 1-9.
12. A computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the method of any one of claims 1-9.
CN202110963316.0A 2021-08-20 2021-08-20 Interactive animation display method and device Active CN113676765B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110963316.0A CN113676765B (en) 2021-08-20 2021-08-20 Interactive animation display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110963316.0A CN113676765B (en) 2021-08-20 2021-08-20 Interactive animation display method and device

Publications (2)

Publication Number Publication Date
CN113676765A CN113676765A (en) 2021-11-19
CN113676765B true CN113676765B (en) 2024-03-01

Family

ID=78544812

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110963316.0A Active CN113676765B (en) 2021-08-20 2021-08-20 Interactive animation display method and device

Country Status (1)

Country Link
CN (1) CN113676765B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103957324A (en) * 2014-05-17 2014-07-30 高伟 Method and system for interaction with television programs through mobile communication terminal
WO2017189985A1 (en) * 2016-04-29 2017-11-02 Grasscrown, Inc. Methods and apparatus for providing interactive images
CN110062269A (en) * 2018-01-18 2019-07-26 腾讯科技(深圳)有限公司 Extra objects display methods, device and computer equipment
CN110679154A (en) * 2017-10-27 2020-01-10 谷歌有限责任公司 Previewing videos in response to computing device interactions
CN111417028A (en) * 2020-03-13 2020-07-14 腾讯科技(深圳)有限公司 Information processing method, information processing apparatus, storage medium, and electronic device
CN112882637A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Interaction method for multi-layer animation display and browser

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103957324A (en) * 2014-05-17 2014-07-30 高伟 Method and system for interaction with television programs through mobile communication terminal
WO2017189985A1 (en) * 2016-04-29 2017-11-02 Grasscrown, Inc. Methods and apparatus for providing interactive images
CN110679154A (en) * 2017-10-27 2020-01-10 谷歌有限责任公司 Previewing videos in response to computing device interactions
CN110062269A (en) * 2018-01-18 2019-07-26 腾讯科技(深圳)有限公司 Extra objects display methods, device and computer equipment
CN111417028A (en) * 2020-03-13 2020-07-14 腾讯科技(深圳)有限公司 Information processing method, information processing apparatus, storage medium, and electronic device
CN112882637A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Interaction method for multi-layer animation display and browser

Also Published As

Publication number Publication date
CN113676765A (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN107147939A (en) Method and apparatus for adjusting net cast front cover
CN110784754A (en) Video display method and device and electronic equipment
CN111078070A (en) PPT video barrage play control method, device, terminal and medium
CN111078078B (en) Video playing control method, device, terminal and computer readable storage medium
CN113411664B (en) Video processing method and device based on sub-application and computer equipment
EP4242839A1 (en) Page switching display method and apparatus, storage medium, and electronic device
CN112102422B (en) Image processing method and device
CN111951356B (en) Animation rendering method based on JSON data format
CN113050939A (en) Page generation method and device
CN113688341B (en) Dynamic picture decomposition method and device, electronic equipment and readable storage medium
CN113705156A (en) Character processing method and device
CN113676765B (en) Interactive animation display method and device
CN111866403B (en) Video graphic content processing method, device, equipment and medium
CN112604279A (en) Special effect display method and device
JP6924544B2 (en) Cartoon data display system, method and program
US20240055025A1 (en) System and method for dynamic, data-driven videos
CN113641853A (en) Dynamic cover generation method, device, electronic equipment, medium and program product
CN111242688B (en) Animation resource production method and device, mobile terminal and storage medium
CN110662099B (en) Method and device for displaying bullet screen
CN110717135A (en) Method for displaying hollow card information in list for displaying advertisement information
CN112418902A (en) Multimedia synthesis method and system based on webpage
CN110764765A (en) Method for automatically controlling display visual rhythm of directory list
CN112667942A (en) Animation generation method, device and medium
CN112800360B (en) Object control method and device
CN115858069A (en) Page animation display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant