CN112004134B - Multimedia data display method, device, equipment and storage medium - Google Patents

Multimedia data display method, device, equipment and storage medium Download PDF

Info

Publication number
CN112004134B
CN112004134B CN201910447817.6A CN201910447817A CN112004134B CN 112004134 B CN112004134 B CN 112004134B CN 201910447817 A CN201910447817 A CN 201910447817A CN 112004134 B CN112004134 B CN 112004134B
Authority
CN
China
Prior art keywords
interactive
multimedia data
interface
target multimedia
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910447817.6A
Other languages
Chinese (zh)
Other versions
CN112004134A (en
Inventor
杜蒲敏
梁艺东
刘小龙
黄志刚
林剑城
陆旭彬
何文东
肖秉日
李莹
林承纬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910447817.6A priority Critical patent/CN112004134B/en
Publication of CN112004134A publication Critical patent/CN112004134A/en
Application granted granted Critical
Publication of CN112004134B publication Critical patent/CN112004134B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method, a device, equipment and a storage medium for displaying multimedia data, and belongs to the technical field of multimedia. The method comprises the following steps: displaying the target multimedia data based on a selected instruction of the target multimedia data, and displaying a first interactive icon on a first interface for displaying the target multimedia data, wherein the first interactive icon is used for indicating that the interaction is not triggered; acquiring interactive resources of target multimedia data based on a trigger instruction of a first interactive icon, skipping the first interface to a second interface, displaying a second interactive icon on the second interface, wherein the second interactive icon is used for indicating that interaction is triggered, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios; and displaying the interactive resources of the target multimedia data on the second interface. Besides the first interactive icon is adjusted to be the second interactive icon, the interactive resources of the target multimedia data are displayed, the interactive feedback is diversified, the interactive rate is improved, and the interactive experience of the user is improved.

Description

Multimedia data display method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of multimedia, in particular to a method, a device, equipment and a storage medium for displaying multimedia data.
Background
With the development of the internet, various types of multimedia data display scenes are increasing. Such as video presentation scenes, audio presentation scenes, and picture presentation scenes, among others. In order to enhance the interactivity of multimedia data, various interactive modes are set when the multimedia data is displayed. For example, the interaction mode is like, forward or comment mode. Among them, praise is the most fundamental way of interaction.
In the related art, in the process of displaying multimedia data, an interactive icon is displayed on an interface for displaying the multimedia data, and a terminal adjusts the interactive icon from a first state before interaction to a second state after interaction based on a trigger instruction of the interactive icon. For example, by taking the displayed multimedia data as a video and the interactive icon as a like, in the process of displaying the video, after the terminal acquires a trigger instruction of the user for the like icon, the transparent color of the like icon is adjusted to red, so that the state of the like icon is adjusted from the first state to the second state.
In the process of implementing the embodiment of the present application, the inventor finds that the related art has at least the following problems:
in the related technology, the terminal adjusts the interactive icon from the first state to the second state on the display interface of the multimedia data based on the trigger instruction of the interactive icon, and in addition, other interactive feedback does not exist, the intensity of the interactive feedback is low, and the form of the interactive feedback is single, so that the interaction rate is low, and the interactive experience of a user is reduced.
Disclosure of Invention
The embodiment of the application provides a multimedia data display method, device, equipment and storage medium, which can be used for solving the problems in the related art. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for displaying multimedia data, where the method includes:
displaying the target multimedia data based on a selected instruction of the target multimedia data, and displaying a first interactive icon on a first interface for displaying the target multimedia data, wherein the first interactive icon is used for indicating that the interaction is not triggered;
acquiring interactive resources of the target multimedia data based on a trigger instruction of the first interactive icon, skipping the first interface to a second interface, displaying a second interactive icon on the second interface, wherein the second interactive icon is used for indicating that interaction is triggered, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
and displaying the interactive resources of the target multimedia data on the second interface.
In another aspect, a method for displaying multimedia data is also provided, the method including:
displaying target multimedia data, and displaying a first interaction icon on a first interface for displaying the target multimedia data, wherein the first interaction icon is used for indicating that interaction is not triggered;
when the first interactive icon is triggered, acquiring interactive resources of the target multimedia data, jumping the first interface to a second interface, and displaying a second interactive icon on the second interface, wherein the second interactive icon is used for indicating that interaction is triggered, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
and displaying the interactive resources of the target multimedia data on the second interface.
In another aspect, there is provided a presentation apparatus of multimedia data, the apparatus including:
the first display module is used for displaying the target multimedia data based on the selected instruction of the target multimedia data;
the second display module is used for displaying a first interaction icon on a first interface for displaying the target multimedia data, wherein the first interaction icon is used for indicating that the interaction is not triggered;
the first acquisition module is used for acquiring the interactive resources of the target multimedia data based on the trigger instruction of the first interactive icon;
the skipping module is used for skipping the first interface to a second interface;
the third display module is used for displaying a second interactive icon on the second interface, the second interactive icon is used for indicating that interaction is triggered, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
and the fourth display module is used for displaying the interactive resources of the target multimedia data on the second interface.
In another aspect, an apparatus for presenting multimedia data is provided, the apparatus including:
the first display module is used for displaying the target multimedia data;
the second display module is used for displaying a first interaction icon on a first interface for displaying the target multimedia data, wherein the first interaction icon is used for indicating that the interaction is not triggered;
the acquisition module is used for acquiring the interactive resources of the target multimedia data after the first interactive icon is triggered;
the skipping module is used for skipping the first interface to a second interface;
the third display module is used for displaying a second interactive icon on the second interface, the second interactive icon is used for indicating that interaction is triggered, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
and the fourth display module is used for displaying the interactive resources of the target multimedia data on the second interface.
In another aspect, a computer device is provided, which includes a processor and a memory, where the memory stores at least one instruction, and the at least one instruction, when executed by the processor, implements any one of the above methods for presenting multimedia data.
In another aspect, a computer-readable storage medium is provided, where at least one instruction is stored in the computer-readable storage medium, and when executed, the at least one instruction implements any one of the above methods for presenting multimedia data.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
and acquiring and displaying interactive resources of the target multimedia data based on the trigger instruction of the first interactive icon, and adjusting the first interactive icon into a second interactive icon. After the trigger instruction of the first interactive icon is acquired, the first interactive icon is adjusted to be the second interactive icon, interactive resources such as interactive videos, interactive pictures and interactive audios of the target multimedia data are displayed, the interactive feedback strength is high, and the interactive feedback is diverse in form, so that the interactive rate is improved, and the interactive experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings may be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
fig. 2 is a flowchart of a method for displaying multimedia data according to an embodiment of the present disclosure;
FIG. 3 is a schematic interface diagram of a guide icon showing a first interactive icon and a first interactive icon according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a first interactive icon provided in the present application;
fig. 5 is a schematic diagram of a first interactive icon, a second interactive icon, and a third interactive icon provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of a composite document provided by an embodiment of the present application;
FIG. 7 is a diagram illustrating a data structure provided by an embodiment of the present application;
FIG. 8 is a schematic interface diagram illustrating a jump from a first interface to a second interface according to an embodiment of the present disclosure;
FIG. 9 is a schematic interface diagram of a second interface returning to a first interface provided by an embodiment of the present application;
fig. 10 is a schematic diagram illustrating a presentation process of multimedia data according to an embodiment of the present application;
fig. 11 is a flowchart of a method for displaying multimedia data according to an embodiment of the present disclosure;
FIG. 12 is a schematic diagram of a multimedia data display apparatus according to an embodiment of the present application;
FIG. 13 is a schematic diagram of a multimedia data display device according to an embodiment of the present application;
FIG. 14 is a schematic diagram of a multimedia data display device according to an embodiment of the present application;
FIG. 15 is a schematic diagram of a multimedia data display device according to an embodiment of the present application;
FIG. 16 is a schematic diagram of a multimedia data display apparatus according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of a device for presenting multimedia data according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
With the development of the internet, various types of multimedia data display scenes are increasing. Such as a video presentation scene, an audio presentation scene, a picture presentation scene, and so on. In order to enhance the interactivity of multimedia data, multiple interactive modes are set when the multimedia data is displayed. For example, the interaction mode is like approval, forwarding or comment. Among them, praise is the most fundamental way of interaction.
In view of the above, an embodiment of the present application provides a method for displaying multimedia data, please refer to fig. 1, which illustrates a schematic diagram of an implementation environment of the method provided in the embodiment of the present application. The implementation environment may include: a terminal 11 and a server 12.
The terminal 11 is installed with an application program or a webpage capable of displaying multimedia data, and the method provided in the embodiment of the present application can be applied to display multimedia data in the process of interaction between a user and the application program or the multimedia data displayed in the webpage of the terminal 11. The terminal 11 can acquire multimedia data and then upload the multimedia data to the server 12 for storage. Of course, the terminal 11 can also acquire multimedia data from the server 12 and store the multimedia data.
Alternatively, the terminal 11 may be a smart device such as a mobile phone, a tablet computer, a personal computer, or the like. The server 12 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center. The terminal 11 establishes a communication connection with the server 12 through a wired or wireless network.
Those skilled in the art should understand that the above-mentioned terminal 11 and server 12 are only examples, and other existing or future terminals or servers, such as may be suitable for the embodiments of the present application, should also be included in the scope of the embodiments of the present application, and are hereby incorporated by reference.
Based on the implementation environment shown in fig. 1, an embodiment of the present application provides a method for displaying multimedia data, which is applied to a terminal as an example. As shown in fig. 2, the method provided by the embodiment of the present application may include the following steps:
in step 201, the target multimedia data is displayed based on the selected instruction of the target multimedia data, and a first interactive icon is displayed on a first interface for displaying the target multimedia data, where the first interactive icon is used to indicate that the interaction is not triggered.
Multimedia data includes, but is not limited to, video, audio, or pictures. Generally, on a screen of a terminal, multimedia data uploaded by other users concerned by the user may be displayed, or recommended multimedia data may be displayed in a multimedia data recommendation area, where the recommended multimedia data may be determined by a server according to a display amount and/or a forwarding amount of the multimedia data. In the multimedia data displayed at the terminal, there may be both general multimedia data without interactive resources and special multimedia data with interactive resources. In the embodiment of the present application, the target multimedia data refers to any multimedia data in the special multimedia data having the interactive resources. When a user selects a special multimedia data with interactive resources, the terminal acquires a selected instruction of the target multimedia data.
And according to different display modes of the terminal on the multimedia data, different modes of acquiring the selected instruction of the target multimedia data are obtained. Optionally, when the terminal displays the multimedia data in a list form, taking a trigger instruction of the user to the target multimedia data as a selected instruction of the target multimedia data; when the terminal displays the multimedia data in the form of multimedia data stream, a user can place the multimedia data to be watched or listened in the main area of the screen of the terminal in the sliding operation process, and the terminal takes the multimedia data placed in the main area of the screen as target multimedia data, so that the selected instruction of the target multimedia data is obtained. The multimedia data stream refers to a presentation form for sequentially presenting a plurality of multimedia data, and the target multimedia data can be determined in the multimedia data stream through the sliding operation of a user.
After the terminal obtains the selection instruction of the target multimedia data, the target multimedia data are obtained from the server based on the selection instruction of the target multimedia data, and then the target multimedia data are displayed on the terminal. The interface for displaying the target multimedia data is used as the first interface, and the first interface may occupy the whole area of the terminal screen or occupy most of the area of the terminal screen. That is, the target multimedia data may be displayed in the entire screen area of the terminal screen, or may be displayed in a partial area of the screen. Optionally, the first interface is an interface in a first player, and the first player is configured to display the target multimedia data. The first player may be different according to the form of the target multimedia data. For example, when the target multimedia data is a target video, the first player may be a video player; when the target multimedia data is the target audio, the first player may be an audio player; when the target multimedia data is a target picture, the first player may be a picture player. The terminal starts the first player in advance before displaying the multimedia data, and then displays the target multimedia data on an interface of the first player based on the selected instruction of the target multimedia data.
And a first interactive icon is also displayed on the first interface for displaying the target multimedia data, and the first interactive icon is used for indicating that the interaction is not triggered. Optionally, displaying a first interactive icon on a first interface displaying the target multimedia data, including: and displaying a first interactive icon and a guide icon of the first interactive icon on a first interface for displaying the target multimedia data, wherein the guide icon of the first interactive icon is used for guiding the interactive operation of the first interactive icon.
The guide icon of the first interactive icon generally only appears in case of first showing special multimedia data with interactive resources. The guide icon of the first interactive icon can be in the form of a label, a bubble, a floating layer and the like, and characters can be displayed in the guide icon of the first interactive icon to guide the user to perform interactive operation on the first interactive icon. For example, a guide icon is displayed above the first interactive icon in the form of a bubble, and the text displayed in the bubble is "click with surprise", as shown in fig. 3 (1), so as to remind the user that by triggering the first interactive icon, additional surprise can be obtained, and the user's surprise is improved. In case of re-presenting the special multimedia data with the interactive resources, the guide icon of the first interactive icon is not displayed any more, as shown in fig. 3 (2).
Optionally, a numerical value is displayed below the first interactive icon in the first interface for displaying the target multimedia data, where the numerical value is used to prompt the user of the number of times of interaction of the target multimedia data, for example, the numerical value is 100 displayed below the first interactive icon in fig. 3 (1) and fig. 3 (2), and the numerical value is 100 of the number of times of interaction between the other users and the target multimedia data before the current user triggers the interactive icon. Optionally, a heart-shaped icon with a heartbeat curve is dynamically displayed in a first color on a first interface displaying the target multimedia data. At this time, the first interactive icon is a heart-shaped icon with a heartbeat curve and a first color.
In an embodiment of the present application, the first interactive icon may be a thumbs up icon. In the related art, the complimentary icons are generally normal heart-shaped icons. According to the first interactive icon, the heartbeat curve and the common heart shape are ingeniously combined to obtain the heart shape icon with the heartbeat curve, as shown in fig. 4. In addition, while the complimentary icons in the related art are generally static, the first interactive icon in the embodiment of the present application is dynamic. In effect, dynamically presenting a cardioid icon with a heartbeat curve may convey the concept of a heartbeat, thereby giving the user a feeling of heartbeat acceleration. By dynamically displaying the heart-shaped icon with the heartbeat curve, the special multimedia data with interactive resources is emphasized, so that the user can be attracted more, and the interaction rate of the user and the multimedia data is improved. Alternatively, the first color may be white, and the first interactive icon may be as shown in fig. 5 (1).
In step 202, based on a trigger instruction of a first interactive icon, an interactive resource of target multimedia data is obtained, the first interface is skipped to a second interface, the second interactive icon is displayed on the second interface, the second interactive icon is used for indicating that interaction is triggered, and the interactive resource includes one or more of an interactive video, an interactive picture and an interactive audio.
And displaying a first interactive icon on the first interface, wherein the first interactive icon can be triggered by a user. Optionally, when the terminal is a mobile terminal such as a mobile phone or a tablet computer, the user may perform pressure touch on the first interactive icon in the first interface, and the pressure touch is used as a trigger operation for the first interactive icon; when the terminal is a desktop computer or a portable laptop, the user can complete the triggering operation of the first interactive icon through the input of the external equipment. For example, the user may click the first interactive icon through a mouse to complete the triggering operation on the first interactive icon, or may complete the triggering operation on the first interactive icon by inputting a shortcut key through a keyboard. And based on the triggering operation of the user on the first interactive icon, the terminal acquires a triggering instruction of the first interactive icon.
And acquiring interactive resources of the target multimedia data based on the trigger instruction of the first interactive icon, wherein the interactive resources comprise one or more of interactive video, interactive pictures and interactive audio. Optionally, based on the trigger instruction of the first interactive icon, the display time of the target multimedia data when the trigger instruction of the first interactive icon is obtained is detected, and when the display time does not exceed a first threshold, the interactive resource of the target multimedia data is obtained, where the first threshold is used to indicate an effective time period for obtaining the interactive resource of the target multimedia data.
Specifically, before the trigger instruction of the first interactive icon is acquired, a first threshold value indicating an effective time period for acquiring the interactive resource of the target multimedia data is set. And detecting the display time of the target multimedia data when the trigger instruction of the first interactive icon is acquired based on the trigger instruction of the first interactive icon. If the display time of the target multimedia data exceeds the first threshold when the trigger instruction of the first interactive icon is acquired, even if the trigger instruction of the first interactive icon is acquired, the operation of acquiring the interactive resource of the target multimedia data is not executed. And if the display time of the target multimedia data does not exceed the first threshold value when the trigger instruction of the first interactive icon is obtained, executing the operation of obtaining the interactive resources of the target multimedia data.
The setting of the first threshold value can prompt the user to interact with interested multimedia data as soon as possible, and the interaction rate of the user and the multimedia data is improved. Alternatively, the first threshold may be determined according to the total duration of the target multimedia data. For example, the first threshold is determined as the duration of 4/5 of the total duration of the target multimedia data, that is, if the display time of the target multimedia data exceeds 4/5 of the total duration of the target multimedia data when the trigger instruction of the first interactive icon is acquired, the interactive resource of the target multimedia data is not acquired; and if the display time of the target multimedia data does not exceed 4/5 of the total duration of the target multimedia data when the trigger instruction of the first interactive icon is obtained, obtaining the interactive resources of the target multimedia data.
The existence form of the target multimedia data and the interactive resources of the target multimedia data includes, but is not limited to, the following two:
the first method is as follows: the target multimedia data and the interactive resources of the target multimedia data are in one composite file.
In this way, the process of displaying the target multimedia data is as follows: acquiring a synthetic file based on a selected instruction of target multimedia data; determining the starting time and the ending time of the target multimedia data in the synthetic file; and displaying the target multimedia data based on the starting time and the ending time of the target multimedia data in the synthesis file. The process of obtaining the interactive resource of the target multimedia data comprises the following steps: determining the starting time and the ending time of the interactive resources of the target multimedia data in the synthetic file; and acquiring the interactive resources of the target multimedia data based on the starting time and the ending time of the interactive resources of the target multimedia data in the synthetic file.
Since the composite file is a file composed of the target multimedia data and the interactive resources of the target multimedia data, as shown in fig. 6, at least four time nodes exist in the composite file, which are the start time of the target multimedia data, the end time of the target multimedia data, the start time of the interactive resources of the target multimedia data, and the end time of the interactive resources of the target multimedia data, respectively. In addition, there may be blank periods in the composite file. In the composition file, a start time of the target multimedia data, an end time of the target multimedia data, a start time of an interactive resource of the target multimedia data, and an end time of an interactive resource of the target multimedia data are determined. And determining the position of the target multimedia data in the synthetic file according to the starting time of the target multimedia data and the ending time of the target multimedia data, so that the target multimedia data is displayed from the starting time of the target multimedia data on the first interface. And determining the position of the interactive resource of the target multimedia data in the synthetic file according to the starting time of the interactive resource of the target multimedia data and the ending time of the interactive resource of the target multimedia data, thereby obtaining the interactive resource of the target multimedia data.
Optionally, when the target multimedia data is shown to the end time of the target multimedia data in the synthesis file, the target multimedia data is shown again from the start time of the target multimedia data in the synthesis file. Specifically, when the target multimedia data is displayed to the end time of the target multimedia data in the synthesis file, if the instruction for stopping displaying the target multimedia data is not obtained, the target multimedia data is displayed again from the start time of the target multimedia data in the synthesis file, and the process is circulated until the instruction for stopping displaying the target multimedia data is obtained. The instruction for stopping displaying the target multimedia data includes, but is not limited to, an instruction for closing the target multimedia data, an instruction for pausing the target multimedia data, a trigger instruction for the first interactive icon, and the like.
It should be noted that, when the target multimedia data or the interactive resource of the target multimedia data is a picture, both time periods between the start time of the picture and the end time of the picture may be used for displaying the picture, and the length of the time period may be set based on a scene or experience, which is not limited in this embodiment of the present application.
Optionally, the parameters of the target multimedia data and the composition file in which the interactive resources of the target multimedia data are located are stored in a data structure for managing all relevant data and business logic related to the target multimedia data. Determining a start time and an end time of the target multimedia data in the composition file based on the first parameter in the data structure; a start time and an end time of the interactive resource of the target multimedia data in the composition file are determined based on the second parameter in the data structure.
For example, a partial data structure of a parameter of a composition file storing target multimedia data and an interactive resource of the target multimedia data is shown in fig. 7. In the sttpsigntimeline shown in fig. 7, there are both a start time and an end time parameter representing the start time and the end time of the target multimedia data in the composite file, and a start time and an end time parameter representing the start time and the end time of the interactive resource of the target multimedia data in the composite file. And taking the parameters of the start time and the end time of the interactive resources of the target multimedia data in the synthetic file as second parameters. Wherein, the start time and end time parameters of the interactive resources of the target multimedia data in the composite file are determined according to the args variable parameters in the action array.
In addition, in the sttpsignerttimeline, there are also a start time and an end time parameter indicating an effective time period for acquiring the interactive resource of the target multimedia data, and the first threshold value indicating the effective time period for acquiring the interactive resource of the target multimedia data may be determined based on the start time and the end time parameter of the effective time period for acquiring the interactive resource of the target multimedia data.
Optionally, a finish field is added in the data structure, and when the finish field indicates that the target multimedia data is shown to the end time, if the instruction to stop showing the target multimedia data is not obtained, the target multimedia data is automatically shown again from the start time. Alternatively, a finish field is set at the end time point of the multimedia data, as shown in fig. 7, since the start time and the end time of the target multimedia data may be stored in the start _ time and the end _ time of the sttpsettimeline, the position of the finish field may be determined based on the start _ time and the end _ time of the sttpsettimeline.
The second method comprises the following steps: the data of the target multimedia and the interactive resources of the target multimedia data are in two files independent of each other.
In this way, the process of displaying the target multimedia data is as follows: acquiring target multimedia data from a file in which the target multimedia data is based on a selected instruction of the target multimedia data; and displaying the target multimedia data based on the target multimedia data. The process of obtaining the interactive resource of the target multimedia data comprises the following steps: and acquiring the interactive resources of the target multimedia data from the file where the interactive resources of the target multimedia data are located.
Whether the target multimedia data and the interactive resources of the target multimedia data are in one composite file or two files independent of each other, the manner of obtaining the composite file or obtaining any one of the two files independent of each other may include, but is not limited to, the following three manners, and the following description will be given by taking the composite file as an example:
mode 1: caching the synthesized file before the synthesized file needs to be acquired; and then when the composite file needs to be obtained, obtaining the composite file from the cache.
Specifically, before the composite file needs to be acquired, the terminal sends a request for acquiring the composite file to the server, and the server sends the composite file to the terminal based on the request for acquiring the composite file. And the terminal caches the synthesized file, and then directly extracts the synthesized file from the cache when the synthesized file needs to be acquired.
Optionally, the specific process of sending the composite file to the terminal by the server based on the request for obtaining the composite file is as follows: the request for obtaining the synthetic file comprises the identification of the synthetic file, the server searches the storage address of the synthetic file based on the identification of the synthetic file, then the server obtains the synthetic file based on the storage address, and then the synthetic file is sent to the terminal.
In this way, the terminal can cache the synthesized file in advance and then directly acquire the synthesized file from the cache, so that the time for acquiring the synthesized file can be shortened, and the response speed of interaction between a user and multimedia data is improved.
Mode 2: caching the storage address of the synthesized file before the synthesized file needs to be acquired; and then when the synthetic file needs to be acquired, acquiring the storage address of the synthetic file from the cache, and pulling the synthetic file based on the storage address of the synthetic file.
Specifically, before the composite file needs to be acquired, the terminal sends a request for acquiring the storage address of the composite file to the server, and the server sends the storage address of the composite file to the terminal based on the request for acquiring the storage address of the composite file. And the terminal caches the storage address of the synthesized file. And when the composite file needs to be acquired, the terminal acquires the storage address of the composite file from the cache. The synthetic file is then pulled based on the storage address of the synthetic file.
In this way, only the storage address of the synthetic file needs to be cached in advance, the synthetic file does not need to be cached in advance, when the synthetic file needs to be acquired, the synthetic file is extracted based on the storage address of the synthetic file, and compared with a mode of directly caching the synthetic file, the storage space of the terminal is saved.
Mode 3: and when the synthetic file needs to be acquired, directly acquiring the synthetic file from the server.
In particular, the terminal does not cache any information about the composite file until it is needed. When the synthetic file needs to be acquired, the terminal sends a request for acquiring the synthetic file to the server, and the server sends the synthetic file to the terminal based on the request for acquiring the synthetic file, so that the terminal acquires the synthetic file.
In this way, the terminal does not need to cache the synthesized file and the storage address of the synthesized file in advance, so that the storage space of the terminal can be further saved.
And jumping the first interface to the second interface after acquiring the interactive resources of the target multimedia data based on the triggering instruction of the first interactive icon. Optionally, the process of jumping the first interface to the second interface includes: skipping the first interface to a first transit interface, and displaying first prompt content for prompting the interface to skip on the first transit interface, wherein the first prompt content comprises one or more of sound and animation; and jumping the first transfer interface to the second interface when a closing instruction of the first transfer interface is acquired or when the prompting time of the first prompting content reaches a second threshold value.
And in the process of jumping from the first interface to the second interface, jumping from the first interface to the first transit interface. And displaying first prompt content for prompting the interface to jump at the first transit interface, wherein the first prompt content comprises one or more of sound and animation. For example, in the first transit interface, a doorbell sound and/or a picture fade-out animation, etc. are presented. The first prompt content can enable the user to obviously perceive the impending interface jump, and attract the attention of the user. Optionally, a heart-shaped icon with a heartbeat curve is dynamically displayed on the first transfer interface, and a plurality of heart-shaped patterns are sequentially displayed above the heart-shaped icon with the heartbeat curve. The third interactive icon may be regarded as an interactive icon with a heart-shaped pattern, and the third interactive icon may be as shown in fig. 5 (2). The interactive icon in the intermediate state is not limited in the embodiment of the application. Optionally, in the first relay interface, the content of the target multimedia data gradually fades out, the content of the interactive resource gradually displays, and in the first relay interface, only the closed entry and the third interactive icon are reserved, and other information is hidden.
And jumping the first transfer interface to the second interface when a closing instruction of the first transfer interface is acquired or when the prompting time of the first prompting content reaches a second threshold value. The second threshold is typically a short time, and may be set based on an application scenario or experience, such as 2 seconds, for example. The closing instruction of the first transfer interface may refer to a triggering instruction for closing the entry. And after jumping to the second interface, displaying a second interactive icon on the second interface, wherein the second interactive icon is used for indicating that the interaction is triggered. It should be noted that the first interactive icon and the second interactive icon may be two interactive icons with the same shape but different states, or two interactive icons with different shapes and different states, which is not limited in this embodiment of the application, as long as whether the interaction is triggered or not can be distinguished.
Optionally, displaying an interactive icon on a second interface, including: and dynamically showing the heart-shaped icon with the heartbeat curve in a second color on the second interface. At this time, the second interactive icon is a heart-shaped icon having a heartbeat curve with a second color. Alternatively, the second color may be black, and the second interactive icon may be as shown in fig. 5 (3).
Optionally, the second interface is an interface in a second player, and the second player is configured to display an interactive resource of the target multimedia data. And before jumping the first interface to the second interface, starting the second player, and then displaying a second interactive icon and interactive resources of the target multimedia data on the second interface in the second player. Optionally, the second player may be pre-started before the interactive resource of the target multimedia data is acquired, that is, the second player may be pre-started immediately after the type of the target multimedia data is determined to be the special multimedia data with the interactive resource, so that efficiency of displaying the interactive resource of the target multimedia data on the second interface can be improved. Optionally, when the parameters of the target multimedia data and the composite file in which the interactive resources of the target multimedia data are located are stored in one data structure, the second player is started based on a third parameter in the data structure. For example, the third parameter is the eptpactiontype parameter in the action array shown in fig. 7, which is used to indicate the type of the multimedia data, and if the parameter indicates that the target multimedia data is special multimedia data with interactive resources, the second player is started.
The process of the first interface jumping to the second interface may be as shown in fig. 8, after the first interface shown in fig. 8 (1) acquires the trigger instruction of the first interaction icon, jump to the first relay interface shown in fig. 8 (2), in the first relay interface, the content of the target multimedia data gradually fades out, the content of the interaction resource gradually displays, in the first relay interface, only the closing entry and the third interaction icon are reserved, and other information is hidden. In the first transit interface, the third interactive icon may be as shown in fig. 5 (2), and there may also be a sound effect in the first transit interface. When a closing instruction of the first transfer interface is acquired or when the prompting time of the first prompting content reaches a second threshold value, the first transfer interface shown in fig. 8 (2) jumps to a second interface shown in fig. 8 (3). Only the closed entry and the second interactive icon, which may be as shown in fig. 5 (3), remain in the second interface.
In step 203, the interactive resources of the target multimedia data are displayed on the second interface.
The interactive resource of the target multimedia data may refer to a resource uploaded by an uploader of the target multimedia data while uploading the target multimedia data, and the interactive resource of the target multimedia data may include one or more of an interactive video, an interactive picture, and an interactive audio. The interactive video can be a thank you video, the interactive picture can be a poster or a photo, and the interactive audio can be a thank you audio or a piece of music. The target multimedia data uploading person can freely set the form and content of the interactive resource of the target multimedia data, and the user does not know the specific form and content of the interactive resource of the target multimedia data when triggering the first interactive icon, so that the surprise of the user in the interaction process with the target multimedia data can be improved, and the interaction rate of the user and the target multimedia data is further improved.
And displaying the interactive resources of the target multimedia data on a second interface. If the number of the interactive resources is multiple, when the multiple interactive resources are in one file, the interactive resources of the target multimedia data can be sequentially displayed according to the storage sequence of each interactive resource in the file; when a plurality of interactive resources are in a plurality of mutually independent files, the interactive resources can be sequentially displayed according to the time sequence uploaded by the uploader of the target multimedia data, also can be sequentially displayed according to the sequence set by the uploader of the target multimedia data, and also can be displayed according to a random sequence. According to different forms and contents of the interactive resources, the display modes and the display times of the interactive resources may also be different, which is not limited in the embodiments of the present application.
Optionally, after the interactive resource of the target multimedia data is displayed on the second interface, when a closing instruction of the second interface is acquired or when the time for displaying the interactive resource of the target multimedia data on the second interface exceeds a third threshold, the second interface is closed, the first interface is returned, and the second interactive icon is displayed on the first interface. At this time, the updated numerical value is displayed below the second interactive icon of the first interface.
The conditions for closing the second interface include: and acquiring a closing instruction of the second interface, or the time for displaying the interactive resources of the target multimedia data on the second interface exceeds a third threshold value. The third threshold value can be freely adjusted according to the form and content of the interactive resource of the target multimedia data. The instruction for acquiring the closing of the second interface may refer to a trigger operation for acquiring a closed entry of the second interface, or may refer to a gesture operation for acquiring a representative closing. For example, the gesture operation representing closing is a sliding operation from left to right, and when the sliding operation from left to right is acquired, the second interface can be closed and the first interface can be returned. As shown in fig. 9, when any one of the two operations shown in fig. 9 (1) is acquired, the second interface is closed, and the first interface shown in fig. 9 (3) is returned to, at this time, the second interactive icon on the first interface may be as shown in fig. 5 (3), at this time, the updated numerical value displayed below the second interactive icon on the first interface is 101, where the numerical value is that the number of interactions between all users and the target multimedia data after the current user triggers the first interactive icon is 101.
Optionally, the process of closing the second interface and returning to the first interface includes: closing the second interface, jumping to the second transfer interface, and displaying second prompt content for prompting the interface to jump on the second transfer interface, wherein the second prompt content comprises one or more of sound and animation; and when a closing instruction of the second transit interface is acquired or when the prompting time of the second prompting content reaches a fourth threshold value, closing the second transit interface and returning to the first interface.
And after the second interface is closed, jumping to a second transit interface, and displaying second prompt content for prompting interface jumping on the second transit interface, wherein the second prompt content comprises one or more of sound and animation. The second prompt content can enable the user to obviously perceive the impending interface jump, and attract the attention of the user. And in the second transfer interface, the content of the interactive resource gradually fades out, and the content of the target multimedia data is gradually displayed. And in the second transfer interface, only the closed entrance and the second interactive icon are still reserved, and other information is hidden. In the second relay interface, the second interactive icon may be as shown in fig. 5 (3), and a sound effect may also be provided in the second relay interface. And when a closing instruction of the second transfer interface is acquired or when the prompting time of the second prompting content reaches a fourth threshold value, closing the second transfer interface and returning to the first interface. The fourth threshold is generally a short time, and may be set based on an application scenario or experience, or may be the same as the second threshold, which is not limited in this embodiment, and for example, the fourth threshold may be 2 seconds. The closing instruction of the second transit interface may refer to a triggering instruction for closing the entry.
The process of closing the second interface and returning to the first interface may be as shown in fig. 9, and when any one of the two operations shown in fig. 9 (1) is acquired, the second interface is closed, and a transition interface jumps to the second transition interface shown in fig. 9 (2), where a second interaction icon of the transition interface may be as shown in fig. 5 (3). When a closing instruction of the second transit interface is acquired or when the prompt time of the second prompt content reaches a fourth threshold value, the second transit interface shown in fig. 9 (2) is closed, and the first interface shown in fig. 9 (3) is returned, where at this time, the second interactive icon on the first interface may be as shown in fig. 5 (3).
For ease of understanding, the presentation process of the multimedia data is illustrated as an example in fig. 10. As shown in fig. 10 (1), a heart-shaped icon having a heartbeat curve is shown on the first interface, and a guide icon in the form of a bubble is also shown, by which the user is informed that the operation of praise can be triggered. The bubble may disappear within the reference time period, such as 6 seconds later, as shown in fig. 10 (2). Under the guidance of the bubble, the user may click on a heart icon with a heartbeat curve, and after clicking, the user finds surprise, as shown in fig. 10 (3). At this time, the user jumps from the first interface to the first transfer interface, and as shown in fig. 10 (4), the first transfer interface has a transition animation effect. For example, a heart-shaped icon with a heartbeat curve is dynamically shown on the first transfer interface, and a plurality of heart-shaped patterns are sequentially displayed above the heart-shaped icon with the heartbeat curve. When a closing instruction of the first transfer interface is acquired or when the prompt time of the first prompt content reaches a second threshold value, the first transfer interface jumps to the second interface, as shown in fig. 10 (5). After approval, a video is displayed on the second interface as shown in fig. 10 (5), and the interactive effect is improved through the video. When the user triggers to close the entrance or the user slides to the right as shown in fig. 10 (6), the second interface returns to the first interface, and the first interface is shown in fig. 10 (7).
In the embodiment of the application, the interactive resources of the target multimedia data are obtained and displayed based on the trigger instruction of the first interactive icon, and the first interactive icon is adjusted to be the second interactive icon. After the trigger instruction of the first interactive icon is acquired, the first interactive icon is adjusted to be the second interactive icon, interactive resources such as interactive videos, interactive pictures and interactive audios of the target multimedia data are displayed, the interactive feedback strength is high, and the interactive feedback is diverse in form, so that the interactive rate is improved, and the interactive experience of a user is improved.
Based on the implementation environment shown in fig. 1, the embodiment of the present application provides a method for displaying multimedia data, which is applied to a terminal as an example. Referring to fig. 11, a method provided by an embodiment of the present application may include the following steps:
in step 1101, target multimedia data is displayed, and a first interaction icon is displayed on a first interface for displaying the target multimedia data, where the first interaction icon is used to indicate that an interaction is not triggered.
And when the user enters an application program or a webpage capable of displaying the multimedia data, displaying the multimedia data. When a user enters an application program or a webpage, the multimedia data currently in the main area of the application program or the webpage screen can be used as target multimedia data, and the triggered multimedia data in the application program or the webpage can also be used as the target multimedia data. According to different forms of the multimedia data, the target multimedia data can be triggered in the following two ways: when the multimedia data are displayed in a list form, detecting the selection operation of a user on target multimedia data, such as clicking or double clicking the target multimedia data in the list, and triggering the target multimedia data; when the multimedia data is displayed in the form of multimedia data stream, a user can place the multimedia data to be watched or listened in the main area of the screen of the terminal in the sliding operation process, the terminal takes the multimedia data placed in the main area of the screen as target multimedia data, and therefore the target multimedia data is acquired and triggered. The multimedia data stream refers to a presentation form for sequentially presenting a plurality of multimedia data, and the triggered target multimedia data can be determined in the multimedia data stream through the sliding operation of a user.
In step 1102, after the first interactive icon is triggered, acquiring an interactive resource of the target multimedia data, jumping from the first interface to a second interface, and displaying a second interactive icon on the second interface, where the second interactive icon is used to indicate that the interaction is triggered.
Optionally, the user may perform pressure touch on the first interactive icon in the first interface, and the pressure touch is used as a trigger operation for the first interactive icon, so that the first interactive icon is triggered; when the terminal is a desktop computer or a portable laptop, the user can complete the triggering operation of the first interactive icon through the input of the external equipment. For example, the user may click the first interactive icon through a mouse to complete the triggering operation on the first interactive icon, or may complete the triggering operation on the first interactive icon through a keyboard in a manner of inputting a shortcut key, so that the first interactive icon is triggered.
Optionally, the interactive resources include one or more of interactive video, interactive pictures, interactive audio.
In step 1103, the interactive resources of the target multimedia data are displayed on the second interface.
The implementation of each step can refer to the related steps in the embodiment shown in fig. 2, and details are not repeated here.
In the embodiment of the application, after the first interactive icon is triggered, interactive resources such as interactive videos, interactive pictures and interactive audios of the target multimedia data are displayed besides the first interactive icon is adjusted to be the second interactive icon, the interactive feedback strength is high, and the interactive feedback forms are various, so that the interactive rate is improved, and the interactive experience of a user is improved.
Based on the same technical concept, referring to fig. 12, an embodiment of the present application provides a multimedia data presentation apparatus, including:
a first display module 1201, configured to display the target multimedia data based on the selected instruction of the target multimedia data;
the second display module 1202 is configured to display a first interaction icon on a first interface for displaying target multimedia data, where the first interaction icon is used to indicate that an interaction is not triggered;
a first obtaining module 1203, configured to obtain an interactive resource of the target multimedia data based on the trigger instruction of the first interactive icon;
a skipping module 1204, configured to skip the first interface to the second interface;
a third display module 1205, configured to display a second interactive icon on the second interface, where the second interactive icon is used to indicate that an interaction is triggered, and the interactive resources include one or more of an interactive video, an interactive picture, and an interactive audio;
the fourth display module 1206 is configured to display the interactive resource of the target multimedia data on the second interface.
Optionally, the target multimedia data and the interactive resources of the target multimedia data are in one composite file;
referring to fig. 13, the apparatus further includes:
a second obtaining module 1207, configured to obtain a composite file based on the selected instruction of the target multimedia data;
a first determining module 1208, configured to determine a start time and an end time of the target multimedia data in the composite file;
a first presentation module 1201, configured to present the target multimedia data based on the start time and the end time of the target multimedia data in the composite file.
Referring to fig. 13, the apparatus further includes:
a second determining module 1209, configured to determine a start time and an end time of an interactive resource of the target multimedia data in the composition file;
a first obtaining module 1203, configured to obtain an interactive resource of the target multimedia data based on a start time and an end time of the interactive resource of the target multimedia data in the composition file.
Optionally, the first presentation module 1201 is configured to, when the target multimedia data is presented to the end time of the target multimedia data in the composition file, re-present the target multimedia data from the start time of the target multimedia data in the composition file.
Optionally, parameters of a composite file in which the target multimedia data and the interactive resources of the target multimedia data are located are stored in a data structure;
a first determining module 1208, configured to determine a start time and an end time of the target multimedia data in the composite file based on the first parameter in the data structure;
a second determining module 1209, configured to determine a start time and an end time of the interactive resource of the target multimedia data in the composition file based on the second parameter in the data structure.
Optionally, the second presentation module 1202 is configured to dynamically present, in a first color, a heart icon with a heartbeat curve on a first interface for presenting the target multimedia data;
a third displaying module 1205 for dynamically displaying the heart icon with heartbeat curve in the second color on the second interface.
Optionally, the second displaying module 1202 is configured to display the first interactive icon and a guide icon of the first interactive icon on the first interface for displaying the target multimedia data, where the guide icon of the first interactive icon is used to guide an interactive operation on the first interactive icon.
Optionally, the first obtaining module 1203 is configured to detect, based on the trigger instruction of the first interactive icon, a display time of the target multimedia data when the trigger instruction of the first interactive icon is obtained, and obtain an interactive resource of the target multimedia data when the display time does not exceed a first threshold, where the first threshold is used to indicate an effective time period for obtaining the interactive resource of the target multimedia data.
Optionally, the skipping module 1204 is further configured to skip the first interface to the first transit interface;
referring to fig. 14, the apparatus further includes:
a fifth presentation module 1210, configured to present, at the first transit interface, first prompt content for prompting the interface to jump, where the first prompt content includes one or more of sound and animation;
the jump module 1204 is further configured to jump the first transfer interface to the second interface when the closing instruction of the first transfer interface is obtained or when the prompt time of the first prompt content reaches a second threshold.
Optionally, the fifth displaying module 1210 is configured to dynamically display a heart icon with a heartbeat curve on the first transition interface, and sequentially display a plurality of heart patterns above the heart icon with the heartbeat curve.
Optionally, the first interface is an interface in a first player, and the second interface is an interface in a second player;
referring to fig. 14, the apparatus further includes:
the starting module 1211 is configured to start a second player, where the second player is used for displaying an interactive resource of the target multimedia data.
Optionally, the starting module 1211 is configured to, when the parameter of the composite file where the target multimedia data and the interactive resource of the target multimedia data are stored in one data structure, start the second player based on a third parameter in the data structure.
Optionally, the target multimedia data and the interactive resources of the target multimedia data are in two files independent of each other;
optionally, referring to fig. 15, the apparatus further comprises:
a closing module 1212, configured to close the second interface when a closing instruction of the second interface is obtained or when a time for displaying the interactive resource of the target multimedia data on the second interface exceeds a third threshold;
a returning module 1213 is configured to return to the first interface, and display the second interactive icon on the first interface.
Optionally, the skipping module 1204 is further configured to close the second interface, skip to the second transit interface, and display second prompt content for prompting the interface to skip on the second transit interface, where the second prompt content includes one or more of sound and animation;
the returning module 1213 is configured to close the second relay interface and return to the first interface when the closing instruction of the second relay interface is obtained or when the time for prompting the second prompt content reaches a fourth threshold.
In the embodiment of the application, the interactive resource of the target multimedia data is acquired and displayed based on the trigger instruction of the first interactive icon, and the first interactive icon is adjusted to be the second interactive icon. After the trigger instruction of the first interactive icon is obtained, interactive resources such as interactive videos, interactive pictures and interactive audios of the target multimedia data are displayed besides the first interactive icon is adjusted to be the second interactive icon, the interactive feedback strength is high, and the interactive feedback forms are various, so that the interactive rate is improved, and the interactive experience of a user is improved.
Based on the same technical concept, referring to fig. 16, an embodiment of the present application provides a multimedia data presentation apparatus, including:
a first display module 1601, configured to display target multimedia data;
a second display module 1602, configured to display a first interaction icon on a first interface for displaying target multimedia data, where the first interaction icon is used to indicate that an interaction is not triggered;
an obtaining module 1603, configured to obtain an interactive resource of the target multimedia data after the first interactive icon is triggered;
a skipping module 1604, configured to skip the first interface to the second interface;
a third display module 1605, configured to display a second interactive icon on the second interface, where the second interactive icon is used to indicate that an interaction is triggered, and the interactive resources include one or more of an interactive video, an interactive picture, and an interactive audio;
a fourth display module 1606, configured to display the interactive resource of the target multimedia data on the second interface.
In the embodiment of the application, after the first interactive icon is triggered, interactive resources such as interactive videos, interactive pictures and interactive audios of the target multimedia data are displayed besides the first interactive icon is adjusted to be the second interactive icon, the interactive feedback strength is high, and the interactive feedback forms are various, so that the interactive rate is improved, and the interactive experience of a user is improved.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 17 is a schematic structural diagram of a device for displaying multimedia data according to an embodiment of the present disclosure. The device may be a terminal, and may be, for example: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. A terminal may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, a terminal includes: a processor 1701 and a memory 1702.
The processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1701 may also include a main processor, which is a processor for Processing data in the wake state and is also called a Central Processing Unit (CPU), and a coprocessor; a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1701 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, the processor 1701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
The memory 1702 may include one or more computer-readable storage media, which may be non-transitory. Memory 1702 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1702 is used to store at least one instruction for execution by the processor 1701 to implement the method for presenting multimedia data provided by the method embodiments of the present application.
In some embodiments, the terminal may further include: a peripheral interface 1703 and at least one peripheral. The processor 1701, the memory 1702 and the peripheral interface 1703 may be connected by buses or signal lines. Various peripheral devices may be connected to the peripheral interface 1703 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1704, a touch display screen 1705, a camera assembly 1706, an audio circuit 1707, a positioning assembly 1708, and a power supply 1709.
The peripheral interface 1703 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1701 and the memory 1702. In some embodiments, the processor 1701, memory 1702, and peripheral interface 1703 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1701, the memory 1702, and the peripheral interface 1703 may be implemented on separate chips or circuit boards, which are not limited by the present embodiment.
The Radio Frequency circuit 1704 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1704 communicates with a communication network and other communication devices via electromagnetic signals. The rf circuit 1704 converts the electrical signal into an electromagnetic signal for transmission, or converts the received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1704 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1704 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1705 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1705 is a touch display screen, the display screen 1705 also has the ability to capture touch signals on or above the surface of the display screen 1705. The touch signal may be input as a control signal to the processor 1701 for processing. At this point, the display 1705 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1705 may be one, disposed on a front panel of the terminal; in other embodiments, the display 1705 may be at least two, respectively disposed on different surfaces of the terminal or in a folded design; in still other embodiments, the display 1705 may be a flexible display, disposed on a curved surface or a folded surface of the terminal. Even more, the display screen 1705 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1705 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1706 is used to capture images or video. Optionally, camera assembly 1706 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1706 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1707 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 1701 for processing, or inputting the electric signals to the radio frequency circuit 1704 for voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal. The microphone may also be an array microphone or an omni-directional acquisition microphone. The speaker is used to convert electrical signals from the processor 1701 or the radio frequency circuit 1704 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1707 may also include a headphone jack.
The positioning component 1708 is used for positioning the current geographic Location of the terminal to implement navigation or LBS (Location Based Service). The Positioning component 1708 may be a Positioning component based on a Global Positioning System (GPS) in the united states, a beidou System in china, a grignard System in russia, or a galileo System in the european union.
A power supply 1709 is used to supply power to the various components in the terminal. The power supply 1709 may be ac, dc, disposable or rechargeable. When power supply 1709 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery can also be used to support fast charge technology.
In some embodiments, the terminal also includes one or more sensors 1710. The one or more sensors 1710 include, but are not limited to: acceleration sensor 1711, gyro sensor 1712, pressure sensor 1713, fingerprint sensor 1714, optical sensor 1715, and proximity sensor 1716.
The acceleration sensor 1711 can detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal. For example, the acceleration sensor 1711 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1701 may control the touch screen 1705 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1711. The acceleration sensor 1711 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1712 can detect the body direction and the rotation angle of the terminal, and the gyroscope sensor 1712 can cooperate with the acceleration sensor 1711 to acquire the 3D action of the user on the terminal. The processor 1701 may perform the following functions based on the data collected by the gyro sensor 1712: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1713 may be disposed on the side frames of the terminal and/or underlying the touch display screen 1705. When the pressure sensor 1713 is disposed on the side frame of the terminal, the user's holding signal to the terminal can be detected, and the processor 1701 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1713. When the pressure sensor 1713 is disposed at the lower layer of the touch display screen 1705, the processor 1701 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1705. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1714 is configured to capture a fingerprint of the user, and the processor 1701 identifies the user based on the fingerprint captured by the fingerprint sensor 1714, or the fingerprint sensor 1714 identifies the user based on the captured fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1701 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1714 may be disposed on the front, back, or side of the terminal. When a physical key or vendor Logo is provided on the terminal, the fingerprint sensor 1714 may be integrated with the physical key or vendor Logo.
The optical sensor 1715 is used to collect the ambient light intensity. In one embodiment, the processor 1701 may control the display brightness of the touch display screen 1705 based on the ambient light intensity collected by the optical sensor 1715. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1705 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1705 is turned down. In another embodiment, the processor 1701 may also dynamically adjust the shooting parameters of the camera assembly 1706 according to the ambient light intensity collected by the optical sensor 1715.
A proximity sensor 1716, also known as a distance sensor, is typically provided on the front panel of the terminal. The proximity sensor 1716 is used to collect the distance between the user and the front of the terminal. In one embodiment, the processor 1701 controls the touch display 1705 to switch from a bright screen state to a dark screen state when the proximity sensor 1716 detects that the distance between the user and the front face of the terminal is gradually reduced; when the proximity sensor 1716 detects that the distance between the user and the front face of the terminal is gradually increased, the touch display 1705 is controlled by the processor 1701 to switch from a breath screen state to a bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 17 is not limiting of the terminal, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer device is also provided that includes a processor and a memory having at least one instruction, at least one program, set of codes, or set of instructions stored therein. The at least one instruction, the at least one program, the set of codes, or the set of instructions is configured to be executed by one or more processors to implement any of the methods of presentation of multimedia data described above.
In an exemplary embodiment, there is also provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions which, when executed by a processor of a computer device, implements a presentation method of any one of the above multimedia data.
Alternatively, the computer-readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only exemplary of the embodiments of the present application and should not be construed as limiting the embodiments of the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the embodiments of the present application should be included in the scope of the embodiments of the present application.

Claims (12)

1. A method for presenting multimedia data, the method comprising:
the method comprises the steps that target multimedia data are displayed based on a selected instruction of the target multimedia data, a first interactive icon and a guide icon of the first interactive icon are dynamically displayed on a first interface for displaying the target multimedia data, the first interactive icon is used for indicating that interaction is not triggered, the first interactive icon is a heart-shaped icon with a heartbeat curve and a first color, the heart-shaped icon with the heartbeat curve is used for prompting that the target multimedia data are multimedia data with interactive resources, and the guide icon of the first interactive icon is used for guiding interactive operation on the first interactive icon;
detecting the display time of the target multimedia data when the trigger instruction of the first interactive icon is acquired based on the trigger instruction of the first interactive icon, and acquiring the interactive resource of the target multimedia data when the display time does not exceed a first threshold value, wherein the first threshold value is used for indicating an effective time period for acquiring the interactive resource of the target multimedia data, and the first threshold value is determined according to the total time length of the target multimedia data;
skipping the first interface to a first transit interface, displaying first prompt content for prompting interface skipping on the first transit interface, wherein the first prompt content comprises a heart-shaped icon dynamically displaying a heartbeat curve, and a plurality of heart-shaped patterns are sequentially displayed above the heart-shaped icon with the heartbeat curve;
when a closing instruction of the first transfer interface is acquired or when the prompting time of the first prompting content reaches a second threshold value, jumping the first transfer interface to a second interface;
dynamically displaying a second interactive icon on the second interface, wherein the second interactive icon is used for indicating that interaction is triggered, the second interactive icon is a heart-shaped icon with a heartbeat curve and in a second color, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
displaying the interactive resources of the target multimedia data on the second interface, wherein the second interface does not comprise other information except the interactive resources, the second interactive icons and closed entrances;
when a closing instruction of the second interface is acquired or when the time for displaying the interactive resources of the target multimedia data on the second interface exceeds a third threshold value, closing the second interface, returning to the first interface, and displaying the second interactive icon on the first interface.
2. The method of claim 1, wherein the target multimedia data and the interactive resources of the target multimedia data are in a composite file;
the target multimedia data is displayed based on the selected instruction of the target multimedia data, and the method comprises the following steps:
acquiring the synthetic file based on the selected instruction of the target multimedia data;
determining a start time and an end time of the target multimedia data in the composite file;
displaying the target multimedia data based on the starting time and the ending time of the target multimedia data in the synthesis file;
the acquiring of the interactive resource of the target multimedia data comprises:
determining a starting time and an ending time of an interactive resource of the target multimedia data in the synthetic file;
and acquiring the interactive resources of the target multimedia data based on the starting time and the ending time of the interactive resources of the target multimedia data in the synthetic file.
3. The method of claim 2, wherein presenting the target multimedia data based on a start time and an end time of the target multimedia data in the composition file comprises:
when the target multimedia data is displayed to the end time of the target multimedia data in the synthesis file, the target multimedia data is displayed again from the start time of the target multimedia data in the synthesis file.
4. The method of claim 2, wherein the parameters of the composite file in which the target multimedia data and the interactive resources of the target multimedia data are stored in a data structure;
the determining the starting time and the ending time of the target multimedia data in the composite file comprises:
determining a start time and an end time of the target multimedia data in the composition file based on a first parameter in the data structure;
the determining the starting time and the ending time of the interactive resource of the target multimedia data in the synthesis file comprises the following steps:
determining a start time and an end time of an interactive resource of the target multimedia data in the composition file based on a second parameter in the data structure.
5. The method of any of claims 1-4, wherein the first prompting content further comprises one or more of sound and animation.
6. The method of any of claims 1-4, wherein the first interface is an interface in a first player and the second interface is an interface in a second player;
before skipping the first transfer interface to the second interface, the method further includes:
and starting a second player, wherein the second player is used for displaying the interactive resources of the target multimedia data.
7. The method of claim 6, wherein the launching the second player comprises:
and when the target multimedia data and the parameters of the synthetic file where the interactive resources of the target multimedia data are located are stored in a data structure, starting a second player based on a third parameter in the data structure.
8. The method of any of claims 1-4, 7, wherein said closing the second interface, returning to the first interface, comprises:
closing the second interface, jumping to a second transit interface, and displaying second prompt content for prompting interface jumping on the second transit interface, wherein the second prompt content comprises one or more of sound and animation;
and when a closing instruction of the second transfer interface is acquired or when the prompting time of the second prompting content exceeds a fourth threshold value, closing the second transfer interface and returning to the first interface.
9. A method for presenting multimedia data, the method comprising:
displaying target multimedia data, dynamically displaying a first interactive icon and a guide icon for displaying the first interactive icon on a first interface for displaying the target multimedia data, wherein the first interactive icon is used for indicating that interaction is not triggered, the first interactive icon is a heart-shaped icon with a first color and a heartbeat curve, the heart-shaped icon with the heartbeat curve is used for prompting that the target multimedia data is multimedia data with interactive resources, and the guide icon of the first interactive icon is used for guiding interactive operation on the first interactive icon;
after the first interactive icon is triggered, detecting the display time of the target multimedia data when a trigger instruction of the first interactive icon is acquired, and acquiring the interactive resources of the target multimedia data when the display time does not exceed a first threshold value, wherein the first threshold value is used for indicating an effective time period for acquiring the interactive resources of the target multimedia data, and the first threshold value is determined according to the total duration of the target multimedia data;
skipping the first interface to a first transit interface, displaying first prompt content for prompting interface skipping on the first transit interface, wherein the first prompt content comprises a heart-shaped icon dynamically displaying a heartbeat curve, and a plurality of heart-shaped patterns are sequentially displayed above the heart-shaped icon with the heartbeat curve;
when a closing instruction of the first transfer interface is acquired or when the prompting time of the first prompting content reaches a second threshold value, jumping the first transfer interface to a second interface;
dynamically displaying a second interactive icon on the second interface, wherein the second interactive icon is used for indicating that interaction is triggered, the second interactive icon is a heart-shaped icon with a second color and a heartbeat curve, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
displaying the interactive resources of the target multimedia data on the second interface, wherein the second interface does not comprise other information except the interactive resources, the second interactive icons and closed entrances;
when a closing instruction of the second interface is acquired or when the time for displaying the interactive resources of the target multimedia data on the second interface exceeds a third threshold value, closing the second interface, returning to the first interface, and displaying the second interactive icon on the first interface.
10. An apparatus for presenting multimedia data, the apparatus comprising:
the first display module is used for displaying the target multimedia data based on the selected instruction of the target multimedia data;
the second display module is used for dynamically displaying a first interactive icon and a guide icon for displaying the first interactive icon on a first interface for displaying the target multimedia data, wherein the first interactive icon is used for indicating that interaction is not triggered, the first interactive icon is a heart-shaped icon with a first color and a heartbeat curve, the heart-shaped icon with the heartbeat curve is used for prompting that the target multimedia data is multimedia data with interactive resources, and the guide icon of the first interactive icon is used for guiding interactive operation on the first interactive icon;
the first obtaining module is used for detecting the display time of the target multimedia data when the trigger instruction of the first interactive icon is obtained based on the trigger instruction of the first interactive icon, obtaining the interactive resources of the target multimedia data when the display time does not exceed a first threshold value, wherein the first threshold value is used for indicating the effective time period for obtaining the interactive resources of the target multimedia data, and the first threshold value is determined according to the total time length of the target multimedia data;
the skipping module is used for skipping the first interface to a first transit interface;
a fifth display module, configured to display, on the first transit interface, first prompt content for prompting an interface to jump, where the first prompt content includes a heart-shaped icon dynamically displaying a heartbeat curve, and multiple heart-shaped patterns are sequentially displayed above the heart-shaped icon with the heartbeat curve;
the skipping module is further used for skipping the first transfer interface to a second interface when a closing instruction of the first transfer interface is obtained or when the prompt time of the first prompt content reaches a second threshold value;
the third display module is used for dynamically displaying a second interactive icon on the second interface, the second interactive icon is used for indicating that interaction is triggered, the second interactive icon is a heart-shaped icon with a heartbeat curve and in a second color, and the interactive resources comprise one or more of interactive videos, interactive pictures and interactive audios;
the fourth display module is used for displaying the interactive resources of the target multimedia data on the second interface, and the second interface does not comprise other information except the interactive resources, the second interactive icons and closed entrances;
the closing module is used for closing the second interface when a closing instruction of the second interface is acquired or when the time for displaying the interactive resources of the target multimedia data on the second interface exceeds a third threshold value;
and the return module is used for returning to the first interface and displaying the second interactive icon on the first interface.
11. A computer device comprising a processor and a memory, wherein the memory stores at least one instruction, and wherein the at least one instruction, when executed by the processor, implements the method of presenting multimedia data of any of claims 1-9.
12. A computer-readable storage medium, wherein at least one instruction is stored in the computer-readable storage medium, and when executed, the at least one instruction implements the method for presenting multimedia data according to any one of claims 1 to 9.
CN201910447817.6A 2019-05-27 2019-05-27 Multimedia data display method, device, equipment and storage medium Active CN112004134B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910447817.6A CN112004134B (en) 2019-05-27 2019-05-27 Multimedia data display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910447817.6A CN112004134B (en) 2019-05-27 2019-05-27 Multimedia data display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112004134A CN112004134A (en) 2020-11-27
CN112004134B true CN112004134B (en) 2022-12-09

Family

ID=73461697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910447817.6A Active CN112004134B (en) 2019-05-27 2019-05-27 Multimedia data display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112004134B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116053B (en) * 2021-11-18 2024-03-12 北京达佳互联信息技术有限公司 Resource display method, device, computer equipment and medium
CN114579030A (en) * 2022-03-11 2022-06-03 北京字跳网络技术有限公司 Information stream display method, device, apparatus, storage medium, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210808A (en) * 2016-08-08 2016-12-07 腾讯科技(深圳)有限公司 Media information put-on method, terminal, server and system
CN106649446A (en) * 2016-09-19 2017-05-10 腾讯科技(深圳)有限公司 Information pushing method and device
CN107071574A (en) * 2017-05-24 2017-08-18 环球智达科技(北京)有限公司 Intelligent television method for page jump
CN109032738A (en) * 2018-07-17 2018-12-18 腾讯科技(深圳)有限公司 Control method for playing multimedia, device, terminal and storage medium
CN109032688A (en) * 2018-06-11 2018-12-18 新疆玖富万卡信息技术有限公司 page jump and control method and device
CN109766082A (en) * 2017-11-09 2019-05-17 北京京东尚科信息技术有限公司 The method and apparatus that the application program page jumps

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106484738B (en) * 2015-09-01 2020-08-28 腾讯科技(深圳)有限公司 Page processing method and device
CN105868397B (en) * 2016-04-19 2020-12-01 腾讯科技(深圳)有限公司 Song determination method and device
CN105871921B (en) * 2016-06-12 2019-07-05 腾讯科技(深圳)有限公司 A kind of data processing method and device
CN106303735B (en) * 2016-09-07 2019-04-02 腾讯科技(深圳)有限公司 A kind of barrage display system, method, apparatus and service customer end
CN106383895B (en) * 2016-09-27 2019-08-27 北京金山安全软件有限公司 Information recommendation method and device and terminal equipment
CN108549567B (en) * 2018-04-17 2021-07-27 腾讯科技(深圳)有限公司 Animation display method, device, terminal, server and storage medium
CN108845749B (en) * 2018-06-01 2021-03-09 阿里巴巴(中国)有限公司 Page display method and device
CN109064180A (en) * 2018-07-13 2018-12-21 广州神马移动信息科技有限公司 Comment on method, apparatus and terminal device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210808A (en) * 2016-08-08 2016-12-07 腾讯科技(深圳)有限公司 Media information put-on method, terminal, server and system
CN106649446A (en) * 2016-09-19 2017-05-10 腾讯科技(深圳)有限公司 Information pushing method and device
CN107071574A (en) * 2017-05-24 2017-08-18 环球智达科技(北京)有限公司 Intelligent television method for page jump
CN109766082A (en) * 2017-11-09 2019-05-17 北京京东尚科信息技术有限公司 The method and apparatus that the application program page jumps
CN109032688A (en) * 2018-06-11 2018-12-18 新疆玖富万卡信息技术有限公司 page jump and control method and device
CN109032738A (en) * 2018-07-17 2018-12-18 腾讯科技(深圳)有限公司 Control method for playing multimedia, device, terminal and storage medium

Also Published As

Publication number Publication date
CN112004134A (en) 2020-11-27

Similar Documents

Publication Publication Date Title
CN109246466B (en) Video playing method and device and electronic equipment
CN108737897B (en) Video playing method, device, equipment and storage medium
CN110061900B (en) Message display method, device, terminal and computer readable storage medium
CN110149332B (en) Live broadcast method, device, equipment and storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN111901658B (en) Comment information display method and device, terminal and storage medium
CN110163066B (en) Multimedia data recommendation method, device and storage medium
CN110149557B (en) Video playing method, device, terminal and storage medium
CN113411680B (en) Multimedia resource playing method, device, terminal and storage medium
CN109275013B (en) Method, device and equipment for displaying virtual article and storage medium
CN110418152B (en) Method and device for carrying out live broadcast prompt
CN112181573A (en) Media resource display method, device, terminal, server and storage medium
CN110139143B (en) Virtual article display method, device, computer equipment and storage medium
CN109618192B (en) Method, device, system and storage medium for playing video
CN113395566B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN111368114B (en) Information display method, device, equipment and storage medium
CN110750734A (en) Weather display method and device, computer equipment and computer-readable storage medium
CN111028566A (en) Live broadcast teaching method, device, terminal and storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN114116053A (en) Resource display method and device, computer equipment and medium
CN111459363A (en) Information display method, device, equipment and storage medium
CN113613028A (en) Live broadcast data processing method, device, terminal, server and storage medium
CN112004134B (en) Multimedia data display method, device, equipment and storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN112616082A (en) Video preview method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant