CN111031354A - Multimedia playing method, device and storage medium - Google Patents

Multimedia playing method, device and storage medium Download PDF

Info

Publication number
CN111031354A
CN111031354A CN201911251005.0A CN201911251005A CN111031354A CN 111031354 A CN111031354 A CN 111031354A CN 201911251005 A CN201911251005 A CN 201911251005A CN 111031354 A CN111031354 A CN 111031354A
Authority
CN
China
Prior art keywords
multimedia
target
playing
data
time point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911251005.0A
Other languages
Chinese (zh)
Other versions
CN111031354B (en
Inventor
韩存爱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yayue Technology Co ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911251005.0A priority Critical patent/CN111031354B/en
Publication of CN111031354A publication Critical patent/CN111031354A/en
Application granted granted Critical
Publication of CN111031354B publication Critical patent/CN111031354B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Abstract

The embodiment of the application discloses a multimedia playing method, a multimedia playing device and a storage medium; the multimedia is played according to the index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period; determining a target multimedia identifier based on the first mapping relation and the jumping time point; determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier; determining candidate multimedia identifications based on the second mapping relation and the target time point; and skipping playing is carried out according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier. The multimedia skip playing effect can be improved.

Description

Multimedia playing method, device and storage medium
Technical Field
The present application relates to the field of internet technologies, and in particular, to a multimedia playing method, apparatus, and storage medium.
Background
With the rapid development of internet technology, people can acquire multimedia resources such as videos or audios on the internet in a live broadcast or on-demand mode. In order to facilitate network transmission and multimedia playing, in the prior art, multimedia resources are supplied through a dynamic bit rate adaptive (HLS, HTTP Live Streaming) protocol, the HLS protocol is implemented by segmenting a large multimedia into a plurality of small independent multimedia segments, each segment carries a playing time period corresponding to the segment, and the segments are further divided into several segment sets, such as an audio segment set and a video segment set, according to the number of media types included in the multimedia.
In the research and practice process of the prior art, the inventor of the present application finds that, in the prior art, when multimedia skip playing is performed, a target segment of each segment set of the multimedia is determined by a skip time point, so that the playing time periods corresponding to the target segments of different segment sets are not completely consistent, and when the multimedia skips, the playing time points of data to be played in the segments of each media form are not consistent, thereby reducing the multimedia skip playing effect.
Disclosure of Invention
The embodiment of the application provides a multimedia playing method, a multimedia playing device and a storage medium, which can improve the multimedia skip playing effect.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
the multimedia playing method provided by the embodiment of the application comprises the following steps:
playing multimedia according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period, and when a jump playing instruction is received, acquiring a jump time point of the multimedia;
determining a target multimedia identifier based on the first mapping relation and the jumping time point;
determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier;
determining candidate multimedia identifications based on the second mapping relation and the target time point;
and skipping playing is carried out according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
Correspondingly, an embodiment of the present application further provides a multimedia playing apparatus, including:
the playing module is used for playing the multimedia according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period;
the acquisition module is used for acquiring the multimedia jump time point when a jump playing instruction is received;
the target identification determining module is used for determining a target multimedia identification based on the first mapping relation and the jumping time point;
the time point determining module is used for determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier;
a candidate identifier determining module, configured to determine a candidate multimedia identifier based on the second mapping relationship and the target time point;
and the skip playing module is used for skipping playing according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
In some embodiments of the present application, the target identifier determining module is specifically configured to:
determining a target playing time period to which the jumping time point belongs;
and determining a target multimedia identifier corresponding to the target playing time period based on the first mapping relation.
In some embodiments of the present application, the time point determining module is specifically configured to:
according to a predetermined data transmission protocol, carrying out address expansion on the target multimedia identifier to obtain a target multimedia address;
acquiring target multimedia data based on the target multimedia address;
and acquiring a target time point of the target multimedia data from the target multimedia data.
In some embodiments of the present application, the candidate identity determination module is specifically configured to:
determining a candidate playing time period to which the target time point belongs;
and determining candidate multimedia identifications corresponding to the candidate playing time periods based on the second mapping relation.
In some embodiments of the present application, the skip play module includes an acquisition sub-module, a determination sub-module, and a skip play sub-module, wherein,
the obtaining submodule is used for obtaining a third mapping relation between the candidate data packet of the candidate multimedia data and the candidate playing time point;
a determining submodule, configured to determine a data packet based on the third mapping relationship and the candidate time point;
and the skip playing sub-module is used for performing skip playing according to the data packet and a target data packet obtained based on the target multimedia data.
In some embodiments of the present application, the determining submodule is specifically configured to:
confirming a playing time point matched with the target time point according to the candidate playing time point;
and confirming the data packet corresponding to the playing time point matched with the target time point based on the third mapping relation.
In some embodiments of the present application, the skip play sub-module is specifically configured to:
analyzing the data packet and a target data packet obtained based on target multimedia data respectively to obtain data and target data;
and skipping playing is carried out according to the data and the target data.
Correspondingly, the embodiment of the present application further provides a terminal, which includes a memory and a processor, where the memory stores an application program, and the processor is configured to run the application program in the memory to execute the multimedia playing method provided in the embodiment of the present application.
Correspondingly, an embodiment of the present application further provides a storage medium, where the storage medium stores a computer program, and the computer program is suitable for being loaded by a processor to execute any one of the multimedia playing methods provided in the embodiment of the present application.
The method includes the steps that firstly, multimedia is played according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period, when a jump playing instruction is received, a jump time point of the multimedia is obtained, then a target multimedia identifier is determined based on the first mapping relation and the jump time point, a target time point of target multimedia data is determined according to the target multimedia identifier, the target multimedia data corresponds to the target multimedia identifier, candidate multimedia identifiers are determined based on the second mapping relation and the target time point, and finally jump playing is conducted according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifiers. The target multimedia identifier of the first multimedia fragment is determined through the jumping time point, and the candidate multimedia identifier of the second multimedia fragment is determined through the target time point corresponding to the target multimedia identifier.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic scene diagram of a multimedia playing system provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a multimedia playing method according to an embodiment of the present application;
fig. 3 is an exemplary diagram of an interactive interface of a video playing method provided in an embodiment of the present application;
fig. 4 is an exemplary diagram of an interactive interface of a video playing method provided in an embodiment of the present application;
fig. 5 is a schematic flowchart of a video playing method provided in an embodiment of the present application;
fig. 6 is an exemplary diagram for determining candidate multimedia identifiers in a video playing method provided by an embodiment of the present application;
fig. 7 is a diagram illustrating a logic flow of a video playing method according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a multimedia playing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a multimedia playing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a multimedia method, a multimedia device and a storage medium.
As shown in fig. 1, fig. 1 is a schematic view of a multimedia playing system provided in an embodiment of the present application, where the multimedia playing system may include a multimedia playing device, and the multimedia playing device may be specifically integrated in a terminal, such as a tablet computer, a mobile phone, a notebook computer, a desktop computer, and the like, which has a storage unit and is equipped with a microprocessor and has an operational capability and is capable of performing multimedia playing, where the terminal is the terminal in fig. 1, the terminal obtains an index file of multimedia, the index file includes a first reference file including a first mapping relationship between a first multimedia fragment identifier and a playing time period and a second mapping relationship between at least one second multimedia fragment identifier and the playing time period, and then plays the multimedia according to the index file, and when the terminal receives a skip playing instruction triggered based on an external input, obtains a skip time point of the multimedia, and determining a target multimedia identifier based on the first mapping relation and the jumping time point, determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier, determining candidate multimedia identifiers based on the second mapping relation and the target time point, and jumping to play according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifiers.
The data processing system may further include a server, which is the server in fig. 1, and is mainly configured to receive a request message sent by a terminal, and return corresponding content to the terminal based on the request message, for example, the server may receive an index file acquisition request sent by the terminal, and return an index file of multimedia to the terminal; the server can also receive a request message sent by the terminal based on the target multimedia address and return target multimedia data to the terminal; the server may also receive a request message sent by the terminal based on the candidate multimedia address and return the candidate multimedia data to the terminal, and so on.
It should be noted that the scene schematic diagram of the multimedia playing system shown in fig. 1 is only an example, and the multimedia playing system and the scene described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application.
The following are detailed below.
In this embodiment, a multimedia playing apparatus will be described in terms of the multimedia playing apparatus, which may be specifically integrated in a terminal, for example, a terminal such as a server, a tablet computer, a mobile phone, a notebook computer, a server, a wearable smart device, and the like, which has a storage unit, is installed with a microprocessor, has computing capability, and is capable of playing multimedia.
As shown in fig. 2, fig. 2 is a schematic flowchart of a multimedia playing method according to an embodiment of the present application.
The multimedia playing method can comprise the following steps:
101. and playing the multimedia according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period.
The multimedia may be in various media forms including text, images, sounds, etc., the multimedia may be displayed and information exchanged on the terminal using more than one media form, for example, the multimedia may include movies, advertisement videos, art programs, songs, radio broadcasts, or audio books, etc., and the multimedia may also include electronic invitations, podcasts, broadcast plays, etc.
The index file of the multimedia can be determined based on the way of playing the multimedia and the number of bytes occupied by the multimedia, and the index file is usually required to be acquired when the multimedia is acquired in a live broadcast or on demand way through a network. And the number, format and form of the index files are different with different transmission modes of the multimedia.
For example, if multimedia playing is performed in an on-demand manner based on the mpeg fourth generation protocol standard (MP4 protocol, Moving picture experts Group 4), the index file may be composed of a plurality of boxes (boxes), each box is composed of a header (header) and data (data), and the boxes carry different data and have different functions, for example, a box containing metadata (metadata may describe data) and a box containing actual data (actual data is multimedia data for playing).
For another example, if the multimedia playing is performed in an on-demand or live broadcast manner based on the streaming media network transport protocol (HLS) of the request-response protocol, the form of the index file may be a website, and the index file may include a plurality of total features (such as an identifier of the index file) and a plurality of segment identifiers and their features (such as an actual duration of the segments). The index files can be nested, for example, one index file can contain a plurality of overall features and a plurality of lower-level index files (such as a second-level index file, a third-level index file and the like). The number of index files required by a multimedia is not fixed, and can be flexibly set according to actual requirements and application scenarios, for example, for a multimedia (e.g., audio 1 and video 1) containing two media forms, a first-level index file may contain two second-level index files, each second-level index file contains a playing data identifier and its characteristics of a media form, or two first-level index files, each first-level index file contains a playing data identifier and its characteristics of a media form. And the index file does not contain actual playing data, but only contains content such as configuration information and playing data identifiers, and if the playing data needs to be acquired, the content contained in the index file needs to be further analyzed and processed, for example, the playing data identifiers contained in the index file are processed to acquire the playing data.
The first mapping relationship may be a relationship between the first multimedia segment identifier and the playing time period, and the second mapping relationship may be a relationship between the first multimedia segment identifier and the playing time period. The fragment identifier may be represented in the form of numbers, symbols, or the like, the form of a group of fragment identifiers may include certain sequence information, each fragment identifier may represent different playing data, a fragment identifier includes the actual duration of the playing data represented by the fragment, and the playing time period of the playing data represented by each fragment identifier in the group including the fragment identifier may be determined based on the sequence information included in the group of fragment identifiers to which the fragment identifier belongs and the actual duration of each fragment identifier in the group, that is, a fragment identifier corresponds to a playing time period, so that the mapping relationship between the fragment identifier and the playing time period in the group may be obtained. If a multimedia needs multiple media forms (such as image, audio, and text) or multiple media forms (such as audio 1, audio 2, audio 3, and text 1), a mapping relationship between the multimedia fragment identifier of the fragment identifier group corresponding to each media form and a playing time period is needed.
Specifically, the obtaining of the index file of the multimedia may be sending an index file obtaining request to a server on the network, and receiving the index file returned by the server based on the obtaining request, because the general memory of the server is huge, the selection range of the index file obtained in this way is large, and the selection range of the multimedia data that can be played is large, for example, multimedia data such as online playing video, song, etc. may obtain the index file in this way; the method does not need to be connected with a network, can be used under the offline condition, saves network resources, is suitable for special conditions, for example, under the condition of no network or network impermissibility, multimedia data such as songs and the like can be directly obtained through the database stored on the terminal; the index file acquisition can also be carried out in an intranet, the method is suitable for special groups and special multimedia files, and meets the requirements of confidentiality and the like, for example, video or audio multimedia data related to trade secrets and the like, the index file acquisition can be carried out in a limited range to acquire the multimedia data, and the like.
The multimedia index file and the mapping relation contained in the index file are necessary contents for multimedia playing and skip playing, the multimedia can be played through the multimedia index file, and skip playing can be performed based on the information such as the mapping relation, fragment identification and the like in the index file.
For example, audio a contains audio 1, audio 2, and text 1, for a total of three media forms. The three media formats are configured with three index files, respectively. The terminal sends request information to the server and receives the audio 1 index file, the audio 2 index file and the character 1 index file sent by the server. The audio 1 index file comprises a mapping relation 1 between a first audio fragment identifier and a playing time period, the audio 2 index file comprises a mapping relation 2 between a second audio fragment identifier and a playing time period, and the character 1 index file comprises a mapping relation 3 between a first character fragment identifier and a playing time period.
Specifically, the multimedia can be played through a player, the playing mode is related to the content of the index file, the multimedia can be played through a local player, or played through a player applied to the internet, the index file can request a server or other databases for playing data according to a fragment identifier, a mapping relation and the like contained in the index file, the playing data is obtained, and the multimedia is played according to the playing data.
For example, audio a is played according to an audio 1 index file, an audio 2 index file and a text 1 index file, and a is played through an online application S (an online application is an application that needs to be networked for use).
102. And when a jump playing instruction is received, acquiring a jump time point of the multimedia.
The jumping time point may be a time point at which the multimedia is about to jump to play, and a multimedia data being played needs to change a current playing state due to a user's personal desire or a requirement of an actual application scene, so that the multimedia data starts to be played from another time point, which is the jumping time point. The skip playing instruction is a starting point for determining to execute a skip playing operation on the multimedia data being played, and the skip playing instruction can be input from the outside, can be automatically triggered by a terminal according to a predetermined program, and the like. The jump playing instruction can carry a jump time point and can also be independent from the jump time point.
Specifically, the jumping time point may be obtained based on an external input, for example, by receiving a user input, or generated and input to the multimedia being played by a third-party application; or calling from a locally stored database, for example, calling a jump time point from a locally stored database in advance through a preset script or other forms based on special requirements such as testing; the skip time point sent by the server may also be obtained by the server, for example, based on the self-demand of the application executing the playing multimedia or the characteristics of the multimedia, and the application may receive the skip time point sent by the server, and so on.
The multimedia jump time point is the basis for the multimedia jump playing and is the basis for realizing the jump playing, and the scheme is based on the jump time point and the index file so as to improve the multimedia playing effect.
For example, the current playing time point of the audio a is 1 minute 20 seconds, the current user profile of the playing audio a shows that the user small P is 13 years old because the audio file of 1 minute 21 seconds to 1 minute 29 seconds has the audio content which is not 18 years old and is forbidden to be obtained, so the server sends the jump time point to the application OO used by the small P according to the analysis of the small P and the small a, and the application OO obtains the jump time point sent by the server for 1 minute 31 seconds when receiving the jump playing instruction of the server.
103. And determining the target multimedia identification based on the first mapping relation and the jumping time point.
The target multimedia identifier may belong to the first multimedia fragment identifier, and the target multimedia identifier may represent multimedia data to be played after the jump. According to the difference of multimedia, the media form of the playing data represented by the target fragment identifier may be different, for example, if the multimedia is a movie containing audio 1, video 1, and subtitle 1, the media form represented by the target multimedia identifier may be video 1; if the multimedia is a broadcast play containing audio 1 and audio 2, the target multimedia identifier may represent the media in audio 1, and so on.
The target multimedia identifier is a key point for realizing the completion of the skip playing of the media form corresponding to the first multimedia fragment, the playing data corresponding to the skip time point is a subset of the playing data represented by the target multimedia identifier, and the determination of the target multimedia identifier means the determination of the skip playing of the media form.
For example, the audio 1 index file includes 7 segment identifiers, which are segment 11, segment 12, segment 13, segment 14, segment 15, segment 16, and segment 17. And determining the target multimedia identifier in the audio 1 index file as a fragment 13 through an audio 1 mapping relation (the mapping relation between the audio 1 fragment identifier and the playing time point is called as an audio 1 mapping relation) and a jumping time point (1 minute 31 seconds).
In an embodiment, determining the target multimedia identifier based on the first mapping relationship and the jumping time point may include the steps of: and determining a target playing time period to which the jumping time point belongs, and determining a target multimedia identifier corresponding to the target playing time period based on the first mapping relation.
The playing time period may be a set including a plurality of time points, and the playing time period may be expressed as a set including a different number of time points according to different measurement accuracies, for example, if 1 second is used as an accuracy unit, 1 minute may be a set including 60 time points; if 5 seconds is the precision unit, 1 minute may be a set containing 20 time points. In an actual application scene, a precision unit can be flexibly selected according to the precision of a jump time point, actual requirements and the like, and only the precision of a time point set of a playing time period is required to be ensured to cover the jump time point, and at the moment, a target playing time period to which the jump time point belongs is confirmed, namely, a playing time period corresponding to a time point set to which time points with equal jump time points exist, namely the target playing time period is confirmed; the playing time period may also be two time points, namely, a starting time point and an ending time point, and the target playing time period to which the jumping time point belongs is determined at this time, namely, the playing time period corresponding to the jumping time point is determined to be greater than or equal to the starting time point and less than or equal to the ending time point, namely, the target playing time period.
The first mapping relationship is the relationship between the first multimedia fragment identifier and the playing time period, that is, each fragment identifier in the first multimedia fragment identifier corresponds to a playing time period, and in the mapping relationship, the corresponding fragment identifier can be confirmed by the playing time period, and vice versa. Then in this embodiment, the target multimedia id may be confirmed according to the first mapping relationship and the target playing time point.
104. And determining a target time point of the target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier.
The target multimedia data corresponds to a target multimedia identifier, i.e. data actually played, and the target multimedia data may correspond to a media format of the target multimedia identifier, for example, if the target multimedia identifier is a slice 5 under the video 1 index file, the target multimedia data may be a video segment 55; the target multimedia is identified as slice 2 under the audio 1 index file, the target multimedia data may be an audio segment 22, and so on. The target time point is attribute information of the target multimedia data, so that if the target multimedia data is acquired, the target time point can be determined. For example, the video segment 55 may carry its attribute information, i.e., the target time point 1 minute 25 seconds.
Acquiring skip playing in a media form through a skip time point, and after confirming the skip playing in the media form, taking a target time point of data skipped and played in the media form as a reference to ensure the skip playing in other media forms of the multimedia.
For example, the audio segment 133 of the target multimedia data is obtained according to the slice 13, and the target time point carried by the audio segment 133 is confirmed to be 1 minute and 30 seconds.
In one embodiment, determining a target time point of target multimedia data according to the target multimedia identifier may include: according to a predetermined data transmission protocol, carrying out address expansion on the target multimedia identifier to obtain a target multimedia address; acquiring target multimedia data based on the target multimedia address; and acquiring a target time point of the target multimedia data from the target multimedia data.
The data transmission protocol is a rule which needs to be followed and observed in the data transmission process, the data transmission protocol is selected, and data is set based on the selected data transmission protocol, so that effective transmission of the data can be guaranteed. For example, a Hypertext transfer protocol (HTTP) is a data transfer protocol that specifies a rule for mutual communication between a terminal and an internet server and transfers internet data via the internet. The destination multimedia address may be a network address that may be used directly to request data. According to the predetermined data transmission protocol, the address of the target multimedia identifier is extended to obtain the target multimedia address, the implementation mode of the address extension of the identifier can be changed due to the difference of the predetermined data transmission protocol, and the selection of the predetermined data transmission protocol can be flexibly selected based on the application scene and the actual requirement, which is not limited herein.
The target multimedia data is obtained based on the target multimedia address, which may be a target time point for requesting a specific object (the specific object may be determined according to a predetermined data transmission protocol and the target multimedia address, such as a server, etc.) for the address, receiving the target multimedia data included in the address, and obtaining the target multimedia data from the target multimedia data.
105. And determining candidate multimedia identifications based on the second mapping relation and the target time point.
The candidate multimedia identifier may belong to a second multimedia segment identifier, and the candidate segment identifier may represent multimedia data to be played after the skip, and the second mapping relationship may include one or more than one, for example, if the multimedia media is in the form of audio 1, audio 2, and text 1, and the audio 1 user determines the target data, the second mapping relationship may include an audio 2 mapping relationship and a text 1 mapping relationship.
The candidate multimedia identification is key information for realizing that the skip playing is finished in a media form corresponding to the second multimedia fragment, and the determination of the candidate multimedia identification through the target time point plays a decisive role in eliminating the problems of pause, unsynchronized sound and pictures and the like in the skip playing process.
For example, 5 segment identifiers are included in the audio 2 index file, which are segment 21, segment 22, segment 23, segment 24, and segment 25. Determining candidate multimedia identifications in the audio 2 index file as fragments 22 through an audio 2 mapping relation (the mapping relation between the audio 2 fragment identification and the playing time point is called as an audio 2 mapping relation) and a target time point (1 minute 30 seconds);
the text 1 index file comprises 8 fragment identifications, namely a fragment 31, a fragment 32, a fragment 33, a fragment 34, a fragment 35, a fragment 36, a fragment 37 and a fragment 38. And determining candidate multimedia identifications in the text 1 index file as fragments 35 according to a text 1 mapping relation (the mapping relation between the text 1 fragment identification and the playing time point is called as a text 1 mapping relation) and a target time point (1 minute 30 seconds).
In an embodiment, determining the candidate multimedia identifications based on the second mapping relationship and the target time point may include the steps of: determining a candidate playing time period to which the target time point belongs, and determining a candidate multimedia identifier corresponding to the candidate playing time period based on the second mapping relationship, for example, determining a candidate playing time period X to which the target time point belongs for 1 minute and 30 seconds: and 1 minute 28 seconds to 1 minute 35 seconds, and according to the second mapping relation, determining that the multimedia identifier corresponding to the X is X, wherein the X is the candidate multimedia identifier.
106. And skipping playing is carried out according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
For example, jumping play to audio a may be implemented based on the target data 1 minute 30 seconds, the target multimedia data audio segment 133, the candidate multimedia data 222 corresponding to the candidate multimedia identification segment 22, and the candidate multimedia data 355 corresponding to the candidate multimedia identification segment 35.
In an embodiment, before performing skip playing according to the target time point, the target multimedia data, and the candidate multimedia data corresponding to the candidate multimedia identifier, the method may further include the steps of: according to a predetermined data transmission protocol, address expansion is carried out on the candidate multimedia identification to obtain a candidate multimedia address; candidate multimedia data is obtained based on the candidate multimedia addresses.
Specifically, according to the predetermined data transmission protocol, address expansion is performed on the candidate multimedia identifier to obtain a candidate multimedia address, an implementation manner of address expansion on the identifier may change based on a change of the predetermined data transmission protocol, and selection of the predetermined data transmission protocol may be flexibly selected based on an application scenario and an actual requirement, which is not limited herein. The candidate multimedia data is obtained based on the candidate multimedia address, which may be a request for a specific object (the specific object may be determined according to a predetermined data transmission protocol and a target multimedia address, such as a server, etc.), and the candidate multimedia data contained in the address is received.
In an embodiment, performing skip playing according to the target time point, the target multimedia data, and the candidate multimedia data corresponding to the candidate multimedia identifier may include:
(1) and acquiring a third mapping relation between the candidate data packet of the candidate multimedia data and the candidate playing time point.
For example, for candidate multimedia data in a media form of video, each candidate data packet may include a frame image, and the data packet carries a corresponding playing time point of the frame image, where the playing time point may be a starting playing time point or an ending playing time point of the frame image, and the determination of the playing time point may be flexibly selected in an application process according to an actual situation, and the like.
(2) And determining the data packet based on the third mapping relation and the candidate time point.
In an embodiment, determining the data packet based on the third mapping relationship and the candidate time point may include the steps of: confirming a playing time point matched with the target time point according to the candidate playing time point; and confirming the data packet corresponding to the playing time point matched with the target time point based on the third mapping relation.
(3) And skipping playing is carried out according to the data packet and a target data packet obtained based on the target multimedia data.
In an embodiment, performing skip play according to the data packet and a target data packet obtained based on target multimedia data may include the steps of: analyzing the data packet and a target data packet obtained based on the target multimedia data respectively to obtain data and target data; and skipping playing is carried out according to the data and the target data.
Specifically, the data packet and the target data packet may be playing data required for performing skip playing, the data packet and the target data packet may be two data packet sets (one data packet set and one target data packet set), or multiple data packet sets (one target data packet set and multiple data packet sets), the data packet and the target data packet obtained based on the target multimedia data are respectively parsed, that is, data in the data packet and the target data packet are parsed, for example, the data packet and the target data packet may be decoded based on a decoding manner, or the data packet and the target data packet are decompressed, and the like, so as to obtain data and target data, the skip playing is performed according to the data and the target data, the data and the target data may be processed based on a media form of the data and the target data, for example, if the media form is a video, the corresponding data or the target data may be sent to a device for rendering, to play; if the media is in the form of audio, the corresponding data, target data, or data and target data may be transmitted to a speaker for playback, and so on. It should be noted that the playing time of the data and the target data needs to be the same to ensure normal playing.
In an embodiment, analyzing the data packet and a target data packet obtained based on the target multimedia data to obtain data and target data respectively may include the steps of: the data and the target data are stored into a blockchain.
The blockchain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism and an encryption algorithm. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The block chain underlying platform can comprise processing modules such as user management, basic service, intelligent contract and operation monitoring. The user management module is responsible for identity information management of all blockchain participants, and comprises public and private key generation maintenance (account management), key management, user real identity and blockchain address corresponding relation maintenance (authority management) and the like, and under the authorization condition, the user management module supervises and audits the transaction condition of certain real identities and provides rule configuration (wind control audit) of risk control; the basic service module is deployed on all block chain node equipment and used for verifying the validity of the service request, recording the service request to storage after consensus on the valid request is completed, for a new service request, the basic service firstly performs interface adaptation analysis and authentication processing (interface adaptation), then encrypts service information (consensus management) through a consensus algorithm, transmits the service information to a shared account (network communication) completely and consistently after encryption, and performs recording and storage; the intelligent contract module is responsible for registering and issuing contracts, triggering the contracts and executing the contracts, developers can define contract logics through a certain programming language, issue the contract logics to a block chain (contract registration), call keys or other event triggering and executing according to the logics of contract clauses, complete the contract logics and simultaneously provide the function of upgrading and canceling the contracts; the operation monitoring module is mainly responsible for deployment, configuration modification, contract setting, cloud adaptation in the product release process and visual output of real-time states in product operation, such as: alarm, monitoring network conditions, monitoring node equipment health status, and the like.
The platform product service layer provides basic capability and an implementation framework of typical application, and developers can complete block chain implementation of business logic based on the basic capability and the characteristics of the superposed business. The application service layer provides the application service based on the block chain scheme for the business participants to use.
The method includes the steps that firstly, multimedia is played according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period, when a jump playing instruction is received, a jump time point of the multimedia is obtained, then a target multimedia identifier is determined based on the first mapping relation and the jump time point, a target time point of target multimedia data is determined according to the target multimedia identifier, the target multimedia data corresponds to the target multimedia identifier, candidate multimedia identifiers are determined based on the second mapping relation and the target time point, and finally jump playing is conducted according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifiers. The target multimedia identifier of the first multimedia fragment is determined through the jumping time point, and the candidate multimedia identifier of the second multimedia fragment is determined through the target time point corresponding to the target multimedia identifier.
The method described in the above embodiments is further illustrated in detail by way of example.
The embodiment takes the example that the multimedia playing apparatus is specifically integrated in the terminal as an example.
In this embodiment, a video playing method is described in detail by taking an example in which playing is performed based on the HLS protocol and multimedia is a video including two media forms. For example, referring to fig. 3 and 4, fig. 3 shows that the current playing time point of a video M with a video length of 30 seconds is 5 seconds, that is, the content of the video M at 5 th second is played, and the jumping time point of the video M is determined to be 13 seconds; fig. 4 shows that the jump playing is performed based on the determined jump time point of 13 seconds, i.e. the 13 th second content of the video M is played.
As shown in fig. 5, fig. 5 is a schematic flow chart of a video playing method according to the present application. The video playing method can comprise the following steps:
201. the terminal acquires an index file of the video, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between a second multimedia fragment identifier and the playing time period.
For example, the embodiment is based on the HLS protocol, and the index file may include video.m3u8 and radio.m3u 8. Wherein, video.m3u8 can contain fragment identification v0.ts, v1.ts, and v2.ts, specifically, the time length that v0.ts corresponds is 9 seconds, the time length that v1.ts corresponds is 10 seconds, the time length that v2.ts corresponds is 7 seconds, in video.m3u8, confirm the corresponding broadcast time quantum of fragment identification according to the time length that fragment identification and fragment identification correspond, and obtain the first mapping relation of fragment identification and broadcast time quantum: v0.ts for 0 to 8 seconds, v1.ts for 9 to 18 seconds, v1.ts for 19 to 25 seconds.
radio.m3u8 can contain fragment identification r0.ts, r1.ts, and r2.ts, specifically, the time length that r0.ts corresponds is 10 seconds, the time length that r1.ts corresponds is 10 seconds, the time length that r2.ts corresponds is 6 seconds, in video.m3u8, confirm the corresponding broadcast time period of fragment identification according to the time length that fragment identification and fragment identification correspond, and obtain the second mapping relation of fragment identification and broadcast time period: r0.ts for 0 to 9 seconds, r1.ts for 10 to 19 seconds, and r2.ts for 20 to 25 seconds.
The method for the terminal to obtain the video.m3u8 and the radio.m3u8 may be to request the server for addresses based on the video.m3u8 and the radio.m3u8, and obtain the first mapping relationship and the second mapping relationship in the addresses and other information (e.g. information such as # EXTM3U for identifying the index file).
202. And the terminal plays the video according to the index file.
For example, playing a video according to video.m3u8 and radio.m3u8 may obtain video data and audio data corresponding to v0.ts and r0.ts, and place the video data in a player for rendering and playing, and place the audio data in a speaker for playing.
203. And when a jump playing instruction is received, the terminal acquires the jump time point of the video.
For example, when a jump play instruction is received, the terminal acquires a jump time point 13 seconds input by the user.
204. And the terminal determines the target multimedia identifier based on the first mapping relation and the jumping time point.
For example, referring to fig. 6, the terminal determines, according to the playing time period (the jumping time period includes a start time point and an end time point) in the first mapping relationship, that a playing time period in which the start time point is greater than the jumping time point and the end time point is less than the jumping time point is satisfied, that is, 9 seconds to 18 seconds of the playing time period, and then determines, according to the first mapping relationship, a segment identifier v1.ts corresponding to 9 seconds to 18 seconds of the playing time period, that is, a target multimedia identifier is v1. ts.
205. And the terminal determines a target time point of the target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier.
For example, referring to fig. 6, the terminal may perform address expansion on the target multimedia identifier v1.ts based on the HTTP protocol to obtain a target multimedia address, request the target multimedia address from the server to obtain the target multimedia data, and confirm that the play time point carried by the first data packet of the target multimedia data is 9 seconds, where the target time point is 9 seconds.
206. And based on the second mapping relation and the target time point, the terminal determines the candidate multimedia identification.
For example, referring to fig. 6, the terminal determines, according to the playing time period (the jumping time period includes the starting time point and the ending time point) in the second mapping relationship, that is, the playing time period is from 0 second to 9 seconds, and then determines, according to the second mapping relationship, the fragment identifier r0.ts corresponding to the playing time period from 0 second to 9 seconds, that is, the candidate multimedia identifier is r 0.ts.
207. And the terminal determines candidate multimedia data according to the candidate multimedia identification.
For example, the terminal may perform address expansion on the candidate multimedia identifier r0.ts based on the HTTP protocol to obtain a candidate multimedia address, and request the candidate multimedia address from the server to obtain candidate multimedia data.
208. And the terminal acquires a third mapping relation between the candidate data packet of the candidate multimedia data and the candidate playing time point.
For example, the candidate multimedia data is 10 candidate data packets (e.g., packet 1, packet 2, packet 3, packet 4, packet 6, packet 7, packet 8, packet 9, and packet 10), each candidate data packet carries a candidate play-out time point (the candidate play-out time point is a start time point of the data packet), and then a third mapping relationship exists between the candidate data packet and the candidate play-out time point: packet 1 for 0 seconds, packet 2 for 1 second, packet 3 for 2 seconds, packet 4 for 3 seconds, packet 5 for 4 seconds, packet 6 for 5 seconds, packet 7 for 6 seconds, packet 8 for 7 seconds, packet 9 for 8 seconds, and packet 10 for 9 seconds.
209. And the terminal determines the data packet based on the third mapping relation and the candidate time point.
For example, the candidate playing time point in the third mapping relation identical to the candidate time point, i.e. 9 seconds, is confirmed, and then the data packet, i.e. the packet 10, is determined according to the third mapping relation.
210. And the terminal analyzes the data packet and a target data packet obtained based on the target multimedia data to obtain data and target data.
For example, the destination data packet may be the content of the destination multimedia data, the destination multimedia data includes packet a, packet B, packet C, packet D, and packet E, and the destination data packet may be packet a, decoded packet 10, and packet a, which correspond to the data and the destination data.
211. And the terminal skips and plays according to the data and the target data.
For example, the terminal sends data to the speaker and target data to the player at the same time point, so that the effect of skipping to 13 seconds for playing the video can be realized.
As shown in fig. 7, putting the user dragging time point into the video dragging logic to obtain the target time point may correspond to step 203, step 204, and step 205 in the embodiment, then putting the target time point into the audio dragging logic, may correspond to step 205, step 206, step 207, step 208, and step 209 in the embodiment, then performing a decoding operation on the audio data packet and the video data packet to obtain audio data and video data, and corresponds to step 210 in the embodiment, finally putting the audio data into a speaker, putting the video data into a player for rendering, and ensuring that the time points of putting the audio data and the video data are consistent, and corresponding to step 211 in the embodiment, that is, an operation step of jumping play.
In the embodiment of the application, a terminal first obtains an index file of a video, the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between a second multimedia fragment identifier and the playing time period, then the terminal plays the video according to the index file, when a skip playing instruction is received, the terminal obtains a skip time point of the video, the terminal determines a target multimedia identifier based on the first mapping relation and the skip time point, the terminal determines a target time point of target multimedia data according to the target multimedia identifier, the target multimedia data corresponds to the target multimedia identifier, the terminal determines a candidate multimedia identifier based on the second mapping relation and the target time point, the terminal determines candidate multimedia data according to the candidate multimedia identifier, the terminal obtains a third mapping relation between a candidate data packet of the candidate multimedia data and the candidate playing time point, the terminal determines a data packet based on the third mapping relation and the candidate time point, analyzes and respectively analyzes the data packet and a target data packet obtained based on the target multimedia data to obtain data and target data, and skips to play according to the data and the target data.
The target multimedia identifier of the first multimedia fragment is determined through the skip time point, and then the candidate multimedia identifier of the second multimedia fragment is determined through the target time point corresponding to the target multimedia identifier.
In order to better implement the multimedia playing method provided by the embodiment of the present application, an embodiment of the present application further provides a device based on the multimedia playing method. The meaning of the noun is the same as that in the above multimedia playing method, and the specific implementation details can refer to the description in the method embodiment.
As shown in fig. 8, fig. 8 is a schematic structural diagram of a multimedia playing apparatus according to an embodiment of the present application, where the group revocation apparatus may include a file playing module 301, an obtaining module 302, a target identifier determining module 303, a time point determining module 304, a candidate identifier determining module 305, and a skip playing module 306, as follows:
the playing module 301 is configured to play the multimedia according to an index file of the multimedia, where the index file includes a first mapping relationship between a first multimedia segment identifier and a playing time segment, and a second mapping relationship between at least one second multimedia segment identifier and the playing time segment;
an obtaining module 302, configured to obtain a multimedia jump time point when a jump play instruction is received;
a target identifier determining module 303, configured to determine a target multimedia identifier based on the first mapping relationship and the skip time point;
a time point determining module 304, configured to determine a target time point of target multimedia data according to a target multimedia identifier, where the target multimedia data corresponds to the target multimedia identifier;
a candidate identifier determining module 305, configured to determine a candidate multimedia identifier based on the second mapping relationship and the target time point;
and a skip playing module 306, configured to skip playing according to the target time point, the target multimedia data, and the candidate multimedia data corresponding to the candidate multimedia identifier.
In some embodiments of the present application, the target identifier determining module 303 is specifically configured to:
determining a target playing time period to which the jumping time point belongs;
and determining a target multimedia identifier corresponding to the target playing time period based on the first mapping relation.
In some embodiments of the present application, the time point determining module 304 is specifically configured to:
according to a predetermined data transmission protocol, carrying out address expansion on the target multimedia identifier to obtain a target multimedia address;
acquiring target multimedia data based on the target multimedia address;
and acquiring a target time point of the target multimedia data from the target multimedia data.
In some embodiments of the present application, the candidate identity determining module 305 is specifically configured to:
determining a candidate playing time period to which the target time point belongs;
and determining candidate multimedia identifications corresponding to the candidate playing time periods based on the second mapping relation.
As shown in fig. 9, in some embodiments of the present application, the jump playing module 306 includes an obtaining sub-module 3061, a determining sub-module 3062, and a jump playing sub-module 3063, wherein,
the obtaining submodule 3061 is configured to obtain a third mapping relationship between the candidate data packet of the candidate multimedia data and the candidate playing time point;
a determining submodule 3062 for determining the data packet based on the third mapping relation and the candidate time point;
and a skip playing submodule 3063, configured to perform skip playing according to the data packet and a target data packet obtained based on the target multimedia data.
In some embodiments of the present application, the determination submodule 3062 is specifically configured to:
confirming a playing time point matched with the target time point according to the candidate playing time point;
and confirming the data packet corresponding to the playing time point matched with the target time point based on the third mapping relation.
In some embodiments of the present application, the jump playing submodule 3063 is specifically configured to:
analyzing the data packet and a target data packet obtained based on the target multimedia data respectively to obtain data and target data;
and skipping playing is carried out according to the data and the target data.
In this embodiment, the playing module 301 first plays multimedia according to an index file of the multimedia, where the index file includes a first mapping relationship between a first multimedia fragment identifier and a playing time period, and a second mapping relationship between at least one second multimedia fragment identifier and the playing time period, when a skip playing instruction is received, the time point obtaining module 302 obtains a skip time point of the multimedia, the target identifier determining module 303 determines a target multimedia identifier based on the first mapping relationship and the skip time point, the time point determining module 304 determines a target time point of target multimedia data according to the target multimedia identifier, the target multimedia data corresponds to the target multimedia identifier, the candidate identifier determining module 305 determines candidate multimedia identifiers based on the second mapping relationship and the target time point, and finally the skip playing module 306 determines candidate multimedia identifiers according to the target time point, the skip playing time point, and the skip time point, And skipping and playing the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier. The target multimedia identifier of the first multimedia fragment is determined through the jumping time point, and the candidate multimedia identifier of the second multimedia fragment is determined through the target time point corresponding to the target multimedia identifier.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the group revocation method, which is not described herein again.
Accordingly, an embodiment of the present application also provides a terminal, as shown in fig. 10, which may include Radio Frequency (RF) circuit 601, memory 602 including one or more computer-readable storage media, input unit 603, display unit 604, sensor 605, audio circuit 606, Wireless Fidelity (WiFi) module 607, processor 608 including one or more processing cores, and power supply 609. Those skilled in the art will appreciate that the terminal structure shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 601 may be used for receiving and transmitting signals during a message transmission or communication process, and in particular, for receiving downlink messages from a base station and then processing the received downlink messages by one or more processors 608; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuit 601 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 601 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 602 may be used to store software programs and modules, and the processor 608 executes various functional applications and data processing by operating the software programs and modules stored in the memory 602. The memory 602 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal, etc. Further, the memory 602 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 602 may also include a memory controller to provide the processor 608 and the input unit 603 access to the memory 602.
The input unit 603 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, input unit 603 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 608, and can receive and execute commands sent by the processor 608. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 603 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 604 may be used to display information input by or provided to the user and various graphical user interfaces of the terminal, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 604 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 608 to determine the type of touch event, and the processor 608 then provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 10 the touch sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel to implement input and output functions.
The terminal may also include at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the terminal is stationary, and can be used for applications of recognizing terminal gestures (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal, detailed description is omitted here.
Audio circuitry 606, a speaker, and a microphone may provide an audio interface between the user and the terminal. The audio circuit 606 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 606 and converted into audio data, which is then processed by the audio data output processor 608, and then transmitted to, for example, another terminal via the RF circuit 601, or the audio data is output to the memory 602 for further processing. The audio circuit 606 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
WiFi belongs to short-distance wireless transmission technology, and the terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 607, and provides wireless broadband internet access for the user. Although fig. 10 shows the WiFi module 607, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 608 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 602 and calling data stored in the memory 602, thereby performing overall monitoring of the terminal. Optionally, processor 608 may include one or more processing cores; preferably, the processor 608 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 608.
The terminal also includes a power supply 609 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 608 via a power management system that may be used to manage charging, discharging, and power consumption. The power supply 609 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the terminal may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the processor 608 in the terminal loads the executable file corresponding to the process of one or more application programs into the memory 602 according to the following instructions, and the processor 608 runs the application programs stored in the memory 602, thereby implementing various functions:
acquiring an index file of a multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period; playing the multimedia according to the index file; when a jump playing instruction is received, acquiring a jump time point of the multimedia; determining a target multimedia identifier based on the first mapping relation and the jumping time point; determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier; determining candidate multimedia identifications based on the second mapping relation and the target time point; and skipping playing is carried out according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
The method includes the steps that firstly, multimedia is played according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period, when a jump playing instruction is received, a jump time point of the multimedia is obtained, then a target multimedia identifier is determined based on the first mapping relation and the jump time point, a target time point of target multimedia data is determined according to the target multimedia identifier, the target multimedia data corresponds to the target multimedia identifier, candidate multimedia identifiers are determined based on the second mapping relation and the target time point, and finally jump playing is conducted according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifiers. The target multimedia identifier of the first multimedia fragment is determined through the jumping time point, and the candidate multimedia identifier of the second multimedia fragment is determined through the target time point corresponding to the target multimedia identifier.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the group revocation method, which is not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by a computer program, which may be stored in a computer-readable storage medium and loaded and executed by a processor, or by related hardware controlled by the computer program.
To this end, the present application provides a storage medium, in which a computer program is stored, where the computer program can be loaded by a processor to execute the steps in any one of the multimedia playing methods provided in the present application. For example, the computer program may perform the steps of:
acquiring an index file of a multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period; playing the multimedia according to the index file; when a jump playing instruction is received, acquiring a jump time point of the multimedia; determining a target multimedia identifier based on the first mapping relation and the jumping time point; determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier; determining candidate multimedia identifications based on the second mapping relation and the target time point; and skipping playing is carried out according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any multimedia playing method provided in the embodiments of the present application, beneficial effects that can be achieved by any multimedia playing method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing describes in detail a multimedia playing method, device and storage medium provided in the embodiments of the present application, and specific examples are applied in the present application to explain the principles and embodiments of the present application, and the description of the foregoing embodiments is only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A multimedia playing method, comprising:
playing multimedia according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period;
when a jump playing instruction is received, acquiring a jump time point of the multimedia;
determining a target multimedia identifier based on the first mapping relation and the jumping time point;
determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier;
determining candidate multimedia identifications based on the second mapping relation and the target time point;
and skipping playing is carried out according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
2. The method of claim 1, wherein determining a target multimedia identifier based on the first mapping relationship and the jumping time point comprises:
determining a target playing time period to which the jumping time point belongs;
and determining a target multimedia identifier corresponding to the target playing time period based on the first mapping relation.
3. The method of claim 1, wherein determining the target time point of the target multimedia data according to the target multimedia identifier comprises:
according to a predetermined data transmission protocol, carrying out address expansion on the target multimedia identifier to obtain a target multimedia address;
acquiring target multimedia data based on the target multimedia address;
and acquiring a target time point of the target multimedia data from the target multimedia data.
4. The method of claim 1, wherein determining a candidate multimedia identifier based on the second mapping relationship and the target time point comprises:
determining a candidate playing time period to which the target time point belongs;
and determining candidate multimedia identifications corresponding to the candidate playing time periods based on the second mapping relation.
5. The method of claim 1, wherein the performing skip play according to the target time point, the target multimedia data, and candidate multimedia data corresponding to the candidate multimedia identifier comprises:
acquiring a third mapping relation between candidate data packets of the candidate multimedia data and candidate playing time points;
determining a data packet based on the third mapping relation and the candidate time point;
and skipping playing is carried out according to the data packet and a target data packet obtained based on the target multimedia data.
6. The method of claim 5, wherein the performing skip play according to the data packet and a target data packet obtained based on target multimedia data comprises:
analyzing the data packet and a target data packet obtained based on target multimedia data respectively to obtain data and target data;
and skipping playing is carried out according to the data and the target data.
7. The method of claim 5, wherein determining the data packet based on the third mapping relationship and the candidate time point comprises:
confirming a playing time point matched with the target time point according to the candidate playing time point;
and confirming the data packet corresponding to the playing time point matched with the target time point based on the third mapping relation.
8. The method of claim 6, further comprising:
storing the data and the target data into a blockchain.
9. A multimedia playback apparatus, comprising:
the playing module is used for playing the multimedia according to an index file of the multimedia, wherein the index file comprises a first mapping relation between a first multimedia fragment identifier and a playing time period and a second mapping relation between at least one second multimedia fragment identifier and the playing time period;
the acquisition module is used for acquiring the multimedia jump time point when a jump playing instruction is received;
the target identification determining module is used for determining a target multimedia identification based on the first mapping relation and the jumping time point;
the time point determining module is used for determining a target time point of target multimedia data according to the target multimedia identifier, wherein the target multimedia data corresponds to the target multimedia identifier;
a candidate identifier determining module, configured to determine a candidate multimedia identifier based on the second mapping relationship and the target time point;
and the skip playing module is used for skipping playing according to the target time point, the target multimedia data and the candidate multimedia data corresponding to the candidate multimedia identifier.
10. A storage medium storing instructions adapted to be loaded by a processor to perform the steps of the multimedia playing method according to any of claims 1 to 8.
CN201911251005.0A 2019-12-09 2019-12-09 Multimedia playing method, device and storage medium Active CN111031354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911251005.0A CN111031354B (en) 2019-12-09 2019-12-09 Multimedia playing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911251005.0A CN111031354B (en) 2019-12-09 2019-12-09 Multimedia playing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN111031354A true CN111031354A (en) 2020-04-17
CN111031354B CN111031354B (en) 2020-12-01

Family

ID=70208559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911251005.0A Active CN111031354B (en) 2019-12-09 2019-12-09 Multimedia playing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111031354B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115134672A (en) * 2022-05-26 2022-09-30 广州励丰文化科技股份有限公司 Rehearsal performance method, rehearsal performance device, terminal equipment and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581693A (en) * 2013-11-12 2014-02-12 北京清源新创科技有限公司 Internet-oriented large-scale live time shifting method and system based on fragment transmission
US20150296266A1 (en) * 2014-07-09 2015-10-15 David Allan Jones Methods and Apparatus for Indexing and/or Advertising in a User Selected Downloaded Digital Video Recording
US20160188577A1 (en) * 2005-11-09 2016-06-30 Cxense Asa User-directed navigation of multimedia search results
CN105959310A (en) * 2016-07-01 2016-09-21 北京小米移动软件有限公司 Frame positioning method and device
CN105979373A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Play method and device
US20160308934A1 (en) * 2015-04-20 2016-10-20 Qualcomm Incorporated Further Device Timing Adjustments and Methods for Supporting DASH Over Broadcast
CN106331840A (en) * 2016-08-31 2017-01-11 青岛海信宽带多媒体技术有限公司 Http live stream (HLS) protocol-based audio and video jump play method and apparatus
US20170094341A1 (en) * 2015-09-30 2017-03-30 Tivo Inc. Synchronizing media content tag data
CN106572358A (en) * 2016-11-11 2017-04-19 青岛海信宽带多媒体技术有限公司 Live broadcast time shift method and client
CN107707937A (en) * 2017-09-22 2018-02-16 烽火通信科技股份有限公司 Time shift optimization method and system based on HLS protocol
CN109274696A (en) * 2018-09-20 2019-01-25 青岛海信电器股份有限公司 Flow media playing method and device based on DASH agreement
CN110213616A (en) * 2018-05-15 2019-09-06 腾讯科技(深圳)有限公司 Video providing method, acquisition methods, device and equipment
CN110209910A (en) * 2019-05-20 2019-09-06 无线生活(杭州)信息科技有限公司 Index switching dispatching method and dispatching device
CN110267117A (en) * 2019-06-11 2019-09-20 网宿科技股份有限公司 A kind of processing method and Streaming Media processing server of stream medium data
US20190371354A1 (en) * 2018-05-31 2019-12-05 Shure Acquisition Holdings, Inc. Systems and methods for intelligent voice activation for auto-mixing
CN110545482A (en) * 2018-05-29 2019-12-06 北京字节跳动网络技术有限公司 Continuous playing method and device during resolution switching and storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160188577A1 (en) * 2005-11-09 2016-06-30 Cxense Asa User-directed navigation of multimedia search results
CN103581693A (en) * 2013-11-12 2014-02-12 北京清源新创科技有限公司 Internet-oriented large-scale live time shifting method and system based on fragment transmission
US20150296266A1 (en) * 2014-07-09 2015-10-15 David Allan Jones Methods and Apparatus for Indexing and/or Advertising in a User Selected Downloaded Digital Video Recording
US20160308934A1 (en) * 2015-04-20 2016-10-20 Qualcomm Incorporated Further Device Timing Adjustments and Methods for Supporting DASH Over Broadcast
US20170094341A1 (en) * 2015-09-30 2017-03-30 Tivo Inc. Synchronizing media content tag data
CN105979373A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Play method and device
CN105959310A (en) * 2016-07-01 2016-09-21 北京小米移动软件有限公司 Frame positioning method and device
CN106331840A (en) * 2016-08-31 2017-01-11 青岛海信宽带多媒体技术有限公司 Http live stream (HLS) protocol-based audio and video jump play method and apparatus
CN106572358A (en) * 2016-11-11 2017-04-19 青岛海信宽带多媒体技术有限公司 Live broadcast time shift method and client
CN107707937A (en) * 2017-09-22 2018-02-16 烽火通信科技股份有限公司 Time shift optimization method and system based on HLS protocol
CN110213616A (en) * 2018-05-15 2019-09-06 腾讯科技(深圳)有限公司 Video providing method, acquisition methods, device and equipment
CN110545482A (en) * 2018-05-29 2019-12-06 北京字节跳动网络技术有限公司 Continuous playing method and device during resolution switching and storage medium
US20190371354A1 (en) * 2018-05-31 2019-12-05 Shure Acquisition Holdings, Inc. Systems and methods for intelligent voice activation for auto-mixing
CN109274696A (en) * 2018-09-20 2019-01-25 青岛海信电器股份有限公司 Flow media playing method and device based on DASH agreement
CN110209910A (en) * 2019-05-20 2019-09-06 无线生活(杭州)信息科技有限公司 Index switching dispatching method and dispatching device
CN110267117A (en) * 2019-06-11 2019-09-20 网宿科技股份有限公司 A kind of processing method and Streaming Media processing server of stream medium data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐煜烨: "《基于流媒体自适应策略的HLS播放器的研究与实现》", 《中国优秀硕士学位论文全文数据库》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115134672A (en) * 2022-05-26 2022-09-30 广州励丰文化科技股份有限公司 Rehearsal performance method, rehearsal performance device, terminal equipment and storage medium
CN115134672B (en) * 2022-05-26 2023-12-12 广州励丰文化科技股份有限公司 Sparring performance method, sparring performance device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN111031354B (en) 2020-12-01

Similar Documents

Publication Publication Date Title
CN107690078B (en) Bullet screen information display method, bullet screen information providing method and bullet screen information providing equipment
US9621950B2 (en) TV program identification method, apparatus, terminal, server and system
CN106791958B (en) Position mark information generation method and device
CN104113787B (en) Based on the comment method of program, terminal, server and system
KR102207208B1 (en) Method and apparatus for visualizing music information
CN106415501B (en) Mating application program for activity cooperation
US10579215B2 (en) Providing content via multiple display devices
CN105740263B (en) Page display method and device
JP2018505504A (en) Advertisement push system, apparatus and method
CN109756767B (en) Preview data playing method, device and storage medium
CN112040330B (en) Video file processing method and device, electronic equipment and computer storage medium
WO2015062224A1 (en) Tv program identification method, apparatus, terminal, server and system
CN108900855B (en) Live content recording method and device, computer readable storage medium and server
CN112995759A (en) Interactive service processing method, system, device, equipment and storage medium
CN112969093B (en) Interactive service processing method, device, equipment and storage medium
CN110825863B (en) Text pair fusion method and device
CN113038192B (en) Video processing method and device, electronic equipment and storage medium
CN111031354B (en) Multimedia playing method, device and storage medium
CN104038832A (en) Video playing method and device
CN110198452B (en) Live video previewing method, device and system
CN112328895A (en) User portrait generation method, device, server and storage medium
CN111641864B (en) Video information acquisition method, device and equipment
CN112203116A (en) Video generation method, video playing method and related equipment
KR102263977B1 (en) Methods, devices, and systems for performing information provision
CN114417201A (en) Message processing method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022321

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20221115

Address after: 1402, Floor 14, Block A, Haina Baichuan Headquarters Building, No. 6, Baoxing Road, Haibin Community, Xin'an Street, Bao'an District, Shenzhen, Guangdong 518,101

Patentee after: Shenzhen Yayue Technology Co.,Ltd.

Address before: 518057 Tencent Building, No. 1 High-tech Zone, Nanshan District, Shenzhen City, Guangdong Province, 35 floors

Patentee before: TENCENT TECHNOLOGY (SHENZHEN) Co.,Ltd.

TR01 Transfer of patent right