CN104837033B - A kind of information processing method and server - Google Patents
A kind of information processing method and server Download PDFInfo
- Publication number
- CN104837033B CN104837033B CN201510261003.5A CN201510261003A CN104837033B CN 104837033 B CN104837033 B CN 104837033B CN 201510261003 A CN201510261003 A CN 201510261003A CN 104837033 B CN104837033 B CN 104837033B
- Authority
- CN
- China
- Prior art keywords
- data
- information
- offset
- server
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 22
- 238000003672 processing method Methods 0.000 title claims abstract description 22
- 238000013467 fragmentation Methods 0.000 claims abstract description 34
- 238000006062 fragmentation reaction Methods 0.000 claims abstract description 34
- 238000000034 method Methods 0.000 claims abstract description 31
- 239000012634 fragment Substances 0.000 claims description 76
- 238000006243 chemical reaction Methods 0.000 claims description 29
- 230000011218 segmentation Effects 0.000 claims description 14
- 230000005540 biological transmission Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 4
- 230000006978 adaptation Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234309—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/231—Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
- H04N21/23109—Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion by placing content in organized collections, e.g. EPG data repository
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Television Signal Processing For Recording (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Information Transfer Between Computers (AREA)
Abstract
A kind of information processing method of offer of the embodiment of the present invention and server, the method includes:To having the first data of the attribute of multimedia data stream N number of the second data for supporting data fragmentation attribute are converted to according to the first coding strategy;Cutting is carried out to second data according to dicing strategy, obtains third data, the third data are the partial content or full content of second data;The third data are converted to the 4th data for supporting the first media play format according to the second coding strategy;Third coding strategy is generated according to the dicing strategy, and according to the third coding strategy to the 4th data carry out transcoding so that the 4th data from the first media play format code transferring be non-first media play format at least one media play format.
Description
Technical Field
The present invention relates to the field of video technologies, and in particular, to an information processing method and a server.
Background
In the process of implementing the technical solution of the embodiment of the present application, the inventor of the present application finds at least the following technical problems in the related art:
editing video information, such as live streaming, is a service with more applications in the multimedia industry, and the technical scheme for editing live streaming at present is as follows: an editor records the live stream completely, after the live stream is completely live, the recorded video is encoded into a file in an MP4 format or a file in other video formats, and then the file is submitted to a server and uploaded to the server, and the file is transcoded into on-demand video files in various formats through a transcoding system.
The problems existing in the technical scheme are as follows: 1) a large amount of time is consumed when the direct broadcast stream is recorded, uploaded and transcoded; 2) live video cannot be accurately edited online.
Disclosure of Invention
In view of this, embodiments of the present invention are expected to provide an information processing method and a server, which can implement online accurate editing of a live stream, improve editing efficiency of the live stream, and save time resources consumed by recording, uploading, and transcoding the live stream.
The embodiment of the invention provides an information processing method, which comprises the following steps: acquiring M first data, wherein the first data have the attribute of multimedia data stream, and M is an integer greater than or equal to 1; converting any one of the M pieces of first data into N pieces of second data supporting data fragmentation attributes according to a first coding strategy, wherein the coding format of the first data is different from that of the second data, and N is an integer greater than 1; segmenting the second data according to a slicing strategy to obtain third data, wherein the third data are part of or all of the second data; converting the third data into fourth data supporting a first media playing format according to a second coding strategy; and generating a third encoding strategy according to the slicing strategy, and transcoding the fourth data according to the third encoding strategy, so that the fourth data is transcoded into at least one media playing format other than the first media playing format from the first media playing format.
In the foregoing solution, when any one of the M pieces of first data is converted into second data supporting a data fragmentation attribute according to a first encoding policy, the method further includes: and generating first information, wherein the first information is used for representing the fragment information of the N pieces of second data.
In the foregoing scheme, the segmenting the second data according to the slicing policy to obtain third data includes: obtaining a first offset of the target data to be divided and the head data of the second data and a second offset of the target data to be divided and the tail data of the second data according to the slicing strategy; extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data; and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
In the foregoing scheme, the transcoding the fourth data according to the third encoding policy includes: obtaining a third offset of the target data to be converted and the head data of the fourth data, a fourth offset of the target data to be converted and the tail data of the fourth data, and an extension pattern of the converted target data according to the third coding strategy; and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
In the foregoing solution, after the converting the third data into fourth data supporting a first media playing format according to a second encoding policy, the method further includes: and storing the fourth data to a unified storage database.
In the above scheme, the method further comprises: and transmitting the fifth data.
An embodiment of the present invention further provides a server, where the server includes: the device comprises an acquisition unit, a first conversion unit, a segmentation unit, a second conversion unit, a generation unit and a transcoding unit; wherein,
the acquiring unit is used for acquiring M first data, the first data has the attribute of a multimedia data stream, and M is an integer greater than or equal to 1;
the first conversion unit is configured to convert any one of the M pieces of first data into N pieces of second data supporting a data fragmentation attribute according to a first encoding policy, where an encoding format of the first data is different from an encoding format of the second data, and N is an integer greater than 1;
the segmentation unit is used for segmenting the second data according to a slicing strategy to obtain third data, wherein the third data is part of or all of the second data;
the second conversion unit is used for converting the third data into fourth data supporting a first media playing format according to a second coding strategy;
the generating unit is used for generating a third coding strategy according to the slicing strategy;
the transcoding unit is configured to transcode the fourth data according to the third encoding policy, so that the fourth data is transcoded from the first media playing format into at least one media playing format other than the first media playing format.
In the foregoing solution, the first converting unit is further configured to generate first information when any one of the M pieces of first data is converted into second data supporting a data fragmentation attribute according to a first coding policy, where the first information is used to represent fragmentation information of the N pieces of second data.
In the above scheme, the segmentation unit is specifically configured to obtain, according to the slicing policy, a first offset of the target data to be segmented and the head data of the second data, and a second offset of the target data to be segmented and the tail data of the second data; extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data; and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
In the above scheme, the transcoding unit is specifically configured to obtain, according to the third coding policy, a third offset of the target data to be converted and the first data of the fourth data, a fourth offset of the target data to be converted and the last data of the fourth data, and an extension pattern of the converted target data; and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
In the above scheme, the server further includes a storage unit, configured to store the fourth data in a unified storage database.
In the foregoing solution, the server further includes a sending unit, configured to send the fifth data.
By adopting the embodiment of the invention, the server segments the second data according to the first offset, the second offset and the distribution information of the second data in the slicing strategy to obtain the third data, wherein the second data is the TS fragment file obtained by converting the first data, namely the live stream, so that the editing position can be positioned according to the user requirement, the corresponding TS fragment file can be obtained in the process of playing the live stream without waiting for the completion of the playing of the live stream, the editing efficiency is improved, and the time resources consumed by recording, uploading and transcoding the live stream are saved.
Drawings
FIG. 1 is a schematic processing flow chart of an information processing method according to an embodiment of the present invention;
FIG. 2 is a schematic processing flow chart of a second information processing method according to an embodiment of the present invention;
FIG. 3 is a schematic processing flow chart of a third information processing method according to an embodiment of the present invention;
FIG. 4 is a block diagram of a server according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a second server according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a structure of a third server according to an embodiment of the present invention;
FIG. 7 is a schematic processing flow diagram of an information processing method in a specific application scenario according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a server in a specific application scenario according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of the server according to the embodiment of the present invention.
Detailed Description
Method embodiment one
An information processing method provided in an embodiment of the present invention is applied to a server, and a processing flow diagram of the information processing method provided in the embodiment of the present invention is shown in fig. 1, and includes the following steps:
step 101, obtaining M first data;
here, the first data has an attribute of a multimedia data stream, such as a live stream, and M is an integer greater than or equal to 1;
specifically, the acquiring M first data includes: and loading and playing the live stream, storing the played playing information of the live stream into a memory, and acquiring the playing information of the live stream in the memory.
Step 102, converting any one of the M pieces of first data into N pieces of second data supporting data fragmentation attributes according to a first encoding policy;
here, the first encoding strategy may be a dynamic code rate adaptation (HLS) protocol;
specifically, a live stream conversion server converts the acquired playing information of the live stream into N TS fragment files according to an HLS protocol, where N is an integer greater than 1; storing the TS fragment file into a cloud storage service; the live stream conversion server converts the playing information of the live stream into a TS fragment file and simultaneously generates first information; the first information may be included in an M3U8 file, and is used to characterize slicing information of the N second data, such as: the number of fragments of the second data, the duration of each fragment file, the start duration and the end duration of each fragment file in all fragment files and the like; the live stream conversion server also adds an extension style of data, such as configuration information such as watermark information, and stores the first information and the configuration information of the data to a task queue server;
the encoding format of the second data is different from that of the first data, and the second data is a TS (transport stream) fragment file; each TS slice file of the N TS slice files may have a duration of 3 seconds to 5 seconds.
103, segmenting the second data according to a slicing strategy to obtain third data;
the slicing strategy is a first offset of target data to be sliced and first data of the second data stored in the cutting system, and a second offset of the target data to be sliced and tail data of the second data;
specifically, the clip merge server obtains a task from the task queue server, and segments the second data according to a first offset, a second offset and the fragmentation information of the N pieces of second data in the people-free queue server to obtain third data;
wherein the third data is part of or all of the second data; therefore, the third data are X fragmented files, and X is an integer less than or equal to N.
Step 104, converting the third data into fourth data supporting a first media playing format according to a second coding strategy;
specifically, the clip merging server merges the third data, that is, X slicing files, into a complete fourth data supporting a first media playing format, and stores the fourth data in a unified storage database; by storing the fourth data to the unified storage database, the time consumed by downloading the fourth data can be saved when the fourth data is edited, and the editing efficiency is improved;
here, the fourth data may be an MP4 file.
Step 105, generating a third encoding strategy according to the slicing strategy;
specifically, the clip merging server calculates the time length of the file head and the file tail which need to be removed by the fourth data according to the slicing strategy, generates a third encoding strategy, and stores the third encoding strategy in a task configuration server; wherein the removal is accurate in duration to milliseconds.
Step 106, transcoding the fourth data according to the third coding strategy;
specifically, the clip merging server sends a transcoding task to a transcoding cluster server, the transcoding cluster server downloads the fourth data to be transcoded after receiving the transcoding task, acquires the third encoding strategy from the task configuration server, and transcodes the fourth data according to the third encoding strategy, so that the fourth data is transcoded from the first media playing format into at least one media playing format other than the first media playing format; meanwhile, accurately editing and adding extension patterns such as watermark information and the like;
here, the fourth data may be sliced to an accuracy of the order of milliseconds, and the non-first media playing format includes: and video formats which can be played by WEB, PC client, Aphone and Iphone.
The playing information of the played live stream is stored in the memory in the step 101, the second data is segmented according to the first offset and the second offset in the slicing strategy and the distribution information of the second data in the step 103, so that the third data is obtained, wherein the second data is a TS fragment file converted from the first data, namely the live stream, and the fourth data is stored in the unified storage database in the step 104, so that the editing position is positioned according to the user requirement without waiting for the completion of the playing of the live stream, the corresponding TS fragment file can be obtained in the playing process of the live stream, the editing efficiency is improved, and the time resources consumed by recording, uploading and transcoding the live stream are saved.
Method embodiment two
A second embodiment of the present invention provides an information processing method, where the method is applied to a server, and a processing flow diagram of the information processing method provided in the second embodiment of the present invention is shown in fig. 2, and the method includes the following steps:
step 201, acquiring first data;
here, the first data has an attribute of a multimedia data stream, such as a live stream;
specifically, the acquiring the first data includes: and loading and playing the live stream, storing the played playing information of the live stream into a memory, and acquiring the playing information of the live stream in the memory.
Step 202, converting the first data into N second data supporting data fragmentation attributes according to a first coding strategy;
here, the first encoding strategy may be a dynamic code rate adaptation (HLS) protocol;
specifically, a live stream conversion server converts the acquired playing information of the live stream into N TS fragment files according to an HLS protocol, where N is an integer greater than 1; storing the TS fragment file into a cloud storage service; the live stream conversion server converts the playing information of the live stream into a TS fragment file and simultaneously generates first information; the first information may be included in an M3U8 file, and is used to characterize slicing information of the N second data, such as: the number of fragments of the second data, the duration of each fragment file, the start duration and the end duration of each fragment file in all fragment files and the like; the live stream conversion server also adds an extension style of data, such as configuration information such as watermark information, and stores the first information and the configuration information of the data to a task queue server;
the encoding format of the second data is different from that of the first data, and the second data is a TS (transport stream) fragment file; each TS slice file of the N TS slice files may have a duration of 3 seconds to 5 seconds.
Step 203, segmenting the second data according to a slicing strategy to obtain third data;
the slicing strategy is a first offset of target data to be sliced and first data of the second data stored in the cutting system, and a second offset of the target data to be sliced and tail data of the second data;
specifically, the clip merge server obtains a task from the task queue server, and segments the second data according to a first offset, a second offset and the fragmentation information of the N pieces of second data in the people-free queue server to obtain third data;
wherein the third data is part of or all of the second data; therefore, the third data are X fragmented files, and X is an integer less than or equal to N.
Step 204, converting the third data into fourth data supporting the first media playing format according to a second coding strategy;
specifically, the clip merging server merges the third data, that is, X slicing files, into a complete fourth data supporting a first media playing format, and stores the fourth data in a unified storage database; by storing the fourth data to the unified storage database, the time consumed by downloading the fourth data can be saved when the fourth data is edited, and the editing efficiency is improved;
here, the fourth data may be an MP4 file.
Step 205, generating a third encoding strategy according to the slicing strategy;
specifically, the clip merging server calculates the time length of the file head and the file tail which need to be removed by the fourth data according to the slicing strategy, generates a third encoding strategy, and stores the third encoding strategy in a task configuration server; wherein the removal is accurate in duration to milliseconds.
Step 206, transcoding the fourth data according to the third encoding strategy;
specifically, the clip merging server sends a transcoding task to a transcoding cluster server, the transcoding cluster server downloads the fourth data to be transcoded after receiving the transcoding task, acquires the third coding strategy from the task configuration server, and obtains a third offset of the target data to be converted and the head data of the fourth data, a fourth offset of the target data to be converted and the tail data of the fourth data, and an extension style of the converted target data according to the third coding strategy; segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data; causing the fourth data to be transcoded from the first media playback format to at least one media playback format other than the first media playback format; meanwhile, accurately editing and adding extension patterns such as watermark information and the like;
here, the fourth data may be sliced to an accuracy of the order of milliseconds, and the non-first media playing format includes: and video formats which can be played by WEB, PC client, Aphone and Iphone.
And step 207, storing the fifth data to a unified storage server, and sending the fifth data by the unified storage server.
The played playing information of the live stream is stored in the memory in the step 201, the second data is segmented according to the first offset and the second offset in the slicing strategy and the distribution information of the second data in the step 203, so that the third data is obtained, wherein the second data is a TS fragment file converted from the first data, namely the live stream, and the fourth data is stored in the unified storage database in the step 204, so that the editing position is positioned according to the user requirement without waiting for the completion of the playing of the live stream, the corresponding TS fragment file can be obtained in the playing process of the live stream, the editing efficiency is improved, and the time resources consumed by recording, uploading and transcoding the live stream are saved.
Method embodiment three
A third embodiment of the present invention provides an information processing method, where the method is applied to a server, and a processing flow diagram of the information processing method provided in the third embodiment of the present invention is shown in fig. 3, and the method includes the following steps:
step 301, acquiring first data;
here, the first data has an attribute of a multimedia data stream, such as a live stream;
specifically, the acquiring the first data includes: and loading and playing the live stream, storing the played playing information of the live stream into a memory, and acquiring the playing information of the live stream in the memory.
Step 302, converting the first data into N second data supporting data fragmentation attributes according to a first coding strategy;
here, the first encoding strategy may be a dynamic code rate adaptation (HLS) protocol;
specifically, a live stream conversion server converts the acquired playing information of the live stream into N TS fragment files according to an HLS protocol, where N is an integer greater than 1; storing the TS fragment file into a cloud storage service; the live stream conversion server converts the playing information of the live stream into a TS fragment file and simultaneously generates first information; the first information may be included in an M3U8 file, and is used to characterize slicing information of the N second data, such as: the number of fragments of the second data, the duration of each fragment file, the start duration and the end duration of each fragment file in all fragment files and the like; the live stream conversion server also adds an extension style of data, such as configuration information such as watermark information, and stores the first information and the configuration information of the data to a task queue server;
the encoding format of the second data is different from that of the first data, and the second data is a TS (transport stream) fragment file; each TS slice file of the N TS slice files may have a duration of 3 seconds to 5 seconds.
Step 303, segmenting the second data according to a slicing strategy to obtain third data;
the slicing strategy is a first offset of target data to be sliced and first data of the second data stored in the cutting system, and a second offset of the target data to be sliced and tail data of the second data;
specifically, the clip merge server obtains a task from the task queue server, and segments the second data according to a first offset, a second offset and the fragmentation information of the N pieces of second data in the people-free queue server to obtain third data;
wherein the third data is part of or all of the second data; therefore, the third data are X fragmented files, and X is an integer less than or equal to N.
Step 304, converting the third data into fourth data supporting the first media playing format according to a second coding strategy;
specifically, the clip merging server merges the third data, that is, X slicing files, into a complete fourth data supporting a first media playing format, and stores the fourth data in a unified storage database; by storing the fourth data to the unified storage database, the time consumed by downloading the fourth data can be saved when the fourth data is edited, and the editing efficiency is improved;
here, the fourth data may be an MP4 file.
Step 305, generating a third encoding strategy according to the slicing strategy;
specifically, the clip merging server calculates the time length of the file head and the file tail which need to be removed by the fourth data according to the slicing strategy, generates a third encoding strategy, and stores the third encoding strategy in a task configuration server; wherein the removal is accurate in duration to milliseconds.
Step 306, transcoding the fourth data according to the third encoding strategy;
specifically, the clip merging server sends a transcoding task to a transcoding cluster server, the transcoding cluster server downloads the fourth data to be transcoded after receiving the transcoding task, acquires the third coding strategy from the task configuration server, and obtains a third offset of the target data to be converted and the head data of the fourth data, a fourth offset of the target data to be converted and the tail data of the fourth data, and an extension style of the converted target data according to the third coding strategy; segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data; causing the fourth data to be transcoded from the first media playback format to at least one media playback format other than the first media playback format; meanwhile, accurately editing and adding extension patterns such as watermark information and the like;
here, the fourth data may be sliced to an accuracy of the order of milliseconds, and the non-first media playing format includes: and video formats which can be played by WEB, PC client, Aphone and Iphone.
And 307, storing the fifth data to a unified storage server.
And 308, the user acquires the fifth data through the unified storage server and plays the fifth data.
The playing information of the played live stream is stored in the memory in the step 301, the second data is segmented according to the first offset and the second offset in the slicing strategy and the distribution information of the second data in the step 303, so that the third data is obtained, wherein the second data is a TS fragment file converted from the first data, namely the live stream, and the fourth data is stored in the unified storage database in the step 304, so that the editing position is positioned according to the user requirement, the playing of the live stream is not required to be finished, the corresponding TS fragment file can be obtained in the playing process of the live stream, the editing efficiency is improved, and the time resources consumed by recording, uploading and transcoding the live stream are saved.
Embodiment of the Server
In order to implement the information processing method, an embodiment of the present invention provides a server, where a schematic structural diagram of the server, as shown in fig. 4, includes: the device comprises an acquisition unit 11, a first conversion unit 12, a segmentation unit 13, a second conversion unit 14, a generation unit 15 and a transcoding unit 16; wherein,
the acquiring unit 11 is configured to acquire M first data, where the first data has an attribute of a multimedia data stream, and M is an integer greater than or equal to 1;
specifically, a live stream is loaded and played, playing information of the played live stream is stored in a memory, and the playing information of the live stream is acquired in the memory.
The first converting unit 12 is configured to convert any one of the M pieces of first data into N pieces of second data supporting a data fragmentation attribute according to a first encoding policy, where an encoding format of the first data is different from an encoding format of the second data, and N is an integer greater than 1;
the segmentation unit 13 is configured to segment the second data according to a slicing policy to obtain third data, where the third data is a part of or all of the content of the second data;
the second converting unit 14 is configured to convert the third data into fourth data supporting the first media playing format according to a second encoding policy;
the generating unit 15 is configured to generate a third encoding strategy according to the slicing strategy;
the transcoding unit 16 is configured to transcode the fourth data according to the third encoding policy, so that the fourth data is transcoded from the first media playing format into at least one media playing format that is not the first media playing format.
In a preferred embodiment of the present invention, the first converting unit 12 is further configured to generate first information when any one of the M first data is converted into second data supporting a data fragmentation attribute according to a first coding policy, where the first information is used to represent fragmentation information of the N second data.
In a preferred embodiment of the present invention, the segmentation unit 13 is specifically configured to obtain, according to the slicing policy, a first offset between target data to be segmented and head data of the second data, and a second offset between target data to be segmented and tail data of the second data; extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data; and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
In a preferred embodiment of the present invention, the transcoding unit 16 is specifically configured to obtain, according to the third coding policy, a third offset of the target data to be converted and the first data of the fourth data, a fourth offset of the target data to be converted and the last data of the fourth data, and an extension pattern of the converted target data; and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
In a preferred embodiment of the present invention, the first encoding policy may be an HLS protocol, and the first converting unit 12 converts the obtained playing information of the live stream into N TS fragment files according to the HLS protocol, where N is an integer greater than 1; storing the TS fragment file into a cloud storage service; the live stream conversion server converts the playing information of the live stream into a TS fragment file and simultaneously generates first information; the first information may be included in an M3U8 file, and is used to characterize slicing information of the N second data, such as: the number of fragments of the second data, the duration of each fragment file, the start and end durations of each fragment file in all fragment files, and the like.
In a preferred embodiment of the present invention, the first data is live streaming data, the second data and the third data are TS clip files, and the fourth data is an MP4 file.
In a preferred embodiment of the present invention, before transcoding the fourth data, the transcoding unit 16 needs to receive a transcoding task and download the fourth data to be transcoded and the third encoding policy.
Second embodiment of the Server
In order to implement the information processing method, a second embodiment of the present invention provides a server, where a schematic structural diagram of the server is shown in fig. 5, and the server includes: the device comprises an acquisition unit 11, a first conversion unit 12, a segmentation unit 13, a second conversion unit 14, a generation unit 15, a transcoding unit 16 and a storage unit 17; wherein,
the acquiring unit 11 is configured to acquire M first data, where the first data has an attribute of a multimedia data stream, and M is an integer greater than or equal to 1;
specifically, a live stream is loaded and played, playing information of the played live stream is stored in a memory, and the playing information of the live stream is acquired in the memory.
The first converting unit 12 is configured to convert any one of the M pieces of first data into N pieces of second data supporting a data fragmentation attribute according to a first encoding policy, where an encoding format of the first data is different from an encoding format of the second data, and N is an integer greater than 1;
the segmentation unit 13 is configured to segment the second data according to a slicing policy to obtain third data, where the third data is a part of or all of the content of the second data;
the second converting unit 14 is configured to convert the third data into fourth data supporting the first media playing format according to a second encoding policy;
the generating unit 15 is configured to generate a third encoding strategy according to the slicing strategy;
the transcoding unit 16 is configured to transcode the fourth data according to the third encoding policy, so that the fourth data is transcoded from the first media playing format into at least one media playing format that is not the first media playing format;
the storage unit 17 is configured to store the fourth data in a unified storage database;
here, by storing the fourth data in the unified storage database, when the fourth data is edited, time consumed for downloading the fourth data can be omitted, and the editing efficiency can be improved.
In a preferred embodiment of the present invention, the first converting unit 12 is further configured to generate first information when any one of the M first data is converted into second data supporting a data fragmentation attribute according to a first coding policy, where the first information is used to represent fragmentation information of the N second data.
In a preferred embodiment of the present invention, the segmentation unit 13 is specifically configured to obtain, according to the slicing policy, a first offset between target data to be segmented and head data of the second data, and a second offset between target data to be segmented and tail data of the second data; extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data; and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
In a preferred embodiment of the present invention, the transcoding unit 16 is specifically configured to obtain, according to the third coding policy, a third offset of the target data to be converted and the first data of the fourth data, a fourth offset of the target data to be converted and the last data of the fourth data, and an extension pattern of the converted target data; and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
In a preferred embodiment of the present invention, the first encoding policy may be an HLS protocol, and the first converting unit 12 converts the obtained playing information of the live stream into N TS fragment files according to the HLS protocol, where N is an integer greater than 1; storing the TS fragment file into a cloud storage service; the live stream conversion server converts the playing information of the live stream into a TS fragment file and simultaneously generates first information; the first information may be included in an M3U8 file, and is used to characterize slicing information of the N second data, such as: the number of fragments of the second data, the duration of each fragment file, the start and end durations of each fragment file in all fragment files, and the like.
In a preferred embodiment of the present invention, the first data is live streaming data, the second data and the third data are TS clip files, and the fourth data is an MP4 file.
In a preferred embodiment of the present invention, before transcoding the fourth data, the transcoding unit 16 needs to receive a transcoding task and download the fourth data to be transcoded and the third encoding policy.
Third embodiment of the server is to implement the information processing method, and a third embodiment of the present invention provides a server, where a schematic structural diagram of a composition of the server, as shown in fig. 6, includes: the device comprises an acquisition unit 11, a first conversion unit 12, a segmentation unit 13, a second conversion unit 14, a generation unit 15, a transcoding unit 16, a storage unit 17 and a sending unit 18; wherein,
the acquiring unit 11 is configured to acquire M first data, where the first data has an attribute of a multimedia data stream, and M is an integer greater than or equal to 1;
specifically, a live stream is loaded and played, playing information of the played live stream is stored in a memory, and the playing information of the live stream is acquired in the memory.
The first converting unit 12 is configured to convert any one of the M pieces of first data into N pieces of second data supporting a data fragmentation attribute according to a first encoding policy, where an encoding format of the first data is different from an encoding format of the second data, and N is an integer greater than 1;
the segmentation unit 13 is configured to segment the second data according to a slicing policy to obtain third data, where the third data is a part of or all of the content of the second data;
the second converting unit 14 is configured to convert the third data into fourth data supporting the first media playing format according to a second encoding policy;
the generating unit 15 is configured to generate a third encoding strategy according to the slicing strategy;
the transcoding unit 16 is configured to transcode the fourth data according to the third encoding policy, so that the fourth data is transcoded from the first media playing format into at least one media playing format that is not the first media playing format;
the storage unit 17 is configured to store the fourth data in a unified storage database;
here, by storing the fourth data in the unified storage database, when the fourth data is edited, time consumed for downloading the fourth data can be omitted, and the editing efficiency can be improved.
The sending unit 18 is configured to send the fifth data;
the fifth data is transmitted through the transmission unit 18 so that the user can view the precisely clipped video.
In a preferred embodiment of the present invention, the first converting unit 12 is further configured to generate first information when any one of the M first data is converted into second data supporting a data fragmentation attribute according to a first coding policy, where the first information is used to represent fragmentation information of the N second data.
In a preferred embodiment of the present invention, the segmentation unit 13 is specifically configured to obtain, according to the slicing policy, a first offset between target data to be segmented and head data of the second data, and a second offset between target data to be segmented and tail data of the second data; extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data; and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
In a preferred embodiment of the present invention, the transcoding unit 16 is specifically configured to obtain, according to the third coding policy, a third offset of the target data to be converted and the first data of the fourth data, a fourth offset of the target data to be converted and the last data of the fourth data, and an extension pattern of the converted target data; and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
In a preferred embodiment of the present invention, the first encoding policy may be an HLS protocol, and the first converting unit 12 converts the obtained playing information of the live stream into N TS fragment files according to the HLS protocol, where N is an integer greater than 1; storing the TS fragment file into a cloud storage service; the live stream conversion server converts the playing information of the live stream into a TS fragment file and simultaneously generates first information; the first information may be included in an M3U8 file, and is used to characterize slicing information of the N second data, such as: the number of fragments of the second data, the duration of each fragment file, the start and end durations of each fragment file in all fragment files, and the like.
In a preferred embodiment of the present invention, the first data is live streaming data, the second data and the third data are TS clip files, and the fourth data is an MP4 file.
In a preferred embodiment of the present invention, before transcoding the fourth data, the transcoding unit 16 needs to receive a transcoding task and download the fourth data to be transcoded and the third encoding policy.
In the foregoing embodiment of the present invention, the functions executed by the obtaining unit and the first converting unit may be implemented by a live stream converting server, the functions executed by the splitting unit, the second converting unit, and the generating unit may be implemented by a clipping and merging server, and the functions executed by the transcoding unit may be implemented by a transcoding cluster server.
Specific scenarios applying embodiments of the invention
As shown in fig. 7, a specific processing flow diagram of the information processing method according to the embodiment of the present invention includes the following steps:
step 401, acquiring a live stream;
specifically, acquiring the live stream includes: and loading and playing the live stream, storing the played playing information of the live stream into a memory, and acquiring the playing information of the live stream in the memory.
Step 402, converting the live stream into a TS fragment file;
specifically, the live stream conversion server converts the live stream into a TS fragmentation file of the HLS protocol, and generates an M3U8 file at the same time, where the M3U8 file includes fragmentation information of a TS distribution file.
And step 403, storing the TS fragment file to a cloud storage server.
And step 404, positioning the offset of the start-stop fragment of the TS fragment file to be clipped and the head-tail TS distribution of the whole TS fragment file, adding configuration information such as extension styles and the like, and storing the offset and the configuration information into a task queue.
Step 405, the clip merging task server obtains a task queue, and obtains and analyzes an M3U8 file according to the start-stop time of the live stream clip in the task queue.
And step 406, the clip merging task server downloads the TS fragment files from the cloud storage server according to the TS distribution information in the M3U8, and merges the downloaded TS fragment files into a complete MP4 file.
Step 407, the clip merging server stores the MP4 file into the unified storage database, and calculates the header and trailer durations that need to be removed by the MP4 according to the clip merging task information, as the transcoding configuration information of the MP4 file.
And step 408, storing the transcoding configuration information into a task configuration server, and sending a transcoding task to the transcoding cluster server.
Step 409, the transcoding cluster server receives the transcoding task, downloads the MP4 file to be transcoded from the unified storage database, and acquires transcoding configuration information from the task configuration server;
here, the transcoding configuration information includes clip durations of a header and a trailer of the MP4 file, an extended style of the MP4 file, such as watermark information, and the like.
And step 410, transcoding the MP4 file into a video format file which can be played by WEB, a PC client, an Aphone and an Iphone.
And 411, storing the transcoded file into a unified storage database.
In step 412, the user downloads the video file through the unified storage database and plays the video file.
A schematic diagram of a system composition structure for implementing the application scenario, as shown in fig. 8, includes: the system comprises a direct current conversion server 10, a cloud storage server 20, a task queue 30, a clip merging server 40, a task configuration server 50, a transcoding cluster server 60 and a unified storage database 70; wherein,
the direct current conversion server 10 is configured to convert the live stream into a TS fragment file of the HLS protocol, and generate an M3U8 file at the same time; wherein the M3U8 file contains slicing information of TS distribution file;
the cloud storage server 20 is configured to store the TS fragment file;
the task queue 30 is used for storing configuration information such as the offset and the extension style of the start-stop fragment of the TS fragment file to be clipped and the head-tail TS allocation of the whole TS fragment file;
the clip merging server 40 is configured to obtain a task queue, and obtain and analyze an M3U8 file according to start and end times of live stream clips in the task queue; downloading TS fragment files from a cloud storage server according to TS distribution information in M3U8, and merging the downloaded TS fragment files into a complete MP4 file; storing the MP4 file into a unified storage database, and calculating the head and tail duration of the MP4 file to be removed through the clip merging task information to serve as the transcoding configuration information of the MP4 file;
the task configuration server 50 is configured to store the transcoding configuration information and send a transcoding task to the transcoding cluster server;
the transcoding cluster server 60 is configured to receive a transcoding task, download an MP4 file to be transcoded in a unified storage database, obtain transcoding configuration information in a task configuration server, and transcode an MP4 file;
the unified storage database 70 is used for storing the transcoded files.
The server is an example of a hardware entity, and as shown in fig. 9, the server includes a processor 31, a storage medium 32, and at least one external communication interface 33; the processor 31, the storage medium 32, and the external communication interface 33 are all connected by a bus 34.
Here, it should be added that the server at least includes a database for storing data and a processor for data processing, or includes a storage medium provided in the server or a storage medium provided separately.
As for the processor for data Processing, when executing Processing, the processor can be implemented by a microprocessor, a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or a Programmable logic Array (FPGA); for the storage medium, the storage medium contains operation instructions, which may be computer executable codes, and the operation instructions implement the steps in the flow of the information processing method according to the above-described embodiment of the present invention.
Here, it should be noted that: the above description related to the server item is similar to the above description of the method, and the description of the beneficial effects of the method is omitted for brevity. For technical details not disclosed in the server of the present invention, refer to the description of the method embodiment of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media capable of storing program codes, such as a removable Memory device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (12)
1. An information processing method, characterized in that the method comprises:
acquiring M first data, wherein the first data have the attribute of multimedia data stream, and M is an integer greater than or equal to 1;
converting any one of the M pieces of first data into N pieces of second data supporting data fragmentation attributes according to a first coding strategy, wherein the coding format of the first data is different from that of the second data, and N is an integer greater than 1;
segmenting the second data according to a slicing strategy to obtain third data, wherein the third data are part of or all of the second data;
converting the third data into fourth data supporting a first media playing format according to a second coding strategy;
and calculating the time length of the file head and the file tail which need to be removed of the fourth data according to the slicing strategy, generating a third coding strategy, and transcoding the fourth data according to the third coding strategy, so that the fourth data is transcoded into at least one media playing format which is not the first media playing format from the first media playing format.
2. The method according to claim 1, wherein when converting any one of the M first data into the second data supporting the data slice attribute according to the first encoding policy, the method further comprises:
and generating first information, wherein the first information is used for representing the fragment information of the N pieces of second data.
3. The method of claim 2, wherein the slicing the second data according to the slicing policy to obtain third data comprises:
obtaining a first offset of the target data to be divided and the head data of the second data and a second offset of the target data to be divided and the tail data of the second data according to the slicing strategy;
extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data;
and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
4. The method of claim 1 or 2, wherein transcoding the fourth data according to the third encoding policy comprises:
obtaining a third offset of target data to be converted and head data of the fourth data, a fourth offset of the target data to be converted and tail data of the fourth data, and an extension pattern of the converted target data according to the third coding strategy;
and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
5. The method according to claim 1 or 2, wherein after converting the third data into fourth data supporting the first media playing format according to the second encoding policy, the method further comprises:
and storing the fourth data to a unified storage database.
6. The method of claim 4, further comprising: and transmitting the fifth data.
7. A server, characterized in that the server comprises: the device comprises an acquisition unit, a first conversion unit, a segmentation unit, a second conversion unit, a generation unit and a transcoding unit; wherein,
the acquiring unit is used for acquiring M first data, the first data has the attribute of a multimedia data stream, and M is an integer greater than or equal to 1;
the first conversion unit is configured to convert any one of the M pieces of first data into N pieces of second data supporting a data fragmentation attribute according to a first encoding policy, where an encoding format of the first data is different from an encoding format of the second data, and N is an integer greater than 1;
the segmentation unit is used for segmenting the second data according to a slicing strategy to obtain third data, wherein the third data is part of or all of the second data;
the second conversion unit is used for converting the third data into fourth data supporting a first media playing format according to a second coding strategy;
the generating unit is used for calculating the time length of the file head and the file tail which need to be removed by the fourth data according to the slicing strategy and generating a third coding strategy;
the transcoding unit is configured to transcode the fourth data according to the third encoding policy, so that the fourth data is transcoded from the first media playing format into at least one media playing format other than the first media playing format.
8. The server according to claim 7, wherein the first converting unit is further configured to generate first information when any one of the M first data is converted into second data supporting a data fragmentation attribute according to a first encoding policy, where the first information is used to characterize fragmentation information of the N second data.
9. The server according to claim 8, wherein the slicing unit is specifically configured to obtain, according to the slicing policy, a first offset between target data to be sliced and head data of the second data, and a second offset between the target data to be sliced and tail data of the second data;
extracting the first information, and analyzing the first information to obtain fragment information representing the N pieces of second data;
and segmenting the second data according to the first offset, the second offset and the fragmentation information of the N pieces of second data to obtain the third data.
10. The server according to claim 7 or 8, wherein the transcoding unit is specifically configured to obtain, according to the third encoding policy, a third offset of the target data to be converted and the first data of the fourth data, a fourth offset of the target data to be converted and the last data of the fourth data, and an extension pattern of the converted target data;
and segmenting the fourth data according to the third offset and the fourth offset to obtain target data, and transcoding the target data according to the extended pattern to obtain fifth data.
11. The server according to claim 7 or 8, wherein the server further comprises a storage unit for storing the fourth data to a unified storage database.
12. The server according to claim 10, wherein the server further comprises a transmission unit configured to transmit the fifth data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510261003.5A CN104837033B (en) | 2015-05-20 | 2015-05-20 | A kind of information processing method and server |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510261003.5A CN104837033B (en) | 2015-05-20 | 2015-05-20 | A kind of information processing method and server |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104837033A CN104837033A (en) | 2015-08-12 |
CN104837033B true CN104837033B (en) | 2018-09-25 |
Family
ID=53814630
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510261003.5A Active CN104837033B (en) | 2015-05-20 | 2015-05-20 | A kind of information processing method and server |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104837033B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105893135B (en) * | 2016-04-25 | 2019-07-26 | 深圳市永兴元科技股份有限公司 | Distributed data processing method and data center |
CN106231440A (en) * | 2016-07-22 | 2016-12-14 | 华为技术有限公司 | A kind of files in stream media burst method for down loading, device and terminal |
CN106534863A (en) * | 2016-11-11 | 2017-03-22 | 协创数据技术股份有限公司 | Live video stream transcoding device |
CN107948669A (en) * | 2017-12-22 | 2018-04-20 | 成都华栖云科技有限公司 | Based on CDN fast video production methods |
CN109495505B (en) * | 2018-12-21 | 2021-10-08 | 北京金山云网络技术有限公司 | Streaming media protocol conversion method, device, system and computer readable medium |
CN112954396B (en) * | 2021-02-05 | 2023-02-28 | 建信金融科技有限责任公司 | Video playing method and device, electronic equipment and computer readable storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102780918A (en) * | 2012-08-15 | 2012-11-14 | 华数传媒网络有限公司 | Video distributed coding format converting method |
CN103024289A (en) * | 2012-12-28 | 2013-04-03 | 天脉聚源(北京)传媒科技有限公司 | Method and device for editing live stream on basis of M3U8 listing protocol |
CN103024605A (en) * | 2012-12-31 | 2013-04-03 | 传聚互动(北京)科技有限公司 | Cloud transcoding method and system for video files |
CN103036888A (en) * | 2012-12-19 | 2013-04-10 | 南京视海网络科技有限公司 | Self-adapting stream-media play method and self-adapting play unit |
CN103036889A (en) * | 2012-12-19 | 2013-04-10 | 常州中流电子科技有限公司 | Self-adapting streaming media displaying method and self-adapting streaming media displaying system |
CN103731678A (en) * | 2013-12-30 | 2014-04-16 | 世纪龙信息网络有限责任公司 | Video file parallel transcoding method and system |
CN104104971A (en) * | 2013-04-02 | 2014-10-15 | 腾讯科技(深圳)有限公司 | Video file processing method and system |
CN104333765A (en) * | 2014-10-23 | 2015-02-04 | 无锡天脉聚源传媒科技有限公司 | Processing method and device of video live streams |
CN104410799A (en) * | 2014-12-24 | 2015-03-11 | 北京中科大洋信息技术有限公司 | Distributed technical review method |
CN104506881A (en) * | 2014-12-31 | 2015-04-08 | 成都东方盛行电子有限责任公司 | Scheduling method for audio/video fragment transcoding |
-
2015
- 2015-05-20 CN CN201510261003.5A patent/CN104837033B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102780918A (en) * | 2012-08-15 | 2012-11-14 | 华数传媒网络有限公司 | Video distributed coding format converting method |
CN103036888A (en) * | 2012-12-19 | 2013-04-10 | 南京视海网络科技有限公司 | Self-adapting stream-media play method and self-adapting play unit |
CN103036889A (en) * | 2012-12-19 | 2013-04-10 | 常州中流电子科技有限公司 | Self-adapting streaming media displaying method and self-adapting streaming media displaying system |
CN103024289A (en) * | 2012-12-28 | 2013-04-03 | 天脉聚源(北京)传媒科技有限公司 | Method and device for editing live stream on basis of M3U8 listing protocol |
CN103024605A (en) * | 2012-12-31 | 2013-04-03 | 传聚互动(北京)科技有限公司 | Cloud transcoding method and system for video files |
CN104104971A (en) * | 2013-04-02 | 2014-10-15 | 腾讯科技(深圳)有限公司 | Video file processing method and system |
CN103731678A (en) * | 2013-12-30 | 2014-04-16 | 世纪龙信息网络有限责任公司 | Video file parallel transcoding method and system |
CN104333765A (en) * | 2014-10-23 | 2015-02-04 | 无锡天脉聚源传媒科技有限公司 | Processing method and device of video live streams |
CN104410799A (en) * | 2014-12-24 | 2015-03-11 | 北京中科大洋信息技术有限公司 | Distributed technical review method |
CN104506881A (en) * | 2014-12-31 | 2015-04-08 | 成都东方盛行电子有限责任公司 | Scheduling method for audio/video fragment transcoding |
Also Published As
Publication number | Publication date |
---|---|
CN104837033A (en) | 2015-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104837033B (en) | A kind of information processing method and server | |
CN106572358B (en) | Live broadcast time shifting method and client | |
CN107277081A (en) | Section method for down loading and device, the stream media system of stream medium data | |
CN106685942B (en) | Video live broadcast playback system and video live broadcast playback method | |
CN105282627B (en) | A kind of method and server obtaining live video slice | |
US9438936B1 (en) | Producing video data | |
US10880353B2 (en) | Systems and methods for cloud storage direct streaming | |
CN104902343B (en) | A kind of method, server and the terminal of transmission and playing audio-video and message | |
CN102055717A (en) | Quick playing method, terminal and server | |
CN103763637A (en) | Stream media broadcasting method and system | |
US9313084B2 (en) | Systems and methods for client-side media chunking | |
WO2015192683A1 (en) | Content distribution method, device and system based on adaptive streaming technology | |
CN110213615A (en) | Video transcoding method, device, server and storage medium | |
CN109089174B (en) | Multimedia data stream processing method and device and computer storage medium | |
US9356981B2 (en) | Streaming content over a network | |
CN104066000A (en) | Monitoring method and device for playing quality of streaming media file | |
US8724691B2 (en) | Transcoding video data | |
CN113079386A (en) | Video online playing method and device, electronic equipment and storage medium | |
CN109587517B (en) | Multimedia file playing method and device, server and storage medium | |
CN115086714B (en) | Data processing method, device, equipment and storage medium | |
CN114025201A (en) | Video playing method, device, equipment and storage medium | |
CN112218118A (en) | Audio and video clipping method and device | |
CN116132751B (en) | Method and system for synchronous playback based on web window scene | |
CN105207976A (en) | Multimedia information inter-cutting control method and multimedia information inter-cutting control system of AVS-DASH system, and client | |
CN106941630A (en) | A kind of method and apparatus for the code check for obtaining video slicing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
EXSB | Decision made by sipo to initiate substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |