CN106341698B - Video live broadcast processing method and device, storage medium and terminal equipment - Google Patents

Video live broadcast processing method and device, storage medium and terminal equipment Download PDF

Info

Publication number
CN106341698B
CN106341698B CN201510393907.3A CN201510393907A CN106341698B CN 106341698 B CN106341698 B CN 106341698B CN 201510393907 A CN201510393907 A CN 201510393907A CN 106341698 B CN106341698 B CN 106341698B
Authority
CN
China
Prior art keywords
video
video stream
live
meta
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510393907.3A
Other languages
Chinese (zh)
Other versions
CN106341698A (en
Inventor
袁树健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201510393907.3A priority Critical patent/CN106341698B/en
Publication of CN106341698A publication Critical patent/CN106341698A/en
Application granted granted Critical
Publication of CN106341698B publication Critical patent/CN106341698B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention relates to a video live broadcast processing method and a video live broadcast processing device, wherein the method comprises the following steps: receiving a video live broadcast request sent by a user terminal; acquiring a meta video stream according to the video live broadcast request; and detecting whether the stream break exists in the meta video stream, and if so, supplementing the pre-stored video stream to the stream break position of the meta video stream. The invention can ensure the video live broadcast to be uninterrupted, effectively reduces the frequency of sending various requests such as refreshing, access and the like to the server by the user terminal, lightens the burden of the server and improves the performance of the video live broadcast.

Description

Video live broadcast processing method and device, storage medium and terminal equipment
Technical Field
The invention relates to the technical field of networks, in particular to a live video processing method and device.
Background
With the development of network technology, it has become a part of people's life to access various resources through browsers for entertainment, study, and office. When a live video is viewed through a browser, the video and audio may be disconnected due to various reasons such as network jitter. Once the video and the audio are disconnected, the user needs to continuously refresh the webpage or wait for the loading of the live video data, even the normal playing state of the live video can not be recovered until the live video request is sent again after the video stream of the next live video is recovered, so that the background server needs to continuously process various requests such as refreshing and accessing sent by the user, the burden of the background server is increased, and the live video performance is directly influenced.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a live video processing method and apparatus that can reduce the load on a server and improve the live video performance.
A video live broadcast processing method, the method comprising:
receiving a video live broadcast request sent by a user terminal;
acquiring a meta video stream according to the video live broadcast request;
detecting whether the meta video stream has a cut-off, if yes,
the pre-stored video stream is padded to the location of the meta video stream break.
A live video processing apparatus, the apparatus comprising:
the video request receiving module is used for receiving a video live broadcast request sent by a user terminal;
the meta video stream acquisition module is used for acquiring a meta video stream according to the video live broadcast request;
the cut-off detection module is used for detecting whether cut-off exists in the primary video stream;
and the stream supplementing module is used for supplementing the pre-stored video stream to the position of the interruption of the meta video stream.
According to the method and the device for processing the live video, after a live video request sent by the user terminal is received, the server can obtain the primary video stream according to the live video request and detect whether the primary video stream has a cut-off, if yes, the pre-stored video stream is supplemented to the position of the cut-off of the primary video stream, so that the live video can be ensured to be uninterrupted, the frequency of sending various requests such as refreshing and accessing to the server by the user terminal is effectively reduced, the load of the server is lightened, and the live video performance is improved.
Drawings
FIG. 1 is a diagram of an application environment in which a live video processing method is implemented in one embodiment;
fig. 2 is an application environment diagram of a video live broadcast processing method in another embodiment;
FIG. 3 is a diagram illustrating an internal architecture of a server according to an embodiment;
FIG. 4 is a flowchart illustrating a video live processing method according to an embodiment;
fig. 5 is a schematic flowchart of a video live broadcast processing method in another embodiment;
fig. 6 is a detailed flow diagram of detecting whether a meta video stream has a break in one embodiment;
FIG. 7 is a detailed flow diagram of supplementing a pre-stored video stream to a location where a primary video stream is disconnected in one embodiment;
FIG. 8 is a diagram of a user terminal interface for a video live broadcast process in a particular application scenario;
FIG. 9 is a second user terminal interface diagram illustrating the video live broadcast process in a specific application scenario;
FIG. 10 is a block diagram showing the structure of a video live processing apparatus according to an embodiment;
FIG. 11 is a block diagram of the results of a cutout detection module in one embodiment;
FIG. 12 is a block diagram of the results of a complementary flow module in one embodiment;
fig. 13 is a block diagram of a video live broadcast processing apparatus in another embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, an application environment diagram implemented by a live video processing method in an embodiment includes a user terminal 102, a server 104, and a live recording terminal 106. The user terminal 102 may be any terminal installed with a browser application, such as a mobile phone, a personal notebook, a tablet computer, or a desktop computer; server 104 may be one or more servers, live recording terminal 106 may be one or more camera devices, and so on. Communication is performed between the user terminal 102 and the server 104, and between the server 104 and the live recording terminal 106 via a network.
Fig. 2 is a diagram of an application environment implemented by a video live broadcast processing method in another embodiment, where the application environment includes a user terminal 102, a video forwarding server 104a, a video detection server 104b, a video transcoding server 104c, a video live broadcast server 104d, and a live recording terminal 106. The user terminal 102 runs a live video application program, and at least provides a function of sending a live video request. The video live server 104d is configured to provide service support for a video live application running on the user terminal 102. The video forwarding server 104a is connected to the live recording terminal 106 in a wireless communication manner, and is configured to obtain a primary video stream sent from the live recording terminal 106, the video detection server 104b stores a video stream that is recorded in advance, and is configured to detect whether the primary video stream has a cut-off, and supplement the pre-stored video stream to a position where the primary video stream has a cut-off when the cut-off occurs, and the video transcoding server 104c is configured to transcode the primary video stream and send the transcoded primary video stream to the user terminal 102 through the video live broadcast server 104d, so as to implement live video broadcast.
In one embodiment, the internal structure of the server 104 in fig. 1 is schematically illustrated in fig. 3. The server 104 includes a processor, an internal memory, a nonvolatile storage medium, and a network interface connected by a system bus. The storage medium of the server stores an operating system and a live video processing device, and the live video processing device is used for realizing a live video processing method. The processor of the server 104 is configured to execute a live video processing method for providing computing and control capabilities and supporting the operation of the entire server 104, the internal memory of the server 104 provides an environment for the operation of the live video processing apparatus in the storage medium, and the network interface of the server 104 is used for performing network communication with the user terminal 102, such as receiving a live video request sent by the user terminal 102, and returning a transcoded video stream to the user terminal 102.
As shown in fig. 4, in an embodiment, a live video processing method is provided, and this embodiment is exemplified by applying the method to the server in fig. 1. The video live broadcast processing method specifically comprises the following steps:
step S402: and receiving a video live broadcast request sent by a user terminal.
When accessing video resources, a user terminal generally needs to access through a browser or a video application, and receives a video live broadcast request triggering instruction input by a user on a browser interface or a video application interface, generates a video live broadcast request and sends the video live broadcast request to a server. Specifically, a browser application or a video application for accessing video resources is run on the user terminal, and a video live broadcast request trigger control (such as a button or a hyperlink) is arranged in an interface of the application. A user inputs a video live broadcast request triggering instruction through an input device such as a touch screen, a key or a track ball of a user terminal, so that the user terminal sends a video live broadcast request to a server.
Step S404: and acquiring the meta video stream according to the video live broadcast request.
Specifically, the live recording terminal can encode the acquired video data and load address information to generate an IP data packet, and the server can receive the IP data packet transmitted by the live recording terminal in real time after establishing wireless communication connection with the live recording terminal. The server can obtain the meta video stream from the IP data packet after receiving the video live broadcast request sent by the user terminal.
A video stream refers to a collection of video frames in a continuous time. The meta video stream refers to a video stream directly transmitted from a live recording terminal without any other video or animation recorded in advance.
Step S406: it is detected whether there is a break in the meta video stream.
In particular, the meta video stream acquired by the server may be cut off due to various reasons such as network jitter. A cut-out refers to a situation where there is a disconnection of video or audio. When it is detected that there is a break in the meta video stream, step S408 is performed, otherwise, the process returns to continue to step S406.
Step S408, the pre-stored video stream is supplemented to the position of the meta video stream break.
Specifically, the server stores a video stream in advance. In one embodiment, the pre-stored video stream is supplemented from the location of the meta video stream cut-out, which may be a pre-recorded segment of video or a pre-authored segment of animation.
For example, the content of the live video is a variety program, and when the interruption of the primary video stream occurs, a pre-stored advertisement video or a funny animation can be inserted into the interruption position of the primary video stream. For another example, the live video content is a basketball game, and when the primary video stream is cut off, a pre-stored player profile video or a highlight review video can be inserted into the cut-off position of the primary video stream. For another example, the content of the live video is a release meeting of a certain movie work, and when the primary video appears to be cut off, the pre-stored Flock video of the related movie can be inserted into the position of the cut-off of the primary video stream.
According to the method for processing the live video, after a live video request sent by the user terminal is received, the server can obtain the primary video stream according to the live video request and detect whether the primary video stream has a cut-off, if yes, the pre-stored video stream is supplemented to the position of the cut-off of the primary video stream, so that the live video can be ensured to be uninterrupted, the frequency of sending various requests such as refreshing and accessing to the server by the user terminal is effectively reduced, the load of the server is lightened, and the live video performance is improved.
As shown in fig. 5, in another embodiment, a video live broadcast processing method is provided, including:
step S502: and receiving a video live broadcast request sent by a user terminal.
The user terminal runs a browser application or a video application for accessing video resources, and a video live broadcast request trigger control (such as a button or a hyperlink) is arranged in an interface of the application. A user inputs a video live broadcast request triggering instruction through an input device such as a touch screen, a key or a track ball of a user terminal, so that the user terminal sends a video live broadcast request to a server.
Step S504: and acquiring the meta video stream according to the video live broadcast request.
After receiving a video live broadcast request sent by a user terminal, the server can acquire a meta video stream according to the video live broadcast request.
Step S506: it is detected whether there is a break in the meta video stream.
In particular, the meta video stream acquired by the server may be cut off due to various reasons such as network jitter. A cut-out refers to a situation where there is a disconnection of video or audio. When it is detected that there is a break in the meta video stream, step S508 is executed, otherwise, the process returns to step S506.
Step S508: the pre-stored video stream is padded to the position where the meta video stream is cut out.
Specifically, the server stores a video stream in advance. In one embodiment, the pre-stored video stream is supplemented from the location of the meta video stream cut-out, which may be a pre-recorded segment of video or a pre-authored segment of animation.
Step S510: and transcoding the primary video stream after the video stream is supplemented, and sending the primary video stream to a user terminal to realize live video.
The video transcoding is to convert a video code stream which has been compressed and encoded into another video code stream so as to adapt to different network bandwidths, different terminal processing capabilities and different user requirements. Video transcoding is essentially a process of decoding and then encoding, and thus, the video streams before and after video conversion may or may not conform to the same video encoding standard.
Step S512: and detecting whether the primary video stream is recovered to be normal.
Specifically, after the server detects that the primary video stream has a cut-off, transcoding the primary video stream supplemented with the video stream and sending the transcoded primary video stream to the user terminal to realize live video, it is necessary to continuously detect whether the primary video stream is normal, if so, executing step S514, otherwise, returning to continue executing step S510.
Step S514: and transcoding the recovered meta video stream and sending the transcoded meta video stream to a user terminal to realize live video.
As shown in fig. 6, in one embodiment, the step of detecting whether there is a break in the meta video stream comprises:
step S602: the meta video stream is decomposed into audio frames and video frames.
Specifically, video is composed of pictures, sounds, and the like, wherein pictures belong to video frames, and sounds belong to audio frames.
Step S604: it is detected whether a time difference between adjacent video frames or between adjacent audio frames exceeds a time threshold.
Specifically, there is normally continuity in time between adjacent video frames and adjacent audio frames. If the discontinuity between adjacent video frames or between adjacent audio frames exceeds a certain time threshold, step S606 is executed to determine that the meta video stream has a break, otherwise, step S608 is executed to determine that the meta video stream has no break.
For example, when the live video is played to 10 am, the picture or sound stops suddenly, and the normal state is not restored until the time of zero 30 seconds at 10 am, which indicates that the current is cut off. When this happens, the user may habitually perform a refresh operation on the browser, or the user may be required to resend the live video request to restore the normal status.
The live video of the user terminal generally has a certain time delay relative to live recording, and the server can detect the meta video stream by using the time delay. And setting the time threshold to be 2 seconds, if the server detects that the next video frame or the next audio frame does not appear after 2 seconds, which indicates that the time difference between adjacent video frames or between adjacent audio frames exceeds 2 seconds, the server judges that the primary video stream has a break so as to take corresponding measures in time.
As shown in fig. 7, in a specific embodiment, the step of supplementing the pre-stored video stream to the location of the meta-video stream outage specifically includes the following steps:
step S702: and acquiring the time point of the occurrence of the break of the meta-video stream.
Specifically, the occurrence of a break in the meta-video stream necessarily corresponds to a time point, for example, when the live video suddenly has no sound or picture at 10 o 'clock, the time point of 10 o' clock is obtained.
Step S704: and supplementing the pre-stored video stream from the time point.
Specifically, after the time point when the interruption of the meta video stream occurs is obtained, for example, 10 o 'clock, the pre-stored video stream is supplemented from 10 o' clock.
The principle of the above-mentioned live video processing method is described below by using a specific application scenario, which takes a mobile phone as a user terminal as an example.
After a user sends a 'news simulcast' live request to a server through a touch screen of a mobile phone, the server can acquire a primary video stream according to the live request, and if the primary video stream is not cut off, the user can see a video picture as shown in fig. 8 on the mobile phone. In the process of live video, the server can detect the acquired primary video stream in real time, and when the primary video stream is detected to have a cut-off, the server can supplement the pre-stored video stream to the position of the cut-off of the primary video stream, transcode the secondary video stream supplemented with the video stream and send the transcoded primary video stream to the user terminal to realize live video. At this time, the user sees the state that the supplemented video stream is being played on the mobile phone, such as the video screen shown in fig. 9, and can also hear some sound such as a warm prompt. In addition, the server can detect whether the primary video stream is recovered to be normal in real time, and if the primary video stream is recovered to be normal, the recovered primary video stream is transcoded and sent to the user terminal to realize live video.
As shown in fig. 10, in one embodiment, a live video processing apparatus 1000 is provided, which has a function of implementing the live video processing method of each of the above embodiments. The video processing device 1000 includes a video request receiving module 1002, a meta video acquisition module 1004, a cut-off detection module 1006, and a complementary stream module 1008.
Specifically, the video request receiving module 1002 is configured to receive a video live broadcast request sent by a user terminal.
The meta video obtaining module 1004 is configured to obtain a meta video stream according to the video live broadcast request.
The break detection module 1006 is used to detect whether there is a break in the meta video stream.
The complementary streaming module 1008 is used to complement the pre-stored video stream to the location of the meta video stream break.
As shown in fig. 11, in one embodiment, the cutout detection module 1100 includes a decomposition unit 1102 and a detection unit 1104.
The decomposition unit 1102 is configured to decompose the meta-video stream into audio frames and video frames.
The detecting unit 1104 is configured to detect whether a time difference between adjacent video frames or between adjacent audio frames exceeds a time threshold, if so, a break exists in the meta video stream, and if not, a break does not exist in the meta video stream.
As shown in fig. 12, in one embodiment, the complementary stream module 1200 includes a time point obtaining unit 1202 and a video stream complementary unit 1204.
The time point obtaining unit 1202 is configured to obtain a time point when the meta video stream appears to be interrupted.
The video stream padding unit 1204 is configured to pad a pre-stored video stream from a time point.
As shown in fig. 13, in another embodiment, a live video processing apparatus 1300 having a function of implementing the live video processing method of each of the above embodiments is provided. The video processing apparatus 1300 includes a video request receiving module 1302, a meta-video obtaining module 1304, a cut-off detecting module 1306, a complementary streaming module 1308, a video transcoding module 1310, and a live video sending module 1312.
The video request receiving module 1302 is configured to receive a video live broadcast request sent by a user terminal.
The meta video obtaining module 1304 is configured to obtain a meta video stream according to the video live broadcast request.
The break detection module 1306 is used to detect whether there is a break in the meta video stream.
The complementary streaming module 1308 is used to complement a pre-stored video stream to the location of the meta video stream break.
The video transcoding module 1310 is configured to transcode the primary video stream after the video stream is supplemented.
The live video sending module 1312 is configured to send the transcoded meta video stream to a user terminal to implement live video.
The stream break detection module 1306 is further configured to detect whether the primary video stream is recovered normally, and if so, transcode the recovered primary video stream through the video transcoding module 1310 and send the transcoded primary video stream to the user terminal through the live video sending module 1312 to implement live video.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A video live broadcast processing method, the method comprising:
receiving a video live broadcast request sent by a user terminal;
acquiring a meta video stream according to an IP data packet received in real time by the video live broadcast request; the IP data packet is generated by encoding the acquired video data and then loading address information by the field recording terminal; the meta video stream refers to a video stream directly transmitted from a live recording terminal, and no other video or animation recorded in advance is inserted in the middle;
decomposing the meta-video stream into audio frames and video frames;
detecting whether the discontinuous condition occurring between adjacent video frames or between adjacent audio frames exceeds a time threshold value, if so, judging that the meta video stream has a cut-off; the interruption refers to the condition that video data or audio data in the meta video stream is disconnected; when the video data is disconnected, the pictures in the live broadcasting process are stopped, and when the audio data is disconnected, the sound in the live broadcasting process is stopped;
supplementing a pre-stored video stream to a location of the meta video stream cut-out; the video stream is prestored in the server, and the prestored video stream comprises a video which is recorded in advance and is associated with the live content corresponding to the video live broadcast request or an animation which is made in advance and is associated with the live content corresponding to the video live broadcast request;
transcoding the primary video stream after the video stream is supplemented and sending the primary video stream to a user terminal to realize live video broadcast;
and when the primary video stream is recovered to be normal, transcoding the recovered primary video stream without cut-off and sending the transcoded primary video stream to a user terminal to realize live video.
2. The method of claim 1, wherein it is detected whether a time difference between adjacent video frames or between adjacent audio frames exceeds a time threshold, and if not, there is no break in the meta video stream.
3. The method of claim 1, further comprising:
and detecting whether the primary video stream is recovered to be normal or not, if so, transcoding the recovered primary video stream and sending the transcoded primary video stream to a user terminal to realize live video.
4. The method of claim 1, wherein the step of supplementing the pre-stored video stream to the location of the meta video stream break comprises:
acquiring a time point of the occurrence of the break of the meta video stream;
and supplementing the pre-stored video stream from the time point.
5. A live video processing apparatus, the apparatus comprising:
the video request receiving module is used for receiving a video live broadcast request sent by a user terminal;
the meta video stream acquisition module is used for acquiring a meta video stream according to the IP data packet received in real time by the video live broadcast request; the IP data packet is generated by encoding the acquired video data and then loading address information by the field recording terminal; the meta video stream refers to a video stream directly transmitted from a live recording terminal, and no other video or animation recorded in advance is inserted in the middle;
a cut-out detection module for decomposing the meta video stream into audio frames and video frames; detecting whether the discontinuous condition occurring between adjacent video frames or between adjacent audio frames exceeds a time threshold value, if so, judging that the meta video stream has a cut-off; the interruption refers to the condition that video data or audio data in the meta video stream is disconnected; when the video data is disconnected, the pictures in the live broadcasting process are stopped, and when the audio data is disconnected, the sound in the live broadcasting process is stopped;
a stream complementing module for complementing a pre-stored video stream to a position where the meta video stream is cut off; the video stream is prestored in the server, and the prestored video stream comprises a video which is recorded in advance and is associated with the live content corresponding to the video live broadcast request or an animation which is made in advance and is associated with the live content corresponding to the video live broadcast request;
the video transcoding module is used for transcoding the primary video stream after the video stream is supplemented;
the live video sending module is used for sending the transcoded primary video stream to a user terminal to realize live video;
and the video transcoding module is also used for transcoding the recovered primary video stream without cut-off when the primary video stream is recovered to be normal and sending the primary video stream to the user terminal so as to realize live video broadcast.
6. The apparatus of claim 5, wherein the cutout detection module comprises:
a decomposition unit for decomposing the meta video stream into audio frames and video frames;
the detection unit is used for detecting whether the time difference between adjacent video frames or between adjacent audio frames exceeds a time threshold value, if so, the meta video stream has cut-off, and if not, the meta video stream does not have cut-off.
7. The apparatus of claim 5, wherein the stream break detection module is further configured to detect whether the primary video stream is recovered, and if so, transcode the recovered primary video stream through the video transcoding module and send the transcoded primary video stream to the user terminal through the live video sending module to implement live video.
8. The apparatus of claim 5, wherein the flow complementing module comprises:
a time point acquisition unit for acquiring a time point at which the meta video stream appears in a cut-off;
and the video stream supplementing unit is used for supplementing the pre-stored video stream from the time point.
9. A storage medium on which a computer program is stored, the program being executable by a processor to implement a live video processing method as claimed in any one of claims 1 to 4.
10. A terminal device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, the processor implementing a live video processing method as claimed in any one of claims 1 to 4 when executing the program.
CN201510393907.3A 2015-07-07 2015-07-07 Video live broadcast processing method and device, storage medium and terminal equipment Active CN106341698B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510393907.3A CN106341698B (en) 2015-07-07 2015-07-07 Video live broadcast processing method and device, storage medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510393907.3A CN106341698B (en) 2015-07-07 2015-07-07 Video live broadcast processing method and device, storage medium and terminal equipment

Publications (2)

Publication Number Publication Date
CN106341698A CN106341698A (en) 2017-01-18
CN106341698B true CN106341698B (en) 2020-11-03

Family

ID=57826340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510393907.3A Active CN106341698B (en) 2015-07-07 2015-07-07 Video live broadcast processing method and device, storage medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN106341698B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107071503B (en) * 2017-02-09 2019-01-08 腾讯科技(深圳)有限公司 The method, apparatus of net cast and live streaming connect streaming server
CN109547810A (en) * 2018-12-28 2019-03-29 北京奇艺世纪科技有限公司 A kind of live broadcasting method and device
CN110401869A (en) * 2019-07-26 2019-11-01 歌尔股份有限公司 A kind of net cast method, system and electronic equipment and storage medium
CN111601061B (en) * 2020-06-01 2021-12-24 联想(北京)有限公司 Video recording information processing method and electronic equipment
CN112423012B (en) * 2020-11-18 2023-05-09 青岛华升联信智慧科技有限公司 Multi-stage load live broadcast method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894558A (en) * 2010-08-04 2010-11-24 华为技术有限公司 Lost frame recovering method and equipment as well as speech enhancing method, equipment and system
CN101990126A (en) * 2009-08-07 2011-03-23 未序网络科技(上海)有限公司 Method for spotting advertisement in dynamic switching of internet on-demand or live broadcast signals
CN102572409A (en) * 2011-12-19 2012-07-11 中山爱科数字科技股份有限公司 Method for preventing video interruption in sector switching process of mobile video monitoring
CN104935948A (en) * 2015-05-13 2015-09-23 深圳市中幼国际教育科技有限公司 Video direct broadcast image processing method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101198022B (en) * 2007-12-26 2010-06-02 青岛海信移动通信技术股份有限公司 Method for inter cutting video information in stream media broadcasting or buffering course

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101990126A (en) * 2009-08-07 2011-03-23 未序网络科技(上海)有限公司 Method for spotting advertisement in dynamic switching of internet on-demand or live broadcast signals
CN101894558A (en) * 2010-08-04 2010-11-24 华为技术有限公司 Lost frame recovering method and equipment as well as speech enhancing method, equipment and system
CN102572409A (en) * 2011-12-19 2012-07-11 中山爱科数字科技股份有限公司 Method for preventing video interruption in sector switching process of mobile video monitoring
CN104935948A (en) * 2015-05-13 2015-09-23 深圳市中幼国际教育科技有限公司 Video direct broadcast image processing method and system

Also Published As

Publication number Publication date
CN106341698A (en) 2017-01-18

Similar Documents

Publication Publication Date Title
CN106341698B (en) Video live broadcast processing method and device, storage medium and terminal equipment
US20200236408A1 (en) Reducing time to first encrypted frame in a content stream
US9832517B2 (en) Seamless playback of media content using digital watermarking
WO2020248909A1 (en) Video decoding method and apparatus, computer device, and storage medium
US20130042100A1 (en) Method and apparatus for forced playback in http streaming
US11228801B2 (en) Method and apparatus for providing multi-view streaming service
CN108989854B (en) Playlist error labeling for delivery and rendering of streaming media
CN106998485B (en) Video live broadcasting method and device
US20170353518A1 (en) Catching up to the live playhead in live streaming
US10277653B2 (en) Failure detection manager
CN106664449A (en) Device switching for a streaming service
US10091265B2 (en) Catching up to the live playhead in live streaming
EP3466081A1 (en) Catching up to the live playhead in live streaming
US11908481B2 (en) Method for encoding live-streaming data and encoding device
US9866459B1 (en) Origin failover for live streaming
US20170142498A1 (en) Crowdsourcing-enhanced audio
CN104053002A (en) Video decoding method and device
CN112333529B (en) Live streaming loading method and device, equipment and medium thereof
CN111641864A (en) Video information acquisition method, device and equipment
US11997369B2 (en) Method of processing an error during the rendering of a digital content
CN111050192A (en) Media processing method and device
KR102281217B1 (en) Method for encoding and decoding, and apparatus for the same
CN105959798B (en) The frame alignment method, apparatus and equipment of video flowing
CN112073727B (en) Transcoding method and device, electronic equipment and storage medium
WO2016032383A1 (en) Sharing of multimedia content

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant