CN116634188A - Live broadcast method and device and computer readable storage medium - Google Patents

Live broadcast method and device and computer readable storage medium Download PDF

Info

Publication number
CN116634188A
CN116634188A CN202310682884.2A CN202310682884A CN116634188A CN 116634188 A CN116634188 A CN 116634188A CN 202310682884 A CN202310682884 A CN 202310682884A CN 116634188 A CN116634188 A CN 116634188A
Authority
CN
China
Prior art keywords
video
teaching
live
live broadcast
storage medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310682884.2A
Other languages
Chinese (zh)
Inventor
王廷虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Century TAL Education Technology Co Ltd
Original Assignee
Beijing Century TAL Education Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Century TAL Education Technology Co Ltd filed Critical Beijing Century TAL Education Technology Co Ltd
Priority to CN202310682884.2A priority Critical patent/CN116634188A/en
Publication of CN116634188A publication Critical patent/CN116634188A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • H04N21/6437Real-time Transport Protocol [RTP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The disclosure relates to a live broadcast method and device and a computer readable storage medium, and relates to the technical field of Internet. The live broadcast method disclosed by the disclosure comprises the following steps: acquiring a teaching file; acquiring a teaching live video acquired in real time by video acquisition equipment of a video production end through a remote procedure call protocol; acquiring synchronous operation information of a video production end on a teaching file; storing the teaching live video into a storage medium; reading a video frame of the teaching live video at a designated time from a storage medium; generating a file operation picture at a specified time according to the teaching file and the synchronous operation information; and synthesizing the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time. According to the method and the device, fluency of the live pictures is improved.

Description

Live broadcast method and device and computer readable storage medium
Technical Field
The disclosure relates to the technical field of internet, in particular to a live broadcast method and device and a computer readable storage medium.
Background
In recent years, with the vigorous development of live technology, the awareness of online education is gradually improved by the mass, and the scale of users of online education is rapidly increased. The on-line education can break through space-time limitation, promote resource sharing, realize education fairness, and along with the development of education informatization, the on-line classroom technology has entered the fusion innovation stage.
The classroom live broadcast technology mainly comprises the steps of collecting, coding, transmitting, distributing, playing and the like of streaming media. The mainstream streaming protocol is the RTMP (Real Time Messaging Protocol, real-time messaging protocol) protocol, and the coding format of the image is usually RGB (Red, grean, blue, red, green, blue) format. In addition, a Flash player is used to play the live video, usually based on the client side.
Disclosure of Invention
According to a first aspect of the present disclosure, there is provided a live broadcast method, including: acquiring a teaching file; acquiring a teaching live video acquired in real time by video acquisition equipment of a video production end through a remote procedure call protocol; acquiring synchronous operation information of a video production end on a teaching file; storing the teaching live video into a storage medium; reading a video frame of the teaching live video at a designated time from a storage medium; generating a file operation picture at a specified time according to the teaching file and the synchronous operation information; and synthesizing the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time.
In some embodiments, reading video frames of a live teaching video at a specified time from a storage medium includes: and reading the video frames of the teaching live video at the appointed time according to the storage address information of the video frames of the teaching live video at the appointed time, wherein the storage address information of different video frames of the teaching live video is different.
In some embodiments, reading video frames of a live teaching video at a specified time from a storage medium includes: according to the identification of the teaching live video, video frames of the teaching live video, acquired by the designated streaming media software development kit, of the designated video production end are read from a storage medium, wherein the identification of the teaching live video comprises a first identification and a second identification of the teaching live video, the first identification represents the video production end for acquiring the teaching live video, and the second identification represents the streaming media software development kit for acquiring the teaching live video.
In some embodiments, storing the teaching live video in a storage medium includes: and storing the teaching live videos acquired by different streaming media software development kits of the video production end into different containers of the storage medium.
In some embodiments, synthesizing a live video picture of a video consumer according to a file operation picture of a specified time and a video frame of a live video for teaching at the specified time, including: performing image processing on video frames of the teaching live video at a designated time to obtain target video frames; and synthesizing a video live broadcast picture of the video consumption end according to the file operation picture and the target video frame at the appointed time, wherein the video live broadcast picture of the video consumption end comprises the file operation picture and the target video frame at the appointed time.
In some embodiments, the image processing includes at least one of special effects addition and feature extraction.
In some embodiments, performing image processing on a video frame of the live video for teaching at a specified time to obtain a target video frame, including: and performing image processing on the video frames of the teaching live video at the appointed time by using a browser to obtain target video frames.
In some embodiments, the format of the teaching live video collected in real time by the video collection device of the video production end is a binary format, and the live broadcast method further includes: converting the teaching live video from a binary format to a YUV format, wherein Y represents a luminance signal, U represents a chrominance signal of a blue channel, and V represents a chrominance signal of a red channel.
In some embodiments, the live method further comprises: for video frames of a teaching live video stored in a storage medium, in the case where time information satisfies a specified condition and is not taken as a video frame of a specified time, the video frame is deleted.
In some embodiments, the live method further comprises: and playing the video live broadcast picture of the video consumption terminal by using the browser.
According to a second aspect of the present disclosure, there is provided a live broadcast apparatus, comprising: the file acquisition module is configured to acquire a teaching file; the video acquisition module is configured to acquire the teaching live video acquired in real time by video acquisition equipment of the video production end through a remote procedure call protocol; the information acquisition module is configured to acquire synchronous operation information of the video production end on the teaching file; a storage module configured to store the teaching live video into a storage medium; a reading module configured to read video frames of the teaching live video at a specified time from the storage medium; the generating module is configured to generate a file operation picture at a specified time according to the teaching file and the synchronous operation information; and the synthesis module is configured to synthesize the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time.
According to a third aspect of the present disclosure, there is provided a live broadcast apparatus, comprising: a memory; and a processor coupled to the memory, the processor configured to perform a live method according to any embodiment of the present disclosure based on instructions stored in the memory.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement a live broadcast method according to any embodiment of the present disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
The disclosure may be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings in which:
fig. 1 shows a schematic diagram of a live broadcast method of the related art;
fig. 2 illustrates a schematic diagram of a live method according to some embodiments of the present disclosure;
fig. 3 illustrates a flow chart of a live method according to some embodiments of the present disclosure;
fig. 4 illustrates a block diagram of a live device according to some embodiments of the present disclosure;
fig. 5 illustrates a block diagram of a live device according to further embodiments of the present disclosure;
FIG. 6 illustrates a block diagram of a computer system for implementing some embodiments of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Fig. 1 shows a schematic diagram of a live broadcast method of the related art.
As shown in fig. 1, the local video source (i.e., live video for teaching) at the video production end (i.e., the main speaking end) is mainly real-time video collected by a camera, and may also include a screen desktop or a separate window at the video production end.
For the video consumption end (i.e., the learner end), the teaching live video is video from the far end. In the related technology, at a video production end, a display area of a teaching file and an area of a teaching live video acquired by a far-end camera are completely intercepted and synthesized into a larger image. The truncated complete image is then sent to the video consumer. At the video consumer end, the entire image is displayed on the screen of the device. The large data volume of the intercepted complete image leads to large transmission pressure, high transmission cost and low transmission speed, and can cause high network delay and blocking of live pictures at the video consumption end.
In terms of transmission protocols, having an RTC (Real time communication, real-time communication) with higher real-time performance can cause excessive transmission costs for such single-frame large-data-volume transmission scenarios. In order to meet the transmission requirement of larger images and consider the transmission cost, the related technology adopts an RTMP protocol, which causes high network delay and affects the fluency of live pictures.
In addition, the transmission speed of the complete image is slow, so that the video consumer plays the complete image after receiving the complete image, and the complete image cannot be stored in advance before playing.
Fig. 2 illustrates a schematic diagram of a live method according to some embodiments of the present disclosure.
As shown in fig. 2, the present disclosure adopts a push-pull stream manner to transmit a live video for teaching.
The video production end independently transmits the teaching live video (local video source relative to the video production end) to the video consumption end through a streaming media SDK (Software Development Kit ). The video consumer stores the teaching live video (an external video source relative to the video consumer) in a storage medium, reads the teaching live video from the storage medium before playing, and synthesizes the video frames with pictures of the teaching file (e.g., courseware). The combination mode is, for example, that the teaching live video and courseware are spliced in fig. 2, and the teaching live video and the courseware are not overlapped, and for example, the teaching live video is superimposed on the courseware to form a live broadcast picture.
In the whole live broadcast process, the video consumption end only needs to acquire courseware at least once, and the courseware display picture of the video production end does not need to be continuously acquired and updated along with the live broadcast. Compared with the transmission of the whole live video picture, the data transmission amount of the live video for teaching is small, the transmission pressure is reduced, the transmission cost is reduced, and video jamming can be reduced.
In addition, by combining the characteristic of small data transmission quantity of the method, the transmission protocol is changed into the RTC protocol, so that delay can be further reduced on the premise of low transmission cost, and the fluency of the live broadcast picture of teaching is improved.
The video production end can also display the video collected by the video consumption end, and the method is similar to that of the video consumption end, and is not repeated here.
Fig. 3 illustrates a flow chart of a live method according to some embodiments of the present disclosure.
As shown in fig. 3, the live broadcast method includes steps S1 to S7. In some embodiments, the live method is performed by the video consumer,
in step S1, a teaching file is acquired. The teaching file is at least one of courseware, documents, animation and music. The teaching file is obtained from the video production end, or uploaded to a third party storage platform by the video production end, and then obtained from the storage platform by the video production end. In some embodiments, the teaching file is obtained before live broadcast begins.
In step S2, a live video for teaching collected in real time by a video collection device at a video production end is obtained through a remote procedure call protocol. Wherein the video capture device is, for example, a camera. The user at the video production end is, for example, a teaching teacher, and the teaching live video is, for example, a teacher image shot by a camera or a real scene of the teaching scene of the teacher.
In step S3, synchronous operation information of the video production end to the teaching file is obtained. The synchronous operation information of the video production end on the teaching file is, for example, a signaling of operations such as page turning, graffiti and the like on courseware, and the operation information is synchronous with the teaching live video. Wherein S3 and S2 may be performed simultaneously, S3 may be performed before S2, or S2 may be performed before S3, which is not limited by the present disclosure.
In step S4, the teaching live video is stored in a storage medium. The teaching live video includes a plurality of video frames, and the storage medium is, for example, a memory. After the transmission speed is increased, the video frame speed of the teaching live video is higher than the video playing speed of the video production end, so that the video frames are stored in advance and are waiting to be played, and the blocking can be reduced.
In some embodiments, storing the teaching live video in a storage medium includes: and storing the teaching live videos acquired by different streaming media SDKs of the video production end into different containers of the storage medium.
In the live broadcast process, the unique streaming media SDK is not relied on. For example, in the live broadcast process, when it is detected that the currently used streaming media SDK does not meet the specified condition, for example, the streaming media SDK is stable, and the streaming media SDK is switched to another streaming media SDK meeting the specified condition. At this time, if the live teaching video collected and transmitted by the currently used streaming media SDK is stored in the container 1, a new container 2 is created, and the live teaching video collected and transmitted by the switched streaming media SDK is stored in the container 2.
In step S5, a video frame of the teaching live video at a specified time is read from the storage medium. For example, when playback is desired, video frames are retrieved from the storage medium for consumption. The designated time is the time generated by the video frame to be played, and the video frame at the designated time is, for example, a 21 st second picture of the live video for teaching.
In some embodiments, reading video frames of a live teaching video at a specified time from a storage medium includes: and reading the video frames of the teaching live video at the appointed time according to the storage address information of the video frames of the teaching live video at the appointed time, wherein the storage address information of different video frames of the teaching live video is different.
For example, the storage address information is a memory ID (Identity document, identification number). When in storage, the relation between the video frames and the storage medium is marked by using the storage address information, and different video frames correspond to different memory IDs, namely, the memory IDs are uniquely mapped to video frame data corresponding to a certain time point. When reading, the storage position of the video frame to be played can be quickly found according to the memory ID, and the video frame is read.
In some embodiments, reading video frames of a live teaching video at a specified time from a storage medium includes: according to the identification of the teaching live video, video frames of the teaching live video, acquired by the designated streaming media software development kit, of the designated video production end are read from a storage medium, wherein the identification of the teaching live video comprises a first identification and a second identification of the teaching live video, the first identification represents the video production end for acquiring the teaching live video, and the second identification represents the streaming media software development kit for acquiring the teaching live video.
The video identification (i.e., video tag) is a tag field inside the browser that is used to identify the source of the video stream. Through the first identifier, the teaching live video collected by the specified streaming media software development kit can be found, and through the second identifier, the identifier of the specified video production end can be found. By adding the identification for the videos of different sources, the rendering process (namely, the video playing process) of the videos of different sources can be isolated later, and the video can be known from which streaming media SDK and which video production end the video is transmitted before, so that the tracking of the consumption history of the video, the tracing of the follow-up problems, the switching among multi-source video streams and the like are facilitated.
In step S6, a file operation screen at a specified time is generated from the teaching file and the synchronization operation information. For example, the video production end generates a courseware display picture at each moment according to the synchronous operation information after receiving the synchronous operation information.
In step S7, according to the file operation picture of the designated time and the video frame of the live video for teaching at the designated time, the live video picture of the video consumer is synthesized. For example, the video consumer terminal splices the file operation picture and the video frame with the same time into a video live broadcast picture, and the live broadcast picture is displayed on a device screen of the video consumer terminal.
In some embodiments, synthesizing a live video picture of a video consumer according to a file operation picture of a specified time and a video frame of a live video for teaching at the specified time, including: performing image processing on video frames of the teaching live video at a designated time to obtain target video frames; and synthesizing a video live broadcast picture of the video consumption end according to the teaching file operation picture and the target video frame at the appointed time, wherein the video live broadcast picture of the video consumption end comprises the file operation picture and the target video frame at the appointed time.
The teaching live video is independently transmitted, so that the teaching live video can be directly processed simply and efficiently without cutting out the region of the teaching live video from the whole image. And then, combining the processed video frames and the teaching file operation picture into a video live broadcast picture. According to the scheme, the teaching live video can be independently processed according to different requirements of video consumption ends, and the flexibility and interactivity of live broadcast can be improved.
In some embodiments, the image processing includes at least one of special effects addition and feature extraction. For example, special effects such as beauty are added to the targets in the video frames. Through feature extraction, information such as whether a teacher is in a lens can be detected.
In some embodiments, a browser is utilized to perform image processing on video frames of the teaching live video at a specified time to obtain target video frames. For example, by programming an upper program, modifying the browser kernel, starting the expansion capability of the browser, so that the browser can perform image processing on the image frames of the teaching live video.
In some embodiments, the format of the teaching live video collected in real time by the video collection device of the video production end is a binary format, and the live broadcast method further includes: converting the teaching live video from a binary format to a YUV format, wherein Y represents a luminance signal, U represents a chrominance signal of a blue channel, and V represents a chrominance signal of a red channel.
Returning to fig. 1, it can be seen that in the related art, the video production end transmits video files in RGB format to the video consumption end, and the video consumption end plays the video files through an animation (flash) player, so that the single frame data amount of the video stream in RGB format is larger, and the transmission load is high. In fig. 3, the present disclosure converts video frame data acquired through a streaming media SDK into binary data. By performing coding format conversion on video frame data, the remote video stream can be converted into YUV format so as to perform subsequent rendering, thereby reducing the data volume of the video stream and improving the local transmission efficiency of the data at the video consumption end while ensuring the picture quality.
In addition, most browsers currently do not support rendering video files in RGB format, and the present disclosure can implement rendering video files with the browser by converting teaching live video into YUV format.
In some embodiments, the video live method further comprises: in the case where the time information satisfies a specified condition and is not a video frame of a specified time, the video frame of the teaching live video stored in the storage medium is deleted.
For example, the specified condition is that a specified period of time is exceeded, and if one video frame stored in the storage medium times out but is not consumed (as a video frame of a specified time), the video frame is cleared from the storage medium. Under the condition of poor network condition, new real-time data can be timely rendered and played by deleting the outdated data which is not consumed, the situation that video is played more slowly is avoided, and picture blocking is reduced.
In some embodiments, the live method further comprises: and playing the video live broadcast picture of the video consumption terminal by using the browser. For example, the browser kernel is modified to play video containing exotic content (i.e., teaching live video), support multiple streaming media SDKs, and support for web page (web) scenes is achieved.
The browser kernel reads the appointed teaching live video from the storage medium according to the identification of the teaching live video, then the teaching live video is processed by an HTML5 (HyperText Markup Language ) layer of the browser, the processed teaching live video is sent to a GPU (Graphics Processing Unit, graphic processor) engine, and the teaching live video is rendered by the GPU engine. Compared with the method for playing the live video pictures by using the client, the method for playing the live video pictures by using the browser can improve the playing flexibility and is more suitable for scenes of education and training.
In some embodiments, the live method further comprises: and acquiring live video acquired by video acquisition equipment of other video consumption terminals. According to the file operation picture of the appointed time and the video frame of the teaching live video at the appointed time, synthesizing the video live picture of the video consumption terminal comprises the following steps: and synthesizing the video live broadcast picture of the video consumption end according to the file operation picture of the appointed time, the video frame of the teaching live broadcast video at the appointed time and the live broadcast video acquired by the video acquisition equipment of other video consumption ends. For example, not only the pictures collected by the cameras of the teacher but also the pictures collected by the cameras of other students are displayed on the screen of one student.
According to some embodiments of the present disclosure, a live method includes obtaining a teaching file; acquiring a teaching live video acquired in real time by video acquisition equipment of a video production end through a remote procedure call protocol; acquiring synchronous operation information of a video production end on a teaching file; storing the teaching live video into a storage medium; reading a video frame of the teaching live video at a designated time from a storage medium; generating a file operation picture at a specified time according to the teaching file and the synchronous operation information; and synthesizing the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time. Through respectively transmitting the teaching live video and the teaching file and combining a remote procedure call protocol, the transmission cost is reduced, the network delay is reduced, and the fluency of live pictures is improved.
Fig. 4 illustrates a block diagram of a live device according to some embodiments of the present disclosure.
As shown in fig. 4, the live broadcast apparatus 4 includes a file acquisition module 41, a video acquisition module 42, an information acquisition module 43, a storage module 44, a reading module 45, a generation module 46, and a synthesis module 47.
The file acquisition module 41 is configured to acquire a teaching file, for example, to perform step S1 shown in fig. 3.
The video acquisition module 42 is configured to acquire the live video for teaching acquired in real time by the video acquisition device of the video production end through a remote procedure call protocol, for example, to perform step S2 shown in fig. 3.
The information obtaining module 43 is configured to obtain synchronous operation information of the video production end on the teaching file, for example, perform step S3 shown in fig. 3.
The storage module 44 is configured to store the teaching live video in a storage medium, for example, to perform step S4 shown in fig. 3.
The reading module 45 is configured to read a video frame of the teaching live video at a specified time from the storage medium, for example, to perform step S5 shown in fig. 3.
The generating module 46 is configured to generate a file operation screen at a specified time, for example, to perform step S6 shown in fig. 3, based on the teaching file and the synchronization operation information.
The synthesizing module 47 is configured to synthesize a live video frame of the video consumer according to the file operation screen of the specified time and the live video frame of the teaching video at the specified time, for example, to perform step S7 shown in fig. 3.
In some embodiments, the reading module is further configured to read the video frame of the live video for instruction at the specified time according to the storage address information of the video frame of the live video for instruction at the specified time, wherein the storage address information of different video frames of the live video for instruction is different.
In some embodiments, the reading module is further configured to read, from the storage medium, video frames of the live video of the teaching collected by the specified streaming media software development kit at the specified time according to an identification of the live video of the teaching, wherein the identification of the live video of the teaching includes a first identification of the live video of the teaching and a second identification, the first identification represents the video production end collecting the live video of the teaching, and the second identification represents the streaming media software development kit collecting the live video of the teaching.
In some embodiments, the storage module is further configured to store the live video of the instruction captured by the different streaming media software development kits at the video production end into different containers of the storage medium.
In some embodiments, the composition module is further configured to perform image processing on video frames of the teaching live video at a specified time to obtain target video frames; and synthesizing a video live broadcast picture of the video consumption end according to the file operation picture and the target video frame at the appointed time, wherein the video live broadcast picture of the video consumption end comprises the file operation picture and the target video frame at the appointed time.
In some embodiments, the live broadcast apparatus further comprises: the conversion module is configured to convert the teaching live video from a binary format to a YUV format, wherein Y represents a luminance signal, U represents a chrominance signal of a blue channel, and V represents a chrominance signal of a red channel.
In some embodiments, the live broadcast apparatus further comprises: and a deletion module configured to delete, for video frames of the teaching live video stored in the storage medium, the video frames in a case where the time information satisfies a specified condition and is not taken as a video frame of a specified time.
In some embodiments, the live broadcast apparatus further comprises: and the playing module is configured to play the video live broadcast picture of the video consumption terminal by utilizing the browser.
A live broadcast device according to some embodiments of the present disclosure includes a file acquisition module configured to acquire a teaching file; the video acquisition module is configured to acquire the teaching live video acquired in real time by video acquisition equipment of the video production end through a remote procedure call protocol; the information acquisition module is configured to acquire synchronous operation information of the video production end on the teaching file; a storage module configured to store the teaching live video into a storage medium; a reading module configured to read video frames of the teaching live video at a specified time from the storage medium; the generating module is configured to generate a file operation picture at a specified time according to the teaching file and the synchronous operation information; and the synthesis module is configured to synthesize the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time. Through respectively transmitting the teaching live video and the teaching file and combining a remote procedure call protocol, network delay is reduced, and fluency of live pictures is improved.
Fig. 5 illustrates a block diagram of a live device according to further embodiments of the present disclosure.
As shown in fig. 5, the live broadcast 5 includes a memory 51; and a processor 52 coupled to the memory 51, the memory 51 for storing instructions for performing the live method. The processor 52 is configured to perform the live method in any of the embodiments of the present disclosure based on instructions stored in the memory 51.
FIG. 6 illustrates a block diagram of a computer system for implementing some embodiments of the present disclosure.
As shown in FIG. 6, computer system 60 may be in the form of a general purpose computing device. Computer system 60 includes a memory 610, a processor 620, and a bus 600 that connects the various system components.
The memory 610 may include, for example, system memory, non-volatile storage media, and the like. The system memory stores, for example, an operating system, application programs, boot Loader (Boot Loader), and other programs. The system memory may include volatile storage media, such as Random Access Memory (RAM) and/or cache memory. The non-volatile storage medium stores, for example, instructions for performing the live method in any of the embodiments of the present disclosure. Non-volatile storage media include, but are not limited to, disk storage, optical storage, flash memory, and the like.
The processor 620 may be implemented as discrete hardware components such as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gates or transistors, or the like. Accordingly, each of the modules, such as the judgment module and the determination module, may be implemented by a Central Processing Unit (CPU) executing instructions of the corresponding steps in the memory, or may be implemented by a dedicated circuit that performs the corresponding steps.
Bus 600 may employ any of a variety of bus architectures. For example, bus structures include, but are not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, and a Peripheral Component Interconnect (PCI) bus.
Computer system 60 may also include input-output interface 630, network interface 640, storage interface 650, and the like. These interfaces 630, 640, 650 and the memory 610 and processor 620 may be connected by a bus 600. The input output interface 630 may provide a connection interface for input output devices such as a display, mouse, keyboard, etc. Network interface 640 provides a connection interface for various networking devices. The storage interface 650 provides a connection interface for external storage devices such as a floppy disk, a USB flash disk, an SD card, and the like.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable apparatus to produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions specified in the flowchart and/or block diagram block or blocks.
These computer readable program instructions may also be stored in a computer readable memory that can direct a computer to function in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture including instructions which implement the function specified in the flowchart and/or block diagram block or blocks.
The present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
By the live broadcasting method, the live broadcasting device and the computer readable storage medium, smoothness of live broadcasting is improved.
Up to this point, the live broadcasting method and apparatus, computer readable storage medium according to the present disclosure have been described in detail. In order to avoid obscuring the concepts of the present disclosure, some details known in the art are not described. How to implement the solutions disclosed herein will be fully apparent to those skilled in the art from the above description.

Claims (13)

1. A live broadcast method, comprising:
acquiring a teaching file;
acquiring a teaching live video acquired in real time by video acquisition equipment of a video production end through a remote procedure call protocol;
acquiring synchronous operation information of a video production end on a teaching file;
storing the teaching live video into a storage medium;
reading a video frame of the teaching live video at a designated time from a storage medium;
generating a file operation picture at a specified time according to the teaching file and the synchronous operation information;
and synthesizing the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time.
2. The live broadcast method of claim 1, wherein reading video frames of the teaching live video at a specified time from the storage medium comprises:
and reading the video frames of the teaching live video at the appointed time according to the storage address information of the video frames of the teaching live video at the appointed time, wherein the storage address information of different video frames of the teaching live video is different.
3. The live broadcast method according to claim 1 or 2, wherein reading video frames of the teaching live video at a specified time from the storage medium comprises:
according to the identification of the teaching live video, video frames of the teaching live video, acquired by the designated streaming media software development kit, of the designated video production end are read from a storage medium, wherein the identification of the teaching live video comprises a first identification and a second identification of the teaching live video, the first identification represents the video production end for acquiring the teaching live video, and the second identification represents the streaming media software development kit for acquiring the teaching live video.
4. The live broadcast method according to claim 1 or 2, wherein storing the teaching live video in a storage medium comprises:
and storing the teaching live videos acquired by different streaming media software development kits of the video production end into different containers of the storage medium.
5. The live broadcast method according to claim 1 or 2, wherein synthesizing the video live broadcast picture of the video consumer according to the file operation picture of the specified time and the video frame of the teaching live broadcast video at the specified time comprises:
performing image processing on video frames of the teaching live video at a designated time to obtain target video frames;
and synthesizing a video live broadcast picture of the video consumption end according to the file operation picture and the target video frame at the appointed time, wherein the video live broadcast picture of the video consumption end comprises the file operation picture and the target video frame at the appointed time.
6. The live broadcast method of claim 5, wherein the image processing of the video frames of the teaching live video at the specified time to obtain the target video frames comprises:
and performing image processing on the video frames of the teaching live video at the appointed time by using a browser to obtain target video frames.
7. The live broadcast method of claim 5, wherein the image processing includes at least one of special effects addition and feature extraction.
8. The live broadcast method according to claim 1 or 2, wherein the format of the teaching live video acquired in real time by the video acquisition device of the video production end is a binary format, and the live broadcast method further comprises:
converting the teaching live video from a binary format to a YUV format, wherein Y represents a luminance signal, U represents a chrominance signal of a blue channel, and V represents a chrominance signal of a red channel.
9. The live broadcast method according to claim 1 or 2, further comprising:
for video frames of a teaching live video stored in a storage medium, in the case where time information satisfies a specified condition and is not taken as a video frame of a specified time, the video frame is deleted.
10. The live broadcast method according to claim 1 or 2, further comprising:
and playing the video live broadcast picture of the video consumption terminal by using the browser.
11. A live broadcast apparatus, comprising:
the file acquisition module is configured to acquire a teaching file;
the video acquisition module is configured to acquire the teaching live video acquired in real time by video acquisition equipment of the video production end through a remote procedure call protocol;
the information acquisition module is configured to acquire synchronous operation information of the video production end on the teaching file;
a storage module configured to store the teaching live video into a storage medium;
a reading module configured to read video frames of the teaching live video at a specified time from the storage medium;
the generating module is configured to generate a file operation picture at a specified time according to the teaching file and the synchronous operation information;
and the synthesis module is configured to synthesize the video live broadcast picture of the video consumption end according to the file operation picture of the designated time and the video frame of the teaching live broadcast video at the designated time.
12. A live broadcast apparatus, comprising:
a memory; and
a processor coupled to the memory, the processor configured to perform the live method of any of claims 1-10 based on instructions stored in the memory.
13. A computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement a live method according to any of claims 1 to 10.
CN202310682884.2A 2023-06-09 2023-06-09 Live broadcast method and device and computer readable storage medium Pending CN116634188A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310682884.2A CN116634188A (en) 2023-06-09 2023-06-09 Live broadcast method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310682884.2A CN116634188A (en) 2023-06-09 2023-06-09 Live broadcast method and device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116634188A true CN116634188A (en) 2023-08-22

Family

ID=87609802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310682884.2A Pending CN116634188A (en) 2023-06-09 2023-06-09 Live broadcast method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN116634188A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117793449A (en) * 2024-02-23 2024-03-29 北京都是科技有限公司 Video live broadcast and video processing method, device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117793449A (en) * 2024-02-23 2024-03-29 北京都是科技有限公司 Video live broadcast and video processing method, device and storage medium
CN117793449B (en) * 2024-02-23 2024-04-30 北京都是科技有限公司 Video live broadcast and video processing method, device and storage medium

Similar Documents

Publication Publication Date Title
US20210004604A1 (en) Video frame extraction method and apparatus, computer-readable medium
WO2021068558A1 (en) Simultaneous subtitle translation method, smart television, and storage medium
US11418832B2 (en) Video processing method, electronic device and computer-readable storage medium
CN110475150A (en) The rendering method and device of virtual present special efficacy, live broadcast system
WO2020082870A1 (en) Real-time video display method and apparatus, and terminal device and storage medium
CN1997153B (en) A method and device for computer multi-video playing
CN112804459A (en) Image display method and device based on virtual camera, storage medium and electronic equipment
CN105376547A (en) Micro video course recording system and method based on 3D virtual synthesis technology
CN110798697A (en) Video display method, device and system and electronic equipment
WO2005013618A1 (en) Live streaming broadcast method, live streaming broadcast device, live streaming broadcast system, program, recording medium, broadcast method, and broadcast device
CN207399423U (en) A kind of distributed network video process apparatus
CN111163360A (en) Video processing method, video processing device, computer-readable storage medium and computer equipment
CN113115110B (en) Video synthesis method and device, storage medium and electronic equipment
CN111899322A (en) Video processing method, animation rendering SDK, device and computer storage medium
CN116634188A (en) Live broadcast method and device and computer readable storage medium
CN114845136B (en) Video synthesis method, device, equipment and storage medium
WO2020258907A1 (en) Virtual article generation method, apparatus and device
KR20090044105A (en) Live-image providing system using contents of 3d virtual space
CN112153472A (en) Method and device for generating special picture effect, storage medium and electronic equipment
CN108320331B (en) Method and equipment for generating augmented reality video information of user scene
CN112738421A (en) Method, system and storage medium for realizing KVM (keyboard video mouse) seat preview function of optical fiber matrix
CN116389849A (en) Video generation method, device, equipment and storage medium
CN108900866A (en) It is a kind of based on the multi-stage data live broadcast system for melting media service platform
CN116962742A (en) Live video image data transmission method, device and live video system
CN111367598B (en) Method and device for processing action instruction, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination