CN117793449A - Video live broadcast and video processing method, device and storage medium - Google Patents

Video live broadcast and video processing method, device and storage medium Download PDF

Info

Publication number
CN117793449A
CN117793449A CN202410200062.0A CN202410200062A CN117793449A CN 117793449 A CN117793449 A CN 117793449A CN 202410200062 A CN202410200062 A CN 202410200062A CN 117793449 A CN117793449 A CN 117793449A
Authority
CN
China
Prior art keywords
video
live
image data
operating system
device file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410200062.0A
Other languages
Chinese (zh)
Other versions
CN117793449B (en
Inventor
张焱
张华宾
邸文华
李娟�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dushi Technology Co ltd
Original Assignee
Beijing Dushi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dushi Technology Co ltd filed Critical Beijing Dushi Technology Co ltd
Priority to CN202410435100.0A priority Critical patent/CN118118731A/en
Priority to CN202410200062.0A priority patent/CN117793449B/en
Publication of CN117793449A publication Critical patent/CN117793449A/en
Application granted granted Critical
Publication of CN117793449B publication Critical patent/CN117793449B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a live video and video processing method, a device and a storage medium, which are used for live video equipment and comprise the following steps: generating a push video for pushing to the live platform; writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to a live broadcast platform; and reading the first video image data from the first device file via the operating system with the live application and transmitting the first video image data to the live platform. Therefore, the technical effect of ensuring that the anchor personnel can utilize the expected live platform to conduct live broadcast is achieved.

Description

Video live broadcast and video processing method, device and storage medium
Technical Field
The present disclosure relates to the field of live video technologies, and in particular, to a live video and video processing method, apparatus, and storage medium.
Background
Currently, with the progress of network communication technology, network live broadcast is increasingly developed and applied. Live devices for network live are proposed to generate push video to a live platform based on video images from different video sources.
The published patent application with application number 2023108768297, entitled "live control method, apparatus, system and storage medium", discloses a live broadcast device that can receive video images from a plurality of video sources (such as cameras, computers and mobile phones), generate push video pushed to a live broadcast platform (where the live broadcast platform may be, for example, a live tremble broadcast platform, a live webcast platform, or a tiger-teeth live broadcast platform) based on the received video images, and send the push video to the live broadcast platform. Furthermore, although the patent application of the invention is not disclosed, the android system can be applied as an operating system to live devices.
In reality, users of live devices, i.e., anchor personnel, are typically deep-bound with each live platform. It is therefore inevitable that a presenter, while using a live device, wishes to be able to live using a desired live application (where the live application corresponds to a live platform) in order to push video to the desired live platform. However, since existing live applications (such as a tremble application and a quick-action application) are developed based on a tablet device such as a mobile phone, the live application defaults to capturing video images from a front-end camera or a rear-end camera of the tablet device.
In this case, even if a live application (e.g., a tremble application, a quick-hand application, etc.) corresponding to a live platform is installed in the live device, the live application cannot push a push video generated by the live device to the corresponding live platform. Therefore, the anchor personnel cannot push the push video generated by the live broadcast equipment to the expected live broadcast platform through the expected live broadcast application program, and thus cannot utilize the live broadcast equipment to conduct live broadcast on the expected live broadcast platform.
The publication number is CN117499695A, and the name is a video live broadcast method, a video live broadcast device, an electronic device and a video live broadcast medium. The method comprises the following steps: receiving a text file; and generating the material text according to the text file. Wherein the material text includes at least one of the speech content; generating a corresponding live video clip based on text content corresponding to each of the at least one conversation content; generating a video to be live broadcast according to the live broadcast video segment; and playing the video to be live broadcast.
Publication number CN117440175a, entitled method, apparatus, system, device and medium for video transmission. The method is performed by a mobile edge computing node. The method includes receiving a panoramic video stream acquired in real time, receiving pose information of a terminal device, generating a video stream for the terminal device based on the pose information and the panoramic video stream, the video stream including a portion of the panoramic video stream corresponding to the pose information, and transmitting the video stream to the terminal device.
Aiming at the technical problem that a host player in the prior art cannot send push video generated by live broadcast equipment to a corresponding live broadcast platform through a desired live broadcast application program, an effective solution is not proposed at present.
Disclosure of Invention
The embodiment of the disclosure provides a video live broadcast and video processing method, a video live broadcast and video processing device and a storage medium, which at least solve the technical problem that a host player in the prior art cannot send push video generated by live broadcast equipment to a corresponding live broadcast platform through a desired live broadcast application program.
According to an aspect of the embodiments of the present disclosure, there is provided a video live method for a live device, including: generating a push video for pushing to the live platform; writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to a live broadcast platform; and reading the first video image data from the first device file via the operating system with the live application and transmitting the first video image data to the live platform.
According to another aspect of the embodiments of the present disclosure, there is also provided a video processing method for a live broadcast device, including: generating a push video for pushing to the live platform; and writing first video image data of the push video into a first device file configured by the operating system, wherein the first device file corresponds to a first camera associated with the live application and the live application corresponds to the live platform.
According to another aspect of the embodiments of the present disclosure, there is also provided a storage medium including a stored program, wherein the method of any one of the above is performed by a processor when the program is run.
According to another aspect of the embodiments of the present disclosure, there is also provided a live video apparatus, including: the first push video generation module is used for generating push videos for pushing to the live broadcast platform; the first data writing module is used for writing first video image data of the push video into a first device file deployed in the operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to a live broadcast platform; and the first data reading module is used for reading the first video image data from the first device file by using the live broadcast application program through the operating system and transmitting the first video image data to the live broadcast platform.
According to another aspect of the embodiments of the present disclosure, there is also provided a video processing apparatus including: the second push video generation module is used for generating push videos for pushing to the live broadcast platform; and a second data writing module for writing first video image data of the push video into a first device file configured by the operating system, wherein the first device file corresponds to a first camera associated with the live application, and the live application corresponds to the live platform.
According to another aspect of the embodiments of the present disclosure, there is also provided a live video apparatus, including: a first processor; and a first memory, coupled to the first processor, for providing instructions to the first processor to process the steps of: generating a push video for pushing to the live platform; writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to a live broadcast platform; and reading the first video image data from the first device file via the operating system with the live application and transmitting the first video image data to the live platform.
According to another aspect of the embodiments of the present disclosure, there is also provided a video processing apparatus including: a second processor; and a second memory, coupled to the second processor, for providing instructions to the second processor to process the steps of: generating a push video for pushing to the live platform; and writing first video image data of the push video into a first device file configured by the operating system, wherein the first device file corresponds to a first camera associated with the live application and the live application corresponds to the live platform.
The application provides a video live broadcast method which is applied to live broadcast equipment. First, the video processing application generates push video for pushing to the live platform. The video processing application then writes the first video image data of the push video to a first device file disposed on the operating system. Finally, the live application reads the first video image data from the first device file via the operating system and transmits the first video image data to the live platform.
Because the first equipment file corresponds to the first camera associated with the live broadcast application program, and the live broadcast application program corresponds to the live broadcast platform, after the operating system receives a request for acquiring data from the first camera, the operating system directly reads the corresponding data from the first equipment file and sends the read data to the live broadcast application program.
Further, since the first video image data of the push video is written into the first device file, the data read from the first device file by the operating system is the first video image data of the push video. Thus, after the operating system sends the first video image data to the live application, the live application transmits the first video image data to the live platform. Therefore, the audience user can watch the live video corresponding to the push video from the corresponding live platform through the terminal equipment.
Therefore, the technical effect of ensuring that the anchor personnel can utilize the expected live platform to conduct live broadcast is achieved. And further, the technical problem that a host person in the prior art cannot send push video generated by live broadcast equipment to a corresponding live broadcast platform through a desired live broadcast application program is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and do not constitute an undue limitation on the disclosure. In the drawings:
fig. 1 is a schematic diagram of a video live system and a plurality of live platforms according to embodiment 1 of the present application;
Fig. 2 is a schematic diagram of hardware modules of a video live system and a plurality of live platforms according to embodiment 1 of the present application;
fig. 3 is a schematic diagram of a hierarchical relationship structure of a live video system according to embodiment 1 of the present application;
fig. 4 is a flowchart of a live video method according to embodiment 1 of the present application;
fig. 5 is a flowchart of a video processing method according to embodiment 1 of the present application;
fig. 6 is a schematic diagram of a live video apparatus according to embodiment 2 of the present application;
fig. 7 is a schematic view of a video processing apparatus according to embodiment 2 of the present application;
fig. 8 is a schematic diagram of a live video apparatus according to embodiment 3 of the present application; and
fig. 9 is a schematic diagram of a video processing apparatus according to embodiment 3 of the present application.
Detailed Description
In order to better understand the technical solutions of the present disclosure, the following description will clearly and completely describe the technical solutions of the embodiments of the present disclosure with reference to the drawings in the embodiments of the present disclosure. It will be apparent that the described embodiments are merely embodiments of a portion, but not all, of the present disclosure. All other embodiments, which can be made by one of ordinary skill in the art without inventive effort, based on the embodiments in this disclosure, shall fall within the scope of the present disclosure.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, partial terms or terminology appearing in describing embodiments of the present disclosure are applicable to the following explanation:
pushing video: the push video of the invention refers to a video which is generated by live broadcast equipment and sent to a live broadcast platform so as to be subjected to live broadcast.
Live video: the live video of the invention refers to a video corresponding to a push video, which is sent to terminal equipment of each audience user by a live platform.
Live broadcast equipment: the live broadcast equipment provided by the invention is equipment with a live broadcast function, and the live broadcast function is a function of sending generated push video to a live broadcast platform. Thus, the live broadcast equipment also comprises a computer, a mobile phone, a tablet personal computer and the like for installing the live broadcast application program.
Example 1
According to the present embodiment, a method embodiment of live video and video processing is provided, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order different from that shown or described herein.
Fig. 1 is a schematic diagram of a video live system and a plurality of live platforms according to embodiment 1 of the present application. Referring to fig. 1, a video live broadcast system includes: a live device 100 and a plurality of different types of video sources 201-206.
Referring to fig. 1, video sources 201 to 204 are connected to the live broadcast device 100, for example, through corresponding video source input interfaces, and video sources 205 to 206 may also be connected to the live broadcast device 100 in communication, for example (where networks used for connecting the video sources 205 to 206 to the live broadcast device 100 in communication include, but are not limited to, mobile communication networks, wireless networks, and the like). And video sources 201-206 may include, for example, camera 201, camera 202, computer 203, computer 204, tablet 205, and tablet 206. It should be noted that, as will be apparent to those skilled in the art, the above description is merely illustrative of the type of the video source and the connection manner of the video source and the live broadcast device 100, and the practical situation is not limited thereto.
In addition, referring to fig. 1, the system further includes a plurality of live broadcast platforms 301 to 30n. After the processor in the live broadcast device 100 generates the push video, the processor transmits the first video image data corresponding to the push video to the live broadcast platforms 301-30 n. Therefore, the audience user can watch the live video corresponding to the push video from the corresponding live platform 301-30 n through the terminal equipment. The live broadcast platform can be a tremble live broadcast platform, a quick live broadcast platform, a tiger tooth live broadcast platform and the like.
It is noted that although not shown in fig. 1, the video source may be, for example, a video image stored locally in the live device 100 or a network video image accessed by a processor in the live device 100 through a link address, etc., and is not particularly limited herein.
Fig. 2 is a schematic diagram of hardware modules of a video live system and a plurality of live platforms according to embodiment 1 of the present application. Referring to fig. 2, a camera 201 is connected to a processor 110 in the live device 100 through a deserializer 111; the camera 202 is connected with the processor 110 in the live device 100 through the USB interface 112; the computer 203 is connected with the processor 110 in the live broadcast equipment 100 through the HDMI interface 113; the computer 204 is connected with the processor 110 in the live broadcast device 100 through the HDMI interface 114; the handset 205 is communicatively coupled to the processor 110 in the live device 100 via the mobile communications circuitry 115; the handset 206 is communicatively coupled to the processor 110 in the live device 100 via the network communication circuit 116.
Further, referring to fig. 2, the processor 110 in the live broadcast device 100 is communicatively connected to the live broadcast platforms 301 to 30n through the network communication circuit 116, so that the first video image data corresponding to the push video can be sent to the live broadcast platforms 301 to 30n. Although not shown in fig. 2, the processor 110 may be communicatively connected to the live broadcast platforms 301-30 n through the mobile communication circuit 115, for example.
It should be noted that the processor 110 may send the first video image data to one live platform, or may send the first video image data to a plurality of different live platforms respectively. The present invention is not particularly limited herein.
Fig. 3 is a schematic structural diagram of a hierarchical relationship of a live video system according to an embodiment of the present application. Referring to fig. 3, the live device 100 has an operating system and hardware layers below the operating system deployed, and in this embodiment, android is described as an example of the operating system. And wherein, referring to fig. 3, the operating system includes a driver layer, a HAL layer, a camera service layer, a camera framework layer, and an application layer.
Wherein, a video processing application program is deployed on an application program layer. The video processing application is used to generate push video during live broadcast. In addition, a plurality of live broadcast application programs 1-n are deployed in the application program layer. The live broadcast application programs 1-n are used for transmitting the push video to the corresponding live broadcast platform. The live broadcast applications 1 to n may be live broadcast applications (e.g., a tremble application, a fast-handed application, etc.) developed by a third party based on an android system, for example.
Further, the driving layer of the operating system is deployed with a first device file (for example, the first device file may be \dev\video0 deployed under the \dev directory of the operating system). And the corresponding function can be utilized at the HAL layer to designate the first device file as the device file corresponding to the first camera associated with the live application 1-n. For example, the first camera associated with the live application 1-n is a front-end camera of a tablet device such as a mobile phone, so that the first device file can be designated as a device file corresponding to the front-end camera associated with the live application 1-n by using a corresponding function at the HAL layer. Or the first camera associated with the live broadcast application program 1-n is a rear camera of a tablet device such as a mobile phone, so that the first device file can be designated as a device file corresponding to the rear camera associated with the live broadcast application program 1-n by utilizing a corresponding function at the HAL layer. Thus, in the case where the video processing application writes video image data to the first device file, the live application 1 to n can receive video image data read from the first device file by the operating system.
The operating system may also create a corresponding second device file at the driver layer based on the type of video source input interface at the hardware layer. For example, the hardware layer includes a deserializer 111, a USB interface 112, a network communication circuit 116, and an HDMI interface 113. For the deserializer 111, no matter whether the deserializer 111 is connected with the camera 201, the operating system generates a corresponding second device file, and the second device file can be, for example, dev/video 1 under the dev directory of the operating system; for the HDMI interface 113, no matter whether the HDMI interface 113 is connected to the computer 203, the operating system generates a corresponding second device file, where the second device file may be \dev\video2 under the \dev directory of the operating system; for the USB interface 112, only if the USB interface 112 is connected to the camera 202, the operating system generates a corresponding second device file, where the second device file may be, for example, dev\video3 under the dev directory of the operating system; for the network communication circuit 116, the operating system does not generate a corresponding second device file.
Furthermore, it should be noted that, in the present embodiment, android is described as an example of an operating system, but it should be clear to those skilled in the art that other types of operating systems are also within the scope of the present application.
In the above-described operating environment, according to a first aspect of the present embodiment, there is provided a live video method implemented by the processor 110 shown in fig. 2. Fig. 4 shows a schematic flow chart of the method, and referring to fig. 4, the method includes:
s402: generating a push video for pushing to the live platform;
s404: writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to a live broadcast platform; and
s406: the first video image data is read from the first device file via the operating system with the live application and transmitted to the live platform.
Specifically, first, in the case where the anchor person performs live video, the anchor person generates a push video for pushing to the live platform using the live device 100 (S402).
Further, the live broadcast device 100 writes the generated first video image data of the push video to a first device file disposed in the operating system (S404). As described above, the first device file may be \dev\video0 deployed under the \dev directory of the operating system. Whereby the live device 100 writes the generated first video image data of the push video into the device file. In addition, as described above, in this embodiment, the first device file may be designated as the device file corresponding to the first camera associated with the live application 1 to n in advance by using the corresponding function at the HAL layer. Specifically, in the present embodiment, the live application (for example, the tremble application, the quick-action application, and the like) is a live application developed by a third party based on a tablet device such as a mobile phone, and the live application acquires video data from a front camera or a rear camera of the tablet device by default, so in the present embodiment, the first camera corresponds to the front camera or the rear camera by default of the live application. Further, in this embodiment, the corresponding function may be utilized in the HAL layer in advance to designate the first device file as a device file corresponding to the front camera or the rear camera of the tablet device, so that the live application may read the video data from the first device file through the operating system.
Finally, the live application reads the first video image data from the first device file via the operating system and transmits the first video image data to the live platform (S406). Referring to the above, since the first device file in the present embodiment corresponds to the first camera associated with the live application, the live application may acquire, through the operating system, first video image data corresponding to the push video from the first device file, and send the first video image data as the push video to the live platform.
Based on the foregoing background, existing live applications (e.g., a tremble application, a quick-action application, etc.) are developed based on a tablet device such as a mobile phone, and the live application acquires video image data from a front camera or a rear camera of the tablet device by default. For example, in the case where a live person is required to perform live, the live application responds to a click operation by the live person and sends a request to acquire video images from a front-end camera to an operating system within the tablet device. Then, the operating system responds to the request and reads video image data photographed by the front-end camera from the device file corresponding to the front-end camera. And then the operating system sends the read video image data to the live broadcast application program, so that the live broadcast application program can generate push video based on the video image data shot by the front-end camera and transmit the push video to the live broadcast platform.
Because the existing live broadcast application is developed based on the tablet device, that is, the live broadcast application defaults to acquire video image data from a front camera or a rear camera of the tablet device, even if a live broadcast application (for example, a tremble application, a fast-hander application, etc.) corresponding to a live broadcast platform is installed in the live broadcast device, the live broadcast application cannot push video generated by the live broadcast device to the live broadcast platform expected by a host.
Thus, in order to solve the above-described problem, the present embodiment deploys the first device file in the operating system of the live device. The first device file corresponds to a first camera associated with a live application. Wherein the first camera may be, for example, a camera that is used by default by the live application (e.g., the first camera corresponds to a front-facing camera or a rear-facing camera of the tablet device). Thus, the live application program can acquire first video image data corresponding to the push video from the first device file through the operating system, and send the first video image data to the live platform as the push video.
As can be seen from the above description, since the first device file corresponds to the first camera associated with the live application, in any case, as long as the operating system receives the request for acquiring video image data from the first camera sent by the live application, the operating system will read the corresponding video image data from the first device file deployed in advance.
Thus, in the live broadcast process, the live broadcast device may, for example, write the first video image data of the push video into the first device file. The live broadcast application program can acquire first video image data corresponding to the push video from the first device file through the operating system, and sends the first video image data to the live broadcast platform as the push video. And furthermore, the anchor personnel can send the push video generated by the live broadcast equipment to the corresponding live broadcast platform through the expected live broadcast application program, and the audience user can watch the live broadcast video corresponding to the push video from the corresponding live broadcast platform through the terminal equipment.
Therefore, the technical effect of ensuring that the anchor personnel can utilize the live broadcast equipment to carry out live broadcast on the expected live broadcast platform is achieved. And further, the technical problem that a host person in the prior art cannot send push video generated by live broadcast equipment to a corresponding live broadcast platform through a desired live broadcast application program is solved.
Optionally, the operation of reading the first video image data from the first device file via the operating system with the live application includes: transmitting a request to acquire data from the first camera to the operating system using the live application; and receiving, with the live application, first video image data from the operating system read by the operating system from the first device file.
Specifically, referring to fig. 3, for example, in a case where a host performs live broadcast by using the live broadcast device 100 through a live broadcast platform corresponding to the live broadcast application 1 to n, the live broadcast application 1 to n on the live broadcast device 100 sends, to an operating system, a request for acquiring video image data from each associated first camera (for example, a front camera or a rear camera of a tablet device), respectively. Then, the operating system responds to a data acquisition request sent by the live broadcast application programs 1-n, reads first video image data from the first equipment file, and sends the read first video image data to the live broadcast application programs 1-n respectively.
Therefore, in the technical scheme of the application, the push video can be pushed to the corresponding live platform by utilizing any live broadcast application program supported by the live broadcast equipment through the operating system. Therefore, for the anchor personnel, the live broadcast equipment can be utilized to carry out live broadcast on the live broadcast platform corresponding to the live broadcast application program only by installing the expected live broadcast application program on the operating system of the live broadcast equipment, thereby being convenient for the anchor personnel to use.
Optionally, the operation of generating the push video for pushing to the live platform includes: generating a push video using a video processing application; and writing first video image data of the push video into a first device file disposed on the operating system, comprising: first video image data of the push video is written to a first device file using a video processing application.
Specifically, referring to fig. 3, during live broadcast, the video processing application generates push video for pushing to the live platform. Then, the video processing application writes the first video image data of the push video to a first device file disposed at a driving layer of the operating system (for example, the first device file may be \dev\video0 disposed under a \dev directory of the operating system). Thus, in the case where the video processing application writes the first video image data of the push video into the first device file, the live broadcast application 1 to n may read the first video image data from the first device file.
In the prior art, a host person usually generates a push video by using a live broadcast application program installed on a tablet device, and since the live broadcast application program and a live broadcast platform are in one-to-one correspondence (for example, a tremble sound application program corresponds to a tremble sound platform, and a fast hand application program corresponds to a fast hand platform), if the host person wants to live broadcast on a plurality of live broadcast platforms at the same time, the host person must use a plurality of tablet devices to live broadcast on different live broadcast platforms at the same time.
In the technical scheme of the application, in the live broadcast process, a push video can be generated by using a video processing application program installed on live broadcast equipment, and first video image data of the push video is written into a first equipment file. And, the live broadcast application programs 1-n can respectively acquire the first video image data from the first equipment file through the operating system, and transmit the first video image data as push video to the respective corresponding live broadcast platform.
Therefore, even if a host needs to live on a plurality of live broadcast platforms at the same time, the host does not need to utilize a plurality of live broadcast application programs installed on different flat-panel devices to respectively live, and can live on different live broadcast platforms at the same time only by installing a video processing application program and a live broadcast application program corresponding to the live broadcast platform expected by the host on the live broadcast device, thereby achieving the technical effects of reducing economic cost and facilitating the live broadcast of the host.
Optionally, the operation of generating the push video for pushing to the live platform includes: obtaining second video image data from at least one video source, wherein the at least one video source comprises a different video source than the first camera associated with the live application; and generating a push video based on the second video image data.
In particular, referring to fig. 3, first, a video processing application obtains second video image data from at least one video source. Wherein the at least one video source comprises a different video source than the first camera associated with the live application. And further, the second video image data may be, for example, video image data obtained by the live device 100 from the at least one video source and not yet further processed by the video processing application.
The at least one video source may be, for example, a camera, a computer or a mobile phone, which are externally connected to the live broadcast device 100, a video image stored locally in the live broadcast device 100, or a network video image accessed by the live broadcast device 100 through a link address. In summary, the at least one video source is a different video source relative to the first camera associated with the live application, and is not particularly limited herein.
Then generating push video for pushing to the live platform based on the second video image data in case the video processing application obtains the second video image data from the at least one video source.
Unlike the existing anchor person which can only generate push video by using a live broadcast application program installed on a mobile phone, a tablet computer or a computer, in the technical scheme of the application, at least one video source including the mobile phone, the tablet computer or the computer can be accessed to live broadcast equipment, so that the video processing application program can receive second video image data transmitted by the at least one video source and generate push video based on the second video image data.
Therefore, in the technical scheme of the application, compared with the push video generated by directly utilizing the live broadcast application program installed on the mobile phone, the tablet personal computer or the computer, the push video pushed to the live broadcast platform contains richer image information, and further the technical effect of attracting audience users to watch live broadcast can be achieved.
Optionally, the at least one video source comprises a second camera different from the first camera, and the operation of acquiring second video image data from the at least one video source comprises: and reading second video image data from a second device file corresponding to the second camera by using the operating system.
Specifically, referring to fig. 3, first, the operating system automatically creates a corresponding second device file at the driving layer according to the type of the video source input interface of the hardware layer, and writes the second video image data into the corresponding second device file.
And under the condition that the video source input interface is a deserializer and/or an HDMI interface, the operating system can generate a corresponding second device file in advance no matter whether the deserializer and/or the HDMI interface is connected with a corresponding video source or not. And under the condition that the deserializer and/or the HDMI interface is accessed to a corresponding video source, the operating system directly writes the received second video image data into a corresponding second device file.
For example, in the case where the video source input interface is the deserializer 111, and the deserializer 111 in the live broadcast apparatus 100 accesses the video camera 201, the operating system directly writes the received second video image data in the corresponding second apparatus file 1.
For another example, in a case where the video source input interface is the HDMI interface 113 and the HDMI interface 113 of the live device 100 is connected to the computer 203, the operating system directly writes the received second video image data into the corresponding second device file 2.
And wherein the operating system dynamically generates the second device file 3 in case the video source input interface is the USB interface 112. That is, when the USB interface 112 is connected to the camera 202, the operating system generates the corresponding second device file 3; in the case where the USB interface 112 does not access the camera 202, the operating system does not generate the corresponding second device file 3.
Further, it should be noted that, in the case where the live broadcast device 100 receives the second video image data corresponding to the mobile phone 205 through the mobile communication circuit 115 and/or receives the second video image data corresponding to the mobile phone 206 through the network communication circuit 116, the operating system directly writes the second video image data into the memory, and the generation of the second device file is not required.
If the live device 100 does not have an external video source, and the second video image data transmitted to the video processing application is a video image stored locally by the live device 100 or a network video image accessed by the live device 100 through the link address, the operating system does not need to deploy a second device file in the driver layer. The operating system directly sends the second video image data of the locally stored video image or the second video image data of the network video image accessed through the link address to the video processing application program of the application program layer.
Furthermore, it should be noted that there may be one or a plurality of second device files.
For example, in the case where there is one deserializer 111 or HDMI interface 113 in the live device 100, the operating system generates one corresponding second device file 1 or second device file 2 in advance. Then, the operating system writes the second video image data received through the deserializer 111 or the HDMI interface 113 to the corresponding second device file 1 or second device file 2.
For another example, in the case where the deserializer 111 and the HDMI interface 113 are present in the live device 100, the operating system generates the corresponding second device file 1 and second device file 2, respectively, in advance. Then, the operating system writes the second video image data received through the deserializer 111 and the HDMI interface 113 to the corresponding second device file 1 and second device file 2.
For another example, in a case where the deserializer 111 and/or the HDMI interface 113 and the USB interface 112 are present in the live device 100, the operating system generates the second device file 1 and/or the second device file 2 corresponding to the deserializer 111 and/or the HDMI interface 113 in advance. Then, in the case where the second video image data transmitted by the corresponding video source is received through the deserializer 111 and/or the HDMI interface 113, the second video image data is written into the second device file 1 and/or the second device file 2 which are generated in advance. In the case that the USB interface 112 is connected to the video source, the operating system dynamically generates the corresponding second device file 3, and writes the second video image data received through the USB interface 112 into the corresponding second device file 3.
And the operating system reads the second video image data from the second equipment file corresponding to the second camera and sends the second video image data to the video processing application program, so that the video processing application program performs operations such as image fusion and/or image processing on the second video image data to generate a push video for pushing to the live broadcast platform.
In the technical scheme of the application, the second device file is deployed on the driving layer, so that the operating system can write the second video image data of at least one video source externally connected with the live broadcast device into the second device file, and the operating system can read the second video image data from the second device file and send the second video image data to the video processing application program. Therefore, under the condition that the anchor personnel need to utilize the external video source to carry out live broadcast, the video processing application program can be ensured to acquire the second video image data of the external video source, and the anchor personnel can be ensured to normally carry out live broadcast.
Optionally, before generating the push video for pushing to the live platform, the method further comprises: deploying a first device file in an operating system; and associating the first camera with the first device file in the operating system.
Specifically, referring to fig. 3, a first device file is also deployed at the driver layer of the operating system before the video processing application generates push video for pushing to the live platform. The first device file may thus be designated at the HAL layer as a device file corresponding to the first camera associated with the live application using the corresponding function. For example, in the case where the first camera associated with the live application is a rear camera of a cellular phone, a post-shot control 8_t function=ANDROID_LENS_FACING_BACK may be designated at the HAL layer utilization function (/ dev/video 0; UPDATE (ANDROID_LENS_FACING, & speed, 1)) designates the first device file, so that the live application may read the first video image data from the first device file in the case where the video processing application writes the first video image data to the first device file.
Or, in case the first camera associated with the live application is a FRONT camera of the handset, the proactive con-struc-t 8_ t _ facility = andlid _ LENS _ facility _ FRONT _ frame may be specified when the HAL layer utilizes the function (/ dev/video 1; UPDATE _ LENS _ facility, 1) specifies the first device file so that the live application may read the first video image data from the first device file in case the video processing application writes the first video image data to the first device file.
Therefore, in the technical scheme of the application, under the condition that the live broadcast application program sends a request for acquiring data from the first camera to the operating system, the operating system reads the first video image data in the first equipment file and sends the first video image data to the live broadcast application program. Therefore, for the anchor person, the live broadcast can be performed only by sending the live broadcast instruction to the live broadcast equipment 100 through the live broadcast application program, so that the technical effect of being convenient for the anchor person to perform live broadcast by using the live broadcast equipment is achieved.
According to the first aspect of the embodiment, the technical effect of ensuring that the anchor personnel can utilize the expected live platform to conduct live broadcast is achieved.
Thus, during live broadcast, the live broadcast device 100 obtains the second video image data from the external video camera and writes the second video image data to the second device file using the operating system. The operating system then reads the second video image data from the second device file and sends the second video image data to the video processing application. The video processing application then generates a push video based on the second video image data and writes first video image data of the push video to the first device file. Further, the live application sends a request to the operating system to acquire video image data from the respective associated first camera. Then, the operating system reads the first video image data from the first device file in response to the request, and transmits the first video image data to the live application. And finally, the live broadcast application program sends the first video image data of the push video to a corresponding live broadcast platform.
Further, according to a second aspect of the present embodiment, there is provided a video processing method implemented by the processor 110 shown in fig. 2. Fig. 5 shows a schematic flow chart of the method, and referring to fig. 5, the method includes:
s502: generating a push video for pushing to the live platform; and
s504: and writing first video image data of the push video into a first device file configured by the operating system, wherein the first device file corresponds to a first camera associated with a live application program, and the live application program corresponds to the live platform.
Notably, as may be described in relation to the first aspect of the embodiments of the present application, the live device 100 may have a live application pre-installed, such that the operating system may send the first video image data read from the first device file to the pre-installed live application, and the live application transmits the first video image data to the live platform. The live device 100 may also not have a live application installed in advance, and may download the live application during live video. For example, the live application may be downloaded after the live device 100 is externally connected to a video source; the live application may also be downloaded after the live device 100 invokes a locally stored video image or accesses a network video image via a link address. The present invention is not particularly limited herein.
According to the second aspect of the embodiment, the technical effect of ensuring that the anchor personnel can live on the expected live platform by utilizing the live equipment is achieved.
Further, referring to fig. 1, according to a third aspect of the present embodiment, there is provided a storage medium. The storage medium includes a stored program, wherein the method of any one of the above is performed by a processor when the program is run.
According to the embodiment, the technical effect of ensuring that the anchor personnel can utilize the live broadcast equipment to conduct live broadcast on the expected live broadcast platform is achieved.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present invention. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present invention.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
Example 2
Fig. 6 shows a live video apparatus 600 according to the first aspect of the present embodiment, the apparatus 600 corresponding to the method according to the first aspect of embodiment 1. Referring to fig. 6, the apparatus 600 includes: a first push video generation module 610, configured to generate a push video for pushing to a live platform; a first data writing module 620, configured to write first video image data of a push video into a first device file disposed in an operating system, where the first device file corresponds to a first camera associated with a live application program, and the live application program corresponds to a live platform; and a first data reading module 630 for reading the first video image data from the first device file via the operating system using the live application program and transmitting the first video image data to the live platform.
Optionally, the first data reading module 630 includes: a first data acquisition module for transmitting a request for acquiring data from the first camera to the operating system using the live application; and a data receiving module for receiving, from the operating system, the first video image data read by the operating system from the first device file using the live application.
Optionally, the first push video generation module 610 includes: the first push video generation sub-module is used for generating push videos by utilizing a video processing application program; and a first data writing module 620 including: and the first data writing sub-module is used for writing first video image data of the push video into the first device file by utilizing the video processing application program.
Optionally, the first push video generation module 610 includes: a second data acquisition module for acquiring second video image data from at least one video source, wherein the at least one video source comprises a different video source than the first camera associated with the live application; and a second push video generation sub-module for generating a push video based on the second video image data.
Optionally, the second data acquisition module includes: and the second data reading module is used for reading second video image data from a second device file corresponding to the second camera by using the operating system.
Optionally, the apparatus 600 further comprises: the device file deployment module is used for deploying a first device file in the operating system; and a device file association module for associating the first camera with the first device file in the operating system.
Further, fig. 7 shows a video processing apparatus 700 according to the second aspect of the present embodiment, the apparatus 700 corresponding to the method according to the second aspect of embodiment 1. Referring to fig. 7, the apparatus 700 includes: a second push video generation module 710, configured to generate a push video for pushing to the live platform; and a second data writing module 720, configured to write first video image data of the push video into a first device file configured by the operating system, where the first device file corresponds to a first camera associated with the live application, and the live application corresponds to the live platform.
According to the embodiment, the technical effect of ensuring that the anchor personnel can utilize the live broadcast equipment to conduct live broadcast on the expected live broadcast platform is achieved.
Example 3
Fig. 8 shows a live video apparatus 800 according to the first aspect of the present embodiment, the apparatus 800 corresponding to the method according to the first aspect of embodiment 1. Referring to fig. 8, the apparatus 800 includes: a first processor 810; and a first memory 820 coupled to the first processor 810 for providing instructions to the first processor 810 for processing the following processing steps: generating a push video for pushing to the live platform; writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to a live broadcast platform; and reading the first video image data from the first device file via the operating system with the live application and transmitting the first video image data to the live platform.
Optionally, the operation of reading the first video image data from the first device file via the operating system with the live application includes: transmitting a request to acquire data from the first camera to the operating system using the live application; and receiving, with the live application, first video image data from the operating system read by the operating system from the first device file.
Optionally, the operation of generating the push video for pushing to the live platform includes: generating a push video using a video processing application; and writing first video image data of the push video into a first device file disposed on the operating system, comprising: first video image data of the push video is written to a first device file using a video processing application.
Optionally, the operation of generating the push video for pushing to the live platform includes: obtaining second video image data from at least one video source, wherein the at least one video source comprises a different video source than the first camera associated with the live application; and generating a push video based on the second video image data.
Optionally, the at least one video source comprises a second camera different from the first camera, and the operation of acquiring second video image data from the at least one video source comprises: and reading second video image data from a second device file corresponding to the second camera by using the operating system.
Optionally, before generating the push video for pushing to the live platform, the method further comprises: deploying a first device file in an operating system; and associating the first camera with the first device file in the operating system.
Further, fig. 9 shows a video processing apparatus 900 according to the second aspect of the present embodiment, the apparatus 900 corresponding to the method according to the second aspect of embodiment 1. Referring to fig. 9, the apparatus 900 includes: a first processor 910; and a second memory 90 coupled to the second processor 910 for providing instructions to the second processor 910 for processing the following processing steps: generating a push video for pushing to the live platform; and writing first video image data of the push video into a first device file configured by the operating system, wherein the first device file corresponds to a first camera associated with the live application and the live application corresponds to the live platform.
According to the embodiment, the technical effect of ensuring that the anchor personnel can utilize the live broadcast equipment to conduct live broadcast on the expected live broadcast platform is achieved.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (10)

1. A live video method for a live video device, comprising:
generating a push video for pushing to the live platform;
writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live application program, and the live application program corresponds to the live platform; and
and reading the first video image data from the first device file by using the live application program through the operating system, and transmitting the first video image data to the live platform.
2. The method of claim 1, wherein the operation of reading the first video image data from the first device file via the operating system with the live application comprises:
transmitting a request to acquire data from the first camera to the operating system using the live application; and
The first video image data read by the operating system from the first device file is received from the operating system with the live application.
3. The method of claim 1, wherein generating push video for pushing to a live platform comprises: generating the push video by using a video processing application program; and
the operation of writing the first video image data of the push video into a first device file disposed in an operating system includes: and writing the first video image data of the push video into the first device file by utilizing the video processing application program.
4. A method according to claim 1 or 3, wherein the operation of generating push video for pushing to a live platform comprises:
obtaining second video image data from at least one video source, wherein the at least one video source comprises a different video source than a first camera associated with the live application; and
and generating the push video based on the second video image data.
5. The method of claim 4, wherein the at least one video source comprises a second camera different from the first camera, and the operation of acquiring second video image data from the at least one video source comprises:
And reading second video image data from a second device file corresponding to the second camera by using the operating system.
6. The method of claim 1, further comprising, prior to generating the push video for pushing to the live platform:
deploying the first device file in the operating system; and
the first camera is associated with the first device file in the operating system.
7. A video processing method for a live broadcast device, comprising:
generating a push video for pushing to the live platform; and
and writing the first video image data of the push video into a first device file configured by an operating system, wherein the first device file corresponds to a first camera associated with a live application program, and the live application program corresponds to the live platform.
8. A storage medium comprising a stored program, wherein the method of any one of claims 1 to 7 is performed by a processor when the program is run.
9. A live video device, comprising:
the first push video generation module is used for generating push videos for pushing to the live broadcast platform;
The first data writing module is used for writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live broadcast application program, and the live broadcast application program corresponds to the live broadcast platform; and
and the first data reading module is used for reading the first video image data from the first device file by using the live broadcast application program through the operating system and transmitting the first video image data to the live broadcast platform.
10. A live video device, comprising:
a first processor; and
a first memory, coupled to the first processor, for providing instructions to the first processor to process the following processing steps:
generating a push video for pushing to the live platform;
writing first video image data of the push video into a first device file deployed in an operating system, wherein the first device file corresponds to a first camera associated with a live application program, and the live application program corresponds to the live platform; and
and reading the first video image data from the first device file by using the live application program through the operating system, and transmitting the first video image data to the live platform.
CN202410200062.0A 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium Active CN117793449B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410435100.0A CN118118731A (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium
CN202410200062.0A CN117793449B (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410200062.0A CN117793449B (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410435100.0A Division CN118118731A (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN117793449A true CN117793449A (en) 2024-03-29
CN117793449B CN117793449B (en) 2024-04-30

Family

ID=90389268

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410435100.0A Pending CN118118731A (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium
CN202410200062.0A Active CN117793449B (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410435100.0A Pending CN118118731A (en) 2024-02-23 2024-02-23 Video live broadcast and video processing method, device and storage medium

Country Status (1)

Country Link
CN (2) CN118118731A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160198226A1 (en) * 2005-04-18 2016-07-07 Mark Sinclair Krebs Multimedia System For Mobile Client Platforms
CN109089129A (en) * 2018-09-05 2018-12-25 南京爱布谷网络科技有限公司 The steady more video binding live broadcast systems of one kind and its method
CN214959711U (en) * 2021-06-29 2021-11-30 沈阳爱视游网络科技有限公司 Lightweight multi-platform interactive video live broadcast cloud control system
CN215734555U (en) * 2021-05-08 2022-02-01 多彩贵州网有限责任公司 Internet video collecting and editing equipment
CN115225920A (en) * 2021-04-20 2022-10-21 苏州思萃人工智能研究所有限公司 Multi-camera multi-scene cloud support live broadcast system for mobile phone
CN115250356A (en) * 2021-04-26 2022-10-28 苏州思萃人工智能研究所有限公司 Multi-camera switchable virtual camera of mobile phone
CN115550678A (en) * 2022-09-26 2022-12-30 北京二六三企业通信有限公司 Live video processing method and device and storage medium
CN116634188A (en) * 2023-06-09 2023-08-22 北京世纪好未来教育科技有限公司 Live broadcast method and device and computer readable storage medium
CN117425052A (en) * 2023-10-25 2024-01-19 包头市金华科技有限公司 Intelligent live broadcast recording and broadcasting system supporting linkage switching of various cameras

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109889875B (en) * 2019-01-23 2021-07-16 北京奇艺世纪科技有限公司 Communication method, communication device, terminal equipment and computer readable medium
CN112788349B (en) * 2019-11-01 2022-10-04 上海哔哩哔哩科技有限公司 Data stream pushing method, system, computer equipment and readable storage medium
CN111107388A (en) * 2019-12-31 2020-05-05 广州华多网络科技有限公司 Method, device, system, equipment and storage medium for processing live broadcast content
WO2021179315A1 (en) * 2020-03-13 2021-09-16 深圳市大疆创新科技有限公司 Video live streaming method and system, and computer storage medium
CN116095397A (en) * 2023-01-18 2023-05-09 杭州星犀科技有限公司 Live broadcast method, live broadcast device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160198226A1 (en) * 2005-04-18 2016-07-07 Mark Sinclair Krebs Multimedia System For Mobile Client Platforms
CN109089129A (en) * 2018-09-05 2018-12-25 南京爱布谷网络科技有限公司 The steady more video binding live broadcast systems of one kind and its method
CN115225920A (en) * 2021-04-20 2022-10-21 苏州思萃人工智能研究所有限公司 Multi-camera multi-scene cloud support live broadcast system for mobile phone
CN115250356A (en) * 2021-04-26 2022-10-28 苏州思萃人工智能研究所有限公司 Multi-camera switchable virtual camera of mobile phone
CN215734555U (en) * 2021-05-08 2022-02-01 多彩贵州网有限责任公司 Internet video collecting and editing equipment
CN214959711U (en) * 2021-06-29 2021-11-30 沈阳爱视游网络科技有限公司 Lightweight multi-platform interactive video live broadcast cloud control system
CN115550678A (en) * 2022-09-26 2022-12-30 北京二六三企业通信有限公司 Live video processing method and device and storage medium
CN116634188A (en) * 2023-06-09 2023-08-22 北京世纪好未来教育科技有限公司 Live broadcast method and device and computer readable storage medium
CN117425052A (en) * 2023-10-25 2024-01-19 包头市金华科技有限公司 Intelligent live broadcast recording and broadcasting system supporting linkage switching of various cameras

Also Published As

Publication number Publication date
CN117793449B (en) 2024-04-30
CN118118731A (en) 2024-05-31

Similar Documents

Publication Publication Date Title
WO2021027630A9 (en) Patching method, related apparatus, and system
US20040066457A1 (en) System and method for remote controlled photography
CN108111874B (en) file processing method, terminal and server
CN109068059B (en) Method for calling camera, mobile terminal and storage medium
CN111245852B (en) Streaming data transmission method, device, system, access device and storage medium
CN115665342B (en) Image processing method, image processing circuit, electronic device, and readable storage medium
CN114422460A (en) Method and system for establishing same-screen communication sharing in instant messaging application
CN114237840A (en) Resource interaction method, device, terminal and storage medium
US7450157B2 (en) Remote high resolution photography and video recording using a streaming video as a view-finder
CN114286117A (en) Multi-platform multi-application live broadcast method and system, live broadcast equipment and storage medium
CN117793449B (en) Video live broadcast and video processing method, device and storage medium
CN112333529B (en) Live streaming loading method and device, equipment and medium thereof
RU2352977C2 (en) System for control of chamber resource in portable device
CN118101629A (en) Audio live broadcast and audio processing method, device and storage medium
KR100713148B1 (en) Mobile Terminal Equipment and Multimedia Resources Sharing Technology among Mobile Terminal Equipments and Stationary Terminal Equipments
CN114285957A (en) Image processing circuit and data transmission method
KR101221845B1 (en) Photo messenger service system and method
CN113747200B (en) Video processing method and device, electronic equipment and readable storage medium
CN116709004B (en) Image processing method and electronic equipment
CN113949684B (en) Video transmission method, device, medium and computing equipment
JP7133634B2 (en) Method, system, and non-transitory computer-readable recording medium for producing videos based on bots capable of user feedback
KR101421059B1 (en) Method of providing streaming movie over image file
KR102336165B1 (en) Computer program for preventing information spill displayed on display device and security service using the same
JP2004213487A (en) Card type device
US10530980B2 (en) Apparatus for managing video data and method the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant