CN113727133A - Live broadcast control method and device, equipment and medium thereof - Google Patents

Live broadcast control method and device, equipment and medium thereof Download PDF

Info

Publication number
CN113727133A
CN113727133A CN202111017139.3A CN202111017139A CN113727133A CN 113727133 A CN113727133 A CN 113727133A CN 202111017139 A CN202111017139 A CN 202111017139A CN 113727133 A CN113727133 A CN 113727133A
Authority
CN
China
Prior art keywords
application
video data
live
video
live broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111017139.3A
Other languages
Chinese (zh)
Other versions
CN113727133B (en
Inventor
曾衍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co Ltd filed Critical Guangzhou Cubesili Information Technology Co Ltd
Priority to CN202111017139.3A priority Critical patent/CN113727133B/en
Publication of CN113727133A publication Critical patent/CN113727133A/en
Application granted granted Critical
Publication of CN113727133B publication Critical patent/CN113727133B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a live broadcast control method, a device, equipment and a medium thereof, wherein the method comprises the following steps: the method comprises the steps that a screen projection communication link is established between a first application and a second application in the same-machine equipment; the first application receives the video data information sent by the second application and loads the video data pointed by the video data information; the first application pushes a live stream containing the video data to a live room controlled by the first application. This application is through constructing the video function of throwing the screen between the different applications in same equipment, realizes the switching of video broadcast control side, makes the video broadcast function control other video of using that accessible self has between the application broadcast the propelling movement, if apply to live in using, it can be with the video synthesis of other applications to live in the stream carry out the broadcast propelling movement to strengthening the broadcast effect of live stream, promotes the live atmosphere of live room.

Description

Live broadcast control method and device, equipment and medium thereof
Technical Field
The present application relates to the field of network live broadcast, and in particular, to a live broadcast control method, and further, to an apparatus, a device, and a non-volatile storage medium corresponding to the method.
Background
The existing technology for transferring the playing of a video from one interface to another interface is realized by means of a wireless screen projection technology such as DLNA (digital living network alliance), a client of the intelligent mobile device can display screen images and sounds of the client to another display device through the wireless screen projection technology, the function of projecting video data of one device to a screen of another device for playing is realized, generally, a display picture of a small-screen device is projected to a screen of a large-screen device for playing, and a user can enjoy better video playing effect. However, obviously, such screen projection technology can only implement screen projection sharing of video content of data media across devices, and cannot be used for cross-application sharing of the same device.
Live broadcast sharing is a common video content sharing channel, and various video sources are often required to carry out content support, and when a main control end of a live broadcast room needs to share a video source which is not provided by application of the live broadcast room, especially when video sources of other application programs in local equipment are required, a complicated operation party is required to indirectly realize sharing, and sometimes, in view of the fact that the application program providing the video source does not support downloading, sharing cannot be realized, or the shared image quality is influenced.
In view of the problem that various prior art can not satisfy the problem of acquiring a live broadcast shared video source from the same device in a cross-application mode, the applicant makes corresponding exploration in consideration of meeting the live broadcast sharing requirement.
Disclosure of Invention
The present application is directed to a live broadcast control method and corresponding apparatus, electronic device, and non-volatile storage medium, which meet the needs of the prior art or overcome at least some of the disadvantages of the prior art.
In order to realize the purpose of the application, the following technical scheme is adopted:
a live broadcast control method adapted to one of the purposes of the present application includes the steps of:
the method comprises the steps that a screen projection communication link is established between a first application and a second application in the same-machine equipment;
the first application receives the video data information sent by the second application and loads the video data pointed by the video data information;
the first application pushes a live stream containing the video data to a live room controlled by the first application.
In a further embodiment, the step of establishing a screen-casting communication link between the first application and a second application in the peer device includes:
the first application receives a multicast message broadcasted by the second application according to a preset protocol specification and used for searching the first application suitable for establishing the screen-casting communication link;
the first application responds to the multicast message and feeds back equipment information preconfigured by the first application to the second application in a unicast mode, and the equipment information comprises communication configuration information;
and the first application and the second application establish a screen projection communication link according to the communication configuration information to realize binding.
In a further embodiment, the step of establishing a screen-casting communication link between the first application and a second application in the peer device includes:
the first application searches and displays an available program list belonging to the second application in the same-machine equipment;
the first application sends equipment information pre-configured by the first application to a second application selected from the available program list, wherein the equipment information comprises communication configuration information so as to activate the second application to enter a state of establishing a screen projection communication link with the first application;
and the first application and the second application establish a screen projection communication link according to the communication configuration information to realize binding.
In a further embodiment, the step of receiving, by the first application, the video data information sent by the second application, and loading the video data pointed by the video data information includes:
the first application receives video data information sent by the second application through the screen projection communication link, the video data information comprises a reference address of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application;
the first application analyzes the video data information and obtains a reference address of a video source from the video data information;
and the first application calls the built-in player to load and play the video data pointed by the reference address.
In a further embodiment, the step of receiving, by the first application, the video data information sent by the second application, and loading the video data pointed by the video data information includes:
the first application receives video data information sent by the second application through the screen projection communication link, the video data information comprises a local cache path of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application;
the first application analyzes video data information and obtains a local cache path of the video data sent by the second application;
and the first application calls a built-in player to load and play the video data pointed by the local cache path.
In a further embodiment, the step of the first application pushing a live stream containing said video data to a live room controlled by the first application comprises:
the first application transmits the reference address to a media server to drive the media server to acquire video data according to the reference address, so that the video data is pushed to the live broadcast room as live broadcast stream;
and the live broadcast room of the first application receives the live broadcast stream through any terminal equipment where the live broadcast room is located and plays the live broadcast stream in an internal player of the live broadcast room.
In a preferred embodiment, the step of the first application pushing a live stream containing said video data to a live room controlled by the first application comprises:
the first application transmits the video data pointed by the local cache path to a media server by streaming media so as to drive the media server to push the live stream containing the video data to the live broadcast room;
and the live broadcast room of the first application receives the live broadcast stream through any terminal equipment where the live broadcast room is located and plays the live broadcast stream in an internal player of the live broadcast room.
In a further embodiment, after the step of the first application pushing the live stream containing the video data to the live room controlled by the first application is executed, the following post-steps are executed:
the first application monitors a video source playing control event triggered by the second application, and correspondingly controls the playing of the video data in the live broadcast room in response to the playing control event.
In a further embodiment, in the step of binding the screen projection communication link between the first application and the second application in the co-operating device, the first application and the second application are configured to be respectively located in virtual devices with different communication addresses from each other, so that the established screen projection communication link conforms to the communication specification of the DLNA protocol.
Adapt to the purpose of this application and propose a live controlling means, it includes:
the screen-casting communication link establishing module is used for establishing a screen-casting communication link between the first application and a second application in the same-machine equipment;
the video data information pushing module is used for receiving the video data information sent by the second application by the first application and loading video data pointed by the video data information;
and the live stream pushing module is used for pushing the live stream containing the video data to a live room controlled by the first application.
In a further embodiment, the screen-projection communication link establishing module includes:
the multicast message receiving submodule is used for receiving a multicast message which is broadcasted by the second application according to the preset protocol standard and is used for searching the first application suitable for establishing the screen-casting communication link by the first application;
the device information response submodule is used for the first application to respond to the multicast message and feed back the device information pre-configured by the first application to the second application in a unicast mode, and the device information comprises communication configuration information;
and the application binding submodule is used for establishing a screen projection communication link between the first application and the second application according to the communication configuration information to realize binding.
In a preferred embodiment, the screen-projection communication link establishing module further includes:
the list searching submodule searches and displays an available program list belonging to the second application in the same-machine equipment;
the device information selection submodule is used for sending device information pre-configured by the first application to a second application selected from the available program list, and the device information contains communication configuration information so as to activate the second application to enter a state of establishing a screen projection communication link with the first application;
and the application binding submodule is used for establishing a screen projection communication link between the first application and the second application according to the communication configuration information to realize binding.
In a further embodiment, the video data information pushing module includes:
the video data information receiving submodule is used for receiving the video data information sent by the second application through the screen projection communication link by the first application, the video data information comprises a reference address of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application;
the reference address acquisition submodule is used for analyzing the video data information by the first application and acquiring a reference address of a video source from the video data information;
and the video data loading submodule is used for calling the built-in player to load and play the video data pointed by the reference address by the first application.
In a preferred embodiment, the video data information pushing module includes:
the first application receives video data information sent by the second application through the screen projection communication link, the video data information comprises a local cache path of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application;
the local cache path acquisition module is used for the first application to analyze video data information and acquire a local cache path of the video data sent by the second application;
and the first application calls a built-in player to load and play the video data pointed by the local cache path.
In a further embodiment, the live stream push module includes:
the reference address pushing submodule is used for transmitting the reference address to the media server by the first application so as to drive the media server to acquire video data according to the reference address and push the video data to the live broadcast room as live broadcast stream;
and the live broadcast stream playing sub-module is used for receiving the live broadcast stream through any terminal equipment where the first application is located in the live broadcast room of the first application and playing the live broadcast stream in an internal player of the live broadcast room.
In a preferred embodiment, the live stream push module further includes:
the local cache way pushing submodule is used for the first application to transmit the video data pointed by the local cache way to the media server by using streaming media so as to drive the media server to push the live broadcast stream containing the video data to the live broadcast room;
and the live broadcast stream playing sub-module is used for receiving the live broadcast stream through any terminal equipment where the first application is located in the live broadcast room of the first application and playing the live broadcast stream in an internal player of the live broadcast room.
An electronic device adapted for the purpose of the present application includes a central processing unit and a memory, wherein the central processing unit is configured to invoke and run a computer program stored in the memory to execute the steps of the live broadcast control method.
The non-volatile storage medium stores a computer program implemented according to the live broadcast control method, and when the computer program is called by a computer, the computer program executes the steps included in the corresponding method.
Compared with the prior art, the application has the following advantages:
according to the method and the device, the video screen projection function between different applications in the same device is constructed, the function allows different applications in the same device to acquire corresponding video data from other applications to be loaded and played in a screen projection mode through a screen projection mode in a mode of establishing a screen projection communication link, and the video data is loaded into a live stream to be broadcast to a live room to be played and displayed.
Firstly, the video screen projection function among different applications in the same equipment is realized, cross-application video data sharing is realized in a screen projection mode, further, the sharing operation of the cross-application video data is simplified, the applications in the same equipment acquire corresponding video data from other applications through a screen projection communication link to be loaded and played, the situation that sharing cannot be realized due to the fact that an application program providing a video source does not support downloading is prevented, the applications acquiring the video data can play the video data which cannot be acquired from a network or a storage space of the equipment, and the video resources of the applications are enriched.
Secondly, the application in the same device can be used as a playing control party of video data of other applications, the application is superior to built-in players of other applications, the playing mode of the video data is controlled by using different playing functions, the playing effect of the video data in the device is optimized, and the integral playing impression of the video is improved.
In addition, the live broadcast application of the same equipment can acquire video data from other applications, the video data acquired by screen projection is shared as the video content of a live broadcast room, the video frames of the video data are synthesized into a live broadcast stream and pushed to the live broadcast room to be played, the video materials of the live broadcast stream are enriched, the video synthesis function of the live broadcast application can be achieved, the video data are subjected to corresponding video special effects, and the playing effect of the live broadcast stream is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic diagram of a typical network deployment architecture related to implementing the technical solution of the present application;
fig. 2 is a schematic flowchart of an exemplary embodiment of a live broadcast control method according to the present application;
fig. 3 is a schematic product architecture diagram of a first application and a second application in the case that a screen-casting communication link of the present application conforms to a communication specification of a DLNA protocol;
FIG. 4 is a flowchart illustrating specific steps of another embodiment of step S11 in FIG. 2;
FIG. 5 is a schematic diagram of a graphical user interface associated with a first application and a second application when the second application is an initiator for establishing a screen-cast communication link according to the present application;
FIG. 6 is a flowchart illustrating specific steps of step S11 in FIG. 2;
FIG. 7 is a schematic diagram of a graphical user interface associated with a first application and a second application when the first application is an initiator for establishing a screen-cast communication link according to the present application;
FIG. 8 is a flowchart illustrating specific steps of step S12 in FIG. 2;
FIG. 9 is a schematic view of a video source list of the present application in a graphical user interface of a second application;
FIG. 10 is a flowchart illustrating specific steps of another embodiment of step S12 in FIG. 2;
FIG. 11 is a schematic view of a graphical user interface of a first application during playback of a live stream containing video data in a live room;
FIG. 12 is a flowchart illustrating specific steps of step S13 in FIG. 2;
FIG. 13 is a flowchart illustrating specific steps of another embodiment of step S13 in FIG. 2;
fig. 14 is a schematic flow chart of an embodiment of a live broadcast control method according to the present application, in which a post-step is added;
fig. 15 is a functional block diagram of an exemplary embodiment of a live control apparatus of the present application;
fig. 16 is a block diagram illustrating a basic structure of a computer device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As will be appreciated by those skilled in the art, "client," "terminal," and "terminal device" as used herein include both devices that are wireless signal receivers, which are devices having only wireless signal receivers without transmit capability, and devices that are receive and transmit hardware, which have receive and transmit hardware capable of two-way communication over a two-way communication link. Such a device may include: cellular or other communication devices such as personal computers, tablets, etc. having single or multi-line displays or cellular or other communication devices without multi-line displays; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data communication capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "client," "terminal device" can be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. The "client", "terminal Device" used herein may also be a communication terminal, a web terminal, a music/video playing terminal, such as a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with music/video playing function, and may also be a smart tv, a set-top box, and the like.
The hardware referred to by the names "server", "client", "service node", etc. is essentially an electronic device with the performance of a personal computer, and is a hardware device having necessary components disclosed by the von neumann principle such as a central processing unit (including an arithmetic unit and a controller), a memory, an input device, an output device, etc., a computer program is stored in the memory, and the central processing unit calls a program stored in an external memory into the internal memory to run, executes instructions in the program, and interacts with the input and output devices, thereby completing a specific function.
It should be noted that the concept of "server" as referred to in this application can be extended to the case of a server cluster. According to the network deployment principle understood by those skilled in the art, the servers should be logically divided, and in physical space, the servers may be independent from each other but can be called through an interface, or may be integrated into one physical computer or a set of computer clusters. Those skilled in the art will appreciate this variation and should not be so limited as to restrict the implementation of the network deployment of the present application.
Referring to fig. 1, the hardware basis required for implementing the related art embodiments of the present application may be deployed according to the architecture shown in the figure. The server 80 is deployed at the cloud end, and serves as a business server, and is responsible for further connecting to a related data server and other servers providing related support, so as to form a logically associated server cluster to provide services for related terminal devices, such as a smart phone 81 and a personal computer 82 shown in the figure, or a third-party server (not shown in the figure). Both the smart phone and the personal computer can access the internet through a known network access mode, and establish a data communication link with the cloud server 80 so as to run a terminal application program related to the service provided by the server.
For the server, the application program is usually constructed as a service process, and a corresponding program interface is opened for remote call of the application program running on various terminal devices.
The application program refers to an application program running on a server or a terminal device, the application program implements the related technical scheme of the application in a programming mode, a program code of the application program can be saved in a nonvolatile storage medium which can be identified by a computer in a form of a computer executable instruction, and is called into a memory by a central processing unit to run, and the related device of the application is constructed by running the application program on the computer.
For the server, the application program is usually constructed as a service process, and a corresponding program interface is opened for remote call of the application program running on various terminal devices.
For various terminal devices which are popular at present, particularly for mobile devices such as tablets and mobile phones, camera devices such as a camera are usually built in, or a personal computer can be externally connected to the camera devices.
The technical scheme suitable for being implemented in the terminal device in the application can also be programmed and built in an application program providing live webcasting, and the technical scheme is used as a part of extended functions. The live webcast refers to a live webcast room network service realized based on the network deployment architecture.
The live broadcast room is a video chat room realized by means of an internet technology, generally has an audio and video broadcast control function and comprises a main broadcast user and audience users, wherein the audience users can comprise registered users registered in a platform or unregistered tourist users; either registered users who are interested in the anchor user or registered or unregistered users who are not interested in the anchor user. The interaction between the anchor user and the audience user can be realized through known online interaction modes such as voice, video, characters and the like, generally, the anchor user performs programs for the audience user in the form of audio and video streams, and economic transaction behaviors can also be generated in the interaction process. Of course, the application form of the live broadcast room is not limited to online entertainment, and can be popularized to other relevant scenes, such as an educational training scene, a video conference scene, a product recommendation and sale scene, and any other scene needing similar interaction.
The person skilled in the art will know this: although the various methods of the present application are described based on the same concept so as to be common to each other, they may be independently performed unless otherwise specified. In the same way, for each embodiment disclosed in the present application, it is proposed based on the same inventive concept, and therefore, concepts of the same expression and concepts of which expressions are different but are appropriately changed only for convenience should be equally understood.
Referring to fig. 2, a live broadcast control method according to the present application, in an exemplary embodiment, includes the following steps:
step S11, the first application establishes a screen-casting communication link with a second application in the peer device:
the first application establishes the screen-casting communication link with the second application running in the co-located device.
The first application and the second application are application programs running in the same device, that is, the same device represents that the second application is an application program running in the same device as the first application.
The first application is generally a live broadcast application program, and is used for an application program for carrying out a live broadcast service, and the first application can load video data pointed by the video data information into a live broadcast stream of the first application for pushing by receiving the video data information pushed by the second application establishing a screen-casting communication link with the first application.
The second application generally refers to an application program capable of independently playing video data and pushing video data information, and the application program can be used as an output party of video data to screen the video data to a graphical user interface of the first application for playing, and can also be used as a playing controller of the video data to control the playing of the video data in the first application, and the second application pushes the video data information of the video data selected by the first application to the first application by establishing the screen-throwing communication link with the first application, so that the first application can push the live broadcast stream of the video data.
The first application and the second application are application programs running on the same-machine equipment, the first application and the second application push the video data information through establishing the screen-casting communication link in the equipment, so that the first application pushes a live broadcast stream containing video data pointed by the video data information to a live broadcast room of the first application and outputs the video data to the live broadcast room for display, and compared with a common application program, a video data playing page of the first application is shared to other applications in a network link mode, so that other applications are different in jumping to the video data playing page through the link, after the first application acquires the video data information, the video data information can be loaded to a built-in player of the first application for playing, and the second application can also participate in controlling the playing of the video data of the first application, for example, pausing the video data playback, starting the video data playback, or adjusting the video data playback progress, etc.
Specifically, referring to fig. 3, the screen projection communication link is generally a screen projection communication link conforming to the communication specification of the DLNA protocol, the first application 302 and the second application 301 are configured to be respectively located in virtual devices with different communication addresses, the second application 301 has a DMS module (digital media server) through which the video data information is pushed to the first application 302 through the screen projection communication link conforming to the DLNA protocol, and the second application 301 has a DMC module (digital media server) capable of controlling the playing of the video data acquired according to the video data information in the first application; the first application 302 has a DMP module (digital media player) having functions of a DMC module and a DMR module (digital media renderer), renders video data acquired according to the video data information through the DMP module to be played, and controls playing of the video data.
Referring to fig. 4 and 5, regarding the implementation of the first application and the second application establishing the screen-casting communication link, when the second application device is the initiator of establishing the screen-casting communication link with the first application, the specific steps are as follows:
step S111, the first application receives a multicast packet broadcasted by the second application according to a preset protocol specification, and used for searching for the first application suitable for establishing the screen-casting communication link:
and the second application establishes the multicast message for searching the screen-casting communication link established with the second application according to the preset protocol standard broadcast, and pushes the multicast message to the first application.
The protocol specification broadcast is generally referred to as the UDP user data protocol, which provides a way for applications to send encapsulated I P packets without establishing a connection, so that the second application can push the multicast packet to the first application without establishing the screen cast communication link.
And the second application broadcasts according to the preset protocol specification, searches the first application starting from the establishment of the screen-casting communication link, and pushes the multicast message to the address and the port of the first application to enable the first application to receive the multicast message and respond so as to establish the screen-casting communication link between the two parties.
Specifically, referring to fig. 5, a screen-casting application selection event is generated by a video data screen-casting control a-501 shown in fig. 5 by touching a graphical user interface of a second application, so as to screen-cast video data played in a video data playing window a-502 shown in fig. 5 into a corresponding application program in the same device for playing, the second application responds to the screen-casting application selection event, the current graphical user interface is converted from the graphical user interface a to the graphical user interface B to display a selected screen-casting application list B-501 shown in the graphical user interface B, a plurality of application programs capable of establishing the screen-casting communication link in the device are displayed in the list, and the application program B in the list is selected as a first application for establishing the screen-casting communication link by touching the graphical user interface B-502, and the application program B (first application) is displayed in the list.
Step S112, the first application responds to the multicast packet and unicast-feeds back the device information preconfigured by the first application to the second application, where the device information includes communication configuration information:
and after receiving the multicast message, the first application responds to the multicast message to acquire an address and a port contained in the multicast message, and unicast-feeds back equipment information which is pre-configured by the first application and contains the communication configuration information to the second application through the address and the port.
The configuration information refers to information of the first application, which is used for establishing the screen-casting communication link with other applications, the device information includes the communication configuration information, the communication configuration information includes an ip address and an upnp service port of the first application, and the second application establishes the screen-casting communication link with the first application according to the communication configuration information included in the configuration information after receiving the configuration information.
Specifically, referring to fig. 5, a diagram C of fig. 5 is a graphical user interface of a first application, and responds to the multicast packet pushed by a second application through a touch acceptance screen projection control C-501, so as to feed back the preconfigured device information to the second application in a unicast manner, so as to synthesize video data played in a video data playing window a-502 in fig. 5 a into a live stream for broadcasting, and load the live stream into a live stream playing touch control C-502 for playing.
Step S113, the first application and the second application establish a screen projection communication link according to the communication configuration information to realize binding:
and the second application receives the equipment information fed back by the first application unicast, and establishes the screen projection communication link with the first application according to the communication configuration information contained in the equipment information so as to realize the mutual binding of the screen projection communication links of the two parties.
Referring to fig. 6 and 7, regarding the implementation of the first application and the second application establishing the screen-projected communication link, when the first application is an initiator of the screen-projected communication link with the second application, the specific steps are as follows:
step S111', the first application searches and displays an available program list belonging to the second application in the peer device:
and the first application acquires the available program list which belongs to the second application in the same-machine equipment and is displayed by searching through triggering the establishment event of the screen-casting communication link.
The available program list comprises i p addresses and upnp ports of the second applications which can be applied to establish the screen-casting communication link in the peer-to-peer devices, the first application outputs and displays the available program list to a graphical user interface after acquiring the available program list, and the first application can select corresponding second application push device information in the second applications displayed in the available program list to establish the screen-casting communication link with the selected second application.
Specifically, please touch fig. 7, where a diagram a in fig. 7 is a current graphical user interface of a first application, and a video data output side control a-601 is selected by touch to trigger the establishment event, so as to obtain corresponding video data, synthesize the video data into a live stream, and output the live stream to a live stream play window a-602 for playing, at this time, an available program list B-601 shown in fig. B is output in the graphical user interface of the first application, and a plurality of second applications capable of establishing the screen casting communication link with the first application are displayed in the available program list B-601.
Step S112', the first application sends device information preconfigured by the first application to a second application selected from the available program list, where the device information includes communication configuration information to activate the second application to enter a state of establishing a screen-casting communication link with the first application:
the first application selects the corresponding second application from the available program list to push the device information pre-configured by the first application to the second application, so that the second application receives the device information and activates the state of establishing the screen-casting communication link with the first application.
The configuration information is information of the first application itself used for establishing the screen-casting communication link with other applications, the device information includes the communication configuration information, the communication configuration information includes i p address and upnp service port of the first application, and after receiving the configuration information, the second application establishes the screen-casting communication link with the first application according to the communication configuration information included in the configuration information.
Specifically, referring to fig. 5, the device information preconfigured by the first application is sent to the application program B by touching the connection B-602 control of fig. B of fig. 5, and the application program B is used as the second application to establish the screen-projecting communication links of the two parties.
For the implementation of the device information, please refer to the related embodiment in step S112, which is not repeated herein.
Step S113', the first application and the second application establish a screen-casting communication link according to the communication configuration information to realize binding:
and the second application receives the equipment information pushed by the first application, and establishes the screen projection communication link with the first application according to the communication configuration information contained in the equipment information so as to realize the mutual binding of the screen projection communication links of the two parties.
Step S12, the first application receives the video data information sent by the second application, and loads the video data pointed by the video data information:
after the screen-casting communication link is established between the first application and the second application, the first application can receive the video data information pushed by the second application through the screen-casting communication link, so that the first application can load the video data pointed by the video data information.
The video data information generally refers to a reference address or a local cache path for acquiring the video data, and after the first application acquires the video data information, the first application acquires the video data pointed by the video data through the reference address or the local cache path included in the video data information.
The video data generally refers to video data acquired by the second application from a network or a local place, the video data can be loaded and played by the first application and the second application, and the first application and the second application can control the playing of the video data, for example, operations such as pausing the playing, starting the playing, adjusting the playing progress, adjusting the playing volume, and the like.
The first application acquires a video event through triggering, corresponding video data is selected from a second application established by the first application, the second application responds to the acquired video event, video data pointed by the acquired video event is determined, and the video data information pointed to the video data is pushed to the first application.
In one embodiment, the second application triggers a push video event, selects a corresponding video data from its own application, and pushes the video data information pointing to the video data to the first application, so that the first application loads the video data pointed by the video data information into a live stream.
After the first application obtains the video data through the video data information, it generally calls a built-in player of itself to load the video data for playing, and besides the first application can control the playing operation of the video data through the built-in player, it can also synthesize a corresponding special animation effect in the video data for displaying, for example, synthesize a video filter or synthesize a picture, etc., or change the playing sound effect of the video data, for example, tune the playing sound effect, etc., and a person skilled in the art can design the function of editing the video design of the built-in player according to an actual service scene, which is not repeated.
Referring to fig. 8 and 9, regarding the implementation of the first application acquiring the video data information to load the video data, when the video data information includes the reference address, the specific implementation steps are as follows:
step S121, the first application receives, via the screen-casting communication link, video data information sent by the second application, where the video data information includes a reference address of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application:
and the first application receives the video data information sent by the second application through the screen projection communication link, and loads the video source selected by the second application through the video source list provided in the graphical user interface of the second application according to the reference address of the video source included in the video data information.
After the screen projection communication link is established between the second application and the first application, a corresponding video source can be selected through the video source list provided in the graphical user interface of the second application to push video data information, a plurality of video sources acquired by the second application through the internet are displayed in the video source list, and the video sources are displayed in the video source list in a visual format of pictures, texts or plain texts.
The reference address generally refers to a URL, and after the first application acquires the reference address, the first application may acquire video data pointed by the URL from a media server pointed by the reference address (URL) to load and play the video data.
Specifically, referring to fig. 9, the video source list is shown as a video source list 901 in the figure, and is displayed in a graphical user interface of a second application that establishes the screen-casting communication link with a first application, a plurality of video sources capable of screen casting to the first application are displayed in the video source list 901, and when a screen-casting push 902 control is touched, the video data information of the video source C is generated and pushed to the first application through the screen-casting communication link.
Step S122, the first application analyzes the video data information, and obtains the reference address of the video source from the video data information:
after the first reference obtains the video data information, the video data information is analyzed, and the reference address in the video data information is obtained, so that the loading of the video data pointed by the reference address is carried out.
Step S123, the first application calls the built-in player to load and play the video data pointed by the reference address:
after the first application acquires the reference address, it calls its own built-in player to load and play the video data pointed by the reference address, specifically, the first reference establishes a connection with the media server pointed by the reference address through the reference address, so as to acquire the video data pointed by the reference address from the media server, and load the video data into the built-in player to play.
Referring to fig. 10, regarding the implementation of the first application acquiring the video data information to load the video data, when the video data information includes the local cache path, the specific implementation steps are as follows:
step S121', the first application receives, via the screen-casting communication link, video data information sent by the second application, where the video data information includes a local cache path of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application:
and the first application receives the video data information sent by the second application through the screen projection communication link, and loads the video source selected by the second application through the video source list provided in the graphical user interface of the second application according to the reference address local cache path of the video source included in the video data information.
After the screen-casting communication link is established between the second application and the first application, a corresponding video source can be selected through the video source list provided in the graphical user interface of the second application to push video data information, a plurality of video sources stored in a memory space of equipment by the second application are shown in the video source list, and the video sources are shown in the video source list in a visual format of pictures, texts or plain texts.
Step S122', the first application analyzes the video data information, and obtains a local cache path of the video data sent by the second application:
after the first reference obtains the video data information, the video data information is analyzed, and the local cache path in the video data information is obtained, so that the video data pointed by the local cache path is loaded.
Step S123', the first application calls a built-in player to load and play the video data pointed by the local cache path:
after the first application acquires the local cache path, the first application calls a player built in the first application to load and play the video data pointed by the local cache path, and specifically, the first application acquires the video data pointed by the local cache path from a storage space of the device through the local cache path and loads the video data into the built-in player to play.
Step S13, the first application pushes the live stream containing the video data to the live room controlled by it:
after the first application acquires the video data and loads the video data, the first application can selectively push a live stream containing the video data to a live broadcast room controlled by the first application.
The first application is a live broadcast service application program, and the user identity logged in the application program is generally a main broadcast user, the terminal equipment running the first application is characterized as a main broadcast client, the main broadcast client obtains the video data from the second application for loading and playing through the video data information in the running first application, and the video data can be selectively synthesized into a live broadcast stream so as to broadcast the live broadcast stream to a live broadcast room of the main broadcast client for playing and displaying.
The anchor user side synthesizes video data of the second application into the live stream through the first application to perform broadcasting operation, and can perform video synthesis operation such as animation special effect or audio special effect on the video data so as to optimize the playing effect of the video data and improve the live broadcast atmosphere of a live broadcast room.
Specifically, referring to fig. 11, fig. 11 is a graphical user interface of a live broadcast room of the first application when a live broadcast stream containing the video data is pushed to the live broadcast room, a live broadcast stream playing window 1101 plays and synthesizes a live broadcast video frame recorded by a camera of the device and a live broadcast stream of the video data video frame 1102 of a main broadcast client of the first application, and a synthesis position of the video data video frame 1102 in the live broadcast stream can be customized by the main broadcast client of the live broadcast room where the first application is located. The live video frame can be recorded by the main broadcast client through a camera of the equipment, and can also be a graphical user interface currently displayed by the equipment of the main broadcast client.
Referring to fig. 12, regarding the implementation of the first application synthesizing the video data into a live stream for broadcasting, when the video data information pointing to the video data includes a reference address, the specific implementation steps are as follows:
step S131, the first application transmits the reference address to the media server to drive the media server to obtain the video data according to the reference address, so as to push the video data to the live broadcast room as a live broadcast stream:
and after acquiring the reference address from the video data information, the first application pushes the reference address to the media server serving the live broadcast service of the first application so as to drive the media server to acquire corresponding video data from the server pointed by the reference address according to the reference address, synthesize the video data into the live broadcast stream of the live broadcast room where the first application is located, and push the live broadcast stream into the live broadcast room to be played.
In one embodiment, the first application triggers a corresponding video composition event, generates a video composition instruction, and pushes the video composition instruction to the media server, and the media server responds to the video composition instruction, synthesizes a special effect pointed by the video composition event into the live stream according to the video composition event represented by the video composition instruction, such as an animation special effect composition event, a video filter composition event, an audio special effect composition event, or the like, and broadcasts the live stream to a live room where the first application is located for playing and displaying.
The client triggering the video composition event can be an anchor client or a spectator client, and the spectator client can trigger the generation of the corresponding video composition event by giving a corresponding electronic gift to an anchor user, generate a corresponding video composition instruction and push the video composition instruction to the server to perform video composition of video data, and modify the playing effect of the video data contained in the live stream.
Step S132, the live broadcast room of the first application receives the live broadcast stream through any terminal device where the live broadcast room is located and plays the live broadcast stream in its built-in player:
and after receiving the live stream, any terminal equipment of the live broadcast room running the first application loads the live stream into a built-in player of the first application to play.
The client-side receives the live stream, loads the live stream to be played through a built-in player of the first application, and controls the playing of the live stream through the built-in player, for example, playing is paused, played is started, or playing progress is adjusted.
Referring to fig. 13, regarding the implementation of the first application synthesizing the video data into a live stream for broadcasting, when the video data information pointing to the video data includes a local cache path, the specific implementation steps are as follows:
step S131', the first application transmits the video data pointed by the local cache path to the media server by using streaming media, so as to drive the media server to push the live stream containing the video data to the live broadcast room:
after the first application acquires the video data information, a local cache path included in the video data information is acquired, corresponding video data is acquired through the local cache path and pushed to the media server in a streaming media mode, so that the media server is driven to synthesize the video data into a live stream of a live broadcast room of the first application, and the live stream is broadcasted to the live broadcast room to be played and displayed.
Step S132', the live broadcast room of the first application receives the live broadcast stream through any terminal device where the live broadcast room is located and plays the live broadcast stream in its built-in player:
for the implementation of this step, please refer to the corresponding embodiment in standard S132, which is not repeated herein.
The above exemplary embodiments and variations thereof fully disclose the embodiments of the present live broadcast control method, but many variations of the method can be deduced by transforming and amplifying some technical means, and other embodiments are summarized as follows:
in one embodiment, referring to fig. 14, after the step of the method executing the first application to push the live stream containing the video data to the live room controlled by the first application, the following post-steps are executed:
step S14, the first application monitors a video source play control event triggered by the second application, and controls the playing of the video data in the live broadcast room in response to the play control event:
the second application establishing the screen-casting communication link with the first application can control the playing of the video data pushed to the first application in a live broadcast room through the triggered video source playing control event.
After monitoring that the first application triggers the video source playing control event, the first application responds to the video source playing control event, generates a corresponding video source playing control instruction and pushes the video source playing control instruction to the media server so as to drive the media server to control the playing of video data pointed by the instruction in a live stream according to the video source playing control instruction; the video source playing control event comprises the steps of adjusting the playing progress of video data, pausing playing of the video data, starting playing of the video data or stopping synthesizing of the video data into a live stream and the like.
In one embodiment, after monitoring that the first application triggers the video source play control event, the first application responds to the video source play control event and controls the play of video data which is not pushed to a media server; the video source playing control event comprises the steps of adjusting the playing progress of video data, pausing playing of the video data, starting playing of the video data or canceling playing of the video data and the like.
Further, a live broadcast control apparatus of the present application can be constructed by functionalizing the steps in the methods disclosed in the above embodiments, and according to this idea, please refer to fig. 15, wherein in an exemplary embodiment, the apparatus includes: the screen-casting communication link establishing module 11 is used for establishing a screen-casting communication link between a first application and a second application in the same-machine equipment; the video data information pushing module 12 is configured to receive, by the first application, video data information sent by the second application, and load video data pointed by the video data information; and the live stream pushing module 13 is configured to push the live stream containing the video data to the live room controlled by the first application.
In one embodiment, the screen-casting communication link establishing module 11 includes: the multicast message receiving submodule is used for receiving a multicast message which is broadcasted by the second application according to the preset protocol standard and is used for searching the first application suitable for establishing the screen-casting communication link by the first application; the device information response submodule is used for the first application to respond to the multicast message and feed back the device information pre-configured by the first application to the second application in a unicast mode, and the device information comprises communication configuration information; and the application binding submodule is used for establishing a screen projection communication link between the first application and the second application according to the communication configuration information to realize binding.
In another embodiment, the screen-projection communication link establishing module 11 further includes: the list searching submodule searches and displays an available program list belonging to the second application in the same-machine equipment; the device information selection submodule is used for sending device information pre-configured by the first application to a second application selected from the available program list, and the device information contains communication configuration information so as to activate the second application to enter a state of establishing a screen projection communication link with the first application; and the application binding submodule is used for establishing a screen projection communication link between the first application and the second application according to the communication configuration information to realize binding.
In one embodiment, the video data information pushing module 12 includes: the video data information receiving submodule is used for receiving the video data information sent by the second application through the screen projection communication link by the first application, the video data information comprises a reference address of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application; the reference address acquisition submodule is used for analyzing the video data information by the first application and acquiring a reference address of a video source from the video data information; and the video data loading submodule is used for calling the built-in player to load and play the video data pointed by the reference address by the first application.
In another embodiment, the video data information pushing module 12 includes: the first application receives video data information sent by the second application through the screen projection communication link, the video data information comprises a local cache path of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application; the local cache path acquisition module is used for the first application to analyze video data information and acquire a local cache path of the video data sent by the second application; and the first application calls a built-in player to load and play the video data pointed by the local cache path.
In one embodiment, the live stream push module 13 includes: the reference address pushing submodule is used for transmitting the reference address to the media server by the first application so as to drive the media server to acquire video data according to the reference address and push the video data to the live broadcast room as live broadcast stream; and the live broadcast stream playing sub-module is used for receiving the live broadcast stream through any terminal equipment where the first application is located in the live broadcast room of the first application and playing the live broadcast stream in an internal player of the live broadcast room.
In another embodiment, the live stream pushing module 13 further includes: the local cache way pushing submodule is used for the first application to transmit the video data pointed by the local cache way to the media server by using streaming media so as to drive the media server to push the live broadcast stream containing the video data to the live broadcast room; and the live broadcast stream playing sub-module is used for receiving the live broadcast stream through any terminal equipment where the first application is located in the live broadcast room of the first application and playing the live broadcast stream in an internal player of the live broadcast room.
In order to solve the above technical problem, an embodiment of the present application further provides a computer device, configured to run a computer program implemented according to the live broadcast control method. Referring to fig. 16, fig. 16 is a block diagram of a basic structure of a computer device according to the present embodiment.
As shown in fig. 16, the internal structure of the computer device is schematically illustrated. The computer device includes a processor, a non-volatile storage medium, a memory, and a network interface connected by a system bus. The non-volatile storage medium of the computer device stores an operating system, a database and computer readable instructions, the database can store control information sequences, and the computer readable instructions can enable the processor to realize a live broadcast control method when being executed by the processor. The processor of the computer device is used for providing calculation and control capability and supporting the operation of the whole computer device. The memory of the computer device may have stored therein computer readable instructions that, when executed by the processor, may cause the processor to perform a live control method. The network interface of the computer device is used for connecting and communicating with the terminal. Those skilled in the art will appreciate that the architecture shown in fig. 16 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The processor in this embodiment is used to execute the specific functions of each module/sub-module in the broadcast control device of the present invention, and the memory stores the program code and various data required for executing the above modules. The network interface is used for data transmission to and from a user terminal or a server. The memory in this embodiment stores program codes and data required for executing all modules/submodules in the broadcast control device, and the server can call the program codes and data of the server to execute the functions of all the submodules.
The present application also provides a non-volatile storage medium, wherein the live broadcast control method is written as a computer program and stored in the storage medium in the form of computer readable instructions, which when executed by one or more processors, means execution of the program in a computer, thereby causing the one or more processors to perform the steps of the live broadcast control method of any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
To sum up, the video screen projection function between different applications in the same device is constructed, and the function allows different applications in the same device to acquire corresponding video data from other applications to be loaded and played in a mode of establishing a screen projection communication link, and loads the video data into a live stream to be broadcasted to a live room to be played and displayed.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
Those of skill in the art will appreciate that the various operations, methods, steps in the processes, acts, or solutions discussed in this application can be interchanged, modified, combined, or eliminated. Further, other steps, measures, or schemes in various operations, methods, or flows that have been discussed in this application can be alternated, altered, rearranged, broken down, combined, or deleted. Further, steps, measures, schemes in the prior art having various operations, methods, procedures disclosed in the present application may also be alternated, modified, rearranged, decomposed, combined, or deleted.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (12)

1. A live broadcast control method is characterized by comprising the following steps:
the method comprises the steps that a screen projection communication link is established between a first application and a second application in the same-machine equipment;
the first application receives the video data information sent by the second application and loads the video data pointed by the video data information;
the first application pushes a live stream containing the video data to a live room controlled by the first application.
2. The method of claim 1, wherein the step of the first application establishing a screen-cast communication link with a second application in the co-located device comprises:
the first application receives a multicast message broadcasted by the second application according to a preset protocol specification and used for searching the first application suitable for establishing the screen-casting communication link;
the first application responds to the multicast message and feeds back equipment information preconfigured by the first application to the second application in a unicast mode, and the equipment information comprises communication configuration information;
and the first application and the second application establish a screen projection communication link according to the communication configuration information to realize binding.
3. The method of claim 1, wherein the step of the first application establishing a screen-cast communication link with a second application in the co-located device comprises:
the first application searches and displays an available program list belonging to the second application in the same-machine equipment;
the first application sends equipment information pre-configured by the first application to a second application selected from the available program list, wherein the equipment information comprises communication configuration information so as to activate the second application to enter a state of establishing a screen projection communication link with the first application;
and the first application and the second application establish a screen projection communication link according to the communication configuration information to realize binding.
4. The method of claim 1, wherein the step of receiving, by the first application, the video data information sent by the second application, and loading the video data pointed to by the video data information comprises:
the first application receives video data information sent by the second application through the screen projection communication link, the video data information comprises a reference address of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application;
the first application analyzes the video data information and obtains a reference address of a video source from the video data information;
and the first application calls the built-in player to load and play the video data pointed by the reference address.
5. The method of claim 1, wherein the step of receiving, by the first application, the video data information sent by the second application, and loading the video data pointed to by the video data information comprises:
the first application receives video data information sent by the second application through the screen projection communication link, the video data information comprises a local cache path of a video source, and the video source is selected from a video source list provided by a graphical user interface of the second application;
the first application analyzes video data information and obtains a local cache path of the video data sent by the second application;
and the first application calls a built-in player to load and play the video data pointed by the local cache path.
6. The method of claim 4, wherein the step of the first application pushing the live stream containing the video data to the live room controlled by the first application comprises:
the first application transmits the reference address to a media server to drive the media server to acquire video data according to the reference address, so that the video data is pushed to the live broadcast room as live broadcast stream;
and the live broadcast room of the first application receives the live broadcast stream through any terminal equipment where the live broadcast room is located and plays the live broadcast stream in an internal player of the live broadcast room.
7. The method of claim 6, wherein the step of the first application pushing the live stream containing the video data to the live room controlled by the first application comprises:
the first application transmits the video data pointed by the local cache path to a media server by streaming media so as to drive the media server to push the live stream containing the video data to the live broadcast room;
and the live broadcast room of the first application receives the live broadcast stream through any terminal equipment where the live broadcast room is located and plays the live broadcast stream in an internal player of the live broadcast room.
8. Method according to any of claims 1 to 7, characterized in that after the step of the method performing the first application pushing a live stream containing said video data to a live room controlled by it, the following post-steps are performed:
the first application monitors a video source playing control event triggered by the second application, and correspondingly controls the playing of the video data in the live broadcast room in response to the playing control event.
9. The method according to any one of claims 1 to 8, wherein in the step of binding the screen-casting communication link between the first application and the second application in the co-operating device, the first application and the second application are configured to be respectively located in virtual devices with different communication addresses from each other, so that the established screen-casting communication link conforms to the communication specification of the DLNA protocol.
10. A live control apparatus, comprising:
the screen-casting communication link establishing module is used for establishing a screen-casting communication link between the first application and a second application in the same-machine equipment;
the video data information pushing module is used for receiving the video data information sent by the second application by the first application and loading video data pointed by the video data information;
and the live stream pushing module is used for pushing the live stream containing the video data to a live room controlled by the first application.
11. An electronic device comprising a central processor and a memory, wherein the central processor is configured to invoke execution of a computer program stored in the memory to perform the steps of the method according to any one of claims 1 to 9.
12. A non-volatile storage medium, characterized in that it stores, in the form of computer-readable instructions, a computer program implemented according to the method of any one of claims 1 to 9, which, when invoked by a computer, performs the steps comprised by the method.
CN202111017139.3A 2021-08-31 2021-08-31 Live broadcast control method and device, equipment and medium thereof Active CN113727133B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111017139.3A CN113727133B (en) 2021-08-31 2021-08-31 Live broadcast control method and device, equipment and medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111017139.3A CN113727133B (en) 2021-08-31 2021-08-31 Live broadcast control method and device, equipment and medium thereof

Publications (2)

Publication Number Publication Date
CN113727133A true CN113727133A (en) 2021-11-30
CN113727133B CN113727133B (en) 2023-04-28

Family

ID=78680145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111017139.3A Active CN113727133B (en) 2021-08-31 2021-08-31 Live broadcast control method and device, equipment and medium thereof

Country Status (1)

Country Link
CN (1) CN113727133B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018001218A1 (en) * 2016-06-27 2018-01-04 北京奇虎科技有限公司 Video playing method, device, program and medium
WO2018001201A1 (en) * 2016-06-27 2018-01-04 北京奇虎科技有限公司 Video push method, device, program and medium
CN108027706A (en) * 2016-08-31 2018-05-11 华为技术有限公司 A kind of application interface display methods and terminal device
CN108174256A (en) * 2017-12-29 2018-06-15 深圳Tcl数字技术有限公司 Video broadcasting method, device and computer readable storage medium
CN112312222A (en) * 2019-10-31 2021-02-02 北京字节跳动网络技术有限公司 Video sending method and device and electronic equipment
CN112788362A (en) * 2020-12-25 2021-05-11 北京小米移动软件有限公司 Video playing method, video playing device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018001218A1 (en) * 2016-06-27 2018-01-04 北京奇虎科技有限公司 Video playing method, device, program and medium
WO2018001201A1 (en) * 2016-06-27 2018-01-04 北京奇虎科技有限公司 Video push method, device, program and medium
CN108027706A (en) * 2016-08-31 2018-05-11 华为技术有限公司 A kind of application interface display methods and terminal device
CN108174256A (en) * 2017-12-29 2018-06-15 深圳Tcl数字技术有限公司 Video broadcasting method, device and computer readable storage medium
CN112312222A (en) * 2019-10-31 2021-02-02 北京字节跳动网络技术有限公司 Video sending method and device and electronic equipment
CN112788362A (en) * 2020-12-25 2021-05-11 北京小米移动软件有限公司 Video playing method, video playing device and storage medium

Also Published As

Publication number Publication date
CN113727133B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN106658205B (en) Live broadcast room video stream synthesis control method and device and terminal equipment
JP4655190B2 (en) Information processing apparatus and method, recording medium, and program
EP2569937B1 (en) Systems and methods for real-time multimedia communication across multiple standards and proprietary devices
EP2335411B1 (en) Communication system and method
CN113727178B (en) Screen-throwing resource control method and device, equipment and medium thereof
US8473994B2 (en) Communication system and method
CN108093267B (en) Live broadcast method and device, storage medium and electronic equipment
US9654726B2 (en) Peripheral device for communication over a communications system
US20150121252A1 (en) Combined Data Streams for Group Calls
CN111970526B (en) Interface notification message processing method, device, equipment and storage medium
CN113457123B (en) Interaction method and device based on cloud game, electronic equipment and readable storage medium
US20070039025A1 (en) Method for application sharing
CN110910860B (en) Online KTV implementation method and device, electronic equipment and storage medium
WO2008125593A2 (en) Virtual reality-based teleconferencing
CN110012362B (en) Live broadcast voice processing method, device, equipment and storage medium
US8924477B2 (en) Real-time meeting object extensibility
CN113573083A (en) Live wheat-connecting interaction method and device and computer equipment
CN115314727A (en) Live broadcast interaction method and device based on virtual object and electronic equipment
CN113727177A (en) Screen-projecting resource playing method and device, equipment and medium thereof
CN117176999A (en) Multi-person wheat connecting method, device, computer equipment and storage medium
CN113727133B (en) Live broadcast control method and device, equipment and medium thereof
CN113727180A (en) Screen projection playing control method and device, equipment and medium thereof
US11778011B2 (en) Live streaming architecture with server-side stream mixing
KR102586186B1 (en) Hybrid server and hybrid server operation method for providing videotelephony service
CN114979756B (en) Method, device and equipment for realizing one-to-many screen-throwing independent display and interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant