CN111919451A - Live broadcasting method, live broadcasting device and terminal - Google Patents
Live broadcasting method, live broadcasting device and terminal Download PDFInfo
- Publication number
- CN111919451A CN111919451A CN202080001125.XA CN202080001125A CN111919451A CN 111919451 A CN111919451 A CN 111919451A CN 202080001125 A CN202080001125 A CN 202080001125A CN 111919451 A CN111919451 A CN 111919451A
- Authority
- CN
- China
- Prior art keywords
- terminal
- live broadcast
- video recording
- video
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application is applicable to the technical field of videos and provides a live broadcast method, a live broadcast device, a terminal and a computer readable storage medium, wherein the method comprises the following steps: when a live broadcast request is received, acquiring real-time pictures shot by each second terminal; displaying each real-time picture on a display screen through different live broadcast windows; and if a window adjusting request carrying size information and layout information is received, adjusting the size of a target live broadcast window according to the size information, and adjusting the position of the target live broadcast window according to the layout information, wherein the target live broadcast window is a live broadcast window indicated by the window adjusting request. By the method, the user can conveniently find the content which is interested in the user from a plurality of real-time pictures.
Description
Technical Field
The application belongs to the technical field of videos, and particularly relates to a live broadcast method, a live broadcast device, a terminal and a computer readable storage medium.
Background
With the development of network communication technology, live broadcast is increasingly popular with people. In the prior art, at least one camera can be used for viewing from different angles, and real-time pictures obtained by viewing through each camera are live on a display screen. However, too many real-time pictures are displayed on the display screen, which may cause the content on the display screen to be too disordered, and may bring a poor viewing experience to the user.
Disclosure of Invention
In view of this, the present application provides a live broadcasting method, a live broadcasting device, a terminal, and a computer readable storage medium, which enable a user to conveniently find out a content of interest from a plurality of real-time pictures.
In a first aspect, the present application provides a live broadcasting method, which is applied to a first terminal having a display screen, where the first terminal and at least two second terminals are in a same network, and the at least two second terminals are configured to respectively shoot a same scene at different machine positions, and the live broadcasting method includes:
when a live broadcast request is received, acquiring real-time pictures shot by each second terminal;
displaying each real-time picture on the display screen through different live broadcast windows;
and if a window adjusting request carrying size information and layout information is received, adjusting the size of a target live broadcast window according to the size information, and adjusting the position of the target live broadcast window according to the layout information, wherein the target live broadcast window is a live broadcast window indicated by the window adjusting request.
In a second aspect, the present application provides a live device, which is applied to a first terminal having a display screen, where the first terminal and at least two second terminals are in the same network, and the at least two second terminals are configured to respectively shoot the same scene at different machine positions, including:
the real-time picture acquisition unit is used for acquiring real-time pictures shot by each second terminal when receiving a live broadcast request;
the real-time picture display unit is used for displaying each real-time picture through different live broadcast windows on the display screen;
and the live broadcast window adjusting unit is used for adjusting the size of a target live broadcast window according to the size information and adjusting the position of the target live broadcast window according to the layout information if a window adjusting request carrying the size information and the layout information is received, wherein the target live broadcast window is a live broadcast window indicated by the window adjusting request.
In a third aspect, the present application provides a terminal, comprising a display, a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the method provided in the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the method as provided in the first aspect.
In a fifth aspect, the present application provides a computer program product, which, when run on a terminal, causes the terminal to perform the method provided by the first aspect described above.
As can be seen from the above, in the present application, when a live broadcast request is received, first, a real-time picture obtained by shooting at each second terminal is obtained, then, each real-time picture is displayed through different live broadcast windows on a display screen, if a window adjustment request carrying size information and layout information is received, the size of a target live broadcast window is adjusted according to the size information, and the position of the target live broadcast window is adjusted according to the layout information, where the target live broadcast window is a live broadcast window indicated by the window adjustment request. This application scheme uses two at least second terminals to shoot same scene at different machine positions respectively, and the real-time picture that obtains is shot with each second terminal carries out the live broadcast through the different live broadcast windows on the display screen of first terminal, and watching the in-process of live broadcast, the user can be according to the demand, adjusts the size and the position of arbitrary live broadcast window at any time, makes the demonstration of live broadcast window more orderly, and convenience of customers finds the content of oneself interest from numerous real-time pictures.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic diagram of a live scene provided in an embodiment of the present application;
fig. 2 is a topology diagram of a first terminal and a second terminal provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a live broadcasting method provided in an embodiment of the present application;
fig. 4 is a block diagram of a live device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to better understand a live broadcast method provided by the embodiment of the present application, a live broadcast scene in the embodiment of the present application is described first below.
Fig. 1 shows a schematic diagram of a live scene provided in an embodiment of the present application, where the live scene includes a first terminal with a display screen and six second terminals (i.e., mobile phones in fig. 1) distributed at different locations. The cameras of the second terminals are aligned to the experience area to shoot, and real-time pictures shot by the second terminals are synchronously displayed through the display screen. When a user plays golf in the experience area, the hitting postures of the user can be shot from six visual angles through the six second terminals, and real-time pictures of the six second terminals are displayed through the display screen, so that the user is helped to adjust the hitting postures of the user.
FIG. 2 shows a topology diagram of a first terminal and a second terminal in an embodiment of the present application, in which the first terminal includes an 86-inch 4K display screen and a high-power distribution computer host equipped with an on-board acquisition card; the second terminal is a mobile phone, and each mobile phone is connected with the High-power distribution computer host through a Type-C extension line, a docking station, a Universal Serial Bus (USB) acquisition card and a High-Definition Multimedia Interface (HDMI).
Fig. 3 shows a flowchart of a live broadcasting method provided in an embodiment of the present application, where the live broadcasting method is applied to a first terminal, and is detailed as follows:
in an embodiment of the present application, the first terminal and the at least two second terminals are in the same network, which may be a fifth generation mobile communication network. Because the network speed of the fifth generation mobile communication network is fast, the low delay of the live broadcast can be ensured. Each second terminal respectively shoots the same scene (such as the experience area in fig. 1) at different machine positions, and the second terminal may be a mobile phone. Because the mobile phone is light, the user can quickly and conveniently adjust the placing position and the shooting angle of the mobile phone. And when the first terminal receives the live broadcast request, acquiring real-time pictures shot by the second terminals. The live broadcast request can be sent by a user through a control terminal (such as a tablet computer) for controlling the first terminal and the second terminals, and each second terminal uploads a real-time picture through a picture transmission technology. Optionally, live broadcast parameters may be set in the second terminal in advance, where the live broadcast parameters include a terminal identifier, an Internet Protocol (IP) address, a camera resolution, and the like.
in this embodiment of the application, the first terminal has a display screen, and in order to bring a better viewing experience to a user, the size of the display screen may be as large as possible, for example, the size of the display screen is 86 inches. And after the first terminal acquires the real-time pictures shot by the second terminals, each real-time picture can be displayed through the display screen. Specifically, a plurality of live broadcast windows can be arranged on the display screen, the number of the live broadcast windows is equal to that of the second terminals, and then each real-time picture is displayed through different live broadcast windows.
For example, assume that the number of the second terminals is 3, and correspondingly, there are 3 real-time frames, including real-time frame 1, real-time frame 2, and real-time frame 3; in order to display each real-time picture through different live broadcast windows, 3 live broadcast windows can be arranged on a display screen, wherein the live broadcast windows comprise a live broadcast window 1, a live broadcast window 2 and a live broadcast window 3; then, the real-time picture 1 is displayed through the live broadcast window 1, the real-time picture 2 is displayed through the live broadcast window 2, and the real-time picture 3 is displayed through the live broadcast window 3.
In the embodiment of the application, the window adjustment request may be sent by a user through a control terminal (e.g., a tablet computer), and the size information and the layout information may be set by the user according to requirements. The window adjusting request is used for indicating the first terminal to adjust the size and the position of the target live broadcast window on the display screen, the size information is used for indicating the target size to which the target live broadcast window should be adjusted, and the layout information is used for indicating the target position to which the target live broadcast window should be adjusted. And if the first terminal receives a window adjusting request carrying size information and layout information, adjusting the size of the target live broadcast window to a target size according to the size information, and adjusting the position of the target window to a target position according to the layout information.
And the target live broadcast window is the live broadcast window indicated by the window adjustment request. Optionally, each second terminal uploads a terminal identifier of the second terminal while uploading the real-time picture, and then the first terminal establishes a correspondence between the real-time picture and the terminal identifier. And if the window adjustment request also carries the terminal identification, determining the live broadcast window displaying the real-time picture corresponding to the terminal identification as a target live broadcast window.
Optionally, in order to be able to extract a specific segment in a live broadcast, the live broadcast method may further include:
a1, when a video recording request carrying video recording duration is received, controlling each second terminal to record a real-time picture based on the video recording duration;
and A2, acquiring the video files obtained by the video of each second terminal.
In the embodiment of the application, the video recording request can be sent by a user through a control terminal (such as a tablet computer), and the video recording duration can be set by the user according to requirements. The video recording request is used for indicating the second terminal to record the real-time picture, and the video recording duration is used for indicating the duration of the second terminal to record the real-time picture. When the first terminal receives the video recording request carrying the video recording duration, the first terminal controls each second terminal in the same network to record the real-time picture based on the video recording duration. After the second terminals record the real-time pictures, the second terminals obtain video files, then the second terminals upload the video files to the first terminal through a File Transfer Protocol (FTP), and the first terminal receives the video files recorded by the second terminals and stores the video files.
For example, assuming that the display screen is displaying each real-time picture within a time period of 2 minutes 10 seconds to 100 minutes 10 seconds, if the first terminal receives a video recording request within 48 minutes 20 seconds, where the video recording request carries a video recording duration, and the video recording duration is 5 minutes, each second terminal records the real-time picture within 48 minutes 20 seconds to 53 minutes 20 seconds.
Optionally, the video recording duration may also be set in each second terminal in advance, accordingly, the video recording duration does not need to be carried in the video recording request, and when the first terminal receives the video recording request, the first terminal controls each second terminal to record the real-time picture based on the video recording duration set in the second terminal.
Based on the embodiment, if the user finds the highlight in the live broadcast watching process, the user can record the highlight at any time without recording the whole live broadcast process.
Optionally, after the first terminal acquires the video files obtained by the videos of the second terminals, in order to enable the video files to be found back after being lost, the video files can be uploaded to a preset cloud server, and the video files are stored by the cloud server, for example, the video files can be uploaded to the cloud server through a hypertext transfer protocol. Therefore, even if the video file in the first terminal is deleted, the video file can be retrieved from the cloud server.
Optionally, the step a1 specifically includes:
and when a video recording request carrying the video recording duration is received, sending the video recording batch identification and the video recording duration to each second terminal so as to instruct each second terminal to record the real-time picture based on the video recording duration to obtain a video recording file.
In the embodiment of the application, each time a video recording request is received, the first terminal generates a video recording batch identifier corresponding to the video recording request, and then the first terminal sends the video recording batch identifier and the video recording duration to each second terminal. And each second terminal can record the real-time picture to obtain a video file when receiving the video batch identification and the video duration, and each video file comprises the video batch identification. The video batch identifier is used for indicating the batch to which the video files belong, and the video files containing the same video batch identifier belong to the same batch, in other words, the video files containing the same video batch identifier are obtained by the second terminal through video recording in the same time period.
For example, when the first terminal receives the video recording request at time t1, the first terminal generates a video recording batch identifier 1, and then sends the video recording batch identifier 1 and the video recording duration to each second terminal, after each second terminal receives the video recording batch identifier 1 and the video recording duration, the real-time picture may be recorded based on the video recording duration, and a video file recorded by each second terminal includes the video recording batch identifier 1. When the first terminal receives the video recording request at the time t2, the first terminal generates a video recording batch identifier 2, and then sends the video recording batch identifier 2 and the video recording duration to each second terminal, after each second terminal receives the video recording batch identifier 2 and the video recording duration, a real-time picture can be recorded based on the video recording duration, and a video file recorded by each second terminal includes the video recording batch identifier 2. Therefore, the batch to which the video file belongs can be determined by identifying the video batch identifier in the video file.
Illustratively, when a video recording request carrying video recording duration is received, the first terminal first obtains a current timestamp and randomly generates a random number, then the current timestamp and the random number are spliced to obtain a string of numbers, the string of numbers are used as video recording batch identifiers, and finally the first terminal broadcasts the video recording batch identifiers and the video recording duration to each second terminal. Because each video file comprises the video batch identification, the first terminal can obtain the video time corresponding to the video file by reading the video batch identification in the video file. Optionally, the first terminal may further add the terminal identifier of the second terminal into the video file, for example, after the first terminal acquires the video file obtained by video recording of a certain second terminal, add the terminal identifier of the second terminal into the video file obtained by video recording of the second terminal. Therefore, the second terminal for generating the video file can be determined by identifying the terminal identifier in the video file.
Optionally, in order to enable the user to view the video files recorded by the second terminals at any time, the live broadcasting method may further include:
b1, when receiving the playback request, detecting whether M video files pointed by the playback request are stored in the first terminal;
and B2, if the first terminal stores M video files, playing the M video files through different playback windows on the display screen.
In this embodiment, the playback request may be sent by a user through a control terminal (e.g., a tablet computer), where M video files pointed by the playback request are video files selected by the user, and M is equal to the number of the second terminals. When the first terminal receives the playback request, it is first required to detect whether M video files pointed by the playback request are stored in the first terminal, and only when the M video files are stored in the first terminal, the M video files are played through different playback windows on the display screen.
Optionally, if it is detected that the M video files pointed by the playback request are not stored in the first terminal, the first terminal may cyclically detect whether the M video files pointed by the playback request are stored in the first terminal, and jump out of the loop until it is detected that the M video files are stored in the first terminal, and play the M video files through different playback windows on the display screen. In this way, the first terminal can play the video file in response to the playback request within the shortest time when receiving the playback request.
For example, the playback request may carry a video batch identifier, and when the first terminal receives the playback request, it is detected whether M video files containing the video batch identifier are stored in the first terminal, where the M video files containing the video batch identifier are the M video files pointed by the playback request.
Optionally, before playing the M video files, the first terminal may read a preset review configuration file, where the review configuration file includes layout information and size information of review windows, and the first terminal may set positions of the review windows on the display screen according to the layout information of the review windows and set sizes of the review windows according to the size information of the review windows.
Optionally, after the step B2, the method further includes:
generating two-dimensional codes storing link addresses of M video files;
and displaying the two-dimensional code on a display screen.
In the embodiment of the application, considering that a user may wish to share a video file with others, after the M video files are played, a two-dimensional code storing link addresses of the M video files may be generated. The link address may be a link address corresponding to a video file stored in the first terminal, or a link address corresponding to a video file stored in the cloud server. After the two-dimensional code is generated, the first terminal displays the two-dimensional code on a display screen. When a person scans the two-dimensional code through the user terminal, the user terminal can acquire the link address and acquire and play the M video files according to the link address.
Optionally, if the link address is a link address corresponding to a video file stored in the cloud server, the step of generating the two-dimensional code storing the link addresses of the M video files may specifically include: the method comprises the steps that a first terminal sends a video batch identification to a cloud server; the cloud server detects whether M video files containing the video batch identification are stored, and if M video files containing the video batch identification are detected to be stored, the cloud server generates hypertext 5.0(HTML5) pages corresponding to the M video files and link addresses of the HTML5 pages, and sends the link addresses to the first terminal; and after receiving the link address, the first terminal generates a two-dimensional code in which the link address is stored. The HTML5 page is subjected to self-adaption and compatibility optimization, can support computer and mobile phone access, and also supports sharing on various social platforms.
As can be seen from the above, in the present application, when a live broadcast request is received, first, a real-time picture obtained by shooting at each second terminal is obtained, then, each real-time picture is displayed through different live broadcast windows on a display screen, if a window adjustment request carrying size information and layout information is received, the size of a target live broadcast window is adjusted according to the size information, and the position of the target live broadcast window is adjusted according to the layout information, where the target live broadcast window is a live broadcast window indicated by the window adjustment request. This application scheme uses two at least second terminals to shoot same scene at different machine positions respectively, and the real-time picture that obtains is shot with each second terminal carries out the live broadcast through the different live broadcast windows on the display screen of first terminal, and watching the in-process of live broadcast, the user can be according to the demand, adjusts the size and the position of arbitrary live broadcast window at any time, makes the demonstration of live broadcast window more orderly, and convenience of customers finds the content of oneself interest from numerous real-time pictures.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 4 shows a block diagram of a live broadcast apparatus according to an embodiment of the present application, where the live broadcast apparatus is applied to a first terminal having a display screen, the first terminal and at least two second terminals are in the same network, and the at least two second terminals are used for shooting the same scene at different machine positions, and for convenience of description, only the parts related to the embodiment of the present application are shown.
The live broadcasting apparatus 400 includes:
a real-time picture acquiring unit 401, configured to acquire a real-time picture captured by each second terminal when receiving a live broadcast request;
a real-time picture display unit 402, configured to display each real-time picture through different live broadcast windows on the display screen;
a live broadcast window adjusting unit 403, configured to, if a window adjustment request carrying size information and layout information is received, adjust a size of a target live broadcast window according to the size information, and adjust a position of the target live broadcast window according to the layout information, where the target live broadcast window is a live broadcast window indicated by the window adjustment request.
Optionally, the live broadcasting apparatus 400 further includes:
the video recording unit is used for controlling each second terminal to record a real-time picture based on the video recording duration when receiving a video recording request carrying the video recording duration;
and the video file acquisition unit is used for acquiring the video files obtained by the video of each second terminal.
Optionally, the recording unit further includes:
and the batch video recording subunit is configured to send a video recording batch identifier and the video recording duration to each second terminal when receiving the video recording request carrying the video recording duration, so as to instruct each second terminal to record a real-time picture based on the video recording duration, so as to obtain video recording files, where each video recording file includes the video recording batch identifier.
Optionally, the batch recording subunit is specifically configured to obtain a current timestamp when the recording request carrying the recording duration is received; splicing the current timestamp with a random number to obtain the video batch identification; and broadcasting the video recording batch identification and the video recording duration to each second terminal.
Optionally, the live broadcasting apparatus 400 further includes:
a file detection unit, configured to detect whether the first terminal stores M video files pointed by the playback request when receiving the playback request, where M is equal to the number of the second terminals;
and a video playback unit, configured to play the M video files through different playback windows on the display screen if the M video files are stored in the first terminal.
Optionally, the file detecting unit is specifically configured to detect whether the M video files including the video batch identifier are stored in the first terminal when the playback request is received.
Optionally, the live broadcasting apparatus 400 further includes:
a two-dimensional code generating unit for generating a two-dimensional code in which link addresses of the M video files are stored;
and the two-dimension code display unit is used for displaying the two-dimension code on the display screen, so that the user terminal which scans the two-dimension code obtains and plays the M video files according to the link address.
As can be seen from the above, in the present application, when a live broadcast request is received, first, a real-time picture obtained by shooting at each second terminal is obtained, then, each real-time picture is displayed through different live broadcast windows on a display screen, if a window adjustment request carrying size information and layout information is received, the size of a target live broadcast window is adjusted according to the size information, and the position of the target live broadcast window is adjusted according to the layout information, where the target live broadcast window is a live broadcast window indicated by the window adjustment request. This application scheme uses two at least second terminals to shoot same scene at different machine positions respectively, and the real-time picture that obtains is shot with each second terminal carries out the live broadcast through the different live broadcast windows on the display screen of first terminal, and watching the in-process of live broadcast, the user can be according to the demand, adjusts the size and the position of arbitrary live broadcast window at any time, makes the demonstration of live broadcast window more orderly, and convenience of customers finds the content of oneself interest from numerous real-time pictures.
Fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application. As shown in fig. 5, the terminal 5 of this embodiment includes: at least one processor 50 (only one is shown in fig. 5), a memory 51, a computer program 52 stored in the memory 51 and operable on the at least one processor 50, and a display 53, wherein the processor 50 implements the following steps when executing the computer program 52:
when a live broadcast request is received, acquiring real-time pictures shot by each second terminal;
displaying each real-time picture on the display screen through different live broadcast windows;
and if a window adjusting request carrying size information and layout information is received, adjusting the size of a target live broadcast window according to the size information, and adjusting the position of the target live broadcast window according to the layout information, wherein the target live broadcast window is a live broadcast window indicated by the window adjusting request.
Assuming that the above is the first possible implementation manner, in a second possible implementation manner provided on the basis of the first possible implementation manner, when the processor 50 executes the computer program 52, the following steps are further implemented:
when a video recording request carrying video recording duration is received, controlling each second terminal to record a real-time picture based on the video recording duration;
and acquiring the video files obtained by the video of each second terminal.
In a third possible implementation manner provided on the basis of the second possible implementation manner, the controlling, when a video recording request carrying a video recording duration is received, each second terminal to record a real-time picture based on the video recording duration includes:
and when the video recording request carrying the video recording duration is received, sending a video recording batch identifier and the video recording duration to each second terminal so as to instruct each second terminal to record a real-time picture based on the video recording duration to obtain video recording files, wherein each video recording file comprises the video recording batch identifier.
In a fourth possible implementation manner provided on the basis of the third possible implementation manner, the sending, to each second terminal, a video recording batch identifier and the video recording duration when receiving the video recording request carrying the video recording duration includes:
when the video recording request carrying the video recording duration is received, acquiring a current timestamp;
splicing the current timestamp with a random number to obtain the video batch identification;
and broadcasting the video recording batch identification and the video recording duration to each second terminal.
In a fifth possible implementation manner provided as a basis for the first possible implementation manner, when the processor 50 executes the computer program 52, the following steps are further implemented:
when a playback request is received, detecting whether the first terminal stores M video files pointed by the playback request, wherein M is equal to the number of the second terminals;
and if the first terminal stores the M video files, the M video files are played through different playback windows on the display screen.
In a sixth possible implementation manner provided on the basis of the fifth possible implementation manner, where the playback request carries a video batch identifier, and the detecting, when the playback request is received, whether the first terminal stores M video files pointed by the playback request includes:
when the playback request is received, whether the M video files containing the video batch identification are stored in the first terminal or not is detected.
In a seventh possible embodiment based on the fifth possible embodiment, if the first terminal stores the M video files, the processor 50 executes the computer program 52 after the M video files are played through different playback windows on the display screen, and then further performs the following steps:
generating two-dimensional codes storing link addresses of the M video files;
and displaying the two-dimensional code on the display screen, so that the user terminal which scans the two-dimensional code acquires and plays the M video files according to the link address.
Those skilled in the art will appreciate that fig. 5 is only an example of the terminal 5, and does not constitute a limitation to the terminal 5, and may include more or less components than those shown, or combine some components, or different components, such as input and output devices, network access devices, etc.
The Processor 50 may be a Central Processing Unit (CPU), and the Processor 50 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal 5 in some embodiments, such as a hard disk or a memory of the terminal 5. The memory 51 may be an external storage device of the terminal 5 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal 5. Further, the memory 51 may include both an internal storage unit and an external storage device of the terminal 5. The memory 51 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, other programs, and the like, such as program codes of the computer programs. The above-mentioned memory 51 may also be used to temporarily store data that has been output or is to be output.
As can be seen from the above, in the present application, when a live broadcast request is received, first, a real-time picture obtained by shooting at each second terminal is obtained, then, each real-time picture is displayed through different live broadcast windows on a display screen, if a window adjustment request carrying size information and layout information is received, the size of a target live broadcast window is adjusted according to the size information, and the position of the target live broadcast window is adjusted according to the layout information, where the target live broadcast window is a live broadcast window indicated by the window adjustment request. This application scheme uses two at least second terminals to shoot same scene at different machine positions respectively, and the real-time picture that obtains is shot with each second terminal carries out the live broadcast through the different live broadcast windows on the display screen of first terminal, and watching the in-process of live broadcast, the user can be according to the demand, adjusts the size and the position of arbitrary live broadcast window at any time, makes the demonstration of live broadcast window more orderly, and convenience of customers finds the content of oneself interest from numerous real-time pictures.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on a terminal, enables the terminal to implement the steps in the above method embodiments when executed.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer-readable medium may include at least: any entity or device capable of carrying computer program code to a terminal, recording medium, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (10)
1. A live broadcast method is characterized in that the live broadcast method is applied to a first terminal with a display screen, the first terminal and at least two second terminals are in the same network, the at least two second terminals are used for shooting the same scene at different machine positions respectively, and the live broadcast method comprises the following steps:
when a live broadcast request is received, acquiring real-time pictures shot by each second terminal;
displaying each real-time picture on the display screen through different live broadcast windows;
and if a window adjusting request carrying size information and layout information is received, adjusting the size of a target live broadcast window according to the size information, and adjusting the position of the target live broadcast window according to the layout information, wherein the target live broadcast window is a live broadcast window indicated by the window adjusting request.
2. A live method according to claim 1, characterized in that the live method further comprises:
when a video recording request carrying video recording duration is received, controlling each second terminal to record a real-time picture based on the video recording duration;
and acquiring the video files obtained by the video of each second terminal.
3. The live broadcasting method according to claim 2, wherein when receiving a video recording request carrying a video recording duration, controlling each second terminal to record a real-time picture based on the video recording duration includes:
and when the video recording request carrying the video recording time length is received, sending a video recording batch identifier and the video recording time length to each second terminal so as to instruct each second terminal to record a real-time picture based on the video recording time length to obtain video recording files, wherein each video recording file comprises the video recording batch identifier.
4. The live broadcasting method according to claim 3, wherein the sending a video recording batch identifier and the video recording duration to each second terminal when receiving the video recording request carrying the video recording duration includes:
when the video recording request carrying the video recording duration is received, acquiring a current timestamp;
splicing the current timestamp with a random number to obtain the video batch identification;
and broadcasting the video batch identification and the video recording duration to each second terminal.
5. A live method according to claim 1, characterized in that the live method further comprises:
when a playback request is received, detecting whether M video files pointed by the playback request are stored in the first terminal, wherein M is equal to the number of the second terminals;
and if the M video files are stored in the first terminal, the M video files are respectively played through different playback windows on the display screen.
6. The live broadcasting method according to claim 5, wherein the playback request carries a video recording batch identifier, and the detecting, when the playback request is received, whether the first terminal stores M video recording files pointed by the playback request includes:
and when the playback request is received, detecting whether the first terminal stores the M video files containing the video batch identification.
7. The live broadcasting method according to claim 5, wherein after the if the first terminal stores the M video files, the M video files are respectively played through different playback windows on the display screen, the live broadcasting method further comprises:
generating two-dimensional codes storing link addresses of the M video files;
and displaying the two-dimension code on the display screen, so that the user terminal which scans the two-dimension code obtains and plays the M video files according to the link address.
8. The utility model provides a live device which characterized in that is applied to the first terminal that has the display screen, first terminal and two at least second terminals are in same network, two at least second terminals are used for respectively taking same scene at different positions, live device includes:
the real-time picture acquisition unit is used for acquiring real-time pictures shot by each second terminal when receiving a live broadcast request;
the real-time picture display unit is used for displaying each real-time picture through different live broadcast windows on the display screen;
and the live broadcast window adjusting unit is used for adjusting the size of a target live broadcast window according to the size information and adjusting the position of the target live broadcast window according to the layout information if a window adjusting request carrying the size information and the layout information is received, wherein the target live broadcast window is a live broadcast window indicated by the window adjusting request.
9. A terminal comprising a display, a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/099363 WO2022000290A1 (en) | 2020-06-30 | 2020-06-30 | Live streaming method, live streaming apparatus, and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111919451A true CN111919451A (en) | 2020-11-10 |
Family
ID=73265254
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080001125.XA Pending CN111919451A (en) | 2020-06-30 | 2020-06-30 | Live broadcasting method, live broadcasting device and terminal |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111919451A (en) |
WO (1) | WO2022000290A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112672174A (en) * | 2020-12-11 | 2021-04-16 | 咪咕文化科技有限公司 | Split-screen live broadcast method, acquisition equipment, playing equipment and storage medium |
CN113873311A (en) * | 2021-09-09 | 2021-12-31 | 北京都是科技有限公司 | Live broadcast control method and device and storage medium |
CN113873312A (en) * | 2021-09-22 | 2021-12-31 | 北京达佳互联信息技术有限公司 | Video editing method and device |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115243067A (en) * | 2022-07-21 | 2022-10-25 | 天比高零售管理(深圳)有限公司 | Jewelry live broadcast platform file pushing method and management system |
CN116017145B (en) * | 2022-12-27 | 2023-08-01 | 深圳市快美妆科技有限公司 | Remote intelligent control system and method for live camera |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104410916A (en) * | 2014-12-03 | 2015-03-11 | 广州华多网络科技有限公司 | On-line living broadcast method and equipment |
CN105872570A (en) * | 2015-12-11 | 2016-08-17 | 乐视网信息技术(北京)股份有限公司 | Method and apparatus for implementing multi-camera video synchronous playing |
US20170195650A1 (en) * | 2015-12-30 | 2017-07-06 | Le Holdings (Beijing) Co., Ltd. | Method and system for multi point same screen broadcast of video |
CN107659825A (en) * | 2017-09-12 | 2018-02-02 | 武汉斗鱼网络科技有限公司 | Method, apparatus, server, main broadcaster end and the medium that a kind of live video is retained |
CN108024123A (en) * | 2017-11-08 | 2018-05-11 | 北京密境和风科技有限公司 | A kind of live video processing method, device, terminal device and server |
CN110636321A (en) * | 2019-09-30 | 2019-12-31 | 北京达佳互联信息技术有限公司 | Data processing method, device, system, mobile terminal and storage medium |
CN111147878A (en) * | 2019-12-30 | 2020-05-12 | 广州酷狗计算机科技有限公司 | Stream pushing method and device in live broadcast and computer storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105872705A (en) * | 2015-12-15 | 2016-08-17 | 乐视网信息技术(北京)股份有限公司 | Live display method and apparatus |
CN105847937A (en) * | 2016-04-19 | 2016-08-10 | 乐视控股(北京)有限公司 | Method and device for displaying video |
CN109600652B (en) * | 2017-09-30 | 2022-03-29 | 中兴通讯股份有限公司 | Method for playing multi-channel video by mobile terminal, mobile terminal and readable storage medium |
CN108156468A (en) * | 2017-09-30 | 2018-06-12 | 上海掌门科技有限公司 | A kind of method and apparatus for watching main broadcaster's live streaming |
CN110324693A (en) * | 2018-03-30 | 2019-10-11 | 武汉斗鱼网络科技有限公司 | The direct broadcasting room processing method and processing device of video is played for realizing how small window |
-
2020
- 2020-06-30 CN CN202080001125.XA patent/CN111919451A/en active Pending
- 2020-06-30 WO PCT/CN2020/099363 patent/WO2022000290A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104410916A (en) * | 2014-12-03 | 2015-03-11 | 广州华多网络科技有限公司 | On-line living broadcast method and equipment |
CN105872570A (en) * | 2015-12-11 | 2016-08-17 | 乐视网信息技术(北京)股份有限公司 | Method and apparatus for implementing multi-camera video synchronous playing |
US20170195650A1 (en) * | 2015-12-30 | 2017-07-06 | Le Holdings (Beijing) Co., Ltd. | Method and system for multi point same screen broadcast of video |
CN107659825A (en) * | 2017-09-12 | 2018-02-02 | 武汉斗鱼网络科技有限公司 | Method, apparatus, server, main broadcaster end and the medium that a kind of live video is retained |
CN108024123A (en) * | 2017-11-08 | 2018-05-11 | 北京密境和风科技有限公司 | A kind of live video processing method, device, terminal device and server |
CN110636321A (en) * | 2019-09-30 | 2019-12-31 | 北京达佳互联信息技术有限公司 | Data processing method, device, system, mobile terminal and storage medium |
CN111147878A (en) * | 2019-12-30 | 2020-05-12 | 广州酷狗计算机科技有限公司 | Stream pushing method and device in live broadcast and computer storage medium |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112672174A (en) * | 2020-12-11 | 2021-04-16 | 咪咕文化科技有限公司 | Split-screen live broadcast method, acquisition equipment, playing equipment and storage medium |
CN112672174B (en) * | 2020-12-11 | 2023-07-07 | 咪咕文化科技有限公司 | Split-screen live broadcast method, acquisition device, playing device and storage medium |
CN113873311A (en) * | 2021-09-09 | 2021-12-31 | 北京都是科技有限公司 | Live broadcast control method and device and storage medium |
CN113873311B (en) * | 2021-09-09 | 2024-03-12 | 北京都是科技有限公司 | Live broadcast control method, device and storage medium |
CN113873312A (en) * | 2021-09-22 | 2021-12-31 | 北京达佳互联信息技术有限公司 | Video editing method and device |
CN113873312B (en) * | 2021-09-22 | 2023-12-01 | 北京达佳互联信息技术有限公司 | Video editing method and device |
Also Published As
Publication number | Publication date |
---|---|
WO2022000290A1 (en) | 2022-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111919451A (en) | Live broadcasting method, live broadcasting device and terminal | |
CN108737882B (en) | Image display method, image display device, storage medium and electronic device | |
US9910865B2 (en) | Method for capturing the moment of the photo capture | |
CN109981695B (en) | Content pushing method, device and equipment | |
US10929460B2 (en) | Method and apparatus for storing resource and electronic device | |
CN108632676B (en) | Image display method, image display device, storage medium and electronic device | |
US20150222815A1 (en) | Aligning videos representing different viewpoints | |
US20150035999A1 (en) | Method for sharing digital photos securely | |
US20110246909A1 (en) | Ancillary experience-based pairing | |
CN111937397A (en) | Media data processing method and device | |
US10009643B2 (en) | Apparatus and method for processing media content | |
CN104361075A (en) | Image website system and realizing method | |
CN105933757A (en) | Video playing method, device and system thereof | |
CN110913278B (en) | Video playing method, display terminal and storage medium | |
CN104185040A (en) | Application synchronization method, application server and terminal | |
CN104661101A (en) | System and method for providing augmented reality effect for multimedia data | |
CN110996157A (en) | Video playing method and device, electronic equipment and machine-readable storage medium | |
WO2022037484A1 (en) | Image processing method and apparatus, device and storage medium | |
WO2016200721A1 (en) | Contextual video content adaptation based on target device | |
CN112770151A (en) | Method, device and storage medium for supporting multi-person interception of screen-projected playing picture | |
CN108401163B (en) | Method and device for realizing VR live broadcast and OTT service system | |
WO2022088908A1 (en) | Video playback method and apparatus, electronic device, and storage medium | |
CN107155114A (en) | A kind of video pictures method of adjustment and system | |
US9491447B2 (en) | System for providing complex-dimensional content service using complex 2D-3D content file, method for providing said service, and complex-dimensional content file therefor | |
US20140003656A1 (en) | System of a data transmission and electrical apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201110 |