CN112423139A - Multi-machine live broadcast method, system, equipment and storage medium based on mobile terminal - Google Patents
Multi-machine live broadcast method, system, equipment and storage medium based on mobile terminal Download PDFInfo
- Publication number
- CN112423139A CN112423139A CN202011311206.8A CN202011311206A CN112423139A CN 112423139 A CN112423139 A CN 112423139A CN 202011311206 A CN202011311206 A CN 202011311206A CN 112423139 A CN112423139 A CN 112423139A
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- camera
- video
- real
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47205—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44222—Analytics of user selections, e.g. selection of programs or purchase activity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The invention provides a multi-machine live broadcast method, a multi-machine live broadcast system, a multi-machine live broadcast device and a storage medium based on a mobile terminal, wherein the method comprises the following steps: setting a plurality of first-class mobile terminals with cameras, and establishing communication between the first-class mobile terminals and a second-class mobile terminal; the camera of each first-class mobile terminal carries out video shooting and sends shot real-time video to the second-class mobile terminal; displaying real-time videos shot by different first-type mobile terminals in a screen of a second-type mobile terminal in a subarea manner; detecting the selected states of different areas, and taking a selected real-time video as an alternative sub-video; and editing the alternative sub-videos into a video stream according to the time sequence order, and sending the video stream to an output end or a server. By improving the algorithm, the invention enables the multi-camera linkage shooting to provide a comprehensive video stream, greatly simplifies the difficulty and later cost of the multi-camera linkage shooting, and is beneficial to the popularization of the multi-camera linkage shooting and the improvement of the video quality.
Description
Technical Field
The invention relates to the field of video shooting, in particular to a method, a system, equipment and a storage medium for multi-position live broadcast of a mobile terminal.
Background
With the development of internet technology and intelligent mobile terminal equipment, various internet products bring convenience and entertainment to the work and life of people, in recent years, various live broadcast platforms for live video broadcast are in endless, and live video broadcast brings real-time social experience to people. Due to the diversity of live broadcast requirements, the multi-machine-position multi-angle live broadcast can fully play the online advantage of the multi-machine-position, avoid the problems of single and poor live broadcast picture content acquired by single-machine-position live broadcast, improve the live broadcast atmosphere to a greater extent, enhance the live broadcast content and support the complicated program live broadcast.
However, for the personal video blogger, the multiple machine positions mean more labor cost and post-editing cost, so that the live broadcast mode of the multiple machine positions is difficult to be popularized by the personal video blogger.
Therefore, the invention provides a method, a system, equipment and a storage medium for multi-position live broadcasting of a mobile terminal.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a multi-camera direct broadcasting method, a multi-camera direct broadcasting system, a multi-camera direct broadcasting device and a storage medium based on a mobile terminal, which overcome the difficulty in the prior art, enable multi-camera linkage shooting to provide a comprehensive video stream through the improvement of an algorithm, greatly simplify the difficulty and later cost of multi-camera shooting, reduce the use threshold of multi-camera shooting, are very beneficial to the popularization of multi-camera shooting, and improve the quality of videos through convenient and rapid multi-angle shooting.
The embodiment of the invention provides a multi-machine-position live broadcast method based on a mobile terminal, which comprises the following steps:
s110, setting a plurality of first-class mobile terminals with cameras, and establishing communication between the first-class mobile terminals and a second-class mobile terminal;
s120, video shooting is carried out on the camera of each first-class mobile terminal, and the shot real-time video is sent to the second-class mobile terminal;
s130, displaying real-time videos shot by different first-type mobile terminals in a split area on a screen of the second-type mobile terminal;
s140, detecting the selected states of different areas, and taking a selected real-time video as an alternative sub-video; and
s150, editing the alternative sub-videos into a video stream according to the time sequence, and sending the video stream to an output end or a server.
Preferably, the step S120 includes: every mobile terminal includes the camera of two different shooting directions, and every camera intermittent type nature carries out the portrait discernment from the real-time image, judges whether to have the preset face, regards the camera that has the preset face in the image as the work camera, will the real-time video that the work camera was shot sends second class mobile terminal.
Preferably, the preset human face is identified according to the personal facial features prerecorded by the video blogger.
Preferably, when the mobile terminal has a plurality of cameras in the same shooting direction, and the plurality of cameras all shoot a preset face, the real-time image of each camera is subjected to face recognition to establish a face region, and at least one camera with the largest face region is used as a working camera.
Preferably, in step S130, the touch screen of the second type of mobile terminal is divided into a plurality of areas, and each area respectively plays a real-time video captured by a different first type of mobile terminal;
in step S140, the user selects the remaining one of the regions by clicking, highlights a border of the region, and uses the real-time video played in the region as the candidate sub-video.
Preferably, in step S140, when the user clicks another region, a frame of the newly selected region is highlighted, and a real-time video played in the region is used as a new candidate sub-video.
Preferably, in step S140, the second-class mobile terminal tracks an eye movement trajectory of a user, and when a duration that the eye movement trajectory of the user stays in an area of the second-class mobile terminal exceeds a preset threshold, highlight a frame of the area, and use a real-time video played in the area as an alternative sub-video.
Preferably, the first type of mobile terminal is any one of a mobile phone, a notebook, a tablet computer, and an unmanned aerial vehicle with a camera.
Preferably, the second type of mobile terminal is any one of a mobile phone, a notebook, a tablet computer, and an unmanned aerial vehicle with a camera.
The embodiment of the invention also provides a multi-camera live broadcast system based on the mobile terminal, which is used for realizing the multi-camera live broadcast method based on the mobile terminal, and the multi-camera live broadcast system based on the mobile terminal comprises:
the terminal connection module is used for setting a plurality of first-class mobile terminals with cameras and establishing communication between the first-class mobile terminals and a second-class mobile terminal;
the video transmission module is used for shooting videos by the camera of each first type of mobile terminal and sending the shot real-time videos to a second type of mobile terminal;
the video display module is used for displaying real-time videos shot by different first-type mobile terminals in a split area on a screen of the second-type mobile terminal;
the first editing module is used for detecting the selected states of different areas and taking a selected real-time video as an alternative sub-video; and
and the second editing module edits the alternative sub-videos into a video stream according to the time sequence order and sends the video stream to an output end or a server.
The embodiment of the invention also provides a multi-machine-position live broadcast device based on the mobile terminal, which comprises the following steps:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the mobile terminal-based multi-camera live method described above via execution of the executable instructions.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program realizes the steps of the multi-camera live broadcast method based on the mobile terminal when being executed.
The invention aims to provide a multi-camera live broadcast method, a multi-camera live broadcast system, a multi-camera live broadcast device and a storage medium based on a mobile terminal, which enable multi-camera linkage shooting to provide a comprehensive video stream through algorithm improvement, greatly simplify the difficulty and later cost of multi-camera shooting, reduce the use threshold of multi-camera shooting, are very beneficial to popularization of multi-camera shooting, and improve the quality of videos through convenient and quick multi-angle shooting.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
Fig. 1 is a flowchart of a multi-station live broadcast method based on a mobile terminal according to the present invention.
Fig. 2 is a process diagram of implementing the multi-camera live broadcasting method based on the mobile terminal of the present invention.
Fig. 3 is a module schematic diagram of a multi-camera live broadcast system based on a mobile terminal according to the present invention.
Fig. 4 is a schematic structural diagram of a multi-camera live broadcast device based on a mobile terminal according to the present invention. And
fig. 5 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their repetitive description will be omitted.
Fig. 1 is a flowchart of a multi-station live broadcast method based on a mobile terminal according to the present invention. As shown in fig. 1, an embodiment of the present invention provides a multi-station live broadcast method based on a mobile terminal, including the following steps:
s110, setting a plurality of first-class mobile terminals with cameras, and establishing communication between the first-class mobile terminals and a second-class mobile terminal;
s120, video shooting is carried out on the camera of each first-class mobile terminal, and the shot real-time video is sent to a second-class mobile terminal;
s130, displaying real-time videos shot by different first-type mobile terminals in a screen of a second-type mobile terminal in a subarea mode;
s140, detecting the selected states of different areas, and copying the selected real-time video as an alternative sub-video; and
s150, editing the alternative sub-videos into a video stream according to the time sequence, and sending the video stream to an output end or a server.
In a preferred embodiment, step S120 includes: every mobile terminal includes the camera of two different shooting directions, and every camera intermittent type nature carries out the portrait discernment from the real-time image, judges whether to have the preset face, regards the camera that has the preset face in the image as the work camera, sends the real-time video that the work camera was shot to second class mobile terminal.
In a preferred embodiment, the recognition of the preset human face is carried out according to the personal facial features prerecorded by the video blogger.
In a preferred embodiment, when the mobile terminal has a plurality of cameras in the same shooting direction, and the plurality of cameras all shoot a preset human face, the real-time image of each camera is subjected to face recognition to establish a face region, and at least one camera with the largest face region is used as a working camera.
In a preferred embodiment, in step S130, the touch screen of the second type of mobile terminal is divided into a plurality of areas, and each area respectively plays a real-time video captured by a different first type of mobile terminal;
in step S140, the user selects the remaining one of the regions by clicking, highlights a border of the region, and uses the real-time video played in the region as the candidate sub-video.
In a preferred embodiment, in step S140, when the user clicks another area, the border of the newly selected area is highlighted, and the real-time video played in the area is used as a new candidate sub-video.
In a preferred embodiment, one of the mobile terminals acts as both a first type of mobile terminal and a second type of mobile terminal.
In a preferred embodiment, in step S140, the second-type mobile terminal tracks the eye movement trajectory of the user, and when the duration that the eye movement trajectory of the user stays in an area of the second-type mobile terminal exceeds a preset threshold, a frame of the area is highlighted, and a real-time video played in the area is used as an alternative sub-video, so that when the user is inconvenient to operate by hand or has a certain distance from the second-type mobile terminal, remote control is performed, and a different camera is replaced to capture a new alternative sub-video.
In a preferred embodiment, the preset threshold value ranges from 1 second to 5 seconds.
Preferably, the first type of mobile terminal is any one of a mobile phone, a notebook, a tablet computer, and an unmanned aerial vehicle with a camera.
Preferably, the second type of mobile terminal is any one of a mobile phone, a notebook, a tablet computer, and an unmanned aerial vehicle with a camera.
Fig. 2 is a process diagram of implementing the multi-camera live broadcasting method based on the mobile terminal of the present invention. Referring to fig. 2, the implementation of the present invention is as follows:
firstly, a plurality of mobile phones 2, 3, 4, 5 with cameras are arranged for shooting a user 1, communication between the mobile phones 3, 4, 5 and the mobile phone 2 is established, and videos are transmitted wirelessly. (in this embodiment, mobile terminal 2 is simultaneously used as a first type mobile terminal and a second type mobile terminal.)
The camera of each handset 2, 3, 4, 5 then takes a video shot and sends the shot real-time video to the handset. The method comprises the steps of carrying out portrait recognition from a real-time image intermittently by each camera on each mobile phone, judging whether a preset face exists or not, taking the camera with the preset face in the image as a working camera, and sending real-time video shot by the working camera to the mobile phone. The preset human face is identified according to the personal facial features prerecorded by the video blogger, only the self-photographing camera 22 on the front side of the mobile phone 2, the main camera 31 on the back side of the mobile phone 3, the main camera 41 on the back side of the mobile phone 4 and the main camera 51 on the back side of the mobile phone 5 photograph the human face of the user 1, and the real-time video photographed by the main camera 31 on the back side of the mobile phone 3, the main camera 41 on the back side of the mobile phone 4 and the main camera 51 on the back side of the mobile phone 5 is transmitted to the mobile phone 2 if the human face of the user 1 is not photographed by the self-photographing camera and the cameras.
Then, the screen of the mobile phone 2 is divided into four areas according to the upper left, the upper right, the lower left and the lower right, and real-time videos shot by the self-shooting camera 22 on the front side of the mobile phone 2, the main camera 31 on the back side of the mobile phone 3, the main camera 41 on the back side of the mobile phone 4 and the main camera 51 on the back side of the mobile phone 5 are respectively displayed.
Finally, the mobile phone 2 tracks the eye movement trajectory of the user 1, and when the duration of the eye movement trajectory of the user 1 staying in one area of the mobile phone exceeds a preset threshold, in this embodiment, the preset threshold is 2 seconds, the frame of the area is highlighted, and the real-time video played in the area is used as an alternative sub-video, for example: when the user 1 stares at the video in the lower left area of the screen of the mobile phone 2, the mobile phone 2 recognizes that the eye movement track of the user 1 stays in the lower left area for more than 2 seconds, copies the real-time video shot by the main camera 41 on the back of the mobile phone 4 as an alternative sub-video, and finally edits the alternative sub-video into a video stream according to the time sequence order and sends the video stream to an output end or a server. The output terminal can be connected to a computer, and the computer is connected to the server. Or directly transmitted to the server so that the server sends the video stream to other users. From this moment on, the content of the video stream is provided entirely by real-time video captured by the main camera 41 of the handset 4. After 1 minute, when user 1 needs to change the camera, for example: when the user 1 stares at the video in the lower right area to watch, the mobile phone 2 copies the real-time video shot by the main camera 51 at the back of the mobile phone 5 as the alternative sub-video by recognizing that the duration of the eye movement track of the user 1 staying in the lower right area exceeds 2 seconds, and the content of the video stream is completely provided by the real-time video shot by the main camera 51 of the mobile phone 5, so that when the user is inconvenient to operate by hand or has a certain distance with the mobile phone, remote control is performed, and different cameras are replaced to shoot new alternative sub-videos to adjust the position of the video stream.
Fig. 3 is a module schematic diagram of a multi-camera live broadcast system based on a mobile terminal according to the present invention. As shown in fig. 3, an embodiment of the present invention further provides a multi-camera live broadcasting system based on a mobile terminal, which is used for implementing the above-mentioned multi-camera live broadcasting method based on a mobile terminal, where the multi-camera live broadcasting system 500 based on a mobile terminal includes:
the terminal connection module 501 is configured to set a plurality of first-type mobile terminals with cameras, and establish communication between the first-type mobile terminals and a second-type mobile terminal.
The video transmission module 502, the camera of each first type mobile terminal takes video and sends the real-time video taken to a second type mobile terminal.
And a video display module 503, which displays real-time videos shot by different first-type mobile terminals in a sub-area on a screen of a second-type mobile terminal.
And the first editing module 504 detects the selected states of the different areas, and copies the selected real-time video as an alternative sub-video.
And the second editing module 505 edits the alternative sub-videos into a video stream according to the time sequence order and sends the video stream to an output end or a server.
The multi-camera direct broadcasting system based on the mobile terminal enables the multi-camera linkage shooting to provide a comprehensive video stream through the improvement of the algorithm, greatly simplifies the difficulty and later cost of the multi-camera shooting, reduces the use threshold of the multi-camera shooting, is very favorable for the popularization of the multi-camera shooting, and improves the video quality through convenient and quick multi-angle shooting.
The embodiment of the invention also provides multi-machine-position live broadcast equipment based on the mobile terminal, which comprises a processor. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the steps of the mobile terminal-based multi-station live method via execution of executable instructions.
As shown above, the multi-camera live broadcast system based on the mobile terminal of the embodiment of the present invention provides a comprehensive video stream through the algorithm improvement, greatly simplifies the difficulty and the later cost of multi-camera shooting, reduces the use threshold of multi-camera shooting, is very beneficial to the popularization of multi-camera shooting, and improves the video quality through convenient and fast multi-angle shooting.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 4 is a schematic structural diagram of a multi-camera live broadcast device based on a mobile terminal according to the present invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 4. The electronic device 600 shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 4, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing the program, and the steps of the multi-machine live broadcast method based on the mobile terminal are realized when the program is executed. In some possible embodiments, the aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
As shown above, the multi-camera live broadcast system based on the mobile terminal of the embodiment of the present invention provides a comprehensive video stream through the algorithm improvement, greatly simplifies the difficulty and the later cost of multi-camera shooting, reduces the use threshold of multi-camera shooting, is very beneficial to the popularization of multi-camera shooting, and improves the video quality through convenient and fast multi-angle shooting.
Fig. 5 is a schematic structural diagram of a computer-readable storage medium of the present invention. Referring to fig. 5, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In summary, the present invention aims to provide a multi-camera live broadcasting method, system, device and storage medium based on a mobile terminal, and the multi-camera live broadcasting system based on the mobile terminal of the present invention provides a comprehensive video stream through algorithm improvement, thereby greatly simplifying the difficulty and later cost of multi-camera shooting, reducing the use threshold of multi-camera shooting, being very beneficial to the popularization of multi-camera shooting, and improving the quality of video through convenient and fast multi-angle shooting.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.
Claims (10)
1. A multi-machine live broadcast method based on a mobile terminal is characterized in that,
s110, setting a plurality of first-class mobile terminals with cameras, and establishing communication between the first-class mobile terminals and a second-class mobile terminal;
s120, video shooting is carried out on the camera of each first-class mobile terminal, and the shot real-time video is sent to the second-class mobile terminal;
s130, displaying real-time videos shot by different first-type mobile terminals in a split area on a screen of the second-type mobile terminal;
s140, detecting the selected states of different areas, and taking a selected real-time video as an alternative sub-video; and
s150, editing the alternative sub-videos into a video stream according to the time sequence, and sending the video stream to an output end or a server.
2. The multi-camera live broadcasting method based on the mobile terminal as claimed in claim 1, wherein the step S120 includes: every mobile terminal includes the camera of two different shooting directions, and every camera intermittent type nature carries out the portrait discernment from the real-time image, judges whether to have the preset face, regards the camera that has the preset face in the image as the work camera, will the real-time video that the work camera was shot sends second class mobile terminal.
3. The multi-camera live broadcasting method based on the mobile terminal as claimed in claim 2, wherein the recognition of the preset face is performed according to the personal facial features pre-recorded by the video blogger.
4. The multi-camera live broadcasting method based on the mobile terminal as claimed in claim 2, wherein when the mobile terminal has a plurality of cameras in the same shooting direction, and the plurality of cameras all shoot a preset face, the real-time image of each camera is subjected to face recognition to establish a face area, and at least one camera with the largest face area is used as a working camera.
5. The multi-camera live broadcasting method based on the mobile terminal as claimed in claim 1, wherein in the step S130, the touch screen of the second type of mobile terminal is divided into a plurality of areas, and each area respectively plays real-time videos shot by different first type of mobile terminals;
in step S140, the user selects the remaining one of the regions by clicking, highlights a border of the region, and uses the real-time video played in the region as the candidate sub-video.
6. The multi-camera live broadcasting method based on the mobile terminal of claim 5, wherein in the step S140, when the user clicks another area, a border of the newly selected area is highlighted, and a real-time video played in the area is used as a new candidate sub-video.
7. The multi-camera live broadcasting method based on the mobile terminal of claim 1, wherein in the step S140, the second-type mobile terminal tracks an eye movement track of the user, and when a duration of the eye movement track of the user staying in an area of the second-type mobile terminal exceeds a preset threshold, a frame of the area is highlighted, and a real-time video played in the area is taken as an alternative sub-video.
8. A multi-camera live broadcast system based on a mobile terminal, for implementing the multi-camera live broadcast method based on the mobile terminal as claimed in claim 1, comprising:
the terminal connection module is used for setting a plurality of first-class mobile terminals with cameras and establishing communication between the first-class mobile terminals and a second-class mobile terminal;
the video transmission module is used for shooting videos by the camera of each first type of mobile terminal and sending the shot real-time videos to a second type of mobile terminal;
the video display module is used for displaying real-time videos shot by different first-type mobile terminals in a split area on a screen of the second-type mobile terminal;
the first editing module is used for detecting the selected states of different areas and taking a selected real-time video as an alternative sub-video; and
and the second editing module edits the alternative sub-videos into a video stream according to the time sequence order and sends the video stream to an output end or a server.
9. The utility model provides a multimachine position live equipment based on mobile terminal which characterized in that includes:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the mobile terminal based multi-station live method of any one of claims 1 to 7 via execution of the executable instructions.
10. A computer-readable storage medium storing a program, wherein the program when executed implements the steps of the multi-camera live broadcasting method based on a mobile terminal according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011311206.8A CN112423139A (en) | 2020-11-20 | 2020-11-20 | Multi-machine live broadcast method, system, equipment and storage medium based on mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011311206.8A CN112423139A (en) | 2020-11-20 | 2020-11-20 | Multi-machine live broadcast method, system, equipment and storage medium based on mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112423139A true CN112423139A (en) | 2021-02-26 |
Family
ID=74777069
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011311206.8A Pending CN112423139A (en) | 2020-11-20 | 2020-11-20 | Multi-machine live broadcast method, system, equipment and storage medium based on mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112423139A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113794844A (en) * | 2021-09-09 | 2021-12-14 | 北京字节跳动网络技术有限公司 | Free view video acquisition system, method, apparatus, server and medium |
CN115297338A (en) * | 2022-08-05 | 2022-11-04 | 深圳市野草声学有限公司 | Audio transmission method, video equipment, audio equipment and system during video shooting |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101093653A (en) * | 2007-06-26 | 2007-12-26 | 广东威创日新电子有限公司 | Device and method for multiuser interactive controlled jogged wall, and for retrieving information displayed on the jogged wall |
CN105338238A (en) * | 2014-08-08 | 2016-02-17 | 联想(北京)有限公司 | Photographing method and electronic device |
CN106231234A (en) * | 2016-08-05 | 2016-12-14 | 广州小百合信息技术有限公司 | The image pickup method of video conference and system |
CN106603912A (en) * | 2016-12-05 | 2017-04-26 | 科大讯飞股份有限公司 | Video live broadcast control method and device |
CN107431846A (en) * | 2015-12-04 | 2017-12-01 | 咖啡24株式会社 | Image transfer method, equipment and system based on multiple video cameras |
CN107948665A (en) * | 2017-11-27 | 2018-04-20 | 广州华多网络科技有限公司 | Switch prompting method, device and mobile terminal in net cast |
-
2020
- 2020-11-20 CN CN202011311206.8A patent/CN112423139A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101093653A (en) * | 2007-06-26 | 2007-12-26 | 广东威创日新电子有限公司 | Device and method for multiuser interactive controlled jogged wall, and for retrieving information displayed on the jogged wall |
CN105338238A (en) * | 2014-08-08 | 2016-02-17 | 联想(北京)有限公司 | Photographing method and electronic device |
CN107431846A (en) * | 2015-12-04 | 2017-12-01 | 咖啡24株式会社 | Image transfer method, equipment and system based on multiple video cameras |
CN106231234A (en) * | 2016-08-05 | 2016-12-14 | 广州小百合信息技术有限公司 | The image pickup method of video conference and system |
CN106603912A (en) * | 2016-12-05 | 2017-04-26 | 科大讯飞股份有限公司 | Video live broadcast control method and device |
CN107948665A (en) * | 2017-11-27 | 2018-04-20 | 广州华多网络科技有限公司 | Switch prompting method, device and mobile terminal in net cast |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113794844A (en) * | 2021-09-09 | 2021-12-14 | 北京字节跳动网络技术有限公司 | Free view video acquisition system, method, apparatus, server and medium |
CN115297338A (en) * | 2022-08-05 | 2022-11-04 | 深圳市野草声学有限公司 | Audio transmission method, video equipment, audio equipment and system during video shooting |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10645332B2 (en) | Subtitle displaying method and apparatus | |
US9791920B2 (en) | Apparatus and method for providing control service using head tracking technology in electronic device | |
CN106155311A (en) | AR helmet, AR interactive system and the exchange method of AR scene | |
WO2019134516A1 (en) | Method and device for generating panoramic image, storage medium, and electronic apparatus | |
CN105554430B (en) | A kind of video call method, system and device | |
CN104777991A (en) | Remote interactive projection system based on mobile phone | |
CN112312144B (en) | Live broadcast method, device, equipment and storage medium | |
US20240086043A1 (en) | Information display method and apparatus, electronic device, and storage medium | |
CN108156374A (en) | A kind of image processing method, terminal and readable storage medium storing program for executing | |
US11917329B2 (en) | Display device and video communication data processing method | |
CN112423139A (en) | Multi-machine live broadcast method, system, equipment and storage medium based on mobile terminal | |
US20150244984A1 (en) | Information processing method and device | |
CN111726561B (en) | Conference method, system, equipment and storage medium for different terminals and same account | |
WO2023143299A1 (en) | Message display method and apparatus, device, and storage medium | |
CN108108079A (en) | A kind of icon display processing method and mobile terminal | |
CN112799622A (en) | Application control method and device and electronic equipment | |
CN110086998B (en) | Shooting method and terminal | |
CN109739414B (en) | Picture processing method, mobile terminal and computer readable storage medium | |
US20230368338A1 (en) | Image display method and apparatus, and electronic device | |
CN112235510A (en) | Shooting method, shooting device, electronic equipment and medium | |
CN114125297B (en) | Video shooting method, device, electronic equipment and storage medium | |
WO2022199614A1 (en) | Interface input source switching method and apparatus, and electronic device | |
CN109495762A (en) | Data flow processing method, device and storage medium, terminal device | |
US20210377454A1 (en) | Capturing method and device | |
CN110941344B (en) | Method for obtaining gazing point data and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210226 |
|
WD01 | Invention patent application deemed withdrawn after publication |