CN117714730A - Multi-picture live broadcast method and device, electronic equipment and storage medium - Google Patents

Multi-picture live broadcast method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117714730A
CN117714730A CN202311754420.4A CN202311754420A CN117714730A CN 117714730 A CN117714730 A CN 117714730A CN 202311754420 A CN202311754420 A CN 202311754420A CN 117714730 A CN117714730 A CN 117714730A
Authority
CN
China
Prior art keywords
picture
live broadcast
live
sub
main
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311754420.4A
Other languages
Chinese (zh)
Inventor
程益君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shuhang Technology Beijing Co ltd
Original Assignee
Shuhang Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shuhang Technology Beijing Co ltd filed Critical Shuhang Technology Beijing Co ltd
Priority to CN202311754420.4A priority Critical patent/CN117714730A/en
Publication of CN117714730A publication Critical patent/CN117714730A/en
Pending legal-status Critical Current

Links

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application discloses a multi-picture live broadcast method, a device, electronic equipment and a storage medium, wherein a live broadcast interface can be displayed, and the live broadcast interface comprises a live broadcast main picture; responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture. Therefore, the audience can watch the complete live broadcast picture while watching the details of the live broadcast content, and the watching experience of the audience is improved.

Description

Multi-picture live broadcast method and device, electronic equipment and storage medium
Technical Field
The application relates to the field of multi-picture live broadcast data processing, in particular to a multi-picture live broadcast method, a device, electronic equipment and a storage medium.
Background
In the existing network live broadcast, the audience can only see the whole content of the live broadcast content, but cannot see part of detail content of the live broadcast content, so that the viewing experience of the audience is poor.
Disclosure of Invention
The embodiment of the application provides a multi-picture live broadcast method, a device, electronic equipment and a storage medium, which can enable a viewer to watch complete live broadcast pictures while watching live broadcast content details, and improve the viewing experience of the viewer.
In a first aspect, an embodiment of the present application provides a multi-screen live broadcast method, where the method is applied to a viewer end, and includes:
displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture;
responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
In a second aspect, an embodiment of the present application further provides a multi-picture live broadcast method, where the method is applied to a hosting end, and includes:
displaying a live broadcast interface of a target live broadcast room, wherein the live broadcast interface comprises a live broadcast main picture;
responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
In a third aspect, an embodiment of the present application further provides a multi-picture live broadcast apparatus, where the apparatus is applied to a viewer end, including:
the first display module is used for displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture;
the second display module is used for responding to a multi-picture live event and displaying live pictures after the combination of the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
In a fourth aspect, an embodiment of the present application further provides a multi-picture live broadcast device, where the device is applied to a hosting end, and includes:
the third display module is used for displaying a live broadcast interface of the target live broadcast room, wherein the live broadcast interface comprises a live broadcast main picture;
the fourth display module is used for responding to a multi-picture live event and displaying the live picture after the combination of the live main picture and the live sub picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
In a fifth aspect, the embodiments of the present application further provide an electronic device, including a processor and a memory, where the memory stores a computer program, and when the computer program is executed by the processor, causes the processor to execute the steps of any one of the multi-screen live broadcast methods provided in the embodiments of the present application.
In a sixth aspect, embodiments of the present application further provide a computer readable storage medium, including a computer program, where the computer program is configured to cause an electronic device to perform any of the steps of the multi-screen live broadcast method provided in the embodiments of the present application when the computer program is run on the electronic device.
In a seventh aspect, embodiments of the present application also provide a computer program product comprising a computer program stored in a computer readable storage medium; when the processor of the electronic device reads the computer program from the computer readable storage medium, the processor executes the computer program, so that the electronic device performs the steps of any of the multi-screen live broadcast methods provided in the embodiments of the present application.
By adopting the scheme of the embodiment of the application, a live broadcast interface can be displayed, wherein the live broadcast interface comprises a live broadcast main picture; responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture. Therefore, the audience can watch the complete live broadcast picture while watching the details of the live broadcast content, and the watching experience of the audience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a system architecture of a multi-screen live broadcast method application provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of data interaction between a host and a server according to an embodiment of the present application;
FIG. 3 is a schematic diagram of data interaction between a viewer side and a server side according to an embodiment of the present application;
fig. 4 is a schematic diagram of data interaction between a hosting side, a live server side, a cloud server side and a viewer side provided in an embodiment of the present application:
FIG. 5 is a flowchart of an embodiment of a multi-screen live broadcast method applied to a viewer terminal according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a live interface provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of a live interface with a sprite region selection box according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a region framed by a sprite region selection frame on a live main picture according to an embodiment of the present application;
FIG. 9 is another schematic illustration of a live interface provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of a parameter setting page according to an embodiment of the present disclosure;
FIG. 11 is a flowchart of an embodiment of a multi-screen live method applied to a anchor provided in an embodiment of the present application;
Fig. 12 is a schematic structural diagram of a multi-screen live broadcast device applied to a viewer end according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a multi-screen live broadcast device applied to a hosting end according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
Meanwhile, in the description of the embodiments of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more features. In the description of the embodiments of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The embodiment of the application provides a multi-picture live broadcast method, a multi-picture live broadcast device, electronic equipment and a computer readable storage medium.
The multi-picture live broadcast method provided by the embodiment of the application can be applied to a system architecture 100. Referring to fig. 1, as shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Eperts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Eperts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like. A terminal may be a device that includes both receive and transmit hardware, i.e., a device having receive and transmit hardware capable of performing bi-directional communications over a bi-directional communication link. The terminal and the server may communicate bi-directionally over a network.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103. The server may be a stand-alone server, or may be a server network or a server cluster of servers, including but not limited to a computer, a network host, a single network server, a set of multiple network servers, or a cloud server of multiple servers. Wherein the Cloud server is composed of a large number of computers or web servers based on Cloud Computing (Cloud Computing).
It should be noted that, the multi-picture live broadcast method provided by the embodiment of the present application is executed by a server, and accordingly, the multi-picture live broadcast device is disposed in the server.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. Any number of terminal devices, networks and servers may be provided according to implementation requirements, and the terminal devices 101, 102 and 103 in the embodiments of the present application may specifically correspond to application systems in actual production.
As shown in fig. 2, fig. 2 is a schematic diagram of data interaction between a host and a server. The main broadcasting end can display a live broadcasting interface of the target live broadcasting room, the live broadcasting interface comprises a live broadcasting main picture, and a sub-picture area selection frame is displayed in the main broadcasting end in response to sub-picture selection operation of the live broadcasting main picture of the main broadcasting end; responding to a sub-picture confirmation operation of a live broadcast main picture of a main broadcasting terminal, and determining a region selected by the sub-picture region selection frame on the live broadcast main picture as a sub-picture region by the main broadcasting terminal; responding to a multi-picture live broadcast event, and transmitting the currently shot live broadcast main picture, the position information of the sub-picture area and the target sub-picture parameter information to a server; triggering the server to acquire an initial sub-picture from the live broadcast main picture which is currently shot based on the position information of the sub-picture area; processing the initial sub-picture based on the target sub-picture parameter information to obtain a live sub-picture; and fusing the live broadcast main picture and the live broadcast sub-picture to obtain a fused live broadcast picture, and controlling the target live broadcast room to display the fused live broadcast picture on a live broadcast interface of the main broadcast end and the audience end.
It should be noted that fig. 2 is only a schematic diagram of data interaction between a host and a server, and other data interaction possibilities are not excluded, which is not limited in this embodiment.
As shown in fig. 3, fig. 3 is a schematic diagram of data interaction between a viewer and a server. The method comprises the steps that a spectator terminal sends the video to a target live broadcasting room of a server terminal, and responds to triggering operation of the target live broadcasting room, the spectator terminal obtains fused live broadcasting pictures from the server terminal, wherein the fused live broadcasting pictures are generated by a multi-picture live broadcasting method applied to a main broadcasting terminal; and displaying the fused live broadcast picture on a live broadcast interface of the audience terminal.
It can be appreciated that the number of live sprites may be one or more. And the live broadcast main picture and the live broadcast sub picture of the target live broadcast room are subjected to confluence processing on the server side, and the audience side sees the confluent processed and fused live broadcast picture.
As shown in fig. 4, fig. 4 is a schematic diagram of interaction between a main broadcasting end, a cloud server end, a live broadcasting server end and a viewer end, where the main broadcasting end sends a live broadcasting main picture and a live broadcasting sub-picture to the live broadcasting server end, and the live broadcasting main picture and the live broadcasting sub-picture are sent to the cloud server end through the live broadcasting server end, and the live broadcasting main picture and the live broadcasting sub-picture are rendered at the cloud server end to obtain a combined live broadcasting picture. And the cloud server side sends the combined live broadcast picture to the live broadcast server side so that the live broadcast server side returns the combined live broadcast picture to the anchor side. And the live broadcast server side receives an access request of the audience side to a target live broadcast room of the anchor side and sends the combined live broadcast picture to the audience side.
In the schematic diagram, the process of rendering the live broadcast main picture and the live broadcast sub-picture at the cloud server side to obtain the combined live broadcast picture may be: and receiving the live broadcast main picture and parameters corresponding to the sub-picture area selected by the main broadcasting terminal when the main broadcasting terminal is live broadcast in real time, wherein the parameters corresponding to the sub-picture area can comprise shape parameters of the sub-picture area, scaling parameters of the content of the sub-picture area and effect information of the content of the sub-picture area. And extracting the live sub-picture from the live main picture according to the parameters corresponding to the selected sub-picture area, and processing the live sub-picture so that the processed live sub-picture is overlapped on the live main picture to obtain the combined live picture. For example, an image frame of an area where a sub-picture area is located is extracted from a live-broadcast main picture, and the content in the image frame is enlarged, wherein the size of the image frame is unchanged, but the content of the area where the image frame is located is enlarged, so that a live-broadcast sub-picture is generated.
The following describes in detail, respectively, with reference to the drawings, the execution subject in this embodiment is an electronic device capable of viewing live broadcast. The following description of the embodiments is not intended to limit the preferred embodiments. Although a logical order is depicted in the flowchart, in some cases the steps shown or described may be performed in an order different than depicted in the figures.
Referring to fig. 5, the multi-picture live broadcast method is applied to a viewer, and the specific flow of the multi-picture live broadcast method may be as follows steps 201 to 202, where:
step 201, displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture.
In the embodiment of the application, the live interface comprises a live screen area and other areas. The live broadcast picture area at least comprises a live broadcast main picture. The live broadcast main picture is a live broadcast picture displayed on the audience side of the live broadcast main picture of the target live broadcast room. The other areas at least comprise an operation area which can be performed by the audience during live broadcast watching, for example, a control area for the audience to perform interactive operation with the anchor. Other areas may also include interactive display areas, such as display areas for chat interaction by spectators with the anchor.
In the embodiment of the application, the live interface is displayed by responding to the watching operation of the audience terminal on the target live room. For example, by clicking on the viewer's end point and entering the target live room, the live interface of the target live room is displayed. For another example, by inputting keywords related to the target living broadcast room at the viewer end, searching the target living broadcast room according to the keywords, and displaying a living broadcast interface of the target living broadcast room in response to the viewing operation of the viewer end on the target living broadcast room.
In the embodiment of the present application, displaying the live interface may further be displaying an initial live interface or a default live interface of the application in response to an opening operation of the application, for example, opening an application, and displaying the live interface on a live bar of the application.
It can be understood that the specific operation of displaying the live interface can be set according to the actual situation, and the embodiment of the application is not limited.
Referring to fig. 6, fig. 6 is a schematic diagram of a live interface provided in an embodiment of the present application, and fig. 6 may be a schematic diagram of a live interface at a viewer end, as shown in fig. 6, taking a live view area in fig. 6 as an example, and 301 in fig. 6 is a live main view in a live view area. Taking the other areas in fig. 6 as an example, 302 in fig. 6 is a control area in the other areas, where the control area is used for the viewer to perform interactive operation.
And 202, responding to a multi-picture live event, and displaying a live picture formed by combining a live main picture and a live sub-picture on a live interface. The live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
In the embodiment of the present application, the live sub-picture is a part of pictures in the live main picture of the target live room, and the part of pictures are pictures capable of displaying the detail content of the live main picture at the viewer end.
Referring to fig. 6, on the live view area of the live view interface, the live view area further includes a live sub-view, where 303 in fig. 6 is a live sub-view in the live view area.
In this embodiment of the present application, a process of displaying a live broadcast picture after a live broadcast main picture and a live broadcast sub picture are combined on a live broadcast interface may be: and acquiring the live broadcast picture after the fusion of the live broadcast main picture and the live broadcast sub picture. And displaying the fused live broadcast picture on the live broadcast interface.
The server side intercepts the content of the sub-picture area from the live broadcast main picture according to the sub-picture area selected by the main broadcasting side and the target parameter information of the sub-picture area, processes the content of the sub-picture area based on the target parameter information to generate a live broadcast sub-picture, performs merging processing on the live broadcast main picture and the live broadcast sub-picture, obtains the merged live broadcast picture by the audience side, and displays the merged live broadcast picture on the live broadcast interface.
In this embodiment of the present application, a process of displaying a live broadcast picture after a live broadcast main picture and a live broadcast sub picture are combined on a live broadcast interface may further be: and acquiring the live broadcast main picture and the live broadcast sub-picture. And displaying a live broadcast picture on the live broadcast interface, wherein the live broadcast picture comprises a main picture layer and a sub picture layer which are displayed in a superimposed mode, the main picture layer is used for displaying a live broadcast main picture, and the sub picture layer is used for displaying a live broadcast sub picture.
The server side intercepts the content of the sub-picture area from the live broadcast main picture according to the sub-picture area selected by the main broadcasting side and the target parameter information of the sub-picture area, processes the content of the sub-picture area based on the target parameter information to generate a live broadcast sub-picture, displays the live broadcast main picture on a main picture layer of the live broadcast interface, displays the live broadcast sub-picture on a sub-picture layer of the live broadcast interface, and the main picture layer is arranged below the sub-picture layer.
In this embodiment of the present application, a multi-picture live event refers to an event of extracting one or more live sub-pictures from a live main picture, and displaying a live picture after the live main picture and the one or more live sub-pictures are combined on a live interface.
In the embodiment of the application, the multi-picture live event is triggered, and the multi-picture live event comprises, but is not limited to, a live sub-picture selection event of a main broadcasting end, a target object identified in a live main picture, and a multi-picture live event initiated by a viewer end to a target live room.
In this embodiment of the present application, displaying, on a live interface, a live broadcast picture after a live broadcast main picture and a live broadcast sub-picture are combined, including:
and responding to the live sub-picture selection event of the main broadcasting terminal, and displaying the live picture after the combination of the live main picture and the live sub-picture on the live interface of the audience terminal.
In this embodiment, the live sprite selection event of the anchor refers to a live sprite selection event initiated by the anchor, and specific content may be seen in the following multi-picture live broadcast method applied to the anchor.
In this embodiment of the present application, displaying, on a live interface, a live broadcast picture after a live broadcast main picture and a live broadcast sub-picture are combined, including:
in response to identifying the target object in the live home screen, determining to trigger a multi-screen live event.
And displaying the live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture on the live broadcast interface.
In the embodiment of the present application, the target object refers to an object that triggers a multi-screen live event. For example, the target object is a live commodity, and when the live commodity corresponding to the current live link is identified, a multi-picture live event of the identified object is triggered.
In the embodiment of the application, the live sprite includes a target object. After the target object is identified, triggering a multi-picture live event, locking the identified target object in the live main picture, and constructing a live sub-picture for the target object.
In this embodiment of the present application, displaying, on a live interface, a live broadcast picture after a live broadcast main picture and a live broadcast sub-picture are combined, including:
And in response to the sub-picture selection operation of the target live broadcasting room, displaying a sub-picture region selection frame on the live broadcasting interface.
And in response to a sub-picture confirmation operation on the live main picture, determining the region framed by the sub-picture region selection frame on the live main picture as a sub-picture region.
And responding to the confirmation operation of the main broadcasting end to the sub-picture area corresponding to the target live broadcasting room, and displaying the live broadcasting picture after the combination of the live broadcasting main picture and the live broadcasting sub-picture on the live broadcasting interface.
In the embodiment of the present application, the sprite refers to a partial picture in the live main picture. For example, when the live main picture includes a vase, a flower, and the sub-picture may be a picture about the flower in the live main picture. For another example, when the live main screen includes a live commodity, the sub-screen may be a local screen of the live commodity.
In the embodiment of the application, the sub-picture selection operation in response to the target live room may be a trigger operation in response to a sub-picture selection control of the live main picture, and the trigger operation is taken as the sub-picture selection operation. The sub-picture selection operation of the target live broadcasting room can be also the function triggering operation of the live broadcasting main picture, the live broadcasting function page is displayed, and the triggering operation of the sub-picture selection function in the live broadcasting function page is used as the sub-picture selection operation. In response to the sub-picture selection operation of the target live broadcasting room, the sub-picture selection operation can be adjusted according to actual conditions, and the embodiment of the application is not particularly limited.
In the embodiment of the present application, the sprite region selection frame refers to a selection frame for determining a sprite region from a live main picture. The sub-picture region selection frame may be a cover layer displayed on the live broadcast interface, and may also be a cover layer displayed only on the live broadcast main picture, where the sub-picture region selection frame is a floating layer or window covering at least a portion of the picture content of the live broadcast main picture on the live broadcast interface, and is used to select the sub-picture region from the live broadcast main picture upper frame.
In an embodiment of the present application, the display sprite area selection box may be configured to display, on the live interface, an overlay layer in response to a sprite selection operation on the live main screen, where the overlay layer is only a floating layer or a window that covers at least a portion of the screen content of the live main screen on the live interface.
In this embodiment of the present application, the display sub-screen region selection box may further display a live-broadcast main-screen display page including only the live-broadcast main screen in response to a sub-screen selection operation on the live-broadcast main screen, where an overlay layer is displayed on the live-broadcast main-screen display page, where the overlay layer is a floating layer or a window that covers at least a portion of screen content of the live-broadcast main screen on the live-broadcast interface.
Referring to fig. 7, fig. 7 is a schematic diagram of a live broadcast interface with a sprite area selection frame displayed in the embodiment of the present application, and in the live broadcast interface shown in fig. 7, 304 in fig. 7 is a sprite area selection frame on the live broadcast interface. It can be understood that the sub-picture region selection frame is only an exemplary selection frame, and the shape, display form and display position of the specific sub-picture selection frame can be adjusted according to practical situations, which is not limited in the embodiments of the present application.
In the embodiment of the present application, the sub-screen confirmation operation in response to the live main screen may be a confirmation operation for a sub-screen region selection box displayed on the live interface. The sub-screen confirmation operation in response to the live main screen may also be a confirmation operation of a sub-screen region selection frame displayed on the live main screen presentation page including only the live main screen. The sub-frame confirmation operation can be adjusted according to a specific page, and the embodiment of the application is not limited.
In the embodiment of the present application, the sub-picture region selection frame is a closed selection frame, that is, the sub-picture region selection frame has a boundary. The determination of the area selected by the sub-picture area selection frame on the live-broadcast main picture as the sub-picture area refers to taking the area surrounded by the boundary of the sub-picture area selection frame as the area selected by the sub-picture area selection frame.
Referring to fig. 8, fig. 8 is a schematic diagram of an area framed on a live main frame by a sub-frame area selection frame provided in an embodiment of the present application, as shown in fig. 8, and 305 in fig. 8 is taken as an area framed on the live main frame by the sub-frame area selection frame.
In this embodiment of the present application, the audience terminal may apply for capturing the sprite, and after the audience terminal selects the sprite position, the audience terminal may notify the anchor terminal by means of a network request, long link communication, or the like. After waiting for the main broadcasting end to agree on a signal, the sub-picture position selected by the audience end is pushed to the audience end which initiates the request for display.
It can be understood that when the audience terminal initiates the request for capturing the sub-picture, if the anchor terminal agrees, the fused picture is displayed only on the audience terminal initiating the request for capturing the sub-picture, and the fused picture is not displayed on other audience terminals and the anchor terminal.
Referring to fig. 6, 303 in fig. 6 is a schematic diagram of displaying a live sub-picture in a live picture area, where the live sub-picture has position information, the position information can be set, and the processed live sub-picture is displayed according to the display position information of the live sub-picture on the live main picture.
Referring to fig. 9, 306 in fig. 9 is another schematic diagram of the display of the live sprite in the live sprite area. And the live broadcast sub-picture covers the live broadcast main picture at a position corresponding to the position information according to the position information of the sub-picture area in the live broadcast main picture.
In this embodiment of the present application, displaying, on a live interface, a live broadcast picture after a live broadcast main picture and a live broadcast sub-picture are combined, including:
and responding to the parameter setting operation of the sub-picture area selection frame, displaying a parameter setting page, wherein the parameter setting page comprises at least one sub-picture parameter setting control.
And responding to the setting operation of the sub-picture parameter setting control, and acquiring target sub-picture parameter information set for the sub-picture area.
And in response to a confirmation operation on the parameter setting page, updating the sub-picture parameter information of the sub-picture area based on the target sub-picture parameter information.
In the embodiment of the present application, the parameter setting page is a page for setting a sub-picture parameter of a sub-picture area. The sprite parameters include, but are not limited to, shape parameters of the sprite region, scaling parameters of the content of the sprite region, and effect information of the content of the sprite region. The specific content of the sub-picture parameters can be adjusted according to actual conditions, and the embodiment of the application is not limited.
In this embodiment of the present application, the parameter setting page may be a parameter setting page displayed on the live interface in response to a parameter setting operation on the sprite area selection frame, and the parameter setting page may also be a parameter setting page skipped from the live interface to the parameter setting page in response to a parameter setting operation on the sprite area selection frame. The embodiments of the present application are not limited. Which can be specifically adjusted according to the actual situation.
In this embodiment of the present application, the parameter setting page further displays an initial sub-picture parameter set by the sub-picture parameter setting control for the sub-picture area, where the sub-picture parameter setting control is configured to receive at least one or more sub-picture parameters, and the sub-picture parameters include a shape parameter of the sub-picture area, a scaling parameter of a content of the sub-picture area, and effect information of the content of the sub-picture area.
In an embodiment of the present application, the sprite parameter setting control corresponding to the shape parameter of the sprite region may provide the shape of at least two candidate sprite regions for the user to select. The sprite parameter setting control corresponding to the shape parameter of the sprite region may be configured to provide a user-defined shape of the sprite region. The specific content may be adjusted according to actual situations, and the embodiments of the present application are not limited.
In an embodiment of the present application, the sub-screen parameter setting control corresponding to the scaling parameter of the content of the sub-screen area may be a scale bar providing the scaling parameter for adjusting the content of the sub-screen area, and the sub-screen parameter setting control corresponding to the scaling parameter of the content of the sub-screen area may include an editing block for receiving the scaling parameter input by the user for the content of the sub-screen area. The specific content may be adjusted according to actual situations, and the embodiments of the present application are not limited.
In the embodiment of the present application, the sprite parameter setting control corresponding to the effect information of the content of the sprite area may provide the effect information of at least two candidate sprite areas for the user to select. The sprite parameter setting control corresponding to the shape parameter of the sprite region may be an effect of providing a user-defined sprite region. The specific content may be adjusted according to actual situations, and the embodiments of the present application are not limited.
Referring to fig. 10, fig. 10 is a schematic diagram of a parameter setting page provided in an embodiment of the present application, where the parameter setting page includes a zoom parameter, a shape parameter, and a special effect parameter. In response to a triggering operation of the sub-picture parameter setting control for the zoom parameter of the content of the sub-picture area, a scale bar (other shapes are also possible) as shown at 307 in fig. 10 is displayed, and the user can set the zoom parameter of the content of the sub-picture area by moving the dot at 307 in fig. 10.
In the embodiment of the present application, the target sprite parameter information refers to target sprite parameter information set for a sprite parameter of a sprite region. For example, when the sub-picture parameter is a scaling ratio, the target sub-picture parameter information may be 80% or 10 units of magnification.
In the embodiment of the application, when the sub-picture selection operation of the live broadcast main picture is responded, a multi-picture live broadcast confirmation control is displayed on the live broadcast interface.
It can be appreciated that the triggering operation of the multi-screen live broadcast confirmation control triggers the viewer to select one or more live broadcast sub-screens. And the viewer terminal can be triggered to select one or more live broadcast sub-pictures by one or more sub-picture selection operations of the live broadcast main picture. And the audience terminal can be triggered to select one or more live broadcast sub-pictures by performing one or more confirmation operations on the parameter setting page.
In the embodiment of the application, according to the target sub-picture parameter information, the sub-picture intercepted in the sub-picture area can be subjected to parameter processing to obtain the content of the live sub-picture.
In an embodiment of the present application, the multi-picture live broadcast method further includes: and scaling the live sprite in response to a scaling operation of the live sprite.
In an embodiment of the present application, the multi-picture live broadcast method further includes: and responding to the triggering operation of the live broadcast sub-picture, and displaying a purchase page of the displayed object in the live broadcast sub-picture. In response to a purchase operation on the purchase page, the display item is purchased.
Or, in response to the quick purchasing operation of the live sprite, displaying an order area of the item displayed by the live sprite in the live interface. In response to a purchase operation on the order area, the display item is purchased.
By adopting the scheme of the embodiment of the application, a live broadcast interface can be shown, wherein the live broadcast interface comprises a live broadcast main picture; responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture, so that the audience can watch the complete live broadcast picture while watching the details of the live broadcast content, and the watching experience of the audience is improved.
Referring to fig. 11, the multi-picture live broadcast method is applied to a main broadcasting end, and the specific flow of the multi-picture live broadcast method may be as follows steps 401 to 402, where:
step 401, displaying a live interface of a target live room, wherein the live interface comprises a live main picture.
In the embodiment of the application, the target live broadcast room is the live broadcast room where the main broadcasting end is located. In the target live broadcasting room, the host can live broadcast through a single live broadcasting device, and can also live broadcast through a plurality of live broadcasting devices. At least one live device may be used for multi-screen live broadcast. That is, the anchor can live through a single live device supporting multi-picture live broadcast to realize multi-picture live broadcast. The anchor can also support multi-picture live broadcast through a plurality of live broadcast devices, at least one of which is used for realizing multi-picture live broadcast. The live broadcast equipment can be intelligent terminals such as a PC, a mobile phone, a tablet and the like, and can also be video acquisition equipment (such as a camera) with a video sending function.
In the embodiment of the application, the live interface comprises a live screen area and other areas. The live broadcast picture area at least comprises a live broadcast main picture. The live broadcast main picture is a live broadcast picture which can be displayed at the audience side by the anchor side. Other areas include at least an operation area that the anchor may perform during live broadcast, such as a control area for interaction between the anchor and the viewer. Other areas may also include interactive display areas, such as display areas for chat interaction by the anchor with the audience.
Referring to fig. 6, fig. 6 is a schematic diagram of a live interface according to an embodiment of the present application. Fig. 6 may be a schematic diagram of a live interface in a target live room, as shown in fig. 6, taking the live view area in fig. 6 as an example, and 301 in fig. 6 is a live home view in the live view area. Taking the other areas in fig. 6 as an example, 302 in fig. 6 is a control area in the other areas, where the control area is used for the anchor to perform interactive operations.
And step 402, responding to the multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface. The live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
In the embodiment of the present application, the live sub-picture is a part of pictures in the live main picture of the target live room, and the part of pictures are pictures capable of displaying the detail content of the live main picture at the main broadcasting end.
Referring to fig. 6, on a live view area of the live view interface of the main broadcasting end, the live view area further includes a live sub-view, for example, 303 in fig. 6 is a live sub-view in the live view area.
In this embodiment of the present application, a multi-picture live event refers to an event of extracting one or more live sub-pictures from a live main picture, and displaying a live picture after the live main picture and the one or more live sub-pictures are combined on a live interface.
In an embodiment of the present application, in response to a multi-picture live event, displaying a live picture after a live main picture and a live sub picture are combined on a live interface, including:
in response to a sub-screen selection operation for the live main screen, a sub-screen region selection frame is displayed.
And in response to a sub-picture confirmation operation on the live main picture, determining the region framed by the sub-picture region selection frame on the live main picture as a sub-picture region.
And displaying the live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture on the live broadcast interface. The live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture based on the sub-picture area.
In the embodiment of the present application, the sprite refers to a partial picture in the live main picture. For example, when the live main picture includes a vase, a flower, and the sub-picture may be a picture about the flower in the live main picture. For another example, when the live main screen includes a live commodity, the sub-screen may be a local screen of the live commodity.
In this embodiment of the present application, the sub-screen selection operation in response to the live main screen may be a trigger operation in response to a sub-screen selection control of the live main screen, and the trigger operation is taken as the sub-screen selection operation. And responding to the sub-picture selection operation of the live broadcast main picture, namely, responding to the function triggering operation of the live broadcast main picture, displaying a live broadcast function page, responding to the triggering operation of the sub-picture selection function on the live broadcast function page, and taking the triggering operation as the sub-picture selection operation. In response to the sub-picture selection operation of the live main picture, the sub-picture selection operation can be adjusted according to actual situations, and the embodiment of the application is not particularly limited.
In the embodiment of the present application, the sprite region selection frame refers to a selection frame for determining a sprite region from a live main picture. The sub-picture region selection frame may be a cover layer displayed on the live broadcast interface, and may also be a cover layer displayed only on the live broadcast main picture, where the sub-picture region selection frame is a floating layer or window covering at least a portion of the picture content of the live broadcast main picture on the live broadcast interface, and is used to select the sub-picture region from the live broadcast main picture upper frame.
In an embodiment of the present application, the display sprite area selection box may be configured to display, on the live interface, an overlay layer in response to a sprite selection operation on the live main screen, where the overlay layer is only a floating layer or a window that covers at least a portion of the screen content of the live main screen on the live interface.
In this embodiment of the present application, the display sub-screen region selection box may further display a live-broadcast main-screen display page including only the live-broadcast main screen in response to a sub-screen selection operation on the live-broadcast main screen, where an overlay layer is displayed on the live-broadcast main-screen display page, where the overlay layer is a floating layer or a window that covers at least a portion of screen content of the live-broadcast main screen on the live-broadcast interface.
Referring to fig. 7, fig. 7 is a schematic diagram of a live broadcast interface with a sprite area selection frame displayed in the embodiment of the present application, and in the live broadcast interface shown in fig. 7, 304 in fig. 7 is a sprite area selection frame on the live broadcast interface. It can be understood that the sub-picture region selection frame is only an exemplary selection frame, and the shape, display form and display position of the specific sub-picture selection frame can be adjusted according to practical situations, which is not limited in the embodiments of the present application.
In the embodiment of the present application, the sub-screen confirmation operation in response to the live main screen may be a confirmation operation for a sub-screen region selection box displayed on a live interface. The sub-picture confirmation operation in response to the live-broadcast main picture may also be a confirmation operation of a sub-picture region selection frame displayed on a live-broadcast main picture presentation page including only the live-broadcast main picture. The sub-frame confirmation operation can be adjusted according to a specific page, and the embodiment of the application is not limited.
In the embodiment of the present application, the sub-picture region selection frame is a closed selection frame, that is, the sub-picture region selection frame has a boundary. The determination of the area selected by the sub-picture area selection frame on the live-broadcast main picture as the sub-picture area refers to taking the area surrounded by the boundary of the sub-picture area selection frame as the area selected by the sub-picture area selection frame.
Referring to fig. 8, fig. 8 is a schematic diagram of an area framed on a live main frame by a sub-frame area selection frame provided in an embodiment of the present application, as shown in fig. 8, and 305 in fig. 8 is taken as an area framed on the live main frame by the sub-frame area selection frame.
In this embodiment of the present application, a multi-picture live event refers to an event in which one or more live sub-pictures are extracted from a live main picture, and a fused live picture corresponding to the live main picture and the live sub-picture is live.
In the embodiment of the present application, the location information of the sub-picture area refers to the location information of the sub-picture area on the live main picture. For example, the bottom left corner of the live-broadcast main picture is taken as the origin of a coordinate system, the length and the width of a rectangle where the live-broadcast main picture is located are taken as two coordinate axes of the coordinate system, the unit length is set on the coordinate axes, and the coordinate information of the sub-picture area in the coordinate system is the position information of the sub-picture area in the live-broadcast main picture.
In this embodiment of the present application, the location information of the sub-picture area may be default location information, for example, a region selected by the sub-picture area selection frame on the live main picture is used as the location information of the sub-picture area. The location information of the sub-picture area may also be set location information, for example, the sub-picture area is set on the parameter setting page and displayed at a certain location of the fused live-broadcast picture, where the certain location is the location information of the sub-picture area.
In this embodiment of the present application, it may be understood that, in response to a multi-picture live event, a fused live broadcast picture (a live broadcast main picture and a live broadcast sub-picture) may be displayed at a live broadcast interface of a main broadcasting terminal and a live broadcast interface of a viewer terminal at the same time, or only the fused live broadcast picture may be displayed at the live broadcast interface of the viewer terminal. Specifically, the embodiments of the present application are not limited according to actual adjustment.
In the embodiment of the application, the live broadcast sub-picture can be covered on the live broadcast main picture according to the display position of the live broadcast sub-picture on the live broadcast main picture, so as to obtain the fused live broadcast picture. And the live broadcast sub-picture can also cover the live broadcast sub-picture on the live broadcast main picture at the position corresponding to the position information according to the position information of the sub-picture region in the live broadcast main picture, so as to obtain the fused live broadcast picture.
Referring to fig. 6, 303 in fig. 6 is a schematic diagram of displaying a live sub-picture in a live picture area, where the live sub-picture has position information, the position information can be set, and the processed live sub-picture is displayed according to the display position information of the live sub-picture on the live main picture.
Referring to fig. 9, 306 in fig. 9 is another schematic diagram of the display of the live sprite in the live sprite area. And the live broadcast sub-picture covers the live broadcast main picture at a position corresponding to the position information according to the position information of the sub-picture area in the live broadcast main picture.
In an embodiment of the present application, in response to a multi-picture live event, before the live interface displays a live picture in which a live main picture and a live sub-picture are combined, the method further includes:
and in response to a parameter setting operation of the sub-picture region selection frame, displaying a parameter setting page, wherein the parameter setting page comprises at least one sub-picture parameter setting control.
And responding to the setting operation of the sub-picture parameter setting control, and acquiring target sub-picture parameter information set for the sub-picture area.
In response to the confirmation operation at the parameter setting page, the sub-picture parameter information of the sub-picture area is updated based on the target sub-picture parameter information. In the embodiment of the present application, the parameter setting page is a page for setting a sub-picture parameter of a sub-picture area. The sprite parameters include, but are not limited to, shape parameters of the sprite region, scaling parameters of the content of the sprite region, and effect information of the content of the sprite region. The specific content of the sub-picture parameters can be adjusted according to actual conditions, and the embodiment of the application is not limited.
In this embodiment of the present application, the parameter setting page may be a parameter setting page displayed on the live interface in response to a parameter setting operation on the sprite area selection frame, and the parameter setting page may also be a parameter setting page skipped from the live interface to the parameter setting page in response to a parameter setting operation on the sprite area selection frame. The embodiments of the present application are not limited. Which can be specifically adjusted according to the actual situation.
In this embodiment of the present application, the parameter setting page further displays an initial sub-picture parameter set by the sub-picture parameter setting control for the sub-picture area, where the sub-picture parameter setting control is configured to receive at least one or more sub-picture parameters, and the sub-picture parameters include a shape parameter of the sub-picture area, a scaling parameter of a content of the sub-picture area, and effect information of the content of the sub-picture area.
In an embodiment of the present application, the sprite parameter setting control corresponding to the shape parameter of the sprite region may provide the shape of at least two candidate sprite regions for the user to select. The sprite parameter setting control corresponding to the shape parameter of the sprite region may be configured to provide a user-defined shape of the sprite region. The specific content may be adjusted according to actual situations, and the embodiments of the present application are not limited.
In an embodiment of the present application, the sub-screen parameter setting control corresponding to the scaling parameter of the content of the sub-screen area may be a scale bar providing the scaling parameter for adjusting the content of the sub-screen area, and the sub-screen parameter setting control corresponding to the scaling parameter of the content of the sub-screen area may include an editing block for receiving the scaling parameter input by the user for the content of the sub-screen area. The specific content may be adjusted according to actual situations, and the embodiments of the present application are not limited.
In the embodiment of the present application, the sprite parameter setting control corresponding to the effect information of the content of the sprite area may provide the effect information of at least two candidate sprite areas for the user to select. The sprite parameter setting control corresponding to the shape parameter of the sprite region may be an effect of providing a user-defined sprite region. The specific content may be adjusted according to actual situations, and the embodiments of the present application are not limited.
Referring to fig. 10, fig. 10 is a schematic diagram of a parameter setting page provided in an embodiment of the present application, where the parameter setting page includes a zoom parameter, a shape parameter, and a special effect parameter. In response to a triggering operation of the sub-picture parameter setting control for the zoom parameter of the content of the sub-picture area, a scale bar (other shapes are also possible) as shown at 307 in fig. 10 is displayed, and the user can set the zoom parameter of the content of the sub-picture area by moving the dot at 307 in fig. 10.
In the embodiment of the present application, the target sprite parameter information refers to target sprite parameter information set for a sprite parameter of a sprite region. For example, when the sub-picture parameter is a scaling ratio, the target sub-picture parameter information may be 80% or 10 units of magnification.
In the embodiment of the application, when the sub-picture selection operation of the live broadcast main picture is responded, a multi-picture live broadcast confirmation control is displayed on the live broadcast interface.
The multi-picture live event includes: and triggering the multi-picture live broadcast confirmation control, selecting the sub-picture of the live broadcast main picture, and setting any one of the confirmation operations of the page by the parameters.
It can be understood that the main broadcasting end is triggered to select one or more live sub-pictures through the triggering operation of the multi-picture live broadcast confirmation control. And the main broadcasting end can be triggered to select one or more live broadcast sub-pictures by one or more sub-picture selection operations of the live broadcast main picture. And the main broadcasting end can be triggered to select one or more live broadcast sub-pictures by performing one or more confirmation operations on the parameter setting page.
In the embodiment of the application, according to the target sub-picture parameter information, the sub-picture intercepted in the sub-picture area can be subjected to parameter processing to obtain the content of the live sub-picture.
In this embodiment of the present application, a process of displaying a live broadcast picture after a live broadcast main picture and a live broadcast sub picture are combined on a live broadcast interface may be: and acquiring the live broadcast picture after the fusion of the live broadcast main picture and the live broadcast sub picture. And displaying the fused live broadcast picture on the live broadcast interface.
In this embodiment of the present application, a process of displaying a live broadcast picture after a live broadcast main picture and a live broadcast sub picture are combined on a live broadcast interface may further be: and acquiring the live broadcast main picture and the live broadcast sub-picture. And displaying a live broadcast picture on the live broadcast interface, wherein the live broadcast picture comprises a main picture layer and a sub picture layer which are displayed in a superimposed mode, the main picture layer is used for displaying a live broadcast main picture, and the sub picture layer is used for displaying a live broadcast sub picture.
Illustratively, displaying a live view of a combination of a live main view and a live sub-view at a live interface in response to a multi-view live event, comprising:
and responding to the multi-picture live broadcast event, and acquiring an initial sub-picture from the live broadcast main picture which is currently shot based on the position information of the sub-picture area.
And processing the initial sub-picture based on the target sub-picture parameter information to obtain the live sub-picture.
And combining the live broadcast main picture and the live broadcast sub-picture to obtain a combined live broadcast picture, pushing the combined live broadcast picture to the server, and triggering the server to control the target live broadcast room to display the combined live broadcast picture on a live broadcast interface of the main broadcast side.
Illustratively, in response to the multi-picture live event, displaying a live picture of the live main picture and the live sub picture combined at the live interface, further comprising:
And responding to the multi-picture live broadcast event, and transmitting the currently shot live broadcast main picture, the position information of the sub-picture area and the target sub-picture parameter information to the server.
And triggering the server side to acquire an initial sub-picture from the live broadcast main picture which is currently shot based on the position information of the sub-picture area.
And processing the initial sub-picture based on the target sub-picture parameter information to obtain the live sub-picture.
And combining the live broadcast main picture and the live broadcast sub-picture to obtain a combined live broadcast picture, and controlling the target live broadcast room to display the combined live broadcast picture on a live broadcast interface of the main broadcasting end. And processing the initial sub-picture based on the target sub-picture parameter information to obtain a live sub-picture.
And fusing the live broadcast main picture and the live broadcast sub-picture to obtain a fused live broadcast picture, and controlling the target live broadcast room to display the fused live broadcast picture on a live broadcast interface of the main broadcast end and the audience end.
In the embodiment of the application, the service end performs confluence processing on the live broadcast main picture and the live broadcast sub picture based on a video coding technology. The merging process refers to a process of merging a plurality of video pictures into one video picture through a video coding technology.
And the service end is used for converging the live broadcast pictures (live broadcast sub-pictures and live broadcast main pictures), and the audience end only needs to pull one path of live broadcast video stream, so that the bandwidth pressure of the audience end is not increased, and the viewing experience of the audience can be improved.
By adopting the scheme of the embodiment of the application, the live interface of the target live broadcasting room can be displayed, and the live broadcasting interface comprises a live broadcasting main picture; responding to the sub-picture selection operation of the live broadcast main picture, and displaying a sub-picture region selection frame; responding to a sub-picture confirming operation of the live broadcast main picture, and determining a region selected by the sub-picture region selection frame on the live broadcast main picture as a sub-picture region; and responding to a multi-picture live broadcast event, controlling a live broadcast interface of the target live broadcast room at the main broadcasting end and the audience end based on the position information of the sub-picture area, and displaying a fused live broadcast picture corresponding to the live broadcast main picture and the live broadcast sub-picture, wherein the live broadcast sub-picture is extracted from the live broadcast main picture based on the position information of the sub-picture area. The live broadcast sub-pictures can be freely selected on the single live broadcast equipment, the live broadcast interfaces of the main broadcasting end and the audience end are controlled, and the fused live broadcast pictures corresponding to the live broadcast main pictures and the live broadcast sub-pictures are displayed, so that the audience can watch the complete live broadcast pictures while watching the details of live broadcast content, and the watching experience of the audience is improved.
The embodiment also provides a multi-picture live broadcast device, which may be integrated in an electronic device, such as a computer device, where the computer device may be a terminal, a server, or other devices, and the embodiment is not limited thereto.
For example, as shown in fig. 12, the multi-picture live broadcast apparatus is applied to a viewer terminal, and the multi-picture live broadcast apparatus may include:
a first display module 501, configured to display a live interface, where the live interface includes a live home screen.
A second display module 502, configured to respond to a multi-picture live event, and display, on the live interface, a live picture obtained by combining the live main picture and the live sub-picture; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
Optionally, in the apparatus of the embodiment of the present application, in the second display module 502, in response to a multi-picture live event, displaying, on the live interface, a live picture after the combination of the live main picture and the live sub-picture includes:
and the first sub-picture selection unit is used for responding to the sub-picture selection operation of the target live broadcasting room and displaying a sub-picture region selection frame on the live broadcasting interface.
And the first sub-picture confirming unit is used for responding to the sub-picture confirming operation of the live-broadcast main picture and determining the area selected by the sub-picture area selecting frame on the live-broadcast main picture as a sub-picture area.
And the first display unit is used for responding to the confirmation operation of the main broadcasting end corresponding to the target living broadcasting room on the sub-picture area and displaying the living broadcasting picture after the combination of the living broadcasting main picture and the living broadcasting sub-picture on the living broadcasting interface.
Optionally, in the apparatus of the embodiment of the present application, in the second display module 502, in response to a multi-picture live event, displaying, on the live interface, a live picture after the combination of the live main picture and the live sub-picture includes:
and the second display unit is used for responding to a live sub-picture selection event of the main broadcasting terminal and displaying the live picture formed by combining the live main picture and the live sub-picture on a live interface of the audience terminal.
Optionally, in the apparatus of the embodiment of the present application, in the second display module 502, in response to a multi-picture live event, displaying, on the live interface, a live picture after the combination of the live main picture and the live sub-picture includes:
And the triggering unit is used for determining to trigger the multi-picture live event in response to the identification of the target object in the live main picture.
And the third display unit is used for displaying the live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture on the live broadcast interface.
Optionally, in the apparatus of the embodiment of the present application, the multi-picture live broadcast apparatus includes:
and the scaling unit is used for scaling the live sub-picture in response to the scaling operation of the live sub-picture.
Optionally, in the apparatus of the embodiment of the present application, the multi-picture live broadcast apparatus includes:
and the purchase page display unit is used for responding to the triggering operation of the live broadcast sub-picture and displaying the purchase page of the object displayed in the live broadcast sub-picture.
And the first purchasing unit is used for purchasing the display article in response to the purchasing operation of the purchasing page.
Or,
and the order area display unit is used for responding to the quick purchasing operation of the live broadcast sub-picture and displaying the order area of the object displayed by the live broadcast sub-picture in the live broadcast interface.
And the second purchasing unit is used for purchasing the display article in response to the purchasing operation of the order area.
Optionally, in the apparatus of the embodiment of the present application, in the second display module 502, the displaying, on the live broadcast interface, a live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture includes:
and the first direct broadcast picture acquisition unit is used for acquiring the direct broadcast picture after the direct broadcast main picture and the direct broadcast sub picture are fused.
And the fourth display unit is used for displaying the fused live broadcast picture on the live broadcast interface.
Optionally, in the apparatus of the embodiment of the present application, in the second display module 502, the displaying, on the live broadcast interface, a live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture includes:
and the second live broadcast picture acquisition unit is used for acquiring the live broadcast main picture and the live broadcast sub-picture.
And the fifth display unit is used for displaying a live broadcast picture on the live broadcast interface, wherein the live broadcast picture comprises a main picture layer and a sub-picture layer which are displayed in a superimposed mode, the main picture layer is used for displaying a live broadcast main picture, and the sub-picture layer is used for displaying a live broadcast sub-picture.
By adopting the scheme of the embodiment of the application, a live broadcast interface can be displayed, wherein the live broadcast interface comprises a live broadcast main picture; responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture. Therefore, the audience can watch the complete live broadcast picture while watching the details of the live broadcast content, and the watching experience of the audience is improved.
For example, as shown in fig. 13, the multi-picture live broadcast apparatus is applied to a main broadcasting end, and the multi-picture live broadcast apparatus may include:
and the third display module 601 is configured to display a live interface of the target live room, where the live interface includes a live main picture.
A fourth display module 602, configured to display, on a live interface, a live broadcast picture after a combination of a live broadcast main picture and a live broadcast sub picture in response to a multi-picture live broadcast event; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
Optionally, in the apparatus of the embodiment of the present application, the fourth display module 602 is configured to display, in response to a multi-picture live event, a live picture after a combination of a live main picture and a live sub-picture on a live interface, where the live picture includes:
and the second sub-picture selection unit is used for responding to the sub-picture selection operation of the live main picture and displaying a sub-picture region selection frame.
And the second sub-picture confirming unit is used for responding to the sub-picture confirming operation of the live-broadcast main picture and determining the area selected by the sub-picture area selecting frame on the live-broadcast main picture as a sub-picture area.
And the sixth display unit is used for displaying the live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture on the live broadcast interface. The live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture based on the sub-picture area.
Optionally, in the apparatus of the embodiment of the present application, in response to the multi-picture live event, before the live interface displays the live picture after the combination of the live main picture and the live sub-picture, the fourth display module 602 further includes:
and the parameter setting page display unit is used for responding to the parameter setting operation of the sub-picture area selection frame and displaying a parameter setting page, wherein the parameter setting page comprises at least one sub-picture parameter setting control.
And the parameter setting unit is used for responding to the setting operation of the sub-picture parameter setting control and acquiring target sub-picture parameter information set for the sub-picture area.
And a parameter confirmation unit for updating the sub-picture parameter information of the sub-picture area based on the target sub-picture parameter information in response to the confirmation operation at the parameter setting page.
Optionally, in the apparatus of the embodiment of the present application, the fourth display module 602 is configured to display, in response to a multi-picture live event, a live picture after a combination of a live main picture and a live sub-picture on a live interface, where the live picture includes:
and the first initial sub-picture acquisition unit is used for responding to the multi-picture live broadcast event and acquiring an initial sub-picture from the live broadcast main picture which is currently shot based on the position information of the sub-picture area.
And the first processing unit is used for processing the initial sub-picture based on the target sub-picture parameter information to obtain the live sub-picture.
The first combination unit is used for combining the live broadcast main picture and the live broadcast sub-picture to obtain a combined live broadcast picture, pushing the combined live broadcast picture to the server, and triggering the server to control the target live broadcast room to display the combined live broadcast picture on the live broadcast interface of the main broadcast side.
Optionally, in the apparatus of the embodiment of the present application, the fourth display module 602 is configured to display, in response to a multi-picture live event, a live picture after a combination of a live main picture and a live sub-picture on a live interface, where the live picture includes:
and the sending unit is used for responding to the multi-picture live broadcast event and sending the currently shot live broadcast main picture, the position information of the sub-picture area and the target sub-picture parameter information to the server.
The second initial sub-picture obtaining unit is used for triggering the server and obtaining an initial sub-picture from the live broadcast main picture which is currently shot based on the position information of the sub-picture area.
And the second processing unit is used for processing the initial sub-picture based on the target sub-picture parameter information to obtain the live sub-picture.
And the second combination unit is used for combining the live broadcast main picture and the live broadcast sub-picture to obtain a combined live broadcast picture, and controlling the target live broadcast room to display the combined live broadcast picture on a live broadcast interface of the main broadcast end.
Optionally, in the device of the embodiment of the present application, the parameter setting page further displays an initial sub-picture parameter set by the sub-picture parameter setting control for the sub-picture area, where the sub-picture parameter setting control is configured to receive at least one or more sub-picture parameters, and the sub-picture parameters include a shape parameter of the sub-picture area, a scaling parameter of a content of the sub-picture area, and effect information of the content of the sub-picture area.
Optionally, in the apparatus of the embodiment of the present application, when responding to a sub-picture selection operation on a live main picture, a multi-picture live broadcast confirmation control is displayed on a live broadcast interface.
A multi-picture live event comprising: and triggering operation of the multi-picture live broadcast confirmation control, selecting operation of the sub-picture of the live broadcast main picture and confirming operation of the parameter setting page.
Optionally, in the apparatus of the embodiment of the present application, the fourth display module 602 displays, on a live interface, a live broadcast picture after a live broadcast main picture and a live broadcast sub picture are combined, including:
and the third live broadcast picture acquisition unit is used for acquiring the live broadcast picture after the fusion of the live broadcast main picture and the live broadcast sub-picture.
And the sixth display unit is used for displaying the fused live broadcast picture on the live broadcast interface.
Optionally, in the apparatus of the embodiment of the present application, the fourth display module 602 displays, on a live interface, a live broadcast picture after a live broadcast main picture and a live broadcast sub picture are combined, including:
and the fourth live broadcast picture acquisition unit is used for acquiring the live broadcast main picture and the live broadcast sub-picture.
And the seventh display unit is used for displaying the live broadcast picture on the live broadcast interface, wherein the live broadcast picture comprises a main picture layer and a sub-picture layer which are displayed in a superimposed mode, the main picture layer is used for displaying the live broadcast main picture, and the sub-picture layer is used for displaying the live broadcast sub-picture. By adopting the scheme of the embodiment of the application, the live interface of the target live broadcasting room can be displayed, and the live broadcasting interface comprises a live broadcasting main picture. And displaying a sub-picture region selection frame in response to a sub-picture selection operation of the live main picture. And in response to a sub-picture confirming operation of the live main picture, determining the area selected by the sub-picture area selection frame on the live main picture as a sub-picture area. And responding to a multi-picture live broadcast event, controlling a live broadcast interface of the target live broadcast room at the main broadcasting end and the audience end based on the position information of the sub-picture area, and displaying a fused live broadcast picture corresponding to the live broadcast main picture and the live broadcast sub-picture, wherein the live broadcast sub-picture is extracted from the live broadcast main picture based on the position information of the sub-picture area. The live broadcast sub-pictures can be freely selected on the single live broadcast equipment, the live broadcast interfaces of the main broadcasting end and the audience end are controlled, and the fused live broadcast pictures corresponding to the live broadcast main pictures and the live broadcast sub-pictures are displayed, so that the audience can watch the complete live broadcast pictures while watching the details of live broadcast content, and the watching experience of the audience is improved.
Correspondingly, the embodiment of the application also provides electronic equipment, which can be a terminal, and the terminal can be terminal equipment such as a smart phone, a tablet personal computer, a notebook computer, a touch screen, a game machine, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. Alternatively, the electronic device may be a server.
Fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 14. The electronic device 700 includes a processor 701 having one or more processing cores, a memory 702 having one or more computer-readable storage media, and a computer program stored on the memory 702 and executable on the processor. The processor 701 is electrically connected to the memory 702. It will be appreciated by those skilled in the art that the electronic device structure shown in the figures is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The processor 701 is a control center of the electronic device 700, connects various portions of the entire electronic device 700 using various interfaces and lines, and performs various functions of the electronic device 700 and processes data by running or loading software programs and/or elements stored in the memory 702, and invoking data stored in the memory 702. The processor 701 may be a processor CPU, a graphics processor GPU, a network processor (Network Processor, NP), etc., that may implement or perform the methods, steps and logic blocks disclosed in embodiments of the present application.
In the embodiment of the present application, the processor 701 in the electronic device 700 loads the instructions corresponding to the processes of one or more application programs into the memory 702 according to the following steps, and the processor 701 executes the application programs stored in the memory 702, so as to implement various functions, for example:
displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture;
responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
Further, the various functions implemented by running the application program stored in the memory 702 may also be referred to as descriptions in the foregoing embodiments, and will not be described herein.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 14, the electronic device 700 further includes: a touch display 703, a radio frequency circuit 704, an audio circuit 705, an input unit 706, and a power supply 707. The processor 701 is electrically connected to the touch display 703, the radio frequency circuit 704, the audio circuit 705, the input unit 706, and the power supply 707, respectively. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 14 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The touch display 703 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 703 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of the electronic device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 701, and can receive and execute commands sent from the processor 701. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is transferred to the processor 701 to determine the type of touch event, and the processor 701 then provides a corresponding visual output on the display panel based on the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 703 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch display 703 may also implement an input function as part of the input unit 706.
The radio frequency circuitry 704 may be configured to receive and transmit radio frequency signals to and from a network device or other electronic device via wireless communication to and from which wireless communication is established.
Audio circuitry 705 may be used to provide an audio interface between a user and an electronic device through speakers, microphones, and so forth. The audio circuit 705 may transmit the received electrical signal converted from audio data to a speaker, where it is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 705 and converted into audio data, which are processed by the audio data output processor 701 for transmission to, for example, another electronic device via the radio frequency circuit 704, or which are output to the memory 702 for further processing. Audio circuitry 705 may also include an ear bud jack to provide communication of the peripheral headphones with the electronic device.
The input unit 706 may be used to receive various operations (e.g., a sub-screen selection operation, a sub-screen confirmation operation), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 707 is used to power the various components of the electronic device 700. Alternatively, the power supply 707 may be logically connected to the processor 701 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system. The power supply 707 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 14, the electronic device 700 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium, which includes a computer program for causing an electronic device to execute any of the multi-screen live broadcast methods provided in the embodiments of the present application when the computer program is run on the electronic device. For example, the computer program may perform the steps of the multi-picture live method as follows:
Displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture;
responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
Further, the refinement steps of the above method steps may also be referred to the description in the foregoing embodiments, and are not repeated herein.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the computer-readable storage medium may comprise: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Because the computer program stored in the computer readable storage medium can execute any multi-picture live broadcast method provided by the embodiment of the present application, the beneficial effects that any multi-picture live broadcast method provided by the embodiment of the present application can be achieved, which are detailed in the previous embodiments and are not described herein.
According to one aspect of the present application, there is also provided a computer program product comprising a computer program stored in a computer readable storage medium; when the processor of the electronic device reads the computer program from the computer-readable storage medium, the processor executes the computer program, causing the electronic device to perform the methods provided in the various alternative implementations of the embodiments described above.
In the embodiments of the multi-screen live broadcast apparatus, the computer readable storage medium, the electronic device, and the computer program product, the descriptions of the embodiments are focused on, and for the parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process and the beneficial effects of the multi-screen live broadcast apparatus, the computer readable storage medium, the computer program product, the electronic device and the corresponding units described above may refer to the description of the multi-screen live broadcast method in the above embodiments, which is not described herein in detail.
The foregoing has described in detail the methods, apparatuses, electronic devices, computer readable storage media and computer program products for multi-screen live broadcast provided by the embodiments of the present application, and specific examples have been applied herein to illustrate the principles and implementations of the present application, where the above description of the embodiments is only for aiding in the understanding of the methods and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (21)

1. A multi-picture live broadcast method, wherein the method is applied to a viewer terminal, the method comprising:
displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture;
responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
2. The multi-picture live broadcast method of claim 1, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface in response to a multi-picture live broadcast event comprises:
responding to the sub-picture selection operation of the target live broadcasting room, and displaying a sub-picture region selection frame on the live broadcasting interface;
responding to a sub-picture confirming operation of the live broadcast main picture, and determining a region selected by the sub-picture region selection frame on the live broadcast main picture as a sub-picture region;
and responding to the confirmation operation of the main broadcasting end corresponding to the target living broadcasting room to the sub-picture area, and displaying the living broadcasting picture formed by combining the living broadcasting main picture and the living broadcasting sub-picture on the living broadcasting interface.
3. The multi-picture live broadcast method of claim 1, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface in response to a multi-picture live broadcast event comprises:
and responding to a live sub-picture selection event of the main broadcasting terminal, and displaying the live picture formed by combining the live main picture and the live sub-picture on a live interface of the audience terminal.
4. The multi-picture live broadcast method of claim 1, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface in response to a multi-picture live broadcast event comprises:
determining to trigger a multi-picture live event in response to identifying a target object in the live home picture;
and displaying the live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture on the live broadcast interface.
5. The multi-picture live broadcast method of any of claims 1-4, further comprising:
and scaling the live sub-picture in response to a scaling operation of the live sub-picture.
6. The multi-picture live broadcast method of any of claims 1-4, further comprising:
Responding to the triggering operation of the live broadcast sub-picture, and displaying a purchasing page of the displayed object in the live broadcast sub-picture;
purchasing the display article in response to a purchase operation on the purchase page;
or,
responding to the quick purchasing operation of the live broadcast sub-picture, and displaying an order area of the object displayed by the live broadcast sub-picture in the live broadcast interface;
the display item is purchased in response to a purchase operation of the order area.
7. The multi-picture live broadcast method according to any one of claims 1 to 4, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface includes:
acquiring a live broadcast picture after the live broadcast main picture and the live broadcast sub picture are fused;
and displaying the fused live broadcast picture on the live broadcast interface.
8. The multi-picture live broadcast method according to any one of claims 1 to 4, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface includes:
acquiring the live broadcast main picture and the live broadcast sub picture;
and displaying a live broadcast picture on the live broadcast interface, wherein the live broadcast picture comprises a main picture layer and a sub picture layer which are displayed in a superimposed mode, the main picture layer is used for displaying a live broadcast main picture, and the sub picture layer is used for displaying a live broadcast sub picture.
9. A multi-picture live broadcast method, wherein the method is applied to a main broadcasting end, and the method comprises the following steps:
displaying a live broadcast interface of a target live broadcast room, wherein the live broadcast interface comprises a live broadcast main picture;
responding to a multi-picture live event, and displaying a live picture formed by combining the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
10. The multi-picture live broadcast method of claim 9, wherein displaying the live broadcast picture in the live broadcast interface in combination with the live broadcast main picture and the live broadcast sub picture in response to a multi-picture live broadcast event comprises:
responding to the sub-picture selection operation of the live broadcast main picture, and displaying a sub-picture region selection frame;
responding to a sub-picture confirming operation of the live broadcast main picture, and determining a region selected by the sub-picture region selection frame on the live broadcast main picture as a sub-picture region;
displaying the live broadcast picture after the combination of the live broadcast main picture and the live broadcast sub-picture on the live broadcast interface; and the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture based on the sub-picture area.
11. The multi-picture live broadcast method of claim 10, wherein the method further comprises, in response to a multi-picture live broadcast event, before the live interface displays the live picture of the live main picture and live sub-picture combination:
responding to the parameter setting operation of the sub-picture area selection frame, displaying a parameter setting page, wherein the parameter setting page comprises at least one sub-picture parameter setting control;
responding to the setting operation of the sub-picture parameter setting control, and acquiring target sub-picture parameter information set for the sub-picture area;
and in response to a confirmation operation on the parameter setting page, updating the sub-picture parameter information of the sub-picture area based on the target sub-picture parameter information.
12. The multi-picture live broadcast method of claim 10, wherein displaying the live broadcast picture in the live broadcast interface in combination with the live broadcast main picture and the live broadcast sub picture in response to a multi-picture live broadcast event comprises:
responding to a multi-picture live broadcast event, and acquiring an initial sub-picture from a live broadcast main picture which is currently shot based on the position information of the sub-picture area;
Processing the initial sub-picture based on the target sub-picture parameter information to obtain a live sub-picture;
and combining the live broadcast main picture and the live broadcast sub-picture to obtain a combined live broadcast picture, pushing the combined live broadcast picture to a server, and triggering the server to control the target live broadcast room to display the combined live broadcast picture on a live broadcast interface of the main broadcast terminal.
13. The multi-picture live broadcast method of claim 10, wherein displaying the live broadcast picture in the live broadcast interface in combination with the live broadcast main picture and the live broadcast sub picture in response to a multi-picture live broadcast event comprises:
responding to a multi-picture live broadcast event, and transmitting the currently shot live broadcast main picture, the position information of the sub-picture area and the target sub-picture parameter information to a server;
triggering the server to acquire an initial sub-picture from the live broadcast main picture which is currently shot based on the position information of the sub-picture area;
processing the initial sub-picture based on the target sub-picture parameter information to obtain a live sub-picture;
and combining the live broadcast main picture and the live broadcast sub-picture to obtain a combined live broadcast picture, and controlling the target live broadcast room to display the combined live broadcast picture on a live broadcast interface of the main broadcast end.
14. The method of claim 11, wherein the parameter setting page further displays initial sprite parameters set by the sprite parameter setting control for the sprite region, the sprite parameter setting control being configured to receive at least one or more sprite parameters, the sprite parameters including shape parameters of the sprite region, scaling parameters of content of the sprite region, and effect information of content of the sprite region.
15. The multi-picture live broadcast method of claim 14, wherein a multi-picture live broadcast confirmation control is displayed at the live broadcast interface upon the response to a sub-picture selection operation on the live broadcast main picture;
the multi-picture live event includes: and triggering the multi-picture live broadcast confirmation control, selecting the sub-picture of the live broadcast main picture, and setting any one of the confirmation operations of the page by the parameters.
16. The multi-picture live broadcast method according to any one of claims 9 to 14, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface includes:
Acquiring a live broadcast picture after the live broadcast main picture and the live broadcast sub picture are fused;
and displaying the fused live broadcast picture on the live broadcast interface.
17. The multi-picture live broadcast method according to any one of claims 9 to 14, wherein displaying the live broadcast picture in which the live broadcast main picture and live broadcast sub-picture are combined on the live broadcast interface includes:
acquiring the live broadcast main picture and the live broadcast sub picture;
and displaying a live broadcast picture on the live broadcast interface, wherein the live broadcast picture comprises a main picture layer and a sub picture layer which are displayed in a superimposed mode, the main picture layer is used for displaying a live broadcast main picture, and the sub picture layer is used for displaying a live broadcast sub picture.
18. A multi-picture live broadcast apparatus, the apparatus being applied to a viewer terminal, comprising:
the first display module is used for displaying a live broadcast interface, wherein the live broadcast interface comprises a live broadcast main picture;
the second display module is used for responding to a multi-picture live event and displaying live pictures after the combination of the live main picture and the live sub-picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
19. A multi-picture live broadcast apparatus, for use at a host, comprising:
the third display module is used for displaying a live broadcast interface of the target live broadcast room, wherein the live broadcast interface comprises a live broadcast main picture;
the fourth display module is used for responding to a multi-picture live event and displaying the live picture after the combination of the live main picture and the live sub picture on the live interface; the live broadcast content in the live broadcast sub-picture is extracted from the live broadcast main picture.
20. An electronic device comprising a processor and a memory, wherein the memory stores a computer program that, when executed by the processor, causes the processor to perform the steps of the multi-picture live method of any of claims 1-8 or causes the processor to perform the steps of the multi-picture live method of any of claims 9-17.
21. A storage medium comprising a computer program for causing an electronic device to carry out the steps of the multi-picture live method as claimed in any one of claims 1 to 8 or for causing the electronic device to carry out the steps of the multi-picture live method as claimed in any one of claims 9 to 17 when the computer program is run on the electronic device.
CN202311754420.4A 2023-12-19 2023-12-19 Multi-picture live broadcast method and device, electronic equipment and storage medium Pending CN117714730A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311754420.4A CN117714730A (en) 2023-12-19 2023-12-19 Multi-picture live broadcast method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311754420.4A CN117714730A (en) 2023-12-19 2023-12-19 Multi-picture live broadcast method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117714730A true CN117714730A (en) 2024-03-15

Family

ID=90151297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311754420.4A Pending CN117714730A (en) 2023-12-19 2023-12-19 Multi-picture live broadcast method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117714730A (en)

Similar Documents

Publication Publication Date Title
CN112162671B (en) Live broadcast data processing method and device, electronic equipment and storage medium
US11790612B2 (en) Information display method and device, terminal, and storage medium
CN111901658B (en) Comment information display method and device, terminal and storage medium
CN111464830B (en) Method, device, system, equipment and storage medium for image display
CN108924616B (en) Display control method and terminal
CN112261481B (en) Interactive video creating method, device and equipment and readable storage medium
US11917329B2 (en) Display device and video communication data processing method
CN110958464A (en) Live broadcast data processing method and device, server, terminal and storage medium
CN113747199A (en) Video editing method, video editing apparatus, electronic device, storage medium, and program product
CN111294607B (en) Live broadcast interaction method and device, server and terminal
CN113518265A (en) Live broadcast data processing method and device, computer equipment and medium
CN114415907B (en) Media resource display method, device, equipment and storage medium
CN110045958B (en) Texture data generation method, device, storage medium and equipment
CN113645476B (en) Picture processing method and device, electronic equipment and storage medium
CN113609358B (en) Content sharing method, device, electronic equipment and storage medium
CN114095793A (en) Video playing method and device, computer equipment and storage medium
CN113032590A (en) Special effect display method and device, computer equipment and computer readable storage medium
CN112784137A (en) Display device, display method and computing device
CN117714730A (en) Multi-picture live broadcast method and device, electronic equipment and storage medium
CN115643445A (en) Interaction processing method and device, electronic equipment and storage medium
CN115002549B (en) Video picture display method, device, equipment and medium
CN110996115B (en) Live video playing method, device, equipment, storage medium and program product
CN113905280B (en) Barrage information display method, device, equipment and storage medium
CN115348240B (en) Voice call method, device, electronic equipment and storage medium for sharing document
WO2024082883A1 (en) Virtual object interaction method and apparatus, device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination