CN112637624B - Live stream processing method, device, equipment and storage medium - Google Patents

Live stream processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112637624B
CN112637624B CN202011472115.2A CN202011472115A CN112637624B CN 112637624 B CN112637624 B CN 112637624B CN 202011472115 A CN202011472115 A CN 202011472115A CN 112637624 B CN112637624 B CN 112637624B
Authority
CN
China
Prior art keywords
display
live
live broadcast
area
display content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011472115.2A
Other languages
Chinese (zh)
Other versions
CN112637624A (en
Inventor
潘佳志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fanxing Huyu IT Co Ltd
Original Assignee
Guangzhou Fanxing Huyu IT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fanxing Huyu IT Co Ltd filed Critical Guangzhou Fanxing Huyu IT Co Ltd
Priority to CN202011472115.2A priority Critical patent/CN112637624B/en
Publication of CN112637624A publication Critical patent/CN112637624A/en
Application granted granted Critical
Publication of CN112637624B publication Critical patent/CN112637624B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Abstract

The application discloses a live stream processing method, a live stream processing device, live stream processing equipment and a storage medium, and belongs to the technical field of live broadcasting. The method comprises the following steps: displaying live broadcast pictures and regional acquisition controls on a live broadcast interface of a main broadcast account; determining an intercepting region on a desktop through a region acquisition control; recording the display content of the intercepting region by a recording component provided by a host client; superposing and displaying display contents on a live broadcast interface; and synthesizing the live broadcast picture and the display content to obtain a live broadcast stream. Changing the recorded display content can realize the function of updating live broadcast. In the process, only the anchor client needs to be developed, so that the updating efficiency of the live broadcast platform is improved.

Description

Live stream processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of live broadcast technologies, and in particular, to a live broadcast stream processing method, apparatus, device, and storage medium.
Background
With the rapid development of the live broadcast industry, manufacturers providing live broadcast services need to frequently update a live broadcast platform to ensure user experience, and the live broadcast platform comprises a hosting client and a spectator client.
Currently, developers typically implement updating live platforms by developing the same new functionality for the anchor client and the viewer client, and updating the anchor client and the viewer client, respectively.
When updating the live platform based on the above manner, the anchor client and the audience client need to be developed simultaneously, and the consistency of the developed functions needs to be ensured, so that the updating efficiency is lower.
Disclosure of Invention
The application provides a live stream processing method, a device, equipment and a storage medium, which can reduce network resource consumption in the live data transmission process. The technical scheme is as follows:
according to an aspect of the present application, there is provided a live stream processing method, including:
displaying live broadcast pictures and regional acquisition controls on a live broadcast interface of a main broadcast account;
determining an intercepting area on a desktop through the area acquisition control;
recording the display content of the intercepting region through a recording component provided by a host client;
superposing and displaying the display content on the live broadcast interface;
and synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
According to another aspect of the present application, there is provided a live stream processing apparatus, the apparatus including:
the display module is used for displaying live broadcast pictures and area acquisition controls on a live broadcast interface of the main broadcast account;
the determining module is used for determining an intercepting area on the desktop through the area acquisition control;
The recording module is used for recording the display content of the intercepting region through a recording component provided by the anchor client;
the display module is used for displaying the display content in a superposition way on the live broadcast interface;
and the synthesis module is used for synthesizing the live broadcast picture with the display content to obtain a live broadcast stream.
In an alternative design, the region acquisition control comprises a box control; the determining module is used for:
responding to the triggering operation on the box selection control, and starting a box selection window on the desktop;
and determining the interception area in response to a box selection operation on the box selection window.
In an alternative design, the determining module is configured to:
determining frame selection position information in response to a frame selection operation on the frame selection window, wherein the frame selection position information is used for indicating a frame selection region of the frame selection operation on the desktop;
and determining the intercepting region according to the frame selection position information.
In an alternative design, the region acquisition control includes a web page acquisition control; the display module is used for:
responding to the triggering operation on the webpage acquisition control, and displaying a webpage list to be selected in the live broadcast interface;
The determining module is used for:
responding to the selection operation of the webpage list to be selected, and acquiring a target webpage;
and determining the area indicated by the embedded code of the target webpage as the intercepting area.
In an alternative design, the display module is configured to:
and in response to an overlay display operation, displaying the display content in an overlay display component provided by the anchor client, the overlay display component being displayed in a display area for displaying the live view, the overlay display component being overlaid on top of the live view.
In an alternative design, the apparatus further comprises:
and the first processing module is used for responding to the dragging operation of the overlapped display assembly and changing the display position of the overlapped display assembly.
In an alternative design, the apparatus further comprises:
and the second processing module is used for responding to the zooming operation of the overlapped display assembly and changing the display area of the overlapped display assembly.
In an alternative design, the live view belongs to a first layer, the display content belongs to a second layer, and the second layer is above the first layer; the synthesis module is used for:
And synthesizing the first layer and the second layer.
In an alternative design, the apparatus further comprises:
and the sending module is used for sending the live broadcast stream to a server, wherein the live broadcast stream is used for displaying the live broadcast picture of the audience in the user interface of the client side of the audience.
According to another aspect of the present application, there is provided a computer device comprising a processor and a memory having stored therein at least one instruction, at least one program, a set of codes or a set of instructions, the at least one instruction, the at least one program, the set of codes or the set of instructions being loaded and executed by the processor to implement the live stream processing method as described in the above aspect.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein at least one program code loaded and executed by a processor to implement the live stream processing method as described in the above aspect.
According to another aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the live stream processing method provided in various alternative implementations of the above aspects.
The beneficial effects that this application provided technical scheme brought include at least:
and synthesizing the display content overlapped and displayed on the live broadcast interface with the live broadcast picture to obtain the live broadcast stream. According to different recorded display contents, different live streams can be generated based on live broadcast pictures, so that different audience live broadcast pictures can be displayed in an audience client. Changing the recorded display content can realize the function of updating live broadcast. In the process, only the anchor client needs to be developed, so that the updating efficiency of the live broadcast platform is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a principle of processing a live stream provided in an embodiment of the present application;
fig. 2 is a flow schematic diagram of a live stream processing method provided in an embodiment of the present application;
fig. 3 is a flow chart of another live stream processing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an implementation process for determining an interception area according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a box selection window provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of another implementation procedure for determining an interception area according to an embodiment of the present application;
FIG. 7 is a schematic diagram of determining an interception area through a frame selection window according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a list of candidate web pages provided in an embodiment of the present application;
FIG. 9 is a schematic diagram of displaying display content through an overlay display assembly provided by an embodiment of the present application;
fig. 10 is a schematic structural diagram of a live stream processing device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another live stream processing apparatus according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of still another live stream processing apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of still another live stream processing apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a principle of processing a live stream provided in an embodiment of the present application. As shown in fig. 1, in the live interface 101 of the anchor account, the anchor client displays a live screen 102 of the anchor account. Also displayed in the live interface 101 is a region acquisition control 103. Optionally, the region acquisition control is a box control. And the anchor client starts a frame selection window on the desktop of the computer equipment where the anchor client is located according to the triggering operation on the frame selection control, and determines the interception area according to the frame selection operation on the frame selection window. For example, the desktop displays a target web page 104 that is acquired and displayed by the anchor client through the browser of the computer device in accordance with the web page acquisition operation. The anchor client can determine the web page area 105 from the displayed target web page as the interception area according to the frame selection operation on the frame selection window. The box selection window is always displayed in the desktop when the computer device switches to display different contents in the desktop. Optionally, the region acquisition control is a web page acquisition control. The anchor client can acquire triggering operation on the control through the webpage and display a webpage list to be selected. And acquiring a target webpage selected from the webpage list to be selected by a host user through a webpage acquisition component provided by the host client, and determining an area in the target webpage indicated by the embedded code of the target webpage as an interception area. The web pages to be selected in the web page list to be selected are provided by a developer of the live broadcast platform, and the embedded codes are written by the developer. And then, the anchor client displays the display content of the interception area recorded by the recording component provided by the anchor client to the overlapping display component 106 according to the overlapping display operation. The overlay display component is displayed in a display area of the live view screen, and the overlay display component is displayed in an overlay manner on the live view screen. Thereby realizing the superimposed display of the display content in the display area of the live view 102. Optionally, the anchor client is also capable of changing the display position of the overlay display component 106 in the display area according to the drag operation. And changing a display area of the overlay display assembly according to the zoom operation. And then the anchor client synthesizes the first layer where the live broadcast picture 102 is positioned and the second layer where the display content in the superposition display component 106 is positioned, thereby obtaining a live broadcast stream, and sends the server corresponding to the anchor client. The server sends the live stream to the viewer account viewing the live of the anchor account so that the viewer client where the viewer account is located can display the viewer live view 107 of the anchor account in the user interface according to the live stream. The viewer live view includes live view 102 and the display content, and live view 102 and display content are in the same layer.
And combining the display content overlapped and displayed on the live broadcast picture 102 with the live broadcast picture 102 to obtain a live broadcast stream. And sends the live stream to the viewer client via the server. Different viewer live scenes 107 can be displayed in the viewer client according to the recorded different display contents. Changing the recorded display content can realize the function of updating live broadcast. In the process, only the anchor client needs to be developed, so that the updating efficiency of the live broadcast platform is improved.
Fig. 2 is a flow chart of a live stream processing method provided in an embodiment of the present application. The method may be used with a computer device or a hosting client on a computer device. As shown in fig. 2, the method includes:
step 201: and displaying a live broadcast picture and an area acquisition control on a live broadcast interface of the main broadcast account.
The anchor account is any account that logs into the anchor client. The anchor client is a client with a live broadcast function and comprises a music client, a short video client, a K song client, a shopping client, a live broadcast client, a friend making client and the like. The live interface of the anchor account provides a live interface for anchor users who log into the anchor account, for example, a live room interface of the anchor account. Illustratively, the region acquisition control is a button. Specifically, the button displayed on the live broadcast interface, the button displayed in the popup window of the live broadcast interface, and the button displayed in the live broadcast interface in a floating manner.
And the anchor client displays the live broadcast picture according to the video stream recorded by the computer equipment where the anchor client is located. And the anchor client displays the live broadcast picture according to the video stream acquired from the computer equipment where the anchor client is located. Alternatively, the anchor client obtains the video stream from the other computer device, thereby displaying the live view. The live view is obtained by playing the video stream.
Step 202: and determining an interception area through an area acquisition control.
Optionally, the intercepting region includes a region in the desktop and a region in the target webpage. The desktop refers to a desktop including any display content, which is displayed by a computer device where the anchor client is located. And the anchor client can determine the intercepting region on the desktop comprising different display contents according to the box selection control. The intercepting region includes any region on the desktop. The target web page is determined from the preselected web pages by the anchor client according to the anchor user's selection operation. The preselected web page is provided by a developer of the live platform. Illustratively, the intercepting region is the entire desktop, or is a partial region in the desktop. Optionally, the anchor client determines the interception area according to a box selection operation. The box selection operation can indicate where the truncated area occupies in the desktop. And the anchor client determines the area in the target webpage indicated by the embedded code of the target webpage as the intercepting area. The shape of the intercepting region is rectangular, circular, elliptic, triangular and the like.
Step 203: and recording the display content of the intercepting region by a recording component provided by the anchor client.
After determining the interception area, the anchor client starts a recording component of the anchor client so as to record the display content in the interception area. When the desktop currently displayed by the computer equipment where the anchor client is located does not comprise the interception area, the anchor client can still record the display content of the interception area through the recording component, namely, at the moment, the anchor client records the display content of the interception area through the background of the recording component. For example, the anchor client records the display content of the intercepting region through a recording component in a Virtual Machine (Virtual Machine) based manner. The recorded display content is static display content (picture) of the intercepting region or real-time display content (video) of the intercepting region. Optionally, the anchor client obtains display data of the interception area at different moments in the recording process through the recording component, so that display content of the interception area is recorded.
Step 204: and superposing and displaying the display content on the live broadcast interface.
The display content of the intercepting region is displayed by being overlapped on the live broadcast interface, and the display content can cover the displayed content in the live broadcast interface. The display content covers the entire live interface or a portion of the live interface. Alternatively, the display content can be displayed superimposed on a display area in the live view. The display area is used for displaying live broadcast pictures of the anchor account. The display content displayed covers the entire display area or a part of the display area.
Illustratively, the display area is an area in the displayed desktop according to data provided by a developer of the anchor client. The display content of the display area is a decorative element or a director list of a live broadcasting room of the host account. By superposing and displaying the display content on the live broadcast picture, the display of adding decorative elements for the face of the anchor in the live broadcast picture or adding a gift list in the live broadcast picture can be realized. And a developer can enable the anchor client to determine the display areas comprising different display contents by updating the provided data, so that different elements are additionally displayed in the live broadcast picture, and the live broadcast function is updated.
Optionally, the anchor client realizes the overlapping display of the display content through an overlapping display component provided by the anchor client. The overlay display component is displayed in a display area for displaying a live view and is overlaid on top of the live view. I.e. the overlay display component will obscure the live view. The display content is displayed in the overlay display assembly. Alternatively, the display area of the display content is the same as the display area of the superimposed display assembly, and the display ratio is uniform. The anchor client can also change the size and position of the displayed overlay display assembly according to the drag operation and the zoom operation of the anchor user on the overlay display assembly, thereby changing the size and position of the displayed display content.
Step 205: and synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
By playing the live stream, a live view picture of the audience can be obtained, wherein the live view picture of the audience comprises the live view picture and the display content. The layer displaying the live broadcast picture is different from the layer displaying the display content. The live broadcast picture belongs to a first layer, the display content belongs to a second layer, and the second layer is above the first layer. Optionally, the anchor client merges the first layer with the second layer, thereby obtaining a merged layer, and according to the merged layer, the live stream can be obtained. The anchor client can send the live stream to the audience client where the audience account watching the live broadcast of the anchor account is located through the corresponding server, and the live stream is used for displaying the audience live broadcast picture in a user interface of the audience client.
Optionally, when the display area of the display content exceeds the display area of the live broadcast picture, the live broadcast client only synthesizes the live broadcast picture with the display content in the display area of the live broadcast picture when synthesizing the live broadcast picture and the display content.
In summary, according to the live stream processing method provided by the embodiment of the application, the display content overlapped and displayed on the live interface is synthesized with the live image, so as to obtain the live stream. According to different recorded display contents, different live streams can be generated based on live broadcast pictures, so that different audience live broadcast pictures can be displayed in an audience client. Changing the recorded display content can realize the function of updating live broadcast. In the process, only the anchor client needs to be developed, so that the updating efficiency of the live broadcast platform is improved.
Fig. 3 is a flow chart of another live stream processing method according to an embodiment of the present application. The method may be used with a computer device or a hosting client on a computer device. As shown in fig. 3, the method includes:
step 301: and displaying a live broadcast picture and an area acquisition control on a live broadcast interface of the main broadcast account.
The live interface of the anchor account provides a live interface for anchor users who log into the anchor account, for example, a live room interface of the anchor account. Illustratively, the region acquisition control is a button. Specifically, the button displayed on the live broadcast interface, the button displayed in the popup window of the live broadcast interface, and the button displayed in the live broadcast interface in a floating manner. Illustratively, with continued reference to FIG. 1, a region acquisition control 103 is displayed in the live interface 101, the region acquisition control 103 being a button.
And the anchor client displays the live broadcast picture according to the video stream recorded by the computer equipment where the anchor client is located. And the anchor client displays the live broadcast picture according to the video stream acquired from the computer equipment where the anchor client is located. Alternatively, the anchor client obtains the video stream from the other computer device, thereby displaying the live view. The live view is obtained by playing the video stream.
Step 302: and determining an interception area through an area acquisition control.
Optionally, the intercepting region includes a region in the desktop and a region in the target webpage. The desktop refers to a desktop including any display content, which is displayed by a computer device where the anchor client is located. And the anchor client can determine the intercepting region on the desktop comprising different display contents according to the box selection control. The intercepting region includes any region on the desktop. The target web page is determined from the preselected web pages by the anchor client according to the anchor user's selection operation. The preselected web page is provided by a developer of the live platform.
Optionally, the area acquisition control includes a box selection control, as shown in fig. 4, and the implementation procedure of step 302 includes the following steps 3021 and 3022:
in step 3021a, a box selection window is launched on the desktop in response to a triggering operation on the box selection control.
The triggering operation includes a touch operation for a box control and a voice instruction for the box control. The touch operation is triggered by gesture operation of an anchor user on a touch screen of a computer device where the anchor client is located through external devices such as a mouse, a keyboard and the like. Including single click, double click, long press, etc. When the anchor client receives voice instructions of 'selecting the box selection control', 'triggering the box selection control', 'starting box selection', and the like through the computer equipment, the anchor client determines that the triggering operation is received.
When the anchor client starts the box selection window, the box selection window is displayed. The frame selection window is suspended on the desktop for display. The shape of the frame selection window is rectangle, circle, ellipse, triangle, etc. The frame of the frame selection window is solid or dashed. For example, a red dashed line, for highlighting the position occupied by the frame selection window in the desktop. The frame of the frame selection window is transparent, and the transparency is a preset proportion and is used for displaying the display content in the position occupied by the frame selection window. And when the computer equipment where the anchor client is located switches to display the desktop comprising different display contents, the anchor client always displays the frame selection window. The anchor client can change the position and the size of the frame selection window in the desktop according to the dragging operation and the zooming operation aiming at the frame selection window, so that different areas are selected in the desktop through the frame selection window.
Illustratively, fig. 5 is a schematic diagram of a box selection window provided in an embodiment of the present application. As shown in fig. 5, in a live interface 401 of the main account, a radio button 402 is displayed. When the anchor client receives a click operation on the box button 402, a box selection window 403 is displayed in the desktop displayed on the computer device on which the anchor client is located. The border of the frame selection window 403 is dashed and the transparency inside the border is 100%. The anchor client is able to frame select different cut-out areas in the desktop through a frame selection window 403.
In step 3022a, an interception area is determined in response to a box selection operation on a box selection window.
The box selection operation is used for indicating that a box selection window is selected in a desktop in a box mode to be determined as an intercepting area. Optionally, when the anchor client receives a determination operation for selecting a region of the window frame for selection, it is determined that the frame selection operation is received. The determining operation includes a touch operation for selecting a window frame selected region, wherein the window frame is kept at the same position and the same size for a preset duration, and a touch operation for a confirmation control displayed corresponding to the window frame. And displaying the frame selection window on the desktop according to a default size and a default position when the frame selection window starts to be started, and when the anchor client receives a drag operation and a zoom operation for the frame selection window, enabling the frame selection window to change the display position and the display size, and receiving the frame selection operation accurately. And determining that the box selection operation is received when the box selection window is started.
In response to a box selection operation on the box selection window, the anchor client may determine box selection location information. The frame selection position information is used for indicating a frame selection region of the frame selection operation on the desktop, namely, a region inside a frame of the frame selection window. And the anchor client can determine the interception area according to the frame selection position information. When the box selection area is not included in the desktop displayed by the computer device, the box selection area does not change.
Optionally, the region acquisition control comprises a web page acquisition control. As shown in fig. 6, the implementation procedure of step 302 includes the following steps 3021b to 3023b:
in step 3021b, in response to a triggering operation on the web page acquisition control, a list of web pages to be selected is displayed in the live interface.
The web page acquisition control is a button. Optionally, the list of web pages to be selected is displayed in a popup window, and the popup window is displayed in a live interface. The triggering operation comprises a single click, double click, long press and sliding operation of the webpage acquisition control. The to-be-selected webpage in the to-be-selected webpage list is provided by a developer of the live broadcast platform, and the to-be-selected webpage is used for realizing the function of updating the live broadcast platform.
In step 3022b, a target web page is acquired in response to a selection operation of the list of web pages to be selected.
The target webpage is a webpage selected from a list of webpages to be selected by the selection operation. The selecting operation includes single click, double click, selection and long press operations of the web pages to be selected in the list of web pages to be selected. Optionally, the anchor client obtains the target web page through a web page obtaining component provided by the anchor client. Or, the anchor client can also acquire the target webpage through a browser installed in the computer device where the anchor client is located.
In step 3023b, the area indicated by the embedded code of the target web page is determined as the interception area.
The embedded code is written by a developer, and the embedded code is written in the code corresponding to the preselected web page. Optionally, when the anchor client acquires the target webpage through the webpage acquisition component and reads the code of the embedded code, the area in the target webpage indicated by the embedded code is determined as the interception area. The embedded code can indicate vertex information as well as width and height information of an area in the target web page. Optionally, when the live client acquires the target webpage, the target webpage is displayed in the live interface through the webpage acquisition component.
Fig. 7 is a schematic diagram illustrating determination of an interception area through a selection window according to an embodiment of the present application. As shown in fig. 7, the anchor client initiates a box selection window 403 on the desktop that displays the live interface 401 of the anchor account. And then, the anchor client displays a live webpage 404 of the sports on the desktop through a browser of the computer equipment according to webpage acquisition operation of an anchor user, and a frame selection window 403 is displayed on the live webpage 404. After the anchor user adjusts the position and size of the box selection window 403, the anchor client determines the interception area according to the determination operation (double click) on the box selection window 403.
Fig. 8 is a schematic diagram of a list of candidate web pages according to an embodiment of the present application. As shown in fig. 8, a web page get button 405 is displayed in the live interface 401 of the anchor account. When the anchor client receives a trigger operation of the web page acquisition button 405, a list of web pages to be selected 406 is displayed. The web pages to be selected in the web page list to be selected 406 are provided by a developer of the live platform, and are used for realizing the function of updating the live platform. Including, for example, a dress up web page (for overlaying display of dress up elements) and a leaderboard web page (for overlaying display of a live room's leaderboard).
Step 303: and recording the display content of the intercepting region by a recording component provided by the anchor client.
After determining the interception area, the anchor client starts a recording component of the anchor client so as to record the display content in the interception area. When the desktop currently displayed by the computer equipment where the anchor client is located does not comprise the interception area, the anchor client can still record the display content of the interception area through the recording component, namely, at the moment, the anchor client records the display content of the interception area through the background of the recording component. For example, the anchor client records the display content of the intercepting region through a recording component in a Virtual Machine (Virtual Machine) based manner. The recorded display content is static display content (picture) of the intercepting region or real-time display content (video) of the intercepting region. Optionally, the anchor client obtains display data of the interception area at different moments in the recording process through the recording component, so that display content of the interception area is recorded.
Step 304: and superposing and displaying the display content on the live broadcast interface.
The display content of the intercepting region is displayed by being overlapped on the live broadcast interface, and the display content can cover the displayed content in the live broadcast interface. Alternatively, the display content can be displayed superimposed on a display area in the live view. The display area is used for displaying live broadcast pictures of the anchor account. The display content displayed covers the entire display area or a part of the display area.
Optionally, in response to the overlay display operation, the anchor client displays the display content in an overlay display component provided by the anchor client. The overlay display component is displayed in a display area for displaying a live view, and the overlay display component is overlaid on top of the live view. I.e. the overlay display component will obscure the live view. Optionally, the overlapping display operation is determined to be received when the anchor client records the display content of the intercepting region through the recording component. When the anchor client receives the trigger operation for the overlay display control, the anchor client determines that the overlay display operation is received. And when the anchor client records the display content of the intercepting region through the recording component, displaying the overlapped display control in the live broadcast interface. And the intercepting area is a webpage area in the target webpage, and when the anchor client detects that the target webpage is embedded with the target code, the overlapping display operation is determined to be received. The object code is formulated by the developer of the anchor client.
The anchor client acquires display data corresponding to the display content of the intercepting region through the recording component and renders the display data into the superposition display component, so that the display content is displayed in the superposition display component. Alternatively, the display area of the display content is the same as the display area of the superimposed display assembly, and the display ratio is uniform. In response to a drag operation on the overlay display component, the anchor client is able to change the display position of the overlay display component. In response to a zoom operation on the overlay display component, the anchor client is able to change the display area of the overlay display component.
Fig. 9 is a schematic diagram illustrating display of display content through an overlay display component according to an embodiment of the present application. As shown in fig. 9 (a), when the anchor client determines the interception area shown in fig. 7 and records the display content of the interception area through the recording component. The anchor client displays the superposition display component 407 on a display area of the live broadcast screen for displaying the anchor account in the live broadcast interface 401 of the anchor account. And displays the display contents of the cut-out area recorded in real time in the superimposed display component 407. The display content is a live broadcast picture of sports. As shown in fig. 9 (b), the anchor client changes the display position and display area of the superimposed display component 407 according to the drag operation and the zoom operation on the superimposed display component 407, thereby realizing the change of the display position and display area of the live-broadcast screen of the sports of the superimposed display.
It should be noted that, the anchor client may superimpose and display the display content in the intercepting area recorded in real time on the display area of the live broadcast picture. As the display content in the intercepting region changes, the display content displayed in the display region in a superimposed manner correspondingly changes. When the interception area changes, the display content also changes, and at the moment, the display content displayed in the display area in a superimposed mode correspondingly changes. The anchor client can also superimpose and display the display content designated by the anchor account in the display area, for example, the display content of the web page area in the web page designated by the anchor account, and the designated video and picture. The anchor client is also capable of superimposing and displaying the display content in the display area recorded in the past in the display area.
The anchor client can initiate a plurality of box selection windows to determine a plurality of intercepting regions. Optionally, the anchor client may further obtain a plurality of target web pages from the list of web pages to be selected, so as to determine a plurality of intercepting areas. And then displaying the display content of each interception area recorded by the recording component on the live broadcast picture by a plurality of superposition display components. And, the anchor client can also adjust the display hierarchy order of each overlay display component according to the sorting operation, thereby adjusting the display hierarchy order of different display contents. Wherein, the display content with a high display level covers the display content with a low display level.
Step 305: and synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
By playing the live stream, a live view picture of the audience can be obtained, wherein the live view picture of the audience comprises the live view picture and the display content. The layer displaying the live broadcast picture is different from the layer displaying the display content. The live broadcast picture belongs to a first layer, the display content belongs to a second layer, and the second layer is above the first layer. Optionally, the anchor client merges the first layer with the second layer, thereby obtaining a merged layer, and according to the merged layer, the live stream can be obtained.
Optionally, when the display area of the display content exceeds the display area of the live broadcast picture, the live broadcast client only synthesizes the live broadcast picture with the display content in the display area of the live broadcast picture when synthesizing the live broadcast picture and the display content.
Step 306: and sending the live stream to a server.
The live stream is used to display a viewer live view in a user interface of a viewer client. The user interface is a live room interface for watching live account of the main broadcasting. The audience live view includes live view and display content. For example, the live video of the main broadcasting account is downloaded by using the audience user of the audience client, and the live video of the audience can be obtained by playing the live video.
When the anchor client adjusts the content, the display position and the display area of the display content displayed in the live broadcast picture in a superimposed manner, the display content in the live broadcast picture of the audience client is correspondingly changed.
In a specific example, the target web page acquired by the anchor client via the web page acquisition component displays a decorative pattern, such as a hat. And then the anchor client determines the area displaying the dress pattern as a cut area according to the frame selection operation on the frame selection window. And displaying the dress pattern by a superimposed display assembly superimposed on the display area of the live view screen. And in the live broadcast process of the anchor account, the anchor client determines a target display position (such as the top of the anchor head) for displaying the decoration pattern according to the live broadcast picture displayed in real time, and adjusts the display position of the decoration pattern in real time. Therefore, the picture that the main player wears the decoration pattern can be displayed, and the live broadcasting interestingness is improved.
In summary, according to the live stream processing method provided by the embodiment of the application, the display content overlapped and displayed on the live interface is synthesized with the live image, so as to obtain the live stream. According to different recorded display contents, different live streams can be generated based on live broadcast pictures, so that different audience live broadcast pictures can be displayed in an audience client. Changing the recorded display content can realize the function of updating live broadcast. In the process, only the anchor client needs to be developed, so that the updating efficiency of the live broadcast platform is improved.
In addition, in the process of live broadcast data transmission, different functions can be provided only by sending live broadcast streams, so that the consumption of network resources is reduced. The intercepting area is determined through the frame selection window, so that display contents in the intercepting area can be displayed for anchor users, and interaction experience is improved. And determining the webpage area according to the target webpage, so that the anchor user can freely adjust the intercepting area according to the requirement. Therefore, different display contents are displayed in the live broadcast picture, and the interest of live broadcast is improved. The display content of the interception area is displayed through the superposition display component, the superposition display component supports modification of the display position and the display area, and the size and the position of the display content can be flexibly changed. The display content of the interception area can be displayed in real time in the live broadcast picture by recording the display content of the interception area through the recording component.
It should be noted that, the sequence of the steps of the method provided in the embodiment of the present application may be appropriately adjusted, the steps may also be increased or decreased according to the situation, and any method that is easily conceivable to be changed by those skilled in the art within the technical scope of the present application should be covered within the protection scope of the present application, so that no further description is given.
Fig. 10 is a schematic structural diagram of a live stream processing device provided in an embodiment of the present application. The apparatus may be used with a computer device or a hosting client on a computer device. As shown in fig. 10, the apparatus 100 includes:
and the display module 1001 is used for displaying a live broadcast picture and an area acquisition control on a live broadcast interface of the main broadcast account.
The determining module 1002 is configured to determine an intercepting region on a desktop through a region acquisition control.
And the recording module 1003 is used for recording the display content of the interception area through a recording component provided by the anchor client.
And the display module 1001 is used for displaying the display content in a superposition manner on the live broadcast interface.
And the synthesizing module 1004 is configured to synthesize the live broadcast picture with the display content to obtain a live broadcast stream.
In one alternative design, the region acquisition control includes a box control. A determining module 1002, configured to:
in response to a triggering operation on the box control, a box selection window is launched on the desktop. In response to a box selection operation on the box selection window, an interception area is determined.
In an alternative design, determination module 1002 is configured to:
and determining frame selection position information in response to the frame selection operation on the frame selection window, wherein the frame selection position information is used for indicating the frame selection operation on the frame selection area on the desktop. And determining an interception area according to the frame selection position information.
In one alternative design, the region acquisition control includes a web page acquisition control. And the display module 1001 is configured to display a list of to-be-selected web pages in the live interface in response to a triggering operation on the web page acquisition control.
A determining module 1002, configured to:
and responding to the selection operation of the webpage list to be selected, and acquiring the target webpage. And determining the area indicated by the embedded code of the target webpage as an interception area.
In an alternative design, the display module 1001 is configured to:
in response to the overlay display operation, display content is displayed in an overlay display component provided by the anchor client, the overlay display component being displayed in a display area for displaying the live view, the overlay display component being overlaid on top of the live view.
In an alternative design, as shown in FIG. 11, the apparatus 100 further comprises:
the first processing module 1005 is configured to change a display position of the overlay display assembly in response to a drag operation on the overlay display assembly.
In an alternative design, as shown in FIG. 12, the apparatus 100 further comprises:
the second processing module 1006 is configured to change a display area of the overlay display assembly in response to a zoom operation on the overlay display assembly.
In an alternative design, the live view belongs to a first layer and the display belongs to a second layer, the second layer being above the first layer. A synthesis module 1004 for:
and synthesizing the first layer and the second layer.
In an alternative design, as shown in FIG. 13, the apparatus 100 further comprises:
and the sending module 1007 is configured to send a live stream to the server, where the live stream is used to display a live image of the audience in a user interface of the audience client.
It should be noted that: in the live stream processing apparatus provided in the foregoing embodiment, only the division of the foregoing functional modules is used as an example, and in practical application, the foregoing functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the live stream processing device and the live stream processing method provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the live stream processing device and the live stream processing method are detailed in the method embodiments and are not repeated herein.
Embodiments of the present application also provide a computer device comprising: the live stream processing method comprises a processor and a memory, wherein at least one instruction, at least one section of program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by the processor to realize the live stream processing method provided by each method embodiment.
Optionally, the computer device is a terminal. Fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application.
In general, terminal 1400 includes: a processor 1401 and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1401 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit, image processor) for rendering and rendering of content required to be displayed by the display screen. In some embodiments, the processor 1401 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement the live stream processing method provided by the method embodiments herein.
In some embodiments, terminal 1400 may optionally further include: a peripheral interface 1403 and at least one peripheral. The processor 1401, memory 1402, and peripheral interface 1403 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1403 via buses, signal lines or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a display screen 1405, a camera assembly 1406, an audio circuit 1407, a positioning assembly 1408, and a power source 1409.
Peripheral interface 1403 may be used to connect at least one Input/Output (I/O) related peripheral to processor 1401 and memory 1402. In some embodiments, processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, either or both of processor 1401, memory 1402, and peripheral interface 1403 may be implemented on separate chips or circuit boards, as embodiments of the application are not limited in this regard.
The Radio Frequency circuit 1404 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1404 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1404 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1404 may also include NFC (Near Field Communication, short range wireless communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to collect touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 as a control signal for processing. At this time, the display 1405 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1405 may be one, providing a front panel of the terminal 1400; in other embodiments, the display 1405 may be at least two, respectively disposed on different surfaces of the terminal 1400 or in a folded design; in still other embodiments, the display 1405 may be a flexible display disposed on a curved surface or a folded surface of the terminal 1400. Even more, the display 1405 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The display 1405 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera component 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Typically, a front camera is disposed on the front panel of terminal 1400 and a rear camera is disposed on the rear of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing, or inputting the electric signals to the radio frequency circuit 1404 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be provided at different portions of the terminal 1400, respectively. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuitry 1407 may also include a headphone jack.
The locating component 1408 is used to locate the current geographic location of the terminal 1400 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 1408 may be a positioning component based on the united states GPS (Global Positioning System ), the chinese beidou system, or the russian galileo system.
A power supply 1409 is used to power the various components in terminal 1400. The power supply 1409 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1409 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1400 also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyroscope sensor 1412, pressure sensor 1413, fingerprint sensor 1414, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the terminal 1400. For example, the acceleration sensor 1411 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1401 may control the touch display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal 1400, and the gyro sensor 1412 may collect a 3D motion of the user to the terminal 1400 in cooperation with the acceleration sensor 1411. The processor 1401 may implement the following functions based on the data collected by the gyro sensor 1412: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
Pressure sensor 1413 may be disposed on a side frame of terminal 1400 and/or on an underlying layer of touch screen 1405. When the pressure sensor 1413 is provided at a side frame of the terminal 1400, a grip signal of the terminal 1400 by a user can be detected, and the processor 1401 performs right-and-left hand recognition or quick operation according to the grip signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the touch screen 1405, the processor 1401 realizes control of the operability control on the UI interface according to the pressure operation of the user on the touch screen 1405. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1414 is used to collect a fingerprint of a user, and the processor 1401 identifies the identity of the user based on the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the identity of the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by the processor 1401 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1414 may be provided on the front, back, or side of the terminal 1400. When a physical key or vendor Logo is provided on terminal 1400, fingerprint sensor 1414 may be integrated with the physical key or vendor Logo.
The optical sensor 1415 is used to collect the ambient light intensity. In one embodiment, the processor 1401 may control the display brightness of the touch screen 1405 based on the intensity of ambient light collected by the optical sensor 1415. Specifically, when the intensity of the ambient light is high, the display brightness of the touch display screen 1405 is turned up; when the ambient light intensity is low, the display brightness of the touch display screen 1405 is turned down. In another embodiment, the processor 1401 may also dynamically adjust the shooting parameters of the camera assembly 1406 based on the ambient light intensity collected by the optical sensor 1415.
A proximity sensor 1416, also referred to as a distance sensor, is typically provided on the front panel of terminal 1400. The proximity sensor 1416 is used to collect the distance between the user and the front of the terminal 1400. In one embodiment, when the proximity sensor 1416 detects that the distance between the user and the front surface of the terminal 1400 gradually decreases, the processor 1401 controls the touch display 1405 to switch from the bright screen state to the off screen state; when the proximity sensor 1416 detects that the distance between the user and the front surface of the terminal 1400 gradually increases, the touch display 1405 is controlled by the processor 1401 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 14 is not limiting and that terminal 1400 may include more or less components than those illustrated, or may combine certain components, or employ a different arrangement of components.
The embodiment of the application also provides a computer readable storage medium, at least one program code is stored in the computer readable storage medium, and when the program code is loaded and executed by a processor of a computer device, the live stream processing method provided by each method embodiment is realized.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the live stream processing method provided by each method embodiment.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the above readable storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely illustrative of the present application and is not intended to limit the invention to the particular embodiments shown, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and principles of the invention.

Claims (11)

1. A live stream processing method, characterized by being applied to a hosting client, the method comprising:
displaying live broadcast pictures and regional acquisition controls on a live broadcast interface of a main broadcast account;
determining an intercepting region through the region acquisition control;
recording the display content of the intercepting region through a recording component provided by the anchor client;
superposing and displaying the display content on the live broadcast interface;
synthesizing the live broadcast picture and the display content to obtain a live broadcast stream;
the region acquisition control comprises a webpage acquisition control; the determining the intercepting area through the area acquisition control comprises the following steps:
responding to the triggering operation on the webpage acquisition control, and displaying a webpage list to be selected in the live broadcast interface;
responding to the selection operation of the webpage list to be selected, and acquiring a target webpage, wherein a target code is embedded in the target webpage and is used for indicating the superposition display operation;
And determining the area indicated by the embedded code of the target webpage as the intercepting area, wherein the embedded code is used for indicating fixed point information and width and height information of the area in the target webpage.
2. The method of claim 1, wherein the region acquisition control comprises a box control; the determining the intercepting area through the area acquisition control comprises the following steps:
responding to the triggering operation on the frame selection control, and starting a frame selection window on the desktop;
and determining the interception area in response to a box selection operation on the box selection window.
3. The method of claim 2, wherein the determining the truncated region in response to a box operation on the box selection window comprises:
determining frame selection position information in response to a frame selection operation on the frame selection window, wherein the frame selection position information is used for indicating a frame selection region of the frame selection operation on the desktop;
and determining the intercepting region according to the frame selection position information.
4. A method according to any one of claims 1 to 3, wherein said superimposing the display content on the live interface comprises:
And in response to an overlay display operation, displaying the display content in an overlay display component provided by the anchor client, the overlay display component being displayed in a display area for displaying the live view, the overlay display component being overlaid on top of the live view.
5. The method according to claim 4, wherein the method further comprises:
in response to a drag operation on the overlay display assembly, a display position of the overlay display assembly is changed.
6. The method according to claim 4, wherein the method further comprises:
in response to a zoom operation on the overlay display assembly, a display area of the overlay display assembly is changed.
7. A method according to any one of claims 1 to 3, wherein the live view belongs to a first layer, the display content belongs to a second layer, the second layer being above the first layer;
the synthesizing the live broadcast picture with the display content comprises the following steps:
and synthesizing the first layer and the second layer.
8. A method according to any one of claims 1 to 3, wherein the method further comprises:
And sending the live stream to a server, wherein the live stream is used for displaying a live audience picture in a user interface of an audience client.
9. A live stream processing apparatus, the apparatus comprising:
the display module is used for displaying live broadcast pictures and area acquisition controls on a live broadcast interface of the main broadcast account;
the determining module is used for determining an intercepting area on the desktop through the area acquisition control;
the recording module is used for recording the display content of the intercepting region through a recording component provided by the anchor client;
the display module is used for displaying the display content in a superposition way on the live broadcast interface;
the synthesis module is used for synthesizing the live broadcast picture and the display content to obtain a live broadcast stream;
the region acquisition control comprises a webpage acquisition control; the determining module is used for responding to the triggering operation on the webpage acquisition control and displaying a webpage list to be selected in the live broadcast interface; responding to the selection operation of the webpage list to be selected, and acquiring a target webpage, wherein a target code is embedded in the target webpage and is used for indicating the superposition display operation; and determining the area indicated by the embedded code of the target webpage as the intercepting area, wherein the embedded code is used for indicating fixed point information and width and height information of the area in the target webpage.
10. A computer device comprising a processor and a memory having stored therein at least one instruction, at least one program, code set or instruction set, the at least one instruction, at least one program, code set or instruction set being loaded and executed by the processor to implement the live stream processing method of any of claims 1 to 8.
11. A computer readable storage medium having stored therein at least one program code loaded and executed by a processor to implement the live stream processing method of any of claims 1 to 8.
CN202011472115.2A 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium Active CN112637624B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011472115.2A CN112637624B (en) 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011472115.2A CN112637624B (en) 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112637624A CN112637624A (en) 2021-04-09
CN112637624B true CN112637624B (en) 2023-07-18

Family

ID=75312856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011472115.2A Active CN112637624B (en) 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112637624B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255488A (en) * 2021-05-13 2021-08-13 广州繁星互娱信息科技有限公司 Anchor searching method and device, computer equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110659092A (en) * 2019-08-13 2020-01-07 平安国际智慧城市科技股份有限公司 Webpage screenshot method and device, computer equipment and storage medium

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406710B (en) * 2016-09-30 2021-08-27 维沃移动通信有限公司 Screen recording method and mobile terminal
CN106791894B (en) * 2016-11-26 2018-08-31 广州华多网络科技有限公司 A kind of method and apparatus playing live video
CN106792092B (en) * 2016-12-19 2020-01-03 广州虎牙信息科技有限公司 Live video stream split-mirror display control method and corresponding device thereof
CN108073346A (en) * 2017-11-30 2018-05-25 深圳市金立通信设备有限公司 A kind of record screen method, terminal and computer readable storage medium
CN108449640B (en) * 2018-03-26 2021-05-07 广州虎牙信息科技有限公司 Live video output control method and device, storage medium and terminal
CN108989830A (en) * 2018-08-30 2018-12-11 广州虎牙信息科技有限公司 A kind of live broadcasting method, device, electronic equipment and storage medium
CN109361954B (en) * 2018-11-02 2021-03-26 腾讯科技(深圳)有限公司 Video resource recording method and device, storage medium and electronic device
CN109862385B (en) * 2019-03-18 2022-03-01 广州虎牙信息科技有限公司 Live broadcast method and device, computer readable storage medium and terminal equipment
CN110941383B (en) * 2019-10-11 2021-08-10 广州视源电子科技股份有限公司 Double-screen display method, device, equipment and storage medium
CN110740346B (en) * 2019-10-23 2022-04-22 北京达佳互联信息技术有限公司 Video data processing method, device, server, terminal and storage medium
CN110784735A (en) * 2019-11-12 2020-02-11 广州虎牙科技有限公司 Live broadcast method and device, mobile terminal, computer equipment and storage medium
CN111246126A (en) * 2020-03-11 2020-06-05 广州虎牙科技有限公司 Direct broadcasting switching method, system, device, equipment and medium based on live broadcasting platform
CN111541930B (en) * 2020-04-27 2023-04-25 广州酷狗计算机科技有限公司 Live broadcast picture display method and device, terminal and storage medium
CN111464830B (en) * 2020-05-19 2022-07-15 广州酷狗计算机科技有限公司 Method, device, system, equipment and storage medium for image display
CN111901654A (en) * 2020-08-04 2020-11-06 海信视像科技股份有限公司 Display device and screen recording method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110659092A (en) * 2019-08-13 2020-01-07 平安国际智慧城市科技股份有限公司 Webpage screenshot method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN112637624A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
CN110971930B (en) Live virtual image broadcasting method, device, terminal and storage medium
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
CN108595239B (en) Picture processing method, device, terminal and computer readable storage medium
CN110213638B (en) Animation display method, device, terminal and storage medium
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
CN108965922B (en) Video cover generation method and device and storage medium
CN108694073B (en) Control method, device and equipment of virtual scene and storage medium
CN109859102B (en) Special effect display method, device, terminal and storage medium
CN109646944B (en) Control information processing method, control information processing device, electronic equipment and storage medium
CN108897597B (en) Method and device for guiding configuration of live broadcast template
CN111464830B (en) Method, device, system, equipment and storage medium for image display
CN110740340B (en) Video live broadcast method and device and storage medium
CN110868636B (en) Video material intercepting method and device, storage medium and terminal
CN110225390B (en) Video preview method, device, terminal and computer readable storage medium
CN112565806B (en) Virtual gift giving method, device, computer equipment and medium
CN113157172A (en) Barrage information display method, transmission method, device, terminal and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN111565338A (en) Method, device, system, equipment and storage medium for playing video
CN113377270B (en) Information display method, device, equipment and storage medium
CN112822544B (en) Video material file generation method, video synthesis method, device and medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN112637624B (en) Live stream processing method, device, equipment and storage medium
CN108228052B (en) Method and device for triggering operation of interface component, storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant