CN112637624A - Live stream processing method, device, equipment and storage medium - Google Patents

Live stream processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112637624A
CN112637624A CN202011472115.2A CN202011472115A CN112637624A CN 112637624 A CN112637624 A CN 112637624A CN 202011472115 A CN202011472115 A CN 202011472115A CN 112637624 A CN112637624 A CN 112637624A
Authority
CN
China
Prior art keywords
display
live
area
live broadcast
display content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011472115.2A
Other languages
Chinese (zh)
Other versions
CN112637624B (en
Inventor
潘佳志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fanxing Huyu IT Co Ltd
Original Assignee
Guangzhou Fanxing Huyu IT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fanxing Huyu IT Co Ltd filed Critical Guangzhou Fanxing Huyu IT Co Ltd
Priority to CN202011472115.2A priority Critical patent/CN112637624B/en
Publication of CN112637624A publication Critical patent/CN112637624A/en
Application granted granted Critical
Publication of CN112637624B publication Critical patent/CN112637624B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Abstract

The application discloses a live streaming processing method, a live streaming processing device, live streaming processing equipment and a storage medium, and belongs to the technical field of live streaming. The method comprises the following steps: displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account; determining an intercepted area on the desktop through the area acquisition control; recording the display content of the intercepted area through a recording component provided by the anchor client; displaying the display content on the live interface in an overlapping manner; and synthesizing the live broadcast picture and the display content to obtain the live broadcast stream. And the live broadcast function can be updated by changing the recorded display content. In the process, only the anchor client needs to be developed, and the updating efficiency of the live broadcast platform is improved.

Description

Live stream processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of live broadcast technologies, and in particular, to a live broadcast stream processing method, apparatus, device, and storage medium.
Background
With the rapid development of the live broadcast industry, manufacturers providing live broadcast services need to frequently update a live broadcast platform to ensure user experience, and the live broadcast platform comprises a main broadcast client and audience clients.
At present, developers usually implement live broadcast platform updating by developing the same new functions for the anchor client and the audience client and respectively updating the anchor client and the audience client.
When the live broadcast platform is updated based on the above manner, the anchor client and the audience client need to be developed at the same time, and the developed functions need to be ensured to be consistent, so that the updating efficiency is low.
Disclosure of Invention
The application provides a live stream processing method, a live stream processing device and a storage medium, which can reduce network resource consumption in a live data transmission process. The technical scheme is as follows:
according to an aspect of the present application, there is provided a live stream processing method, including:
displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account;
determining an intercepted area on the desktop through the area acquisition control;
recording the display content of the intercepted area through a recording component provided by an anchor client;
displaying the display content on the live broadcast interface in an overlapping manner;
and synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
According to another aspect of the present application, there is provided a live stream processing apparatus, the apparatus including:
the display module is used for displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account;
the determining module is used for determining the intercepted area on the desktop through the area acquisition control;
the recording module is used for recording the display content of the intercepted area through a recording component provided by the anchor client;
the display module is used for displaying the display content on the live broadcast interface in an overlapping manner;
and the synthesis module is used for synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
In an alternative design, the region acquisition control comprises a frame selection control; the determining module is configured to:
responding to the triggering operation on the frame selection control, and starting a frame selection window on the desktop;
and responding to the frame selection operation on the frame selection window, and determining the intercepting area.
In an alternative design, the determining module is configured to:
responding to the frame selection operation on the frame selection window, and determining frame selection position information, wherein the frame selection position information is used for indicating the frame selection operation in the frame selection area on the desktop;
and determining the intercepting area according to the framing position information.
In an alternative design, the region acquisition control comprises a web page acquisition control; the display module is used for:
responding to the triggering operation on the webpage acquisition control, and displaying a list of webpages to be selected in the live broadcast interface;
the determining module is configured to:
responding to the selection operation of the to-be-selected webpage list, and acquiring a target webpage;
and determining the area indicated by the embedded code of the target webpage as the intercepting area.
In an alternative design, the display module is configured to:
and responding to the superposition display operation, displaying the display content in a superposition display component provided by the anchor client, wherein the superposition display component is displayed in a display area for displaying the live broadcast picture, and the superposition display component is superposed on the live broadcast picture.
In an alternative design, the apparatus further comprises:
and the first processing module is used for responding to the dragging operation of the superposed display assembly and changing the display position of the superposed display assembly.
In an alternative design, the apparatus further comprises:
and the second processing module is used for responding to the zooming operation of the superposed display assembly and changing the display area of the superposed display assembly.
In an optional design, the live broadcast picture belongs to a first layer, the display content belongs to a second layer, and the second layer is on the first layer; the synthesis module is configured to:
and synthesizing the first image layer and the second image layer.
In an alternative design, the apparatus further comprises:
and the sending module is used for sending the live stream to a server, and the live stream is used for displaying a live image of the audience in a user interface of the audience client.
According to another aspect of the present application, there is provided a computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, set of codes, or set of instructions, which is loaded and executed by the processor to implement a live stream processing method as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium having at least one program code stored therein, the program code being loaded and executed by a processor to implement the live stream processing method as described above.
According to another aspect of the application, a computer program product or computer program is provided, comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to execute the live stream processing method provided in the various alternative implementations of the above aspects.
The beneficial effect that technical scheme that this application provided brought includes at least:
and synthesizing the display content which is superposed and displayed on the live broadcast interface with a live broadcast picture to obtain the live broadcast stream. According to the recorded different display contents, different live broadcast streams can be generated based on the live broadcast pictures, so that different live broadcast pictures of the audiences can be displayed in the audiences. And the live broadcast function can be updated by changing the recorded display content. In the process, only the anchor client needs to be developed, and the updating efficiency of the live broadcast platform is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a principle of processing a live stream provided by an embodiment of the present application;
fig. 2 is a schematic flowchart of a live stream processing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another live stream processing method provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of an implementation process for determining an intercepting region according to an embodiment of the present application;
FIG. 5 is a diagram illustrating a selection window provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of another implementation process for determining an intercepting region according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a determination of a capture area by framing a selection window according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a list of candidate web pages provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of displaying display content through an overlay display component according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a live stream processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another live stream processing apparatus according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of another live stream processing apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a further live stream processing apparatus according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a principle of processing a live stream provided by an embodiment of the present application. As shown in fig. 1, in a live interface 101 of the anchor account, the anchor client displays a live screen 102 of the anchor account. An area acquisition control 103 is also displayed in the live interface 101. Optionally, the region acquisition control is a frame selection control. And the anchor client starts a frame selection window on the desktop of the computer equipment where the anchor client is positioned according to the triggering operation on the frame selection control, and determines an intercepting area according to the frame selection operation on the frame selection window. For example, the desktop displays a target web page 104 that the anchor client obtains and displays through a browser of the computer device according to the web page obtaining operation. The anchor client can determine a web page area 105 as an intercepting area from the displayed target web page according to the frame selection operation on the frame selection window. When the computer equipment switches to display different contents in the desktop, the frame selection window is always displayed in the desktop. Optionally, the area acquisition control is a web page acquisition control. The anchor client can display the webpage list to be selected through the triggering operation on the webpage acquisition control. And acquiring a target webpage selected by the anchor user from the list of the webpages to be selected through a webpage acquisition component provided by the anchor client, and determining an area in the target webpage indicated by the embedded code of the target webpage as an intercepting area. The web pages to be selected in the web page list to be selected are provided by developers of the live broadcast platform, and the embedded codes are written by the developers. And then, the anchor client displays the display content of the intercepted area recorded by the recording component provided by the anchor client to the superposition display component 106 according to the superposition display operation. The superposition display component is displayed in a display area of the live broadcast picture, and the superposition display component is displayed on the live broadcast picture in a superposition mode. Thereby realizing the display of the display content superimposed in the display area of the live view 102. Optionally, the anchor client is also capable of changing the display position of the overlay display component 106 in the display area according to the drag operation. And changing a display area of the overlay display assembly according to the zoom operation. And then, the anchor client synthesizes the first layer where the live broadcast picture 102 is located with the second layer where the display content in the superposition display component 106 is located, so as to obtain a live broadcast stream, and sends the live broadcast stream to a server corresponding to the anchor client. The server sends the live stream to a viewer account watching the live broadcast of the anchor account, so that a viewer client where the viewer account is located can display a viewer live broadcast picture 107 of the anchor account in a user interface according to the live stream. The viewer live frame includes a live frame 102 and the display content, and the live frame 102 and the display content are in the same layer.
The display content superimposed on the live view 102 is synthesized with the live view 102 to obtain a live stream. And sends the live stream to the viewer client via the server. Different live views 107 can be displayed on the viewer client according to the recorded different display contents. And the live broadcast function can be updated by changing the recorded display content. In the process, only the anchor client needs to be developed, and the updating efficiency of the live broadcast platform is improved.
Fig. 2 is a schematic flow chart of a live stream processing method according to an embodiment of the present application. The method may be used for a computer device or an anchor client on a computer device. As shown in fig. 2, the method includes:
step 201: and displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account.
The anchor account is any account that logs on to the anchor client. The anchor client is a client with a live broadcast function and comprises a music client, a short video client, a karaoke client, a shopping client, a live broadcast client, a friend-making client and the like. The live interface of the anchor account provides a live interface for an anchor user who logs in the anchor account, such as a live room interface of the anchor account. Illustratively, the region acquisition control is a button. The button is displayed on a live interface, the button is displayed in a popup window of the live interface, and the button is displayed in the live interface in a floating mode.
And the anchor client displays the live broadcast picture according to the video stream recorded by the computer equipment where the anchor client is positioned. And the anchor client displays the live broadcast picture according to the video stream acquired from the computer equipment where the anchor client is located. Alternatively, the anchor client obtains the video stream from another computer device, thereby displaying the live view. The live video is obtained by playing the video stream.
Step 202: and determining the intercepted area through the area acquisition control.
Optionally, the intercepted area includes an area in the desktop and an area in the target web page. The desktop refers to a desktop including any display content displayed by the computer equipment where the anchor client is located. The anchor client can determine the intercepting area on the desktop comprising different display contents according to the frame selection control. The cut-out area includes any area on the desktop. The target web page is determined from the pre-selected web pages by the anchor client according to the selection operation of the anchor user. The pre-selected web page is provided by a developer of the live platform. Illustratively, the cut-out area is the entire desktop, or a partial area in the desktop. Optionally, the anchor client determines the intercepting area according to a frame selection operation. The box operation can indicate the position that the cut-out area occupies in the desktop. And the anchor client determines the area in the target webpage indicated by the embedded code of the target webpage as the intercepting area. The shape of the cut-out area is rectangular, circular, oval, triangular and the like.
Step 203: and recording the display content of the intercepted area through a recording component provided by the anchor client.
After the anchor client determines the intercepting area, a recording component of the anchor client is started, so that the display content in the intercepting area is recorded. When the currently displayed desktop of the computer equipment where the anchor client is located does not include the intercepting area, the anchor client can still record the display content of the intercepting area through the recording component, namely, at the moment, the anchor client records the display content of the intercepting area through the background of the recording component. For example, the anchor client records the display content of the intercepted area through a recording component in a Virtual Machine (Virtual Machine) based mode. The display content is recorded as static display content (picture) of the intercepting region or real-time display content (video) of the intercepting region. Optionally, the anchor client obtains the display data of the intercepted area at different moments in the recording process through the recording component, so as to record the display content of the intercepted area.
Step 204: and displaying the display content on the live interface in an overlapping manner.
The display content of the intercepting area is overlapped on the live interface to be displayed, and the display content can cover the content displayed in the live interface. The display content covers the entire live interface or covers a portion of the live interface. Alternatively, the display content can be displayed superimposed on a display area in the live view. The display area is used for displaying a live broadcast picture of the anchor account. The display content is displayed covering the entire display area, or covering a portion of the display area.
Illustratively, the display area is an area in the desktop that is displayed according to data provided by a developer of the anchor client. The display content of the display area is a decoration element or a gift list of a live broadcast room of the anchor account. By displaying the display content on the live broadcast picture in a superposition manner, the effect that decorative elements are added to the main broadcast face in the live broadcast picture or the display of a gift list is added to the live broadcast picture can be achieved. By updating the provided data, the developer can enable the anchor client to determine the display area containing different display contents, thereby increasing and displaying different elements in the live broadcast picture and realizing the update of the live broadcast function.
Optionally, the anchor client implements the display of the display content in an overlay manner through an overlay display component provided by the anchor client. The overlay display component is displayed in a display area for displaying a live screen and is overlaid on the live screen. I.e. the overlay display component will obscure the live view. The display content is displayed in the overlay display component. Optionally, the display area of the display content is the same as the display area of the overlay display component, and the display scale is the same. The anchor client can also change the size and the position of the displayed overlapping display component according to the dragging operation and the zooming operation of the anchor user on the overlapping display component, so that the size and the position of the displayed display content are changed.
Step 205: and synthesizing the live broadcast picture and the display content to obtain the live broadcast stream.
By playing the live stream, a live frame of the audience can be obtained, wherein the live frame of the audience comprises the live frame and the display content. And the layer displaying the live broadcast picture is different from the layer displaying the display content. The live broadcast picture belongs to a first layer, the display content belongs to a second layer, and the second layer is arranged on the first layer. Optionally, the anchor client merges the first layer and the second layer to obtain a merged layer, and the live stream can be obtained according to the merged layer. The anchor client can send the live stream to the audience client where the audience account watching the live broadcast of the anchor account is located through the corresponding server, and the live stream is used for displaying the live broadcast picture of the audience in the user interface of the audience client.
Optionally, when the display area for displaying the content exceeds the display area of the live view, the live view client only synthesizes the live view with the display content in the display area of the live view when synthesizing the live view and the display content.
In summary, the live streaming processing method provided in the embodiment of the present application synthesizes display content, which is displayed in a superimposed manner on a live interface, with a live frame to obtain a live stream. According to the recorded different display contents, different live broadcast streams can be generated based on the live broadcast pictures, so that different live broadcast pictures of the audiences can be displayed in the audiences. And the live broadcast function can be updated by changing the recorded display content. In the process, only the anchor client needs to be developed, and the updating efficiency of the live broadcast platform is improved.
Fig. 3 is a flowchart illustrating another live stream processing method according to an embodiment of the present application. The method may be used for a computer device or an anchor client on a computer device. As shown in fig. 3, the method includes:
step 301: and displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account.
The live interface of the anchor account provides a live interface for an anchor user who logs in the anchor account, such as a live room interface of the anchor account. Illustratively, the region acquisition control is a button. The button is displayed on a live interface, the button is displayed in a popup window of the live interface, and the button is displayed in the live interface in a floating mode. Illustratively, with continued reference to fig. 1, a region acquisition control 103 is displayed in the live interface 101, and the region acquisition control 103 is a button.
And the anchor client displays the live broadcast picture according to the video stream recorded by the computer equipment where the anchor client is positioned. And the anchor client displays the live broadcast picture according to the video stream acquired from the computer equipment where the anchor client is located. Alternatively, the anchor client obtains the video stream from another computer device, thereby displaying the live view. The live video is obtained by playing the video stream.
Step 302: and determining the intercepted area through the area acquisition control.
Optionally, the intercepted area includes an area in the desktop and an area in the target web page. The desktop refers to a desktop including any display content displayed by the computer equipment where the anchor client is located. The anchor client can determine the intercepting area on the desktop comprising different display contents according to the frame selection control. The cut-out area includes any area on the desktop. The target web page is determined from the pre-selected web pages by the anchor client according to the selection operation of the anchor user. The pre-selected web page is provided by a developer of the live platform.
Optionally, the area obtaining control includes a frame selection control, and as shown in fig. 4, the implementation process of step 302 includes the following steps 3021 and 3022:
in step 3021a, a box selection window is started on the desktop in response to a trigger operation on the box control.
The triggering operation comprises a touch operation aiming at the frame selection control and a voice instruction aiming at the frame selection control. The touch operation is triggered by external equipment such as a mouse and a keyboard and gesture operation of a host user aiming at a touch screen of computer equipment where a host client is located. Including single click, double click, long press, etc. When the anchor client receives voice instructions of selecting the frame selection control, triggering the frame selection control, starting the frame selection and the like through the computer equipment, the trigger operation is determined to be received.
When the anchor client starts the frame selection window, the frame selection window is displayed. The frame selection window is displayed on the desktop in a floating mode. The shape of the frame selection window is rectangle, circle, ellipse, triangle, etc. The frame of the frame selection window is a solid line or a dotted line. For example, a red dashed line, to highlight where the box selection window occupies in the desktop. The frame inside of the frame selection window is transparent, the transparency is a preset proportion, and the frame selection window is used for displaying the display content in the position occupied by the frame selection window. And when the computer equipment where the anchor client is positioned switches and displays the desktop containing different display contents, the anchor client always displays the frame selection window. The anchor client can change the position and the size of the frame selection window in the desktop according to the dragging operation and the zooming operation aiming at the frame selection window, so that different areas can be selected in the desktop through the frame selection window.
Exemplarily, fig. 5 is a schematic diagram of a frame selection window provided in an embodiment of the present application. As shown in fig. 5, in a live interface 401 of the anchor account, a frame selection button 402 is displayed. When the anchor client receives a single-click operation on the box button 402, a box selection window 403 is displayed in the desktop displayed by the computer device on which the anchor client is located. The frame of the frame selection window 403 is a dotted line, and the transparency inside the frame is 100%. The anchor client can frame out different intercepting areas in the desktop through a framing selection window 403.
In step 3022a, in response to a frame selection operation on the frame selection window, a clipping region is determined.
The selection operation is used for indicating that the area of the selection window selected by the selection window in the desktop is determined as the intercepting area. Optionally, when the anchor client receives a determination operation for a region framed by the framed selection window, it determines that the framing operation is received. The determining operation comprises touch operation of a selected area of the frame selection window, the frame selection window keeps the same position and the same size to reach a preset duration, and touch operation of a confirmation control displayed corresponding to the frame selection window. And when the frame selection window starts to be started, the frame selection window is displayed on the desktop according to the default size and the default position, and when the anchor client receives the dragging operation and the zooming operation aiming at the frame selection window, the frame selection window is enabled to change the display position and the display size, and the frame selection operation is received accurately. And when the frame selection window starts, determining that the frame selection operation is received.
In response to a frame selection operation on the frame selection window, the anchor client determines frame selection location information. The frame selection position information is used for indicating the area of the frame selection operation on the desktop, namely the area inside the frame of the frame selection window. The anchor client can determine the intercepting area according to the frame selection position information. When the frame selection area is not included in the desktop displayed by the computer equipment, the frame selection area is not changed.
Optionally, the region acquisition control comprises a web page acquisition control. As shown in fig. 6, the implementation of step 302 includes the following steps 3021b to 3023 b:
in step 3021b, in response to a trigger operation on the web page acquisition control, displaying a list of web pages to be selected in the live interface.
The web page acquisition control is a button. Optionally, the list of the web pages to be selected is displayed in a popup, and the popup is displayed in a live interface. The triggering operation comprises single click, double click, long press and sliding operation on the webpage acquisition control. The web pages to be selected in the list of the web pages to be selected are provided by developers of the live broadcast platform, and the web pages to be selected are used for realizing the function of updating the live broadcast platform.
In step 3022b, in response to the selection operation on the list of web pages to be selected, a target web page is obtained.
The target webpage is a webpage selected from the list of webpages to be selected by the selection operation. The selection operation comprises single click, double click, selection and long press operation on the webpage to be selected in the webpage to be selected list. Optionally, the anchor client obtains the target web page through a web page obtaining component provided by the anchor client. Alternatively, the anchor client can also acquire the target web page through a browser installed in the computer device where the anchor client is located.
In step 3023b, the region indicated by the embedded code of the target web page is determined as the intercepted region.
The embedded code is written by a developer and is written in the code corresponding to the preselected webpage. Optionally, when the anchor client acquires the target webpage through the webpage acquisition component and reads the code of the embedded code, the area in the target webpage indicated by the embedded code is determined as the intercepted area. The embedded code can indicate vertex information and aspect information for a region in the target web page. Optionally, when the live client acquires the target webpage, the target webpage is also displayed in the live interface through the webpage acquisition component.
Exemplarily, fig. 7 is a schematic diagram of determining a clipping region by framing a selection window according to an embodiment of the present application. As shown in fig. 7, the anchor client starts a box selection window 403 on the desktop, which displays a live interface 401 with the anchor account. And then, the anchor client displays a live webpage 404 of sports on a desktop through a browser of the computer equipment according to the webpage acquisition operation of the anchor user, and at the moment, a frame selection window 403 is displayed on the live webpage 404. After the anchor user adjusts the position and size of the frame selection window 403, the anchor client determines the intercepting area according to the determination operation (double-click) on the frame selection window 403.
Illustratively, fig. 8 is a schematic diagram of a list of candidate webpages provided in an embodiment of the present application. As shown in fig. 8, a web page capture button 405 is displayed in the live interface 401 of the anchor account. When the anchor client receives a trigger operation for the web page acquisition button 405, a list 406 of web pages to be selected is displayed. The web pages to be selected in the list of web pages to be selected 406 are provided by developers of the live broadcast platform, and are used for realizing the function of updating the live broadcast platform. Examples include a dress page (for displaying the dress elements in a superimposed manner) and a leader board page (for displaying the leaderboard of the live room in a superimposed manner).
Step 303: and recording the display content of the intercepted area through a recording component provided by the anchor client.
After the anchor client determines the intercepting area, a recording component of the anchor client is started, so that the display content in the intercepting area is recorded. When the currently displayed desktop of the computer equipment where the anchor client is located does not include the intercepting area, the anchor client can still record the display content of the intercepting area through the recording component, namely, at the moment, the anchor client records the display content of the intercepting area through the background of the recording component. For example, the anchor client records the display content of the intercepted area through a recording component in a Virtual Machine (Virtual Machine) based mode. The display content is recorded as static display content (picture) of the intercepting region or real-time display content (video) of the intercepting region. Optionally, the anchor client obtains the display data of the intercepted area at different moments in the recording process through the recording component, so as to record the display content of the intercepted area.
Step 304: and displaying the display content on the live interface in an overlapping manner.
The display content of the intercepting area is overlapped on the live interface to be displayed, and the display content can cover the content displayed in the live interface. Alternatively, the display content can be displayed superimposed on a display area in the live view. The display area is used for displaying a live broadcast picture of the anchor account. The display content is displayed covering the entire display area, or covering a portion of the display area.
Optionally, in response to the overlay display operation, the anchor client displays the display content in an overlay display component provided by the anchor client. The overlay display component is displayed in a display area for displaying a live image, and the overlay display component is overlaid on the live image. I.e. the overlay display component will obscure the live view. Optionally, when the anchor client records the display content in the intercepted area through the recording component, it is determined that the overlay display operation is received. And when the anchor client receives the trigger operation aiming at the superposition display control, determining to receive the superposition display operation. And when the anchor client records the display content of the intercepted area through the recording component, the superposition display control is displayed in the live broadcast interface. And the intercepting area is a webpage area in the target webpage, and when the anchor client detects that the target webpage is embedded with the target code, the anchor client determines to receive the superposition display operation. The object code is specified by a developer of the anchor client.
And the anchor client acquires display data corresponding to the display content of the intercepted area through the recording component and renders the display data to the superposition display component, so that the display content is displayed in the superposition display component. Optionally, the display area of the display content is the same as the display area of the overlay display component, and the display scale is the same. In response to a drag operation on the overlay display component, the anchor client is able to change a display position of the overlay display component. In response to a zoom operation on the overlaid display component, the anchor client can change a display area of the overlaid display component.
Illustratively, fig. 9 is a schematic diagram of displaying display content through a superimposed display component according to an embodiment of the present application. As shown in fig. 9 (a), when the anchor client determines the cut-out area as shown in fig. 7 and records the display content of the cut-out area through the recording component. The anchor client displays the overlay display component 407 on a display area for displaying a live screen of the anchor account in the live interface 401 of the anchor account. And display content of the cut-out area recorded in real time is displayed in the overlay display component 407. The display content is a live picture of sports. As shown in fig. 9 (b), the anchor client changes the display position and the display area of the superimposed display component 407 in accordance with the drag operation and the zoom operation on the superimposed display component 407, thereby changing the display position and the display area of the live view of the sports motion displayed in the superimposed manner.
It should be noted that the anchor client can display the display content recorded in the capture area in real time in a manner of being superimposed on the display area of the live view. With the change of the display content in the intercepting area, the display content displayed in the display area in an overlapping mode can also correspondingly change. When the intercepting area changes, the display content also changes, and the display content displayed in the display area in an overlapping mode also correspondingly changes. In addition, the anchor client can also display the display content specified by the anchor account in the display area in an overlapping manner, for example, the display content of the webpage area in the webpage specified by the anchor account, and the specified video and picture. The anchor client is also capable of displaying the display content recorded in the past in a superimposed manner in the display area.
The anchor client can initiate a plurality of frame selection windows to determine a plurality of intercepting areas. Optionally, the anchor client can also acquire a plurality of target webpages from the list of webpages to be selected, so as to determine a plurality of intercepting areas. And then, the display content of each intercepted area recorded by the recording component is displayed on the live broadcast picture through a plurality of superposed display components. Moreover, the anchor client can also adjust the display level sequence of each superposed display component according to the sorting operation, so that the display level sequence of different display contents is adjusted. Wherein, the display content with high display level can cover the display content with low display level.
Step 305: and synthesizing the live broadcast picture and the display content to obtain the live broadcast stream.
By playing the live stream, a live frame of the audience can be obtained, wherein the live frame of the audience comprises the live frame and the display content. And the layer displaying the live broadcast picture is different from the layer displaying the display content. The live broadcast picture belongs to a first layer, the display content belongs to a second layer, and the second layer is arranged on the first layer. Optionally, the anchor client merges the first layer and the second layer to obtain a merged layer, and the live stream can be obtained according to the merged layer.
Optionally, when the display area for displaying the content exceeds the display area of the live view, the live view client only synthesizes the live view with the display content in the display area of the live view when synthesizing the live view and the display content.
Step 306: and sending the live stream to a server.
The live stream is used to display a live view of the viewer in a user interface of the viewer client. The user interface is a live broadcast room interface for watching live broadcast of the anchor account. The audience live broadcast picture comprises a live broadcast picture and display content. Illustratively, a viewer user using a viewer client downloads a live video of the anchor account for viewing, and the viewer's live view is also available by playing the live video.
It should be noted that, when the anchor client adjusts the content, the display position, and the display area of the display content displayed in the live view in a superimposed manner, the display content in the live view displayed by the viewer client changes correspondingly.
In a specific example, the target web page acquired by the anchor client through the web page acquisition component is displayed with a decorating pattern, such as a hat. And then the anchor client determines the area displaying the decorating pattern as an intercepting area according to the frame selection operation on the frame selection window. And displaying the decorating pattern through a superposition display component which is superposed and displayed on a display area of the live broadcast picture. And in the live broadcast process of the anchor account, the anchor client determines a target display position (such as the anchor top) for displaying the decorating pattern according to a live broadcast picture displayed in real time, and adjusts the display position of the decorating pattern in real time. Therefore, the picture that the main player wears the dressing pattern can be displayed, and the interest of live broadcast is improved.
In summary, the live streaming processing method provided in the embodiment of the present application synthesizes display content, which is displayed in a superimposed manner on a live interface, with a live frame to obtain a live stream. According to the recorded different display contents, different live broadcast streams can be generated based on the live broadcast pictures, so that different live broadcast pictures of the audiences can be displayed in the audiences. And the live broadcast function can be updated by changing the recorded display content. In the process, only the anchor client needs to be developed, and the updating efficiency of the live broadcast platform is improved.
In addition, in the process of live data transmission, different functions can be provided only by sending live streams, and the consumption of network resources is reduced. The intercepting area is determined by selecting the selection window in a frame mode, display content in the intercepting area can be displayed for the anchor user, and interaction experience is improved. And determining a webpage area according to the target webpage, so that the anchor user can freely adjust the intercepting area according to the requirement. Therefore, different display contents are displayed in the live broadcast picture, and the interest of live broadcast is improved. The display content of the intercepted area is displayed through the superposition display component, the superposition display component supports modification of the display position and the display area, and the size and the position of the display content can be flexibly changed. The display content of the intercepting area is recorded through the recording component, and the display content of the intercepting area can be displayed in real time in a live broadcast picture.
It should be noted that, the order of the steps of the method provided in the embodiments of the present application may be appropriately adjusted, and the steps may also be increased or decreased according to the circumstances, and any method that can be easily conceived by those skilled in the art within the technical scope disclosed in the present application shall be covered by the protection scope of the present application, and therefore, the detailed description thereof is omitted.
Fig. 10 is a schematic structural diagram of a live stream processing apparatus according to an embodiment of the present application. The apparatus may be for a computer device or an anchor client on a computer device. As shown in fig. 10, the apparatus 100 includes:
the display module 1001 is configured to display a live view and a region acquisition control on a live interface of the anchor account.
The determining module 1002 is configured to determine the intercepting area on the desktop through the area obtaining control.
And a recording module 1003, configured to record the display content of the intercepted area through a recording component provided by the anchor client.
And a display module 1001 configured to display the display content in an overlay manner on the live interface.
And a synthesizing module 1004, configured to synthesize the live view and the display content to obtain a live stream.
In an alternative design, the region acquisition control includes a frame selection control. A determining module 1002 configured to:
and responding to the triggering operation on the frame selection control, and starting a frame selection window on the desktop. And determining the intercepting area in response to the frame selection operation on the frame selection window.
In an alternative design, determining module 1002 is configured to:
and responding to the frame selection operation on the frame selection window, and determining frame selection position information, wherein the frame selection position information is used for indicating the area framed and selected by the frame selection operation on the desktop. And determining the intercepted area according to the framing position information.
In an alternative design, the region acquisition control includes a web page acquisition control. The display module 1001 is configured to display a list of webpages to be selected in a live broadcast interface in response to a trigger operation on the webpage acquisition control.
A determining module 1002 configured to:
and responding to the selection operation of the list of the web pages to be selected, and acquiring the target web page. And determining the area indicated by the embedded code of the target webpage as a intercepted area.
In an alternative design, the display module 1001 is configured to:
and responding to the superposition display operation, displaying the display content in a superposition display component provided by the anchor client, wherein the superposition display component is displayed in a display area for displaying the live broadcast picture, and the superposition display component is superposed on the live broadcast picture.
In an alternative design, as shown in fig. 11, the apparatus 100 further comprises:
a first processing module 1005, configured to change a display position of the overlay display component in response to a drag operation on the overlay display component.
In an alternative design, as shown in fig. 12, the apparatus 100 further comprises:
a second processing module 1006, configured to change a display area of the overlay display assembly in response to a zoom operation on the overlay display assembly.
In an optional design, the live broadcast picture belongs to a first layer, the display content belongs to a second layer, and the second layer is on the first layer. A synthesis module 1004 to:
and synthesizing the first image layer and the second image layer.
In an alternative design, as shown in fig. 13, the apparatus 100 further comprises:
a sending module 1007, configured to send a live stream to the server, where the live stream is used to display a live viewer screen in a user interface of the viewer client.
It should be noted that: the live stream processing apparatus provided in the foregoing embodiment is only illustrated by dividing the functional modules, and in practical applications, the functions may be distributed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the functions described above. In addition, the live stream processing apparatus and the live stream processing method provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments and are not described herein again.
Embodiments of the present application further provide a computer device, including: the device comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the live stream processing method provided by the method embodiments.
Optionally, the computer device is a terminal. Illustratively, fig. 14 is a schematic structural diagram of a terminal provided in an embodiment of the present application.
In general, terminal 1400 includes: a processor 1401, and a memory 1402.
Processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1401 may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array). Processor 1401 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1401 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1401 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1402 may include one or more computer-readable storage media, which may be non-transitory. Memory 1402 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1402 is used to store at least one instruction for execution by processor 1401 to implement the live stream processing method provided by method embodiments herein.
In some embodiments, terminal 1400 may further optionally include: a peripheral device interface 1403 and at least one peripheral device. The processor 1401, the memory 1402, and the peripheral device interface 1403 may be connected by buses or signal lines. Each peripheral device may be connected to the peripheral device interface 1403 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1404, a display 1405, a camera assembly 1406, audio circuitry 1407, a positioning assembly 1408, and a power supply 1409.
The peripheral device interface 1403 can be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1401 and the memory 1402. In some embodiments, the processor 1401, memory 1402, and peripheral interface 1403 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1401, the memory 1402, and the peripheral device interface 1403 may be implemented on a separate chip or circuit board, which is not limited by the embodiments of the present application.
The Radio Frequency circuit 1404 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1404 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1404 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1404 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1404 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1405 is a touch display screen, the display screen 1405 also has the ability to capture touch signals at or above the surface of the display screen 1405. The touch signal may be input to the processor 1401 for processing as a control signal. At this point, the display 1405 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1405 may be one, providing the front panel of the terminal 1400; in other embodiments, display 1405 may be at least two, respectively disposed on different surfaces of terminal 1400 or in a folded design; in still other embodiments, display 1405 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1400. Even further, the display 1405 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display 1405 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1406 is used to capture images or video. Optionally, camera assembly 1406 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal 1400 and the rear camera is disposed on the rear side of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1406 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1407 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1401 for processing or inputting the electric signals to the radio frequency circuit 1404 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1400. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is then used to convert electrical signals from the processor 1401 or the radio frequency circuit 1404 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1407 may also include a headphone jack.
The positioning component 1408 serves to locate the current geographic position of the terminal 1400 for navigation or LBS (Location Based Service). The Positioning component 1408 may be based on the Positioning component of the GPS (Global Positioning System) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1409 is used to power the various components of terminal 1400. The power source 1409 may be alternating current, direct current, disposable or rechargeable. When the power source 1409 comprises a rechargeable battery, the rechargeable battery can be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1400 also includes one or more sensors 1410. The one or more sensors 1410 include, but are not limited to: acceleration sensor 1411, gyroscope sensor 1412, pressure sensor 1413, fingerprint sensor 1414, optical sensor 1415, and proximity sensor 1416.
The acceleration sensor 1411 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 1400. For example, the acceleration sensor 1411 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1401 can control the touch display 1405 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1411. The acceleration sensor 1411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1412 may detect a body direction and a rotation angle of the terminal 1400, and the gyro sensor 1412 and the acceleration sensor 1411 may cooperate to collect a 3D motion of the user on the terminal 1400. The processor 1401 can realize the following functions according to the data collected by the gyro sensor 1412: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1413 may be disposed on the side bezel of terminal 1400 and/or underlying touch display 1405. When the pressure sensor 1413 is disposed on the side frame of the terminal 1400, the user's holding signal of the terminal 1400 can be detected, and the processor 1401 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1413. When the pressure sensor 1413 is disposed at the lower layer of the touch display 1405, the processor 1401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1405. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1414 is used for collecting a fingerprint of a user, and the processor 1401 identifies the user according to the fingerprint collected by the fingerprint sensor 1414, or the fingerprint sensor 1414 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, processor 1401 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for, and changing settings, etc. Fingerprint sensor 1414 may be disposed on the front, back, or side of terminal 1400. When a physical button or vendor Logo is provided on terminal 1400, fingerprint sensor 1414 may be integrated with the physical button or vendor Logo.
The optical sensor 1415 is used to collect ambient light intensity. In one embodiment, processor 1401 can control the display brightness of touch display 1405 based on the ambient light intensity collected by optical sensor 1415. Specifically, when the ambient light intensity is high, the display luminance of the touch display 1405 is increased; when the ambient light intensity is low, the display brightness of the touch display 1405 is turned down. In another embodiment, the processor 1401 can also dynamically adjust the shooting parameters of the camera assembly 1406 according to the intensity of the ambient light collected by the optical sensor 1415.
Proximity sensor 1416, also known as a distance sensor, is typically disposed on the front panel of terminal 1400. The proximity sensor 1416 is used to collect the distance between the user and the front surface of the terminal 1400. In one embodiment, when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually decreased, processor 1401 controls touch display 1405 to switch from a bright screen state to a dark screen state; when proximity sensor 1416 detects that the distance between the user and the front face of terminal 1400 is gradually increasing, processor 1401 controls touch display 1405 to switch from a breath-screen state to a bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 14 is not intended to be limiting with respect to terminal 1400 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The embodiment of the present application further provides a computer-readable storage medium, where at least one program code is stored in the computer-readable storage medium, and when the program code is loaded and executed by a processor of a computer device, the live stream processing method provided in the foregoing method embodiments is implemented.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the live stream processing method provided by the method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer readable storage medium, and the above readable storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only an example of the present application and should not be taken as limiting, and any modifications, equivalent switches, improvements, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A live stream processing method is applied to a main broadcasting client, and comprises the following steps:
displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account;
determining an intercepted area through the area acquisition control;
recording the display content of the intercepted area through a recording component provided by the anchor client;
displaying the display content on the live broadcast interface in an overlapping manner;
and synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
2. The method of claim 1, wherein the region acquisition control comprises a frame selection control; determining the intercepted area through the area acquisition control comprises:
responding to the triggering operation on the frame selection control, and starting a frame selection window on the desktop;
and responding to the frame selection operation on the frame selection window, and determining the intercepting area.
3. The method of claim 2, wherein the determining the truncation area in response to the frame selection operation on the frame selection window comprises:
responding to the frame selection operation on the frame selection window, and determining frame selection position information, wherein the frame selection position information is used for indicating the frame selection operation in the frame selection area on the desktop;
and determining the intercepting area according to the framing position information.
4. The method of any of claims 1 to 3, wherein the region acquisition control comprises a web page acquisition control; determining the intercepted area through the area acquisition control comprises:
responding to the triggering operation on the webpage acquisition control, and displaying a list of webpages to be selected in the live broadcast interface;
responding to the selection operation of the to-be-selected webpage list, and acquiring a target webpage;
and determining the area indicated by the embedded code of the target webpage as the intercepting area.
5. The method of any of claims 1 to 3, wherein the displaying the display content on the live interface in an overlaid manner comprises:
and responding to the superposition display operation, displaying the display content in a superposition display component provided by the anchor client, wherein the superposition display component is displayed in a display area for displaying the live broadcast picture, and the superposition display component is superposed on the live broadcast picture.
6. The method of claim 5, further comprising:
and changing the display position of the superposed display assembly in response to the dragging operation of the superposed display assembly.
7. The method of claim 5, further comprising:
changing a display area of the overlay display component in response to a zoom operation on the overlay display component.
8. The method according to any one of claims 1 to 3, wherein the live picture belongs to a first layer, the display content belongs to a second layer, and the second layer is above the first layer;
the synthesizing the live broadcast picture and the display content includes:
and synthesizing the first image layer and the second image layer.
9. The method of any of claims 1 to 3, further comprising:
and sending the live stream to a server, wherein the live stream is used for displaying a live image of the audience in a user interface of the audience client.
10. A live stream processing apparatus, characterized in that the apparatus comprises:
the display module is used for displaying a live broadcast picture and a regional acquisition control on a live broadcast interface of the anchor account;
the determining module is used for determining the intercepted area on the desktop through the area acquisition control;
the recording module is used for recording the display content of the intercepted area through a recording component provided by the anchor client;
the display module is used for displaying the display content on the live broadcast interface in an overlapping manner;
and the synthesis module is used for synthesizing the live broadcast picture and the display content to obtain a live broadcast stream.
11. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement a live stream processing method as claimed in any one of claims 1 to 9.
12. A computer-readable storage medium having stored therein at least one program code, the program code being loaded and executed by a processor to implement the live stream processing method as claimed in any one of claims 1 to 9.
CN202011472115.2A 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium Active CN112637624B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011472115.2A CN112637624B (en) 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011472115.2A CN112637624B (en) 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112637624A true CN112637624A (en) 2021-04-09
CN112637624B CN112637624B (en) 2023-07-18

Family

ID=75312856

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011472115.2A Active CN112637624B (en) 2020-12-14 2020-12-14 Live stream processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112637624B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255488A (en) * 2021-05-13 2021-08-13 广州繁星互娱信息科技有限公司 Anchor searching method and device, computer equipment and storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406710A (en) * 2016-09-30 2017-02-15 维沃移动通信有限公司 Screen recording method and mobile terminal
CN106791894A (en) * 2016-11-26 2017-05-31 广州华多网络科技有限公司 A kind of method and apparatus for playing live video
CN106792092A (en) * 2016-12-19 2017-05-31 广州虎牙信息科技有限公司 Live video flow point mirror display control method and its corresponding device
CN108073346A (en) * 2017-11-30 2018-05-25 深圳市金立通信设备有限公司 A kind of record screen method, terminal and computer readable storage medium
CN108449640A (en) * 2018-03-26 2018-08-24 广州虎牙信息科技有限公司 Live video output control method, device and storage medium, terminal
CN108989830A (en) * 2018-08-30 2018-12-11 广州虎牙信息科技有限公司 A kind of live broadcasting method, device, electronic equipment and storage medium
CN109361954A (en) * 2018-11-02 2019-02-19 腾讯科技(深圳)有限公司 Method for recording, device, storage medium and the electronic device of video resource
CN109862385A (en) * 2019-03-18 2019-06-07 广州虎牙信息科技有限公司 Method, apparatus, computer readable storage medium and the terminal device of live streaming
CN110659092A (en) * 2019-08-13 2020-01-07 平安国际智慧城市科技股份有限公司 Webpage screenshot method and device, computer equipment and storage medium
CN110740346A (en) * 2019-10-23 2020-01-31 北京达佳互联信息技术有限公司 Video data processing method, device, server, terminal and storage medium
CN110784735A (en) * 2019-11-12 2020-02-11 广州虎牙科技有限公司 Live broadcast method and device, mobile terminal, computer equipment and storage medium
CN110941383A (en) * 2019-10-11 2020-03-31 广州视源电子科技股份有限公司 Double-screen display method, device, equipment and storage medium
CN111246126A (en) * 2020-03-11 2020-06-05 广州虎牙科技有限公司 Direct broadcasting switching method, system, device, equipment and medium based on live broadcasting platform
CN111464830A (en) * 2020-05-19 2020-07-28 广州酷狗计算机科技有限公司 Method, device, system, equipment and storage medium for image display
CN111541930A (en) * 2020-04-27 2020-08-14 广州酷狗计算机科技有限公司 Live broadcast picture display method and device, terminal and storage medium
CN111901654A (en) * 2020-08-04 2020-11-06 海信视像科技股份有限公司 Display device and screen recording method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406710A (en) * 2016-09-30 2017-02-15 维沃移动通信有限公司 Screen recording method and mobile terminal
CN106791894A (en) * 2016-11-26 2017-05-31 广州华多网络科技有限公司 A kind of method and apparatus for playing live video
CN106792092A (en) * 2016-12-19 2017-05-31 广州虎牙信息科技有限公司 Live video flow point mirror display control method and its corresponding device
CN108073346A (en) * 2017-11-30 2018-05-25 深圳市金立通信设备有限公司 A kind of record screen method, terminal and computer readable storage medium
CN108449640A (en) * 2018-03-26 2018-08-24 广州虎牙信息科技有限公司 Live video output control method, device and storage medium, terminal
CN108989830A (en) * 2018-08-30 2018-12-11 广州虎牙信息科技有限公司 A kind of live broadcasting method, device, electronic equipment and storage medium
CN109361954A (en) * 2018-11-02 2019-02-19 腾讯科技(深圳)有限公司 Method for recording, device, storage medium and the electronic device of video resource
CN109862385A (en) * 2019-03-18 2019-06-07 广州虎牙信息科技有限公司 Method, apparatus, computer readable storage medium and the terminal device of live streaming
CN110659092A (en) * 2019-08-13 2020-01-07 平安国际智慧城市科技股份有限公司 Webpage screenshot method and device, computer equipment and storage medium
CN110941383A (en) * 2019-10-11 2020-03-31 广州视源电子科技股份有限公司 Double-screen display method, device, equipment and storage medium
CN110740346A (en) * 2019-10-23 2020-01-31 北京达佳互联信息技术有限公司 Video data processing method, device, server, terminal and storage medium
CN110784735A (en) * 2019-11-12 2020-02-11 广州虎牙科技有限公司 Live broadcast method and device, mobile terminal, computer equipment and storage medium
CN111246126A (en) * 2020-03-11 2020-06-05 广州虎牙科技有限公司 Direct broadcasting switching method, system, device, equipment and medium based on live broadcasting platform
CN111541930A (en) * 2020-04-27 2020-08-14 广州酷狗计算机科技有限公司 Live broadcast picture display method and device, terminal and storage medium
CN111464830A (en) * 2020-05-19 2020-07-28 广州酷狗计算机科技有限公司 Method, device, system, equipment and storage medium for image display
CN111901654A (en) * 2020-08-04 2020-11-06 海信视像科技股份有限公司 Display device and screen recording method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255488A (en) * 2021-05-13 2021-08-13 广州繁星互娱信息科技有限公司 Anchor searching method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN112637624B (en) 2023-07-18

Similar Documents

Publication Publication Date Title
CN108595239B (en) Picture processing method, device, terminal and computer readable storage medium
CN110213638B (en) Animation display method, device, terminal and storage medium
CN108965922B (en) Video cover generation method and device and storage medium
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
CN108449641B (en) Method, device, computer equipment and storage medium for playing media stream
CN110602321A (en) Application program switching method and device, electronic device and storage medium
CN112181572A (en) Interactive special effect display method and device, terminal and storage medium
CN110868636B (en) Video material intercepting method and device, storage medium and terminal
CN108717365B (en) Method and device for executing function in application program
CN113157172A (en) Barrage information display method, transmission method, device, terminal and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN111897465B (en) Popup display method, device, equipment and storage medium
CN110662105A (en) Animation file generation method and device and storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN112565806A (en) Virtual gift presenting method, device, computer equipment and medium
CN110677713B (en) Video image processing method and device and storage medium
CN111083554A (en) Method and device for displaying live gift
CN112822544B (en) Video material file generation method, video synthesis method, device and medium
CN112616082A (en) Video preview method, device, terminal and storage medium
CN112637624B (en) Live stream processing method, device, equipment and storage medium
CN113613053B (en) Video recommendation method and device, electronic equipment and storage medium
CN113407141B (en) Interface updating method, device, terminal and storage medium
CN111369434B (en) Method, device, equipment and storage medium for generating spliced video covers
CN110868642B (en) Video playing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant