CN115529498A - Live broadcast interaction method and related equipment - Google Patents

Live broadcast interaction method and related equipment Download PDF

Info

Publication number
CN115529498A
CN115529498A CN202211146304.XA CN202211146304A CN115529498A CN 115529498 A CN115529498 A CN 115529498A CN 202211146304 A CN202211146304 A CN 202211146304A CN 115529498 A CN115529498 A CN 115529498A
Authority
CN
China
Prior art keywords
information
drawing surface
displaying
handwriting
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211146304.XA
Other languages
Chinese (zh)
Inventor
袁敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Douyu Network Technology Co Ltd
Original Assignee
Wuhan Douyu Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Douyu Network Technology Co Ltd filed Critical Wuhan Douyu Network Technology Co Ltd
Priority to CN202211146304.XA priority Critical patent/CN115529498A/en
Publication of CN115529498A publication Critical patent/CN115529498A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application provides a live broadcast interaction method and related equipment, wherein the method comprises the following steps: displaying a first drawing surface, wherein the first drawing surface is a live video picture of a main broadcasting end; displaying a second drawing surface on an upper layer of the first drawing surface based on drawing interaction instructions, the second drawing surface being a transparent surface; generating drawing information based on the contact interaction event information of the screen area of the second drawing surface; and displaying the drawing information on the live video picture through the second drawing surface. Therefore, by combining the drawing function with the live broadcast technology, the user draws on the screen and sends the drawing information to other user sides, the handwriting information of the user can be shared in real time, and the efficiency and the quality of live broadcast communication interaction are improved.

Description

Live broadcast interaction method and related equipment
Technical Field
The invention relates to the technical field of live broadcasting, in particular to a live broadcasting interaction method and related equipment.
Background
The network live broadcast is that independent signal acquisition equipment is erected on site, acquired audio and video are guided into a broadcast guide terminal, are uploaded to a server through a network and are released to a website for people to watch. The network live broadcast absorbs and continues the advantages of the Internet, utilizes a video mode to carry out on-site live broadcast, can release audio and video contents to the Internet in real time, and utilizes the characteristics of intuition, rapidness, good expression form, rich contents, strong interactivity, unlimited region, capability of dividing audiences and the like of the Internet to enhance the popularization effect of an activity site.
The existing network live broadcast interaction mode comprises text comments, voice microphone and video microphone, but in an actual application scene, a situation that a main broadcast has operation problems and needs audiences to guide the operation problems, and the problems of unclear meaning, difficulty in understanding and low communication efficiency can exist when the main broadcast guides the operation problems through the text or the voice, so that certain limitations are realized.
Disclosure of Invention
The invention provides a live broadcast interaction method, which aims to solve the problem that in the live broadcast process, a main broadcast and a user are not communicated sufficiently through only text and voice to connect with a microphone, and the understanding is difficult.
In a first aspect, the present invention provides a live broadcast interaction method, which is used for a client and includes:
displaying a first drawing surface, wherein the first drawing surface is a live video picture of a main broadcasting end;
displaying a second drawing surface on an upper layer of the first drawing surface based on drawing interaction instructions, wherein the second drawing surface is a transparent surface;
generating drawing information based on the contact interaction event information of the screen area of the second drawing surface;
and displaying the drawing information on the live video picture through the second drawing surface.
Optionally, the live video frame is associated with at least two second drawing surfaces, and the drawing information is generated jointly based on the contact interaction event information of the screen areas to which the at least two second drawing surfaces belong.
Optionally, the second drawing surface includes a anchor drawing surface associated with the anchor client and a user drawing surface associated with the viewer client, and the drawing information is generated by combining the contact interaction event information of the screen areas to which the anchor drawing surface and at least one user drawing surface belong.
Optionally, the contact interaction event information includes handwriting coordinate information, and the generating of the drawing information based on the contact interaction event information of the screen area of the second drawing surface includes:
determining handwriting coordinate proportion information based on the handwriting coordinate information and the size information of the screen area of the second drawing surface;
and generating the drawing information according to the handwriting coordinate proportion information.
Optionally, the generating the drawing information according to the handwriting coordinate ratio information includes:
generating the drawing information according to the percentage data and the color value data;
the displaying the drawing information on the live video picture through the second drawing surface comprises:
and sending the drawing information to a server so that the server sends the drawing information to other clients to be displayed on the live video picture through the second drawing surface in other clients.
Optionally, the displaying the drawing information on the live video frame through the second drawing surface includes:
under the condition that the client is an audience client, acquiring first handwriting coordinate information corresponding to the drawing information existing in the received live screen interface;
acquiring second handwriting coordinate information which is recorded on the second drawing surface of the audience client and is associated with the first handwriting coordinate information;
and under the condition that the first handwriting coordinate information is not matched with the second handwriting coordinate information, modifying the first handwriting coordinate information and/or the second handwriting coordinate information so that a handwriting picture corresponding to the second handwriting coordinate information covers a handwriting picture corresponding to the first handwriting coordinate information.
Optionally, after the step of displaying the drawing information on the live video frame through the second drawing surface, the method further includes:
determining corresponding target content from the live video picture according to the handwriting coordinate information;
when it is detected that the current coordinate information of the target content in the first drawing surface is inconsistent with the handwriting coordinate information on the second drawing surface, adjusting the handwriting coordinate information to the coordinate position of the current coordinate information on the second drawing surface, so that the current note coordinate of the drawing information is synchronous with the current coordinate position of the target content.
Optionally, the generating of the drawing information based on the contact interaction event information of the screen area of the second drawing surface includes:
responding to a drawing starting instruction, and monitoring contact interaction event information of the screen area of the second drawing surface;
and all the contact interaction event information is used as drawing information so as to avoid triggering other application operations through the contact interaction event information.
Optionally, the drawing starting instruction is obtained based on a floating window control, and the floating window control and/or the second drawing surface are displayed on an upper layer of display content corresponding to all application programs in the intelligent terminal to which the client belongs.
Optionally, the live broadcast interaction method further includes:
generating prompt information under the condition of receiving drawing information generated by other clients sent by a server;
and displaying the drawing information on the live video picture through the second drawing surface under the condition of receiving a response instruction of the prompt information.
In a second aspect, the present invention further provides a live broadcast interaction apparatus, including:
the first imaging module is used for displaying a first drawing surface, and the first drawing surface is a live video picture of a main broadcasting end;
the drawing module is used for displaying a second drawing surface on the upper layer of the first drawing surface based on a drawing interaction instruction, and the second drawing surface is a transparent surface;
the interaction module is used for generating drawing information based on the contact interaction event information of the screen area of the second drawing surface;
and the second imaging module is used for displaying the drawing information on the live video picture through the second drawing surface.
In a third aspect, the present invention further provides a live broadcast interactive system, where the system includes a server and a client, and the client includes a main broadcast end and an audience client;
the anchor terminal is used for displaying a first drawing surface, and the first drawing surface is used for displaying a live video picture of the anchor terminal; displaying a second drawing surface on top of the first drawing surface based on drawing interaction instructions of a anchor user; generating first drawing information in response to first contact interaction event information of the anchor, wherein the first contact interaction event information is generated based on a screen area of a second drawing surface of the anchor; displaying the first drawing information on the live video picture through a second drawing surface of a main broadcasting end to obtain a first video stream; transmitting the first video stream to the server;
the server is used for receiving the first video stream and forwarding the first video stream to a viewer client;
the audience client is used for receiving and playing the first video stream; displaying a second drawing surface on top of a first drawing surface that shows the first video stream based on drawing interaction instructions of an audience user; generating second drawing information in response to second contact interaction event information of the audience user, wherein the second contact interaction event information is generated on the basis of the screen region of a second drawing surface of the audience client; displaying the second drawing information on a video picture of the first video stream through a second drawing surface of a viewer client to obtain a second video stream; feeding back the second video stream to the server;
and the server is also used for receiving the second video stream and forwarding the second video stream to a main broadcasting terminal and other audience clients.
In a fourth aspect, the present invention further provides an electronic device, including a memory and a processor, where the processor is configured to implement the steps of the live broadcast interaction method according to any one of the above first aspects when executing a computer program stored in the memory.
In a fifth aspect, the present invention also provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, performs the steps of the live interaction method according to any one of the first aspect.
As can be seen from the foregoing technical solutions, an embodiment of the present application provides a live broadcast interaction method and related devices, where the method includes: displaying a first drawing surface, wherein the first drawing surface is a live video picture of a main broadcasting end; displaying a second drawing surface on an upper layer of the first drawing surface based on drawing interaction instructions, the second drawing surface being a transparent surface; generating drawing information based on the contact interaction event information of the screen area of the second drawing surface; and displaying the drawing information on the live video picture through the second drawing surface. The current live webcast interaction mode comprises text comments, voice microphone and video microphone, but in an actual application scene, a situation that a main broadcaster has operation problems and needs to guide the main broadcaster is likely to exist, and the problems of unclear meaning, difficulty in understanding and low communication efficiency are likely to exist when the main broadcaster guides the main broadcaster through text or voice, so that certain limitations are realized. In the embodiment of the application, the live video picture is displayed as the first drawing surface, and the transparent second drawing surface is displayed on the upper layer of the live video picture. The user can draw through the contact interaction of the screen area of the second drawing surface, and finally the drawing information is displayed on the live video picture through the second drawing surface, so that the handwriting information of the user can be shared in real time, and the efficiency and the quality of the live interaction are improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments are briefly described below, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a live broadcast interaction method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an embodiment of a common drawing of an anchor client and an audience client of a live broadcast interaction method provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a live interactive apparatus according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a handwriting coordinate synchronization scheme of a live broadcast interaction method according to an embodiment of the present application;
fig. 5a is a schematic view of an embodiment of handwriting coordinate information generated when an anchor draws and determined target content in a live broadcast interaction method according to an embodiment of the present application;
fig. 5b is a schematic diagram of an embodiment of a live broadcast interaction method provided by the embodiment of the present application, where target content moves, so that current coordinate information is inconsistent with handwriting coordinate information;
fig. 5c is a schematic view of an embodiment of adjusting handwriting coordinate information to a coordinate position of current coordinate information in a live broadcast interaction method provided in the embodiment of the present application;
fig. 6 is a schematic diagram of an embodiment of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic diagram illustrating an embodiment of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following examples do not represent all embodiments consistent with the present application. But merely as exemplifications of systems and methods consistent with certain aspects of the application, as recited in the claims. In the several embodiments provided in the embodiments of the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways, and the apparatus embodiments described below are merely exemplary.
As shown in fig. 1, a first embodiment of a live broadcast interaction method provided in the embodiment of the present application includes:
step S110, displaying a first drawing surface, where the first drawing surface is a live video frame of a main broadcasting end.
For example, before the step of displaying the first drawing surface, the anchor may create and join a microphone channel, and after the joining is successful, the anchor may apply for an permission to open a microphone to the intelligent terminal to which the anchor belongs, and open a local audio capture, where the local audio may be directly acquired by a microphone of the intelligent terminal to which the anchor belongs, or acquired after being connected to the intelligent terminal through an external device, and the anchor may perform special transcoding on the captured audio in real time, and may transmit the transcoded audio signal to the server through a specified channel, where the server may receive the transcoded audio signal, and may also forward the audio signal to other microphone users, so as to implement real-time sharing of audio content.
Before the step of displaying the first drawing surface, the anchor terminal can start screen sharing, after the screen sharing is started, the anchor terminal can apply for opening screen recording permission to the affiliated intelligent terminal and start screen picture capture, wherein the screen picture can be a screen picture of the affiliated intelligent terminal directly collected or a video content collected by connecting an external device with the affiliated intelligent terminal, the anchor terminal can perform special transcoding on the captured screen picture in real time and transmit the transcoded video signal to the server through a specified channel, the server can receive the transcoded video signal and can forward the video signal to other connected-to-the-wheat users, and therefore real-time sharing of screen picture content of the anchor terminal can be achieved.
The user side can apply for joining the connected channel created by the main broadcasting side, the joining process is the same as that of the main broadcasting side, and real-time audio sharing of the user can be achieved.
In a specific implementation, the present embodiment may decode the screen sharing content on the anchor side by adding a surfview in the user side, and draw the screen sharing content on the surfview. It should be noted that the faceview can control the format, size, and rendering position of the rendering.
Step S120, displaying a second drawing surface on the upper layer of the first drawing surface based on the drawing interaction instruction, wherein the second drawing surface is a transparent surface.
The second drawing surface is a transparent surface for image drawing, which is provided on the first drawing surface. The transparent surface can be drawn through a Canvas function on the surface view, and the method for drawing the transparent surface can be calling a Draw Color drawing method in the Canvas function, transmitting a parameter Color.
Step S130, generating drawing information based on the contact interaction event information of the screen area of the second drawing surface.
Illustratively, the contact interaction event information is screen touch event information performed by a user on the screen area of the second drawing surface, and includes a click touch and a slide touch. The handwriting information may be determined according to the contact interaction event information of the screen area of the second drawing surface, and a storage module storing the handwriting information may be established, including a Path module and a Paint module, where the Path module may be configured to store handwriting coordinate information determined according to the contact interaction event information of the screen area of the second drawing surface, and the Paint module may be configured to store information of a handwriting color value and a width determined according to the contact interaction event information of the screen area of the second drawing surface.
According to some embodiments, the generating of the drawing information based on the contact interaction event information of the screen area of the second drawing surface includes:
responding to a drawing starting instruction, and monitoring contact interaction event information of a screen area of the second drawing surface;
and taking all the contact interaction event information as drawing information so as to avoid triggering other application operations through the contact interaction event information.
Illustratively, the drawing start instruction is an instruction for starting to receive contact interaction event information of the screen area of the second drawing surface. A listener can be injected into the Surfaceview, wherein the listener can be used for monitoring contact interaction event information generated by a client user on a screen area of the intelligent terminal, the listener can also inform a system of the intelligent terminal, all the contact interaction event information is used as drawing information, the second drawing surface consumes the contact interaction event information, and the contact interaction event information does not need to be issued to other applications.
The contact interaction event information of the screen area of the second drawing surface is monitored, and all the contact interaction event information is used as drawing information, so that the situation that other applications are started in the operation process of a user, the smoothness and the accuracy of the operation are influenced, and the use feeling of the user is further influenced can be avoided.
According to some embodiments, the drawing start instruction is obtained based on a floating window control, and the floating window control and/or the second drawing surface are displayed on an upper layer of corresponding display contents of all application programs in the intelligent terminal to which the client belongs.
For example, the floating window control may receive the drawing start instruction and receive the contact interaction event information according to the drawing start instruction. The button for opening the graffiti can be arranged as a floating window control, the button for opening the graffiti can be located on the same layer as the second drawing surface or on the upper layer of the second drawing surface, the button for opening the graffiti can be displayed on the upper layer of the corresponding display content of all the application programs in the intelligent terminal to which the user terminal belongs together with the second drawing surface under the condition that the button for opening the graffiti can be located on the same layer as the second drawing surface, and the second drawing surface is displayed on the upper layer of the corresponding display content of all the application programs in the intelligent terminal to which the user terminal belongs under the condition that the button for opening the graffiti is located on the upper layer of the second drawing surface. After clicking the button for opening the graffiti, a color plate can be opened for the user to select colors; the method comprises the steps that a first drawing surface is displayed on a first drawing surface all the time after a main broadcasting end starts screen sharing, contact interaction event information of a screen area of the first drawing surface is monitored by clicking a button for starting graffiti, the second drawing surface is not displayed on the first drawing surface after the main broadcasting end starts screen sharing, the second drawing surface is started through the button for starting graffiti, and contact interaction event information of the screen area of the second drawing surface is monitored.
And displaying the second drawing surface on the first drawing surface without influencing the presentation effect of the video content of the first drawing surface because the second drawing surface is transparent. The anchor end can carry out live game through the mobile end and the PC end, but part of game application does not support the operation of the PC end and can only be operated at the mobile end. Under the condition that the anchor uses the mobile terminal to play live games, full-screen content is usually shared, and therefore other applications need to retreat from the residence background, and therefore the full-screen content needs to be displayed on the upper layers of the corresponding display content of all application programs in the intelligent terminal to which the user side belongs through the floating window control and/or the second drawing surface.
According to some embodiments, the generating of the drawing information based on the contact interaction event information of the screen area of the second drawing surface includes:
determining handwriting coordinate proportion information based on the handwriting coordinate information and the size information of the screen area of the second drawing surface;
and generating the drawing information according to the handwriting coordinate proportion information.
For example, coordinate scale information [ a, b ] of the handwriting may be determined according to coordinate information [ x, y ] of the handwriting and the width and height of the second drawing surface by a = x/m, b = y/n, where m is the width of the second drawing surface and n is the height of the second drawing surface; the width coordinate ratio information c of the handwriting may be determined according to the width coordinate information w of the handwriting and the width of the second drawing surface by c = w/m, and the drawing information may be determined according to the coordinate ratio information of the handwriting and the width coordinate ratio information of the handwriting.
Because the user watches the various live broadcast devices, the screen size and the screen resolution of each device are different, and the handwriting displayed on different intelligent terminals by the same handwriting coordinate information may have differences, so that communication errors can be caused.
According to some embodiments, the generating the drawing information according to the handwriting coordinate ratio information includes:
generating the drawing information according to the percentage data and the color value data;
the displaying the drawing information on the live video picture through the second drawing surface includes:
and sending the drawing information to a server so that the server sends the drawing information to other clients so as to display the drawing information on the live video picture through the second drawing surface in other clients.
For example, the color value data of the color may be determined according to the color selected by the user in a case where the user opens the color panel. For example, under the condition of drawing by the user side, the intelligent terminal to which the user belongs can transcode drawing information into a byte array, the byte array is sent to the server through a special channel at the frequency of once every 30 milliseconds, the server can send the byte array to the anchor side, handwriting is restored on the second drawing surface of the anchor side, screen content sharing of other users is achieved through the screen sharing function of the anchor side, the byte array can also be sent to other clients through the server, the byte array is directly restored by the other clients, and the handwriting is restored on the second drawing surface of the user side.
The drawing information is generated according to the percentage data and the color value data, the drawing information is sent to other clients through the server, the drawing information is displayed on the live video picture through the second drawing surface in other clients, self-adaption of a screen can be carried out on handwriting information according to different screen sizes and resolution ratios of different users, the handwriting information is displayed according to the color value data, the accuracy of color presentation can be guaranteed, the occurrence of color difference is avoided, and the accuracy of interactive information can be guaranteed.
Step S140 is displaying the drawing information on the live video frame through the second drawing surface.
Illustratively, in a case that a corresponding instruction of the prompt information is received, the drawing information may be extracted from the Path module and the Paint module, a Draw Path handwriting drawing method in a Canvas function may be called, and the drawing information is transferred to a Canvas, so that the drawing information is presented on the second drawing surface.
According to some embodiments, in the case of receiving drawing information generated by other clients sent by a server, prompt information is generated;
and displaying the drawing information on the live video picture through the second drawing surface when the response instruction of the prompt information is received.
For example, the user side may actively send information requesting interaction to the anchor side, or the anchor side may actively authorize the user side to perform interactive drawing. And under the condition that the anchor receives the information of requesting interaction, the server can generate the information of requesting interaction into prompt information to prompt the anchor that other users need to interact at the moment. The anchor terminal can receive the prompt information sent by the server, and when the anchor receives the prompt information and agrees to perform interaction, the user side performs drawing on the screen area to which the audience client belongs through a second drawing surface of the user, and the server sends the drawing information of the user side to the anchor terminal. The anchor terminal can restore the drawing information on the anchor second drawing surface and display the anchor second drawing surface on the upper layer of the live video picture.
According to some embodiments, the live video frame is associated with at least two second drawing surfaces, and the drawing information is generated together based on the contact interaction event information of the screen areas to which the at least two second drawing surfaces belong.
For example, the anchor end may assign the drawing authority to at least two viewer clients only, and the plurality of viewer clients assigned with the drawing authority perform drawing together.
Therefore, the problem encountered in the live broadcasting of the anchor terminal can be solved by identifying the multiple audience clients in the live broadcasting interface, and the problem solving efficiency is improved.
According to some embodiments, the second drawing surface comprises a main drawing surface associated with the main side client and a user drawing surface associated with the viewer client, and the drawing information is generated by the contact interaction event information of the screen areas of the main drawing surface and the at least one user drawing surface.
Illustratively, a live video frame of a anchor may be displayed in a plurality of clients including the anchor and viewer clients. The anchor end can give the drawing authority to at least one audience client under the condition of ensuring that the anchor end has the drawing authority, and the anchor end and the audience client draw together. In the case where the anchor side gives the drawing authority to one viewer client a, then the viewer client a and the anchor side may perform the drawing operation on the respective second drawing surfaces. For example, as shown in fig. 2, the anchor does not know how to set the skill of the game character, and gives the drawing authority to the anchor client and the spectator client a, the user client may draw a circle at a channel entry icon 1 entering the skill interface for setting the game character in the live interface through the second drawing surface, and the anchor may draw a circle at an icon 2 if the anchor considers that the channel entry entering the skill interface for setting the game character is correct and is icon 2. The method and the system can realize real-time interaction between the anchor terminal and the user terminal, and avoid the problems that the communication time is long, the communication efficiency is low, the live broadcast process is influenced and the like because only the anchor terminal or the audience client terminal is allowed to independently draw.
For example, the anchor terminal may give the drawing authority to two or more viewer clients under the condition that the anchor terminal is guaranteed to own the drawing authority, and a plurality of viewer clients and the anchor terminal may draw together. The time for the same problem to be communicated with different users independently can be saved, and the interaction degree and the communication efficiency of the anchor terminal and the user terminal are further improved.
Therefore, when the anchor terminal and the audience client terminal communicate problems, interactivity is more diversified, and particularly when the live interface display content needs to be combined for communication, communication efficiency is higher, and the problem solving speed is higher.
According to some embodiments, the displaying the drawing information on the live video frame through the second drawing surface includes:
under the condition that the client is a spectator client, acquiring first trace coordinate information corresponding to the drawing information existing in the received live screen interface;
acquiring second handwriting coordinate information which is recorded on the second drawing surface of the audience client and is associated with the first handwriting coordinate information;
and under the condition that the first handwriting coordinate information is not matched with the second handwriting coordinate information, modifying the first handwriting coordinate information and/or the second handwriting coordinate information so that a handwriting picture corresponding to the second handwriting coordinate information covers the handwriting picture corresponding to the first handwriting coordinate information.
Illustratively, the first handwriting coordinate information is drawing information of the audience client received by the anchor terminal, and the drawing information is sent to the server, so that the server sends the drawing information to other clients, and handwriting information of a live video picture is displayed in the audience client. The second handwriting coordinate information is note coordinate information drawn by the user on a second drawing surface of the user of the viewer client. The handwriting coordinate information corresponding to the drawing information of the live screen interface can be acquired, and the handwriting information associated with the first handwriting coordinate information on the second drawing surface of the user of the audience client can be acquired and used as the second handwriting coordinate information. Under the condition that the first handwriting coordinate information is not matched with the second handwriting coordinate information, the second handwriting coordinate information can be modified according to the first handwriting coordinate information, the first handwriting coordinate information can be modified according to the second handwriting coordinate information, and the first handwriting coordinate information and the second handwriting coordinate information can be synchronously modified according to the first handwriting coordinate information and the second handwriting coordinate information.
When a user draws on the second drawing surface, drawing information of the user can be displayed on the direct broadcasting screen of the anchor side, and the direct broadcasting screen interface of the anchor side can be synchronously transmitted to the user side in real time, so that the situation that handwriting on the second drawing surface of the user side and handwriting displayed on the direct broadcasting screen interface are not completely overlapped can be caused, the use experience of the user is influenced, and the problem of incomplete superposition of the handwriting can be avoided by correcting the coordinate information of the handwriting.
As shown in fig. 3, fig. 3 is a schematic structural diagram of a live broadcast interaction device according to an embodiment of the present application.
The embodiment of the present application provides a live broadcast interaction apparatus 300, including a first imaging module 301, a drawing module 302, an interaction module 303, and a second imaging module 304, wherein:
a first imaging module 301, configured to display a first drawing surface, where the first drawing surface is a live video frame of a main broadcast end;
a drawing module 302, configured to display a second drawing surface on an upper layer of the first drawing surface based on a drawing interaction instruction, where the second drawing surface is a transparent surface;
the interaction module 303 is configured to generate drawing information based on the contact interaction event information of the screen area of the second drawing surface;
the second imaging module 304 displays the drawing information on the live video frame through the second drawing surface.
A live interactive apparatus 300 can implement each process implemented in the method embodiment of fig. 1, and is not described herein again to avoid repetition. And the live broadcast interactive device 300 can draw on the screen by the user by combining the drawing function with the live broadcast technology, and transmit the drawing information to other user terminals, so that the handwriting information of the user can be shared in real time, and the efficiency and quality of the live broadcast communication interaction are improved.
Furthermore, the invention also provides a live broadcast interactive system, which comprises a server and a client, wherein the client comprises a main broadcast end and an audience client;
in a specific implementation, both the anchor client and the viewer client may be loaded with the live interactive apparatus 300;
the anchor terminal is used for displaying a first drawing surface, and the first drawing surface is used for displaying a live video picture of the anchor terminal; displaying a second drawing surface on top of the first drawing surface based on drawing interaction instructions of a anchor user; generating first drawing information in response to first contact interaction event information of the anchor, wherein the first contact interaction event information is generated on the basis of a screen area of a second drawing surface of the anchor; displaying the first drawing information on the live video picture through a second drawing surface of a main broadcasting end to obtain a first video stream; transmitting the first video stream to the server;
the server is used for receiving the first video stream and forwarding the first video stream to a viewer client;
the audience client is used for receiving and playing the first video stream; displaying a second drawing surface on top of a first drawing surface that presents the first video stream based on drawing interaction instructions of an audience user; generating second drawing information in response to second contact interaction event information of the audience user, wherein the second contact interaction event information is generated on the basis of the screen region of a second drawing surface of the audience client; displaying the second drawing information in a video picture of the first video stream through a second drawing surface of a viewer client to obtain a second video stream; feeding back the second video stream to the server;
the server is further configured to receive the second video stream, and forward the second video stream to a main broadcaster and other viewer clients.
Specifically, the embodiment can be applied to a scene where the anchor and the user interact with each other, for example, the anchor plays a game live broadcast, and draws a track (i.e., generated drawing information) on a live video picture displayed by the anchor; a user watches the live video picture of the anchor in the live broadcasting room of the anchor, the user draws tracks on the live video picture played by the audience client, and the tracks drawn by the anchor and the user on the same live video picture are synchronously rendered by channel transmission.
Further, a second embodiment of a live broadcast interaction method provided in the embodiments of the present application, as shown in fig. 4, includes:
in this embodiment, after the step of displaying the drawing information on the live video frame through the second drawing surface in the step S140, the method further includes:
step S410: and determining corresponding target content from the live video picture according to the handwriting coordinate information.
It can be understood that, before the drawing information is generated for the first time, the live interactive device 300 captures the handwriting coordinate information generated when the anchor end/audience client user touches the mobile phone screen, where the handwriting coordinate information is the coordinate a of the drawing information generated for the first time, as shown in fig. 5a, the anchor end user draws a circle, which is the drawing information on the second drawing surface, to the character 1 in the live video picture at the position of the coordinate a on the terminal screen, where the character 1 is the target content on the first drawing surface (live video picture);
step S420: when the current coordinate information of the target content in the first drawing surface is detected to be inconsistent with the handwriting coordinate information on the second drawing surface, adjusting the handwriting coordinate information to the coordinate position of the current coordinate information on the second drawing surface for displaying, so that the current note coordinate of the drawing information is synchronous with the current coordinate position of the target content.
It can be understood that the target content (character 1) in the live video picture moves continuously in the live video picture, the live interactive device 300 captures the current coordinate information of the target content (character 1) on the screen in real time, and compares the current coordinate information of the target content (character 1) with the handwriting coordinate information of the user on the second drawing surface (i.e. the coordinate a of the circle drawn for the character 1 in fig. 5 a) to determine whether the target content in the live video picture moves, and if the comparison result shows that the current coordinate information of the character 1 is different from the coordinate a of the handwriting coordinate information, it is determined that the target content (character 1) in the live video picture moves, as shown in fig. 5B, that is, the current coordinate information of the character 1 moves to the coordinate B; the live interactive device 300 transfers the current coordinates of the target content (character 1) moving in the screen to a second drawing surface in real time, adjusts the current note coordinates of the drawing information (circle drawn by the user for the character 1) from the original coordinates a to the coordinate position of coordinates B in the second drawing surface (as shown in fig. 5 c), encodes the drawing information with the adjusted note coordinates in the second drawing surface and the live video frame to form a third video stream, and sends the third video stream to the server, and the server broadcasts the third video stream to other viewer clients in the live room, so that all viewers in the live room can see the picture effect that the note coordinates of the drawing information are synchronized with the current coordinate position of the target content after the target content moves in the screen, and thus, after the main broadcast end/viewer client marks the target content in the live video frame on the screen, the live broadcast end and the live broadcast mark of the target content can synchronize with the moving track of the target content in the live video frame even if the target content moves in the live broadcast frame.
As shown in fig. 6, fig. 6 is a schematic structural diagram of an electronic device provided in the embodiment of the present application.
An embodiment of the present application provides an electronic device 600, which includes a memory 610, a processor 620, and a computer program 611 stored in the memory 610 and operable on the processor 620, where the processor 620 implements the following steps when executing the computer program 611:
displaying a first drawing surface, wherein the first drawing surface is a live video picture of a main broadcasting end;
displaying a second drawing surface on the upper layer of the first drawing surface based on the drawing interaction instruction, wherein the second drawing surface is a transparent surface;
generating drawing information based on the contact interaction event information of the screen area of the second drawing surface;
and displaying the drawing information on the live video picture through the second drawing surface.
In a specific implementation, when the processor 620 executes the computer program 611, any of the embodiments corresponding to fig. 1 may be implemented.
Since the electronic device described in this embodiment is a device used for implementing an apparatus in this embodiment, based on the method described in this embodiment, a person skilled in the art can understand a specific implementation manner of the electronic device in this embodiment and various modifications thereof, so that how to implement the method in this embodiment by the electronic device is not described in detail herein, and as long as the person skilled in the art implements the device used for implementing the method in this embodiment, the device is within the scope of protection of this application.
As shown in fig. 7, fig. 7 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application.
The present embodiment provides a computer-readable storage medium 700, on which a computer program 711 is stored, which computer program 711, when executed by a processor, performs the steps of:
displaying a first drawing surface, wherein the first drawing surface is a live video picture of a main broadcasting end;
displaying a second drawing surface on the upper layer of the first drawing surface based on the drawing interaction instruction, wherein the second drawing surface is a transparent surface;
generating drawing information based on the contact interaction event information of the screen area of the second drawing surface;
and displaying the drawing information on the live video picture through the second drawing surface.
It should be noted that, in the foregoing embodiments, the description of each embodiment has an emphasis, and reference may be made to the related description of other embodiments for a part that is not described in detail in a certain embodiment.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
An embodiment of the present application further provides a computer program product, where the computer program product includes computer software instructions, and when the computer software instructions are run on a processing device, the processing device is enabled to execute a flow in the live broadcast interaction method in the embodiment corresponding to fig. 1.
The computer program product includes one or more computer instructions. The processes or functions described above in accordance with the embodiments of the present application occur wholly or in part upon loading and execution of the above-described computer program instructions on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium may be any available medium that a computer can store or a data storage device including one or more available media, an integrated server, a data center, and the like. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), etc.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the above-described division of units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
In summary, the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (14)

1. A live broadcast interaction method is used for a client and comprises the following steps:
displaying a first drawing surface, wherein the first drawing surface is used for showing a live video picture of a main broadcasting end;
displaying a second drawing surface on an upper layer of the first drawing surface based on drawing interaction instructions, the second drawing surface being a transparent surface;
generating drawing information based on the contact interaction event information of the screen area of the second drawing surface;
and displaying the drawing information on the live video picture through the second drawing surface.
2. The method as recited in claim 1, wherein the live video frame is associated with at least two second drawing surfaces, and the drawing information is generated jointly based on contact interaction event information of screen areas to which the at least two second drawing surfaces belong.
3. The method of claim 2, wherein the second drawing surface comprises an anchor drawing surface associated with an anchor-side client and a user drawing surface associated with a viewer client, and wherein the drawing information is generated in conjunction with contact interaction event information for screen areas to which the anchor drawing surface and at least one user drawing surface belong.
4. The method of claim 1, wherein the contact interaction event information includes handwriting coordinate information, and wherein generating drawing information based on the contact interaction event information for the screen area of the second drawing surface includes:
determining handwriting coordinate proportion information based on the handwriting coordinate information and the size information of the screen area of the second drawing surface;
and generating the drawing information according to the handwriting coordinate proportion information.
5. The method of claim 4,
the handwriting coordinate proportion information is percentage data of the handwriting coordinate information and the size information of the screen area, and the drawing information is generated according to the handwriting coordinate proportion information, and the percentage data comprises the following steps:
generating the drawing information according to the percentage data and the color value data;
the displaying the drawing information on the live video picture through the second drawing surface includes:
and sending the drawing information to a server so that the server sends the drawing information to other clients so as to display the drawing information on the live video picture through a second drawing surface in the other clients.
6. The method as claimed in claim 1, wherein said displaying said drawing information on a live video frame via said second drawing surface comprises:
under the condition that the client is an audience client, acquiring first handwriting coordinate information corresponding to the drawing information existing in the received live screen interface;
acquiring second handwriting coordinate information which is recorded on the second drawing surface of the audience client and is associated with the first handwriting coordinate information;
and under the condition that the first handwriting coordinate information is not matched with the second handwriting coordinate information, modifying the first handwriting coordinate information and/or the second handwriting coordinate information so as to enable a handwriting picture corresponding to the second handwriting coordinate information to cover the handwriting picture corresponding to the first handwriting coordinate information.
7. The method of any of claims 4-6, wherein after the step of displaying the drawing information on the live video frame via the second drawing surface, the method further comprises:
determining corresponding target content from the live video picture according to the handwriting coordinate information;
when it is detected that the current coordinate information of the target content in the first drawing surface is inconsistent with the handwriting coordinate information on the second drawing surface, adjusting the handwriting coordinate information to the coordinate position of the current coordinate information on the second drawing surface, so that the current note coordinate of the drawing information is synchronous with the current coordinate position of the target content.
8. The method of claim 1, wherein generating drawing information based on the contact interaction event information of the screen region of the second drawing surface comprises:
responding to a drawing starting instruction, and monitoring contact interaction event information of the screen area of the second drawing surface;
and taking all the contact interaction event information as drawing information so as to avoid triggering other application operations through the contact interaction event information.
9. The method of claim 8, wherein the drawing starting instruction is obtained based on a floating window control, and the floating window control and/or the second drawing surface are displayed on the upper layer of the corresponding display content of all the applications in the intelligent terminal to which the client belongs.
10. The method of claim 1, further comprising:
generating prompt information under the condition of receiving drawing information generated by other clients sent by a server;
and under the condition of receiving a response instruction of the prompt message, displaying the drawing information on the live video picture through the second drawing surface.
11. A live interaction device, comprising:
the system comprises a first imaging module, a second imaging module and a display module, wherein the first imaging module is used for displaying a first drawing surface, and the first drawing surface is a live video picture of a main broadcasting end;
the drawing module is used for displaying a second drawing surface on the upper layer of the first drawing surface based on a drawing interaction instruction, and the second drawing surface is a transparent surface;
the interaction module is used for generating drawing information based on the contact interaction event information of the screen area of the second drawing surface;
and the second imaging module is used for displaying the drawing information on the live video picture through the second drawing surface.
12. A live broadcast interactive system is characterized by comprising a server and a client, wherein the client comprises an anchor client and an audience client;
the anchor terminal is used for displaying a first drawing surface, and the first drawing surface is used for displaying a live video picture of the anchor terminal; displaying a second drawing surface on an upper layer of the first drawing surface based on drawing interaction instructions of a anchor user; generating first drawing information in response to first contact interaction event information of the anchor, wherein the first contact interaction event information is generated on the basis of a screen area of a second drawing surface of the anchor; displaying the first drawing information in the live video picture through a second drawing surface of a main broadcasting end to obtain a first video stream; transmitting the first video stream to the server;
the server is used for receiving the first video stream and forwarding the first video stream to a viewer client;
the audience client is used for receiving and playing the first video stream; displaying a second drawing surface on top of a first drawing surface that presents the first video stream based on drawing interaction instructions of an audience user; generating second drawing information in response to second contact interaction event information of the audience user, wherein the second contact interaction event information is generated on the basis of the screen region of a second drawing surface of the audience client; displaying the second drawing information on a video picture of the first video stream through a second drawing surface of a viewer client to obtain a second video stream; feeding back the second video stream to the server;
and the server is also used for receiving the second video stream and forwarding the second video stream to a main broadcasting terminal and other audience clients.
13. An electronic device comprising a memory, a processor, wherein the processor is configured to implement the steps of the live interaction method of any one of claims 1 to 10 when executing a computer program stored in the memory.
14. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program when executed by a processor implementing the steps of the live interaction method as claimed in any one of claims 1 to 10.
CN202211146304.XA 2022-09-20 2022-09-20 Live broadcast interaction method and related equipment Pending CN115529498A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211146304.XA CN115529498A (en) 2022-09-20 2022-09-20 Live broadcast interaction method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211146304.XA CN115529498A (en) 2022-09-20 2022-09-20 Live broadcast interaction method and related equipment

Publications (1)

Publication Number Publication Date
CN115529498A true CN115529498A (en) 2022-12-27

Family

ID=84698107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211146304.XA Pending CN115529498A (en) 2022-09-20 2022-09-20 Live broadcast interaction method and related equipment

Country Status (1)

Country Link
CN (1) CN115529498A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116248912A (en) * 2023-05-12 2023-06-09 南京维赛客网络科技有限公司 Method, system and storage medium for annotating live streaming picture in real time

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116248912A (en) * 2023-05-12 2023-06-09 南京维赛客网络科技有限公司 Method, system and storage medium for annotating live streaming picture in real time

Similar Documents

Publication Publication Date Title
CN111818359B (en) Processing method and device for live interactive video, electronic equipment and server
CN107483460B (en) Method and system for multi-platform parallel broadcasting and stream pushing
CN109327741B (en) Game live broadcast method, device and system
EP1472871B1 (en) Remote server switching of video streams
CN1976440B (en) Method and system for accurately positioning playing progress rate in IPTV
US9883244B2 (en) Multi-source video navigation
WO2015080856A1 (en) Manipulation of media content to overcome user impairments
CN111343476A (en) Video sharing method and device, electronic equipment and storage medium
CN112533037B (en) Method for generating Lian-Mai chorus works and display equipment
CN1980373A (en) Composition type interacting video-performance system
CN109195003B (en) Interaction method, system, terminal and device for playing game based on live broadcast
CN109391822A (en) Video guide's method, apparatus and terminal device on line
CN111147911A (en) Video clipping method and device, electronic equipment and storage medium
CN113596553A (en) Video playing method and device, computer equipment and storage medium
JP6024002B2 (en) Video distribution system
CN112399263A (en) Interaction method, display device and mobile terminal
CN113630614A (en) Game live broadcast method, device, system, electronic equipment and readable storage medium
CN115529498A (en) Live broadcast interaction method and related equipment
CN110324653B (en) Game interactive interaction method and system, electronic equipment and device with storage function
JP2005269607A (en) Instant interactive audio/video management system
CN113489938A (en) Virtual conference control method, intelligent device and terminal device
JP7290260B1 (en) Servers, terminals and computer programs
CN113661715B (en) Service management method, interaction method, display equipment and mobile terminal for projection hall
CN112399225B (en) Service management method for projection hall and display equipment
CN112533023B (en) Method for generating Lian-Mai chorus works and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination