CN112019906A - Live broadcast method, computer equipment and readable storage medium - Google Patents

Live broadcast method, computer equipment and readable storage medium Download PDF

Info

Publication number
CN112019906A
CN112019906A CN201910463104.9A CN201910463104A CN112019906A CN 112019906 A CN112019906 A CN 112019906A CN 201910463104 A CN201910463104 A CN 201910463104A CN 112019906 A CN112019906 A CN 112019906A
Authority
CN
China
Prior art keywords
live broadcast
elements
live
image
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910463104.9A
Other languages
Chinese (zh)
Inventor
姜军
秦永芳
王皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN201910463104.9A priority Critical patent/CN112019906A/en
Publication of CN112019906A publication Critical patent/CN112019906A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Abstract

The invention discloses a live broadcast method, computer equipment and a readable storage medium, and belongs to the technical field of internet. The live broadcast interface of the mobile terminal can support the display of a plurality of drawing elements (namely scene elements of live broadcast interaction), during live broadcast, live broadcast data of each drawing element is respectively obtained, and the live broadcast data of all the drawing elements are synthesized to generate a live broadcast image, so that the live broadcast interface can display the plurality of drawing elements, and the interaction effect and visual experience among users are improved.

Description

Live broadcast method, computer equipment and readable storage medium
Technical Field
The present invention relates to the field of internet technologies, and in particular, to a live broadcast method, a computer device, and a readable storage medium.
Background
With the continuous development of the live broadcast industry, the live broadcast of the mobile phone is more and more popular, and through the live broadcast software of the mobile phone, a user can share fresh things around the user in real time, so that other users can chat and interact in a closer distance. However, the content displayed on the anchor interface of the current live broadcast end of the mobile phone is single (for example, only video data collected by a camera can be presented, and other text or picture information cannot be presented), the effect of the anchor interface when the live broadcast is carried out by using a computer end cannot be achieved, and the experience effect of a user is poor.
Disclosure of Invention
Aiming at the problem of single content displayed on the existing mobile phone live interface, the live broadcast method, the computer equipment and the readable storage medium which aim at improving the effect of the mobile terminal live broadcast interface are provided.
The invention provides a live broadcast method, which is applied to a mobile terminal, wherein a live broadcast interface of the mobile terminal comprises at least two drawing elements; the method comprises the following steps:
acquiring live broadcast data of the at least two drawing elements;
and synthesizing the live broadcast data of the at least two drawing elements to generate a live broadcast image.
Preferably, the step of synthesizing the live data of the at least two drawing elements to generate a live image includes:
adjusting at least one of the position, the size and the texture of each drawing element according to the adaptive parameters of each drawing element, and drawing the adjusted at least two drawing elements in a live interface to generate a live image, or
And synthesizing the live broadcast data of the plurality of drawing elements according to the adaptive parameters of each drawing element, and drawing the plurality of drawing elements in a live broadcast interface to generate a live broadcast image.
Preferably, the method further comprises the following steps:
and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
Preferably, the drawing element is selected from any one of the following: camera elements, scene elements, composition sub-elements;
the scene elements adopt picture elements or character elements.
Preferably, the combined sub-element comprises a combined sub-element which is composed of a picture element, and/or a text element, and/or a camera element and contains adaptive parameters;
the adaptation parameters include: a size variation parameter, and/or a position variation parameter, and/or a texture variation parameter.
Preferably, when the at least two drawing elements include one picture element, and a layer of the picture element covers above the live broadcast interface, and the transparency of the picture element is within a threshold range;
in the live broadcasting process, when the shielding signal is acquired, synthesizing the live broadcasting data of the at least two drawing elements, and generating a live broadcasting image comprises the following steps:
and adjusting the transparency of the shielding picture to a non-transparent state, synthesizing the live broadcast data of the at least two drawing elements, and generating a live broadcast image showing the shielding picture.
The invention also provides a live broadcast method, which is applied to the mobile terminal, wherein a live broadcast interface of the mobile terminal comprises a plurality of drawing elements; the rendering elements comprise at least one presentation element, each presentation element corresponds to an index module, and the index module is used for recording the display time of each document image and the display state at each moment;
the method comprises the following steps:
acquiring live broadcast data of the drawing element, and acquiring a display state and a manuscript image of the presentation manuscript element according to the index module;
and synthesizing the live broadcast data of the at least two drawing elements to generate a live broadcast image.
Preferably, the method further comprises the following steps:
and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
Preferably, the step of synthesizing the live data of the at least two drawing elements to generate a live image includes:
adjusting at least one of the position, the size and the texture of each rendering element according to the adaptation parameters of each rendering element, the display state of the current presentation element and the document image, rendering the at least two adjusted rendering elements in a live interface to generate a live image, or
And synthesizing the live broadcast data of all the rendering elements according to the adaptive parameters of each rendering element, the display state of the current presentation file element and the file image, and rendering the rendering elements in a live broadcast interface to generate a live broadcast image.
The invention also provides a live broadcast method, which is applied to a mobile terminal, wherein the mobile terminal comprises at least two camera modules with visual angles and a storage module corresponding to the camera modules, and each camera module corresponds to a camera element; the live broadcast interface of the mobile terminal comprises a plurality of drawing elements, wherein the plurality of drawing elements comprise at least two camera shooting elements;
the method comprises the following steps:
acquiring live broadcast data of the plurality of drawing elements;
and synthesizing the live broadcast data of the plurality of drawing elements to generate a live broadcast image.
Preferably, the method further comprises the following steps:
and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
The invention also provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method when executing the computer program.
The present invention also provides a computer-readable storage medium having stored thereon a computer program characterized in that: which when executed by a processor implements the steps of the above-described method.
The beneficial effects of the above technical scheme are that:
in the technical scheme, the live broadcast interface of the mobile terminal can support the display of a plurality of drawing elements (namely scene elements of live broadcast interaction), during live broadcast, live broadcast data of each drawing element is respectively obtained, and the live broadcast data of all the drawing elements are synthesized to generate a live broadcast image, so that the live broadcast interface can display the plurality of drawing elements, and the interaction effect and visual experience among users are improved.
Drawings
Fig. 1 is a block diagram of one embodiment of a live broadcast system of the present invention;
FIG. 2 is a flow chart of a method of one embodiment of a live broadcast method of the present invention;
FIG. 3 is a diagram illustrating one embodiment of a live image according to the present invention;
FIG. 4 is a schematic diagram of a hierarchy of rendered elements in a live interface with privacy mode;
FIG. 5 is a schematic view of a live interface in a privacy mode;
FIG. 6 is a method flow diagram of another embodiment of a live broadcast method of the present invention;
FIG. 7 is a flow diagram of a method of one embodiment of a live broadcast method including presentation elements;
8c 1-8 c3 are image diagrams of two manuscript image switching processes;
FIG. 9 is a flow diagram of a method of an embodiment of a live method including two camera elements;
FIG. 10 is a schematic view of a live interface including two camera elements;
fig. 11 is a block diagram of an embodiment of a live system of the present invention;
fig. 12 is a schematic hardware structure diagram of a computer device for executing a live broadcast method according to an embodiment of the present invention;
FIG. 13 is a diagram of a live interface including various drawing elements.
Detailed Description
The advantages of the invention are further illustrated in the following description of specific embodiments in conjunction with the accompanying drawings.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" depending on the context.
In the description of the present invention, it should be understood that the numerical references before the steps do not identify the order of performing the steps, but merely serve to facilitate the description of the present invention and to distinguish each step, and thus should not be construed as limiting the present invention.
The video of the embodiment of the application may be presented on clients such as large-scale video playing devices, game machines, desktop computers, smart phones, tablet computers, MP3(movingpicture expeerpercgroupandioudiolayer iii) players, MP4(movingpicture expeerpercgroupandioudiolayer rlv) players, laptop portable computers, e-book readers, and other display terminals.
The video in the embodiment of the application can be applied to not only the video playing program of the match type, but also any application scene capable of presenting the video, for example, the video can be applied to some job-seeking programs, some relatives, entertainment programs of multi-party confrontation, and the like. The embodiment of the present application takes the application of video to game-like live video playing programs as an example, but is not limited to this.
In the embodiment of the application, a user at a live broadcast end (i.e., a stream push end) can send live broadcast information to each watching end (i.e., a stream pull end) through the server after processing the live broadcast information, and each watching end plays the live broadcast information again. Referring to fig. 1, fig. 1 is a diagram illustrating a live system architecture according to an embodiment of the present disclosure. As shown in fig. 1, a user a transmits live broadcast information to a server W through a wireless network, users B and C watch live broadcast video of the user a through the wireless network, and users D and E watch live broadcast video of the user a through a wired network and transmit respective barrage information to the server W. Only one server W is shown here, and the application scenario here may also include multiple servers in communication with each other. The server W may be a cloud server or a local server. In the embodiment of the present application, the server W is placed on the cloud side. If the user A sends the live broadcast information, the server W processes the live broadcast information and forwards the live broadcast information to the user A, the user B, the user C, the user D and the user E.
The invention provides a live broadcast method for overcoming the defects that the existing mobile phone live broadcast interface is single in display content and poor in user experience effect. It should be noted that: the live broadcast method is applied to the mobile terminal, and a live broadcast interface of the mobile terminal comprises at least two drawing elements. Referring to fig. 2, which is a schematic flow chart of a live broadcasting method according to a preferred embodiment of the present invention, it can be seen from the diagram that the live broadcasting method provided in the present embodiment mainly includes the following steps:
s1, acquiring live broadcast data of the at least two drawing elements;
it should be noted that: the drawing element may be selected from any one of: camera elements, scene elements, composition sub-elements; layers between drawing elements may overlap or overlay.
The scene elements can adopt picture elements or character elements; the combined sub-element can comprise a combined sub-element which is composed of a picture element, and/or a text element, and/or a camera element and contains adaptive parameters; the adaptation parameters may include: a size variation parameter, and/or a position variation parameter, and/or a texture variation parameter. The size change parameter, the position change parameter and the texture change parameter correspond to corresponding transformation matrixes. The mobile terminal in this embodiment may be a smart phone or a tablet computer. The camera element in the embodiment corresponds to a camera module of the mobile terminal, and the image content displayed by the camera element is the content shot by the camera module; the picture elements are images stored in the mobile terminal in advance; a composite sub-element may be understood as a small live interface, which may be an element consisting of a picture element and a text element and including layout adaptation parameters for each element.
The content of the current time of each drawing element is acquired in step S1 so as to be continuously updated with the content of the live image, for example: acquiring currently acquired data from the camera module again for the camera element; for the text element, the distance length of movement and the like are calculated according to the current time. The results of each rendering element calculation (i.e., each as a map and the corresponding high and wide data for that map) are stored.
And S2, synthesizing the live broadcast data of the at least two drawing elements to generate a live broadcast image.
In practical application, the live image can be displayed in a screen of the mobile terminal so that a user can watch the live effect.
Further, in step S2, the step of synthesizing the live data of the at least two drawing elements to generate a live image includes:
and adjusting at least one of the position, the size and the texture of each drawing element according to the adaptive parameters of each drawing element, and drawing the adjusted at least two drawing elements in a live interface to generate a live image. Each rendering element corresponds to an adaptation parameter.
Because the pixel filling speed of the display card of the existing mobile phone is much lower than that of the display card of a computer, when a live broadcast interface needs a plurality of drawing elements to be synthesized, each drawing element needs to be rendered for a plurality of times, and the pixel filling speed of the display card needs to be consumed in each rendering, the pixel filling speed of the display card of the existing mobile phone is low, so that the display of the live broadcast interface comprising a plurality of drawing elements cannot be supported. In step S2, aiming at the defect of low pixel filling speed of the display card of the mobile phone, a scene synthesis (i.e., rendering an original picture corresponding to each rendering element based on a transformation matrix corresponding to the rendering element to synthesize a live broadcast image) is directly performed on the corresponding rendering element by using the adaptation parameter of each rendering element, so as to meet the pixel filling speed of the display card of the mobile phone itself, achieve the purpose of displaying a live broadcast interface by using a plurality of rendering elements, and improve the experience effect of the user.
By way of example and not limitation, an open graphics library (OpenGL) module may be used to render and synthesize live broadcast data of all drawing elements, calculate, according to an adaptation parameter corresponding to each drawing element and an actual size of the drawing element, a scaling ratio for scaling each drawing element and a position coordinate of final drawing on the premise of no deformation, and combine a mapping in which the drawing elements are updated into a live broadcast picture. Taking a picture element as an example: during rendering and composition, the size of the picture corresponding to the picture element (for example, display modes such as zoom-in, zoom-out, tiling, stretch, fill, centering, cross-region and the like) needs to be adjusted, so that the picture is displayed within a reasonable layout range of the live interface, and the position (for example, upper portion, lower portion, middle portion and the like) of the picture in the live interface and the texture (for example, transparency and the like) of the picture are adjusted, so that the picture is displayed in the live interface with a reasonable size and a proper effect. The text elements can be displayed in a rolling mode when the display effect of the text elements is viewed.
In the embodiment, the live broadcasting method is mainly applied to a live broadcasting end, a live broadcasting interface of the mobile terminal can support display of a plurality of drawing elements (namely scene elements of live broadcasting interaction), during live broadcasting, live broadcasting data of each drawing element is respectively obtained, and the live broadcasting data of all the drawing elements are synthesized to generate a live broadcasting image, so that the live broadcasting interface displays the plurality of drawing elements, and interaction effect and visual experience among users are improved.
As shown in fig. 3, the live image includes: a camera element a1, a text element a2, a picture element a3 as background, and a picture element a4 as avatar. The image layer of the image pickup element a1, the image layer of the text element a2 and the image layer of the picture element a4 are all located above the image layer of the picture element a3.
In the live broadcast process, when the live broadcast software can capture the whole mobile phone screen picture to carry out stream pushing, the privacy of the user is leaked if the user switches to QQ, WeChat or other social software to temporarily carry out certain operations, even if the user carries out recharging in the live broadcast game process and needs to verify a payment password and the like. Therefore, based on the live broadcasting method, the picture element covering the live broadcasting interface can be provided as a shielding layer, and the privacy of the user is protected. Specific examples are as follows:
when the at least two drawing elements comprise one picture element, the layer of the picture element covers the upper part of the live broadcast interface, and the transparency of the picture element is within a threshold range;
in the live broadcasting process, when the mask signal is acquired, in step S2, the live broadcasting data of the at least two drawing elements are synthesized, and the step of generating the live broadcasting image includes:
and adjusting the transparency of the shielding picture to a non-transparent state, synthesizing the live broadcast data of the at least two drawing elements, and generating a live broadcast image showing the shielding picture.
In this embodiment, a user can start the privacy mode at any time as required during live broadcasting, and when the user does not start the privacy mode, in the process of synthesizing live broadcasting data of all drawing elements, a picture element covering the top of the live broadcasting interface b2 as a transparency in an adaptation parameter of a shielding layer b1 (shown in fig. 4) is set to be fully transparent or semi-transparent, so that a user watching live broadcasting can watch the content of the live broadcasting interface b2 through the shielding layer b 1; when the user triggers the shielding signal and starts the privacy mode, the transparency of the picture element covering the top of the live broadcast interface as the adaptive parameter of the shielding layer b1 is set to be in a non-transparent state (refer to fig. 5), so that the user watching the live broadcast cannot watch the content of the current whole live broadcast interface b2 through the shielding layer, and the purpose of protecting the personal privacy of the live broadcast user is achieved. The user at the watching end can only see the image of the shielding layer within the time of starting the privacy mode, but cannot see the operation of the user at the live broadcasting end; when the user at the live end turns off the privacy mode, the user at the watching end can continue to watch the screen content of the anchor.
Further, in step S2, the step of synthesizing the live data of the at least two drawing elements to generate a live image includes:
and synthesizing the live broadcast data of the plurality of drawing elements according to the adaptive parameters of each drawing element, and drawing the plurality of drawing elements in a live broadcast interface to generate a live broadcast image.
In this embodiment, when the adaptation parameter of the rendering element matches the parameter that needs to be adjusted, rendering synthesis may be performed on the rendering element, and the updated mapping of the rendering element is combined into a live view.
As shown in fig. 6, in a preferred embodiment, the live broadcasting method may further include the steps of:
and S3, making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
Streaming media refers to a technology and a process of compressing a series of media data, sending the data in segments through a network, and transmitting video and audio on the network in real time for viewing, wherein the technology enables data packets to be sent like streaming; if this technique is not used, the entire media file must be downloaded before use.
In practical application, the data of live broadcast images and the audio data synchronously acquired can be compressed to generate streaming media data with time stamps, and the streaming media data are pushed to a server for the server to transmit the live broadcast streaming media to a watching end.
For a mobile terminal of an android system, a process of making data of live images into streaming media and pushing the streaming media to a server is as follows: the live images rendered by OpenGL can be encoded and compressed by a video encoder connected with an open graphics library interface to make streaming media to be pushed to a server.
For the mobile operating system of apple, the process of making the data of the live images into streaming media and pushing the streaming media to the server is as follows: the live image rendered by OpenGL can be acquired by adopting the video card storage module and the mainboard storage module of the mobile terminal to perform the on-the-spot copying operation, and then a video encoder is called to compress the image to make the compressed image into streaming media to be pushed to a server.
As shown in fig. 7, a live broadcast method is applied to a mobile terminal, where a live broadcast interface of the mobile terminal includes a plurality of drawing elements; the rendering elements comprise at least one presentation element, each presentation element corresponds to an index module, and the index module is used for recording the display time of each document image and the display state at each moment;
the method comprises the following steps:
A1. acquiring live broadcast data of the drawing element, and acquiring a display state and a manuscript image of the presentation manuscript element according to the index module;
it should be noted that: the drawing element can be selected from any one of the following elements: camera elements, scene elements, composition sub-elements;
the scene elements can adopt picture elements or character elements; the combined sub-element can comprise a combined sub-element which is composed of a picture element, and/or a text element, and/or a camera element and contains adaptive parameters; the adaptation parameters may include: a size variation parameter, and/or a position variation parameter, and/or a texture variation parameter.
The mobile terminal in this embodiment may be a smart phone or a tablet computer. The camera element in the embodiment corresponds to a camera module of the mobile terminal, and the image content displayed by the camera element is the content shot by the camera module; the picture elements are images stored in the mobile terminal in advance; a composite sub-element may be understood as a small live interface, which may be an element consisting of a picture element and a text element and including layout adaptation parameters for each element.
In step a1, the content of each drawing element at the current time is obtained so as to continuously update the content of the live image, for example: and determining the manuscript image to be displayed of the current presentation manuscript element according to the current moment and the display effect (such as fade-in, fade-out, push-in, erasing and the like) when the manuscript image is switched.
A2. And synthesizing the live broadcast data of the at least two drawing elements to generate a live broadcast image.
In practical application, the live image can be displayed in a screen of the mobile terminal so that a user can watch the live effect.
Further, the step of synthesizing the live data of the at least two drawing elements in step a2 to generate a live image includes:
adjusting at least one of the position, the size and the texture of each rendering element according to the adaptation parameters of each rendering element, the display state of the current presentation element and the document image, rendering the at least two adjusted rendering elements in a live interface to generate a live image, or
And synthesizing the live broadcast data of all the rendering elements according to the adaptive parameters of each rendering element, the display state of the current presentation file element and the file image, and rendering the rendering elements in a live broadcast interface to generate a live broadcast image. When the adaptive parameters of the drawing elements are matched with the parameters needing to be adjusted, rendering synthesis can be carried out on the drawing elements, and the maps with the updated drawing elements are combined into a live broadcast picture.
For the presentation document element, a document image to be displayed of the current presentation document element and the display effect of the document image can be determined according to the current moment, then a picture loading controller is controlled to load the document image, the document image is zoomed in an equal ratio to the maximum extent through an open graphic library and then is placed in a display interface, the transparency of the document image is calculated to perform mixed rendering, and therefore the final picture is obtained. Referring to fig. 8c 1-8 c3, the document image of fig. 8c1 is displayed in the transition to the document image of fig. 8c3, and fig. 8c2 is a document image display state when both document images are translucent.
In this embodiment, a live interface of the mobile terminal can support the display of a plurality of rendering elements (i.e., scene elements of live interaction), and during live broadcasting, live data of each rendering element is respectively obtained, and the live data of all the rendering elements are synthesized to generate a live image, so that the live interface displays the plurality of rendering elements, and the interaction effect and visual experience among users are improved.
By way of example and not limitation, the live broadcast method can be applied to communication between a user at a live broadcast end and a user at a watching end in a voice and comment mode, and live broadcast end replaces shooting live broadcast by displaying a presentation.
In a preferred embodiment, the live broadcasting method may further include the steps of:
A3. and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
In practical application, the data of live broadcast images and the audio data synchronously acquired can be compressed to generate streaming media data with time stamps, and the streaming media data are pushed to a server for the server to transmit the live broadcast streaming media to a watching end.
As shown in fig. 9, a live broadcast method is applied to a mobile terminal, where the mobile terminal includes at least two viewing angle camera modules and a storage module corresponding to the camera modules, and stores image data acquired by the camera modules through the storage module, and each camera module corresponds to a camera element; the live broadcast interface of the mobile terminal comprises a plurality of drawing elements, wherein the plurality of drawing elements comprise at least two camera shooting elements;
the method comprises the following steps:
B1. acquiring live broadcast data of the plurality of drawing elements;
it should be noted that: the drawing element may also be selected from any one of: scene elements, composition sub-elements;
the scene elements can adopt picture elements or character elements; the combined sub-element can comprise a combined sub-element which is composed of a picture element, and/or a text element, and/or a camera element and contains adaptive parameters; the adaptation parameters may include: a size variation parameter, and/or a position variation parameter, and/or a texture variation parameter.
The mobile terminal in this embodiment may be a smartphone or a tablet computer configured with two cameras. The camera element in the embodiment corresponds to a camera module of the mobile terminal, and the image content displayed by the camera element is the content shot by the camera module; the picture elements are images stored in the mobile terminal in advance; a composite sub-element may be understood as a small live interface, which may be an element consisting of a picture element and a text element and including layout adaptation parameters for each element.
B2. And synthesizing the live broadcast data of the plurality of drawing elements to generate a live broadcast image.
Referring to fig. 10, d1 is image data collected by the front camera module of the mobile terminal; d2 is the image data collected by the rear camera module of the mobile terminal.
In practical application, the live image can be displayed in a screen of the mobile terminal so that a user can watch the live effect.
Further, in step B2, the step of synthesizing live data of the plurality of drawing elements to generate a live image includes:
adjusting at least one of the position, the size and the texture of each drawing element according to the adaptive parameters of each drawing element, and drawing the adjusted at least two drawing elements in a live interface to generate a live image, or
And synthesizing the live broadcast data of the plurality of drawing elements according to the adaptive parameters of each drawing element, and drawing the plurality of drawing elements in a live broadcast interface to generate a live broadcast image.
By way of example and not limitation, an open graphics library (OpenGL) module may be used to render and synthesize live broadcast data of all drawing elements, calculate, according to an adaptation parameter corresponding to each drawing element and an actual size of the drawing element, a scaling ratio for scaling each drawing element and a position coordinate of final drawing on the premise of no deformation, and combine a mapping in which the drawing elements are updated into a live broadcast picture. Taking a picture element as an example: during rendering and composition, the size of the picture corresponding to the picture element (for example, display modes such as zoom-in, zoom-out, tiling, stretch, fill, centering, cross-region and the like) needs to be adjusted, so that the picture is displayed within a reasonable layout range of the live interface, and the position (for example, upper portion, lower portion, middle portion and the like) of the picture in the live interface and the texture (for example, transparency and the like) of the picture are adjusted, so that the picture is displayed in the live interface with a reasonable size and a proper effect. The text elements can be displayed in a rolling mode when the display effect of the text elements is viewed.
In this embodiment, a live interface of the mobile terminal can support the display of a plurality of rendering elements (i.e., scene elements of live interaction), and during live broadcasting, live data of each rendering element is respectively obtained, and the live data of all the rendering elements are synthesized to generate a live image, so that the live interface displays the plurality of rendering elements, and the interaction effect and visual experience among users are improved.
In a preferred embodiment, the live broadcasting method may further include the steps of:
B3. and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
In practical application, the data of live broadcast images and the audio data synchronously acquired can be compressed to generate streaming media data with time stamps, and the streaming media data are pushed to a server for the server to transmit the live broadcast streaming media to a watching end.
As shown in fig. 11, a live broadcast system is applied to a mobile terminal, where a live broadcast interface of the mobile terminal includes at least two drawing elements; the system comprises: an acquisition unit 11 and a synthesis unit 12, wherein:
an obtaining unit 11, configured to obtain live data of the at least two drawing elements;
it should be noted that: the drawing element may be selected from any one of: camera elements, scene elements, composition sub-elements;
the scene elements can adopt picture elements or character elements; the combined sub-element can comprise a combined sub-element which is composed of a picture element, and/or a text element, and/or a camera element and contains adaptive parameters; the adaptation parameters may include: a size variation parameter, and/or a position variation parameter, and/or a texture variation parameter.
The mobile terminal in this embodiment may be a smart phone or a tablet computer. The camera element in the embodiment corresponds to a camera module of the mobile terminal, and the image content displayed by the camera element is the content shot by the camera module; the picture elements are images stored in the mobile terminal in advance; a composite sub-element may be understood as a small live interface, which may be an element consisting of a picture element and a text element and including layout adaptation parameters for each element.
The acquisition unit 11 continuously updates the content of the live image by acquiring the content of the current time of each drawing element, for example: acquiring currently acquired data from the camera module again for the camera element; for the text element, the distance length of movement and the like are calculated according to the current time. The results of each rendering element calculation (i.e., each as a map and the corresponding high and wide data for that map) are stored.
And a synthesizing unit 12, configured to synthesize the live data of the at least two drawing elements, and generate a live image.
In practical application, the live image can be displayed in a screen of the mobile terminal so that a user can watch the live effect.
Further, the synthesizing unit 12 is configured to adjust at least one of a position, a size, and a texture of each of the rendering elements according to the adaptation parameter of each of the rendering elements, and render the at least two adjusted rendering elements in a live interface to generate a live image.
The synthesizing unit 12 may further synthesize the live broadcast data of the plurality of drawing elements according to the adaptation parameter of each drawing element, and draw the plurality of drawing elements in a live broadcast interface to generate a live broadcast image. When the adaptive parameters of the drawing elements are matched with the parameters needing to be adjusted, rendering synthesis can be carried out on the drawing elements, and the maps with the updated drawing elements are combined into a live broadcast picture.
In this embodiment, a live interface of the mobile terminal can support the display of a plurality of rendering elements (i.e., scene elements of live interaction), and during live broadcasting, live data of each rendering element is respectively obtained, and the live data of all the rendering elements are synthesized to generate a live image, so that the live interface displays the plurality of rendering elements, and the interaction effect and visual experience among users are improved.
As shown in fig. 12, a computer apparatus 2, the computer apparatus 2 comprising:
a memory 21 for storing executable program code; and
a processor 22 for calling said executable program code in said memory 21, the execution steps including the live method described above.
Fig. 12 illustrates an example of one processor 22.
The memory 21, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules (for example, the obtaining unit 11 and the synthesizing unit 12 shown in fig. 11) corresponding to the live broadcast method in the embodiment of the present application. The processor 22 executes various functional applications and data processing of the computer device 2 by running the nonvolatile software programs, instructions and modules stored in the memory 21, namely, the live broadcast method of the above-mentioned method embodiment is realized.
The memory 21 may include a program storage area and a data storage area, wherein the program storage area may store an application program required for at least one function of the operating system; the storage data area may store playback information of the user on the computer device 2. Further, the memory 21 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 21 optionally includes memory 21 located remotely from the processor 22, and these remote memories 21 may be connected to the live system 1 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 21 and when executed by the one or more processors 22 perform the live broadcast method in any of the above-described method embodiments, e.g., performing the above-described method steps S1-S2 in fig. 2, S1-S3 in fig. 6, implementing the functions of the acquisition unit 11 and the synthesis unit 12 shown in fig. 11.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
The computer device 2 of the embodiment of the present application exists in various forms, including but not limited to:
(1) a mobile communication device: such devices are characterized by mobile communications capabilities and are primarily targeted at providing voice, data communications. Such terminals include: smart phones (e.g., iphones), multimedia phones, functional phones, and low-end phones, among others.
(2) Ultra mobile personal computer device: the equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include: PDA, MID, and UMPC devices, etc., such as ipads.
(3) A portable entertainment device: such devices can display and play multimedia content. This type of device comprises: audio, video players (e.g., ipods), handheld game consoles, electronic books, and smart toys and portable car navigation devices.
(4) A server: the device for providing the computing service comprises a processor, a hard disk, a memory, a system bus and the like, and the server is similar to a general computer architecture, but has higher requirements on processing capacity, stability, reliability, safety, expandability, manageability and the like because of the need of providing high-reliability service.
(5) And other electronic devices with data interaction functions.
The present application provides a non-transitory computer-readable storage medium, which stores computer-executable instructions, which are executed by one or more processors, such as one processor 22 in fig. 12, to enable the one or more processors 22 to perform a live broadcast method in any of the above method embodiments, for example, to perform the above-described method steps S1 to S2 in fig. 2, and method steps S1 to S3 in fig. 6, so as to implement the functions of the obtaining unit 11 and the synthesizing unit 12 shown in fig. 11.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on at least two network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the embodiments of the present application. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a Random Access Memory (RAM), or the like.
The first embodiment,
Referring to fig. 13, the live broadcast method may be applied to a live broadcast interface of a mobile phone in which a live broadcast scene includes a camera element, a picture element, and a text element. For a main broadcast user who carries out live broadcast at a live broadcast end by adopting a camera of a mobile phone, in order to meet the requirement that the main broadcast user can transmit an identifier indicating the current live broadcast topic or content to a watching user at any time, the live broadcast content can be transmitted in real time by adopting a mode of adding a topic frame on a live broadcast interface. The live broadcast end can add a topic box in a live broadcast interface to prompt the current live broadcast topic content. This "topic box" (e.g., a picture identified with "what is chatting today. Pictures and topic frames acquired by camera elements corresponding to a camera module of the mobile phone belong to drawing elements, and the topic frames are located above the camera element layers.
Example II,
As shown in fig. 3, the live broadcast method can be applied to a live broadcast interface of a mobile phone in which a live broadcast scene includes a camera element, a picture element, and a text element. Besides adding a camera element to a direct interface to shoot a live broadcast picture, a live broadcast end can also add a character element (such as welcome live broadcast room! live broadcast time from 9 to 12 points) to prompt a watching user to watch the time of the live broadcast room, and can also adopt an image element to represent a live broadcast head portrait of a main broadcast and an integral live broadcast background, thereby improving the live broadcast interaction effect and enriching the live broadcast content.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A live broadcast method is characterized in that the method is applied to a mobile terminal, and a live broadcast interface of the mobile terminal comprises at least two drawing elements; the method comprises the following steps:
acquiring live broadcast data of the at least two drawing elements;
and synthesizing the live broadcast data of the at least two drawing elements to generate a live broadcast image.
2. A live broadcast method according to claim 1, wherein the step of synthesizing live broadcast data of the at least two rendering elements to generate a live broadcast image comprises:
adjusting at least one of the position, the size and the texture of each drawing element according to the adaptive parameters of each drawing element, and drawing the adjusted at least two drawing elements in a live interface to generate a live image, or
And synthesizing the live broadcast data of the plurality of drawing elements according to the adaptive parameters of each drawing element, and drawing the plurality of drawing elements in a live broadcast interface to generate a live broadcast image.
3. A live broadcast method according to claim 1, further comprising the step of:
and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
4. A live method according to claim 1, wherein the drawing element is selected from any one of: camera elements, scene elements, composition sub-elements;
the scene elements adopt picture elements or character elements.
5. A live broadcast method according to claim 4, wherein the combined sub-elements comprise combined sub-elements comprising picture elements, and/or text elements, and/or camera elements, and including adaptation parameters;
the adaptation parameters include: a size variation parameter, and/or a position variation parameter, and/or a texture variation parameter.
6. The live broadcasting method according to claim 4, wherein when the at least two drawing elements include a picture element, and a layer of the picture element is overlaid on the live broadcasting interface, the transparency of the picture element is within a threshold range;
in the live broadcasting process, when the shielding signal is acquired, synthesizing the live broadcasting data of the at least two drawing elements, and generating a live broadcasting image comprises the following steps:
and adjusting the transparency of the shielding picture to a non-transparent state, synthesizing the live broadcast data of the at least two drawing elements, and generating a live broadcast image showing the shielding picture.
7. A live broadcast method is characterized in that the live broadcast method is applied to a mobile terminal, and a live broadcast interface of the mobile terminal comprises a plurality of drawing elements; the rendering elements comprise at least one presentation element, each presentation element corresponds to an index module, and the index module is used for recording the display time of each document image and the display state at each moment;
the method comprises the following steps:
acquiring live broadcast data of the drawing element, and acquiring a display state and a manuscript image of the presentation manuscript element according to the index module;
and synthesizing the live broadcast data of the at least two drawing elements to generate a live broadcast image.
8. A live broadcast method according to claim 7, further comprising the step of:
and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
9. A live broadcast method according to claim 7, wherein the step of synthesizing live broadcast data of the at least two rendering elements to generate a live broadcast image comprises:
adjusting at least one of the position, the size and the texture of each rendering element according to the adaptation parameters of each rendering element, the display state of the current presentation element and the document image, rendering the at least two adjusted rendering elements in a live interface to generate a live image, or
And synthesizing the live broadcast data of all the rendering elements according to the adaptive parameters of each rendering element, the display state of the current presentation file element and the file image, and rendering the rendering elements in a live broadcast interface to generate a live broadcast image.
10. A live broadcast method is characterized in that the live broadcast method is applied to a mobile terminal, the mobile terminal comprises camera modules with at least two visual angles and storage modules corresponding to the camera modules, and each camera module corresponds to a camera element; the live broadcast interface of the mobile terminal comprises a plurality of drawing elements, wherein the plurality of drawing elements comprise at least two camera shooting elements;
the method comprises the following steps:
acquiring live broadcast data of the plurality of drawing elements;
and synthesizing the live broadcast data of the plurality of drawing elements to generate a live broadcast image.
11. A live broadcast method according to claim 10, further comprising the step of:
and making the data of the live broadcast image into streaming media and pushing the streaming media to a server.
12. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of any one of claims 1 to 6 when executing the computer program.
13. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program when executed by a processor implements the steps of the method of any one of claims 1 to 6.
CN201910463104.9A 2019-05-30 2019-05-30 Live broadcast method, computer equipment and readable storage medium Pending CN112019906A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910463104.9A CN112019906A (en) 2019-05-30 2019-05-30 Live broadcast method, computer equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910463104.9A CN112019906A (en) 2019-05-30 2019-05-30 Live broadcast method, computer equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112019906A true CN112019906A (en) 2020-12-01

Family

ID=73502032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910463104.9A Pending CN112019906A (en) 2019-05-30 2019-05-30 Live broadcast method, computer equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112019906A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113225581A (en) * 2021-05-17 2021-08-06 腾讯科技(深圳)有限公司 Live broadcast interaction method and device and electronic equipment
CN113949900A (en) * 2021-10-08 2022-01-18 上海哔哩哔哩科技有限公司 Live broadcast map processing method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105791889A (en) * 2016-05-04 2016-07-20 武汉斗鱼网络科技有限公司 Advertisement inter-cut method for video live broadcasting and advertisement inter-cut device for video live broadcasting
CN106165430A (en) * 2016-06-29 2016-11-23 北京小米移动软件有限公司 Net cast method and device
CN106454481A (en) * 2016-09-30 2017-02-22 广州华多网络科技有限公司 Live broadcast interaction method and apparatus of mobile terminal
CN106559696A (en) * 2016-12-01 2017-04-05 北京小米移动软件有限公司 Method for sending information and device
CN107566878A (en) * 2017-08-07 2018-01-09 北京小米移动软件有限公司 The method and device of live middle display picture
CN109672908A (en) * 2018-12-27 2019-04-23 北京潘达互娱科技有限公司 A kind of method for protecting privacy, device and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105791889A (en) * 2016-05-04 2016-07-20 武汉斗鱼网络科技有限公司 Advertisement inter-cut method for video live broadcasting and advertisement inter-cut device for video live broadcasting
CN106165430A (en) * 2016-06-29 2016-11-23 北京小米移动软件有限公司 Net cast method and device
CN106454481A (en) * 2016-09-30 2017-02-22 广州华多网络科技有限公司 Live broadcast interaction method and apparatus of mobile terminal
CN106559696A (en) * 2016-12-01 2017-04-05 北京小米移动软件有限公司 Method for sending information and device
CN107566878A (en) * 2017-08-07 2018-01-09 北京小米移动软件有限公司 The method and device of live middle display picture
CN109672908A (en) * 2018-12-27 2019-04-23 北京潘达互娱科技有限公司 A kind of method for protecting privacy, device and mobile terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113225581A (en) * 2021-05-17 2021-08-06 腾讯科技(深圳)有限公司 Live broadcast interaction method and device and electronic equipment
CN113949900A (en) * 2021-10-08 2022-01-18 上海哔哩哔哩科技有限公司 Live broadcast map processing method and system
CN113949900B (en) * 2021-10-08 2023-11-24 上海哔哩哔哩科技有限公司 Live broadcast mapping processing method, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112019907A (en) Live broadcast picture distribution method, computer equipment and readable storage medium
CN106210861B (en) Method and system for displaying bullet screen
WO2017193576A1 (en) Video resolution adaptation method and apparatus, and virtual reality terminal
US10229651B2 (en) Variable refresh rate video capture and playback
EP3151548A1 (en) Video recording method and device
US9370718B2 (en) System and method for delivering media over network
CN111970532B (en) Video playing method, device and equipment
CN109644294B (en) Live broadcast sharing method, related equipment and system
US11184646B2 (en) 360-degree panoramic video playing method, apparatus, and system
CN106713942B (en) Video processing method and device
CN107040808B (en) Method and device for processing popup picture in video playing
CN107995482B (en) Video file processing method and device
US11451858B2 (en) Method and system of processing information flow and method of displaying comment information
US20170225077A1 (en) Special video generation system for game play situation
KR20210095160A (en) A technology configured to provide a user interface through the representation of two-dimensional content through three-dimensional display objects rendered in a navigable virtual space
CN113781660A (en) Method and device for rendering and processing virtual scene on line in live broadcast room
US8860720B1 (en) System and method for delivering graphics over network
CN109302636B (en) Method and device for providing panoramic image information of data object
CN114245228A (en) Page link releasing method and device and electronic equipment
CN112019906A (en) Live broadcast method, computer equipment and readable storage medium
CN113411537B (en) Video call method, device, terminal and storage medium
KR101643102B1 (en) Method of Supplying Object State Transmitting Type Broadcasting Service and Broadcast Playing
CN114445600A (en) Method, device and equipment for displaying special effect prop and storage medium
CN113099309A (en) Video processing method and device
CN111667313A (en) Advertisement display method and device, client device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination