CN115037951B - Live broadcast processing method and device - Google Patents

Live broadcast processing method and device Download PDF

Info

Publication number
CN115037951B
CN115037951B CN202110243553.XA CN202110243553A CN115037951B CN 115037951 B CN115037951 B CN 115037951B CN 202110243553 A CN202110243553 A CN 202110243553A CN 115037951 B CN115037951 B CN 115037951B
Authority
CN
China
Prior art keywords
definition
user
target
live
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110243553.XA
Other languages
Chinese (zh)
Other versions
CN115037951A (en
Inventor
朱翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Bilibili Technology Co Ltd
Original Assignee
Shanghai Bilibili Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Bilibili Technology Co Ltd filed Critical Shanghai Bilibili Technology Co Ltd
Priority to CN202110243553.XA priority Critical patent/CN115037951B/en
Publication of CN115037951A publication Critical patent/CN115037951A/en
Application granted granted Critical
Publication of CN115037951B publication Critical patent/CN115037951B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2402Monitoring of the downstream path of the transmission network, e.g. bandwidth available

Abstract

The embodiment of the application provides a live broadcast processing method and device, wherein the live broadcast processing method comprises the following steps: receiving a live broadcasting room entering instruction of a user, wherein the live broadcasting room entering instruction carries identification information of a target live broadcasting room, acquiring definition monitoring data of at least one data dimension, determining target definition of a live broadcasting stream to be played of the user according to the definition monitoring data, acquiring live broadcasting configuration information of the target live broadcasting room according to the identification information, determining a target live broadcasting stream corresponding to the target definition according to a mapping relation between the live broadcasting stream and the definition in the live broadcasting configuration information, and transmitting the target live broadcasting stream to the user as the live broadcasting stream to be played.

Description

Live broadcast processing method and device
Technical Field
The embodiment of the application relates to the technical field of live video broadcasting, in particular to a live video processing method. One or more embodiments of the present application relate to a live broadcast processing apparatus, a computing device, and a computer-readable storage medium.
Background
With the development of network technology, the application of network video is more and more popular, and corresponding network video live broadcast is also widely applied, for example, network live broadcast of a concert, a sports event and the like.
The current network live broadcast technology encodes a video SDI (Serial Digital Interface, digital component serial interface) signal received from a satellite to obtain a live broadcast video stream, and transmits the live broadcast video stream to a terminal for display, because the requirements of users on definition and stability are higher and higher. And the stability, bandwidth and the like of the network often affect the live broadcast effect. When a terminal with low network bandwidth plays live video, the problems that the playing is not smooth, some video frames in the video cannot be normally acquired and displayed, video content information is lost and the like often occur, and the fluency of live video cannot be guaranteed.
Disclosure of Invention
In view of this, embodiments of the present application provide a live broadcast processing method. One or more embodiments of the present application relate to a live broadcast processing apparatus, a computing device, and a computer readable storage medium, so as to solve the technical defect that in the prior art, when a live broadcast video is played by a user terminal, smoothness of the live broadcast cannot be guaranteed.
According to a first aspect of an embodiment of the present application, there is provided a live broadcast processing method, including:
receiving a live broadcasting room entering instruction of a user, wherein the live broadcasting room entering instruction carries identification information of a target live broadcasting room;
Acquiring definition monitoring data of at least one data dimension, and determining target definition of the live stream to be played of the user according to the definition monitoring data;
acquiring live broadcast configuration information of the target live broadcast room according to the identification information, and determining a target live broadcast stream corresponding to the target definition according to a mapping relation between the live broadcast stream and the definition in the live broadcast configuration information;
and transmitting the target live stream to the user as the live stream to be played.
According to a second aspect of an embodiment of the present application, there is provided a live broadcast processing apparatus, including:
the receiving module is configured to receive a live broadcasting room entering instruction of a user, wherein the live broadcasting room entering instruction carries identification information of a target live broadcasting room;
the acquisition module is configured to acquire definition monitoring data of at least one data dimension and determine target definition of the live stream to be played of the user according to the definition monitoring data;
the determining module is configured to acquire live broadcast configuration information of the target live broadcast room according to the identification information, and determine a target live broadcast stream corresponding to the target definition according to a mapping relation between the live broadcast stream and the definition in the live broadcast configuration information;
And the transmission module is configured to transmit the target live stream to the user as the live stream to be played.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising:
a memory and a processor;
the memory is configured to store computer executable instructions and the processor is configured to execute the computer executable instructions, wherein the processor implements the steps of the live processing method when executing the computer executable instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer readable storage medium storing computer executable instructions which, when executed by a processor, implement the steps of the live processing method.
An embodiment of the application realizes a live broadcast processing method and a device, wherein the live broadcast processing method comprises the steps of receiving a live broadcast room entering instruction of a user, carrying identification information of a target live broadcast room in the live broadcast room entering instruction, acquiring definition monitoring data of at least one data dimension, determining target definition of a live broadcast stream to be played of the user according to the definition monitoring data, acquiring live broadcast configuration information of the target live broadcast room according to the identification information, determining a target live broadcast stream corresponding to the target definition according to a mapping relation between the live broadcast stream and the definition in the live broadcast configuration information, and transmitting the target live broadcast stream to the user as the live broadcast stream to be played.
In the embodiment of the application, the server side can monitor and acquire the definition monitoring data of at least one dimension of the user, so that a live stream with proper playing definition is provided for the user according to the definition monitoring data, and the blocking of the live stream of the user terminal can be effectively avoided under the condition that the utilization rate of network resources of the user is ensured, thereby being beneficial to improving user experience.
Drawings
Fig. 1 is a flowchart of a live broadcast processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a live broadcast processing method according to an embodiment of the present application;
FIG. 3a is a schematic diagram of a live processing result according to an embodiment of the present application;
FIG. 3b is a schematic diagram of a live processing result according to one embodiment of the present application;
fig. 4 is a schematic flowchart of a live broadcast processing in the live broadcast processing method according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a live broadcast processing device according to an embodiment of the present application;
FIG. 6 is a block diagram of a computing device provided in one embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is, however, susceptible of embodiment in many other ways than those herein described and similar generalizations can be made by those skilled in the art without departing from the spirit of the application and the application is therefore not limited to the specific embodiments disclosed below.
The terminology used in one or more embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of one or more embodiments of the application. As used in this application in one or more embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that, although the terms first, second, etc. may be used in one or more embodiments of the present application to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, a first may also be referred to as a second, and similarly, a second may also be referred to as a first, without departing from the scope of one or more embodiments of the present application. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
First, terms related to one or more embodiments of the present application will be explained.
Live broadcast room: the anchor presents the live video content to the virtual network address of the viewer.
Live stream: the transmission of video-on-demand data, which can be transmitted as a steady and continuous stream over a network to a viewer for viewing.
Code rate: the higher the bit rate, the more clear the bit number of data transmitted in unit time during video data transmission, otherwise, the rough and multi-mosaic picture.
Frame rate: a measure for measuring the number of display frames. The so-called measurement unit is the number of display frames per second.
Default sharpness: the user enters different high-code-rate or low-code-rate live streams which are distributed by default in the live broadcasting room, namely different definition.
Live transcoding: and (3) sampling and analyzing the original live stream, and then outputting and compressing the original live stream into the live stream with a low code rate.
Weak net: the judgment of the network transmission and reception speed is generally divided into weak networks below 2G/3G, and the weak signal wireless network is also commonly used as the weak network.
Live broadcast blocking: the user watches the phenomenon that the display is not smooth, and the black screen or the picture is static.
In the present application, a live broadcast processing method is provided. One or more embodiments of the present application relate to a live broadcast processing apparatus, a computing device, and a computer-readable storage medium, which are described in detail in the following embodiments.
Referring to fig. 1, fig. 1 shows a flowchart of a live broadcast processing method according to an embodiment of the present application, including the following steps:
step 102, receiving a live broadcasting room entering instruction of a user, wherein the live broadcasting room entering instruction carries identification information of a target live broadcasting room.
The live broadcast processing method is applied to a server, and the server can be in communication connection with a plurality of clients (user terminals) through the Internet. Among the plurality of clients, part of the clients can serve as a main broadcasting client to provide an online live broadcasting room, the rest of the clients can serve as audience clients to enter the online live broadcasting room provided by the main broadcasting client to watch live broadcasting contents, the main broadcasting client uploads the online live broadcasting contents to a server, and the server sends the online live broadcasting contents to the audience clients entering the main broadcasting live broadcasting room for watching by the audience clients; the viewer client may interact with the anchor within the anchor living room, such as sending a gift to the anchor or sending comment information, etc.
In addition, in the embodiment of the application, the client may be a media terminal for implementing functions such as group chat, live video broadcast, channel K song, online game, online movie and television. The client may run on a user device including, but not limited to, a mobile phone, a mobile computer, a tablet, a personal digital assistant (Personal Digital Assistant, PDA), a media player, a smart television, a smart watch, smart glasses, a smart bracelet, and the like.
In practical application, a user can watch live broadcast through clients such as a personal desktop, a mobile computer or an intelligent mobile terminal (such as a smart phone), and the like, specifically can select a target live broadcast room from a live broadcast list of a live broadcast platform, or search the target live broadcast room through a search page of the live broadcast platform, and then submit a live broadcast room entering instruction aiming at the target live broadcast room.
After receiving the instruction, the client sends the instruction to the server, and after receiving the instruction, the server can transmit a live stream with a proper code rate for the user based on the identification information of the target live broadcast room carried in the instruction.
And 104, acquiring definition monitoring data of at least one data dimension, and determining the target definition of the live stream to be played of the user according to the definition monitoring data.
In this embodiment, when the server pushes the live stream for the user, the play definition of the live stream that may be provided includes, but is not limited to, fluency, standard definition, high definition, super definition, blue light, and the like.
Because the preset definition is mostly adopted at present, and the definition cannot be switched by a user in the live broadcast process, or the definition needs to be manually switched by the user. However, when the network of the user terminal is unstable, the phenomena of blocking, frame loss and the like occur, so that the viewing experience of the user is poor.
Therefore, in the embodiment of the application, after receiving the instruction of entering the live broadcasting room of the user, the server can acquire the definition monitoring data of at least one dimension of the user, and determine the target definition which does not cause video jamming when the user terminal plays the live broadcasting stream according to the definition monitoring data.
In specific implementation, the definition monitoring data comprises login state data of a user;
correspondingly, the target definition of the live stream to be played of the user is determined according to the definition monitoring data, namely, if the login state of the user is determined to be not logged in according to the login state data, the target definition of the live stream to be played of the user is determined to be the first definition.
Further, the definition monitoring data comprises login state data of a user and terminal state data of a user terminal; correspondingly, the target definition of the live stream to be played of the user is determined according to the definition monitoring data, namely, if the login state of the user is determined to be logged according to the login state data, the target definition of the live stream to be played of the user is determined according to the terminal state data.
Specifically, in the definition monitoring data of at least one data dimension according to the embodiments of the present application, the definition monitoring data of one data dimension is login status data of a user, and after receiving an instruction from a live broadcast room of the user, a server may obtain login status data of the user on a live broadcast platform, and determine whether the user is a login user of the live broadcast platform according to the login status data.
If the user is an unregistered user, determining that the target definition of the user to be played live stream is a first definition, wherein the first definition can be high definition; if the user is a logged-in user, portrait analysis is performed on the user, terminal state data of the user terminal are fitted, and proper play definition is provided for the user, so that smoothness of a live stream play process of the user terminal is guaranteed.
In the implementation, as described above, if the user is a logged-in user, the target definition needs to be determined according to the terminal state data of the user terminal, where in one embodiment provided in the application, the terminal state data of the user terminal includes a network communication address of the user terminal;
correspondingly, the determining the target definition of the live stream to be played by the user according to the definition monitoring data includes:
Determining a first target area to which the user belongs according to the network communication address;
and if the first target area is a non-preset area, determining that the target definition of the live stream to be played by the user is a second definition.
Further, the terminal state data includes positioning data of a user terminal, if the first target area is a predetermined area, determining a second target area to which the user belongs according to the positioning data, and determining an area type of the second target area;
if the second target area is of the first area type, determining that the target definition of the live stream to be played by the user is a third definition;
and under the condition that the second target area is of a second area type, determining that the target definition of the live stream to be played by the user is fourth definition.
Specifically, the network communication address may be an IP address of the user terminal; the non-predetermined area may be an area other than a continental area; the predetermined area is a continental area; the second sharpness may be standard definition; after the server obtains the IP address of the user terminal, the first target area where the user is located may be determined according to the IP address, and if the user is determined to be outside the continental area according to the IP address, the data transmission link established between the server and the user terminal is longer, so that in order to ensure the data transmission rate, the target definition may be determined as standard definition.
In addition, the second target area is a specific position where the user is located, and the specific position needs to be specific to the street and the house number; the third sharpness may be blue light; the fourth definition may be super definition. Therefore, after determining that the area where the user is located is a continental area, the server may further determine an area type of the area where the user is located according to the positioning data of the user terminal, and determine the target definition according to the area type.
In this embodiment, when the server pushes the live stream for the user, the play definition of the live stream that can be provided includes, but is not limited to, fluency, standard definition, high definition, super definition, blue light, and the like, so that the first definition may be high definition, the second definition may be standard definition, the third definition may be blue light, and the fourth definition may be super definition, in practical application, under the condition that the definition of the live stream that can be provided changes, for example, higher play definition can be provided, only the second definition is required to be lower than the first definition, the first definition is lower than the fourth definition, the fourth definition is required to be lower than the third definition, and the practical code rate corresponding to each definition can be determined according to practical conditions without any limitation.
In the embodiment of the application, the region type includes, but is not limited to, schools, markets, residences and the like; the positioning data may be GPS (Global Positioning System ) positioning data or LBS (base station positioning, location Based Service) positioning data, etc.
If the second target area where the user is located is determined to be a residential area (first area type) according to the positioning data, determining the target definition as blue light; if the second target area where the user is located is determined to be a school area or a mall area (second area type) according to the positioning data, the target definition may be determined to be super definition.
And providing proper play definition for the user according to the region type of the region where the user is located, which is beneficial to reducing video cartoon times of the user terminal, thereby improving the viewing experience of the user.
In addition, after determining the target definition of the user according to the login state data of the user and the terminal state data of the user terminal, the target definition can be further adjusted by combining the network state data of the user terminal so as to determine a new target definition; thus, the sharpness monitoring data comprises network status data of the user terminal;
Correspondingly, the determining the target definition of the live stream to be played by the user according to the definition monitoring data includes:
and if the network type of the user terminal is determined to be the first network type according to the network state data and the network signal strength of the user terminal is greater than a preset signal strength threshold, determining that the target definition of the live stream to be played of the user is the first definition.
Further, the definition monitoring data comprises mobile state data of the user terminal;
and if the network type of the user terminal is determined to be the second network type according to the network state data, determining the target definition of the live stream to be played of the user according to the mobile state data.
Specifically, the network types may include a wireless network, a data network, and the like, where the first network type is a wireless network, and the second network type is a mobile network.
In the definition monitoring data of at least one data dimension, the definition monitoring data of one data dimension is network state data of the user terminal, and after determining the target definition according to the login state data of the user and the terminal state data of the user terminal, the server can also re-determine the target definition according to the network state data of the user terminal.
If the network type of the user terminal is determined to be a wireless network according to the network state data of the user on the basis of the existing definition, and the network signal strength of the wireless network is greater than or equal to a preset signal strength threshold value, the target definition can be directly determined to be super definition; if the network type of the user terminal is determined as the data network according to the network state data of the user on the basis of the existing definition, the target definition can be continuously determined according to the mobile state data of the user terminal.
On the basis of the existing definition, the network type of the user is determined according to the network state data of the user, and the proper play definition is provided for the user according to the network type, so that the video blocking times of the user terminal are reduced, and the watching experience of the user is improved.
In specific implementation, the definition monitoring data comprise movement state data of the user terminal, so that definition monitoring data of the user in at least one data dimension are obtained, namely angular velocity data of a first coordinate axis, a second coordinate axis and a third coordinate axis of the user terminal in a space coordinate system, which are acquired by a motion sensor in the user terminal, are obtained; and/or acquiring acceleration data of the user terminal, which is acquired by a velocity sensor in the user terminal, and taking the angular velocity data and/or the acceleration data as the definition monitoring data.
Specifically, in the case of judging that the user uses the data network, the movement state (stationary or non-stationary state) of the user can be determined by analyzing the movement state data of the user terminal; the mobile state data can be collected by a triaxial gyroscope and an acceleration sensor of the user terminal, and the maximum effect of the triaxial gyroscope is to measure angular velocity so as to judge the motion state of an object; the acceleration sensor can be used for measuring the movement acceleration of the carrier;
the three-axis gyroscope can measure the angular velocity of motion along one axis or a plurality of axes, can form complementary advantages with the acceleration sensor, and can better track and capture the complete motion of the three-dimensional space of the user terminal by combining the acceleration sensor and the three-axis gyroscope.
Therefore, in the embodiment of the present application, by acquiring the angular velocity data and the acceleration data of the user terminal, the motion state of the user terminal is determined by combining the angular velocity data and the acceleration data, specifically, if both the angular velocity data and the acceleration data are 0, the user terminal is determined to be in a stationary state, and if any value in the angular velocity data or the acceleration data is not equal to 0, the user terminal is determined to be in a non-stationary state, that is, the user is determined to be in a non-stationary state.
In the implementation, if the user is determined to be in a non-stationary state, determining that the motion type of the user is a first motion type according to the motion state data, and determining that the target definition of the live stream to be played of the user is a third definition.
Specifically, under the condition that the user is in a non-stationary state according to the acceleration data and the angular velocity data in the moving state data, the motion type of the user, such as slow walking, fast running, going upstairs and downstairs, can be further calculated according to the acceleration data and the angular velocity data; the first movement type can be slow walking or going upstairs or downstairs, and 1 definition level is adjusted downwards based on the determined target definition after the movement type of the user is determined to be the first movement type.
The play definition of the live stream of the user terminal is continuously adjusted according to the moving state and/or the movement type of the user, so that the smoothness of the play process of the live stream of the user terminal is guaranteed, and the watching experience of the user is improved.
Further, after the definition level is adjusted downwards, the moving speed of the user can be determined according to the positioning data of the user terminal;
Determining the region type of the second target region to which the user belongs according to the positioning data under the condition that the moving speed is smaller than or equal to a preset threshold value;
if the second target area is of the first area type, determining that the target definition of the live stream to be played by the user is a third definition;
and under the condition that the second target area is of a second area type, determining that the target definition of the live stream to be played by the user is fourth definition.
Or if the user is determined to be in a non-stationary state according to the moving state data and the moving speed of the user is determined to be greater than a preset threshold according to the positioning data of the user terminal, determining the target definition of the live stream to be played by the user to be the second definition.
Specifically, as previously mentioned, the zone types include, but are not limited to, school zones, mall zones, or residential zones; the positioning data includes, but is not limited to, GPS positioning data or LBS positioning data, etc.;
under the condition that the user is in a non-stationary state, calculating the moving speed of the user according to the positioning data, namely, the moving distance of the user in unit time, and determining the type of the area where the user is located through the moving speed; for example, if the moving distance of the user in unit time exceeds 100 meters, the user is in an environment of high-speed movement, which may be a subway, a train, a highway or other areas with unstable network environments, and the target definition is directly reduced to standard definition;
If the moving speed of the user is smaller than or equal to a preset threshold value according to the positioning data, the type of the second target area where the user is located can be further determined according to the positioning data, and if the second target area is a residential area, the target definition can be determined to be blue light; if the target definition is a school area or a mall area, the target definition may be determined to be super definition.
The target definition is comprehensively determined according to the moving speed of the user, the region type of the region where the user is located and other data, and the play definition of the live stream of the user terminal can be continuously adjusted based on the determination result, so that the smoothness of the play process of the live stream of the user terminal is guaranteed, and the watching experience of the user is improved.
And 106, acquiring live broadcast configuration information of the target live broadcast room according to the identification information, and determining a target live broadcast stream corresponding to the target definition according to the mapping relation between the live broadcast stream and the definition in the live broadcast configuration information.
Specifically, the server may establish a mapping relationship between the live stream and different definitions for the target live broadcast room, and generate a transcoding list of the target live broadcast room according to the mapping relationship. Usually, 5 kinds of definition of different code rates of standard definition (code rate 800), high definition (code rate 1500), ultra definition (code rate 2500), blue light (code rate 4000) and original picture (original code rate) exist in a target live broadcasting room to provide live broadcasting service for users; therefore, the server can establish 5 live streams for the target live broadcasting room, and respectively establish mapping relations between the 5 live streams and the definition of 5 different code rates so as to generate the transcoding list.
After receiving the live broadcasting room entering instruction, the server can query and obtain the live broadcasting configuration information of the target live broadcasting room according to the identification information of the target live broadcasting room carried in the live broadcasting room entering instruction, wherein the live broadcasting configuration information comprises the transcoding list, so that the server can determine the target live broadcasting stream corresponding to the target definition according to the mapping relation between the live broadcasting stream and the definition in the transcoding list, and the target live broadcasting stream is transmitted to the user terminal, and the live broadcasting video is played for the user through the user terminal.
And step 108, transmitting the target live stream to the user as the live stream to be played.
Specifically, the server may determine a code rate associated with the target definition according to a mapping relationship between the definition in the transcoding list and the live stream, take the code rate as a target code rate, adjust a current live broadcast code rate of the target live broadcast room, and transmit the adjusted target live stream as a live stream to be played to a user.
In practical application, the server can determine the target live streams output by the target transcoding machines corresponding to the target definition information according to the corresponding relation between the target live streams output by the different transcoding machines and the definition information, wherein the target live streams output by the different transcoding machines are live video streams with different definition obtained by respectively transcoding the target live video by the different transcoding machines.
And acquiring the target live stream and transmitting the target live stream to a user terminal so that the user terminal plays the target live stream. Because the target live streams output by different transcoding machines have different definitions, live streams suitable for the definition of the current network state are transmitted to the user terminal according to the definition monitoring data of different user terminals, video blocking times are reduced, information acquisition efficiency is improved, and user experience is improved.
Finally, the definition monitoring data further comprises live broadcast stream blocking data of the user terminal, after the live broadcast stream to be played is transmitted to the user terminal, the blocking condition of the live broadcast of the user terminal can be detected, and the play definition can be adjusted according to the blocking condition, and the method can be realized specifically by the following steps:
polling and detecting the live stream blocking times of the user terminal in a preset time period according to a preset time interval;
judging whether the live stream jamming times are larger than a preset jamming times threshold value or not;
if the number of times of blocking is larger than the preset threshold value, the definition level of the live stream to be played of the user is reduced;
judging whether the current definition level of the user terminal is smaller than or equal to a preset definition level threshold value;
If yes, taking the definition corresponding to the definition level as the target definition of the live stream to be played by the user;
if not, the live stream blocking times of the user terminal in a preset time period are detected in a polling mode according to the preset time interval.
Specifically, because video clamping can bring bad viewing experience to users, after live streams to be played are transmitted to the users, the definition monitoring data of the users can be continuously obtained, and the playing definition of live videos of the user terminals can be adjusted in real time in a manner of continuously analyzing the definition monitoring data.
For example, if the type of the area in which the user is located changes, for example, the user enters a mall area or a school area from a residential area; or the motion state of the user changes, for example, the user terminal continuously reduces the stability of the network signal strength when the user slowly walks into the subway high-speed running state, etc., and then the user terminal is blocked in the live broadcast watching process.
The server can capture and report the jamming times of the user terminal in the process of watching the live video in real time, and the definition level of the live stream to be played of the user is reduced under the condition that the jamming times of the live stream in the preset duration are determined to be larger than a preset jamming time threshold; for example, if the video playing definition of the user terminal is reduced if the video playing definition of the user terminal is received more than 5 times during the live broadcast process under the current definition within 1 minute, the server can continuously feed back the video playing definition of the user terminal every 5 minutes, and stop continuously reducing the video playing definition of the user terminal until the video playing definition of the user terminal is reduced to the lowest definition.
In addition, as shown in fig. 2, a schematic diagram of a live broadcast processing method provided in the embodiment of the present application is shown, and a server may determine target definition suitable for playing live video by a client a by acquiring data such as positioning data, network signal strength, moving distance, moving speed, and the like of the client a of a user U; the server can also perform the video blocking monitoring analysis on the client A so as to adjust the play definition of the live video of the client A in real time, thereby avoiding video blocking of the client A.
A schematic diagram of a live broadcast processing result provided by the embodiment of the application is shown in fig. 3a, and a schematic diagram of another live broadcast processing result provided by the embodiment of the application is shown in fig. 3b, wherein the live broadcast processing result shown in fig. 3a is a live broadcast picture of a PC end, the live broadcast processing result shown in fig. 3b is a live broadcast picture of a mobile end, and a current definition and a transmission line of a live broadcast stream can be displayed in the picture, wherein related information of the definition and the line in fig. 3b is displayed through a second layer located above the first layer.
An embodiment of the application realizes a live broadcast processing method, wherein the live broadcast processing method comprises the steps of receiving a live broadcast room entering instruction of a user, carrying identification information of a target live broadcast room in the live broadcast room entering instruction, acquiring definition monitoring data of at least one data dimension, determining target definition of a live broadcast stream to be played of the user according to the definition monitoring data, acquiring live broadcast configuration information of the target live broadcast room according to the identification information, determining a target live broadcast stream corresponding to the target definition according to a mapping relation between the live broadcast stream and the definition in the live broadcast configuration information, and transmitting the target live broadcast stream to the user as the live broadcast stream to be played.
In the embodiment of the application, the server side can monitor and acquire the definition monitoring data of at least one dimension of the user, so that a live stream with proper playing definition is provided for the user according to the definition monitoring data, and the blocking of the live stream of the user terminal can be effectively avoided under the condition that the utilization rate of network resources of the user is ensured, thereby being beneficial to improving user experience.
Referring to fig. 4, an example of application of the live broadcast processing method provided in the embodiment of the present application in the live broadcast field is further described. Fig. 4 shows a process flow chart of a live broadcast processing method according to an embodiment of the present application, which specifically includes the following steps:
step 402, receiving a live room entry instruction of a user.
At step 404, sharpness monitoring data for at least one data dimension is obtained.
Step 406, if it is determined that the login status of the user is logged according to the login status data in the sharpness monitoring data, determining a first target area to which the user belongs according to the network communication address in the sharpness monitoring data.
In step 408, if the first target area is a non-predetermined area, it is determined that the target definition of the live stream to be played by the user is standard definition.
Step 410, if the first target area is a predetermined area, determining a second target area to which the user belongs according to the positioning data, and determining an area type of the second target area.
And step 412, if the second target area is a residential area, determining that the target definition of the live stream to be played by the user is blue light.
And step 414, determining that the target definition of the live stream to be played of the user is super-definition if the second target area is a business district.
And step 416, if it is determined that the network type of the user terminal is a wireless network according to the network status data in the sharpness monitoring data and the network signal strength of the user terminal is greater than a preset signal strength threshold, determining that the target sharpness of the live stream to be played by the user is super-clean.
And 418, if the network type of the user terminal is determined to be the data network type according to the network state data, determining the target definition of the live stream to be played of the user according to the mobile state data.
Step 420, if it is determined that the user is in a non-stationary state according to the movement state data in the sharpness monitoring data, and the movement type of the user is determined to be the first movement type according to the movement state data, it is determined that the target sharpness of the live stream to be played by the user is super-clean.
Step 422, if it is determined that the ue is in a stationary state according to the movement state data, determining the movement speed of the user according to the positioning data of the ue.
And step 424, if the moving speed is greater than a preset threshold, determining that the target definition of the live stream to be played by the user is standard definition.
In the embodiment of the application, the server side can monitor and acquire the definition monitoring data of at least one dimension of the user, so that a live stream with proper playing definition is provided for the user according to the definition monitoring data, and the blocking of the live stream of the user terminal can be effectively avoided under the condition that the utilization rate of network resources of the user is ensured, thereby being beneficial to improving user experience.
Corresponding to the method embodiment, the present application further provides a live broadcast processing device embodiment, and fig. 5 shows a schematic structural diagram of a live broadcast processing device according to one embodiment of the present application. As shown in fig. 5, the apparatus includes:
the receiving module 502 is configured to receive a live broadcast room entering instruction of a user, wherein the live broadcast room entering instruction carries identification information of a target live broadcast room;
an obtaining module 504, configured to obtain sharpness monitoring data of at least one data dimension, and determine a target sharpness of the live stream to be played by the user according to the sharpness monitoring data;
A determining module 506, configured to obtain, according to the identification information, live broadcast configuration information of the target live broadcast room, and determine, according to a mapping relationship between a live broadcast stream and sharpness in the live broadcast configuration information, a target live broadcast stream corresponding to the target sharpness;
and the transmission module 508 is configured to transmit the target live stream to the user as the live stream to be played.
Optionally, the sharpness monitoring data includes login status data of the user;
accordingly, the obtaining module 504 includes:
and the first determining submodule is configured to determine that the target definition of the live stream to be played of the user is the first definition if the login state of the user is determined to be not logged according to the login state data.
Optionally, the definition monitoring data includes login status data of a user and terminal status data of a user terminal;
accordingly, the obtaining module 504 includes:
and the second determining submodule is configured to determine the target definition of the live stream to be played of the user according to the terminal state data if the login state of the user is determined to be logged according to the login state data.
Optionally, the terminal status data includes a network communication address of the user terminal;
accordingly, the obtaining module 504 includes:
a third determining submodule configured to determine a first target area to which the user belongs according to the network communication address;
and the fourth determining submodule is configured to determine that the target definition of the live stream to be played by the user is the second definition if the first target area is a non-preset area.
Optionally, the terminal status data includes positioning data of the user terminal, and the acquiring module 504 further includes:
a fifth determining submodule configured to determine a second target area to which the user belongs according to the positioning data and determine an area type of the second target area if the first target area is a predetermined area;
a sixth determining submodule, configured to determine that a target definition of the live stream to be played by the user is a third definition if the second target area is of the first area type;
and a seventh determining submodule, configured to determine that the target definition of the live stream to be played by the user is a fourth definition if the second target area is of the second area type.
Optionally, the sharpness monitoring data comprises network status data of the user terminal;
accordingly, the obtaining module 504 includes:
and the eighth determining submodule is configured to determine that the target definition of the live stream to be played by the user is the first definition if the network type of the user terminal is determined to be the first network type according to the network state data and the network signal strength of the user terminal is greater than a preset signal strength threshold value.
Optionally, the sharpness monitoring data includes movement status data of the user terminal, and the obtaining module 504 further includes:
and a ninth determining submodule configured to determine that the network type of the user terminal is the second network type according to the network state data, and determine the target definition of the live stream to be played of the user according to the mobile state data.
Optionally, the obtaining module 504 includes:
and the tenth determination submodule is configured to determine that the target definition of the live stream to be played by the user is a third definition if the user is determined to be in a non-stationary state according to the moving state data and the motion type of the user is determined to be a first motion type according to the moving state data.
Optionally, the obtaining module 504 further includes:
a first movement speed determination sub-module configured to determine a movement speed of the user according to positioning data of the user terminal;
the first region type determining submodule is configured to determine the region type of the second target region to which the user belongs according to the positioning data under the condition that the moving speed is smaller than or equal to a preset threshold value;
the first definition determining submodule is configured to determine that the target definition of the live stream to be played of the user is third definition when the second target area is of the first area type;
and the second definition determining submodule is configured to determine that the target definition of the live stream to be played by the user is fourth definition when the second target area is of a second area type.
Optionally, the obtaining module 504 further includes:
and the third definition determining submodule is configured to determine that the target definition of the live stream to be played by the user is the second definition if the user is determined to be in a non-stationary state according to the moving state data and the moving speed of the user is determined to be greater than a preset threshold according to the positioning data of the user terminal.
Optionally, the obtaining module 504 includes:
the acquisition sub-module is configured to acquire angular velocity data of a first coordinate axis, a second coordinate axis and a third coordinate axis of the user terminal in a space coordinate system, wherein the angular velocity data are acquired by a motion sensor in the user terminal; and/or acquiring acceleration data of the user terminal, which are acquired by a acceleration sensor in the user terminal;
a processing sub-module configured to take the angular velocity data and/or the acceleration data as the sharpness monitoring data.
Optionally, the sharpness monitoring data includes live stream katon data of the user terminal, and the live processing device further includes:
the detection module is configured to poll and detect the live stream blocking times of the user terminal in a preset time period according to a preset time interval;
the first judging module is configured to judge whether the live stream jamming times are larger than a preset jamming times threshold value or not;
if the operation result of the first judging module is yes, the operation level reducing module is operated;
the grade reducing module is configured to reduce the definition grade of the live stream to be played by the user;
the second judging module is configured to judge whether the current definition level of the user terminal is smaller than or equal to a preset definition level threshold value;
If the operation result of the second judging module is yes, taking the definition corresponding to the definition level as the target definition of the live stream to be played by the user;
if not, the detection module is operated.
The foregoing is a schematic scheme of a live broadcast processing apparatus of this embodiment. It should be noted that, the technical solution of the live broadcast processing apparatus and the technical solution of the live broadcast processing method belong to the same concept, and details of the technical solution of the live broadcast processing apparatus, which are not described in detail, can be referred to the description of the technical solution of the live broadcast processing method.
Fig. 6 illustrates a block diagram of a computing device 600 provided in accordance with one embodiment of the present application. The components of computing device 600 include, but are not limited to, memory 610 and processor 620. The processor 620 is coupled to the memory 610 via a bus 630 and a database 650 is used to hold data.
Computing device 600 also includes access device 640, access device 640 enabling computing device 600 to communicate via one or more networks 660. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The access device 640 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-described components of computing device 600, as well as other components not shown in FIG. 6, may also be connected to each other, such as by a bus. It should be understood that the block diagram of the computing device illustrated in FIG. 6 is for exemplary purposes only and is not intended to limit the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 600 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), mobile phone (e.g., smart phone), wearable computing device (e.g., smart watch, smart glasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 600 may also be a mobile or stationary server.
Wherein the processor 620 is configured to execute computer-executable instructions for performing steps of the live processing method when the processor executes the computer-executable instructions.
The foregoing is a schematic illustration of a computing device of this embodiment. It should be noted that, the technical solution of the computing device and the technical solution of the live broadcast processing method belong to the same concept, and details of the technical solution of the computing device, which are not described in detail, can be referred to the description of the technical solution of the live broadcast processing method.
An embodiment of the present application also provides a computer-readable storage medium storing computer-executable instructions that, when executed by a processor, implement the steps of the live processing method.
The above is an exemplary version of a computer-readable storage medium of the present embodiment. It should be noted that, the technical solution of the storage medium and the technical solution of the live broadcast processing method belong to the same concept, and details of the technical solution of the storage medium which are not described in detail can be referred to the description of the technical solution of the live broadcast processing method.
The foregoing describes specific embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The computer instructions include computer program code that may be in source code form, object code form, executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the embodiments are not limited by the order of actions described, as some steps may take other order or occur simultaneously in accordance with the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the embodiments of the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The above-disclosed preferred embodiments of the present application are provided only as an aid to the elucidation of the present application. Alternative embodiments are not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the teachings of the embodiments of the present application. These embodiments were chosen and described in order to best explain the principles of the embodiments and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention. This application is to be limited only by the claims and the full scope and equivalents thereof.

Claims (12)

1. A live broadcast processing method, comprising:
receiving a live broadcasting room entering instruction of a user, wherein the live broadcasting room entering instruction carries identification information of a target live broadcasting room;
acquiring definition monitoring data of at least two data dimensions, wherein the definition monitoring data at least comprise login state data of a user and terminal state data of a user terminal, and the terminal state data comprise network communication addresses and positioning data of the user terminal;
If the login state of the user is determined to be logged in according to the login state data, determining a first target area to which the user belongs according to the network communication address, wherein the network communication address is an IP address of the user terminal;
if the first target area is a non-preset area, determining that the target definition of the live stream to be played by the user is a second definition;
if the first target area is a preset area, determining a second target area which the user belongs to according to the positioning data, determining the area type of the second target area, and determining target definition according to the area type of the second target area, wherein the second target area is a specific position where the user is located;
acquiring live broadcast configuration information of the target live broadcast room according to the identification information, and determining a target live broadcast stream corresponding to the target definition according to a mapping relation between the live broadcast stream and the definition in the live broadcast configuration information;
and transmitting the target live stream to the user as the live stream to be played.
2. The live broadcast processing method according to claim 1, wherein the determining the target definition according to the region type of the second target region includes:
If the second target area is of the first area type, determining that the target definition of the live stream to be played by the user is a third definition;
if the second target area is of the second area type, determining that the target definition of the live stream to be played by the user is a fourth definition,
the second definition is lower than the fourth definition, the fourth definition is lower than the third definition, the preset area is a predefined area, the non-preset area is other areas except the preset area, the first area type is a residential area type, and the second area type is a school area type or a market area type.
3. The live broadcast processing method according to claim 1, wherein the sharpness monitoring data comprises network status data of a user terminal;
correspondingly, the determining the target definition of the live stream to be played by the user according to the definition monitoring data includes:
after determining the target definition of the user according to the login state data of the user and the terminal state data of the user terminal, if the network type of the user terminal is determined to be the first network type according to the network state data and the network signal strength of the user terminal is greater than a preset signal strength threshold value, determining the target definition of the live stream to be played of the user to be the first definition.
4. A live broadcast processing method according to claim 3, wherein the sharpness monitoring data comprises movement status data of the user terminal, the method further comprising:
and if the network type of the user terminal is determined to be the second network type according to the network state data, determining the target definition of the live stream to be played of the user according to the mobile state data.
5. The live processing method according to claim 4, wherein the determining the target definition of the live stream to be played by the user according to the movement state data includes:
and if the user is determined to be in a non-static state according to the moving state data and the motion type of the user is determined to be a first motion type according to the moving state data, determining the target definition of the live stream to be played by the user to be a third definition.
6. The live processing method of claim 5, further comprising:
determining the moving speed of the user according to the positioning data of the user terminal;
determining the region type of the second target region to which the user belongs according to the positioning data under the condition that the moving speed is smaller than or equal to a preset threshold value;
If the second target area is of the first area type, determining that the target definition of the live stream to be played by the user is a third definition;
and under the condition that the second target area is of a second area type, determining that the target definition of the live stream to be played by the user is fourth definition.
7. The live broadcast processing method according to claim 5 or 6, further comprising:
and if the user is determined to be in a non-static state according to the moving state data and the moving speed of the user is determined to be greater than a preset threshold according to the positioning data of the user terminal, determining the target definition of the live stream to be played by the user to be a second definition.
8. The live broadcast processing method according to claim 4 or 5, wherein the acquiring sharpness monitoring data of the user in at least one data dimension comprises:
acquiring angular velocity data of a first coordinate axis, a second coordinate axis and a third coordinate axis of a user terminal in a space coordinate system, wherein the angular velocity data are acquired by a motion sensor in the user terminal; and/or acquiring acceleration data of the user terminal, which are acquired by a acceleration sensor in the user terminal;
And taking the angular velocity data and/or the acceleration data as the definition monitoring data.
9. The live broadcast processing method according to claim 1 or 2, wherein the sharpness monitoring data comprises live stream katon data of a user terminal, the method further comprising:
after the live stream to be played is transmitted to the user terminal, polling and detecting the blocking times of the live stream of the user terminal in a preset time period according to a preset time interval;
judging whether the live stream jamming times are larger than a preset jamming times threshold value or not;
if the number of times of blocking is larger than the preset threshold value, the definition level of the live stream to be played of the user is reduced;
judging whether the current definition level of the user terminal is smaller than or equal to a preset definition level threshold value;
if yes, taking the definition corresponding to the definition level as the target definition of the live stream to be played by the user;
if not, returning to execute the step of detecting the live stream blocking times of the user terminal in the preset time period according to the polling of the preset time interval.
10. A live broadcast processing apparatus, comprising:
the receiving module is configured to receive a live broadcasting room entering instruction of a user, wherein the live broadcasting room entering instruction carries identification information of a target live broadcasting room;
An acquisition module configured to acquire sharpness monitoring data of at least two data dimensions, wherein the sharpness monitoring data comprises at least login status data of the user and terminal status data of a user terminal, the terminal status data comprises a network communication address and positioning data of the user terminal,
if the login state of the user is determined to be logged in according to the login state data, determining a first target area to which the user belongs according to the network communication address, wherein the network communication address is an IP address of the user terminal; if the first target area is a non-preset area, determining that the target definition of the live stream to be played by the user is a second definition; if the first target area is a preset area, determining a second target area which the user belongs to according to the positioning data, determining the area type of the second target area, and determining target definition according to the area type of the second target area, wherein the second target area is a specific position where the user is located;
the determining module is configured to acquire live broadcast configuration information of the target live broadcast room according to the identification information, and determine a target live broadcast stream corresponding to the target definition according to a mapping relation between the live broadcast stream and the definition in the live broadcast configuration information;
And the transmission module is configured to transmit the target live stream to the user as the live stream to be played.
11. A computing device, comprising:
a memory and a processor;
the memory is configured to store computer executable instructions and the processor is configured to execute the computer executable instructions, wherein the processor, when executing the computer executable instructions, performs the steps of live processing according to any one of claims 1-9.
12. A computer readable storage medium, characterized in that it stores computer instructions which, when executed by a processor, implement the steps of the live processing method of any of claims 1-9.
CN202110243553.XA 2021-03-05 2021-03-05 Live broadcast processing method and device Active CN115037951B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110243553.XA CN115037951B (en) 2021-03-05 2021-03-05 Live broadcast processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110243553.XA CN115037951B (en) 2021-03-05 2021-03-05 Live broadcast processing method and device

Publications (2)

Publication Number Publication Date
CN115037951A CN115037951A (en) 2022-09-09
CN115037951B true CN115037951B (en) 2024-03-12

Family

ID=83118294

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110243553.XA Active CN115037951B (en) 2021-03-05 2021-03-05 Live broadcast processing method and device

Country Status (1)

Country Link
CN (1) CN115037951B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102377730A (en) * 2010-08-11 2012-03-14 中国电信股份有限公司 Audio/video signal processing method and mobile terminal
CN105872604A (en) * 2016-06-15 2016-08-17 武汉斗鱼网络科技有限公司 Live broadcast video stream pushing method based on different user states and live broadcast video stream pushing system based on different user states
CN106162233A (en) * 2016-07-08 2016-11-23 合网络技术(北京)有限公司 Code check recommends method and device
CN106454404A (en) * 2016-09-29 2017-02-22 广州华多网络科技有限公司 Live video playing method, device and system
WO2017101311A1 (en) * 2015-12-15 2017-06-22 乐视控股(北京)有限公司 Method, client and server for realising switching from live broadcasting to on-demand broadcasting
CN107071536A (en) * 2017-03-29 2017-08-18 武汉斗鱼网络科技有限公司 User's switching definition loads the method and system of video flowing
CN108848414A (en) * 2018-06-26 2018-11-20 曜宇航空科技(上海)有限公司 The switching method and player of a kind of playback method of video, clarity
CN110213619A (en) * 2018-02-28 2019-09-06 优酷网络技术(北京)有限公司 Bandwidth allocation methods and device
CN110602425A (en) * 2018-06-13 2019-12-20 上海哔哩哔哩科技有限公司 Video definition adjusting method, system, computer readable storage medium and terminal
CN110971936A (en) * 2019-12-06 2020-04-07 中车青岛四方车辆研究所有限公司 Video data processing method, server and video receiving end
CN111586431A (en) * 2020-06-05 2020-08-25 广州酷狗计算机科技有限公司 Method, device and equipment for live broadcast processing and storage medium
CN111601118A (en) * 2020-05-13 2020-08-28 广州市百果园信息技术有限公司 Live video processing method, system, device and terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3022921A4 (en) * 2013-07-14 2017-03-15 Sharp Kabushiki Kaisha Signaling indications and constraints

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102377730A (en) * 2010-08-11 2012-03-14 中国电信股份有限公司 Audio/video signal processing method and mobile terminal
WO2017101311A1 (en) * 2015-12-15 2017-06-22 乐视控股(北京)有限公司 Method, client and server for realising switching from live broadcasting to on-demand broadcasting
CN105872604A (en) * 2016-06-15 2016-08-17 武汉斗鱼网络科技有限公司 Live broadcast video stream pushing method based on different user states and live broadcast video stream pushing system based on different user states
CN106162233A (en) * 2016-07-08 2016-11-23 合网络技术(北京)有限公司 Code check recommends method and device
CN106454404A (en) * 2016-09-29 2017-02-22 广州华多网络科技有限公司 Live video playing method, device and system
CN107071536A (en) * 2017-03-29 2017-08-18 武汉斗鱼网络科技有限公司 User's switching definition loads the method and system of video flowing
CN110213619A (en) * 2018-02-28 2019-09-06 优酷网络技术(北京)有限公司 Bandwidth allocation methods and device
CN110602425A (en) * 2018-06-13 2019-12-20 上海哔哩哔哩科技有限公司 Video definition adjusting method, system, computer readable storage medium and terminal
CN108848414A (en) * 2018-06-26 2018-11-20 曜宇航空科技(上海)有限公司 The switching method and player of a kind of playback method of video, clarity
CN110971936A (en) * 2019-12-06 2020-04-07 中车青岛四方车辆研究所有限公司 Video data processing method, server and video receiving end
CN111601118A (en) * 2020-05-13 2020-08-28 广州市百果园信息技术有限公司 Live video processing method, system, device and terminal
CN111586431A (en) * 2020-06-05 2020-08-25 广州酷狗计算机科技有限公司 Method, device and equipment for live broadcast processing and storage medium

Also Published As

Publication number Publication date
CN115037951A (en) 2022-09-09

Similar Documents

Publication Publication Date Title
US10469820B2 (en) Streaming volumetric video for six degrees of freedom virtual reality
US10474717B2 (en) Live video streaming services with machine-learning based highlight replays
Bao et al. Motion-prediction-based multicast for 360-degree video transmissions
US9685195B2 (en) Geographical location information/signal quality-context based recording and playback of multimedia data from a conference session
US10305613B2 (en) Method and system for detecting image delay
CN102752667B (en) Multi-stream media live broadcast interaction system and live broadcast interaction method
CN113068052B (en) Method for determining brushing amount of live broadcast room, live broadcast method and data processing method
CN108093267B (en) Live broadcast method and device, storage medium and electronic equipment
US20130133000A1 (en) Video Interaction System
CN111683273A (en) Method and device for determining video blockage information
US11089073B2 (en) Method and device for sharing multimedia content
CN112995776B (en) Method, device, equipment and storage medium for determining screen capture frame rate of shared screen content
US11641498B2 (en) Method, systems and devices for providing adjusted video content according to viewing distance
WO2015085873A1 (en) Video code stream obtaining method and apparatus
CN115037951B (en) Live broadcast processing method and device
JP6530820B2 (en) Multimedia information reproducing method and system, collecting device, standard server
CN106604085A (en) Video sharing method and video sharing device
US20220239920A1 (en) Video processing method, related apparatus, storage medium, and program product
CN113840157B (en) Access detection method, system and device
CN116980392A (en) Media stream processing method, device, computer equipment and storage medium
JP7431514B2 (en) Method and system for measuring quality of video call service in real time
US20200250708A1 (en) Method and system for providing recommended digital content item to electronic device
CN113573004A (en) Video conference processing method and device, computer equipment and storage medium
CN114945097B (en) Video stream processing method and device
CN117014696A (en) RTP-based audio and video transmission method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant