CN111970527A - Live broadcast data processing method and device - Google Patents

Live broadcast data processing method and device Download PDF

Info

Publication number
CN111970527A
CN111970527A CN202010833459.5A CN202010833459A CN111970527A CN 111970527 A CN111970527 A CN 111970527A CN 202010833459 A CN202010833459 A CN 202010833459A CN 111970527 A CN111970527 A CN 111970527A
Authority
CN
China
Prior art keywords
live
live broadcast
target
image
texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010833459.5A
Other languages
Chinese (zh)
Other versions
CN111970527B (en
Inventor
李武军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huya Technology Co Ltd
Original Assignee
Guangzhou Huya Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huya Technology Co Ltd filed Critical Guangzhou Huya Technology Co Ltd
Priority to CN202010833459.5A priority Critical patent/CN111970527B/en
Publication of CN111970527A publication Critical patent/CN111970527A/en
Application granted granted Critical
Publication of CN111970527B publication Critical patent/CN111970527B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application provides a live broadcast data processing method and a device, which relate to the technical field of live broadcast.A live broadcast receiving end extracts textures of a live broadcast background image selected by a user to obtain a live broadcast background texture, extracts object textures of a target object in the received live broadcast image, mixes the object textures with the live broadcast background texture and a player sdk to generate a target live broadcast image, and displays the target live broadcast image; therefore, the user can perform personalized configuration on the live background at the live receiving end, and the flexibility of live background switching is improved.

Description

Live broadcast data processing method and device
Technical Field
The application relates to the technical field of live broadcast, in particular to a live broadcast data processing method and device.
Background
In scenes such as network live broadcast, the background in a live broadcast picture can be replaced, so that a viewer can watch live content of a main broadcast in a plurality of different scenes, and the watching experience of the viewer is improved.
However, some background image background replacement schemes generally cannot meet the requirement of autonomous replacement on the viewer side, and the replacement flexibility is poor.
Disclosure of Invention
The application aims to provide a live broadcast data processing method and device, which can improve the flexibility of live broadcast background switching.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides a live data processing method, applied to a live receiving end, where the method includes:
carrying out texture extraction on a live background image selected by a user to obtain live background textures;
extracting object textures of target objects in the received live broadcast images;
and mixing the object texture with the live background texture and a player sdk to generate a target live image, and displaying the target live image.
In a second aspect, the present application provides a live data processing apparatus, which is applied to a live receiving end, the apparatus includes:
the processing module is used for extracting textures of the live background image selected by the user to obtain live background textures;
the processing module is further used for extracting object textures of target objects in the received live broadcast images;
and the replacing module is used for mixing the object texture, the live background texture and the player sdk to generate a target live image and displaying the target live image.
In a third aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the live data processing method described above.
In a fourth aspect, the present application provides a live broadcast receiving end, including a memory for storing one or more programs; a processor; the one or more programs, when executed by the processor, implement the live data processing method described above.
In a fifth aspect, the present application provides a live broadcast system, where the live broadcast system includes a server and the above-mentioned live broadcast receiving terminal, and the server establishes communication with the live broadcast receiving terminal.
According to the live broadcast data processing method and device, the live broadcast receiving end carries out texture extraction on a live broadcast background image selected by a user to obtain a live broadcast background texture, and extracts an object texture of a target object in the received live broadcast image, so that the object texture, the live broadcast background texture and the player sdk are mixed to generate a target live broadcast image, and the target live broadcast image is displayed; therefore, the user can perform personalized configuration on the live background at the live receiving end, and the flexibility of live background switching is improved.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly explain the technical solutions of the present application, the drawings needed for the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also derive other related drawings from these drawings without inventive effort.
Fig. 1 is a schematic view illustrating an interactive scene of a live broadcast system provided in the present application;
fig. 2 shows a schematic structural block diagram of a live broadcast receiving end provided in the present application;
fig. 3 shows a schematic flow chart of a live data processing method provided by the present application;
FIG. 4 shows an alternative process diagram of a live background;
fig. 5 shows a schematic structural diagram of a live data processing apparatus provided in the present application.
In the figure: 10-a live broadcast system; 100-a live broadcast receiving end; 101-a memory; 102-a processor; 103-a memory controller; 104-peripheral interfaces; 105-a radio frequency unit; 106-communication bus/signal line; 107-a display unit; 200-a server; 300-a live broadcast initiating terminal; 500-a live data processing device; 501-a processing module; 502-replacement module.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the accompanying drawings in some embodiments of the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. The components of the present application, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person of ordinary skill in the art based on a part of the embodiments in the present application without any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises all of the elements.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic view illustrating an interactive scene of a live broadcast system 10 provided in the present application. In some embodiments, the live system 10 may include a live initiator 300, a server 200, and a live receiver 100, where the live initiator 300 and the live receiver 100 both establish communication with the server 200, and the server 200 may provide live services for the live initiator 300 and the live receiver 100.
It is understood that the live system 10 shown in fig. 1 is only an example, and in some other possible embodiments of the present application, the live system 10 may also include only one of the components shown in fig. 1 or may also include other components; such as only the server 200 and the live initiator 300, or only the server 200 and the live receiver 100.
In some possible embodiments, the live initiator 300 and the live receiver 100 may be, but are not limited to, a smart phone, a personal digital assistant, a personal computer, a tablet computer, a personal computer, a notebook computer, a virtual reality terminal device, an augmented reality terminal device, and the like. The live broadcast initiator 300 and the live broadcast receiver 100 may have internet products installed therein for providing live internet services, for example, the internet products may be applications APP, Web pages, applets, etc. related to live internet services used in a computer or a smart phone.
In addition, the live broadcast initiator 300 and the live broadcast receiver 100 may be the same device, for example, both use smart phones, or may be different devices, for example, the live broadcast initiator 300 may be a personal computer, and the live broadcast receiver 100 may be a smart phone.
In some possible embodiments, the server 200 may be a single physical server or a server group consisting of a plurality of physical servers configured to perform different data processing functions. The set of servers can be centralized or distributed (e.g., the servers can be a distributed system). In some possible embodiments, for a single physical server, different logical servers may be assigned to the physical server based on different live service functions.
When network live broadcasting is performed based on a live broadcasting system shown in fig. 1, for example, some schemes for replacing a background in a live broadcasting picture are as follows: when the anchor of the live broadcast initiating terminal carries out live broadcast, the live broadcast background of each frame in the generated video code stream can be replaced by the background of personalized configuration by the live broadcast initiating terminal through personalized configuration, so that the video code stream after replacing the live broadcast background is sent to the server and then sent to the live broadcast receiving terminal by the server, and audiences of the live broadcast receiving terminal can watch the network live broadcast under different live broadcast backgrounds.
However, in the above live background replacement scheme, a user at a live receiving end cannot configure the live background at will, but only can view the live background replaced by the live initiating end, which cannot meet different viewing requirements of different viewers, and the background replacement flexibility is poor.
Therefore, one possible implementation manner provided by the present application is: extracting the texture of the live broadcast background image selected by the user by the live broadcast receiving end to obtain a live broadcast background texture, extracting the object texture of the target object in the received live broadcast image, mixing the object texture, the live broadcast background texture and the player sdk to generate a target live broadcast image, and displaying the target live broadcast image; therefore, the flexibility of live background switching can be improved.
Referring to fig. 2, fig. 2 shows a schematic block diagram of a live broadcast receiving end 100 provided in the present application, and in some embodiments, the live broadcast receiving end 100 may include a memory 101, one or more processors (only one of which is shown in the figure) 102, a storage controller 103, a peripheral interface 104, a radio frequency unit 105, a display unit 107, and the like. These components communicate with each other via one or more communication buses/signal lines 106.
The memory 101 may be configured to store a software program and a module, such as a program instruction/module corresponding to the live data processing apparatus provided in the present application, and the processor 102 executes various functional applications, image processing, and the like by running the software program and the module stored in the memory 101, so as to implement the live data processing method provided in the present application.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The Processor 102 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), a voice Processor, a video Processor, and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The methods, steps, and logic blocks disclosed in some embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor 102 may be any conventional processor or the like.
The peripheral interface 104 may couple various input/output devices to the processor 102 as well as to the memory 101. In some embodiments, the peripheral interface 104, the processor 102, and the memory controller 103 may be implemented in a single chip. In other embodiments of the present application, they may be implemented by separate chips.
The rf unit 105 may be configured to receive and transmit electromagnetic waves, and perform interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices, such as a server of the live broadcast system shown in fig. 1, to receive a video code stream to obtain live broadcast images.
The display unit 107 may be used to provide a graphical output interface for a user, display image information, such as displaying live images for the user to view a live network, and the like.
It is to be understood that the structure shown in fig. 2 is merely illustrative, and the live receiver 100 may also include more or fewer components than shown in fig. 2, or have a different configuration than shown in fig. 2. The components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
The live broadcast receiving end shown in fig. 2 is taken as an exemplary execution main body, and the live broadcast data processing method provided by the present application is exemplarily described below.
Referring to fig. 3, fig. 3 shows a schematic flow chart of a live data processing method provided in the present application, and in some embodiments, the live data processing method may include the following steps:
step 401, extracting texture of a live background image selected by a user to obtain live background texture;
step 403, extracting object texture of a target object in the received live broadcast image;
step 405, mixing the object texture with the live background texture and the player sdk to generate a target live image, and displaying the target live image.
In some embodiments, as shown in fig. 1, the live broadcast receiving end may continuously receive the live broadcast video code stream from the server, and continuously display a live broadcast image in the live broadcast video code stream on a display interface of the live broadcast receiving end, so as to provide a service for viewers at the live broadcast receiving end to watch live webcast.
With reference to fig. 4, when the display interface of the live broadcast receiving end displays the live broadcast image shown in fig. 4A, for example, the live broadcast receiving end may display the live broadcast background image selection interface shown in fig. 4B by receiving an input instruction of a user, so that the user may select a live broadcast background image to be replaced in the live broadcast background image selection interface.
After the user selects the live background image to be replaced in the live background image selection interface shown in fig. 4B, for example, the live receiving end may perform texture extraction on the live background image selected by the user by using MediaPlayer + OpenGL, for example, the live receiving end may first decode the live background image by using a player and output the decoded live background image to the surface texture, so that the live background texture bgdynamic texture corresponding to the live background image is output by the surface texture.
Then, the live broadcast receiving end can extract the object texture of the target object in the received live broadcast image; for example, a portrait area in a live broadcast image may be used as a target object, and the live broadcast image is input to a depth learning model trained in advance, so that the live broadcast image is identified by the depth learning model to obtain an area where the target object is located in the live broadcast image, and then the live broadcast receiving end generates an object texture mask id of the target object.
Next, the live broadcast receiving end may perform alpha mixing on the object texture, the live broadcast background texture, and the player sdk by using, for example, openGL, to generate a target live broadcast image, and display the target live broadcast image on a display interface of the live broadcast receiving end, for example, the live broadcast image shown in fig. 4C is displayed on the display interface of the live broadcast receiving end, so that a user of the live broadcast receiving end can watch live broadcast by using a live broadcast background image configured individually, thereby improving flexibility of switching live broadcast backgrounds, and improving live broadcast watching experience of viewers.
In some embodiments, in order to flexibly extract the object texture of the target object, when the live broadcast receiving end executes step 403, for example, the above-mentioned deep learning model completed by training may be used to identify the region where the target object is located in the live broadcast image as a mask region, where the mask region is the region where the target object is located in the live broadcast image; for example, in a scene with a portrait as a target object, a direct receiving end may recognize a portrait area in a live broadcast image as a mask area, so that the live broadcast receiving end may first obtain target Bitmap (Bitmap) information corresponding to the mask area in the received live broadcast image, and then convert the target Bitmap information into an object texture of the target object.
It should be noted that, when the audience uses the live broadcast receiving end to watch the network live broadcast, a large amount of resources such as memory, CPU, etc. of the live broadcast receiving end need to be consumed; in some possible application scenarios, when a live broadcast receiving end continuously replaces a live broadcast background of a live broadcast image in a video code stream, bitmap information corresponding to a mask area in each frame of live broadcast image also needs to be continuously acquired, so that the live broadcast receiving end needs to frequently create different bitmap information in a memory to cause memory jitter, thereby causing live broadcast jamming and reducing the live broadcast viewing experience of audiences.
Therefore, in some possible embodiments, a memory multiplexing technique may be combined, and a live broadcast receiving end may construct a bitmap management pool (bitmap pool) by creating a memory pool in advance and loading a plurality of bitmap information in the memory pool created in advance, that is: the bitmap management pool stores a plurality of bitmap information in advance, and the target bitmap information is one of the bitmap information.
Thus, when the live broadcast receiving end acquires the target bitmap information corresponding to the mask region in the received live broadcast image, the target bitmap information corresponding to the mask region in the received live broadcast image can be searched in, for example, the pre-created bitmap management pool by using a preset fuzzy matching algorithm in combination with pts (Presentation Time Stamp, timestamp) of the live broadcast image, so that frequent creation of bitmap information is avoided, and live broadcast jamming caused by memory jitter is avoided.
In addition, in some possible application scenarios, as shown in fig. 4, when the live broadcast background is replaced by the live broadcast played by the live broadcast receiving end, the live broadcast receiving end generally needs to continue playing the live broadcast image in the video stream without interrupting the live broadcast.
For this reason, when executing the scheme provided by the present application, the live broadcast receiving end may first release the currently used player sdk and create a new player sdk, so as to execute step 405 with the new player sdk.
It should be noted that, a certain time is required to create a new player sdk, and the above implementation may cause the time for replacing the live background by the live receiver to be lengthened, so that there is a certain time delay for replacing the live background.
Therefore, in some possible embodiments, the live receiving end may not release the currently used player sdk when performing step 405, but reset the currently used player sdk in a manner of resetting blending, and blend the object texture and the live background texture on the reset player sdk, thereby generating the target live image. In this manner, a new player sdk can be created without releasing the currently used player sdk when replacing the live background, thereby reducing the time delay incurred when replacing the live background.
As described above, when replacing the live broadcast background, it is necessary to perform certain processing on the received live broadcast image and to mix the live broadcast image with the live broadcast background image selected by the user, which often requires a certain time to complete.
That is, when a user inputs an operation instruction to the live broadcast receiving terminal to instruct the live broadcast operation terminal to replace the live broadcast background of the live broadcast network, a certain time interval is required until the live broadcast background is replaced.
Therefore, in some possible embodiments, the live broadcast receiving end may use attachthoglcontext and detachfromlcontext of surface texture to multiplex an off-screen cache manner, before generating a target live broadcast image, a display interface of the live broadcast receiving end may continue to play a live broadcast image in a video code stream by using a previously used live broadcast background image until the live broadcast receiving end generates the target live broadcast image, and switch content displayed on the display interface from a live broadcast image generated by the previously used live broadcast background image to a target live broadcast image generated by a selected live broadcast background image; thus, the discontinuous display of the live broadcast image during background switching can be avoided.
In addition, based on the same inventive concept as the live data processing method provided in the present application, the present application further provides a live data processing apparatus 500 as shown in fig. 5, where the live data processing apparatus 500 includes a processing module 501 and a replacing module 502; wherein:
the processing module 501 is configured to perform texture extraction on a live background image selected by a user to obtain a live background texture;
the processing module 501 is further configured to extract an object texture of a target object in the received live broadcast image;
and the replacing module 502 is configured to mix the object texture with the live background texture and the player sdk to generate a target live image, and display the target live image.
Optionally, as a possible implementation manner, when the object texture is mixed with the live background texture and the player sdk to generate the target live image, the replacing module 502 is specifically configured to:
the currently used player sdk is reset and the object texture and the live background texture are blended on the reset player sdk to generate the target live image.
Optionally, as a possible implementation, before the replacing module 502 generates the target live image, the processing module 501 is further configured to:
the historical live broadcast image is displayed until the replacing module 502 generates the target live broadcast image, and the replacing module 502 switches the historical live broadcast image into the target live broadcast image.
Optionally, as a possible implementation manner, when extracting an object texture of a target object in a received live image, the processing module 501 is specifically configured to:
acquiring target bitmap information corresponding to a mask area in a received live broadcast image; the shading area is an area where a target object in a live broadcast image is located;
the target bitmap information is converted into an object texture of the target object.
Optionally, as a possible implementation manner, when acquiring target bitmap information corresponding to a mask region in a received live broadcast image, the processing module 501 is specifically configured to:
searching target bitmap information corresponding to a mask area in the received live broadcast image in a pre-established bitmap management pool by using a preset fuzzy matching algorithm; the bitmap management pool stores a plurality of bitmap information, the target bitmap information is one of the bitmap information, and the mask area is an area where a target object in a live broadcast image is located.
Optionally, as a possible implementation manner, before the processing module 501 searches, by using a preset fuzzy matching algorithm, target bitmap information corresponding to a mask region in the received live broadcast image in a pre-created bitmap management pool, the processing module 501 is further configured to:
and loading a plurality of bitmap information in a pre-created memory pool to construct a bitmap management pool.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to some embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in some embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to some embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only a few examples of the present application and is not intended to limit the present application, and those skilled in the art will appreciate that various modifications and variations can be made in the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. A live broadcast data processing method is applied to a live broadcast receiving end, and comprises the following steps:
carrying out texture extraction on a live background image selected by a user to obtain live background textures;
extracting object textures of target objects in the received live broadcast images;
and mixing the object texture with the live background texture and a player sdk to generate a target live image, and displaying the target live image.
2. The method of claim 1, wherein said blending said object texture with said live background texture and player sdk to generate a target live image comprises:
the currently used player sdk is reset and the object texture and the live background texture are blended on the reset player sdk to generate the target live image.
3. The method of claim 1, wherein prior to the generating the target live image, the method further comprises:
and displaying the historical live broadcast image until the target live broadcast image is generated, and switching the historical live broadcast image into the target live broadcast image.
4. The method of claim 1, wherein extracting an object texture of a target object in the received live image comprises:
acquiring target bitmap information corresponding to a mask area in a received live broadcast image; the shade area is an area where a target object in the live broadcast image is located;
and converting the target bitmap information into the object texture of the target object.
5. The method of claim 4, wherein the obtaining target bitmap information corresponding to a mask region in the received live image comprises:
searching target bitmap information corresponding to a mask area in the received live broadcast image in a pre-established bitmap management pool by using a preset fuzzy matching algorithm; the bitmap management pool stores a plurality of bitmap information, the target bitmap information is one of the bitmap information, and the mask area is an area where the target object in the live broadcast image is located.
6. The method of claim 5, wherein prior to said finding target bitmap information corresponding to a mask region in the received live image in a pre-created bitmap management pool using a pre-defined fuzzy matching algorithm, the method further comprises:
and loading a plurality of bitmap information in a pre-created memory pool to construct the bitmap management pool.
7. A live broadcast data processing device is applied to a live broadcast receiving end, and the device comprises:
the processing module is used for extracting textures of the live background image selected by the user to obtain live background textures;
the processing module is further used for extracting object textures of target objects in the received live broadcast images;
and the replacing module is used for mixing the object texture, the live background texture and the player sdk to generate a target live image and displaying the target live image.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
9. A live broadcast receiving end, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-6.
10. A live system, characterized in that the live system comprises a server and a live receiving end as claimed in claim 9, the server establishing communication with the live receiving end.
CN202010833459.5A 2020-08-18 2020-08-18 Live broadcast data processing method and device Active CN111970527B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010833459.5A CN111970527B (en) 2020-08-18 2020-08-18 Live broadcast data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010833459.5A CN111970527B (en) 2020-08-18 2020-08-18 Live broadcast data processing method and device

Publications (2)

Publication Number Publication Date
CN111970527A true CN111970527A (en) 2020-11-20
CN111970527B CN111970527B (en) 2022-03-29

Family

ID=73388374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010833459.5A Active CN111970527B (en) 2020-08-18 2020-08-18 Live broadcast data processing method and device

Country Status (1)

Country Link
CN (1) CN111970527B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114257866A (en) * 2021-12-14 2022-03-29 上海掌门科技有限公司 Video display method, readable medium and electronic equipment
CN114765692A (en) * 2021-01-13 2022-07-19 北京字节跳动网络技术有限公司 Live broadcast data processing method, device, equipment and medium
CN114860370A (en) * 2022-05-17 2022-08-05 聚好看科技股份有限公司 Display device, server and software development kit switching method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040194123A1 (en) * 2003-03-28 2004-09-30 Eastman Kodak Company Method for adapting digital cinema content to audience metrics
CN102541515A (en) * 2010-12-08 2012-07-04 腾讯科技(深圳)有限公司 Method and device for realizing special screen switching effect
CN106204426A (en) * 2016-06-30 2016-12-07 广州华多网络科技有限公司 A kind of method of video image processing and device
CN106254893A (en) * 2015-12-30 2016-12-21 深圳超多维科技有限公司 Main broadcaster's class interaction platform client method for changing scenes and device, client
CN106713988A (en) * 2016-12-09 2017-05-24 福建星网视易信息系统有限公司 Beautifying method and system for virtual scene live
CN106803966A (en) * 2016-12-31 2017-06-06 北京星辰美豆文化传播有限公司 A kind of many people's live network broadcast methods, device and its electronic equipment
US20180084292A1 (en) * 2016-09-18 2018-03-22 Shanghai Hode Information Technology Co.,Ltd. Web-based live broadcast
CN107920256A (en) * 2017-11-30 2018-04-17 广州酷狗计算机科技有限公司 Live data playback method, device and storage medium
US20180122114A1 (en) * 2016-08-19 2018-05-03 Beijing Sensetime Technology Development Co., Ltd. Method and apparatus for processing video image and electronic device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040194123A1 (en) * 2003-03-28 2004-09-30 Eastman Kodak Company Method for adapting digital cinema content to audience metrics
CN102541515A (en) * 2010-12-08 2012-07-04 腾讯科技(深圳)有限公司 Method and device for realizing special screen switching effect
CN106254893A (en) * 2015-12-30 2016-12-21 深圳超多维科技有限公司 Main broadcaster's class interaction platform client method for changing scenes and device, client
CN106204426A (en) * 2016-06-30 2016-12-07 广州华多网络科技有限公司 A kind of method of video image processing and device
US20180122114A1 (en) * 2016-08-19 2018-05-03 Beijing Sensetime Technology Development Co., Ltd. Method and apparatus for processing video image and electronic device
US20180084292A1 (en) * 2016-09-18 2018-03-22 Shanghai Hode Information Technology Co.,Ltd. Web-based live broadcast
CN106713988A (en) * 2016-12-09 2017-05-24 福建星网视易信息系统有限公司 Beautifying method and system for virtual scene live
CN106803966A (en) * 2016-12-31 2017-06-06 北京星辰美豆文化传播有限公司 A kind of many people's live network broadcast methods, device and its electronic equipment
CN107920256A (en) * 2017-11-30 2018-04-17 广州酷狗计算机科技有限公司 Live data playback method, device and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114765692A (en) * 2021-01-13 2022-07-19 北京字节跳动网络技术有限公司 Live broadcast data processing method, device, equipment and medium
CN114765692B (en) * 2021-01-13 2024-01-09 北京字节跳动网络技术有限公司 Live broadcast data processing method, device, equipment and medium
CN114257866A (en) * 2021-12-14 2022-03-29 上海掌门科技有限公司 Video display method, readable medium and electronic equipment
CN114860370A (en) * 2022-05-17 2022-08-05 聚好看科技股份有限公司 Display device, server and software development kit switching method
CN114860370B (en) * 2022-05-17 2024-03-29 聚好看科技股份有限公司 Display equipment, server and software development kit switching method

Also Published As

Publication number Publication date
CN111970527B (en) 2022-03-29

Similar Documents

Publication Publication Date Title
CN111970527B (en) Live broadcast data processing method and device
CN109525851B (en) Live broadcast method, device and storage medium
CN112929678B (en) Live broadcast method, live broadcast device, server side and computer readable storage medium
CN111078070B (en) PPT video barrage play control method, device, terminal and medium
CN110856008B (en) Live broadcast interaction method, device and system, electronic equipment and storage medium
EP2953055A1 (en) Two-dimensional code processing method and terminal
CN112218108B (en) Live broadcast rendering method and device, electronic equipment and storage medium
CN107295352B (en) Video compression method, device, equipment and storage medium
CN112689168A (en) Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
CN110913237A (en) Live broadcast control method and device, live broadcast initiating device and storage medium
CN110856005A (en) Live stream display method and device, electronic equipment and readable storage medium
CN112153409B (en) Live broadcast method and device, live broadcast receiving end and storage medium
CN110582021B (en) Information processing method and device, electronic equipment and storage medium
US11750876B2 (en) Method and apparatus for determining object adding mode, electronic device and medium
CN108683900B (en) Image data processing method and device
US20160165315A1 (en) Display apparatus, method of displaying channel list performed by the same, server, and control method performed by the server
CN113691835B (en) Video implantation method, device, equipment and computer readable storage medium
CN114095772A (en) Virtual object display method and system under live microphone connection and computer equipment
TWI765230B (en) Information processing device, information processing method, and information processing program
US20200007945A1 (en) Video production system with dynamic character generator output
CN111246246A (en) Video playing method and device
CN111190518B (en) Interaction method and device between first screen and second screen, terminal and storage medium
CN113453032B (en) Gesture interaction method, device, system, server and storage medium
US11962545B2 (en) Method and device for providing chatbot participating chat service
CN111213133B (en) Command processing server, program, system, command execution program, and command processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant