CN106658145B - Live broadcast data processing method and device - Google Patents

Live broadcast data processing method and device Download PDF

Info

Publication number
CN106658145B
CN106658145B CN201611229439.7A CN201611229439A CN106658145B CN 106658145 B CN106658145 B CN 106658145B CN 201611229439 A CN201611229439 A CN 201611229439A CN 106658145 B CN106658145 B CN 106658145B
Authority
CN
China
Prior art keywords
application
interface
data
live broadcast
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611229439.7A
Other languages
Chinese (zh)
Other versions
CN106658145A (en
Inventor
葛山
董晶阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qihoo Technology Co Ltd
Original Assignee
Beijing Qihoo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qihoo Technology Co Ltd filed Critical Beijing Qihoo Technology Co Ltd
Priority to CN201611229439.7A priority Critical patent/CN106658145B/en
Publication of CN106658145A publication Critical patent/CN106658145A/en
Priority to PCT/CN2017/118839 priority patent/WO2018121556A1/en
Application granted granted Critical
Publication of CN106658145B publication Critical patent/CN106658145B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Stored Programmes (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a live data processing method and a device, wherein the method comprises the following steps: in the live broadcast process based on the target application, live broadcast return information is obtained, wherein the live broadcast return information is used for feeding back information except live broadcast video data, and the live broadcast return information comprises bullet screen data; rendering a display image obtained by superposing an application interface of the target application and the live broadcast return information, and displaying a live broadcast interface according to the display image; and generating live broadcast video data with bullet screen data according to the live broadcast interface, and uploading the live broadcast video data. Therefore, live video data with data such as barrage and the like are directly generated at a live anchor end, the synthesis of live video, barrage and other live return information at a server end is not needed, and time delay is reduced.

Description

Live broadcast data processing method and device
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a live data processing method and a live data processing apparatus.
Background
With the development of network and terminal technologies, users can perform various network operations such as surfing the internet, playing games, watching videos and the like through terminals, so that a new multimedia interaction mode, namely live network broadcasting, is also created.
The network live broadcast is a multimedia interaction mode that a signal acquisition device adopts a live broadcast video, the live broadcast video is uploaded to a server, and the server feeds back the live broadcast video to the playing of each user terminal.
The interactive messages are displayed in a live broadcast picture, but the interactive messages usually have a certain time delay in the live broadcast process of the network, so that the live broadcast effect is influenced.
Disclosure of Invention
In view of the above problems, the present invention has been made to provide a live data processing method and a corresponding live data processing apparatus that overcome or at least partially solve the above problems.
According to an aspect of the present invention, a live data processing method is provided, which specifically includes: in the live broadcast process based on the target application, live broadcast return information is obtained, wherein the live broadcast return information is used for feeding back information except live broadcast video data, and the live broadcast return information comprises bullet screen data; rendering a display image obtained by superposing an application interface of the target application and the live broadcast return information, and displaying a live broadcast interface according to the display image; and generating live broadcast video data with bullet screen data according to the live broadcast interface, and uploading the live broadcast video data.
Optionally, rendering a display image in which the application interface of the target application and the live return information are superimposed includes: calling a first interface according to application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and acquiring an application interface drawn based on application source framework data; superposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined from application source framework data of the target application.
Optionally, rendering a display image in which the application interface of the target application and the live return information are superimposed includes: and calling a first interface according to application mapping architecture data, drawing a display image which is superposed with an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image.
Optionally, displaying a live interface according to the display image includes: and sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
Optionally, generating live video data with barrage data according to the live interface includes: calling a second interface according to the application mapping architecture data to acquire the display image; and according to the time information of the display images, forming the live video data by the display images.
Optionally, the live return information further includes: traffic data and/or non-public information.
Optionally, the information feedback image corresponding to the live broadcast return information includes a public image and a private image, where the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
Optionally, acquiring the display image comprises: and acquiring a display image generated by overlapping the application interface of the target application and the public image.
Optionally, the method further comprises: creating application mapping architecture data and application source architecture data when initializing the target application in advance through the suspended target object; determining at least one target interface according to the application mapping architecture data.
Optionally, the creating, in advance through the suspended target object, the application mapping architecture data and the application source architecture data when initializing the target application includes: monitoring a creating function in a system, and suspending a target object when the creating function creates the target object; when the target object initializes the target application, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
Optionally, the method further comprises: when an object is created, creating a mapping object according to the application information and pointing to a source memory address, and then creating a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
Optionally, determining at least one target interface according to the application mapping architecture data includes: determining at least one target interface for rendering a display image in dependence on the interface information based on the application mapping architecture data.
According to another aspect of the present invention, a live data processing apparatus is provided, which specifically includes: the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring live broadcast return information in the process of live broadcast based on a target application, the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises barrage data; the rendering display module is used for rendering a display image formed by superposing the application interface of the target application and the live broadcast return information and displaying the live broadcast interface according to the display image; and the live broadcast returning module is used for generating live broadcast video data with barrage data according to the live broadcast interface and uploading the live broadcast video data.
Optionally, the rendering display module includes: the rendering submodule is used for calling a first interface according to application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information and acquiring an application interface drawn based on application source framework data; superposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined from application source framework data of the target application.
Optionally, the rendering display module includes: and the drawing submodule is used for calling a first interface according to application mapping architecture data, drawing a display image which is formed by superposing an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image.
Optionally, the drawing and displaying module includes: and the display submodule is used for sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
Optionally, the live broadcast returning module includes: the obtaining submodule is used for calling a second interface according to the application mapping architecture data to obtain the display image; and the live broadcast generation submodule is used for forming the live broadcast video data by the display images according to the time information of the display images.
Optionally, the live return information further includes: traffic data and/or non-public information.
Optionally, the information feedback image corresponding to the live broadcast return information includes a public image and a private image, where the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
Optionally, the obtaining sub-module is configured to obtain a display image generated by superimposing the application interface of the target application and the public image.
Optionally, the method further comprises: the initialization module is used for creating application mapping architecture data and application source architecture data when initializing the target application in advance through the suspended target object; determining at least one target interface according to the application mapping architecture data.
Optionally, the initialization module includes: the suspension submodule is used for monitoring a function created in the system and suspending a target object when the created function creates the target object; and the architecture building submodule is used for initializing a target application building application in the target object, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
Optionally, the framework building sub-module is further configured to, when creating an object, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
Optionally, the initialization module further includes: and the interface analysis submodule is used for determining at least one target interface for drawing a display image according to the interface information based on the application mapping architecture data.
In the live broadcast process based on the target application, acquiring live broadcast return information, wherein the live broadcast return information is used for feeding back information except live broadcast video data, the live broadcast return information comprises barrage data, a display image superposed by an application interface of the target application and the live broadcast return information is rendered, a live broadcast interface is displayed according to the display image, live broadcast video data with the barrage data is generated according to the live broadcast interface, and the live broadcast video data is uploaded; therefore, live video data with data such as barrage and the like are directly generated at a live anchor end, the synthesis of live video, barrage and other live return information at a server end is not needed, and time delay is reduced.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow diagram illustrating steps of an embodiment of a method of live data processing according to the present invention;
FIG. 2 is a flow diagram illustrating steps of an embodiment of a method of live data processing according to another embodiment of the present invention;
FIG. 3 shows a flow diagram of steps in another embodiment of a method of live data processing, according to another embodiment of the invention;
fig. 4 is a block diagram illustrating an embodiment of a live data processing apparatus according to an embodiment of the present invention;
fig. 5 is a block diagram illustrating an embodiment of a live data processing apparatus according to another embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a live data processing method according to an embodiment of the present invention is shown, which may specifically include the following steps:
102, in the process of live broadcasting based on the target application, obtaining live broadcasting return information, wherein the live broadcasting return information is used for feeding back information except live broadcasting video data, and the live broadcasting return information comprises barrage data.
The live broadcast of the embodiment is based on the target application, that is, data of the target application is used as live broadcast data, for example, live game broadcast is that a user uses data of a picture corresponding to the game application as live broadcast data in a process of playing a game by using the game application. The user watching the live broadcast in the live broadcast process can interact with the anchor, and the anchor can be the user playing the live broadcast, for example, a game player in the game live broadcast can be the anchor. The user can send bullet screen data, and can also give a main broadcast gift, and the like, wherein the bullet screen refers to a display mode that a large amount of videos fly from the screen in a caption pop-up mode.
The embodiment is applied to the anchor terminal in the live broadcast process, so that the anchor terminal can conveniently interact with a user, and live broadcast return information can be acquired in the live broadcast process based on the target application. In order to reduce the occupation of the main broadcast bandwidth, the live broadcast return information of the embodiment is used for feeding back information except the video data, that is, the live broadcast return information is an interactive message not carrying live broadcast video data, and may include bullet screen data, and may also include data such as gifts.
And 104, rendering a display image obtained by superposing the application interface of the target application and the live broadcast return information, and displaying a live broadcast interface according to the display image.
And 106, generating live broadcast video data with barrage data according to the live broadcast interface, and uploading the live broadcast video data.
In the angle of system bottom layer and screen display, the interface and window displayed in the system can be regarded as a frame of image, so that in the running process of the target application, the application interface of the target application needs to be drawn and rendered, in the process, the superimposed live broadcast return information can be drawn on the application interface to obtain corresponding display images, the display images are respectively drawn for each frame, so that the frame of display images are combined according to the time sequence of the time stamps to display the corresponding live broadcast interface, and the direct interface displays the bullet screen data and other return data such as gifts and the like. And then, generating live broadcast video data with barrage data according to the live broadcast interface, namely combining the live broadcast interfaces of all frames to obtain the live broadcast video data, and uploading the live broadcast video data with the barrage data.
Therefore, live video data with data such as barrage and the like are directly generated at a live anchor end, the synthesis of live video, barrage and other live return information at a server end is not needed, and time delay is reduced.
Referring to fig. 2, a flowchart illustrating steps of an embodiment of a live data processing method according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 202, creating application mapping architecture data and application source architecture data when initializing the target application in advance through the suspended target object.
The method comprises the steps of monitoring a system in advance, monitoring a creating function of the system, such as a create function, and when the creating function is called to create a target object, suspending the target object, namely, setting a first hook program when the target object is initialized, so that the hook program is set at the source of the target object. The target object is an important component object of the system for performing operations required by various applications.
The creation, operation and the like of part of applications in the system all need the participation of a target object, so that the call aiming at the target object can be intercepted by setting a hook at the source of the target object, and the required information is determined. In this embodiment, the application needs to call the target object for execution, so that the application information when the target object is called can be intercepted by the first hook program, and the application information is used to indicate application source architecture data for creating the target application. When the target application is created, a series of interfaces corresponding to the target need to be called to obtain data, a required function object is created, and the like, and the application source architecture data of the target application can be determined through the interface call and the function object, so that the target application is started and operated.
Therefore, in this embodiment, the application mapping architecture data of the target application is built by using the application information, and then the application source architecture data is built, that is, the required operation is determined according to the application information, one piece of mapping information is built first and then the source information is built, and the mapping information and the source information point to the same content address, that is, a shell which is the same as the application source architecture data is built by using the mapping architecture data, but the substantial content is still provided by the application source architecture data, so that a very small amount of memory information is consumed, the architecture of the target application can be known, and the return of the data required by the application is controlled.
In the embodiment of the present invention, hook programs may be set in various target objects of various systems, and in this embodiment, taking a Component Object Model (COM) as an example of a target Object, when a COM Component is initialized by creating a function, a first hook program may be set for the COM Component, so as to set a hook at a source of the COM Component, and implement takeover of the COM Component. The COM component is a software component technology for interaction between a web page server and a client by microsoft, and is an object-oriented programming mode which defines the behavior of an object within a single application program or among multiple application programs. And COM is implemented on multiple platforms and is not limited to Windows operating systems. For example, the game client employs a 3D engine of DirectX (DX), which is a multimedia programming interface created by microsoft corporation, implemented in C + + programming language, compliant with COM components. Therefore, when the game client is started and operated and the 3D engine operation is involved, the COM component needs to be called, and the application information calling the COM component can be acquired through the first hook engine arranged on the COM component, so that the source architecture data of the 3D engine of the game client is determined, namely various interfaces, function objects and the like required by the 3D engine operation are obtained.
Step 204, determining at least one target interface according to the application mapping architecture data.
When the application mapping architecture data is built, various interfaces required to be called by the target application, functions of the interfaces and other information can be determined, so that at least one required target interface can be determined, and the target interface comprises various interfaces for drawing, rendering and displaying images. The target interface includes a first interface for rendering image data and a second interface for outputting the image data.
And step 206, acquiring live broadcast return information in the live broadcast process based on the target application.
And 208, calling a first interface according to the application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and acquiring an application interface drawn based on the application source framework data.
And step 210, overlapping and rendering the bullet screen image and the application interface to obtain a display image.
After the target application runs, a full-screen mode can be adopted, a display interface, namely an interface image, of the target application in the full-screen mode can be paved on the whole window, and the focus is located in the target application. In the bottom layer of the system and the angle of the screen display, the interface and the window displayed in the system can be regarded as a frame of image. In this embodiment, an information feedback image corresponding to the live broadcast return information is drawn based on a development technical principle of the target application according to the application mapping architecture data.
Therefore, after the application interface of the target application is drawn based on the application source framework data, and the live broadcast return information is acquired, a pre-injected hook program can be adopted to call the first interface to draw, render, display and the like an information feedback image corresponding to the live broadcast return information according to the image drawing, rendering and displaying modes corresponding to the application mapping framework data, the application interface created based on the application source framework data is acquired, the information feedback image and the application interface are superposed and rendered to obtain a display image, and therefore the image with the live broadcast return information such as barrage data, gifts and the like is obtained. And drawing, superposing and rendering each frame of image in the manner to obtain a dynamic picture on the screen.
For example, for a game client, when playing a client game, a user usually adopts a full-screen mode and performs live game, and then live data can be generated based on an application interface of the game client created by application source frame data; and calling the target interface according to the application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and displaying the information feedback image and the application interface in a superposition manner, so that the user seems to display data such as live broadcast barrages, gifts, private letters and the like under a full-screen game, and the user experience and the live broadcast flexibility are improved.
Step 212, calling a first interface according to application mapping architecture data, drawing a display image which is superimposed on an application interface of the target application and corresponds to the information feedback image of the live broadcast return information, and rendering the display image.
In the above manner, by using the manner of respectively drawing the application interface and the information feedback image and then displaying the application interface and the information feedback image in an overlapping manner, in the actual processing, the first interface may be called according to the application mapping architecture data, and the overlapped display image may be directly drawn, that is, the display image data in which the application interface and the information feedback image are overlapped is directly drawn. Therefore, live broadcast is carried out based on the overlapped display images in the subsequent live broadcast process, the live broadcast data added with the live broadcast data such as the barrage and the gift can be directly returned, and the data such as the barrage and the like do not need to be overlapped for the live broadcast data on the service side.
And 214, sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
And acquiring time information of each display image, namely acquiring a time stamp of each display image, combining the display images according to the time stamps, and sequentially displaying each frame of display image on a terminal screen to form a live broadcast interface of the target application.
Step 216, calling a second interface according to the application mapping architecture data to obtain the display image.
Step 218, according to the time information of the display images, the display images are formed into the live video data.
And step 220, uploading the live video data.
The live broadcast terminal directly generates live broadcast images carrying data such as barrage data and gifts without synthesizing the live broadcast images at the server. Therefore, the second interface can be called according to the application mapping architecture data to obtain the display image, the display image is synthesized according to the time information of the display image to generate the response live video data, and the live video data is uploaded.
In this embodiment of the application, the live broadcast return information includes: bullet screen data, business data and/or non-public information. The service data is determined according to specific services, for example, the service data is electronic article data such as a gift for a game service, the non-public information includes data such as a private letter and an authority question sent to the anchor, and the authority question can be question data after the authority of purchase. Therefore, the information feedback image corresponding to the live broadcast return information comprises a public image and a private image, wherein the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
Therefore, when the live data is uploaded by the obtained display image, the display image generated by overlapping the application interface of the target application and the public image can be obtained, and the live data without carrying non-public information is generated. Wherein, aiming at the mode of respectively drawing the images and then superposing the images, such as the step 208 and the step 210, the display images generated by superposing the respectively drawn application interfaces and the public images can be directly obtained; for the mode of directly drawing the superposed image in the step 212, the superposed image of the application interface and the public image can be drawn, and the private image is separately drawn and then superposed, so that the live broadcast data is ensured not to carry non-public information.
In this embodiment, the image drawing is performed based on the framework principle of the target application, so that the framework of the target application and information such as an interface and a function under the framework can be predetermined, and then the image drawing is provided in a full-screen mode.
In an alternative embodiment, the architecture of the target application may be determined.
In this embodiment, a game client is taken as an example of a target application, the game client may adopt a DX 3D engine, and the 3D engine is accelerated based on a hardware Graphics Processing Unit (GPU) in a Windows operating system, and directly reads and writes from a memory, so that a message mechanism can be avoided.
Referring to fig. 3, a flowchart illustrating steps of another embodiment of a live data processing method according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 302, monitoring a creating function in the system, and suspending a target object when the creating function creates the target object.
The method comprises the steps of monitoring a system in advance, monitoring a creating function of the system, such as a create function, and when the creating function is called to create a target object, suspending the target object, namely, setting a first hook program when the target object is initialized, so that the hook program is set at the source of the target object. The target object is an important component object of the system for performing operations required by various applications.
In the embodiment of the present invention, hook programs may be set in various target objects of various systems, and in this embodiment, taking a Component Object Model (COM) as an example of a target Object, when a COM Component is initialized by creating a function, a first hook program may be set for the COM Component, so as to set a hook at a source of the COM Component, and implement takeover of the COM Component.
Step 304, intercepting application information when the target object is called to create the target application.
The creation, operation and the like of part of applications in the system all need the participation of a target object, so that the call aiming at the target object can be intercepted by setting a hook at the source of the target object, and the required information is determined. In this embodiment, the application needs to call the target object for execution, so that the application information when the target object is called can be intercepted by the first hook program, and the application information is used to indicate application source architecture data for creating the target application. When the target application is created, a series of interfaces corresponding to the target need to be called to obtain data, a required function object is created, and the like, and the application source architecture data of the target application can be determined through the interface call and the function object, so that the target application is started and operated.
For example, the game client employs a 3D engine of DirectX (DX), which is a multimedia programming interface created by microsoft corporation, implemented in C + + programming language, compliant with COM components. Therefore, when the game client is started and operated and the 3D engine operation is involved, the COM component needs to be called, and the application information calling the COM component can be acquired through the first hook engine arranged on the COM component, so that the source architecture data of the 3D engine of the game client is determined, namely various interfaces, function objects and the like required by the 3D engine operation are obtained.
Step 306, when creating the object, creating a mapping object according to the application information and pointing to the source memory address, and then creating a source object and pointing to the source memory address.
And 308, respectively setting a hook program on each interface corresponding to each source object.
And step 310, building application mapping architecture data according to the mapping object and the interface call.
And step 312, building application source architecture data according to the source object and the interface call.
In this embodiment, when the target object creates the content related to the target application, the required object may be created according to information such as an interface that needs to be called, and in this embodiment, mapping architecture data is used to build a shell that is the same as the application source architecture data, but the substantial content of the shell is still provided by the application source architecture data. Therefore, when the application information needs to create an object, the relevant information of the object needing to be created is determined, a mapping object is created to point to the corresponding source memory address, and then the source object is created and points to the source memory address. The method comprises the steps of determining the content of the needed definition according to the application information, defining a mapping object with the definition, then defining the source object which needs to be defined originally, and setting a hook program at the interface needed by each source object, so that a group of hook programs can be set for the target application to realize the calling mode of hooking hooks at the entry level. Therefore, the mapping object and the interface are used for calling and building application mapping architecture data, the source object and the interface are used for calling and building application source architecture data, and a shell which is the same as the application source architecture data, namely the application mapping architecture data is obtained, and the actual definition and calling content of the application mapping architecture data can correspond to the application source architecture data and can also be mapped into the application source architecture data for processing.
For example, when the 3D engine of the game client calls the COM component, call information is intercepted, for example, the call information indicates that a function a is created, the function a calls an interface B and a function C, and the function C calls an interface D, E, functions a 'and C' can be mapped to the created function, a hook program is set at an interface B, D, E, a mapping function corresponding relationship is established, and then source functions a and C and a mapping function corresponding relationship with an interface B, D, E are created, so that a shell which is the same as source architecture data of the game client, i.e., mapping architecture data, can be mapped to the source architecture data through the mapping architecture data.
Step 314, determining at least one target interface for drawing the display image according to the interface information based on the application mapping architecture data.
When the application mapping architecture data is built, various interfaces required to be called by the target application, functions of the interfaces and other information can be determined, so that at least one required target interface can be determined, and the target interface comprises various interfaces for drawing, rendering and displaying images. A series of target interfaces related to the image can be determined, and each target interface is injected with a hook program, so that the target interfaces can be called directly through the hook program subsequently. The target interface includes a first interface and a second interface.
Therefore, application information can be intercepted during initialization of the target application, application mapping architecture data and application source architecture data of the target application are established, namely, the architecture of the target application is analyzed, and the required operation can be executed based on the architecture conveniently.
And step 316, performing live broadcast based on the target application and acquiring live broadcast return information.
And step 318, calling a first interface according to the application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and acquiring an application interface drawn based on the application source framework data.
And step 320, superposing and rendering the bullet screen image and the application interface to obtain a display image.
After the target application runs, a full-screen mode can be adopted, a display interface, namely an interface image, of the target application in the full-screen mode can be paved on the whole window, and the focus is located in the target application. In the bottom layer of the system and the angle of the screen display, the interface and the window displayed in the system can be regarded as a frame of image. In this embodiment, an information feedback image corresponding to the live broadcast return information is drawn based on a development technical principle of the target application according to the application mapping architecture data.
Therefore, after the application interface of the target application is drawn based on the application source framework data, and the live broadcast return information is acquired, a pre-injected hook program can be adopted to call the first interface to draw, render, display and the like an information feedback image corresponding to the live broadcast return information according to the image drawing, rendering and displaying modes corresponding to the application mapping framework data, the application interface created based on the application source framework data is acquired, the information feedback image and the application interface are superposed and rendered to obtain a display image, and therefore the image with the live broadcast return information such as barrage data, gifts and the like is obtained. And drawing, superposing and rendering each frame of image in the manner to obtain a dynamic picture on the screen.
For example, for a game client, when playing a client game, a user usually adopts a full-screen mode and performs live game, and then live data can be generated based on an application interface of the game client created by application source frame data; and calling the target interface according to the application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and displaying the information feedback image and the application interface in a superposition manner, so that the user seems to display data such as live broadcast barrages, gifts, private letters and the like under a full-screen game, and the user experience and the live broadcast flexibility are improved.
And 322, calling a first interface according to application mapping architecture data, drawing a display image which is superposed with an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image.
In the above manner, by using the manner of respectively drawing the application interface and the information feedback image and then displaying the application interface and the information feedback image in an overlapping manner, in the actual processing, the first interface may be called according to the application mapping architecture data, and the overlapped display image may be directly drawn, that is, the display image data in which the application interface and the information feedback image are overlapped is directly drawn. Therefore, live broadcast is carried out based on the overlapped display images in the subsequent live broadcast process, the live broadcast data added with the live broadcast data such as the barrage and the gift can be directly returned, and the data such as the barrage and the like do not need to be overlapped for the live broadcast data on the service side.
And 324, sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live interface of the target application.
And acquiring time information of each display image, namely acquiring a time stamp of each display image, combining the display images according to the time stamps, and sequentially displaying each frame of display image on a terminal screen to form a live broadcast interface of the target application.
Step 326, calling a second interface according to the application mapping architecture data to obtain the display image.
And 328, forming the live video data by the display images according to the time information of the display images.
And 330, uploading the live video data.
The live broadcast terminal directly generates live broadcast images carrying data such as barrage data and gifts without synthesizing the live broadcast images at the server. Therefore, the second interface can be called according to the application mapping architecture data to obtain the display image, the display image is synthesized according to the time information of the display image to generate the response live video data, and the live video data is uploaded.
For example, when a game client is initialized through a COM component, corresponding source architecture data and mapping architecture data are determined, and a target interface required for drawing, rendering and displaying an image is determined according to the game client. After the initialization is completed, the game client can be run and a full screen mode is entered. The user plays games in a live broadcast process, for live broadcast data such as barrage, a corresponding first display image can be drawn by calling a target interface based on an image drawing principle of a 3D engine, and the first image is superposed on a game client display interface image for display, or the superposed image is directly drawn to replace the game client display interface image.
In fact, in the running process of the target application, the interface image of the target application, namely the second display image, can be drawn and displayed in each frame, and the second display image responding to the live broadcast return information is also drawn in each frame and then is superposed and displayed on the second display image, so that even if the user angle image is static, the corresponding image is drawn in each frame at the system angle. Taking the game client as an example, the displayed game interface image is drawn and rendered for each frame, so that the first display image displayed on the game interface image is also displayed by being overlaid on the game interface image at the corresponding position after each frame is drawn and rendered.
Therefore, by calling the mode of directly drawing and displaying the image in the bottom window, a communication mechanism (such as a message mechanism of Windows) of the system can be bypassed, so that the display and the operation of the live broadcast return information can be normally executed without exiting the full screen, and the user experience is improved on the basis of ensuring the normal operation of the user.
For simplicity of explanation, the method embodiments are described as a series of acts or combinations, but those skilled in the art will appreciate that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the embodiments of the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
On the basis of the above embodiment, the present embodiment further provides a live data processing apparatus.
Referring to fig. 4, a block diagram of a live data processing apparatus according to an embodiment of the present invention is shown, which may specifically include the following modules:
an obtaining module 402, configured to obtain live broadcast return information in a live broadcast process based on a target application, where the live broadcast return information is used to feed back information other than video data, and the live broadcast return information includes bullet screen data.
And a rendering and displaying module 404, configured to render a display image in which the application interface of the target application and the live broadcast return information are superimposed, and display a live broadcast interface according to the display image.
And the live broadcast returning module 406 is used for generating live broadcast video data with barrage data according to the live broadcast interface and uploading the live broadcast video data.
In the process of carrying out the live broadcast based on the target application, acquire live broadcast return information, wherein, the live broadcast return information is used for feeding back the information except live broadcast video data, the live broadcast return information includes bullet screen data, and the rendering the application interface of target application and the superimposed display image of live broadcast return information, foundation show the live broadcast interface of image show, foundation the live broadcast interface generates the live broadcast video data that has bullet screen data, uploads live broadcast video data thereby directly generate the live broadcast video data that has data such as bullet screen at the live broadcast end, need not to carry out the synthesis of live broadcast return information such as live broadcast video and bullet screen at the server end, reduce time delay.
Referring to fig. 5, a block diagram illustrating a structure of an embodiment of a live data processing apparatus according to another embodiment of the present invention may specifically include the following modules:
an initialization module 408, configured to create application mapping architecture data and application source architecture data when initializing a target application in advance through a suspended target object; determining at least one target interface according to the application mapping architecture data.
An obtaining module 402, configured to obtain live broadcast return information in a live broadcast process based on a target application, where the live broadcast return information is used to feed back information other than video data, and the live broadcast return information includes bullet screen data.
And a rendering and displaying module 404, configured to render a display image in which the application interface of the target application and the live broadcast return information are superimposed, and display a live broadcast interface according to the display image.
And the live broadcast returning module 406 is used for generating live broadcast video data with barrage data according to the live broadcast interface and uploading the live broadcast video data.
In an alternative embodiment, the rendering and displaying module 404 includes: the rendering submodule 4042 is configured to call a first interface according to the application mapping architecture data, draw an information feedback image corresponding to the live broadcast return information, and acquire an application interface drawn based on the application source framework data; superposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined from application source framework data of the target application.
In another alternative embodiment, the rendering and displaying module 404 includes: the drawing submodule 4044 is configured to invoke a first interface according to application mapping architecture data, draw a display image in which an information feedback image corresponding to the live broadcast return information is superimposed on an application interface of the target application, and render the display image.
The drawing display module 404 includes: and the display sub-module 4046 is configured to sequentially display the display images on the terminal screen according to the time information of the display images, so as to form a live broadcast interface of the target application.
The live broadcast returning module 406 includes:
the obtaining sub-module 4062 is configured to call a second interface according to the application mapping architecture data, and obtain the display image.
And the live broadcast generation submodule 4064 is configured to construct the live broadcast video data from the display images according to the time information of the display images.
Wherein the live broadcast return information further comprises: traffic data and/or non-public information. The information feedback image corresponding to the live broadcast return information comprises a public image and a private image, wherein the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
The obtaining sub-module 4062 is configured to obtain a display image generated by superimposing the application interface of the target application and the public image.
The initialization module 408 includes:
the suspend submodule 4082 is configured to monitor a function created in the system, and suspend a target object when the creating function creates the target object.
The architecture building submodule 4084 is configured to create an application when the target object initializes the target application, determine a mapping object and interface call according to the application information, build application mapping architecture data, determine a source object and interface call according to the application information, and build application source architecture data.
The architecture building submodule 4084 is further configured to, when creating an object, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
The initialization module 408 further includes: the interface analysis sub-module 4086 is configured to determine, based on the application mapping architecture data, at least one target interface for drawing a display image according to the interface information.
Therefore, by calling the mode of directly drawing and displaying the image in the bottom window, a communication mechanism (such as a message mechanism of Windows) of the system can be bypassed, so that the display and the operation of the live broadcast return information can be normally executed without exiting the full screen, and the user experience is improved on the basis of ensuring the normal operation of the user.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. It will be appreciated by those skilled in the art that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in a live data processing method and apparatus device according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The invention discloses a1 and a live data processing method, which comprises the following steps:
in the live broadcast process based on the target application, live broadcast return information is obtained, wherein the live broadcast return information is used for feeding back information except live broadcast video data, and the live broadcast return information comprises bullet screen data;
rendering a display image obtained by superposing an application interface of the target application and the live broadcast return information, and displaying a live broadcast interface according to the display image;
and generating live broadcast video data with bullet screen data according to the live broadcast interface, and uploading the live broadcast video data.
A2, the method as in A1, rendering a display image of the application interface and the live return information overlay of the target application, comprising:
calling a first interface according to application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and acquiring an application interface drawn based on application source framework data;
superposing and rendering the information feedback image and the application interface to obtain a display image;
wherein the application mapping architecture data is determined from application source framework data of the target application.
A3, the method as in A1, rendering a display image of the application interface and the live return information overlay of the target application, comprising:
and calling a first interface according to application mapping architecture data, drawing a display image which is superposed with an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image.
A4, the method of A1, showing a live interface according to the display image, comprising:
and sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
A5, the method of A4, generating live video data with barrage data according to the live interface, including:
calling a second interface according to the application mapping architecture data to acquire the display image;
and according to the time information of the display images, forming the live video data by the display images.
A6, the method as in any a1-a5, the live return information further comprising: traffic data and/or non-public information.
A7, the method in A6, wherein the information feedback image corresponding to the live broadcast return information includes a public image and a private image, the public image is drawn according to bullet screen data and service data, and the private image is drawn according to the non-public information.
A8, the method of A7, obtaining the display image, comprising:
and acquiring a display image generated by overlapping the application interface of the target application and the public image.
A9, the method of any one of a2-a5, further comprising:
establishing application mapping architecture data and application source architecture data in the initialized target application through the suspended target object in advance;
determining at least one target interface according to the application mapping architecture data.
A10, the method as in A9, wherein the creating application mapping architecture data and application source architecture data in initialization of the target application in advance through the suspended target object, comprises:
monitoring a creating function in a system, and suspending a target object when the creating function creates the target object;
when the target object initializes the target application, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
A11, the method of a10, further comprising:
when an object is created, creating a mapping object according to the application information and pointing to a source memory address, and then creating a source object and pointing to the source memory address;
and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
A12, the method of A11, determining at least one target interface from the application mapping architecture data, comprising:
determining at least one target interface for rendering a display image in dependence on the interface information based on the application mapping architecture data.
The invention also discloses B13, a live broadcast data processing device, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring live broadcast return information in the process of live broadcast based on a target application, the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises barrage data;
the rendering display module is used for rendering a display image formed by superposing the application interface of the target application and the live broadcast return information and displaying the live broadcast interface according to the display image;
and the live broadcast returning module is used for generating live broadcast video data with barrage data according to the live broadcast interface and uploading the live broadcast video data.
B14, the apparatus of B13, the rendering display module comprising:
the rendering submodule is used for calling a first interface according to application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information and acquiring an application interface drawn based on application source framework data; superposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined from application source framework data of the target application.
B15, the apparatus of B13, the rendering display module comprising:
and the drawing submodule is used for calling a first interface according to application mapping architecture data, drawing a display image which is formed by superposing an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image.
B16, the apparatus as described in B13, the drawing display module comprising:
and the display submodule is used for sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
B17, the apparatus of B16, the live return module comprising:
the obtaining submodule is used for calling a second interface according to the application mapping architecture data to obtain the display image;
and the live broadcast generation submodule is used for forming the live broadcast video data by the display images according to the time information of the display images.
B18, the apparatus as in any of B13-B17, the live return information further comprising: traffic data and/or non-public information.
The device of B19, as B18, the information feedback image corresponding to the live broadcast return information includes a public image and a private image, wherein the public image is drawn according to bullet screen data and service data, and the private image is drawn according to the non-public information.
B20, device according to B18,
and the acquisition submodule is used for acquiring a display image generated by overlapping the application interface of the target application and the public image.
B21, the apparatus of any of B14-B17, further comprising:
the initialization module is used for creating application mapping architecture data and application source architecture data when initializing the target application in advance through the suspended target object; determining at least one target interface according to the application mapping architecture data.
B22, the apparatus of B21, the initialization module comprising:
the suspension submodule is used for monitoring a function created in the system and suspending a target object when the created function creates the target object;
and the architecture building submodule is used for initializing a target application building application in the target object, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
B23, device according to B22,
the framework building submodule is also used for building a mapping object according to the application information and pointing to a source memory address when the object is built, and then building a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
B24, the apparatus as in B23, the initialization module further comprising:
and the interface analysis submodule is used for determining at least one target interface for drawing a display image according to the interface information based on the application mapping architecture data.

Claims (16)

1. A live data processing method includes:
in the live broadcast process based on the target application, live broadcast return information is obtained, wherein the live broadcast return information is used for feeding back information except live broadcast video data, and the live broadcast return information comprises bullet screen data; the live broadcast return information further includes: business data and/or non-public information; the method is applied to a main broadcasting end in a live broadcasting process;
rendering a display image obtained by superposing an application interface of the target application and the live broadcast return information, and displaying a live broadcast interface according to the display image; rendering a display image in which an application interface of the target application and live broadcast return information are superposed, including: calling a first interface according to application mapping architecture data, drawing a display image which is superposed with an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image; the information feedback image corresponding to the live broadcast return information comprises a public image and a private image, wherein the public image is drawn according to bullet screen data and service data, and the private image is drawn according to the non-public information;
according to the live broadcast interface, live broadcast video data with barrage data are generated, and according to the live broadcast interface, the generation of the live broadcast video data with the barrage data comprises the following steps: and acquiring a display image generated by overlapping an application interface of the target application and the public image, generating live broadcast video data without carrying non-public information, and uploading the live broadcast video data.
2. The method of claim 1, wherein rendering the display image of the application interface and the live return information overlay of the target application comprises:
calling a first interface according to application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information, and acquiring an application interface drawn based on application source framework data;
superposing and rendering the information feedback image and the application interface to obtain a display image;
wherein the application mapping architecture data is determined from application source framework data of the target application.
3. The method of claim 1, wherein presenting a live interface from the display image comprises:
and sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
4. The method of claim 3, wherein generating live video data with barrage data in accordance with the live interface comprises:
calling a second interface according to the application mapping architecture data to acquire the display image;
and according to the time information of the display images, forming the live video data by the display images.
5. The method of any of claims 1-4, further comprising:
creating application mapping architecture data and application source architecture data when initializing the target application in advance through the suspended target object;
determining at least one target interface according to the application mapping architecture data.
6. The method of claim 5, wherein creating application mapping schema data and application source schema data in advance through the suspended target object upon initializing the target application comprises:
monitoring a creating function in a system, and suspending a target object when the creating function creates the target object;
when the target object initializes the target application, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
7. The method of claim 6, further comprising:
when an object is created, creating a mapping object according to the application information and pointing to a source memory address, and then creating a source object and pointing to the source memory address;
and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
8. The method of claim 7, wherein determining at least one target interface from the application mapping architecture data comprises:
determining at least one target interface for rendering a display image in dependence on the interface information based on the application mapping architecture data.
9. A live data processing apparatus comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring live broadcast return information in the process of live broadcast based on a target application, the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises barrage data; the live broadcast return information further includes: business data and/or non-public information; the device is applied to a main broadcasting end in a live broadcasting process;
the rendering display module is used for rendering a display image formed by superposing the application interface of the target application and the live broadcast return information and displaying the live broadcast interface according to the display image; the rendering display module comprises: the rendering submodule is used for calling a first interface according to application mapping architecture data, rendering a display image obtained by superimposing an information feedback image corresponding to the live broadcast return information on an application interface of the target application, and rendering the display image; the information feedback image corresponding to the live broadcast return information comprises a public image and a private image, wherein the public image is drawn according to bullet screen data and service data, and the private image is drawn according to the non-public information;
the live broadcast return module is used for generating live broadcast video data with barrage data according to the live broadcast interface, and the acquisition submodule is used for acquiring a display image generated by overlapping an application interface of the target application and a public image; and generating live video data which does not carry non-public information, and uploading the live video data.
10. The apparatus of claim 9, wherein the rendering display module comprises:
the rendering submodule is used for calling a first interface according to application mapping architecture data, drawing an information feedback image corresponding to the live broadcast return information and acquiring an application interface drawn based on application source framework data; superposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined from application source framework data of the target application.
11. The apparatus of claim 10, wherein the drawing display module comprises:
and the display submodule is used for sequentially displaying the display images on a terminal screen according to the time information of the display images to form a live broadcast interface of the target application.
12. The apparatus of claim 11, wherein the live return module comprises:
the obtaining submodule is used for calling a second interface according to the application mapping architecture data to obtain the display image;
and the live broadcast generation submodule is used for forming the live broadcast video data by the display images according to the time information of the display images.
13. The apparatus of any of claims 9-12, further comprising:
the initialization module is used for creating application mapping architecture data and application source architecture data when initializing the target application in advance through the suspended target object; determining at least one target interface according to the application mapping architecture data.
14. The apparatus of claim 13, wherein the initialization module comprises:
the suspension submodule is used for monitoring a function created in the system and suspending a target object when the created function creates the target object;
and the architecture building submodule is used for initializing a target application building application in the target object, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
15. The apparatus of claim 14,
the framework building submodule is also used for building a mapping object according to the application information and pointing to a source memory address when the object is built, and then building a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
16. The apparatus of claim 15, wherein the initialization module further comprises:
and the interface analysis submodule is used for determining at least one target interface for drawing a display image according to the interface information based on the application mapping architecture data.
CN201611229439.7A 2016-12-27 2016-12-27 Live broadcast data processing method and device Active CN106658145B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201611229439.7A CN106658145B (en) 2016-12-27 2016-12-27 Live broadcast data processing method and device
PCT/CN2017/118839 WO2018121556A1 (en) 2016-12-27 2017-12-27 Live broadcast data processing method, apparatus, program and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611229439.7A CN106658145B (en) 2016-12-27 2016-12-27 Live broadcast data processing method and device

Publications (2)

Publication Number Publication Date
CN106658145A CN106658145A (en) 2017-05-10
CN106658145B true CN106658145B (en) 2020-07-03

Family

ID=58831562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611229439.7A Active CN106658145B (en) 2016-12-27 2016-12-27 Live broadcast data processing method and device

Country Status (2)

Country Link
CN (1) CN106658145B (en)
WO (1) WO2018121556A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106658145B (en) * 2016-12-27 2020-07-03 北京奇虎科技有限公司 Live broadcast data processing method and device
CN107734353B (en) * 2017-10-09 2020-08-04 武汉斗鱼网络科技有限公司 Method and device for recording barrage video, readable storage medium and equipment
CN107911708B (en) * 2017-11-09 2022-04-05 腾讯数码(天津)有限公司 Barrage display method, live broadcast method and related devices
CN108289234B (en) * 2018-01-05 2021-03-16 武汉斗鱼网络科技有限公司 Virtual gift special effect animation display method, device and equipment
CN108737879A (en) * 2018-04-04 2018-11-02 北京潘达互娱科技有限公司 A kind of present column display methods, device, electronic equipment and storage medium
CN110060135A (en) * 2019-04-26 2019-07-26 广州虎牙信息科技有限公司 Commodity transaction processing method, device, server and medium based on live streaming platform
CN110740346B (en) * 2019-10-23 2022-04-22 北京达佳互联信息技术有限公司 Video data processing method, device, server, terminal and storage medium
CN113271502B (en) * 2020-02-17 2023-04-28 上海哔哩哔哩科技有限公司 Video barrage-based data display method and device and computer equipment
CN113542846B (en) * 2020-04-21 2022-12-23 上海哔哩哔哩科技有限公司 AR barrage display method and device
CN113840170B (en) * 2020-06-23 2023-06-16 武汉斗鱼网络科技有限公司 Method and device for live broadcast of wheat
CN111726687B (en) * 2020-06-30 2022-12-27 北京百度网讯科技有限公司 Method and apparatus for generating display data
CN111913708B (en) * 2020-08-07 2024-02-27 广州虎牙科技有限公司 Interface display method and device, storage medium and electronic equipment
CN114245148B (en) * 2020-09-09 2023-10-27 腾讯科技(深圳)有限公司 Live interaction method, device, terminal, server and storage medium
CN112423111A (en) * 2020-11-05 2021-02-26 上海哔哩哔哩科技有限公司 Graphic engine and graphic processing method suitable for player
CN112637670B (en) * 2020-12-15 2022-07-29 上海哔哩哔哩科技有限公司 Video generation method and device
CN114765692B (en) * 2021-01-13 2024-01-09 北京字节跳动网络技术有限公司 Live broadcast data processing method, device, equipment and medium
CN115134652A (en) * 2021-03-22 2022-09-30 阿里巴巴新加坡控股有限公司 Video dynamic subtitle generating method and device, electronic equipment and storage medium
CN113347453B (en) * 2021-04-09 2022-11-08 北京润信恒达科技有限公司 Multi-channel live broadcast system, method and equipment
CN113613062B (en) * 2021-07-08 2024-01-23 广州云智达创科技有限公司 Video data processing method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6961382B1 (en) * 1999-02-01 2005-11-01 Samsung Electronics Co., Ltd. Moving picture experts group decoding apparatus and method for caption display
CN1878069A (en) * 2005-06-08 2006-12-13 腾讯科技(深圳)有限公司 Image synthesis processing system and method therefor
CN1989543A (en) * 2003-10-23 2007-06-27 微软公司 Media integration layer
CN105763950A (en) * 2014-12-19 2016-07-13 中兴通讯股份有限公司 Bullet screen display method and system
CN105828160A (en) * 2016-04-01 2016-08-03 腾讯科技(深圳)有限公司 Video play method and apparatus

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100451956C (en) * 2006-05-24 2009-01-14 北京壁虎科技有限公司 Screen display method based on drawing engine
CN101227421B (en) * 2007-01-16 2012-04-18 沃天醒石(北京)科技有限公司 Instantaneous communication method and apparatus under full screen graphics mode
CN101500125B (en) * 2008-02-03 2011-03-09 突触计算机系统(上海)有限公司 Method and apparatus for providing user interaction during displaying video on customer terminal
US20130198774A1 (en) * 2012-01-30 2013-08-01 Consultants Net Creation Inc. Live broadcasting of dynamically generated content
EP3272126A1 (en) * 2015-03-20 2018-01-24 Twitter, Inc. Live video stream sharing
CN105597321B (en) * 2015-12-18 2020-07-10 武汉斗鱼网络科技有限公司 Bullet screen display method and system in full-screen game state
CN106162230A (en) * 2016-07-28 2016-11-23 北京小米移动软件有限公司 The processing method of live information, device, Zhu Boduan, server and system
CN106658145B (en) * 2016-12-27 2020-07-03 北京奇虎科技有限公司 Live broadcast data processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6961382B1 (en) * 1999-02-01 2005-11-01 Samsung Electronics Co., Ltd. Moving picture experts group decoding apparatus and method for caption display
CN1989543A (en) * 2003-10-23 2007-06-27 微软公司 Media integration layer
CN1878069A (en) * 2005-06-08 2006-12-13 腾讯科技(深圳)有限公司 Image synthesis processing system and method therefor
CN105763950A (en) * 2014-12-19 2016-07-13 中兴通讯股份有限公司 Bullet screen display method and system
CN105828160A (en) * 2016-04-01 2016-08-03 腾讯科技(深圳)有限公司 Video play method and apparatus

Also Published As

Publication number Publication date
CN106658145A (en) 2017-05-10
WO2018121556A1 (en) 2018-07-05

Similar Documents

Publication Publication Date Title
CN106658145B (en) Live broadcast data processing method and device
CN106713968B (en) Live data display method and device
JP5196499B2 (en) Computer network based 3D rendering system
CN108289234B (en) Virtual gift special effect animation display method, device and equipment
WO2022048097A1 (en) Single-frame picture real-time rendering method based on multiple graphics cards
CN113840154B (en) Live broadcast interaction method and system based on virtual gift and computer equipment
CN109286824B (en) Live broadcast user side control method, device, equipment and medium
CN110968962B (en) Three-dimensional display method and system based on cloud rendering at mobile terminal or large screen
CN112243137A (en) Live broadcast interface updating method, device, server and system
CN107343206B (en) Video generation method, device, medium and electronic equipment supporting multi-view viewing
US20120140025A1 (en) Dynamic Modification of Video Content at a Set-Top Box Device
CN111476851B (en) Image processing method, device, electronic equipment and storage medium
CN113141537A (en) Video frame insertion method, device, storage medium and terminal
US20230290043A1 (en) Picture generation method and apparatus, device, and medium
CN114650434A (en) Cloud service-based rendering method and related equipment thereof
CN112689168A (en) Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
CN114095744A (en) Video live broadcast method and device, electronic equipment and readable storage medium
CN110012336A (en) Picture configuration method, terminal and the device at interface is broadcast live
CN113824976A (en) Method and device for displaying approach show in live broadcast room and computer equipment
CN116485966A (en) Video picture rendering method, device, equipment and medium
CN111343485A (en) Method, device, equipment, system and storage medium for displaying virtual gift
CN113301425A (en) Video playing method, video playing device and electronic equipment
CN113840170B (en) Method and device for live broadcast of wheat
CN106445535B (en) Operation processing method and device
CN111327920A (en) Live broadcast-based information interaction method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant