CN106713968B - Live data display method and device - Google Patents

Live data display method and device Download PDF

Info

Publication number
CN106713968B
CN106713968B CN201611229470.0A CN201611229470A CN106713968B CN 106713968 B CN106713968 B CN 106713968B CN 201611229470 A CN201611229470 A CN 201611229470A CN 106713968 B CN106713968 B CN 106713968B
Authority
CN
China
Prior art keywords
application
display image
data
live broadcast
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611229470.0A
Other languages
Chinese (zh)
Other versions
CN106713968A (en
Inventor
葛山
董晶阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Qihoo Technology Co Ltd
Original Assignee
Beijing Qihoo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Qihoo Technology Co Ltd filed Critical Beijing Qihoo Technology Co Ltd
Priority to CN201611229470.0A priority Critical patent/CN106713968B/en
Publication of CN106713968A publication Critical patent/CN106713968A/en
Priority to PCT/CN2017/118840 priority patent/WO2018121557A1/en
Application granted granted Critical
Publication of CN106713968B publication Critical patent/CN106713968B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The embodiment of the invention provides a live data display method and a device, wherein the method comprises the following steps: in the process of live broadcasting based on a target application, live broadcasting return information is obtained, wherein the live broadcasting return information is used for feeding back information except video data, and the live broadcasting return information comprises barrage data; drawing a first display image corresponding to the live broadcast return information, and superposing the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data; therefore, the anchor acquires the returned interactive message to interact with the watching user conveniently, the live broadcast returned message does not carry live broadcast video data, the occupation of bandwidth resources can be effectively reduced, and the stability of live broadcast is ensured.

Description

Live data display method and device
Technical Field
The invention relates to the technical field of computers, in particular to a live data display method and a live data display device.
Background
With the development of network and terminal technologies, users can perform various network operations such as surfing the internet, playing games, watching videos and the like through terminals, so that a new multimedia interaction mode, namely live network broadcasting, is also created.
The network live broadcast is a multimedia interaction mode that a signal acquisition device adopts a live broadcast video, the live broadcast video is uploaded to a server, and the server feeds back the live broadcast video to the playing of each user terminal.
However, in the live broadcast process, the anchor broadcast uploads the live broadcast video and also downloads the live broadcast video added with the interactive message, which may cause the network bandwidth to be occupied and affect the normal uploading of the live broadcast video.
Disclosure of Invention
In view of the above problems, the present invention has been made to provide a live data display method and a corresponding live data display apparatus that overcome or at least partially solve the above problems.
According to an aspect of the present invention, a live data display method is provided, which specifically includes: in the process of live broadcasting based on a target application, live broadcasting return information is obtained, wherein the live broadcasting return information is used for feeding back information except video data, and the live broadcasting return information comprises barrage data; and drawing a first display image corresponding to the live broadcast return information, and superposing the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data.
Optionally, drawing a first display image corresponding to the live broadcast return information, and superimposing the first display image on a second display image corresponding to the target application for display, including: generating a sub-window and displaying a first display image of the live broadcast return information in a drawing sequence of the sub-window; and setting the sub-window on a second display image corresponding to the target application for display.
Optionally, drawing a first display image corresponding to the live broadcast return information, and superimposing the first display image on a second display image corresponding to the target application for display, including: generating a transparent window and drawing a first display image for dynamically displaying the live broadcast return information on the transparent window; and overlaying the transparent window on a second display image corresponding to the target application for display.
Optionally, drawing a first display image corresponding to the live broadcast return information includes: and calling the target interface according to application mapping architecture data, and drawing a first display image corresponding to the live broadcast return information, wherein the application mapping architecture data is determined according to application source framework data of the target application.
Optionally, the displaying by overlaying the second display image corresponding to the target application includes: and acquiring a second display image created based on the application source frame data, and performing overlapping display on the first display image and the second display image.
Optionally, drawing a first display image corresponding to the live broadcast return information, and superimposing the first display image on a second display image corresponding to the target application for display, including: and calling the target interface according to application mapping architecture data, and drawing superposed image data of a second display image corresponding to the target application and a first display image corresponding to the live broadcast return information.
Optionally, the method further comprises: establishing application mapping architecture data and application source architecture data in the initialized target application through the suspended target object in advance; determining at least one target interface according to the application mapping architecture data.
Optionally, the creating, in advance through the suspended target object, the application mapping architecture data and the application source architecture data at the initialization target application includes: monitoring a creating function in a system, and suspending a target object when the creating function creates the target object; when the target object initializes the target application, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
Optionally, the method further comprises: when an object is created, creating a mapping object according to the application information and pointing to a source memory address, and then creating a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
Optionally, determining at least one target interface according to the application mapping architecture data includes: determining at least one target interface for rendering a display image in dependence on the interface information based on the application mapping architecture data.
Optionally, the method further comprises: and generating live video data according to the second display image, and uploading the live video data.
Optionally, the live return information further includes: traffic data and/or non-public information.
According to another aspect of the present invention, a live data display apparatus is provided, which specifically includes: the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring live broadcast return information in the process of live broadcast based on a target application, the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises barrage data; and the drawing and displaying module is used for drawing a first display image corresponding to the live broadcast return information and overlapping the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data.
Optionally, the drawing and displaying module includes: the drawing submodule is used for generating a sub-window and displaying a first display image of the live broadcast return information in the drawing sequence of the sub-window; and the display sub-module is used for setting the sub-window on a second display image corresponding to the target application for displaying.
Optionally, the drawing and displaying module includes: the drawing submodule is used for generating a transparent window and drawing a first display image which dynamically shows the live broadcast return information on the transparent window; and the display sub-module is used for overlaying the transparent window on a second display image corresponding to the target application for display.
Optionally, the drawing and displaying module includes: and the drawing submodule is used for calling the target interface according to application mapping architecture data and drawing a first display image corresponding to the live broadcast return information, wherein the application mapping architecture data is determined according to application source framework data of the target application.
Optionally, the drawing and displaying module further includes: and the display sub-module is used for acquiring a second display image created based on the application source frame data and displaying the first display image and the second display image in a superposition manner.
Optionally, the method further comprises: the initialization module is used for creating application mapping architecture data and application source architecture data in an initialization target application in advance through a suspended target object; determining at least one target interface according to the application mapping architecture data.
Optionally, the initialization module includes: the suspension submodule is used for monitoring a function created in the system and suspending a target object when the created function creates the target object; and the architecture building submodule is used for initializing a target application building application in the target object, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
Optionally, the framework building sub-module is further configured to, when creating an object, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
Optionally, the initialization module further includes: and the interface analysis submodule is used for determining at least one target interface for drawing a display image according to the interface information based on the application mapping architecture data.
Optionally, the method further comprises: and the live broadcast module is used for generating live broadcast video data according to the second display image and uploading the live broadcast video data.
Optionally, the live return information further includes: traffic data and/or non-public information.
When the second display image of the target application is adopted to generate live broadcast video data for live broadcast, live broadcast return information which does not carry the video data can be acquired, the live broadcast return information comprises barrage data, a first display image corresponding to the live broadcast return information is drawn, and the first display image is superposed on the second display image corresponding to the target application to be displayed, so that the anchor broadcast can acquire the returned interactive information to interact with a watching user conveniently, the live broadcast return information does not carry the live broadcast video data, occupation of bandwidth resources can be effectively reduced, and the live broadcast stability is ensured.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a flow diagram illustrating steps of an embodiment of a method for displaying live data according to the present invention;
FIG. 2 is a flow diagram illustrating steps of an embodiment of a method for live data display according to another embodiment of the present invention;
FIG. 3 is a flow diagram illustrating steps of another embodiment of a method for live data display according to another embodiment of the present invention;
FIG. 4 is a flow diagram illustrating steps of yet another embodiment of a method for live data display in accordance with the present invention;
FIG. 5 is a flowchart illustrating steps of an embodiment of a rendering display method based on an application mapping architecture according to another embodiment of the present invention;
FIG. 6 is a block diagram of an embodiment of a live data display apparatus according to an embodiment of the present invention;
fig. 7 is a block diagram showing an embodiment of a live data display apparatus according to another embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a live data display method according to an embodiment of the present invention is shown, which may specifically include the following steps:
102, in the process of live broadcasting based on the target application, obtaining live broadcasting return information, wherein the live broadcasting return information is used for feeding back information except video data, and the live broadcasting return information comprises barrage data.
And 104, drawing a first display image corresponding to the live broadcast return information, and overlaying the first display image onto a second display image corresponding to the target application for display, wherein the second display image is used for generating live broadcast video data.
The live broadcast of the embodiment is based on the target application, that is, data of the target application is used as live broadcast data, for example, live game broadcast is that a user uses data of a picture corresponding to the game application as live broadcast data in a process of playing a game by using the game application. The user watching the live broadcast in the live broadcast process can interact with the anchor, and the anchor can be the user playing the live broadcast, for example, a game player in the game live broadcast can be the anchor. The user can send bullet screen data, and can also give a main broadcast gift, and the like, wherein the bullet screen refers to a display mode that a large amount of videos fly from the screen in a caption pop-up mode.
The embodiment is applied to the anchor terminal in the live broadcast process, so that the anchor terminal can conveniently interact with a user, and live broadcast return information can be acquired in the live broadcast process based on the target application. In order to reduce the occupation of the main broadcast bandwidth, the live broadcast return information of the embodiment is used for feeding back information except the video data, that is, the live broadcast return information is an interactive message not carrying live broadcast video data, and may include bullet screen data, and may also include data such as gifts.
In the angle of system bottom layer and screen display, the interface and window displayed in the system can be regarded as a frame of image, so that an interactive message image, namely a first display image, can be drawn based on the live broadcast return information, the first display image can display the interactive message, the display interface and display content corresponding to the target application are called as a second display image, and the target application can be displayed in a full screen mode, so that the first display image can be superposed on the second display image for display. That is, each frame of image displayed on the screen is generated by superimposing the first display image and the second display image drawn in the frame, so that the image dynamically displayed on the screen is obtained by drawing the first display image and the second display image in each frame. The second display image is a display interface and display content corresponding to the target application, so that the second display image corresponding to each frame can be acquired, and live video data can be generated according to information such as a timestamp of each frame.
In summary, when the live video data is generated by using the second display image of the target application for live broadcasting, live broadcast return information that does not carry the video data can be obtained, where the live broadcast return information includes the barrage data, the first display image corresponding to the live broadcast return information is drawn, and the first display image is superimposed on the second display image corresponding to the target application for display, so that the anchor broadcast obtains the returned interactive message to interact with the viewer, and the live broadcast return information does not carry the live broadcast video data, so that occupation of bandwidth resources can be effectively reduced, and the stability of the live broadcast is ensured.
Wherein, the live broadcast return information comprises: the system comprises bullet screen data, service data and/or non-public information, wherein the service data is determined according to specific services, for example, the service data is electronic article data such as gifts and the like in game services, the non-public information comprises data such as private letters and authority questions sent to a main broadcast, and the authority questions can be question data after purchased authorities.
In the live broadcast interaction process, the method and the device only return the interaction message to the main broadcast without carrying video data, thereby reducing the occupation of bandwidth. The first display image generated according to the live broadcast return information can be displayed in various modes, so that the anchor can watch the first display image conveniently.
One way is to use a separate sub-window display.
Referring to fig. 2, a flowchart illustrating steps of an embodiment of a live data display method according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 202, generating live video data according to a second display image of the target application, and uploading the live video data.
And 204, acquiring live broadcast return information in the live broadcast process based on the target application.
Step 206, generating a sub-window and displaying a first display image of the live broadcast return information in the drawing sequence of the sub-window.
And 208, setting the sub-window on a second display image corresponding to the target application for display.
And taking the display data of the target application as live broadcast data, wherein the display data comprises a display interface and interface contents, drawing each frame of corresponding second display image according to the data of the target application, and forming the display data of the target application through each frame of second display image, so as to obtain each frame of second display image and combine the second display image according to the timestamp to form corresponding live broadcast data, thereby realizing live broadcast of the display picture of the target application. In the process of live broadcast based on the target application, a user can send live broadcast return information such as barrage data, private letter data, service data and the like, and a user terminal of a main broadcast can receive the live broadcast return information. And then processing and displaying the live broadcast return information. The method includes the steps that a sub-window is generated, namely a sub-window is drawn at a preset position and drawn in the sub-window to display live broadcast return information sequentially, and therefore a first display image is obtained. The first display image of each frame is drawn in the above mode to form dynamically displayed live broadcast display content, such as live broadcast barrage data. In the process, a second display image is correspondingly drawn based on the target application, and for each frame of image of the screen, the drawn sub-window is superposed to the preset position of the second display image, so that the display of the screen image and the generation of live broadcast data are carried out through the drawing of each frame of image.
Another way is to use a separate sub-window display.
Referring to fig. 3, a flowchart illustrating steps of another embodiment of a live data display method according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 302, generating live video data according to a second display image of the target application, and uploading the live video data.
And 304, acquiring live broadcast return information in the live broadcast process based on the target application.
Step 306, generating a transparent window and drawing a first display image for dynamically displaying the live broadcast return information on the transparent window.
And step 308, overlaying the transparent window on a second display image corresponding to the target application for display.
And taking the display data of the target application as live broadcast data, wherein the display data comprises a display interface and interface contents, drawing each frame of corresponding second display image according to the data of the target application, and forming the display data of the target application through each frame of second display image, so as to obtain each frame of second display image and combine the second display image according to the timestamp to form corresponding live broadcast data, thereby realizing live broadcast of the display picture of the target application. In the process of live broadcast based on the target application, a user can send live broadcast return information such as barrage data, private letter data, service data and the like, and a user terminal of a main broadcast can receive the live broadcast return information. And then processing and displaying the live broadcast return information, wherein a similar watching user mode can be displayed, for example, the bullet screen data is displayed in a bullet screen mode. The live return information may be displayed through a transparent window (or transparent mask). And generating a transparent window, namely drawing a transparent window at a preset position, and drawing the transparent window to display the live broadcast return information in sequence, thereby obtaining a first display image. The first display image of each frame is drawn in the above mode to form dynamically displayed live broadcast display content, such as live broadcast barrage data. In the process, a second display image is correspondingly drawn based on the target application, and for each frame of image of the screen, the drawn sub-window is superposed to the preset position of the second display image, so that the display of the screen image and the generation of live broadcast data are carried out through the drawing of each frame of image.
The other mode is a drawing display mode based on the application mapping framework
Referring to fig. 4, a flowchart illustrating steps of another embodiment of a live data display method according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 402, creating application mapping architecture data and application source architecture data in the initialized target application through the suspended target object in advance.
The method comprises the steps of monitoring a system in advance, monitoring a creating function of the system, such as a create function, and when the creating function is called to create a target object, suspending the target object, namely, setting a first hook program when the target object is initialized, so that the hook program is set at the source of the target object. The target object is an important component object of the system for performing operations required by various applications.
The creation, operation and the like of part of applications in the system all need the participation of a target object, so that the call aiming at the target object can be intercepted by setting a hook at the source of the target object, and the required information is determined. In this embodiment, the application needs to call the target object for execution, so that the application information when the target object is called can be intercepted by the first hook program, and the application information is used to indicate application source architecture data for creating the target application. When the target application is created, a series of interfaces corresponding to the target need to be called to obtain data, a required function object is created, and the like, and the application source architecture data of the target application can be determined through the interface call and the function object, so that the target application is started and operated.
Therefore, in this embodiment, the application mapping architecture data of the target application is built by using the application information, and then the application source architecture data is built, that is, the required operation is determined according to the application information, one piece of mapping information is built first and then the source information is built, and the mapping information and the source information point to the same content address, that is, a shell which is the same as the application source architecture data is built by using the mapping architecture data, but the substantial content is still provided by the application source architecture data, so that a very small amount of memory information is consumed, the architecture of the target application can be known, and the return of the data required by the application is controlled.
In the embodiment of the present invention, hook programs may be set in various target objects of various systems, and in this embodiment, taking a Component Object Model (COM) as an example of a target Object, when a COM Component is initialized by creating a function, a first hook program may be set for the COM Component, so as to set a hook at a source of the COM Component, and implement takeover of the COM Component. The COM component is a software component technology for interaction between a web page server and a client by microsoft, and is an object-oriented programming mode which defines the behavior of an object within a single application program or among multiple application programs. And COM is implemented on multiple platforms and is not limited to Windows operating systems. For example, the game client employs a 3D engine of DirectX (DX), which is a multimedia programming interface created by microsoft corporation, implemented in C + + programming language, compliant with COM components. Therefore, when the game client is started and operated and the 3D engine operation is involved, the COM component needs to be called, and the application information calling the COM component can be acquired through the first hook engine arranged on the COM component, so that the source architecture data of the 3D engine of the game client is determined, namely various interfaces, function objects and the like required by the 3D engine operation are obtained.
Step 404, determining at least one target interface according to the application mapping architecture data.
When the application mapping architecture data is built, various interfaces required to be called by the target application, functions of the interfaces and other information can be determined, so that at least one required target interface can be determined, and the target interface comprises various interfaces for drawing, rendering and displaying images.
And 406, performing live broadcast based on the target application, and acquiring live broadcast return information.
And step 408, calling the target interface according to the application mapping architecture data, and drawing a first display image corresponding to the live broadcast return information.
And step 410, acquiring a second display image created based on the application source architecture data, and performing superposition display on the first display image and the second display image.
After the target application runs, a full-screen mode can be adopted, a display interface, namely an interface image, of the target application in the full-screen mode can be paved on the whole window, and the focus is located in the target application. In the bottom layer of the system and the angle of the screen display, the interface and the window displayed in the system can be regarded as a frame of image. The present embodiment performs rendering of the first display image based on the development technical principle of the target application according to the application mapping architecture data described above.
Therefore, after the live broadcast data can be generated based on the second display image drawn by the application source architecture data, and the live broadcast return information is acquired, the pre-injected hook program can be adopted to call the target interface to draw, render, display and the like the first display image corresponding to the live broadcast return information according to the image drawing, rendering and display modes corresponding to the application mapping architecture data, the second display image created based on the application source architecture data is acquired, the first display image is superposed on the second display image to be displayed, namely, the first display image is superposed on the second display image of the display target application to be displayed.
For example, for a game client, when playing a client game, a user usually adopts a full-screen mode and performs live game playing, and then may adopt a second display image of the game client created by application source architecture data, and generate live broadcast data based on the second display image; and calling the target interface according to the application mapping architecture data, drawing a first display image corresponding to the live broadcast return information, and displaying the first display image and a second display image in a superposition manner, so that the user seems to display data such as live broadcast barrages, gifts, private letters and the like under a full-screen game, and the user experience and the live broadcast flexibility are improved.
Step 412, calling the target interface according to the application mapping architecture data, and drawing the superimposed image data of the second display image corresponding to the target application and the first display image corresponding to the live broadcast return information.
In the above manner, by using the manner of respectively drawing the second display image and the first display image and then displaying them in an overlapping manner, in the actual processing, the target interface may be called according to the application mapping architecture data to directly draw the overlapping image data, that is, the overlapping image data in which the second display image and the first display image are overlapped is directly drawn. Therefore, live broadcast is carried out based on the superposed image data in the subsequent live broadcast process, the live broadcast data added with the data such as the barrage, the gift and the like can be directly returned, and the data such as the barrage and the like do not need to be superposed for the live broadcast data on the service side.
In summary, the architecture of the target application is determined by creating the application mapping architecture data and the application source architecture data during initialization, so that at least one target interface can be determined, the target interface is called based on the application mapping architecture data to draw the image data, the image data can be executed based on the architecture principle of the target application, the application is more suitable, the influence on the live broadcast and the application operation is reduced, and the user experience is improved.
In this embodiment, the image drawing is performed based on the framework principle of the target application, so that the framework of the target application and information such as an interface and a function under the framework can be predetermined, and then the image drawing is provided in a full-screen mode.
In an alternative embodiment, the architecture of the target application may be determined.
In this embodiment, a game client is taken as an example of a target application, the game client may adopt a DX 3D engine, and the 3D engine is accelerated based on a hardware Graphics Processing Unit (GPU) in a Windows operating system, and directly reads and writes from a memory, so that a message mechanism can be avoided.
Referring to fig. 5, a flowchart illustrating steps of an embodiment of a rendering and displaying method based on an application mapping architecture according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 502, monitoring a creating function in a system, and suspending a target object when the creating function creates the target object.
The method comprises the steps of monitoring a system in advance, monitoring a creating function of the system, such as a create function, and when the creating function is called to create a target object, suspending the target object, namely, setting a first hook program when the target object is initialized, so that the hook program is set at the source of the target object. The target object is an important component object of the system for performing operations required by various applications.
In the embodiment of the present invention, hook programs may be set in various target objects of various systems, and in this embodiment, taking a Component Object Model (COM) as an example of a target Object, when a COM Component is initialized by creating a function, a first hook program may be set for the COM Component, so as to set a hook at a source of the COM Component, and implement takeover of the COM Component.
Step 504, intercepting application information when the target object is called to create the target application.
The creation, operation and the like of part of applications in the system all need the participation of a target object, so that the call aiming at the target object can be intercepted by setting a hook at the source of the target object, and the required information is determined. In this embodiment, the application needs to call the target object for execution, so that the application information when the target object is called can be intercepted by the first hook program, and the application information is used to indicate application source architecture data for creating the target application. When the target application is created, a series of interfaces corresponding to the target need to be called to obtain data, a required function object is created, and the like, and the application source architecture data of the target application can be determined through the interface call and the function object, so that the target application is started and operated.
For example, the game client employs a 3D engine of DirectX (DX), which is a multimedia programming interface created by microsoft corporation, implemented in C + + programming language, compliant with COM components. Therefore, when the game client is started and operated and the 3D engine operation is involved, the COM component needs to be called, and the application information calling the COM component can be acquired through the first hook engine arranged on the COM component, so that the source architecture data of the 3D engine of the game client is determined, namely various interfaces, function objects and the like required by the 3D engine operation are obtained.
Step 506, when creating the object, creating a mapping object according to the application information and pointing to the source memory address, and then creating a source object and pointing to the source memory address.
And step 508, respectively setting a hook program on each interface corresponding to each source object.
And step 510, building application mapping architecture data according to the mapping object and the interface call.
And step 512, building application source framework data according to the source object and the interface call.
In this embodiment, when the target object creates the content related to the target application, the required object may be created according to information such as an interface that needs to be called, and in this embodiment, mapping architecture data is used to build a shell that is the same as the application source architecture data, but the substantial content of the shell is still provided by the application source architecture data. Therefore, when the application information needs to create an object, the relevant information of the object needing to be created is determined, a mapping object is created to point to the corresponding source memory address, and then the source object is created and points to the source memory address. The method comprises the steps of determining the content of the needed definition according to the application information, defining a mapping object with the definition, then defining the source object which needs to be defined originally, and setting a hook program at the interface needed by each source object, so that a group of hook programs can be set for the target application to realize the calling mode of hooking hooks at the entry level. Therefore, the mapping object and the interface are used for calling and building application mapping architecture data, the source object and the interface are used for calling and building application source architecture data, and a shell which is the same as the application source architecture data, namely the application mapping architecture data is obtained, and the actual definition and calling content of the application mapping architecture data can correspond to the application source architecture data and can also be mapped into the application source architecture data for processing.
For example, when the 3D engine of the game client calls the COM component, call information is intercepted, for example, the call information indicates that a function a is created, the function a calls an interface B and a function C, and the function C calls an interface D, E, functions a 'and C' can be mapped to the created function, a hook program is set at an interface B, D, E, a mapping function corresponding relationship is established, and then source functions a and C and a mapping function corresponding relationship with an interface B, D, E are created, so that a shell which is the same as source architecture data of the game client, i.e., mapping architecture data, can be mapped to the source architecture data through the mapping architecture data.
Step 514, determining at least one target interface for drawing the display image according to the interface information based on the application mapping architecture data.
When the application mapping architecture data is built, various interfaces required to be called by the target application, functions of the interfaces and other information can be determined, so that at least one required target interface can be determined, and the target interface comprises various interfaces for drawing, rendering and displaying images. A series of target interfaces related to the image can be determined, and each target interface is injected with a hook program, so that the target interfaces can be called directly through the hook program subsequently.
Therefore, application information can be intercepted during initialization of the target application, application mapping architecture data and application source architecture data of the target application are established, namely, the architecture of the target application is analyzed, and the required operation can be executed based on the architecture conveniently.
And 516, performing live broadcast based on the target application and acquiring live broadcast return information.
Step 518, the target interface is called according to the application mapping architecture data, and a first display image corresponding to the live broadcast return information is drawn.
And step 520, acquiring a second display image created based on the application source architecture data, and performing superposition display on the first display image and the second display image.
Step 522, calling the target interface according to the application mapping architecture data, and drawing the superposed image data of the second display image corresponding to the target application and the first display image corresponding to the live broadcast return information.
And when the target application is initialized in the terminal, application mapping architecture data and application source architecture data are established, and at least one target interface is determined. The target application may then be launched upon completion of the initialization, and the target application enters full screen mode upon launching. In this mode, the application source schema data may be invoked with the application mapping schema data in response. And generating live broadcast data based on the second display image of the target application, receiving live broadcast return information in the live broadcast process, and calling a target interface to draw and render a first display image based on the live broadcast return information based on an architecture principle determined by application mapping architecture data. And the display position of the first image can be determined, and the first image is superposed at the corresponding position of the second display image of the target application, so that the first display image is superposed on the second display image. And determining drawing data based on the live broadcast return information and the data of the target application, and calling a target interface to draw and render the superposed image data of the first display image and the second display image based on the drawing data.
For example, when a game client is initialized through a COM component, corresponding source architecture data and mapping architecture data are determined, and a target interface required for drawing, rendering and displaying an image is determined according to the game client. After the initialization is completed, the game client can be run and a full screen mode is entered. The user plays games in a live broadcast process, for live broadcast data such as barrage, a corresponding first display image can be drawn by calling a target interface based on an image drawing principle of a 3D engine, and the first image is superposed on a game client display interface image for display, or the superposed image is directly drawn to replace the game client display interface image.
In fact, in the running process of the target application, the interface image of the target application, namely the second display image, can be drawn and displayed in each frame, and the second display image responding to the live broadcast return information is also drawn in each frame and then is superposed and displayed on the second display image, so that even if the user angle image is static, the corresponding image is drawn in each frame at the system angle. Taking the game client as an example, the displayed game interface image is drawn and rendered for each frame, so that the first display image displayed on the game interface image is also displayed by being overlaid on the game interface image at the corresponding position after each frame is drawn and rendered.
Therefore, by calling the mode of directly drawing and displaying the image in the bottom window, a communication mechanism (such as a message mechanism of Windows) of the system can be bypassed, so that the display and the operation of the live broadcast return information can be normally executed without exiting the full screen, and the user experience is improved on the basis of ensuring the normal operation of the user.
For simplicity of explanation, the method embodiments are described as a series of acts or combinations, but those skilled in the art will appreciate that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the embodiments of the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
On the basis of the above embodiment, the embodiment of the invention also provides a live data display device.
Referring to fig. 6, a block diagram of a live data display apparatus according to an embodiment of the present invention is shown, which may specifically include the following modules:
an obtaining module 602, configured to obtain live broadcast return information in a live broadcast process based on a target application, where the live broadcast return information is used to feed back information other than video data, and the live broadcast return information includes bullet screen data.
And a drawing and displaying module 604, configured to draw a first display image corresponding to the live broadcast return information, and superimpose the first display image on a second display image corresponding to the target application for display, where the second display image is used to generate live broadcast video data.
In summary, when the live video data is generated by using the second display image of the target application for live broadcasting, live broadcast return information that does not carry the video data can be obtained, where the live broadcast return information includes the barrage data, the first display image corresponding to the live broadcast return information is drawn, and the first display image is superimposed on the second display image corresponding to the target application for display, so that the anchor broadcast obtains the returned interactive message to interact with the viewer, and the live broadcast return information does not carry the live broadcast video data, so that occupation of bandwidth resources can be effectively reduced, and the stability of the live broadcast is ensured.
Referring to fig. 7, a block diagram of a live data display apparatus according to another embodiment of the present invention is shown, which may specifically include the following modules:
an initialization module 606, configured to create application mapping architecture data and application source architecture data in an initialization target application in advance through a suspended target object; determining at least one target interface according to the application mapping architecture data.
An obtaining module 602, configured to obtain live broadcast return information in a live broadcast process based on a target application, where the live broadcast return information is used to feed back information other than video data, and the live broadcast return information includes bullet screen data.
And a drawing and displaying module 604, configured to draw a first display image corresponding to the live broadcast return information, and superimpose the first display image on a second display image corresponding to the target application for display, where the second display image is used to generate live broadcast video data.
And a live broadcast module 608, configured to generate live broadcast video data according to the second display image, and upload the live broadcast video data. The live broadcast return information further includes: traffic data and/or non-public information.
The drawing and displaying module 604 comprises: a drawing sub-module 6042 and a display sub-module 6044.
In an optional embodiment, the drawing sub-module 6042 is configured to generate a sub-window and display a first display image of the live return information in a drawing order of the sub-window; and a display sub-module 6044, configured to set the sub-window on the second display image corresponding to the target application for display.
In another optional embodiment, the drawing sub-module 6042 is configured to generate a transparent window and draw a first display image that dynamically shows live broadcast return information on the transparent window; and a display sub-module 6044, configured to superimpose the transparent window on the second display image corresponding to the target application for display.
In yet another optional embodiment, the rendering sub-module 6042 is configured to invoke the target interface according to application mapping architecture data, and render the first display image corresponding to the live broadcast return information, where the application mapping architecture data is determined according to application source architecture data of the target application. And a display sub-module 6044, configured to acquire a second display image created based on the application source architecture data, and display the first display image and the second display image in an overlapping manner.
The drawing and displaying module 604 is configured to call the target interface according to application mapping architecture data, and draw superimposed image data of a second display image corresponding to the target application and a first display image corresponding to the live broadcast return information.
Wherein, the initialization module 606 includes:
the suspend submodule 6062 is configured to monitor a system creation function, and suspend a target object when the creation function creates the target object.
And the architecture building submodule 6064 is configured to initialize a target application building application on the target object, determine a mapping object and interface call according to the application information, build application mapping architecture data, determine a source object and interface call according to the application information, and build application source architecture data.
The framework building submodule 6064 is further configured to, when creating an object, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
The initialization module 606, further includes: and an interface analysis sub-module 6066, configured to determine, based on the application mapping architecture data, at least one target interface for drawing a display image according to the interface information.
Therefore, by calling the mode of directly drawing and displaying the image in the bottom window, a communication mechanism (such as a message mechanism of Windows) of the system can be bypassed, so that the display and the operation of the live broadcast return information can be normally executed without exiting the full screen, and the user experience is improved on the basis of ensuring the normal operation of the user.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. It will be appreciated by those skilled in the art that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in a live data display method and a live data display apparatus device according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The invention discloses a1 and a live data display method, which comprises the following steps: in the process of live broadcasting based on a target application, live broadcasting return information is obtained, wherein the live broadcasting return information is used for feeding back information except video data, and the live broadcasting return information comprises barrage data; and drawing a first display image corresponding to the live broadcast return information, and superposing the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data.
A2, the method as in A1, drawing a first display image corresponding to the live broadcast return information, and overlaying the first display image on a second display image corresponding to the target application for display, including: generating a sub-window and displaying a first display image of the live broadcast return information in a drawing sequence of the sub-window; and setting the sub-window on a second display image corresponding to the target application for display.
A3, the method as in A1, drawing a first display image corresponding to the live broadcast return information, and overlaying the first display image on a second display image corresponding to the target application for display, including: generating a transparent window and drawing a first display image for dynamically displaying the live broadcast return information on the transparent window; and overlaying the transparent window on a second display image corresponding to the target application for display.
A4, the method as in A1, drawing a first display image corresponding to the live return information, comprising: and calling the target interface according to application mapping architecture data, and drawing a first display image corresponding to the live broadcast return information, wherein the application mapping architecture data is determined according to application source framework data of the target application.
A5, the method according to A4, superimposed on the corresponding second display image of the target application for display, including: and acquiring a second display image created based on the application source frame data, and performing overlapping display on the first display image and the second display image.
A6, the method as in A1, drawing a first display image corresponding to the live broadcast return information, and overlaying the first display image on a second display image corresponding to the target application for display, including: and calling the target interface according to application mapping architecture data, and drawing superposed image data of a second display image corresponding to the target application and a first display image corresponding to the live broadcast return information.
A7, the method of any one of a4-a6, further comprising: establishing application mapping architecture data and application source architecture data in the initialized target application through the suspended target object in advance; determining at least one target interface according to the application mapping architecture data.
A8, the method as in A7, wherein the creating application mapping architecture data and application source architecture data at initialization target application in advance through suspended target objects comprises: monitoring a creating function in a system, and suspending a target object when the creating function creates the target object; when the target object initializes the target application, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
A9, the method of A8, further comprising: when an object is created, creating a mapping object according to the application information and pointing to a source memory address, and then creating a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
A10, the method of A9, determining at least one target interface from the application mapping architecture data, comprising: determining at least one target interface for rendering a display image in dependence on the interface information based on the application mapping architecture data.
A11, the method of a1, further comprising: and generating live video data according to the second display image, and uploading the live video data.
A12, the method as in any a1-a6, the live return information further comprising: traffic data and/or non-public information.
The invention also provides B13, a live data display device, comprising: the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring live broadcast return information in the process of live broadcast based on a target application, the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises barrage data; and the drawing and displaying module is used for drawing a first display image corresponding to the live broadcast return information and overlapping the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data.
B14, the apparatus as described in B13, the drawing display module comprising: the drawing submodule is used for generating a sub-window and displaying a first display image of the live broadcast return information in the drawing sequence of the sub-window; and the display sub-module is used for setting the sub-window on a second display image corresponding to the target application for displaying.
B15, the apparatus as described in B13, the drawing display module comprising: the drawing submodule is used for generating a transparent window and drawing a first display image which dynamically shows the live broadcast return information on the transparent window; and the display sub-module is used for overlaying the transparent window on a second display image corresponding to the target application for display.
B16, the apparatus as described in B13, the drawing display module comprising: and the drawing submodule is used for calling the target interface according to application mapping architecture data and drawing a first display image corresponding to the live broadcast return information, wherein the application mapping architecture data is determined according to application source framework data of the target application.
B17, the apparatus as described in B16, the drawing display module further comprising: and the display sub-module is used for acquiring a second display image created based on the application source frame data and displaying the first display image and the second display image in a superposition manner.
The device of B18, as defined in B13, the rendering and displaying module is configured to invoke the target interface according to application mapping architecture data, and render superimposed image data of a second display image corresponding to the target application and a first display image corresponding to the live broadcast return information.
B19, the apparatus of any of B16-B18, further comprising: the initialization module is used for creating application mapping architecture data and application source architecture data in an initialization target application in advance through a suspended target object; determining at least one target interface according to the application mapping architecture data.
B20, the apparatus of B19, the initialization module comprising: the suspension submodule is used for monitoring a function created in the system and suspending a target object when the created function creates the target object; and the architecture building submodule is used for initializing a target application building application in the target object, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
B21, the device according to B20, wherein the framework building sub-module is further used for creating a mapping object according to the application information and pointing to a source memory address when creating an object, and then creating a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
B22, the apparatus as in B13, the initialization module further comprising: and the interface analysis submodule is used for determining at least one target interface for drawing a display image according to the interface information based on the application mapping architecture data.
B23, the apparatus of B13, further comprising: and the live broadcast module is used for generating live broadcast video data according to the second display image and uploading the live broadcast video data.
B24, the apparatus as in any of B13-B18, the live return information further comprising: traffic data and/or non-public information.

Claims (22)

1. A live data display method includes:
creating application mapping architecture data when initializing target application through a suspended target object in advance; determining at least one target interface according to the application mapping architecture data;
in the process of live broadcast based on a target application, live broadcast return information is acquired, wherein the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises: barrage data, service data and/or non-public information;
drawing a first display image corresponding to the live broadcast return information, and superposing the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data; the drawing of the first display image corresponding to the live broadcast return information and the superposition of the first display image and the second display image corresponding to the target application for display comprises the following steps: and calling a target interface according to application mapping architecture data, and drawing superposed image data of a second display image corresponding to the target application and a first display image corresponding to the live broadcast return information.
2. The method of claim 1, wherein drawing a first display image corresponding to the live return information and superimposing the first display image on a second display image corresponding to the target application for display, further comprises:
generating a sub-window and displaying a first display image of the live broadcast return information in a drawing sequence of the sub-window;
and setting the sub-window on a second display image corresponding to the target application for display.
3. The method of claim 1, wherein drawing a first display image corresponding to the live return information and superimposing the first display image on a second display image corresponding to the target application for display comprises:
generating a transparent window and drawing a first display image for dynamically displaying the live broadcast return information on the transparent window;
and overlaying the transparent window on a second display image corresponding to the target application for display.
4. The method of claim 1, wherein rendering a first display image corresponding to the live return information comprises:
and calling the target interface according to application mapping architecture data, and drawing a first display image corresponding to the live broadcast return information, wherein the application mapping architecture data is determined according to application source architecture data of the target application.
5. The method of claim 1, wherein superimposing for display onto the target application-corresponding second display image comprises:
and acquiring a second display image created based on the application source architecture data, and overlapping and displaying the first display image and the second display image.
6. The method of any of claims 1, 4 and 5, further comprising:
and creating application source architecture data when the target application is initialized through the suspended target object in advance.
7. The method of claim 6, wherein creating application mapping schema data and application source schema data at an initialization target application in advance through a suspended target object comprises:
monitoring a creating function in a system, and suspending a target object when the creating function creates the target object;
when the target object initializes the target application, determining a mapping object and interface call according to the application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
8. The method of claim 7, further comprising:
when an object is created, creating a mapping object according to the application information and pointing to a source memory address, and then creating a source object and pointing to the source memory address;
and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
9. The method of claim 8, wherein determining at least one target interface from the application mapping architecture data comprises:
determining at least one target interface for rendering a display image in dependence on the interface information based on the application mapping architecture data.
10. The method of claim 1, further comprising:
and generating live video data according to the second display image, and uploading the live video data.
11. The method of any of claims 1-5, wherein the live return information further comprises: traffic data and/or non-public information.
12. A live data display apparatus comprising:
the initialization module is used for creating application mapping architecture data when initializing the target application in advance through the suspended target object; determining at least one target interface according to the application mapping architecture data;
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring live broadcast return information in the process of live broadcast based on target application, the live broadcast return information is used for feeding back information except video data, and the live broadcast return information comprises barrage data, service data and/or non-public information;
the drawing and displaying module is used for drawing a first display image corresponding to the live broadcast return information and overlapping the first display image to a second display image corresponding to the target application for displaying, wherein the second display image is used for generating live broadcast video data; the drawing and displaying module is further used for: and calling a target interface according to application mapping architecture data, and drawing superposed image data of a second display image corresponding to the target application and a first display image corresponding to the live broadcast return information.
13. The apparatus of claim 12, wherein the drawing display module comprises:
the drawing submodule is used for generating a sub-window and displaying a first display image of the live broadcast return information in the drawing sequence of the sub-window;
and the display sub-module is used for setting the sub-window on a second display image corresponding to the target application for displaying.
14. The apparatus of claim 12, wherein the drawing display module comprises:
the drawing submodule is used for generating a transparent window and drawing a first display image which dynamically shows the live broadcast return information on the transparent window;
and the display sub-module is used for overlaying the transparent window on a second display image corresponding to the target application for display.
15. The apparatus of claim 12, wherein the drawing display module comprises:
and the drawing submodule is used for calling the target interface according to application mapping architecture data and drawing a first display image corresponding to the live broadcast return information, wherein the application mapping architecture data is determined according to application source architecture data of the target application.
16. The apparatus of claim 12, wherein the drawing display module further comprises:
and the display sub-module is used for acquiring a second display image created based on the application source architecture data and overlapping and displaying the first display image and the second display image.
17. The apparatus of claim 12 or 16, further comprising:
and the initialization module is used for creating application source architecture data in the initialization target application through the suspended target object in advance.
18. The apparatus of claim 17, wherein the initialization module comprises:
the suspension submodule is used for monitoring a function created in the system and suspending a target object when the created function creates the target object;
and the architecture building submodule is used for initializing a target application building application in the target object, determining a mapping object and interface call according to application information, building application mapping architecture data, determining a source object and interface call according to the application information, and building application source architecture data.
19. The apparatus of claim 18,
the framework building submodule is also used for building a mapping object according to the application information and pointing to a source memory address when the object is built, and then building a source object and pointing to the source memory address; and when the interfaces are called, a group of hook programs are set at the corresponding interfaces according to the application information, and the interface information of each interface is determined.
20. The apparatus of claim 19, wherein the initialization module further comprises:
and the interface analysis submodule is used for determining at least one target interface for drawing a display image according to the interface information based on the application mapping architecture data.
21. The apparatus of claim 12, further comprising:
and the live broadcast module is used for generating live broadcast video data according to the second display image and uploading the live broadcast video data.
22. The apparatus of any of claims 12-16, wherein the live return information further comprises: traffic data and/or non-public information.
CN201611229470.0A 2016-12-27 2016-12-27 Live data display method and device Active CN106713968B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201611229470.0A CN106713968B (en) 2016-12-27 2016-12-27 Live data display method and device
PCT/CN2017/118840 WO2018121557A1 (en) 2016-12-27 2017-12-27 Live broadcast data display method, device, program and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611229470.0A CN106713968B (en) 2016-12-27 2016-12-27 Live data display method and device

Publications (2)

Publication Number Publication Date
CN106713968A CN106713968A (en) 2017-05-24
CN106713968B true CN106713968B (en) 2020-04-24

Family

ID=58895498

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611229470.0A Active CN106713968B (en) 2016-12-27 2016-12-27 Live data display method and device

Country Status (2)

Country Link
CN (1) CN106713968B (en)
WO (1) WO2018121557A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106713968B (en) * 2016-12-27 2020-04-24 北京奇虎科技有限公司 Live data display method and device
CN110147185B (en) * 2018-11-16 2021-02-26 腾讯科技(深圳)有限公司 Message prompting method, device, electronic device and storage medium
CN110536094A (en) * 2019-08-27 2019-12-03 上海盛付通电子支付服务有限公司 A kind of method and apparatus transmitting information in video call process
CN112911329B (en) * 2021-02-03 2023-08-25 广州虎牙科技有限公司 Window live broadcast method, device, electronic equipment and computer readable storage medium
CN113225606B (en) * 2021-04-30 2022-09-23 上海哔哩哔哩科技有限公司 Video barrage processing method and device
CN113938753B (en) * 2021-12-20 2022-04-26 北京搜狐新动力信息技术有限公司 Live broadcast data processing method and device, storage medium and equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101500125A (en) * 2008-02-03 2009-08-05 突触计算机系统(上海)有限公司 Method and apparatus for providing user interaction during displaying video on customer terminal
CN104902318A (en) * 2015-04-29 2015-09-09 小米科技有限责任公司 Playing control method and terminal device
CN105597321A (en) * 2015-12-18 2016-05-25 武汉斗鱼网络科技有限公司 Barrage display method and system in full-screen game state
WO2016154149A1 (en) * 2015-03-20 2016-09-29 Twitter, Inc. Live video stream sharing
CN106162230A (en) * 2016-07-28 2016-11-23 北京小米移动软件有限公司 The processing method of live information, device, Zhu Boduan, server and system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100451956C (en) * 2006-05-24 2009-01-14 北京壁虎科技有限公司 Screen display method based on drawing engine
CN101227421B (en) * 2007-01-16 2012-04-18 沃天醒石(北京)科技有限公司 Instantaneous communication method and apparatus under full screen graphics mode
US20130198774A1 (en) * 2012-01-30 2013-08-01 Consultants Net Creation Inc. Live broadcasting of dynamically generated content
CN106713968B (en) * 2016-12-27 2020-04-24 北京奇虎科技有限公司 Live data display method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101500125A (en) * 2008-02-03 2009-08-05 突触计算机系统(上海)有限公司 Method and apparatus for providing user interaction during displaying video on customer terminal
WO2016154149A1 (en) * 2015-03-20 2016-09-29 Twitter, Inc. Live video stream sharing
CN104902318A (en) * 2015-04-29 2015-09-09 小米科技有限责任公司 Playing control method and terminal device
CN105597321A (en) * 2015-12-18 2016-05-25 武汉斗鱼网络科技有限公司 Barrage display method and system in full-screen game state
CN106162230A (en) * 2016-07-28 2016-11-23 北京小米移动软件有限公司 The processing method of live information, device, Zhu Boduan, server and system

Also Published As

Publication number Publication date
CN106713968A (en) 2017-05-24
WO2018121557A1 (en) 2018-07-05

Similar Documents

Publication Publication Date Title
CN106658145B (en) Live broadcast data processing method and device
CN106713968B (en) Live data display method and device
CN108289234B (en) Virtual gift special effect animation display method, device and equipment
CN108939556B (en) Screenshot method and device based on game platform
WO2022048097A1 (en) Single-frame picture real-time rendering method based on multiple graphics cards
WO2018000609A1 (en) Method for sharing 3d image in virtual reality system, and electronic device
CN110968962B (en) Three-dimensional display method and system based on cloud rendering at mobile terminal or large screen
CN106791915B (en) Method and device for displaying video image
CN109845250B (en) Method and system for sharing effect of image
US20170171277A1 (en) Method and electronic device for multimedia recommendation based on android platform
CN114650434A (en) Cloud service-based rendering method and related equipment thereof
CN112689168A (en) Dynamic effect processing method, dynamic effect display method and dynamic effect processing device
WO2024051541A1 (en) Special-effect image generation method and apparatus, and electronic device and storage medium
WO2024051540A1 (en) Special effect processing method and apparatus, electronic device, and storage medium
WO2016066056A1 (en) Image remote projection method, server and client
US11095956B2 (en) Method and system for delivering an interactive video
CN106445535B (en) Operation processing method and device
CN115484489A (en) Resource processing method, resource processing device, electronic device, storage medium, and program product
CN115661011A (en) Rendering method, device, equipment and storage medium
CN111913761B (en) Plug-in processing method, device, equipment and storage medium for live channel
CN108271033B (en) Video live broadcast method and device
CN114398018A (en) Picture display method, device, storage medium and electronic equipment
CN114428573A (en) Special effect image processing method and device, electronic equipment and storage medium
US20170186218A1 (en) Method for loading 360 degree images, a loading module and mobile terminal
US20190370932A1 (en) Systems And Methods For Transforming Media Artifacts Into Virtual, Augmented and Mixed Reality Experiences

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant