WO2018121556A1 - 直播数据处理方法、装置、程序及介质 - Google Patents

直播数据处理方法、装置、程序及介质 Download PDF

Info

Publication number
WO2018121556A1
WO2018121556A1 PCT/CN2017/118839 CN2017118839W WO2018121556A1 WO 2018121556 A1 WO2018121556 A1 WO 2018121556A1 CN 2017118839 W CN2017118839 W CN 2017118839W WO 2018121556 A1 WO2018121556 A1 WO 2018121556A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
interface
data
information
live broadcast
Prior art date
Application number
PCT/CN2017/118839
Other languages
English (en)
French (fr)
Inventor
葛山
董晶阳
Original Assignee
北京奇虎科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京奇虎科技有限公司 filed Critical 北京奇虎科技有限公司
Publication of WO2018121556A1 publication Critical patent/WO2018121556A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Definitions

  • the present invention relates to the field of computer technology, and in particular, to a live data processing method, a live data processing device, a computer program, and a computer readable medium.
  • the webcasting is a multimedia interactive mode in which the live broadcast video is used by the signal collecting device, and then the live video is uploaded to the server, and then fed back to the user terminal by the server, and the multimedia interactive mode also supports the user to interact with the anchor of the webcast. That is, the user's message can be fed back to the anchor, so that the anchor interacts based on the message.
  • the above interactive message will be displayed in the live broadcast screen.
  • the interactive message usually has a certain delay in the process of webcasting, which affects the live broadcast effect.
  • the present invention has been made in order to provide a live data processing method and corresponding live data processing apparatus, computer program and computer readable medium that overcome the above problems or at least partially solve the above problems.
  • a live data processing method which includes: acquiring live broadcast return information in a process of performing live broadcast based on a target application, wherein the live broadcast return information is used for feedback other than live video data.
  • Information the live broadcast return information includes barrage data; the application interface of the target application and the display image superimposed by the live broadcast return information are rendered, and the live broadcast interface is displayed according to the display image; and the live broadcast with the barrage data is generated according to the live broadcast interface.
  • Video data uploading the live video data.
  • the rendering of the application interface of the target application and the display image superimposed by the live broadcast information includes: calling the first interface according to the application mapping architecture data, and drawing an information feedback image corresponding to the live broadcast return information, and acquiring the application source based on the application source An application interface for drawing the frame data; superimposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined according to the application source frame data of the target application.
  • the rendering of the application interface of the target application and the display image superimposed by the live broadcast information includes: calling the first interface according to the application mapping architecture data, and mapping the live broadcast return information corresponding to the application interface of the target application The display image of the information feedback image is rendered.
  • displaying the live broadcast interface according to the display image comprises: displaying the display image sequentially on the terminal screen according to the time information of the display image, and forming a live broadcast interface of the target application.
  • generating the live video data with the barrage data according to the live broadcast interface including: calling the second interface according to the application mapping architecture data, acquiring the display image; and according to the time information of the display image, The display image constitutes the live video data.
  • the live broadcast return information further includes: service data and/or non-public information.
  • the information feedback image corresponding to the live broadcast return information includes a public image and a private image, wherein the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
  • acquiring the display image comprises: acquiring a display image generated by an application interface of the target application and a public image overlay.
  • the method further includes: initializing the application mapping architecture data and the application source architecture data by initializing the target application by using the suspended target object; determining at least one target interface according to the application mapping architecture data.
  • the creating, by the suspended target object, the application mapping architecture data and the application source architecture data in the initialization target application including: monitoring a creation function in the system, and suspending when the creation function creates the target object The target object; when the target object initializes the target application, determines the mapping object and the interface call according to the application information, builds the application mapping architecture data, determines the source object and the interface call according to the application information, and builds the application source architecture. data.
  • the method further includes: when the object is created, creating a mapping object according to the application information and pointing to the source memory address, and then creating the source object and pointing to the source memory address; when the interface is invoked, corresponding to the application information A set of hook programs is set at each interface, and the interface information of each interface is determined.
  • determining the at least one target interface according to the application mapping architecture data includes: determining, according to the application mapping schema data, at least one target interface for drawing a display image according to the interface information.
  • a live data processing apparatus which specifically includes: an obtaining module, configured to acquire live broadcast return information during a live broadcast based on a target application, where the live broadcast return information is used for feedback In addition to the video data, the live broadcast return information includes the barrage data; the rendering display module is configured to render the application interface of the target application and the display image superimposed by the live broadcast information, and display the live broadcast interface according to the display image; The returning module is configured to generate live video data with barrage data according to the live broadcast interface, and upload the live video data.
  • the rendering display module includes: a rendering module, configured to invoke the first interface according to the application mapping architecture data, draw an information feedback image corresponding to the live broadcast return information, and obtain an application interface that is drawn based on the application source frame data. And superimposing and rendering the information feedback image and the application interface to obtain a display image; wherein the application mapping architecture data is determined according to application source framework data of the target application.
  • the rendering display module includes: a drawing module, configured to invoke the first interface according to the application mapping architecture data, and draw a display image of the information feedback image corresponding to the live return information on the application interface of the target application , rendering the display image.
  • a drawing module configured to invoke the first interface according to the application mapping architecture data, and draw a display image of the information feedback image corresponding to the live return information on the application interface of the target application , rendering the display image.
  • the drawing display module includes: a display module, configured to sequentially display the display image on the terminal screen according to the time information of the displayed image, and constitute a live broadcast interface of the target application.
  • the live broadcast returning module includes: an obtaining module, configured to invoke the second interface according to the application mapping architecture data to obtain the display image; and a live broadcast generating module, configured to: according to time information of the displayed image, Each of the display images constitutes the live video data.
  • the live broadcast return information further includes: service data and/or non-public information.
  • the information feedback image corresponding to the live broadcast return information includes a public image and a private image, wherein the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
  • the acquiring module is configured to acquire a display image generated by an application interface of the target application and a public image overlay.
  • the method further includes: an initialization module, configured to create application mapping architecture data and application source architecture data in the initialization target application by using the suspended target object in advance; and determining at least one target interface according to the application mapping architecture data.
  • an initialization module configured to create application mapping architecture data and application source architecture data in the initialization target application by using the suspended target object in advance; and determining at least one target interface according to the application mapping architecture data.
  • the initialization module includes: a suspending module, configured to monitor a creation function in the system, suspending the target object when the creation function creates a target object; and an architecture building module, configured to be in the target object
  • the target application is initialized to create an application, and the mapping object and the interface call are determined according to the application information, the application mapping architecture data is set up, and the source object and the interface call are determined according to the application information, and the application source architecture data is constructed.
  • the architecture building module is further configured to: when the object is created, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address; when the interface is invoked, according to the The application information sets a set of hook programs at corresponding interfaces, and determines interface information of each interface.
  • the initialization module further includes: an interface analysis module, configured to determine, according to the application mapping architecture data, at least one target interface for drawing a display image according to the interface information.
  • an interface analysis module configured to determine, according to the application mapping architecture data, at least one target interface for drawing a display image according to the interface information.
  • a computer program comprising computer readable code, when the readable code is run on a computing device, causes the computing device to perform any of the embodiments of the present invention The live data processing method described.
  • the embodiment of the invention further provides a computer readable medium, wherein the program according to the embodiment of the invention is stored.
  • live broadcast return information in the process of performing live broadcast based on the target application, where the live broadcast return information is used to feed back information other than live video data, where the live broadcast return information includes barrage data, and the application of the target application is rendered.
  • the display image superimposed by the interface and the live broadcast information is displayed, and the live broadcast interface is displayed according to the display image, and the live video data with the barrage data is generated according to the live broadcast interface, and the live video data is uploaded; thereby directly generating a live band on the live broadcast end.
  • Live video data with data such as barrage does not require live broadcast video and barrage to be synthesized on the server side to reduce the delay.
  • FIG. 1 is a flow chart showing the steps of an embodiment of a live data processing method according to an embodiment of the present invention
  • FIG. 2 is a flow chart showing the steps of an embodiment of a live data processing method according to another embodiment of the present invention.
  • FIG. 3 is a flow chart showing the steps of another embodiment of a live data processing method according to another embodiment of the present invention.
  • FIG. 4 is a structural block diagram of an embodiment of a live data processing apparatus according to an embodiment of the present invention.
  • FIG. 5 is a structural block diagram of an embodiment of a live data processing apparatus according to another embodiment of the present invention.
  • FIG. 6 shows a block diagram of a computing device for performing a live data processing method in accordance with the present invention
  • Figure 7 shows a storage unit for holding or carrying program code implementing a live data processing method in accordance with the present invention.
  • FIG. 1 a flow chart of a method for processing a live data processing method according to an embodiment of the present invention is shown.
  • Step 102 Acquire live broadcast return information in the process of performing live broadcast based on the target application, where the live broadcast return information is used to feed back information other than live video data, where the live broadcast return information includes barrage data.
  • the live broadcast of the present embodiment is based on the live broadcast of the target application, that is, the data of the target application is used as the live broadcast data.
  • the live broadcast of the game is a process in which the user uses the game application to play the game, and the data of the corresponding image of the game application is used as the live broadcast data.
  • the anchor is the user who performs the live broadcast, such as the game player in the game live broadcast, that is, the anchor.
  • the user can send the barrage data, or send a gift to the host broadcast, etc., wherein the barrage refers to a large number of display modes in the video that float through the screen in the form of subtitle pop-ups.
  • the live broadcast return information may be obtained during the live broadcast based on the target application.
  • the live broadcast information of the embodiment is used to feed back information other than the video data, that is, the live broadcast information is an interactive message that does not carry live video data, and may include barrage data, and may also include Gifts and other data.
  • Step 104 Render an application interface of the target application and a display image superimposed by the live broadcast return information, and display a live broadcast interface according to the display image.
  • Step 106 Generate live video data with barrage data according to the live broadcast interface, and upload the live video data.
  • the interface and window displayed in the system can be regarded as one frame image. Therefore, during the running of the target application, the application interface of the target application needs to be drawn and rendered, and in the process, The superimposed live broadcast return information is drawn on the application interface, and the corresponding display image is obtained, and the display image is respectively drawn for each frame, so that each frame display image is combined according to the time sequence of the time stamp to display a corresponding live broadcast interface, the direct interface Shows barrage data, as well as other return data such as gifts. Then, the live video data with the barrage data can be generated according to the live broadcast interface, that is, the live broadcast video data of each frame is combined to obtain the live video data, and the live video data with the barrage data is uploaded.
  • the live video data with the data such as the barrage is directly generated on the live broadcast terminal, and the live broadcast video and the live broadcast information such as the barrage are not required to be synthesized on the server side, thereby reducing the delay.
  • FIG. 2 a flow chart of a method for processing a live data processing method according to another embodiment of the present invention is shown, which may specifically include the following steps:
  • Step 202 Create application mapping architecture data and application source architecture data in the initialization target application by using the suspended target object in advance.
  • the system is monitored in advance, and the system creation function such as the crate function is used.
  • the creation function is called to create the target object
  • the target object can be suspended, that is, the first hook program is set when the target object is initialized, thereby at the source of the target object. Set the hook program.
  • This target object is an important component object of the system and is used to perform the operations required by various applications.
  • the creation, operation, etc. of some applications in the system require the participation of the target object. Therefore, by setting a hook at the source of the target object, each call to the target object can be intercepted to determine the required information.
  • the application needs to invoke the target object to perform the operation, so the application information when the target object is invoked may be intercepted by the first hook program, and the application information is used to indicate the application source architecture data of the target application. That is, when the target application is created, a series of interfaces of the target object are called to acquire data, and a required function object or the like is created, and the application source schema data of the target application can be determined through the interface call and the function object, thereby starting and running the target application.
  • Target application when the target application is created, a series of interfaces of the target object are called to acquire data, and a required function object or the like is created, and the application source schema data of the target application can be determined through the interface call and the function object, thereby starting and running the target application.
  • Target application when the target
  • the application first constructs the application mapping architecture data of the target application by using the application information, and then constructs the application source architecture data, that is, determines the required operation according to the application information, first creates a mapping information, and then creates the source information, and the mapping information is
  • the source information points to the same content address, that is, the mapping architecture data is used to construct a shell with the same source data as the application source architecture, but the substance content is still provided by the application source architecture data, so that the memory information is consumed in a small amount, and the target application can be learned.
  • a hook program can be set in various target objects of various systems.
  • a component object model (COM) is taken as an example object, and when a function is initialized, a COM component can be initialized.
  • the first hook program is set for the COM component, so that the hook is set at the source of the COM component to implement the takeover of the COM component.
  • the COM component is a software component technology of Microsoft's interaction between the web server and the client. It is an object-oriented programming model that defines how the object behaves within a single application or between multiple applications. .
  • COM is implemented on multiple platforms, not limited to the Windows operating system.
  • the game client uses DirectX (Direct eXtension, DX) 3D engine, which is a multimedia programming interface created by Microsoft Corporation, implemented by the C++ programming language, following COM components. Therefore, when the game client is started to run, when the 3D engine operation is involved, the COM component needs to be called, and the application information of the calling COM component can be obtained through the first hook engine set in the COM component, thereby determining the 3D engine of the game client.
  • the source architecture data that is, the various interfaces and function objects required for the 3D engine to run.
  • Step 204 Determine at least one target interface according to the application mapping architecture data.
  • the target interface includes, for drawing, rendering, Various interfaces for displaying images.
  • the target interface includes a first interface for drawing image data and a second interface for outputting image data.
  • Step 206 Acquire live broadcast return information during the live broadcast based on the target application.
  • Step 208 Call the first interface according to the application mapping architecture data, draw an information feedback image corresponding to the live broadcast return information, and obtain an application interface that is drawn based on the application source framework data.
  • Step 210 Superimpose and render the barrage image and the application interface to obtain a display image.
  • the full screen mode can be adopted.
  • the display interface of the target application that is, the interface image
  • the focus is located in the target application.
  • the interface and window displayed in the system can be regarded as one frame of image.
  • the information feedback image corresponding to the live broadcast return information is drawn based on the development technical principle of the target application.
  • the pre-injected hook program may be used to call the first interface to draw the live image according to the image mapping, rendering, and display of the application mapping architecture data.
  • the pre-injected hook program may be used to call the first interface to draw the live image according to the image mapping, rendering, and display of the application mapping architecture data.
  • the game mode is usually in full screen mode
  • the game client application interface created by the application source frame data may be used to generate live data based on the application interface; Calling the target interface, drawing an information feedback image corresponding to the live broadcast return information, superimposing the information feedback image and the application interface, so that the user appears to be displaying the live broadcast barrage, the gift, the private message and the like under the full-screen game.
  • Step 212 The first interface is invoked according to the application mapping architecture data, and the display image of the live feedback information corresponding information feedback image is superimposed on the application interface of the target application, and the display image is rendered.
  • the application interface and the information feedback image are respectively displayed in a superimposed manner.
  • the first interface may be invoked according to the application mapping architecture data, and the superimposed display image is directly drawn, that is, the application interface and the information feedback image are directly drawn. Display image data that has been superimposed. Therefore, the live broadcast process based on the superimposed display image during the live broadcast process can directly return the live broadcast data of the added barrage, gifts, and the like, without superimposing the data such as the barrage on the service side for the live data.
  • Step 214 Display the display image sequentially on the terminal screen according to the time information of the displayed image to form a live broadcast interface of the target application.
  • the time information of each display image is obtained, that is, the time stamp of each display image is acquired, and the display images are combined according to the time stamp, and each frame display image is sequentially displayed on the terminal screen to form a live broadcast interface of the target application.
  • Step 216 Call the second interface according to the application mapping architecture data to obtain the display image.
  • Step 218 The display images are formed into the live video data according to time information of the displayed image.
  • Step 220 Upload the live video data.
  • a live image carrying data such as barrage data, gifts, and the like is directly generated on the live broadcast end, and does not need to be synthesized in the server. Therefore, the second interface may be invoked according to the application mapping architecture data, the display image is acquired, the display image is synthesized according to the time information of the display image, the response live video data is generated, and the live video data is uploaded.
  • the live broadcast return information includes: barrage data, service data, and/or non-public information.
  • the service data is determined according to a specific service.
  • the game service is business data as a gift and other electronic item data
  • the non-public information includes data such as a private message and a permission issue sent to the anchor, and the permission question may be the question data after the purchased authority. Therefore, the information feedback image corresponding to the live broadcast return information includes a public image and a private image, wherein the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
  • the display image generated by the application interface of the target application and the public image overlay may be acquired, and live data that does not carry the non-public information may be generated.
  • the separately drawn application interface and the display image generated by the superimposed image superposition may be directly obtained; for the method of directly drawing the superimposed image in step 212, the application interface may be drawn. And superimposed images of the public images, and the private images are separately drawn and then superimposed to ensure that the live data does not carry non-public information.
  • the image rendering described above is performed based on the architectural principle of the target application. Therefore, the architecture of the target application and the interfaces, functions, and the like under the architecture may be determined in advance, and then the image rendering is provided in the full screen mode.
  • the architecture of the target application can be determined.
  • the game client is used as a target application, and the game client can adopt a DX 3D engine.
  • the 3D engine is based on a hardware graphics processor (GPU) acceleration in the Windows operating system, directly from the memory. Read and write can circumvent the message mechanism.
  • GPU hardware graphics processor
  • FIG. 3 a flow chart of steps of another embodiment of a live data processing method according to another embodiment of the present invention is shown, which may specifically include the following steps:
  • Step 302 Monitor a creation function in the system, and suspend the target object when the creation function creates a target object.
  • the system is monitored in advance, and the system creation function such as the crate function is used.
  • the creation function is called to create the target object
  • the target object can be suspended, that is, the first hook program is set when the target object is initialized, thereby at the source of the target object. Set the hook program.
  • This target object is an important component object of the system and is used to perform the operations required by various applications.
  • a hook program can be set in various target objects of various systems.
  • a component object model (COM) is taken as an example object, and when a function is initialized, a COM component can be initialized.
  • the first hook program is set for the COM component, so that the hook is set at the source of the COM component to implement the takeover of the COM component.
  • COM component object model
  • Step 304 Intercept the application information when the target object is created by calling the target object.
  • the creation, operation, etc. of some applications in the system require the participation of the target object. Therefore, by setting a hook at the source of the target object, each call to the target object can be intercepted to determine the required information.
  • the application needs to invoke the target object to perform the operation, so the application information when the target object is invoked may be intercepted by the first hook program, and the application information is used to indicate the application source architecture data of the target application. That is, when the target application is created, it is required to call a series of interfaces corresponding to the target to acquire data, and create a required function object, etc., through the interface call and the function object, the application source schema data of the target application can be determined, thereby starting and running the target application. Target application.
  • the game client uses DirectX (Direct eXtension, DX) 3D engine, which is a multimedia programming interface created by Microsoft Corporation, implemented by the C++ programming language, following COM components. Therefore, when the game client is started to run, when the 3D engine operation is involved, the COM component needs to be called, and the application information of the calling COM component can be obtained through the first hook engine set in the COM component, thereby determining the 3D engine of the game client.
  • the source architecture data that is, the various interfaces and function objects required for the 3D engine to run.
  • Step 306 When creating an object, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address.
  • Step 308 respectively setting a hook program on each interface corresponding to each source object.
  • Step 310 Build application mapping architecture data according to the mapping object and the interface call.
  • Step 312 Build the application source architecture data according to the source object and the interface call.
  • the information about the interface to be called may be created according to the determined object.
  • the mapping architecture data is used to construct a shell with the same application source architecture data, but The substance is still provided by the application source schema data. Therefore, when the application information needs to create an object, it is possible to determine information about the object to be created, create a mapping object to point to the corresponding source memory address, and then create a source object and point to the source memory address.
  • the mapping object and the interface call are used to build the application mapping architecture data, and the source object and the interface call are used to construct the application source architecture data, thereby obtaining the same shell as the application source architecture data, that is, the application mapping architecture data, and the application mapping architecture data is actually defined.
  • the called content can correspond to the application source schema data, and can also be mapped to the application source schema data for processing.
  • the call information is intercepted, for example, the call information indicates that the function A is created, the function A calls the interface B and the function C, and the function C calls the interfaces D and E, and can create the mapping function A' and C', and at the interfaces B, D, E respectively set the hook program and establish the correspondence relationship of the mapping function, and then create the source functions A and C and the corresponding relationship with the interfaces B, D, E, thereby building a game client
  • the shell with the same source schema data maps the schema data through which the schema schema data can be mapped to the source schema data.
  • Step 314 Determine, according to the application mapping architecture data, at least one target interface for drawing a display image according to the interface information.
  • various interfaces called by the target application and functions of each interface can be determined, so that at least one target interface required can be determined, and the target interface includes, for drawing, rendering, Various interfaces for displaying images.
  • a series of target interfaces related to the image can be determined, and each target interface is injected with a hook program, so that the target interface can be directly called through the hook program.
  • the target interface includes a first interface and a second interface.
  • the application information can be intercepted during the target application at the time of initialization, and the application mapping architecture data and the application source architecture data of the target application are constructed, that is, the architecture of the target application is analyzed, so that the subsequent operations based on the architecture can be performed.
  • Step 316 Perform live broadcast based on the target application, and obtain live broadcast return information.
  • Step 318 Call the first interface according to the application mapping architecture data, draw an information feedback image corresponding to the live broadcast return information, and obtain an application interface that is drawn based on the application source framework data.
  • Step 320 superimposing and rendering the barrage image and the application interface to obtain a display image.
  • the full screen mode can be adopted.
  • the display interface of the target application that is, the interface image
  • the interface and window displayed in the system can be regarded as one frame of image.
  • the information feedback image corresponding to the live broadcast return information is drawn based on the application technology of the target application according to the application mapping architecture data.
  • the pre-injected hook program may be used to call the first interface to draw the live image according to the image mapping, rendering, and display of the application mapping architecture data.
  • the pre-injected hook program may be used to call the first interface to draw the live image according to the image mapping, rendering, and display of the application mapping architecture data.
  • the game mode is usually in full screen mode
  • the game client application interface created by the application source frame data may be used to generate live data based on the application interface; Calling the target interface, drawing an information feedback image corresponding to the live broadcast return information, superimposing the information feedback image and the application interface, so that the user appears to be displaying the live broadcast barrage, the gift, the private message and the like under the full-screen game.
  • Step 322 The first interface is invoked according to the application mapping architecture data, and the display image of the live feedback information corresponding information feedback image is superimposed on the application interface of the target application, and the display image is rendered.
  • the application interface and the information feedback image are respectively displayed in a superimposed manner.
  • the first interface may be invoked according to the application mapping architecture data, and the superimposed display image is directly drawn, that is, the application interface and the information feedback image are directly drawn. Display image data that has been superimposed. Therefore, the live broadcast process based on the superimposed display image during the live broadcast process can directly return the live broadcast data of the added barrage, gifts, and the like, without superimposing the data such as the barrage on the service side for the live data.
  • Step 324 according to the time information of the displayed image, sequentially displaying the display image on the terminal screen to form a live broadcast interface of the target application.
  • the time information of each display image is obtained, that is, the time stamp of each display image is acquired, and the display images are combined according to the time stamp, and each frame display image is sequentially displayed on the terminal screen to form a live broadcast interface of the target application.
  • Step 326 Call the second interface according to the application mapping architecture data to obtain the display image.
  • Step 328 The display images are formed into the live video data according to time information of the displayed image.
  • Step 330 Upload the live video data.
  • the live broadcast end directly generates a live broadcast image carrying data such as barrage data, gifts, and the like, and does not need to be synthesized on the server. Therefore, the second interface may be invoked according to the application mapping architecture data, the display image is acquired, the display image is synthesized according to the time information of the display image, the response live video data is generated, and the live video data is uploaded.
  • the corresponding source architecture data and the mapping schema data are determined according to the target interface required by the game client to draw, render, and display the image.
  • the user plays the game live during the game playing.
  • the first display image can be drawn based on the image rendering principle of the 3D engine and the target interface is called, and the first image is superimposed on the game client. Display on the interface image, or directly draw the superimposed image instead of the game client display interface image.
  • the interface image of the target application that is, the second display image may be drawn and displayed every frame, and the second display image that responds to the live broadcast return information is also superimposed and displayed on the second display image after each frame is drawn. Therefore, even if the user angle image is still, the corresponding image is drawn every frame at the system angle.
  • the displayed game interface image is drawn and rendered every frame, so the first display image displayed on the game interface image is also superimposed on the game interface image of the corresponding position after each frame is rendered and rendered. to show.
  • the communication mechanism of the system (such as the message mechanism of Windows) can be bypassed, so that the display and manipulation of the live return information can be performed normally without exiting the full screen, and the user is guaranteed. Based on normal operation, the user experience is improved.
  • the embodiment further provides a live data processing apparatus.
  • FIG. 4 a structural block diagram of an embodiment of a live data processing apparatus according to an embodiment of the present invention is shown, which may specifically include the following modules:
  • the obtaining module 402 is configured to obtain live broadcast return information during the live broadcast based on the target application, where the live broadcast return information is used to feed back information other than the video data, where the live broadcast return information includes the barrage data.
  • the rendering display module 404 is configured to render an application interface of the target application and a display image superimposed by the live broadcast information, and display a live broadcast interface according to the display image.
  • the live broadcast returning module 406 is configured to generate live video data with barrage data according to the live broadcast interface, and upload the live video data.
  • live broadcast return information in the process of performing live broadcast based on the target application, where the live broadcast return information is used to feed back information other than live video data, where the live broadcast return information includes barrage data, and the application of the target application is rendered.
  • the display image superimposed by the interface and the live broadcast information is displayed, and the live broadcast interface is displayed according to the display image, and the live video data with the barrage data is generated according to the live broadcast interface, and the live video data is uploaded to be directly generated on the live broadcast end.
  • the live video data of the data such as the barrage does not need to be synthesized on the server side, such as live video and barrage, to reduce the delay.
  • FIG. 5 is a structural block diagram of an embodiment of a live data processing apparatus according to another embodiment of the present invention, which may specifically include the following modules:
  • the initialization module 408 is configured to create application mapping architecture data and application source architecture data in the initialization target application by using the suspended target object in advance, and determine at least one target interface according to the application mapping architecture data.
  • the obtaining module 402 is configured to obtain live broadcast return information during the live broadcast based on the target application, where the live broadcast return information is used to feed back information other than the video data, where the live broadcast return information includes the barrage data.
  • the rendering display module 404 is configured to render an application interface of the target application and a display image superimposed by the live broadcast information, and display a live broadcast interface according to the display image.
  • the live broadcast returning module 406 is configured to generate live video data with barrage data according to the live broadcast interface, and upload the live video data.
  • the rendering display module 404 includes: a rendering module 4042, configured to invoke a first interface according to the application mapping architecture data, and draw an information feedback image corresponding to the live broadcast return information, and obtain an application-based source framework.
  • the rendering display module 404 includes: a drawing module 4044, configured to invoke the first interface according to the application mapping architecture data, and draw the corresponding live broadcast return information on the application interface of the target application The display image of the information feedback image is rendered.
  • the drawing display module 404 includes a display module 4046 for sequentially displaying display images on the terminal screen according to time information of the displayed image to form a live broadcast interface of the target application.
  • the live broadcast returning module 406 includes: an obtaining module 4062, configured to invoke the second interface according to the application mapping architecture data to acquire the display image.
  • the live broadcast generating module 4064 is configured to form the live video data according to the time information of the displayed image.
  • the live broadcast return information further includes: service data and/or non-public information.
  • the information feedback image corresponding to the live broadcast return information includes a public image and a private image, wherein the public image is drawn according to the barrage data and the service data, and the private image is drawn according to the non-public information.
  • the obtaining module 4062 is configured to acquire a display image generated by an application interface of the target application and a public image overlay.
  • the initialization module 408 includes a suspend module 4082 for monitoring a creation function in the system, and suspending the target object when the creation function creates a target object.
  • the architecture building module 4084 is configured to create an application in the target object initialization target application, determine a mapping object and an interface call according to the application information, build an application mapping architecture data, and then determine a source object and an interface call according to the application information, and build Apply source schema data.
  • the architecture building module 4084 is further configured to: when creating an object, create a mapping object according to the application information and point to a source memory address, and then create a source object and point to the source memory address; when the interface is invoked, according to the application The information sets a set of hook programs at the corresponding interfaces, and determines the interface information of each interface.
  • the initialization module 408 further includes an interface analysis module 4068, configured to determine, according to the application mapping architecture data, at least one target interface for drawing a display image according to the interface information.
  • the communication mechanism of the system (such as the message mechanism of Windows) can be bypassed, so that the display and manipulation of the live return information can be performed normally without exiting the full screen, and the user is guaranteed. Based on normal operation, the user experience is improved.
  • the description is relatively simple, and the relevant parts can be referred to the description of the method embodiment.
  • modules in the devices of the embodiments can be adaptively changed and placed in one or more devices different from the embodiment.
  • the modules or units or components of the embodiments may be combined into one module or unit or component, and further they may be divided into a plurality of sub-modules or sub-units or sub-components.
  • any combination of the features disclosed in the specification, including the accompanying claims, the abstract and the drawings, and any methods so disclosed, or All processes or units of the device are combined.
  • Each feature disclosed in this specification (including the accompanying claims, the abstract and the drawings) may be replaced by alternative features that provide the same, equivalent or similar purpose.
  • the various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or digital signal processor may be used in practice to implement some or all of the functionality of some or all of the components of the live data processing method and apparatus in accordance with embodiments of the present invention.
  • the invention can also be implemented as a device or device program (e.g., a computer program and a computer program product) for performing some or all of the methods described herein.
  • a program implementing the invention may be stored on a computer readable medium or may be in the form of one or more signals.
  • Such signals may be downloaded from an Internet website, provided on a carrier signal, or provided in any other form.
  • Figure 6 illustrates a computing device that can implement a live data processing method in accordance with the present invention.
  • the computing device conventionally includes a processor 610 and a program product or readable medium in the form of a memory 620.
  • Memory 620 can be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, or ROM.
  • Memory 620 has a memory space 630 for program code 631 for performing any of the method steps described above.
  • storage space 630 for program code may include various program code 631 for implementing various steps in the above methods, respectively.
  • These program codes can be read from or written to one or more program products.
  • These program products include program code carriers such as memory cards.
  • Such a program product is typically a portable or fixed storage unit as described with reference to FIG.
  • the storage unit may have storage segments, storage spaces, and the like that are similarly arranged to memory 620 in the computing device of FIG.
  • the program code can be compressed, for example, in an appropriate form.
  • the storage unit includes readable code 631', ie, code that can be read by a processor, such as 610, which, when executed by a computing device, causes the computing device to perform various steps in the methods described above. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Stored Programmes (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种直播数据处理方法、装置、程序及介质,所述方法包括:在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除直播视频数据外的信息,所述直播返回信息包括弹幕数据;渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面;依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。从而在直播的主播端直接生成带有弹幕等数据的直播视频数据,无需在服务端进行直播视频和弹幕等直播返回信息的合成,减少时延。

Description

直播数据处理方法、装置、程序及介质 技术领域
本发明涉及计算机技术领域,特别是涉及一种直播数据处理方法、一种直播数据处理装置,一种计算机程序及一种计算机可读介质。
背景技术
随着网络和终端技术的发展,用户可以通过终端执行各种网络操作,如上网、玩游戏、观看视频等,因此也兴起了一种新的多媒体互动方式,即网络直播。
网络直播是通过信号采集设备采用直播视频,然后将直播视频上传到服务器,再由服务器反馈给各用户终端的播放的多媒体互动方式,该多媒体互动方式还支持用户和该网络直播的主播进行互动,即用户的消息可以反馈给主播,从而主播基于该消息进行互动。
上述互动消息会显示在直播画面中,但是,在网络直播过程中互动消息通常存在一定的时延,影响直播效果。
发明内容
鉴于上述问题,提出了本发明以便提供一种克服上述问题或者至少部分地解决上述问题的直播数据处理方法和相应的直播数据处理装置、计算机程序及计算机可读介质。
依据本发明的一个方面,提供了一种直播数据处理方法,具体包括:在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除直播视频数据外的信息,所述直播返回信息包括弹幕数据;渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面;依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
可选地,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,包括:依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面;将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像;其中,所述应用映射 架构数据是依据所述目标应用的应用源框架数据确定的。
可选地,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,包括:依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
可选地,依据所述显示图像展示直播界面,包括:依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
可选地,依据所述直播界面生成具有弹幕数据的直播视频数据,包括:依据所述应用映射架构数据调用第二接口,获取所述显示图像;依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
可选地,所述直播返回信息还包括:业务数据和/或非公开信息。
可选地,所述直播返回信息对应的信息反馈图像包括公开图像和私密图像,其中,所述公开图像依据弹幕数据和业务数据绘制,所述私密图像依据所述非公开信息绘制。
可选地,获取所述显示图像,包括:获取由所述目标应用的应用界面和公开图像叠加生成的显示图像。
可选地,还包括:预先通过挂起的目标对象,在初始化所述目标应用创建应用映射架构数据和应用源架构数据;依据所述应用映射架构数据确定至少一个目标接口。
可选地,所述预先通过挂起的目标对象,在初始化目标应用创建应用映射架构数据和应用源架构数据,包括:对系统中创建函数进行监控,在所述创建函数创建目标对象时挂起所述目标对象;在所述目标对象初始化目标应用时,依据所述应用信息确定映射对象和接口调用,搭建应用映射架构数据,再依据所述应用信息确定源对象和接口调用,搭建应用源架构数据。
可选地,还包括:在创建对象时,依据所述应用信息创建映射对象并指向源内存地址,再创建源对象并指向所述源内存地址;在调用接口时,依据所述应用信息在对应各接口处设置一组钩子程序,并确定各接口的接口信息。
可选地,依据所述应用映射架构数据确定至少一个目标接口,包括:基于所述应用映射架构数据,依据所述接口信息确定用于绘制显示图像的至少一个目标接口。
根据本发明的另一方面,提供了一种直播数据处理装置,具体包括:获取模块,用于在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除视频数据外的信息,所述直播返回信息包括弹幕数据;渲染显示模块,用于渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面;直播返回模块,用于依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
可选地,所述渲染显示模块,包括:渲染模块,用于依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面;将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像;其中,所述应用映射架构数据是依据所述目标应用的应用源框架数据确定的。
可选地,所述渲染显示模块,包括:绘制模块,用于依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
可选地,所述绘制显示模块,包括:显示模块,用于依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
可选地,所述直播返回模块,包括:获取模块,用于依据所述应用映射架构数据调用第二接口,获取所述显示图像;直播生成模块,用于依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
可选地,所述直播返回信息还包括:业务数据和/或非公开信息。
可选地,所述直播返回信息对应的信息反馈图像包括公开图像和私密图像,其中,所述公开图像依据弹幕数据和业务数据绘制,所述私密图像依据所述非公开信息绘制。
可选地,所述获取模块,用于获取由所述目标应用的应用界面和公开图像叠加生成的显示图像。
可选地,还包括:初始化模块,用于预先通过挂起的目标对象,在初始化目标应用创建应用映射架构数据和应用源架构数据;依据所述应用映射架构数据确定至少一个目标接口。
可选地,初始化模块,包括:挂起模块,用于对系统中创建函数进行监控,在所述创建函数创建目标对象时挂起所述目标对象;架构搭建模块,用 于在所述目标对象初始化目标应用创建应用,依据所述应用信息确定映射对象和接口调用,搭建应用映射架构数据,再依据所述应用信息确定源对象和接口调用,搭建应用源架构数据。
可选地,架构搭建模块,还用于在创建对象时,依据所述应用信息创建映射对象并指向源内存地址,再创建源对象并指向所述源内存地址;在调用接口时,依据所述应用信息在对应各接口处设置一组钩子程序,并确定各接口的接口信息。
可选地,初始化模块,还包括:接口分析模块,用于基于所述应用映射架构数据,依据所述接口信息确定用于绘制显示图像的至少一个目标接口。
根据本发明的又一方面,提供了一种计算机程序,包括计算机可读代码,当所述可读代码在计算设备上运行时,导致所述计算设备执行根据本发明实施例中的任一所述的直播数据处理方法。
本发明实施例还提供了一种计算机可读介质,其中存储了如本发明实施例所述的程序。
在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除直播视频数据外的信息,所述直播返回信息包括弹幕数据,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面,依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据;从而在直播的主播端直接生成带有弹幕等数据的直播视频数据,无需在服务端进行直播视频和弹幕等直播返回信息的合成,减少时延。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。
附图说明
通过阅读下文优选实施方式的详细描述,各种其他的优点和益处对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本发明的限制。而且在整个附图中,用相同的参考符号表示相同的部件。在附图中:
图1示出了根据本发明一个实施例的一种直播数据处理方法实施例的步 骤流程图;
图2示出了根据本发明另一个实施例的一种直播数据处理方法实施例的步骤流程图;
图3示出了根据本发明另一个实施例的另一种直播数据处理方法实施例的步骤流程图;
图4示出了根据本发明一个实施例的一种直播数据处理装置实施例的结构框图;
图5示出了根据本发明另一个实施例的一种直播数据处理装置实施例的结构框图;
图6示出了用于执行根据本发明直播数据处理方法的计算设备的框图;
图7示出了用于保持或者携带实现根据本发明的直播数据处理方法的程序代码的存储单元。
具体实施方式
下面将参照附图更详细地描述本公开的示例性实施例。虽然附图中显示了本公开的示例性实施例,然而应当理解,可以以各种形式实现本公开而不应被这里阐述的实施例所限制。相反,提供这些实施例是为了能够更透彻地理解本公开,并且能够将本公开的范围完整的传达给本领域的技术人员。
参照图1,示出了根据本发明一个实施例的一种直播数据处理方法实施例的步骤流程图,具体可以包括如下步骤:
步骤102,在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除直播视频数据外的信息,所述直播返回信息包括弹幕数据。
本实施例的直播是基于目标应用的直播,即将目标应用的数据作为直播数据,例如游戏直播是用户在使用游戏应用玩游戏的过程中,将游戏应用对应画面的数据作为直播数据。直播过程中观看直播的用户和主播可以进行互动,主播即进行该直播的用户,如游戏直播中游戏玩家即主播。用户可以发送弹幕数据,也可以给主播送礼物等,其中,弹幕指的是视频中大量以字幕弹出形式从屏幕飘过的显示方式。
本实施例应用于直播过程的主播端,为了便于主播与用户互动,在基于目标应用进行直播的过程中,可以获取直播返回信息。为了减少对主播带宽 的占用,本实施例的直播返回信息用于反馈除视频数据之外的信息,即该直播返回信息为不携带直播视频数据的互动消息,可以包括弹幕数据,也可以包括礼物等数据。
步骤104,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面。
步骤106,依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
其中,在系统底层以及屏幕显示的角度,在系统中显示的界面、窗口都可以看作是一帧图像,因此目标应用运行过程中,需要绘制、渲染目标应用的应用界面,在该过程中可以在应用界面上绘制叠加的直播返回信息,得到相应的显示图像,对于每帧分别绘制该显示图像,从而将各帧显示图像按照时间戳的时间顺序进行组合可以展示相应的直播界面,该直接界面显示有弹幕数据,以及其他返回数据如礼物等。然后可以依据该直播界面生成具有弹幕数据的直播视频数据,即将各帧直播界面组合得到直播视频数据,上传具有弹幕数据的直播视频数据。
从而在直播的主播端直接生成带有弹幕等数据的直播视频数据,无需在服务端进行直播视频和弹幕等直播返回信息的合成,减少时延。
参照图2,示出了根据本发明另一个实施例的一种直播数据处理方法实施例的步骤流程图,具体可以包括如下步骤:
步骤202,预先通过挂起的目标对象,在初始化目标应用创建应用映射架构数据和应用源架构数据。
预先对系统进行监控,监控系统的创建函数如crate函数,当调用该创建函数创建目标对象时,可以挂起该目标对象,即在初始化目标对象时设置第一钩子程序,从而在目标对象的源头设置钩子程序。该目标对象为系统的重要组件对象,用于执行各种应用所需的操作。
系统中部分应用的创建、运行等均需要目标对象的参与,因此在目标对象的源头设置钩子,就可以截获各个针对目标对象的调用,从而确定所需的信息。本实施例中,应用在创建需要调用目标对象执行操作,因此可以通过第一钩子程序截获调用所述目标对象时的应用信息,该应用信息用于指示创建该目标应用的应用源架构数据。即在目标应用创建时需要调用目标对象的一系列接口获取数据,并创建所需的函数对象等,通过该接口调用和函数对 象可以确定出该目标应用的应用源架构数据,从而启动并运行该目标应用。
因此本实施例先采用应用信息搭建该目标应用的应用映射架构数据,再搭建应用源架构数据,即依据该应用信息确定所需操作,先创建一个映射信息再创建源信息,且该映射信息与源信息指向相同的内容地址,即采用映射架构数据搭建一个与应用源架构数据相同的外壳,但其实质内容仍然由应用源架构数据提供,从而极少量的消耗内存信息,且能够获知该目标应用的架构,并控制应用所需数据的返回。
本发明实施例中,可以在各种系统的各种目标对象设置钩子程序,本实施例中以组件对象模型(Component Object Model,COM)作为目标对象为例,可以在创建函数初始化COM组件时,为COM组件设置第一钩子程序,从而在COM组件的源头设置钩子,实现对COM组件的接管。其中,COM组件是微软对于网页服务器与客户端之间交互的一项软件组件技术,是一种面向对象的编程模式,它定义了对象在单个应用程序内部或多个应用程序之间的行为方式。并且COM被实现于多个平台之上,并不限于Windows操作系统。例如,游戏客户端采用DirectX(Direct eXtension,DX)的3D引擎,该DirectX是由微软公司创建的多媒体编程接口,由C++编程语言实现,遵循COM组件。因此在启动运行该游戏客户端,涉及到3D引擎操作时,需要调用COM组件,通过在COM组件设置的第一钩子引擎,可以获取调用COM组件的应用信息,从而确定出该游戏客户端的3D引擎的源架构数据,即获知3D引擎运行所需的各种接口以及函数对象等。
步骤204,依据所述应用映射架构数据确定至少一个目标接口。
在搭建应用映射架构数据时可以确定该目标应用所需调用的各种接口,以及各接口的功能等信息,因此可以确定出所需的至少一个目标接口,该目标接口包括用于绘制、渲染、显示图像的各种接口。目标接口包括第一接口和第二接口,第一接口用于绘制图像数据,第二接口用于输出图像数据。
步骤206,在基于目标应用进行直播的过程中,获取直播返回信息。
步骤208,依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面。
步骤210,将所述弹幕图像和应用界面进行叠加并渲染得到显示图像。
目标应用运行后可以采用全屏模式,全屏模式下该目标应用的显示界面即界面图像会铺满整个窗口,且焦点位于该目标应用中。其中,在系统底层 以及屏幕显示的角度,在系统中显示的界面、窗口都可以看作是一帧图像。本实施例依据上述应用映射架构数据,基于该目标应用的开发技术原理来绘制所述直播返回信息对应的信息反馈图像。
因此基于应用源框架数据绘制目标应用的应用界面,以及获取直播返回信息后,可以采用预先注入的钩子程序调用第一接口依据应用映射架构数据对应图像绘制、渲染、显示等方式,绘制所述直播返回信息对应的信息反馈图像,获取基于应用源框架数据创建的应用界面,将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像,从而得到显示有弹幕数据、礼物等直播返回信息的图像。对于每一帧图像采用上述方式绘制、叠加、渲染,从而得到屏幕上动态的画面。
例如针对游戏客户端,用户在玩客户端游戏时通常是采用全屏模式并进行游戏直播,则可以采用应用源框架数据创建的游戏客户端的应用界面,基于应用界面生成直播数据;依据应用映射架构数据调用所述目标接口,绘制所述直播返回信息对应的信息反馈图像,将信息反馈图像和应用界面进行叠加显示,从而用户看起来就是在全屏游戏下,显示了直播弹幕、礼物以及私信等数据,提高用户体验以及直播的灵活性。
步骤212,依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
上述采用分别绘制应用界面和信息反馈图像后叠加显示的方式,实际处理中,还可以依据应用映射架构数据调用所述第一接口,直接绘制叠加的显示图像,即直接绘制应用界面和信息反馈图像已叠加的显示图像数据。从而后续直播过程中基于叠加的显示图像进行直播,可以直接返回已添加弹幕、礼物等数据的直播数据,不用在服务侧为直播数据叠加弹幕等数据。
步骤214,依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
获取每个显示图像的时间信息即获取每个显示图像的时间戳,按照该时间戳对显示图像进行组合,在终端屏幕上依次展示各帧显示图像,构成所述目标应用的直播界面。
步骤216,依据所述应用映射架构数据调用第二接口,获取所述显示图像。
步骤218,依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
步骤220,上传所述直播视频数据。
本实施例在直播端直接生成携带弹幕数据、礼物等数据的直播图像,无需在服务器合成。因此可以依据所述应用映射架构数据调用第二接口,获取所述显示图像,按照显示图像的时间信息对显示图像进行合成,生成响应的直播视频数据,上传该直播视频数据。
本申请实施例中,所述直播返回信息包括:弹幕数据、业务数据和/或非公开信息。其中,业务数据依据具体业务确定,例如游戏业务则业务数据为礼物等电子物品数据,非公开信息包括给主播发送的私信、权限问题等数据,该权限问题可以是购买的权限后的提问数据。因此直播返回信息对应的信息反馈图像包括公开图像和私密图像,其中,所述公开图像依据弹幕数据和业务数据绘制,所述私密图像依据所述非公开信息绘制。
因此在获取显示图像上传直播数据时,可以获取由所述目标应用的应用界面和公开图像叠加生成的显示图像,生成不携带非公开信息的直播数据。其中,针对步骤208-210这类分别绘制图像再叠加的方式,可以直接获取分别绘制的应用界面和公开图像叠加生成的显示图像;对于步骤212这类直接绘制叠加图像的方式,可以绘制应用界面和公开图像的叠加图像,而对于私密图像单独绘制后再叠加,从而保证直播数据不携带非公开信息。
本实施例中,上述图像绘制是基于目标应用的架构原理执行的,因此,可以预先确定目标应用的架构以及在架构下的接口、函数等信息,然后再全屏模式下提供图像绘制。
一个可选实施例中,可以确定目标应用的架构。
本实施例以游戏客户端作为目标应用为例,该游戏客户端可以采用DX的3D引擎,该3D引擎在Windows操作系统中是基于硬件图形处理器(Graphics Processing Unit,GPU)加速,直接从内存读写,可规避消息机制。
参照图3,示出了根据本发明另一个实施例的另一种直播数据处理方法实施例的步骤流程图,具体可以包括如下步骤:
步骤302,对系统中创建函数进行监控,在所述创建函数创建目标对象时挂起所述目标对象。
预先对系统进行监控,监控系统的创建函数如crate函数,当调用该创 建函数创建目标对象时,可以挂起该目标对象,即在初始化目标对象时设置第一钩子程序,从而在目标对象的源头设置钩子程序。该目标对象为系统的重要组件对象,用于执行各种应用所需的操作。
本发明实施例中,可以在各种系统的各种目标对象设置钩子程序,本实施例中以组件对象模型(Component Object Model,COM)作为目标对象为例,可以在创建函数初始化COM组件时,为COM组件设置第一钩子程序,从而在COM组件的源头设置钩子,实现对COM组件的接管。
步骤304,在调用所述目标对象创建目标应用时截获应用信息。
系统中部分应用的创建、运行等均需要目标对象的参与,因此在目标对象的源头设置钩子,就可以截获各个针对目标对象的调用,从而确定所需的信息。本实施例中,应用在创建需要调用目标对象执行操作,因此可以通过第一钩子程序截获调用所述目标对象时的应用信息,该应用信息用于指示创建该目标应用的应用源架构数据。即在目标应用创建时需要调用目标对应的一系列接口获取数据,并创建所需的函数对象等,通过该接口调用和函数对象可以确定出该目标应用的应用源架构数据,从而启动并运行该目标应用。
例如,游戏客户端采用DirectX(Direct eXtension,DX)的3D引擎,该DirectX是由微软公司创建的多媒体编程接口,由C++编程语言实现,遵循COM组件。因此在启动运行该游戏客户端,涉及到3D引擎操作时,需要调用COM组件,通过在COM组件设置的第一钩子引擎,可以获取调用COM组件的应用信息,从而确定出该游戏客户端的3D引擎的源架构数据,即获知3D引擎运行所需的各种接口以及函数对象等。
步骤306,在创建对象时,依据所述应用信息创建映射对象并指向源内存地址,再创建源对象并指向所述源内存地址。
步骤308,在各源对象对应的各接口分别设置钩子程序。
步骤310,依据映射对象和接口调用,搭建应用映射架构数据。
步骤312,依据源对象和接口调用,搭建应用源架构数据。
本实施例中,目标对象创建目标应用相关内容时,可以创建所需对象依据确定需要调用的接口等信息,本实施例中,采用映射架构数据搭建一个与应用源架构数据相同的外壳,但其实质内容仍然由应用源架构数据提供。因此,可以在该应用信息需要创建对象时,确定所需创建对象的相关信息,创建一个映射对象指向对应的源内存地址,然后创建源对象并指向所述源内存 地址。即依据该应用信息确定所需定义的内容,定义一个具有含义的映射对象,再定义原本需要定义的源对象,并且在各源对象所需的接口处分别设置钩子程序,从而针对该目标应用可以设置一组钩子程序,实现在入口级挂钩子的调用方式。从而采用映射对象和接口调用搭建应用映射架构数据,采用源对象和接口调用搭建应用源架构数据,从而得到与该应用源架构数据相同的外壳即应用映射架构数据,而该应用映射架构数据实际定义、调用的内容可以与应用源架构数据对应,还可以映射到应用源架构数据中处理。
例如,游戏客户端的3D引擎调用COM组件时,截获调用信息,例如调用信息指示创建函数A,该函数A调用接口B和函数C,函数C调用接口D、E,可以向创建映射函数A′和C′,并且在接口B、D、E处分别设置钩子程序以及建立映射函数的对应关系,再创建源函数A和C以及与接口B、D、E的对应关系,从而搭建一个与游戏客户端的源架构数据相同的外壳即映射架构数据,通过该映射架构数据可以映射到源架构数据。
步骤314,基于所述应用映射架构数据,依据所述接口信息确定用于绘制显示图像的至少一个目标接口。
在搭建应用映射架构数据时可以确定该目标应用所需调用的各种接口,以及各接口的功能等信息,因此可以确定出所需的至少一个目标接口,该目标接口包括用于绘制、渲染、显示图像的各种接口。即可以确定出与图像相关的一系列目标接口,且各目标接口均注入了钩子程序,从而后续可以直接通过钩子程序调用目标接口。目标接口包括第一接口和第二接口。
从而能够在初始化时目标应用时截获应用信息,搭建所述目标应用的应用映射架构数据和应用源架构数据,即分析出了目标应用的架构,便于后续基于该架构执行所需操作。
步骤316,基于目标应用进行直播,获取直播返回信息。
步骤318,依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面。
步骤320,将所述弹幕图像和应用界面进行叠加并渲染得到显示图像。
目标应用运行后可以采用全屏模式,全屏模式下该目标应用的显示界面即界面图像会铺满整个窗口,且焦点位于该目标应用中。其中,在系统底层以及屏幕显示的角度,在系统中显示的界面、窗口都可以看作是一帧图像。本实施例依据上述应用映射架构数据,基于该目标应用的开发技术原理来绘 制所述直播返回信息对应的信息反馈图像。
因此基于应用源框架数据绘制目标应用的应用界面,以及获取直播返回信息后,可以采用预先注入的钩子程序调用第一接口依据应用映射架构数据对应图像绘制、渲染、显示等方式,绘制所述直播返回信息对应的信息反馈图像,获取基于应用源框架数据创建的应用界面,将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像,从而得到显示有弹幕数据、礼物等直播返回信息的图像。对于每一帧图像采用上述方式绘制、叠加、渲染,从而得到屏幕上动态的画面。
例如针对游戏客户端,用户在玩客户端游戏时通常是采用全屏模式并进行游戏直播,则可以采用应用源框架数据创建的游戏客户端的应用界面,基于应用界面生成直播数据;依据应用映射架构数据调用所述目标接口,绘制所述直播返回信息对应的信息反馈图像,将信息反馈图像和应用界面进行叠加显示,从而用户看起来就是在全屏游戏下,显示了直播弹幕、礼物以及私信等数据,提高用户体验以及直播的灵活性。
步骤322,依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
上述采用分别绘制应用界面和信息反馈图像后叠加显示的方式,实际处理中,还可以依据应用映射架构数据调用所述第一接口,直接绘制叠加的显示图像,即直接绘制应用界面和信息反馈图像已叠加的显示图像数据。从而后续直播过程中基于叠加的显示图像进行直播,可以直接返回已添加弹幕、礼物等数据的直播数据,不用在服务侧为直播数据叠加弹幕等数据。
步骤324,依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
获取每个显示图像的时间信息即获取每个显示图像的时间戳,按照该时间戳对显示图像进行组合,在终端屏幕上依次展示各帧显示图像,构成所述目标应用的直播界面。
步骤326,依据所述应用映射架构数据调用第二接口,获取所述显示图像。
步骤328,依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
步骤330,上传所述直播视频数据。
本实施例再直播端直接生成携带弹幕数据、礼物等数据的直播图像,无需在服务器合成。因此可以依据所述应用映射架构数据调用第二接口,获取所述显示图像,按照显示图像的时间信息对显示图像进行合成,生成响应的直播视频数据,上传该直播视频数据。
例如通过COM组件初始化游戏客户端时,确定对应源架构数据和映射架构数据,依据该游戏客户端在绘制、渲染、显示图像所需的目标接口。初始化完成后可以运行游戏客户端并进入全屏模式。用户在玩游戏的过程中进行游戏直播,对于直播的弹幕等数据,可以基于3D引擎的图像绘制原理、调用目标接口绘制相应的第一显示图像,将该第一图像叠加到游戏客户端显示界面图像上显示,或者直接绘制叠加的图像替代游戏客户端显示界面图像。
实际上在目标应用运行过程中,目标应用的界面图像即第二显示图像可以是每帧都绘制显示,响应直播返回信息的第二显示图像也是每帧绘制后叠加显示在第二显示图像上的,因此即使用户角度图像是静止的,而在系统角度是每帧均绘制相应的图像。以游戏客户端为例,所显示的游戏界面图像是每帧绘制、渲染显示的,因此该游戏界面图像上显示的第一显示图像也是每帧绘制渲染后,叠加到相应位置的游戏界面图像上显示的。
从而通过调用底层窗口直接绘制图像并显示的方式,可以绕过系统的通信机制(如Windows的消息机制),从而对于直播返回信息的显示以及操纵也可以正常执行且不会退出全屏,在保证用户正常操作的基础上,提高了用户体验。
对于方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明实施例并不受所描述的动作顺序的限制,因为依据本发明实施例,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作并不一定是本发明实施例所必须的。
在上述实施例的基础上,本实施例还提供了一种直播数据处理装置。
参照图4,示出了根据本发明一个实施例的一种直播数据处理装置实施例的结构框图,具体可以包括如下模块:
获取模块402,用于在基于目标应用进行直播的过程中,获取直播返回 信息,其中,所述直播返回信息用于反馈除视频数据外的信息,所述直播返回信息包括弹幕数据。
渲染显示模块404,用于渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面。
直播返回模块406,用于依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除直播视频数据外的信息,所述直播返回信息包括弹幕数据,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面,依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据从而在直播的主播端直接生成带有弹幕等数据的直播视频数据,无需在服务端进行直播视频和弹幕等直播返回信息的合成,减少时延。
参照图5,示出了根据本发明另一个实施例的一种直播数据处理装置实施例的结构框图,具体可以包括如下模块:
初始化模块408,用于预先通过挂起的目标对象,在初始化目标应用创建应用映射架构数据和应用源架构数据;依据所述应用映射架构数据确定至少一个目标接口。
获取模块402,用于在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除视频数据外的信息,所述直播返回信息包括弹幕数据。
渲染显示模块404,用于渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面。
直播返回模块406,用于依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
一个可选实施例中,所述渲染显示模块404,包括:渲染模块4042,用于依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面;将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像;其中,所述应用映射架构数据是依据所述目标应用的应用源框架数据确定的。
另一个可选实施例中,所述渲染显示模块404,包括:绘制模块4044, 用于依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
所述绘制显示模块404,包括:显示模块4046,用于依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
所述直播返回模块406,包括:获取模块4062,用于依据所述应用映射架构数据调用第二接口,获取所述显示图像。直播生成模块4064,用于依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
其中,所述直播返回信息还包括:业务数据和/或非公开信息。所述直播返回信息对应的信息反馈图像包括公开图像和私密图像,其中,所述公开图像依据弹幕数据和业务数据绘制,所述私密图像依据所述非公开信息绘制。
所述获取模块4062,用于获取由所述目标应用的应用界面和公开图像叠加生成的显示图像。
所述初始化模块408,包括:挂起模块4082,用于对系统中创建函数进行监控,在所述创建函数创建目标对象时挂起所述目标对象。架构搭建模块4084,用于在所述目标对象初始化目标应用创建应用,依据所述应用信息确定映射对象和接口调用,搭建应用映射架构数据,再依据所述应用信息确定源对象和接口调用,搭建应用源架构数据。
其中,架构搭建模块4084,还用于在创建对象时,依据所述应用信息创建映射对象并指向源内存地址,再创建源对象并指向所述源内存地址;在调用接口时,依据所述应用信息在对应各接口处设置一组钩子程序,并确定各接口的接口信息。
所述初始化模块408,还包括:接口分析模块4086,用于基于所述应用映射架构数据,依据所述接口信息确定用于绘制显示图像的至少一个目标接口。
从而通过调用底层窗口直接绘制图像并显示的方式,可以绕过系统的通信机制(如Windows的消息机制),从而对于直播返回信息的显示以及操纵也可以正常执行且不会退出全屏,在保证用户正常操作的基础上,提高了用户体验。
对于装置实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
在此提供的算法和显示不与任何特定计算机、虚拟系统或者其它设备固有相关。各种通用系统也可以与基于在此的示教一起使用。根据上面的描述,构造这类系统所要求的结构是显而易见的。此外,本发明也不针对任何特定编程语言。应当明白,可以利用各种编程语言实现在此描述的本发明的内容,并且上面对特定语言所做的描述是为了披露本发明的最佳实施方式。
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本发明的实施例可以在没有这些具体细节的情况下实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。
类似地,应当理解,为了精简本公开并帮助理解各个发明方面中的一个或多个,在上面对本发明的示例性实施例的描述中,本发明的各个特征有时被一起分组到单个实施例、图、或者对其的描述中。然而,并不应将该公开的方法解释成反映如下意图:即所要求保护的本发明要求比在每个权利要求中所明确记载的特征更多的特征。更确切地说,如下面的权利要求书所反映的那样,发明方面在于少于前面公开的单个实施例的所有特征。因此,遵循具体实施方式的权利要求书由此明确地并入该具体实施方式,其中每个权利要求本身都作为本发明的单独实施例。
本领域那些技术人员可以理解,可以对实施例中的设备中的模块进行自适应性地改变并且把它们设置在与该实施例不同的一个或多个设备中。可以把实施例中的模块或单元或组件组合成一个模块或单元或组件,以及此外可以把它们分成多个子模块或子单元或子组件。除了这样的特征和/或过程或者单元中的至少一些是相互排斥之外,可以采用任何组合对本说明书(包括伴随的权利要求、摘要和附图)中公开的所有特征以及如此公开的任何方法或者设备的所有过程或单元进行组合。除非另外明确陈述,本说明书(包括伴随的权利要求、摘要和附图)中公开的每个特征可以由提供相同、等同或相似目的的替代特征来代替。
此外,本领域的技术人员能够理解,尽管在此所述的一些实施例包括其它实施例中所包括的某些特征而不是其它特征,但是不同实施例的特征的组合意味着处于本发明的范围之内并且形成不同的实施例。例如,在下面的权利要求书中,所要求保护的实施例的任意之一都可以以任意的组合方式来使用。
本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理 器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本发明实施例的直播数据处理方法和装置设备中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本发明的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。
例如,图6示出了可以实现根据本发明的直播数据处理方法的计算设备。该计算设备传统上包括处理器610和以存储器620形式的程序产品或者可读介质。存储器620可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM或者ROM之类的电子存储器。存储器620具有用于执行上述方法中的任何方法步骤的程序代码631的存储空间630。例如,用于程序代码的存储空间630可以包括分别用于实现上面的方法中的各种步骤的各个程序代码631。这些程序代码可以从一个或者多个程序产品中读出或者写入到这一个或者多个程序产品中。这些程序产品包括诸如存储卡之类的程序代码载体。这样的程序产品通常为如参考图7所述的便携式或者固定存储单元。该存储单元可以具有与图6的计算设备中的存储器620类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括可读代码631’,即可以由例如诸如610之类的处理器读取的代码,这些代码当由计算设备运行时,导致该计算设备执行上面所描述的方法中的各个步骤。
应该注意的是上述实施例对本发明进行说明而不是对本发明进行限制,并且本领域技术人员在不脱离所附权利要求的范围的情况下可设计出替换实施例。在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本发明可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。

Claims (26)

  1. 一种直播数据处理方法,包括:
    在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除直播视频数据外的信息,所述直播返回信息包括弹幕数据;
    渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面;
    依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
  2. 如权利要求1所述的方法,其特征在于,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,包括:
    依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面;
    将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像;
    其中,所述应用映射架构数据是依据所述目标应用的应用源框架数据确定的。
  3. 如权利要求1所述的方法,其特征在于,渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,包括:
    依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
  4. 如权利要求1所述的方法,其特征在于,依据所述显示图像展示直播界面,包括:
    依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
  5. 如权利要求4所述的方法,其特征在于,依据所述直播界面生成具有弹幕数据的直播视频数据,包括:
    依据所述应用映射架构数据调用第二接口,获取所述显示图像;
    依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
  6. 如权利要求1-5任一所述的方法,其特征在于,所述直播返回信息还包括:业务数据和/或非公开信息。
  7. 如权利要求6所述的方法,其特征在于,所述直播返回信息对应的信息反馈图像包括公开图像和私密图像,其中,所述公开图像依据弹幕数据和业务数据绘制,所述私密图像依据所述非公开信息绘制。
  8. 如权利要求7所述的方法,其特征在于,获取所述显示图像,包括:
    获取由所述目标应用的应用界面和公开图像叠加生成的显示图像。
  9. 如权利要求2-5任一所述的方法,其特征在于,还包括:
    预先通过挂起的目标对象,在初始化所述目标应用创建应用映射架构数据和应用源架构数据;
    依据所述应用映射架构数据确定至少一个目标接口。
  10. 如权利要求9所述的方法,其特征在于,所述预先通过挂起的目标对象,在初始化目标应用创建应用映射架构数据和应用源架构数据,包括:
    对系统中创建函数进行监控,在所述创建函数创建目标对象时挂起所述目标对象;
    在所述目标对象初始化目标应用时,依据所述应用信息确定映射对象和接口调用,搭建应用映射架构数据,再依据所述应用信息确定源对象和接口调用,搭建应用源架构数据。
  11. 如权利要求10所述的方法,其特征在于,还包括:
    在创建对象时,依据所述应用信息创建映射对象并指向源内存地址,再创建源对象并指向所述源内存地址;
    在调用接口时,依据所述应用信息在对应各接口处设置一组钩子程序,并确定各接口的接口信息。
  12. 如权利要求11所述的方法,其特征在于,依据所述应用映射架构数据确定至少一个目标接口,包括:
    基于所述应用映射架构数据,依据所述接口信息确定用于绘制显示图像的至少一个目标接口。
  13. 一种直播数据处理装置,包括:
    获取模块,用于在基于目标应用进行直播的过程中,获取直播返回信息,其中,所述直播返回信息用于反馈除视频数据外的信息,所述直播返回信息包括弹幕数据;
    渲染显示模块,用于渲染所述目标应用的应用界面和直播返回信息叠加的显示图像,依据所述显示图像展示直播界面;
    直播返回模块,用于依据所述直播界面生成具有弹幕数据的直播视频数据,上传所述直播视频数据。
  14. 如权利要求13所述的装置,其特征在于,所述渲染显示模块,包括:
    渲染模块,用于依据应用映射架构数据调用第一接口,绘制所述直播返回信息对应的信息反馈图像,并获取基于应用源框架数据绘制的应用界面;将所述信息反馈图像和应用界面进行叠加并渲染得到显示图像;其中,所述应用映射架构数据是依据所述目标应用的应用源框架数据确定的。
  15. 如权利要求13所述的装置,其特征在于,所述渲染显示模块,包括:
    绘制模块,用于依据应用映射架构数据调用第一接口,绘制在所述目标应用的应用界面上叠加所述直播返回信息对应信息反馈图像的显示图像,渲染所述显示图像。
  16. 如权利要求13所述的装置,其特征在于,所述绘制显示模块,包括:
    显示模块,用于依据显示图像的时间信息,在终端屏幕上依次展示显示图像,构成所述目标应用的直播界面。
  17. 如权利要求16所述的装置,其特征在于,所述直播返回模块,包括:
    获取模块,用于依据所述应用映射架构数据调用第二接口,获取所述显示图像;
    直播生成模块,用于依据显示图像的时间信息,将所述各显示图像构成所述直播视频数据。
  18. 如权利要求13-17任一所述的装置,其特征在于,所述直播返回信息还包括:业务数据和/或非公开信息。
  19. 如权利要求18所述的装置,其特征在于,所述直播返回信息对应的信息反馈图像包括公开图像和私密图像,其中,所述公开图像依据弹幕数据和业务数据绘制,所述私密图像依据所述非公开信息绘制。
  20. 如权利要求18所述的装置,其特征在于,
    所述获取模块,用于获取由所述目标应用的应用界面和公开图像叠加生成的显示图像。
  21. 如权利要求14-17任一所述的装置,其特征在于,还包括:
    初始化模块,用于预先通过挂起的目标对象,在初始化目标应用创建应用映射架构数据和应用源架构数据;依据所述应用映射架构数据确定至少一个目标接口。
  22. 如权利要求21所述的装置,其特征在于,初始化模块,包括:
    挂起模块,用于对系统中创建函数进行监控,在所述创建函数创建目标对象时挂起所述目标对象;
    架构搭建模块,用于在所述目标对象初始化目标应用创建应用,依据所述应用信息确定映射对象和接口调用,搭建应用映射架构数据,再依据所述应用信息确定源对象和接口调用,搭建应用源架构数据。
  23. 如权利要求22所述的装置,其特征在于,
    架构搭建模块,还用于在创建对象时,依据所述应用信息创建映射对象并指向源内存地址,再创建源对象并指向所述源内存地址;在调用接口时,依据所述应用信息在对应各接口处设置一组钩子程序,并确定各接口的接口信息。
  24. 如权利要求23所述的装置,其特征在于,初始化模块,还包括:
    接口分析模块,用于基于所述应用映射架构数据,依据所述接口信息确定用于绘制显示图像的至少一个目标接口。
  25. 一种计算机程序,包括计算机可读代码,当所述可读代码在计算设备上运行时,导致所述计算设备执行根据权利要求1-12中的任一所述的直播数据处理方法。
  26. 一种计算机可读介质,其中存储了如权利要求25所述的程序。
PCT/CN2017/118839 2016-12-27 2017-12-27 直播数据处理方法、装置、程序及介质 WO2018121556A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611229439.7A CN106658145B (zh) 2016-12-27 2016-12-27 一种直播数据处理方法和装置
CN201611229439.7 2016-12-27

Publications (1)

Publication Number Publication Date
WO2018121556A1 true WO2018121556A1 (zh) 2018-07-05

Family

ID=58831562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/118839 WO2018121556A1 (zh) 2016-12-27 2017-12-27 直播数据处理方法、装置、程序及介质

Country Status (2)

Country Link
CN (1) CN106658145B (zh)
WO (1) WO2018121556A1 (zh)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913708A (zh) * 2020-08-07 2020-11-10 广州虎牙科技有限公司 界面显示方法、装置、存储介质及电子设备
CN112423111A (zh) * 2020-11-05 2021-02-26 上海哔哩哔哩科技有限公司 图形引擎和适用于播放器的图形处理方法
CN112637670A (zh) * 2020-12-15 2021-04-09 上海哔哩哔哩科技有限公司 视频生成方法及装置
CN113271502A (zh) * 2020-02-17 2021-08-17 上海哔哩哔哩科技有限公司 基于视频弹幕的数据显示方法、装置以及计算机设备
CN113542846A (zh) * 2020-04-21 2021-10-22 上海哔哩哔哩科技有限公司 Ar弹幕显示方法及装置
CN113613062A (zh) * 2021-07-08 2021-11-05 广州云智达创科技有限公司 视频数据处理方法、装置、设备、存储介质和程序产品
CN113840170A (zh) * 2020-06-23 2021-12-24 武汉斗鱼网络科技有限公司 连麦直播的方法及装置
CN114245148A (zh) * 2020-09-09 2022-03-25 腾讯科技(深圳)有限公司 直播互动方法、装置、终端、服务器及存储介质
CN114765692A (zh) * 2021-01-13 2022-07-19 北京字节跳动网络技术有限公司 一种直播数据处理方法、装置、设备及介质
CN115134652A (zh) * 2021-03-22 2022-09-30 阿里巴巴新加坡控股有限公司 视频动态字幕生成方法、装置、电子设备及存储介质

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106658145B (zh) * 2016-12-27 2020-07-03 北京奇虎科技有限公司 一种直播数据处理方法和装置
CN107734353B (zh) * 2017-10-09 2020-08-04 武汉斗鱼网络科技有限公司 录制弹幕视频的方法、装置、可读存储介质及设备
CN107911708B (zh) * 2017-11-09 2022-04-05 腾讯数码(天津)有限公司 弹幕显示方法、直播方法、及相关装置
CN108289234B (zh) * 2018-01-05 2021-03-16 武汉斗鱼网络科技有限公司 一种虚拟礼物特效动画展示方法、装置和设备
CN108737879A (zh) * 2018-04-04 2018-11-02 北京潘达互娱科技有限公司 一种礼物栏显示方法、装置、电子设备及存储介质
CN110060135A (zh) * 2019-04-26 2019-07-26 广州虎牙信息科技有限公司 基于直播平台的商品交易处理方法、装置、服务器及介质
CN110740346B (zh) * 2019-10-23 2022-04-22 北京达佳互联信息技术有限公司 视频数据处理方法、装置、服务器、终端和存储介质
CN111726687B (zh) * 2020-06-30 2022-12-27 北京百度网讯科技有限公司 用于生成显示数据的方法和装置
CN113347453B (zh) * 2021-04-09 2022-11-08 北京润信恒达科技有限公司 一种多路直播系统、方法和设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101078982A (zh) * 2006-05-24 2007-11-28 北京壁虎科技有限公司 基于绘图引擎的屏幕显示方法
CN101227421A (zh) * 2007-01-16 2008-07-23 沃天醒石(北京)科技有限公司 全屏图形模式下的即时通讯方法和装置
CN101500125A (zh) * 2008-02-03 2009-08-05 突触计算机系统(上海)有限公司 用户终端上用于在视频显示时提供用户交互的方法及装置
US20130198774A1 (en) * 2012-01-30 2013-08-01 Consultants Net Creation Inc. Live broadcasting of dynamically generated content
CN105597321A (zh) * 2015-12-18 2016-05-25 武汉斗鱼网络科技有限公司 一种全屏游戏状态下的弹幕显示方法与系统
WO2016154149A1 (en) * 2015-03-20 2016-09-29 Twitter, Inc. Live video stream sharing
CN106162230A (zh) * 2016-07-28 2016-11-23 北京小米移动软件有限公司 直播信息的处理方法、装置、主播端、服务器及系统
CN106658145A (zh) * 2016-12-27 2017-05-10 北京奇虎科技有限公司 一种直播数据处理方法和装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100585058B1 (ko) * 1999-02-01 2006-06-01 삼성전자주식회사 캡션 디스플레이를 위한 엠펙 디코더 및 디코딩 방법
US7511718B2 (en) * 2003-10-23 2009-03-31 Microsoft Corporation Media integration layer
CN100426727C (zh) * 2005-06-08 2008-10-15 腾讯科技(深圳)有限公司 一种图像合成处理系统及方法
CN105763950A (zh) * 2014-12-19 2016-07-13 中兴通讯股份有限公司 一种弹幕显示方法及系统
CN105828160B (zh) * 2016-04-01 2017-09-12 腾讯科技(深圳)有限公司 视频播放方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101078982A (zh) * 2006-05-24 2007-11-28 北京壁虎科技有限公司 基于绘图引擎的屏幕显示方法
CN101227421A (zh) * 2007-01-16 2008-07-23 沃天醒石(北京)科技有限公司 全屏图形模式下的即时通讯方法和装置
CN101500125A (zh) * 2008-02-03 2009-08-05 突触计算机系统(上海)有限公司 用户终端上用于在视频显示时提供用户交互的方法及装置
US20130198774A1 (en) * 2012-01-30 2013-08-01 Consultants Net Creation Inc. Live broadcasting of dynamically generated content
WO2016154149A1 (en) * 2015-03-20 2016-09-29 Twitter, Inc. Live video stream sharing
CN105597321A (zh) * 2015-12-18 2016-05-25 武汉斗鱼网络科技有限公司 一种全屏游戏状态下的弹幕显示方法与系统
CN106162230A (zh) * 2016-07-28 2016-11-23 北京小米移动软件有限公司 直播信息的处理方法、装置、主播端、服务器及系统
CN106658145A (zh) * 2016-12-27 2017-05-10 北京奇虎科技有限公司 一种直播数据处理方法和装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113271502A (zh) * 2020-02-17 2021-08-17 上海哔哩哔哩科技有限公司 基于视频弹幕的数据显示方法、装置以及计算机设备
CN113542846A (zh) * 2020-04-21 2021-10-22 上海哔哩哔哩科技有限公司 Ar弹幕显示方法及装置
CN113840170A (zh) * 2020-06-23 2021-12-24 武汉斗鱼网络科技有限公司 连麦直播的方法及装置
CN113840170B (zh) * 2020-06-23 2023-06-16 武汉斗鱼网络科技有限公司 连麦直播的方法及装置
CN111913708A (zh) * 2020-08-07 2020-11-10 广州虎牙科技有限公司 界面显示方法、装置、存储介质及电子设备
CN111913708B (zh) * 2020-08-07 2024-02-27 广州虎牙科技有限公司 界面显示方法、装置、存储介质及电子设备
CN114245148A (zh) * 2020-09-09 2022-03-25 腾讯科技(深圳)有限公司 直播互动方法、装置、终端、服务器及存储介质
CN114245148B (zh) * 2020-09-09 2023-10-27 腾讯科技(深圳)有限公司 直播互动方法、装置、终端、服务器及存储介质
CN112423111A (zh) * 2020-11-05 2021-02-26 上海哔哩哔哩科技有限公司 图形引擎和适用于播放器的图形处理方法
CN112637670A (zh) * 2020-12-15 2021-04-09 上海哔哩哔哩科技有限公司 视频生成方法及装置
CN114765692A (zh) * 2021-01-13 2022-07-19 北京字节跳动网络技术有限公司 一种直播数据处理方法、装置、设备及介质
CN114765692B (zh) * 2021-01-13 2024-01-09 北京字节跳动网络技术有限公司 一种直播数据处理方法、装置、设备及介质
CN115134652A (zh) * 2021-03-22 2022-09-30 阿里巴巴新加坡控股有限公司 视频动态字幕生成方法、装置、电子设备及存储介质
CN113613062A (zh) * 2021-07-08 2021-11-05 广州云智达创科技有限公司 视频数据处理方法、装置、设备、存储介质和程序产品
CN113613062B (zh) * 2021-07-08 2024-01-23 广州云智达创科技有限公司 视频数据处理方法、装置、设备、存储介质

Also Published As

Publication number Publication date
CN106658145B (zh) 2020-07-03
CN106658145A (zh) 2017-05-10

Similar Documents

Publication Publication Date Title
WO2018121556A1 (zh) 直播数据处理方法、装置、程序及介质
WO2018121557A1 (zh) 直播数据显示方法、装置、程序及介质
US11943486B2 (en) Live video broadcast method, live broadcast device and storage medium
CN111729293B (zh) 一种数据处理方法、装置及存储介质
US10902663B2 (en) Method and apparatus for displaying 2D application interface in virtual reality device
WO2022048097A1 (zh) 一种基于多显卡的单帧画面实时渲染方法
US8968087B1 (en) Video game overlay
WO2017148410A1 (zh) 一种信息交互的方法、设备及系统
US8888592B1 (en) Voice overlay
JP7475610B2 (ja) クラウドネイティブによる3d場面のゲーム方法及びシステム
US10972511B2 (en) Streaming relay for digital signage
WO2018000609A1 (zh) 一种虚拟现实系统中分享3d影像的方法和电子设备
BR112015011245A2 (pt) sistemas e métodos para o processamento em nuvem e sobreposição do conteúdo nos frames de streaming de vídeo de aplicações remotamente processadas.
CN113840154B (zh) 基于虚拟礼物的直播互动方法、系统及计算机设备
WO2023071586A1 (zh) 画面生成方法、装置、设备及介质
JP6379107B2 (ja) 情報処理装置並びにその制御方法、及びプログラム
CN115065684A (zh) 数据处理方法、装置、设备以及介质
CN115220906A (zh) 音频/视频合成应用的云执行
US20200360816A1 (en) Capturing Subject Representation Within an Augmented Reality Environment
CN117370696A (zh) 小程序页面的加载方法、装置、电子设备及存储介质
WO2016066056A1 (zh) 图像远程投射方法、服务器和客户端
US11095956B2 (en) Method and system for delivering an interactive video
US10417989B2 (en) GPU and GPU computing system for providing a virtual machine and a method of manufacturing the same
CN109862385A (zh) 直播的方法、装置、计算机可读存储介质及终端设备
CN111913761B (zh) 直播频道的插件处理方法、装置、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17887750

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17887750

Country of ref document: EP

Kind code of ref document: A1