CN111726687B - Method and apparatus for generating display data - Google Patents

Method and apparatus for generating display data Download PDF

Info

Publication number
CN111726687B
CN111726687B CN202010611784.7A CN202010611784A CN111726687B CN 111726687 B CN111726687 B CN 111726687B CN 202010611784 A CN202010611784 A CN 202010611784A CN 111726687 B CN111726687 B CN 111726687B
Authority
CN
China
Prior art keywords
data
stream data
video stream
media stream
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010611784.7A
Other languages
Chinese (zh)
Other versions
CN111726687A (en
Inventor
李明路
付盛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202010611784.7A priority Critical patent/CN111726687B/en
Publication of CN111726687A publication Critical patent/CN111726687A/en
Application granted granted Critical
Publication of CN111726687B publication Critical patent/CN111726687B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440227Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Abstract

The application discloses a method and a device for generating display data, and relates to the technical field of data processing and live video. The method comprises the following steps: acquiring display configuration information of video stream data and display configuration information of media stream data and media stream data, wherein the display configuration information comprises layout hierarchy information representing a display hierarchy; converting the video stream data and the media stream data into corresponding frame data; and generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data. By adopting the method, the display time delay can be reduced while the mixed flow display data keeps the display stability.

Description

Method and apparatus for generating display data
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to the technical field of data processing and video live broadcast, and particularly relates to a method and a device for generating display data.
Background
Video applications have evolved from single camera applications to interactive audio-video applications, i.e., video can be performed while short-cut, picture or document content is being displayed. The existing method for displaying multi-channel data is to adopt a single-pipeline data source replacement technology or a multi-pipeline data source scheduling technology.
However, the single-pipe data source replacement technology has a problem of poor display stability caused by frequent switching, and the multi-pipe data source scheduling technology has a problem of display delay caused by transferring hardware or software resources.
Disclosure of Invention
The present disclosure provides a method, an apparatus, an electronic device computer-readable storage medium, and a computer program product for generating display data.
According to a first aspect of the present disclosure, there is provided a method for generating display data, comprising: acquiring display configuration information of video stream data and display configuration information of media stream data and media stream data, wherein the display configuration information comprises layout hierarchy information representing a display hierarchy; converting the video stream data and the media stream data into corresponding frame data; mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream are generated according to the layout level information of the video stream data and the layout level information of the media stream data, wherein the frame data corresponding to the media stream are frame data containing media content generated based on the media content sharing instruction.
According to a second aspect of the present disclosure, there is provided an apparatus for generating display data, comprising: an acquisition unit configured to acquire display configuration information of video stream data and video stream data, and display configuration information of media stream data and media stream data, the display configuration information including layout hierarchy information representing a display hierarchy; a generating unit configured to convert the video stream data and the media stream data into corresponding frame data; mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream are generated according to the layout level information of the video stream data and the layout level information of the media stream data, wherein the frame data corresponding to the media stream are frame data containing media content generated based on the media content sharing instruction.
According to a third aspect of the present disclosure, an embodiment of the present disclosure provides an electronic device, including: one or more processors: a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement a method for generating display data as provided in the first aspect.
According to a fourth aspect of the present disclosure, embodiments of the present disclosure provide a computer-readable storage medium having a computer program stored thereon, where the program when executed by a processor implements the method for generating display data provided by the first aspect.
According to a fifth aspect of the present disclosure, embodiments of the present disclosure provide a computer program product comprising a computer program which, when executed by a processor, implements the method for generating display data as provided by the first aspect. The method and the device for generating the display data first acquire the display configuration information of video stream data and the display configuration information of media stream data and the media stream data, then convert the video stream data and the media stream data into corresponding frame data, and generate mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, so that the mixed flow display data can keep the display stability and reduce the display delay.
It should be understood that the statements in this section are not intended to identify key or critical features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be considered limiting of the present application. Wherein:
FIG. 1 is an exemplary system architecture diagram in which embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for generating display data according to the present application;
FIG. 3 is a flow diagram of another embodiment of a method for generating display data according to the present application;
FIG. 4 is a schematic diagram of a particular application scenario of a method for generating display data according to the present application;
FIG. 5 is a schematic block diagram of one embodiment of an apparatus for generating display data according to the present application;
FIG. 6 is a block diagram of an electronic device used to implement a method for generating display data according to an embodiment of the application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the present method for generating display data or apparatus for generating display data may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various client applications installed thereon for receiving push services, such as a video recording application, a video playing application, a player application, a map application, and the like.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting receiving push services, including but not limited to a smart phone, a tablet computer, an e-book reader, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts Group Audio Layer IV, motion Picture Experts Group Audio Layer 4) player, a laptop portable computer, a desktop computer, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal apparatuses 101, 102, and 103 are hardware, various electronic apparatuses may be used, and when the terminal apparatuses 101, 102, and 103 are software, the electronic apparatuses may be installed therein. It may be implemented as multiple pieces of software or software modules (e.g., multiple software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein. The terminal 101 may be a video stream recording terminal, a media stream data storage terminal, and a display data generation terminal, and when the terminal 101 receives a media stream data sharing instruction, performs data processing of multiple data sources on the video stream data and the media stream data to generate display data, and then transmits the display data to a display device of the terminal 101 and the server 105. The terminals 102, 103 may be receiving terminals of display data, and the terminals 102, 103 may receive display data generated by the terminal 101 transmitted by the server 105.
The server 105 may be a server that provides various services, and may be a server that provides a data storage service, for example. The server providing the data storage service may receive the display data generated and transmitted by the terminal apparatus 101 and store the display data on the server 105, and when the server 105 receives a display data acquisition request transmitted by the terminals 102, 103, may transmit the display data to the terminals 102, 103.
It should be noted that the method for generating display data provided by the embodiment of the present disclosure is generally performed by the terminal device 101, and accordingly, the apparatus for generating display data is generally disposed in the terminal device 101.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for generating display data according to the present disclosure is shown, including the steps of:
step 201, obtaining display configuration information of video stream data and video stream data, and display configuration information of media stream data and media stream data, wherein the display configuration information includes layout hierarchy information representing a display hierarchy.
In the present embodiment, the execution subject of the method for generating display data (e.g., the terminal apparatus 101 shown in fig. 1) can acquire video stream data, media stream data, and display configuration information of the video stream data, display configuration information of the media stream data from the image pickup device of the terminal or the local storage. The video stream data refers to video content recorded by a terminal through a camera device on the terminal, and the media stream data refers to files in the form of videos, pictures, documents and the like stored in the terminal; the display configuration information of the video stream data and the media stream data is information including information representing a hierarchy of the corresponding data when displayed on the display device (for example, displayed on a top layer or displayed on a bottom layer), or information of a layout determined by the terminal according to a requirement input by the user (for example, the video stream data is primary display data and the media stream data is secondary display data according to a requirement that the video stream data is displayed in front of the media stream data).
Step 202, converting the video stream data and the media stream data into corresponding frame data, and generating mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, wherein the frame data corresponding to the media stream is the frame data containing the media content generated based on the media content sharing instruction.
In this embodiment, the video stream data and the media stream data may be converted into corresponding frame data by using a data extraction method or a video/image format conversion method, and a matrix calculation may be performed on the frame data storage matrix according to a preset display rotation angle of the video stream data and the media stream data by using a matrix transformation algorithm. The frame data is data corresponding to the minimum unit of display of the video stream data and the media stream data.
In this embodiment, in the process of displaying video stream data, based on a media content sharing instruction triggered by a user or preset by a system, that is, an instruction for representing that data mixing starts and mixed flow data is displayed, corresponding frame data is extracted from frame data corresponding to the video stream data and frame data corresponding to the media stream data according to layout level information corresponding to the video stream data and layout level information corresponding to the media stream data, and mixed flow display data is generated, where the mixed flow display data refers to display data that simultaneously includes two paths of data, i.e., the video stream data and the media stream data. Specifically, the frame data corresponding to the video stream and the frame data corresponding to the media stream may be respectively stored in corresponding storage matrices, and when frame data is extracted, the storage matrices are traversed according to the layout level information to extract the corresponding frame data. Specifically, according to the layout level information, the frame data corresponding to the high-level layout level information may cover the frame data corresponding to the low-level layout level information to realize the mixed flow of the data; or according to the layout level information, the frame data corresponding to the layout level information of the high level is stacked above the frame data corresponding to the layout level information of the low level, so as to realize the mixed flow of the data.
In this embodiment, in the process of recording the video stream data, when the video stream data recording terminal receives the media content sharing instruction, the frame data of the video stream picture captured at the current time and the frame data in the media stream data specified in the media content sharing instruction are used as two paths of frame data to be synthesized. The frame data of the media stream data specified in the media content sharing instruction may be a frame of image specified in the media content sharing instruction, a specified page in a document, or a frame of data in a video.
For example, in an online education scene, video stream data is live broadcast data recorded and played by a teacher in real time, media stream data is an example image stored by the teacher on a live broadcast data recording terminal, layout level information corresponding to the live broadcast data is displayed on a bottom layer, layout level information corresponding to the example image is displayed on a top layer, when the teacher clicks a display image option, a media content sharing instruction is triggered, and the terminal generates frame display data of the example image on the top layer and the live broadcast data on the bottom layer according to the layout level information and frame data corresponding to the live broadcast data and the example image at the moment when the teacher triggers the media content sharing instruction.
The method and the device for generating display data provided by the embodiment firstly acquire the display configuration information of video stream data and the display configuration information of media stream data and media stream data, then convert the video stream data and the media stream data into corresponding frame data, and generate mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, so that the mixed flow display data can keep the display stability and reduce the display delay.
With further reference to FIG. 3, a flow 300 of yet another embodiment of a method for generating display data is shown, comprising the steps of:
step 301, obtaining display configuration information of video stream data and video stream data, and display configuration information of media stream data and media stream data, where the display configuration information includes layout hierarchy information representing a display hierarchy.
The description of step 301 in this embodiment is the same as that of step 201, and is not repeated here.
Step 302, converting video stream data and media stream data into corresponding frame data; mixed flow information of the video stream data and the media stream data is determined based on the layout level information of the video stream data and the layout level information of the media stream data.
Step 302 specifically includes: calculating the hash value of the frame data, using the hash value of the frame data as a key in a data source key value pair, using the frame data as a value in the data source key value pair, filling the data source key value pair, and associating the data source key value pair with a layout key value pair according to the hash value of the frame data, wherein the layout key value pair takes the layout level information of the video stream data or the media stream data corresponding to the frame data as a key, and takes the hash value of the corresponding frame data as a value.
In this embodiment, first, video stream data and media stream data are converted into corresponding frame data, then hash values of frame data corresponding to the video stream data and hash values of frame data corresponding to the media stream data are calculated, and the layout level information, the frame data and the hash values of the frame data are stored by means of key value pairs, specifically, a data source key value pair and a layout key value pair are set, the hash values of the frame data are filled in the data source key value pair as keys in the data source key value pair, and the frame data are used as values in the data source key value pair (for example, the data source key value pair: "hash value of frame data-frame data"); the layout key value pair is filled with the layout level information as a key of the layout key value pair, and the hash value of the frame data as a value of the layout key value pair (for example, the layout key value pair: "layout level information-hash value of the frame data"). The video stream data corresponding to the data source key value pair and the layout key value pair are mixed flow information of the video stream data, and the media stream data corresponding to the data source key value pair and the layout key value pair are mixed flow information of the video stream data
Step 303, rendering frame data corresponding to the video stream data and frame data corresponding to the media stream data based on the mixed flow information of the video stream data and the media stream data.
In this embodiment, the hash value of the frame data corresponding to the video stream data may be extracted in the layout key value pair corresponding to the video stream data according to the layout level information of the video stream data in the layout key value pair, and then the frame data of the video stream data corresponding to the hash value of the frame data may be extracted in the data source key value pair of the video stream data according to the hash value of the frame data; extracting a hash value of frame data corresponding to the media stream data in a layout key value pair corresponding to the media stream data according to the layout level information of the media stream data, and then extracting frame data of the media stream data corresponding to the hash value of the frame data in a data source key value pair of the media stream data according to the hash value of the frame data; and mixing the extracted frame data corresponding to the video stream data and the extracted frame data corresponding to the media stream data to generate display data.
According to the embodiment, the hash value of the frame data corresponding to the video stream data and the hash value of the frame data corresponding to the media stream data are calculated according to the video stream data and the frame data corresponding to the media stream data, and the corresponding frame data are extracted by reading the hash values, so that the data reading time can be reduced, the delay of generating display data is reduced, and the delay of data display is further reduced.
Optionally, rendering frame data corresponding to the video stream data and frame data corresponding to the media stream data based on mixed flow information of the video stream data and the media stream data includes: traversing the layout key value pair, and reading frame data corresponding to each display level according to the association of the layout key value pair and the data source key value pair; and rendering frame data corresponding to each display level.
In this embodiment, the hash value of the corresponding frame data may be first extracted in the layout key value pair by traversal according to the layout level information, and the corresponding frame data may be read in the data source key value pair according to the hash value of the frame data corresponding to the layout level information. And then, fusing frame data corresponding to each display level by using image rendering software or an image rendering engine according to preset rendering parameters to generate display data. In this embodiment and by using the association of the layout key value to the data source key value pair, the corresponding frame data is extracted by traversing the layout key value pair, so that the data processing time can be reduced, the time for generating the display data is further reduced, and the data display delay is further reduced.
Optionally, the display configuration information further includes rendering parameters characterizing display effects of the video stream data and the media stream data; generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, comprising: and generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information and the rendering parameter of the video stream data and the layout level information and the rendering parameter of the media stream data.
In this embodiment, frame data corresponding to video stream data and frame data corresponding to media stream data are extracted according to layout level information corresponding to the video stream data and layout level information corresponding to the media stream data, and the frame data are rendered and fused according to preset rendering parameters and by using image rendering software and effect map making software to generate mixed flow display data, where the rendering parameters may be parameters of display positions of the video stream data and the media stream data on a display interface/device, a length of a display screen, a width of the display screen, or an aspect ratio of the display screen, relative positions of the video stream data and the media stream data when displayed, animation effects of the video stream data and the media stream data when displayed, such as a rotation angle of the display screen, a scaling ratio of the display screen, and the like.
Optionally, generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, including: and extracting frame data corresponding to the video stream data and frame data corresponding to the media stream data according to the layout level information of the video stream data and the layout level information of the media stream data, and generating mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream.
In this embodiment, corresponding frame data is extracted from frame data corresponding to video stream data and frame data corresponding to media stream data according to layout level information corresponding to the video stream data and layout level information corresponding to the media stream data, and is stored in a data matrix, and then display data corresponding to the data matrix is generated according to the data matrix. The embodiment can improve the calculation efficiency of the processor and further reduce the processing time delay by generating the display data according to the matrix operation.
In some alternative implementations of the embodiments described above in connection with fig. 2 and 3, the method for generating display data further comprises: and sending the mixed flow display data to a server so that the server sends the display result to a request terminal of the video stream data.
In this embodiment, the generated mixed flow display data is sent to a server, and the server may send the mixed flow display data to a request terminal of the video stream data, where the request terminal may be a terminal of a user requesting to view the video stream or a terminal designated by the video stream generation terminal to receive the video stream. The embodiment sends the generated display result to the server, so that the server can back up, archive or distribute the display result, and diversified video stream playing scenes can be met.
In some application scenarios, openGL (Open Graphics Library) may be used to perform data processing on video stream data and media stream data to obtain display data. Specifically, the OpenGL software environment is run, a frame data buffer CVPixelBuffer for storing frame data, a texture buffer CVOpenGLESTextureRef for storing frame data textures, a buffer OpenGL ES Framebuffer for storing rendering effects of the frame data are created, the frame data corresponding to the video stream data and the frame data corresponding to the media stream data are stored in the corresponding CVPixelBuffer, the textures of the frame data corresponding to the video stream data and the textures of the frame data corresponding to the media stream data are stored in the corresponding CVPixelBuffer, the rendering effects of the frame data corresponding to the video stream data and the rendering effects of the frame data corresponding to the media stream data are stored in the corresponding OpenGL ES Framebuffer, the rendering effects of the frame data corresponding to the video stream data and the rendering effects of the frame data corresponding to the media stream data are extracted from the video stream frame data buffer CVPixelBuffer and the media stream data buffer according to layout level information corresponding to the video stream data and the media stream data, the corresponding frame data in the frame data buffer CVPixelBuffer are extracted, and the rendering effects of the frame data are placed in the OpenGL buffer, the frame data buffer CVPixelBuffer, and the rendering effects of the frame data are displayed, and the rendering functions are displayed in the frame data.
In the online education scenario shown in fig. 4, a scenario of locally stored courseware needs to be shared during the course of playing of the lecture video. The camera shooting and collecting assembly shown in fig. 4 is used for collecting video stream data, the local storage assembly is used for providing locally stored media stream data, display configuration information of the video stream data and display configuration information of the media stream data are preset by a user and input into the terminal, the mixing assembly is used for mixing the video stream data and the media stream data according to corresponding display configuration information to generate display data, the distribution assembly is used for distributing the display data to a display device, namely a local display device, of the collecting terminal of the video stream data and a remote server in communication connection with the local display device, and the remote server sends the display data to a request terminal which is in communication connection with the remote server and used for teaching and explaining videos. For example, firstly, a user configures video display position and proportion, courseware display position and proportion, video layout level information, courseware layout level information and other display configuration information, after the terminal acquires the display configuration information, the terminal can detect the permission of the camera and start the camera to acquire video streaming data, the user can select locally stored media streaming data needing to be shared, namely, courseware in a document form, then two paths of data streams can be transmitted to the mixed flow component based on an extended acquisition pipeline, the mixed flow component can generate mixed flow display data according to a method based on the generated display data, finally the mixed flow display data can be transmitted to the terminal of the user through the distribution component to be displayed, and the mixed flow display data is transmitted to the server end to enable the server to be transmitted to a request terminal of an online course, such as students needing to go to the course.
With further reference to fig. 5, as an implementation of the above method, the present disclosure provides an embodiment of an apparatus for generating display data, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for generating display data of the present embodiment includes: the display control method includes an acquisition unit 501 and a generation unit 502, wherein the acquisition unit 501 is configured to acquire display configuration information of video stream data and video stream data, and display configuration information of media stream data and media stream data, and the display configuration information includes layout level information representing a display level; a generating unit 502 configured to convert the video stream data and the media stream data into corresponding frame data; mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream are generated according to the layout level information of the video stream data and the layout level information of the media stream data, wherein the frame data corresponding to the media stream are frame data containing media content generated based on the media content sharing instruction.
In some embodiments, the generating unit 502 includes: a determination module configured to determine mixed flow information of video stream data and media stream data based on layout level information of the video stream data and layout level information of the media stream data, including: calculating the hash value of frame data, using the hash value of the frame data as a key in a data source key value pair, using the frame data as a value in the data source key value pair, filling the data source key value pair, and associating the data source key value pair with a layout key value pair according to the hash value of the frame data, wherein the layout key value pair takes the layout level information of video stream data or media stream data corresponding to the frame data as a key and the hash value of the corresponding frame data as a value; the rendering module is configured to render frame data corresponding to the video stream data and frame data corresponding to the media stream data based on mixed flow information of the video stream data and the media stream data.
In some embodiments, a rendering module, comprising: a traversing module configured to traverse the layout key-value pairs and read frame data corresponding to each display level according to the association of the layout key-value pairs with the data source key-value pairs; and the rendering submodule is configured to render the frame data corresponding to each display level.
In some embodiments, the display configuration information further includes rendering parameters characterizing display effects of the video stream data and the media stream data; a generation unit comprising: and the generating module is configured to generate mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information and the rendering parameter of the video stream data and the layout level information and the rendering parameter of the media stream data.
In some embodiments, the generating unit comprises: and the generating submodule is configured to extract frame data corresponding to the video stream data and frame data corresponding to the media stream data according to the layout level information of the video stream data and the layout level information of the media stream data, and generate mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream.
In some embodiments, the apparatus comprises: and the distribution unit is configured to transmit the mixed flow display data to the server so as to enable the server to transmit the display result to a request terminal of the video stream data.
The units in the apparatus 500 described above correspond to the steps in the method described with reference to fig. 2 and 3. Thus, the operations, features and technical effects that can be achieved by the methods for generating display data described above are also applicable to the apparatus 500 and the units included therein, and are not described herein again.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 6, is a block diagram of an electronic device 600 for generating display data according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 6, the electronic apparatus includes: one or more processors 601, memory 602, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, if desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 6, one processor 601 is taken as an example.
The memory 602 is a non-transitory computer readable storage medium as provided herein. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for generating display data provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method for generating display data provided herein.
The memory 602, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method for generating display data in the embodiments of the present application (e.g., the obtaining unit 501, the generating unit 502 shown in fig. 5). The processor 601 executes various functional applications of the server and data processing, i.e., implements the method for generating display data in the above-described method embodiments, by executing non-transitory software programs, instructions, and modules stored in the memory 602.
The memory 602 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device for generating display data, and the like. Further, the memory 602 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 602 optionally includes memory located remotely from the processor 601, which may be connected over a network to an electronic device for generating display data. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device for generating display data may further include: an input device 603, an output device 604, and a bus 605. The processor 601, the memory 602, the input device 603, and the output device 604 may be connected by a bus 605 or other means, and are exemplified by the bus 605 in fig. 6.
The input device 603 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the electronic apparatus used to generate the display data, such as a touch screen, keypad, mouse, track pad, touch pad, pointer stick, one or more mouse buttons, track ball, joystick or other input device. The output devices 604 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program product (also known as a program, software application, or code) for implementing the methods of the present disclosure includes machine instructions for a programmable processor, and may be implemented using a high-level procedural and/or object-oriented programming language, and/or assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments are not intended to limit the scope of the present disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method for generating display data, comprising:
the method comprises the steps of obtaining display configuration information of video stream data and display configuration information of media stream data and media stream data, wherein the display configuration information comprises layout hierarchy information representing a display hierarchy;
converting the video stream data and the media stream data into corresponding frame data, and generating mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, including: determining mixed flow information of the video stream data and the media stream data based on layout level information of the video stream data and layout level information of the media stream data, and rendering frame data corresponding to the video stream data and frame data corresponding to the media stream data based on the mixed flow information of the video stream data and the media stream data to generate display data in a fusion manner, wherein the determining mixed flow information of the video stream data and the media stream data based on the layout level information of the video stream data and the layout level information of the media stream data comprises: calculating a hash value of the frame data, using the hash value of the frame data as a key in a data source key value pair, using the frame data as a value in the data source key value pair, filling a data source key value pair, and associating the data source key value pair with a layout key value pair according to the hash value of the frame data, wherein the layout key value pair uses video stream data or layout level information of the media stream data corresponding to the frame data as a key and uses a hash value of the corresponding frame data as a value, wherein the frame data corresponding to the media stream is generated based on a media content sharing instruction and contains the media content, the data source key value pair and the layout key value pair corresponding to the video stream data are mixed flow information of the video stream data, and the data source key value pair and the layout key value pair corresponding to the media stream data are mixed flow information of the media stream data; rendering frame data corresponding to the video stream data and frame data corresponding to the media stream data based on mixed flow information of the video stream data and the media stream data to generate display data in a fusion mode, wherein the rendering mode comprises the following steps: traversing the layout key value pair, and reading frame data corresponding to each display level according to the association of the layout key value pair and a data source key value pair; and rendering the frame data corresponding to each display level to fuse and generate display data.
2. The method of claim 1, wherein the display configuration information further includes rendering parameters characterizing the video stream data and the media stream data display effects;
the generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data includes:
and generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information and the rendering parameter of the video stream data and the layout level information and the rendering parameter of the media stream data.
3. The method of claim 1, wherein the generating mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data comprises:
and extracting frame data corresponding to the video stream data and frame data corresponding to the media stream data according to the layout level information of the video stream data and the layout level information of the media stream data, and generating mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream.
4. A method according to any one of claims 1-3, wherein the method comprises:
and sending the mixed flow display data to a server so that the server sends the mixed flow display data to a request terminal of the video stream data.
5. An apparatus for generating display data, comprising:
an acquisition unit configured to acquire display configuration information of video stream data and video stream data, and display configuration information of media stream data and media stream data, the display configuration information including layout hierarchy information representing a display hierarchy;
a generating unit configured to convert the video stream data and the media stream data into corresponding frame data, and generate mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream according to the layout level information of the video stream data and the layout level information of the media stream data, including: a determination module and a rendering module, the determination module configured to determine mixed flow information of the video stream data and the media stream data based on layout level information of the video stream data and layout level information of the media stream data, including: calculating a hash value of the frame data, using the hash value of the frame data as a key in a data source key value pair, using the frame data as a value in the data source key value pair, filling a data source key value pair, and associating the data source key value pair with a layout key value pair according to the hash value of the frame data, where the layout key value pair uses video stream data or layout level information of the media stream data corresponding to the frame data as a key and uses a hash value of the corresponding frame data as a value, where the frame data corresponding to the media stream is the frame data containing the media content generated based on the media content sharing instruction, the data source key value pair and the layout key value pair corresponding to the video stream data are mixed flow information of the video stream data, and the data source key value pair and the layout key value pair corresponding to the media stream data are mixed flow information of the media stream data, and the rendering module is configured to render the frame data corresponding to the video stream data and the frame data corresponding to the media stream data based on the mixed flow information of the video stream data and the media stream data to generate display data by fusion, including: a traversal module and a rendering sub-module, the traversal module configured to traverse the layout key-value pairs and read frame data corresponding to each display level according to associations of the layout key-value pairs with data source key-value pairs; the rendering submodule is configured to render the frame data corresponding to each display level so as to generate display data in a fusion mode.
6. The apparatus of claim 5, wherein the display configuration information further includes rendering parameters characterizing the video stream data and the media stream data display effects;
the generation unit includes:
the generating module is configured to generate mixed flow display data of frame data corresponding to the video stream data and frame data corresponding to the media stream according to the layout level information and the rendering parameter of the video stream data and the layout level information and the rendering parameter of the media stream data.
7. The apparatus of claim 5, wherein the generating unit comprises:
the generating submodule is configured to extract frame data corresponding to the video stream data and frame data corresponding to the media stream data according to the layout level information of the video stream data and the layout level information of the media stream data, and generate mixed flow display data of the frame data corresponding to the video stream data and the frame data corresponding to the media stream.
8. The apparatus of any of claims 5-7, wherein the apparatus comprises:
a distribution unit configured to transmit the mixed flow display data to a server to cause the server to transmit the mixed flow display data to a requesting terminal of the video stream data.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-4.
10. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-4.
CN202010611784.7A 2020-06-30 2020-06-30 Method and apparatus for generating display data Active CN111726687B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010611784.7A CN111726687B (en) 2020-06-30 2020-06-30 Method and apparatus for generating display data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010611784.7A CN111726687B (en) 2020-06-30 2020-06-30 Method and apparatus for generating display data

Publications (2)

Publication Number Publication Date
CN111726687A CN111726687A (en) 2020-09-29
CN111726687B true CN111726687B (en) 2022-12-27

Family

ID=72570120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010611784.7A Active CN111726687B (en) 2020-06-30 2020-06-30 Method and apparatus for generating display data

Country Status (1)

Country Link
CN (1) CN111726687B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959947A (en) * 2010-07-06 2013-03-06 松下电器产业株式会社 Screen synthesising device and screen synthesising method
CN103338340A (en) * 2013-06-18 2013-10-02 北京汉博信息技术有限公司 Picture-in-picture module and method for realizing user-defined combination of frame pictures of multiple video streams
CN103702040A (en) * 2013-12-31 2014-04-02 广州华多网络科技有限公司 Real-time video graphic decoration superposing processing method and system
CN104469089A (en) * 2014-12-23 2015-03-25 山东建筑大学 Multimedia interaction teaching system and teaching method
CN105979357A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Teaching video processing method and device
CN106375296A (en) * 2016-08-30 2017-02-01 杭州施强教育科技有限公司 Multimedia courseware teaching live broadcast method
CN106572385A (en) * 2015-10-10 2017-04-19 北京佳讯飞鸿电气股份有限公司 Image overlaying method for remote training video presentation
CN107360160A (en) * 2017-07-12 2017-11-17 广州华多网络科技有限公司 live video and animation fusion method, device and terminal device
CN108614671A (en) * 2016-12-12 2018-10-02 北京忆恒创源科技有限公司 Key-data access method based on NameSpace and solid storage device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2534136A (en) * 2015-01-12 2016-07-20 Nokia Technologies Oy An apparatus, a method and a computer program for video coding and decoding
GB2554877B (en) * 2016-10-10 2021-03-31 Canon Kk Methods, devices, and computer programs for improving rendering display during streaming of timed media data
CN106658145B (en) * 2016-12-27 2020-07-03 北京奇虎科技有限公司 Live broadcast data processing method and device
CN109218754A (en) * 2018-09-28 2019-01-15 武汉斗鱼网络科技有限公司 Information display method, device, equipment and medium in a kind of live streaming
CN109525852B (en) * 2018-11-22 2020-11-13 北京奇艺世纪科技有限公司 Live video stream processing method, device and system and computer readable storage medium
CN111246232A (en) * 2020-01-17 2020-06-05 广州华多网络科技有限公司 Live broadcast interaction method and device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959947A (en) * 2010-07-06 2013-03-06 松下电器产业株式会社 Screen synthesising device and screen synthesising method
CN103338340A (en) * 2013-06-18 2013-10-02 北京汉博信息技术有限公司 Picture-in-picture module and method for realizing user-defined combination of frame pictures of multiple video streams
CN103702040A (en) * 2013-12-31 2014-04-02 广州华多网络科技有限公司 Real-time video graphic decoration superposing processing method and system
CN104469089A (en) * 2014-12-23 2015-03-25 山东建筑大学 Multimedia interaction teaching system and teaching method
CN106572385A (en) * 2015-10-10 2017-04-19 北京佳讯飞鸿电气股份有限公司 Image overlaying method for remote training video presentation
CN105979357A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Teaching video processing method and device
CN106375296A (en) * 2016-08-30 2017-02-01 杭州施强教育科技有限公司 Multimedia courseware teaching live broadcast method
CN108614671A (en) * 2016-12-12 2018-10-02 北京忆恒创源科技有限公司 Key-data access method based on NameSpace and solid storage device
CN107360160A (en) * 2017-07-12 2017-11-17 广州华多网络科技有限公司 live video and animation fusion method, device and terminal device

Also Published As

Publication number Publication date
CN111726687A (en) 2020-09-29

Similar Documents

Publication Publication Date Title
US10061552B2 (en) Identifying the positioning in a multiple display grid
US20200029119A1 (en) Generating masks and displaying comments relative to video frames using masks
Jeong et al. Ultrascale collaborative visualization using a display-rich global cyberinfrastructure
EP3055761B1 (en) Framework for screen content sharing system with generalized screen descriptions
US20090172557A1 (en) Gui screen sharing between real pcs in the real world and virtual pcs in the virtual world
CN108846886B (en) AR expression generation method, client, terminal and storage medium
WO2015077259A1 (en) Image sharing for online collaborations
WO2021147414A1 (en) Video message generation method and apparatus, electronic device, and storage medium
US20230029698A1 (en) Video interaction method and apparatus, electronic device, and computer-readable storage medium
CN110825224A (en) Interaction method, interaction system and display device
US20150195320A1 (en) Method, System and Software Product for Improved Online Multimedia File Sharing
WO2021103366A1 (en) Bullet screen processing method and system based on wechat mini-program
CN103197908A (en) Information display platform based PDF (portable document format) file display method and information display platform based PDF file display system
US20220254114A1 (en) Shared mixed reality and platform-agnostic format
US11558440B1 (en) Simulate live video presentation in a recorded video
CN111033497B (en) Providing hyperlinks in remotely viewed presentations
CN110673886B (en) Method and device for generating thermodynamic diagrams
Lee et al. FLUID-XP: flexible user interface distribution for cross-platform experience
CN111726687B (en) Method and apparatus for generating display data
CN112153472A (en) Method and device for generating special picture effect, storage medium and electronic equipment
WO2024021353A1 (en) Live broadcast room presentation method and apparatus, and electronic device and storage medium
US20230055968A1 (en) Filtering group messages
US11711408B2 (en) Content appearance conversion for remote application sharing
CN113542802B (en) Video transition method and device
KR20160131827A (en) System for cloud streaming service, method of image cloud streaming service using alpha level of color bit and apparatus for the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant