CN112261434A - Interface layout control and processing method and corresponding device, equipment and medium - Google Patents

Interface layout control and processing method and corresponding device, equipment and medium Download PDF

Info

Publication number
CN112261434A
CN112261434A CN202011139741.XA CN202011139741A CN112261434A CN 112261434 A CN112261434 A CN 112261434A CN 202011139741 A CN202011139741 A CN 202011139741A CN 112261434 A CN112261434 A CN 112261434A
Authority
CN
China
Prior art keywords
playing
layout
layout information
window
canvas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011139741.XA
Other languages
Chinese (zh)
Inventor
梁仕田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Huaduo Network Technology Co Ltd
Original Assignee
Guangzhou Huaduo Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Huaduo Network Technology Co Ltd filed Critical Guangzhou Huaduo Network Technology Co Ltd
Priority to CN202011139741.XA priority Critical patent/CN112261434A/en
Publication of CN112261434A publication Critical patent/CN112261434A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26208Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interface layout control and processing method and a corresponding device, equipment and medium thereof, wherein the control method comprises the following steps: displaying a plurality of playing windows with adjustable layouts in canvas provided by a playing control end, wherein each playing window correspondingly presents a path of dynamic images; uploading layout information of each playing window relative to the canvas and corresponding dynamic images to a server in real time, so as to control the server to blend the received dynamic images into a single-path video stream according to the layout information and push the single-path video stream to a viewer; and responding to the repositioning operation acted on any play window, and changing the layout information of the play window into the repositioned layout information so as to control the server to adjust the layout of the play window in the canvas. The method and the device have the advantages that the playing control end of the network live broadcast can dynamically reposition the playing window by an efficient technical means, so that the playing control end can adjust the display layout effect of the dynamic image received and played by the audience end, and what you see is what you get is achieved.

Description

Interface layout control and processing method and corresponding device, equipment and medium
Technical Field
The present application relates to the field of network video live broadcast control technologies, and in particular, to an interface layout control method and a corresponding apparatus, device, and medium thereof, and also to an interface layout processing method and a corresponding apparatus, device, and medium thereof.
Background
The network video live broadcast is used as the basic service of the internet, is widely applied to various real scenes, and provides services through different product forms such as an education training system, a video conference system, a video live broadcast system and the like.
In some live scenes, in order to output various dynamic information simultaneously, there are often situations where multiple live streams exist simultaneously, for example, in a system applying a live broadcast technique to presentation, a live stream carrying a screenshot image of a graphical user interface of a broadcast control end, a live stream carrying an interface image of a user document presentation process, a live stream carrying a video image acquired by a local camera device of the broadcast control end, and the like may be included at the same time. And simultaneously transmitting a plurality of concurrent live broadcast streams to the server, and storing the live broadcast streams to a CDN (content delivery system) by the server so as to finally push the live broadcast streams to audience terminals, thereby realizing network video live broadcast.
If the server pushes multiple live streams completely to the audience without processing, it is conceivable that the server will bear a large load in terms of network bandwidth and audience device power. Therefore, in the prior art, multiple live streams are usually mixed and drawn by a server according to a certain layout, so that the image content of each live stream is presented simultaneously in a layout space defined by a canvas. In this case, the server usually splices the live streams according to a pre-configured layout scheme, for example, according to a left-right layout, a picture-in-picture layout, and the like, and finally forms a single video stream for output, thereby saving bandwidth flow. The disadvantage of this method is self-evident, which inevitably results in that the broadcast control end cannot effectively control the layout effect of the multi-channel live broadcast stream in the graphical user interface, thereby losing the broadcast control flexibility.
In reality, there are many specific application scenarios in which the playing and controlling end needs to have such a control capability that the layout effect can be adjusted, and particularly in an application situation that is complex, for example, in the video live broadcasting system or the video conference system, it is generally necessary to provide higher autonomy for the playing and controlling end, so as to achieve a more effective information display effect through a more reasonable interface layout.
Disclosure of Invention
In view of the shortcomings of the prior art, an object of the present application is to provide an interface layout control method and a corresponding interface layout control apparatus, an electronic device, and a non-volatile storage medium.
As another object of the present application, an interface layout processing method and a corresponding interface layout processing apparatus, an electronic device, and a nonvolatile storage medium are also provided.
In order to meet various purposes of the application, the following technical scheme is adopted in the application:
an interface layout control method adapted to one of the objectives of the present application includes the following steps:
displaying a plurality of playing windows with adjustable layouts in a canvas provided by a graphical user interface of a live network playing control end, wherein each playing window correspondingly presents a path of dynamic images;
uploading layout information of each playing window relative to the canvas and corresponding dynamic images to a server in real time, so as to control the server to blend the received dynamic images into a single-path video stream according to the layout information and push the single-path video stream to a viewer;
and responding to the repositioning operation acting on any playing window, changing the layout information uploaded to the server by the playing window into the repositioned layout information, and controlling the server to adjust the layout of the playing window in the canvas according to the repositioned layout information when blending.
In a preferred embodiment, said process of responding to the relocation operation applied to any of said playing windows comprises the following steps:
responding to a mouse pressing event acting on the playing window, and highlighting the playing window;
responding to a mouse drag and drop event acting on the playing window, and enabling the playing window to move in the canvas range along with a mouse;
and responding to a mouse release event acting on the playing window, stopping the movement of the playing window, and enabling the playing window to obtain the repositioned layout information, so that the layout information uploaded to the server by the playing window is changed into the repositioned layout information.
In some embodiments, the method comprises the following steps:
according to default setting or responding to a user setting instruction, one playing window corresponding to the setting is placed at the bottom layer of other playing windows and maximally displayed in the canvas, and the other playing windows are linearly arranged at the upper layer side of the maximally displayed playing window.
In some embodiments, the method comprises the following steps:
and responding to an image exchange instruction acting between the two playing windows, so that the two playing windows exchange and play the original dynamic images of each other, and the corresponding relation between the playing windows and the dynamic images is updated, so that the server conforms to the updated corresponding relation when implementing the mixed drawing.
In a more detailed embodiment, the layout information includes position information of a corresponding playing window relative to the canvas and layout space hierarchy information of other playing windows, and the position information includes positioning information of the playing window relative to the canvas and size information of the playing window.
In a further embodiment, the method comprises the following pre-steps:
and synchronously acquiring a video stream generated by the camera equipment of the local machine and a screenshot image stream of a graphical user interface of the local machine, and respectively playing the video stream and the screenshot image stream as dynamic images of different playing windows.
An interface layout processing method adapted to one of the objectives of the present application includes the following steps:
receiving multiple paths of dynamic images uploaded by a broadcast control end of live webcasting and layout information of each playing window corresponding to each path of dynamic images in the same canvas in real time;
mixing and drawing the received dynamic images into a single-path video stream according to the layout information so as to push the single-path video stream to a viewer;
when the received layout information of any playing window is changed, the layout of the playing window in the canvas is adjusted according to the changed layout information to perform the mixed drawing.
In a preferred embodiment, when the received dynamic images are mixed and drawn according to the layout information, each frame of image of the video stream is organized according to the canvas range, and when one frame of dynamic image of a plurality of playing windows in each frame of image is overlapped, the image of the playing window on the upper layer of the layout space replaces the image of all playing windows on the lower layer of the layout space for the overlapped part, so that the dynamic images of all parts are spliced into the same dynamic image.
An interface layout control apparatus adapted to one of the objects of the present application includes:
the window display module is used for displaying a plurality of playing windows with adjustable layouts in a canvas provided by a graphical user interface of a live webcast playing control end, and each playing window correspondingly presents a path of dynamic images;
the real-time uploading module is used for uploading the layout information of each playing window relative to the canvas and each corresponding path of dynamic image to the server in real time so as to control the server to blend the received dynamic images into a single-path video stream according to the layout information and push the single-path video stream to a viewer end;
and the positioning adjustment module is used for responding to the repositioning operation acted on any playing window, changing the layout information uploaded to the server by the playing window into the repositioned layout information, and controlling the server to adjust the layout of the playing window in the canvas according to the repositioned layout information when mixed drawing is carried out.
An interface layout processing apparatus adapted to one of the objects of the present application includes:
the image receiving module is used for receiving a plurality of paths of dynamic images uploaded by a broadcast control end of live webcasting and layout information formed by all playing windows corresponding to all paths of dynamic images in the same canvas in real time;
the image mixing and drawing module is used for mixing and drawing the received dynamic images into a single-path video stream according to the layout information so as to push the single-path video stream to a viewer;
and the adjusting monitoring module is configured to adjust the layout of the playing window in the canvas according to the changed layout information to perform the mixed drawing when the received layout information of any playing window is changed.
An electronic device adapted to one of the objectives of the present application includes a central processing unit and a memory, wherein the central processing unit is used for invoking and running a computer program stored in the memory to execute the steps of the interface layout control/processing method described in the present application.
A non-volatile storage medium, provided for the purpose of the present application, stores a computer program implemented according to the interface layout control/processing method in the form of computer readable instructions, and when the computer program is called by a computer, executes the steps included in the method.
Compared with the prior art, the application has the following advantages:
firstly, the method allows the play control authority to perform repositioning operation on each play window by opening the adjustment authority of the layout of the play window corresponding to each path of dynamic image at the play control end, and on the other hand, the play control end uploads the layout information of each play window and each path of corresponding dynamic image to the server in real time to perform mixed drawing, so that the play control end can adjust the layout of each dynamic image on the interface at any time, and then the dynamic images are mixed and drawn into a single path of video through the server and then synchronously sent to the audience end, thereby solving the problem that the layout relation of multiple paths of live streams cannot be adjusted by the play control end in the prior art.
Secondly, according to the method and the device, the dynamic layout information of each playing window at the broadcasting control end is completely adopted for mixed drawing without depending on the server to call the preset layout scheme, and under the unified action of the layout information, the interface layout effect at the broadcasting control end is consistent with the layout effect formed after the server mixed drawing, so that the broadcasting control end can obtain the 'what you see is what you get' effect when implementing the interface layout relation adjustment of any playing window, namely, the broadcasting control end can master the playing effect of the single-channel video stream seen by the audience end, and the interface adjustment reaction can be timely made according to various conditions.
In addition, the function of taking charge of layout adjustment is realized at the broadcasting control end, the function of implementing mixed drawing is realized at the corresponding server, besides the transmission of the dynamic images, the layout information of each playing window is only added, and the layout information is light data, so that the dynamic images corresponding to each playing window can be synchronously transmitted to the server in real time without consuming a large amount of computer power and network resources, and the interaction efficiency of the broadcasting control end and the server is ensured.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic diagram of a typical network deployment architecture related to implementing the technical solution of the present application.
Fig. 2 is a schematic layout diagram of a graphical user interface obtained by implementing the interface layout control method of the present application.
Fig. 3 is a schematic diagram of a coordinate system of a canvas according to the present application.
Fig. 4 is a flowchart illustrating an exemplary embodiment of an interface layout control method according to the present application.
Fig. 5 is a schematic flowchart illustrating a specific step included in step S13 in fig. 4.
Fig. 6 is a flowchart illustrating an exemplary embodiment of an interface layout processing method according to the present application.
Fig. 7 is a schematic block diagram of an exemplary embodiment of an interface layout control apparatus according to the present application.
Fig. 8 is a schematic block diagram of an exemplary embodiment of an interface layout processing apparatus according to the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As will be appreciated by those skilled in the art, "client," "terminal," and "terminal device" as used herein include both devices that are wireless signal receivers, which are devices having only wireless signal receivers without transmit capability, and devices that are receive and transmit hardware, which have receive and transmit hardware capable of two-way communication over a two-way communication link. Such a device may include: cellular or other communication devices such as personal computers, tablets, etc. having single or multi-line displays or cellular or other communication devices without multi-line displays; PCS (Personal Communications Service), which may combine voice, data processing, facsimile and/or data communication capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global Positioning System) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "client," "terminal device" can be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. The "client", "terminal Device" used herein may also be a communication terminal, a web terminal, a music/video playing terminal, such as a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with music/video playing function, and may also be a smart tv, a set-top box, and the like.
The hardware referred to by the names "server", "client", "service node", etc. is essentially an electronic device with the performance of a personal computer, and is a hardware device having necessary components disclosed by the von neumann principle such as a central processing unit (including an arithmetic unit and a controller), a memory, an input device, an output device, etc., a computer program is stored in the memory, and the central processing unit calls a program stored in an external memory into the internal memory to run, executes instructions in the program, and interacts with the input and output devices, thereby completing a specific function.
It should be noted that the concept of "server" as referred to in this application can be extended to the case of a server cluster. According to the network deployment principle understood by those skilled in the art, the servers should be logically divided, and in physical space, the servers may be independent from each other but can be called through an interface, or may be integrated into one physical computer or a set of computer clusters. Those skilled in the art will appreciate this variation and should not be so limited as to restrict the implementation of the network deployment of the present application.
Referring to fig. 1, the hardware basis required for implementing the related art embodiments of the present application may be deployed according to the architecture shown in the figure. The server 80 is deployed at the cloud end, and mainly provides a network (video) live broadcast operation support service, and may be responsible for further connecting a related broadcasting server, a streaming media server, and other servers providing related support, so as to form a logically related server cluster to provide services for related terminal devices, such as a smart phone 81 and a personal computer 82 shown in the figure. Both the smart phone and the personal computer can access the internet through a known network access method, and establish a data communication link with the cloud server 80 so as to run application programs related to the service provided by the server, including application programs embedded in web pages.
It should be noted that in these server clusters, in some scenarios, the server supporting live network video operation, the streaming media server, and the server supporting broadcast service are merged into the same server or the same network address, and sometimes the same application server may be used to establish the correlation of the whole cluster, so that the server ultimately responsible for service can be pointed to by using the same network address. In this regard, those skilled in the art will appreciate.
In order to support the operation of the application program, the terminal device is equipped with a related operating system, such as an IOS (operating system), an HMS (grand Mongolian), an Android and other operating systems providing equivalent functions, and with the support of such an operating system, the application program developed adaptively can be normally operated, so that human-computer interaction and remote interaction are realized.
The interface layout processing method is suitable for being programmed and built in an application program providing network live broadcast operation service, and is used as a basic service function to operate in the server. The interface layout control method is suitable for a broadcast control end, is programmed and built in a webpage embedded or independent application program, and is installed and operated in the terminal equipment, so that the human-computer interaction is conveniently realized.
The network live broadcast, or network video live broadcast, refers to a video live broadcast network service implemented based on the network deployment architecture, and can be presented in different forms such as a video conference system, a network chat room system, an education training system, and the like, so as to meet various application scenarios. The participants of the same network live broadcast instance generally have more than two parties, but generally have at least one broadcast control end, and the rest can be used as a spectator end which receives the information transmitted by the broadcast control end. In the live broadcasting process, the identities of the broadcasting control end and the audience end can also be allowed to be switched. Such peripheral support techniques, however, do not affect the practice of the present application.
Referring to fig. 2, in a graphic user interface layout diagram presented after a broadcast control end electronic device runs by software implemented by a related method of the present application, a central portion of the diagram is a main display area of live broadcast, the main display area corresponds to a canvas 5, and as known by those skilled in the art, the canvas 5 in computer image processing technology is mainly used for standardizing a plane space for graphic display and editing in a graphic user interface, and is not necessarily required to be explicitly displayed or visible in actual display, and is not necessarily required to provide a memory instance in a computer background, theoretically, a developer only needs to implement development according to the plane space specification principle of the canvas, and those skilled in the art can also define a layout space of the canvas by adding a Z axis on the basis of a virtual plane coordinate of the canvas, so that the layout space of different objects on the canvas can not only be described according to the plane space layout, the description can also be made in a different stacking order.
According to the conceptual thinking of the layout space, in the upper layer of the canvas 5 shown in fig. 2, a first play window 51 maximizing and occupying the full size of the canvas is included for playing the first dynamic image, and further, at the upper right corner thereof, a second play window 52 for playing the second dynamic image is displayed on the upper layer of the first play window 51. In the same way, the display of more playing windows on the upper layer of the picture can be realized according to the requirement.
In this way, it can be understood that, as shown in fig. 3, the canvas will establish a coordinate system including X-axis (horizontal direction), Y-axis (vertical direction) and Z-axis (interface level direction), and the coordinate point data of this coordinate system can locate which canvas plane position a play window is located at which level. In specific implementation, the upper left corner of the canvas can be represented by a plane coordinate to be the origin (0, 0), and the sequence number can be directly and independently given by taking the hierarchy value of the Z axis as a reference. In addition, the size of the play window can also be expressed by the horizontal size and the vertical size. The description of implementing such layout information is flexible and can be implemented by those skilled in the art in conjunction with the disclosure herein.
In a more practical embodiment, the layout information of each of the playing windows 51 and 52 formed in the canvas 5 in the present application includes the position information of the corresponding playing window 51 and 52 relative to the canvas 5 and the layout space hierarchy information relative to other playing windows, where the position information includes the positioning information of the playing window 51 and 52 relative to the canvas 5 and the size information of the playing window. The positioning information can be An(x, y) where x, y can be specific coordinates relative to the canvas origin, i.e., the canvas position where the upper left corner of the play window is located. The size information may be Wn、HnAnd respectively representing the horizontal size and the vertical size of the playing window. The Layer for the layout space hierarchy informationnIndicating the level number where the playing window is located. Thus, for each playback window, the layout information thereof is composed of the following attribute data: a. then(x,y)、Wn、Hn、LayernIt will be appreciated that given these several attribute data, the position, size, and hierarchy level at which the playback window resides within the canvas can be determined. At run-time, the playing window can be described by an instance object, and the layout information of the playing window is determined by carrying the attribute data by the instance object. Similarly, the layout information of the playing window is obtained by reading the attribute data.
In fig. 2, an image source control area 6 is shown on the left side of the main display area, the image source control area 6 lists a plurality of different types of controls 60, such as a document control, a camera control, a desktop sharing control, and a video connection control, which may be respectively used to configure one of the playing windows 51, 52, and through the configuration, the opening and closing of the corresponding playing window 51, 52 may be determined, and various other common settings and controls may be performed on the corresponding playing window, so that those skilled in the art may implement it flexibly.
Under the main display area, a tool area 7 is provided, in which a plurality of editing controls (not shown) capable of inputting information such as characters, lines, etc. within the range defined by the canvas 5 are provided, and the dynamic images formed after the editing controls are operated can also be uploaded to the server for comixing as a dynamic image corresponding to a play window with background transparency property maximally displayed in the canvas, so that the viewer can synchronize the corresponding writing and editing effects to the play control end.
The principles disclosed above with reference to fig. 2 and 3 are applicable to various embodiments of the present application, and may be referred to for the description of the embodiments, and will not be described again as much as possible in the following.
Based on the application environment and the basic principle thereof, various embodiments of the method related to the application will be described below. To avoid the meaning of misinterpretation of the passage, the person skilled in the art will know: although the various methods of the present application are described based on the same concept so as to be common to each other, they may be independently performed unless otherwise specified. In the same way, for each embodiment disclosed in the present application, it is proposed based on the same inventive concept, and therefore, concepts of the same expression and concepts of which expressions are different but are appropriately changed only for convenience should be equally understood.
Referring to fig. 2 and fig. 4, an interface layout control method of the present application is programmed to be implemented in an electronic device serving as a broadcast control terminal, and can be executed in a manner of a standalone application or a web application, and in an exemplary embodiment, the method includes the following steps:
step S11, displaying a plurality of playing windows with adjustable layouts in a canvas provided by a graphical user interface of a live webcast playing control end, where each playing window correspondingly presents a path of dynamic images:
in the graphical user interface shown in fig. 2, only two of the plurality of playback windows are displayed, wherein the first playback window 51 occupies the full size display of the canvas and resides at the bottom layer in the Z-axis direction of the layout space of the screen, and the second playback window 52 resides at the top layer, and the second playback window 52 completely covers the first playback window 51 and resides at the upper right corner of the planar space range defined by the canvas 5. The user at the playing control end can add more playing windows through different controls provided by the image source control area 6, and of course, can also close the displayed playing windows 51 and 52.
The control 60 of the image source control area 6 is called and executes each configured dynamic image, and the dynamic images are respectively output and displayed in the corresponding playing windows 51 and 52. Taking the functions of the various types of controls 60 shown in fig. 2 as an example, the first playing window 51 may be used to display a slide show played by a playing control end, so as to perform specific applications such as teaching or explanation; the second playing window 52 may be used to display a video image acquired by the camera device of the local computer, and other newly added playing windows are also similar. It can be understood that the corresponding relationship between each playing window 51, 52 and the dynamic image thereof is also hidden in the interaction process between the playing control end and the server responsible for the mixed drawing, so that the server determines the layout of various dynamic images according to the layout information of each playing window. It should be noted that, for each of the playing windows 51 and 52, the content displayed in the playing window is understood as the dynamic image in the present application, regardless of whether the specific object of the dynamic image is static text, a picture or a dynamic video. It is understood that each playing window 51, 52 is only corresponding to playing one path of dynamic image.
During programming and development, the modification authority of the playing windows 51 and 52, which is about various attributes of the layout information, in the instance object thereof is already default to be adjustable. Specifically, the present application allows the playback end user to move each of the playback windows 51, 52 through one or more operations, also allows the playback end user to adjust the lateral and longitudinal dimensions of each of the playback windows through one or more operations, even allows the playback end user to maximally display any of the playback windows, such as 51, and allows the playback end user to interchange dynamic images between the two playback windows 51, 52, adjust the level of each of the playback windows 51, 52 in the layout space, and so on. Such a means, when implemented by the playback end user, results in a change in the layout information of the corresponding playback window.
Step S12, uploading layout information of each playing window relative to the canvas and corresponding dynamic images to a server in real time, so as to control the server to blend the received dynamic images into a single-channel video stream according to the layout information, and push the single-channel video stream to a viewer:
the method comprises the steps that a background service is operated at a broadcast control end, the background service can be operated concurrently with other steps of the method, various dynamic images and layout information of a play window of the broadcast control end are uploaded to a server of the method in real time through the background service, the background service strives for uploading the data in real time by shortening communication time delay as much as possible, therefore, a person skilled in the art should understand that the term "real time" in the application refers to uploading various dynamic images and layout information generated by the broadcast control end to the server as quickly as possible within a range allowing reasonable communication time difference, and even if relevant data obtained by the server has certain time relative to corresponding data generated by the broadcast control end, the time difference is understood as not exceeding the content defined by the concept of real time.
At the playing control end, on one hand, the layout information of each playing window can be provided by reading the related attribute data used for describing the layout information of each playing window in the instance object of each playing window; on the other hand, each dynamic image corresponding to each playing window can be determined according to the corresponding relation between the playing window and the dynamic image. Therefore, the background service continuously and synchronously uploads the layout information of each playing window in the playing control end and the dynamic images corresponding to each playing window to the server. The mapping relationship between the dynamic image of the playing window and the layout information of the playing window may be provided to the server through pre-association, for example, both are associated with the same playing window object name, and the server may know the association accordingly, which is not limited to this, and those skilled in the art may implement the mapping relationship in many ways.
Regarding the size of the canvas used by the broadcast control end, the server is known in advance because the canvas belongs to the same project and the same system, or the server can provide a new dummy canvas by itself, and the new dummy canvas is changed according to various layout information of the broadcast control end so as to be re-laid in the dummy canvas provided by itself. In addition, the broadcast control end can upload canvas setting information of the broadcast control end to the server so that the control server can implement mixed drawing layout according to the canvas setting information. And such may be readily implemented by those skilled in the art in light of the teachings herein.
After the related information of each playing window of the playing control end, including the layout information and the dynamic images thereof, is uploaded to the server, the server can perform mixed drawing on the received dynamic images corresponding to each playing window according to the layout information of each playing window, finally form a single-channel video stream, and distribute the single-channel video stream to a Content Distribution Network (CDN) network so as to be finally pushed to a spectator end accessing the video stream, thereby realizing network video live broadcast from the playing control end to the spectator end. For the process of the server performing the mixed-drawing output according to the various information uploaded by the broadcast control terminal, please refer to the disclosure of various embodiments of the interface layout processing method of the present application, which is not shown here for the moment.
It can be seen that the playing control end uploads the layout information and the dynamic images of each playing window to the server, which is also equivalent to driving and controlling the server to implement blending to generate a video stream pushed to the audience, and if the playing control end stops uploading, the server cannot generate the corresponding video stream.
Step S13, in response to the repositioning operation applied to any of the playing windows, changing the layout information uploaded to the server by the playing window to the repositioned layout information, so as to control the server to adjust the layout of the playing window in the canvas according to the repositioned layout information when blending:
in the process of network video live broadcast at the broadcast control end, sometimes various different requirements are met, individual or multiple play windows need to be adjusted, and re-layout of related play windows in the layout space of the canvas is achieved. It can be understood that, since the broadcast control end uploads the relevant layout information and the dynamic image to the server in real time through the background service, any layout change on the canvas of the broadcast control end can be immediately uploaded and reflected to the server, so that the layout control acting on the broadcast control end actually controls the mixed drawing process of the server, the layout information depended on by the server during mixed drawing is changed, and the layout effect of the corresponding playing window in the dynamic image obtained after mixed drawing is changed.
The playback control end can realize the repositioning operation of the playback window in various ways, and because the repositioning operation of the playback window can cause the attribute data related to the layout information in the instance object of the playback window to change, the layout information of the corresponding playback window is naturally changed each time, the original attribute data related to the layout information in the instance object of the playback window is replaced by the new data of the repositioned layout information, the background service reads the new data according to the inherent logic of the background service and uploads the new data to the server, and the repositioned layout information is naturally provided for the server.
While the above exemplary embodiments disclose embodiments of the present application in sufficient detail, the possible embodiments of the present application are not limited thereto, and other embodiments of the present application will be described below with reference to various cases based on the foregoing basis:
in one embodiment, the playing control end can move the playing window through a mouse to realize the repositioning operation of the moved playing window, and the playing control end responds to the repositioning operation of the user correspondingly to complete the repositioning of the playing window. Referring to fig. 5, the step S13 in the exemplary embodiment of the present application may be implemented as the following specific steps:
step S131, responding to the mouse pressing event acting on the playing window, highlighting the playing window:
when a user at the playing control end needs to adjust the layout of a playing window, particularly move the position of the playing window in the canvas, the user can act on the playing window through a mouse to implement playing operation. When the mouse is pressed, a mouse pressing event is triggered, and at the moment, in order to clearly show the operated object, when the mouse pressing event is responded, the playing window can be highlighted by executing related program codes.
Step S132, responding to the mouse drag and drop event acting on the playing window, and enabling the playing window to move in the canvas range along with the mouse:
when a user at a playing control end starts to move a mouse after pressing the mouse, a mouse drag and drop event is triggered, so that the step responds by executing a program code in a response method of the mouse drag and drop event, the coordinate information of the mouse in a graphical user interface is read, and the playing window is controlled to move along with the mouse along with the change of the coordinate information.
Step S133, responding to the mouse release event acting on the play window, stopping the movement of the play window, and making the play window obtain the repositioned layout information, thereby changing the layout information uploaded to the server by the play window to the repositioned layout information:
when the user at the playing control end selects the repositioning position of the playing window in the canvas, the mouse can be released, and at the moment, a mouse release event is triggered. Similarly, in this step, the playing window is fixed in the position of the corresponding canvas by responding through the program code in the response method for executing the mouse release event, so that in the instance object of the playing window, the attribute corresponding to the layout information of the playing window is assigned according to the new relocated data, and then the playing window obtains the relocated layout information. As described above, the subsequent background service uploads the relocated layout information to the server, so as to achieve the effect of changing the layout in the canvas where the playing window is located.
In another embodiment, a pre-step is added to the present application, in this step, according to default settings, a playing window corresponding to the settings is placed at the bottom layer of other playing windows and maximally displayed in a canvas, and the other playing windows are linearly arranged at one side of the top layer of the maximally displayed playing window. The default settings may be pre-stored in a local configuration file. This step may be performed during initialization of the user interface shown in fig. 2. This makes it possible to quickly format the layout of a plurality of playback windows in the entire screen, and to eliminate the trouble of the user in operation.
In another embodiment similar to the previous embodiment, a step is added, in which, in response to a user setting instruction, one playing window corresponding to the setting is placed at the bottom layer of the other playing windows and maximally displayed in the canvas, and the other playing windows are linearly arranged at the upper layer side of the maximally displayed playing window. The user setting instruction can be triggered by an event that a user at the playing control end double clicks a certain playing window, so that when the event is responded, the double-clicked playing window is set to be the bottom-layer maximized playing window which is full of canvas display.
In a further embodiment, the method further comprises the following steps: and responding to an image exchange instruction acting between the two playing windows, so that the two playing windows exchange and play the original dynamic images of each other, and the corresponding relation between the playing windows and the dynamic images is updated, so that the server conforms to the updated corresponding relation when implementing the mixed drawing.
For example, a double-click event of the play control end user acting on the upper play window can be understood as an image exchange instruction, and then the double-clicked play window is re-associated to output and display the dynamic image which is originally displayed in the lower play window and the bottommost play window is used to output and display the dynamic image which is originally displayed in the double-clicked play window, so that the dynamic images are exchanged with each other.
It can be understood that after the step is performed, the corresponding relationship between the layout information uploaded to the server and the dynamic image is changed correspondingly compared with the previous step, so that the server still performs the comic according to the inherent logic of the server, and as the corresponding relationship is updated, the corresponding relationship between the playing window in the dynamic image formed by the comic of the server and the dynamic image is changed correspondingly.
In order to be suitable for screen sharing and application scenario of the presenter, in another embodiment of the present application, the method of the present application may further include the following pre-steps: and synchronously acquiring a video stream generated by the camera equipment of the local machine and a screenshot image stream of a graphical user interface of the local machine, and respectively playing the video stream and the screenshot image stream as dynamic images of different playing windows. The method is suitable for the embodiment, the video of the speaker at the broadcasting control end can be synchronously displayed in one playing window as the dynamic image, the screenshot image of the screen of the speaker is also displayed in another playing window as the dynamic image in real time, and the effect of synchronously playing the two paths of dynamic images is formed, so that the synchronous effect same as that of the broadcasting control end can be obtained when the video stream obtained after the mixed drawing by the server is played, namely, the time corresponding relation between the audio and video content generated by the speaker and the variation content of the screenshot image is consistent no matter at the broadcasting control end or at the audience end.
It should be noted that the technical means adopted in the above embodiments can be combined with each other to form a richer embodiment, as long as there is no contradiction between the technical means of the embodiments, and those skilled in the art can flexibly implement the embodiments in combination with the disclosure herein.
Referring to fig. 6, an interface layout processing method of the present application is programmed and implemented in an electronic device serving as a server, and opens a corresponding service to a broadcast control end described in the present application through the operation of an application program, and in an exemplary embodiment of the method, the method includes the following steps:
step S21, receiving multiple paths of dynamic images uploaded by the broadcast control end of live webcasting and layout information formed in the same canvas by each of the playing windows corresponding to the multiple paths of dynamic images in real time:
as mentioned above, the multi-channel dynamic images and layout information uploaded in real time by the broadcast control terminal will be received by the server for comic. The information mainly includes, in units of the playback windows, layout information in the canvas corresponding to the playback windows, and stream data of the moving images that the playback windows are responsible for outputting. The one-to-one correspondence between the moving images and the layout information realizes a certain association through the play window, and therefore, the server can recognize the correspondence relationship therebetween.
In addition, adapt to various embodiments as described earlier, the server can build the canvas by oneself, also can receive the canvas data that broadcast accuse end uploaded, and does not influence the implementation of this application.
Step S22, mixing and drawing the received dynamic images into a single-channel video stream according to the layout information to be pushed to the viewer:
and for the multi-channel dynamic images and the plurality of layout information acquired from the broadcasting control end, the server implements mixed picture processing on the basis of the canvas according to the corresponding relation between the server and the plurality of playing windows.
When the received dynamic images are mixed and drawn according to the layout information, the basic principle is that each frame of image of the video stream is organized according to the canvas range, so that each frame of image can simultaneously display the dynamic images corresponding to each playing window, and the position relation of each playing window relative to the canvas and the hierarchical relation of each playing window relative to other playing windows are arranged according to the layout information corresponding to each playing window. E.g. according to a provided by the broadcast terminaln(x,y)、Wn、Hn、LayernAnd performing calculation setting on the attribute data. In the process, when one frame of dynamic image of a plurality of playing windows in each frame of image is overlapped, the image of the playing window on the upper layer of the layout space replaces the image of all playing windows on the lower layer of the layout space for the overlapped part, and the dynamic images of all parts are spliced into the same dynamic image. Particularly, if the frames per second of each path of dynamic images are different, the server may unify the paths of dynamic images into the same frame number, for example, unify the paths of dynamic images according to a video standard by taking 24 frames per second as an example, the server may perform processing of inserting repeated image frames for dynamic images with insufficient frame number, and may perform processing of deleting intermediate image frames for dynamic images with excessive frame number, so as to achieve an effect of unifying the frame numbers, thereby avoiding image interference caused by different frame numbers.
The multi-path dynamic images subjected to mixed drawing processing by the server form a uniform path of dynamic images to generate a corresponding single-path video stream, so that the path of video stream can be stored in a content distribution network and pushed to a viewer for access after being pulled by the viewer. It can be understood that, for the video stream, there are still multiple playing windows in the dynamic image, and the layout relationship is the same as that shown by the graphical user interface of the broadcasting control end, but these playing windows are no longer adjustable for the audience.
Step S23, when the layout information of any playing window is changed, adjusting the layout of the playing window in the canvas according to the changed layout information to perform the drawing blending:
it can be understood that the server is always running according to steps S21 and S23, and therefore, when the layout information of any playback window in the layout information received subsequently changes due to the operation of the playback end user, the layout of the playback window in the screen is naturally adjusted according to the changed layout information, and the blending process of step S23 is performed.
Further, an interface layout control apparatus of the present application can be constructed by functionalizing the steps in the methods disclosed in the above embodiments, and according to this idea, please refer to fig. 7, wherein in an exemplary embodiment, the apparatus includes:
the window display module 11 is configured to display a plurality of playing windows with adjustable layouts in a canvas provided by a graphical user interface of a live webcast playing control end, where each playing window correspondingly presents a path of dynamic images;
the real-time uploading module 12 is configured to upload layout information of each playing window relative to the canvas and corresponding dynamic images to a server in real time, so as to control the server to blend the received dynamic images into a single-channel video stream according to the layout information and push the single-channel video stream to a viewer;
and the positioning adjustment module 13 is configured to change, in response to a relocation operation performed on any of the playing windows, the layout information uploaded to the server by the playing window to the relocated layout information, so as to control the server to adjust the layout of the playing window in the canvas according to the relocated layout information when performing blending.
Further, an interface layout processing apparatus of the present application can be constructed by functionalizing the steps in the methods disclosed in the above embodiments, and according to this idea, please refer to fig. 8, wherein in an exemplary embodiment, the apparatus includes:
the image receiving module 21 is configured to receive, in real time, multiple paths of dynamic images uploaded by a broadcast control end of live webcasting and layout information formed in the same canvas by each play window corresponding to each path of dynamic images;
the image mixing and drawing module 22 is used for mixing and drawing the received dynamic images into a single-channel video stream according to the layout information so as to push the single-channel video stream to the audience;
and the adjusting monitoring module 23 is configured to, when the received layout information of any playing window in the canvas changes, adjust the layout of the playing window in the canvas according to the changed layout information to perform the mixed drawing.
Further, to facilitate the implementation of the present application, the present application provides an electronic device, which includes a central processing unit and a memory, where the central processing unit is configured to invoke and run a computer program stored in the memory to perform the steps of the interface layout control/processing method in the foregoing embodiments.
It can be seen that the memory is suitable for adopting a nonvolatile storage medium, the aforementioned method is implemented as a computer program and installed in an electronic device such as a mobile phone or a computer, the related program code and data are stored in the nonvolatile storage medium of the electronic device, and further the program is executed by a central processing unit of the electronic device and is called from the nonvolatile storage medium to a memory for execution, so as to achieve the desired purpose of the present application. Therefore, it is understood that in an embodiment of the present application, a non-volatile storage medium may be further provided, in which a computer program implemented according to various embodiments of the interface layout control/processing method is stored, and when the computer program is called by a computer, the computer program executes the steps included in the method.
In summary, the present application uses an efficient technical means to enable a broadcast control end of live webcasting to dynamically reposition its play window, so that the broadcast control end can adjust the display layout effect of the dynamic image received and played by the audience end, and what you see is what you get is achieved.
As will be appreciated by one skilled in the art, the present application includes apparatus that are directed to performing one or more of the operations, methods described herein. These devices may be specially designed and manufactured for the required purposes, or they may comprise known devices in general-purpose computers. These devices have computer programs stored in their memories that are selectively activated or reconfigured. Such a computer program may be stored in a device (e.g., computer) readable medium, including, but not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magnetic-optical disks, ROMs (Read-Only memories), RAMs (Random Access memories), EPROMs (Erasable Programmable Read-Only memories), EEPROMs (Electrically Erasable Programmable Read-Only memories), flash memories, magnetic cards, or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a bus. That is, a readable medium includes any medium that stores or transmits information in a form readable by a device (e.g., a computer).
It will be understood by those within the art that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions. Those skilled in the art will appreciate that the computer program instructions may be implemented by a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the aspects specified in the block or blocks of the block diagrams and/or flowchart illustrations disclosed herein.
Those of skill in the art will appreciate that the various operations, methods, steps in the processes, acts, or solutions discussed in this application can be interchanged, modified, combined, or eliminated. Further, other steps, measures, or schemes in various operations, methods, or flows that have been discussed in this application can be alternated, altered, rearranged, broken down, combined, or deleted. Further, steps, measures, schemes in the prior art having various operations, methods, procedures disclosed in the present application may also be alternated, modified, rearranged, decomposed, combined, or deleted.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (12)

1. An interface layout control method is characterized by comprising the following steps:
displaying a plurality of playing windows with adjustable layouts in a canvas provided by a graphical user interface of a live network playing control end, wherein each playing window correspondingly presents a path of dynamic images;
uploading layout information of each playing window relative to the canvas and corresponding dynamic images to a server in real time, so as to control the server to blend the received dynamic images into a single-path video stream according to the layout information and push the single-path video stream to a viewer;
and responding to the repositioning operation acting on any playing window, changing the layout information uploaded to the server by the playing window into the repositioned layout information, and controlling the server to adjust the layout of the playing window in the canvas according to the repositioned layout information when blending.
2. The method according to claim 1, wherein said process of responding to a relocation operation applied to any of said playing windows comprises the steps of:
responding to a mouse pressing event acting on the playing window, and highlighting the playing window;
responding to a mouse drag and drop event acting on the playing window, and enabling the playing window to move in the canvas range along with a mouse;
and responding to a mouse release event acting on the playing window, stopping the movement of the playing window, and enabling the playing window to obtain the repositioned layout information, so that the layout information uploaded to the server by the playing window is changed into the repositioned layout information.
3. Method according to claim 1, characterized in that it comprises the following steps:
according to default setting or responding to a user setting instruction, one playing window corresponding to the setting is placed at the bottom layer of other playing windows and maximally displayed in the canvas, and the other playing windows are linearly arranged at the upper layer side of the maximally displayed playing window.
4. Method according to claim 1, characterized in that it comprises the following steps:
and responding to an image exchange instruction acting between the two playing windows, so that the two playing windows exchange and play the original dynamic images of each other, and the corresponding relation between the playing windows and the dynamic images is updated, so that the server conforms to the updated corresponding relation when implementing the mixed drawing.
5. The method according to any one of claims 1 to 4, characterized in that: the layout information comprises position information of a playing window corresponding to the layout information relative to the canvas and layout space hierarchy information relative to other playing windows, and the position information comprises positioning information of the playing window relative to the canvas and size information of the playing window.
6. Method according to any one of claims 1 to 4, characterized in that it comprises the following preliminary steps:
and synchronously acquiring a video stream generated by the camera equipment of the local machine and a screenshot image stream of a graphical user interface of the local machine, and respectively playing the video stream and the screenshot image stream as dynamic images of different playing windows.
7. An interface layout processing method is characterized by comprising the following steps:
receiving multiple paths of dynamic images uploaded by a broadcast control end of live webcasting and layout information of each playing window corresponding to each path of dynamic images in the same canvas in real time;
mixing and drawing the received dynamic images into a single-path video stream according to the layout information so as to push the single-path video stream to a viewer;
when the received layout information of any playing window is changed, the layout of the playing window in the canvas is adjusted according to the changed layout information to perform the mixed drawing.
8. The method according to claim 7, wherein when the received dynamic images are mixed according to the layout information, each frame of image of the video stream is organized according to a canvas range, and when one frame of dynamic image of a plurality of playing windows in each frame of image is overlapped, for the overlapped portion, the image of the playing window on the upper layer of the layout space replaces the image of all playing windows on the lower layer of the layout space, so as to splice the dynamic images of the portions into the same dynamic image.
9. An interface layout control apparatus, comprising:
the window display module is used for displaying a plurality of playing windows with adjustable layouts in a canvas provided by a graphical user interface of a live webcast playing control end, and each playing window correspondingly presents a path of dynamic images;
the real-time uploading module is used for uploading the layout information of each playing window relative to the canvas and each corresponding path of dynamic image to the server in real time so as to control the server to blend the received dynamic images into a single-path video stream according to the layout information and push the single-path video stream to a viewer end;
and the positioning adjustment module is used for responding to the repositioning operation acted on any playing window, changing the layout information uploaded to the server by the playing window into the repositioned layout information, and controlling the server to adjust the layout of the playing window in the canvas according to the repositioned layout information when mixed drawing is carried out.
10. An interface layout processing apparatus, comprising:
the image receiving module is used for receiving a plurality of paths of dynamic images uploaded by a broadcast control end of live webcasting and layout information formed by all playing windows corresponding to all paths of dynamic images in the same canvas in real time;
the image mixing and drawing module is used for mixing and drawing the received dynamic images into a single-path video stream according to the layout information so as to push the single-path video stream to a viewer;
and the adjusting monitoring module is configured to adjust the layout of the playing window in the canvas according to the changed layout information to perform the mixed drawing when the received layout information of any playing window is changed.
11. An electronic device comprising a central processor and a memory, wherein the central processor is configured to invoke execution of a computer program stored in the memory to perform the steps of the method according to any one of claims 1 to 8.
12. A non-volatile storage medium, characterized in that it stores, in computer-readable instructions, a computer program implemented according to the method of any one of claims 1 to 8, which, when invoked by a computer, performs the steps comprised by the method.
CN202011139741.XA 2020-10-22 2020-10-22 Interface layout control and processing method and corresponding device, equipment and medium Pending CN112261434A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011139741.XA CN112261434A (en) 2020-10-22 2020-10-22 Interface layout control and processing method and corresponding device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011139741.XA CN112261434A (en) 2020-10-22 2020-10-22 Interface layout control and processing method and corresponding device, equipment and medium

Publications (1)

Publication Number Publication Date
CN112261434A true CN112261434A (en) 2021-01-22

Family

ID=74264655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011139741.XA Pending CN112261434A (en) 2020-10-22 2020-10-22 Interface layout control and processing method and corresponding device, equipment and medium

Country Status (1)

Country Link
CN (1) CN112261434A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113596561A (en) * 2021-07-29 2021-11-02 北京达佳互联信息技术有限公司 Video stream playing method and device, electronic equipment and computer readable storage medium
CN113873311A (en) * 2021-09-09 2021-12-31 北京都是科技有限公司 Live broadcast control method and device and storage medium
CN113986446A (en) * 2021-12-23 2022-01-28 北京麟卓信息科技有限公司 Scaling optimization method for android application window in android running environment
CN114222149A (en) * 2021-11-17 2022-03-22 武汉斗鱼鱼乐网络科技有限公司 Plug flow method, device, medium and computer equipment
CN114339401A (en) * 2021-12-30 2022-04-12 北京翼鸥教育科技有限公司 Video background processing method and device
WO2024099235A1 (en) * 2022-11-07 2024-05-16 北京字跳网络技术有限公司 Livestream picture processing method and apparatus, device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060290817A1 (en) * 2002-04-01 2006-12-28 Canon Kabushiki Kaisha Multi-screen synthesis apparatus, method of controlling the apparatus, and program for controlling the apparatus
CN105357542A (en) * 2015-11-20 2016-02-24 广州华多网络科技有限公司 Live broadcast method, device and system
CN106303663A (en) * 2016-09-27 2017-01-04 北京小米移动软件有限公司 Live treating method and apparatus, direct broadcast server

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060290817A1 (en) * 2002-04-01 2006-12-28 Canon Kabushiki Kaisha Multi-screen synthesis apparatus, method of controlling the apparatus, and program for controlling the apparatus
CN105357542A (en) * 2015-11-20 2016-02-24 广州华多网络科技有限公司 Live broadcast method, device and system
CN106303663A (en) * 2016-09-27 2017-01-04 北京小米移动软件有限公司 Live treating method and apparatus, direct broadcast server

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113596561A (en) * 2021-07-29 2021-11-02 北京达佳互联信息技术有限公司 Video stream playing method and device, electronic equipment and computer readable storage medium
CN113873311A (en) * 2021-09-09 2021-12-31 北京都是科技有限公司 Live broadcast control method and device and storage medium
CN113873311B (en) * 2021-09-09 2024-03-12 北京都是科技有限公司 Live broadcast control method, device and storage medium
CN114222149A (en) * 2021-11-17 2022-03-22 武汉斗鱼鱼乐网络科技有限公司 Plug flow method, device, medium and computer equipment
CN113986446A (en) * 2021-12-23 2022-01-28 北京麟卓信息科技有限公司 Scaling optimization method for android application window in android running environment
CN114339401A (en) * 2021-12-30 2022-04-12 北京翼鸥教育科技有限公司 Video background processing method and device
WO2024099235A1 (en) * 2022-11-07 2024-05-16 北京字跳网络技术有限公司 Livestream picture processing method and apparatus, device and storage medium

Similar Documents

Publication Publication Date Title
CN112261434A (en) Interface layout control and processing method and corresponding device, equipment and medium
US10356363B2 (en) System and method for interactive video conferencing
US11120677B2 (en) Transcoding mixing and distribution system and method for a video security system
US10033967B2 (en) System and method for interactive video conferencing
CN106792092B (en) Live video stream split-mirror display control method and corresponding device thereof
US7185116B2 (en) Template-based customization of a user interface for a messaging application program
JP7157177B2 (en) Video Acquisition Method, Apparatus, Terminal and Medium
JP5361746B2 (en) Multi-division display content and method for creating the system
US20150012831A1 (en) Systems and methods for sharing graphical user interfaces between multiple computers
KR101770070B1 (en) Method and system for providing video stream of video conference
CN114710681A (en) Multi-channel live broadcast display control method and device, equipment and medium thereof
CN114422821A (en) Live broadcast home page interaction method, device, medium and equipment based on virtual gift
WO2019056001A1 (en) System and method for interactive video conferencing
CN113596571B (en) Screen sharing method, device, system, storage medium and computer equipment
CN110362375A (en) Display methods, device, equipment and the storage medium of desktop data
US20180247672A1 (en) Bundling Separate Video Files to Support a Controllable End-User Viewing Experience with Frame-Level Synchronization
US20130141308A1 (en) Electronic device and multi-panel interface displaying method
CN115794095B (en) JavaScript-based illusion engine UI development method and system
KR20160131830A (en) System for cloud streaming service, method of cloud streaming service of providing multi-view screen based on resize and apparatus for the same
CN114095772B (en) Virtual object display method, system and computer equipment under continuous wheat direct sowing
US11711408B2 (en) Content appearance conversion for remote application sharing
US20190200068A1 (en) Apparatuses, systems, and methods for adding functionalities to control buttons on a remote control device
CN103336649A (en) Feedback window image sharing method and device among terminals
CN114020375A (en) Display method and device
KR20220146801A (en) Method, computer device, and computer program for providing high-definition image of region of interest using single stream

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210122

RJ01 Rejection of invention patent application after publication