CN113596571B - Screen sharing method, device, system, storage medium and computer equipment - Google Patents

Screen sharing method, device, system, storage medium and computer equipment Download PDF

Info

Publication number
CN113596571B
CN113596571B CN202110849522.9A CN202110849522A CN113596571B CN 113596571 B CN113596571 B CN 113596571B CN 202110849522 A CN202110849522 A CN 202110849522A CN 113596571 B CN113596571 B CN 113596571B
Authority
CN
China
Prior art keywords
frame image
screen frame
canvas
screen
sharing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110849522.9A
Other languages
Chinese (zh)
Other versions
CN113596571A (en
Inventor
闫理
方周
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co Ltd filed Critical Guangzhou Cubesili Information Technology Co Ltd
Priority to CN202110849522.9A priority Critical patent/CN113596571B/en
Publication of CN113596571A publication Critical patent/CN113596571A/en
Application granted granted Critical
Publication of CN113596571B publication Critical patent/CN113596571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application provides a screen sharing method, a device, a system, a storage medium and computer equipment, wherein the method comprises the following steps: the method comprises the steps that a sharing release terminal responds to a triggering operation of screen sharing, and a screen frame image corresponding to a screen area to be shared is obtained; creating a canvas with fixed resolution, pasting a screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to a sharing receiving terminal; the sharing receiving terminal receives the video coding data, and decodes the video coding data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image. The method and the device can improve the stability of screen sharing and the instantaneity of screen sharing.

Description

Screen sharing method, device, system, storage medium and computer equipment
Technical Field
The embodiment of the application relates to the field of screen sharing, in particular to a screen sharing method, device, system, storage medium and computer equipment.
Background
With the development of video conferences, online education, and the like, screen sharing has also been increasingly used. When the screen sharing is performed, the shared screen area can be adjusted according to the needs, for example, when the shared screen area is a window, that is, when the window sharing is performed, the size of the window can be adjusted in real time according to the needs, so that the resolution of the window image can be adjusted in real time, when the window image is shared, the window image needs to be encoded into video encoding data through an encoder, and when the encoder encodes, the encoding resolution of the window image needs to be consistent with the resolution of the window image, however, because the encoder cannot modify the encoding resolution in real time, the encoder needs to be restarted continuously to adapt to the image with the continuously changed resolution, so that the problem of poor stability of the screen sharing is caused.
In some technologies, capturing and encoding of an image is suspended in the process of changing the resolution of a window image, and an encoder is restarted to continue capturing and encoding the image until the resolution of the window image is stable, however, this method also causes a problem of poor real-time performance of screen sharing. Correspondingly, when the receiving end decodes the window image, the decoding resolution is required to be consistent with the resolution of the window image, and further, the problems of poor stability and poor instantaneity of screen sharing are also caused.
Disclosure of Invention
In order to overcome the problems in the related art, the application provides a screen sharing method, a device, a system, a storage medium and computer equipment, which can improve the stability and instantaneity of screen sharing.
According to a first aspect of an embodiment of the present application, a screen sharing method is provided, including the following steps:
the method comprises the steps that a sharing release terminal responds to a triggering operation of screen sharing, and a screen frame image corresponding to a screen area to be shared is obtained; creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
the sharing receiving terminal receives the video coding data, decodes the video coding data in a decoding mode corresponding to the resolution of the canvas, and obtains the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
According to a second aspect of the embodiments of the present application, a screen sharing method is provided, which is applied to a sharing publishing terminal, and includes the following steps:
responding to a triggering operation of screen sharing, and acquiring a screen frame image corresponding to a screen area to be shared;
creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
and transmitting the video coding data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coding data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
According to a third aspect of the embodiments of the present application, a screen sharing method is provided, which is applied to a sharing receiving terminal, and includes the following steps:
Receiving video encoding data; wherein the video coding data comprises canvas and identification information of screen frame images on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
decoding the video coding data according to a decoding mode corresponding to the resolution of the canvas to obtain the identification information of the canvas and the screen frame image on the canvas;
obtaining a screen frame image from the canvas according to the identification information;
and displaying the screen frame image.
According to a fourth aspect of the embodiments of the present application, a screen sharing system is provided, including a sharing publishing terminal and a sharing receiving terminal;
the sharing release terminal responds to a triggering operation of screen sharing and acquires a screen frame image corresponding to a screen area to be shared; creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
The sharing receiving terminal receives the video coding data, decodes the video coding data in a decoding mode corresponding to the resolution of the canvas, and obtains the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
According to a fifth aspect of the embodiments of the present application, there is provided a screen sharing device, applied to a sharing publishing terminal, the device including:
the screen frame image acquisition module is used for responding to the triggering operation of screen sharing and acquiring a screen frame image corresponding to a screen area to be shared;
the identification information acquisition module is used for creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and acquiring the identification information of the screen frame image on the canvas; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
the coding data acquisition module is used for coding the canvas and the identification information to obtain video coding data to be shared;
And the data transmitting module is used for transmitting the video coding data to at least one sharing receiving terminal.
According to a sixth aspect of the embodiments of the present application, there is provided a screen sharing device, applied to a sharing receiving terminal, the device including:
the data receiving module is used for receiving the video coding data; wherein the video coding data comprises canvas and identification information of screen frame images on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
the decoding module is used for decoding the video coding data according to a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information of the screen frame image on the canvas;
the screen frame acquisition module is used for acquiring a screen frame image from the canvas according to the identification information;
and the display module is used for displaying the screen frame image.
According to a seventh aspect of embodiments of the present application, there is provided a computer device comprising a processor and a memory; the memory stores a computer program adapted to be loaded by the processor and to perform the screen sharing method as described above.
According to an eighth aspect of embodiments of the present application, there is provided a computer readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements a screen sharing method as described above.
According to the embodiment of the application, the canvas with the fixed resolution is created, so that the screen frame image corresponding to the screen area to be shared is pasted on the canvas, and the identification information of the screen frame image on the canvas is obtained; the method comprises the steps of obtaining video coding data to be shared based on the canvas with fixed resolution and the identification information, sending the video coding data to at least one sharing receiving terminal, and further, not needing to adjust coding resolution in real time according to resolution of screen frame images, and not needing to adjust decoding resolution in real time according to resolution of the screen frame images, so that the problem of unstable screen sharing caused by continuously restarting an encoder and a decoder can be avoided, stability of screen sharing is improved, meanwhile, instantaneity of screen sharing can be improved, further, no operation of compressing and amplifying the screen frame images is needed in coding and transmission processes, and complete transmission of the screen frame images captured by the sharing release terminal to the sharing receiving terminal can be realized without complex programming logic, and reduction degree of the screen frame images in the sharing release terminal is guaranteed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
For a better understanding and implementation, the present invention is described in detail below with reference to the drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic block diagram of an application environment of a screen sharing method according to an embodiment of the present application;
fig. 2 is a flowchart of a screen sharing method according to a first embodiment of the present application;
FIG. 3 is a schematic diagram of a video conferencing interface shown in an embodiment of the present application;
fig. 4 is a schematic diagram of a video conference interface after a triggering operation in response to screen sharing according to an embodiment of the present application;
fig. 5 is a schematic diagram of a user after determining a screen area to be shared according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a screen frame image in a canvas according to an embodiment of the present application;
fig. 7 is a flowchart of a method of video encoding data according to a first embodiment of the present application;
fig. 8 is a flowchart of a method of displaying a screen frame image according to a first embodiment of the present application;
FIG. 9 is a schematic diagram of a screen frame image overlaid on a rendering window for display of the screen frame image according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a rendering window shown in an embodiment of the present application superimposed over a screen frame image to display the screen frame image;
FIG. 11 is a schematic illustration of the sliding of a screen frame image within a rendering window in the state of FIG. 10;
FIG. 12 is a schematic diagram of a configuration of a viewing mode control shown in an embodiment of the present application;
FIG. 13 is a schematic effect diagram of displaying a screen frame image at a resolution of viewing window matching as illustrated in an embodiment of the present application;
FIG. 14 is another schematic effect diagram of a display of a screen frame image at a viewing window matched resolution shown in an embodiment of the present application;
FIG. 15 is a further schematic effect diagram illustrating the display of screen frame images at viewing window matched resolution according to an embodiment of the present application;
fig. 16 is a flowchart of a screen sharing method according to a second embodiment of the present application;
Fig. 17 is a flowchart of a screen sharing method according to a third embodiment of the present application;
FIG. 18 is a schematic block diagram of a screen sharing system shown in a fourth embodiment of the present application;
fig. 19 is a schematic block diagram of a screen sharing device according to a fifth embodiment of the present application;
fig. 20 is a schematic block diagram of a screen sharing device according to a sixth embodiment of the present application;
fig. 21 is a schematic structural diagram of a computer device according to a seventh embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the embodiments of the present application, are within the scope of the embodiments of the present application.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. In the description of this application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The word "if"/"if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination".
Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
As will be appreciated by those skilled in the art, the terms "terminal" and "terminal device" as used herein include both devices of a wireless signal transmitter having only a wireless signal transmitter capable of transmitting and devices of a wireless signal receiver having only a wireless signal receiver capable of receiving, and devices of receiving and transmitting hardware having both receiving and transmitting hardware capable of two-way communication over a two-way communication link. Such a device may include: a cellular or other communication device such as a personal computer, tablet, or the like, having a single-line display or a multi-line display or a cellular or other communication device without a multi-line display; a PCS (PersonalCommunications Service, personal communication system) that may combine voice, data processing, facsimile and/or data communication capabilities; a PDA (Personal Digital Assistant ) that can include a radio frequency receiver, pager, internet/intranet access, web browser, notepad, calendar and/or GPS (Global PositioningSystem ) receiver; a conventional laptop and/or palmtop computer or other appliance that has and/or includes a radio frequency receiver. As used herein, "terminal," "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or adapted and/or configured to operate locally and/or in a distributed fashion, to operate at any other location(s) on earth and/or in space. The "terminal" and "terminal device" used herein may also be a communication terminal, a network access terminal, and a music/video playing terminal, for example, may be a PDA, a MID (Mobile Internet Device ), and/or a mobile phone with a music/video playing function, and may also be a smart tv, a set top box, and other devices.
The term "terminal", "terminal device", and other terms refer to hardware, essentially a computer device having the capability of a personal computer, and is a hardware device having necessary components disclosed by von neumann's principle, such as a central processing unit (including an arithmetic unit and a controller), a memory, an input device, and an output device, where a computer program is stored in the memory, and the central processing unit calls the program stored in the memory to run, executes instructions in the program, and interacts with the input/output device, thereby completing a specific function.
Please refer to fig. 1, which is a schematic block diagram of an application environment of a screen sharing method according to an embodiment of the present application. As shown in fig. 1, the application environment of the screen sharing method includes a sharing publishing terminal 120 and at least one sharing receiving terminal 140; the sharing publishing terminal 120 and the sharing receiving terminal 140 may be directly or indirectly connected through a network, which is not limited herein. The network may be a communication medium of various connection types capable of providing a communication link between the sharing distribution terminal 120 and the sharing receiving terminal 140, for example, a wired communication link, a wireless communication link, or an optical fiber cable, etc., which is not limited herein. It should be noted that, the screen sharing method of the present application may be specifically applied to educational and meeting scenes, and may also be applied to any scene that may use the screen sharing method of the present application, such as live broadcast.
The sharing publishing terminal 120 is one end of a sharing screen; the sharing publishing terminal 120 may be a personal electronic device or a public electronic device such as a desktop, notebook, tablet, or smart phone, which is not limited herein. In some examples, when the screen sharing method is used for education, the sharing distribution terminal 120 may be a personal electronic device used by a teacher; when the screen sharing method is used for a conference, the sharing distribution terminal 120 may be a personal electronic device used by conference persons sharing a screen, such as a conference host.
The sharing receiving terminal 140 is one end for watching the sharing screen; the sharing receiving terminal 140 may also be a personal electronic device or a public electronic device such as a desktop, a notebook, a tablet computer or a smart phone, which is not limited herein. In some examples, when the screen sharing method is used for education, the sharing reception terminal 140 may be a personal electronic device used by students; when the screen sharing method is used for a conference, the sharing publishing terminal 120 may be a personal electronic device used by conference persons who are not sharing the screen, such as conference participants.
In this embodiment of the present application, the sharing release terminal 120 and the sharing receiving terminal 140 may both run the same screen sharing application, and the screen sharing application relies on a network to implement interaction between the sharing release terminal 120 and the sharing receiving terminal 140, where the screen sharing application may run as an independent application or may be embedded in another application, and may be used as a part of another application, for example, in a conference scenario, the video conference application is embedded in the screen sharing application, so as to implement screen sharing quickly according to actual needs in the conference process. Specifically, the sharing distribution terminal 120 and the sharing receiving terminal 140 may first join the same conference room through the video conference application, and then share the screen of the sharing distribution terminal 120 to all the sharing receiving terminals 140 joining the conference room through the screen sharing application, so that the sharing receiving terminal 140 may synchronously see the screen content in the sharing distribution terminal 120.
Optionally, the application environment of the screen sharing method may further include a server (not shown in the figure), where the sharing release terminal 120 and the sharing receiving terminal 140 may access the internet through a network access manner, and establish a data communication link with the server, so as to implement interaction between the sharing release terminal 120 and the sharing receiving terminal 140 through the server. The server may serve as a service server, and is further responsible for connecting to related audio streaming servers, video streaming servers, other servers providing related support, and so on, so as to form a logically related service cluster, so as to provide services for related terminal devices, for example, the sharing distribution terminal 120 and the sharing receiving terminal 140.
For a better understanding of the aspects of the present application, some of the technical aspects will be described below.
When the screen sharing is performed, the shared screen area can be adjusted according to the needs, for example, when the shared screen area is a window, that is, when the window sharing is performed, the size of the window can be adjusted in real time according to the needs, so that the resolution of the window image can be adjusted in real time, when the window image is shared, the window image needs to be encoded into video encoding data through an encoder, and when the encoder encodes, the encoding resolution of the window image needs to be consistent with the resolution of the window image, however, because the encoder cannot modify the encoding resolution in real time, the encoder needs to be restarted continuously to adapt to the image with the continuously changed resolution, so that the problem of poor stability of the screen sharing is caused. In some technologies, capturing and encoding of an image is suspended in the process of changing the resolution of a window image, and an encoder is restarted to continue capturing and encoding the image until the resolution of the window image is stable, however, this method also causes a problem of poor real-time performance of screen sharing.
Correspondingly, when the sharing receiving terminal 140 receives the video encoded data sent by the sharing publishing terminal 120 for decoding, the decoding resolution needs to be kept consistent with the resolution of the window image, which also causes the problems of poor stability and poor real-time performance of screen sharing.
In addition, with the development of video conferences and online education, screen sharing is also being widely used, and the popularization of smartphones and tablets is further realized, so that the screen sharing of computer equipment is gradually changed into a scene of mixed sharing of smartphones, tablets and computers. Because of the different display resolutions or display proportions of the different devices, the screen images shared by the sharing distribution terminal 120 may not be consistently displayed at the sharing receiving terminal 140. For example, a mobile phone is used at the sharing distribution terminal 120, which uses a screen with a resolution of 720×1080 in the vertical direction and an aspect ratio of 2:3, but a computer is used at the sharing receiving terminal 140, which uses a resolution of 1920×720 and an aspect ratio of 8:3, that is, the problem that the screen image shared by the sharing distribution terminal 120 is not matched with the display resolution of the display screen of the sharing receiving terminal 140, and therefore only a partial image of the screen image can be displayed at the sharing receiving terminal 140; if the screen image is scaled to adapt to different display resolutions between the sharing distribution terminal 120 and the sharing receiving terminal 140, some details of the image become blurred due to scaling, especially in a scene that the shared screen image is text, a web page, or teaching, and the details are relatively large and also important, the scaled screen image becomes blurred at the sharing receiving terminal 140, so that the screen image cannot be clearly displayed, and further, the requirement of displaying the details cannot be met.
Based on the above, the application provides a screen sharing method, a device, a system, a storage medium and computer equipment.
Referring to fig. 2, fig. 2 is a flowchart of a screen sharing method according to a first embodiment of the present application, where the screen sharing method according to the embodiment of the present application includes the following steps,
step S101: the method comprises the steps that a sharing release terminal responds to a triggering operation of screen sharing, and a screen frame image corresponding to a screen area to be shared is obtained; creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas.
The triggering operation of screen sharing refers to a request operation of sharing the desktop of the sharing release terminal.
In an embodiment, the triggering operation of screen sharing may be a request operation of desktop sharing for the sharing release terminal at the teacher end when the education scene is in use. For example, when an education scene is live broadcasting teaching for a teacher, the teacher can realize the triggering operation of screen sharing by clicking the sharing control on the sharing release terminal, and the sharing release terminal where the teacher is located carries out desktop sharing in response to the triggering operation of screen sharing.
In another embodiment, the triggering operation of screen sharing may be a request operation for desktop sharing on the sharing publishing terminal of the host when the conference scene is in. For example, during a meeting, a presenter can implement a triggering operation of screen sharing by clicking a sharing control on a sharing release terminal, and the sharing release terminal where the presenter is located responds to the triggering operation of screen sharing to perform desktop sharing. Specifically, as shown in fig. 3, the sharing publishing terminal embeds a screen sharing application in the conference application, sets a screen sharing control in the toolbar 11 of the conference display interface 10, and realizes the triggering operation of screen sharing by clicking the screen sharing control. As shown in fig. 3, it may be understood that a microphone control, a camera control, a recording control, an invitation member control, a conventional control used for starting a live control and other video conferences are still provided in the toolbar 11 of the conference display interface 10 in the conference application, and the screen sharing control and the above controls may exist independently to implement corresponding functions.
The screen area to be shared may be a full-screen desktop of the sharing publishing terminal, may be a partial area of the desktop, or may be a virtual desktop. The partial area of the desktop may be a window opened by the sharing publishing terminal, or a screen area of one of the split screens. Specifically, after the user clicks the screen sharing control to realize the triggering operation of screen sharing, the sharing release terminal responds to the triggering operation of screen sharing to display all windows opened at the sharing release terminal, windows preset in advance and/or screen areas of the split-screen desktop in a list form for the user to select. As shown in fig. 4, after the sharing release terminal responds to the triggering operation of screen sharing, a window to be shared is displayed at the sharing release terminal for the user to select. When the user clicks one of the options in the list to determine the screen area to be shared, for example, clicks a certain window to determine the screen area to be shared, the sharing release terminal pops up a prompt box around the window to prompt the user about the area being shared, as shown in fig. 5, the bold black box below the "screen area being shared" is the prompted area being shared. Optionally, in order to better show the screen area to be shared, after the screen area to be shared is determined, the conference display desktop 10 will display the screen area to be shared, and the toolbar 11 will be hidden.
The screen frame image may be an image captured by the sharing release terminal at intervals of a preset time, and optionally, the screen frame image may be an image obtained by the sharing release terminal at intervals of a preset time after capturing a screen of the screen region to be shared. It will be appreciated that when the size of the screen area to be shared is changed, the size of the screen frame image will also change, that is, the resolution of the screen frame image will also change, for example, when the size of the screen area to be shared is a window, the resolution of the screen frame image corresponding to the window will also change when the user drags the window to change its size.
The canvas is a canvas for encoding that may be created based on the canvas, specifically, by defining canvas elements and creating canvas objects, and then drawing graphics using the canvas objects. In this embodiment, the screen frame image is pasted onto the canvas by a canvas object. Optionally, the canvas is a blank canvas to reduce the transmission of data.
It should be noted that, for each frame of screen frame image, it corresponds to a canvas, so that each frame of screen frame image can be pasted on the canvas, and further, for each frame of screen frame image, encoding is performed based on the canvas. Because the resolution of the canvas is fixed, when the canvas is used for coding, the canvas is only required to be used for coding, and the coding resolution is not required to be adjusted according to the resolution of the screen frame image, so that the problem of unstable screen sharing caused by continuously restarting the encoder can be avoided, and meanwhile, the real-time performance of screen sharing can be improved.
The location information is used to identify a location of the screen frame image in the canvas to identify the screen frame image in the canvas according to the location information.
In one embodiment, the location information includes a resolution of the screen frame image and coordinates of the screen frame image preset point in the canvas. The resolution of the screen image comprises a first horizontal pixel number and a first vertical pixel number. The coordinate system can be established for the canvas, for example, a point at the lower right corner of the canvas can be used as a dot, the direction in which the canvas transversely exists is used as an X axis, the direction in which the canvas longitudinally exists is used as a Y axis, and a rectangular coordinate system is established. Alternatively, the point at the upper left corner of the screen frame image may be used as a preset point, and then the coordinates of the point at the upper left corner of the screen frame image in the canvas may be obtained. As shown in fig. 6, the screen frame image is pasted on the canvas, the point coordinates of the upper left corner of the screen frame image are (x, y), and the first horizontal pixel point number and the first vertical pixel point number are w and h, respectively.
The sharing receiving terminal is a terminal which establishes a screen sharing channel with the sharing release terminal. Specifically, in a conference scenario, the sharing receiving terminal and the sharing publishing terminal can both install a video conference application program, wherein the video conference application program is embedded into a screen sharing application program; the sharing release terminal where the host is located can establish a conference room in the video conference application program to form a conference ID, each conference participant can enter the same conference room to participate in the conference by opening the video conference application program in each sharing release terminal and inputting the conference ID, at this time, the sharing release terminal establishes a screen sharing channel with the sharing release terminal, and then the sharing release terminal can send video coding data corresponding to the screen frame image to all sharing release terminals joining the conference room.
Step S102: the sharing receiving terminal receives the video coding data, decodes the video coding data in a decoding mode corresponding to the resolution of the canvas, and obtains the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
After the sharing receiving terminal receives the video coding data, the canvas and the identification information can be obtained through decoding according to a preset decoding algorithm. And because the identification information comprises the position information of the screen frame image, the position of the screen frame image can be determined from the canvas, so that the screen frame image is obtained. Because the resolution of the canvas is fixed, the decoding resolution of the decoder is fixed when the decoder decodes, so that the problem of unstable screen sharing caused by continuously restarting the decoder is avoided. It can be understood that the sharing release terminal and the sharing receiving terminal can set specific coding and decoding algorithms in advance in a unified manner, so as to realize the coding and decoding operation of the screen frame image, and the coding and decoding algorithms can be any algorithm capable of realizing the method of the application, which is not limited in the application.
According to the embodiment of the application, the canvas with the fixed resolution is created, so that the screen frame image corresponding to the screen area to be shared is pasted on the canvas, and the identification information of the screen frame image on the canvas is obtained; the method comprises the steps of obtaining video coding data to be shared based on the canvas with fixed resolution and the identification information, sending the video coding data to at least one sharing receiving terminal, and further, not needing to adjust coding resolution in real time according to resolution of screen frame images, and not needing to adjust decoding resolution in real time according to resolution of the screen frame images, so that the problem of unstable screen sharing caused by continuously restarting an encoder and a decoder can be avoided, stability of screen sharing is improved, meanwhile, instantaneity of screen sharing can be improved, further, no operation of compressing and amplifying the screen frame images is needed in coding and transmission processes, and complete transmission of the screen frame images captured by the sharing release terminal to the sharing receiving terminal can be realized without complex programming logic, and reduction degree of the screen frame images in the sharing release terminal is guaranteed.
In one embodiment, the resolution of the canvas is the resolution corresponding to the full screen desktop of the sharing and publishing terminal. Considering that the size of the screen area to be shared is changed, and the screen area to be shared is the full-screen desktop at the maximum, so that in order to make the screen frame image corresponding to the screen area to be shared be pasted on the canvas, the resolution of the canvas is set to be the resolution corresponding to the full-screen desktop of the sharing release terminal.
Referring to fig. 7, in one embodiment, in step S101, the sharing publishing terminal encodes the canvas and the identification information according to an encoding mode corresponding to the resolution of the canvas, including:
step S1011: coding the canvas by adopting a fixed coding resolution to obtain first coding data; wherein the encoding resolution is the same size as the resolution of the canvas.
Because the resolution of the canvas is fixed, the coding resolution is also fixed, and when the canvas is coded, the canvas is also coded with the fixed resolution, so that the problem of instability caused by continuously restarting the encoder is avoided, and the instantaneity of the coding can be improved.
Step S1012: and coding the identification information in an SEI mode to obtain second coded data.
The SEI (Supplemental Enhancement Information ) is a bit stream coding mode in the code stream category, and provides a method for adding information into video coding data, specifically, the identification information is coded by adopting the SEI mode to form second coding data, and then the first coding data corresponding to the canvas is combined to form and added into the video coding data, so that the screen frame image can be identified and displayed in the canvas to provide basis. It should be noted that, when the resolution of the screen frame image is changed, the manner and complexity of encoding the identification information by the SEI manner are not affected.
It will be appreciated that before the identification information is encoded in an SEI manner to obtain second encoded data, the identification information needs to be serialized into a string, so as to be converted into data that can be identified by an encoder and then encoded.
Step S1013: and packaging the first coded data and the second coded data to obtain the video coded data to be shared.
According to the method and the device, the first coded data obtained by coding the canvas and the second coded data obtained by coding the identification information in an SEI mode are packaged and transmitted to the sharing receiving terminal, so that synchronous transmission of the canvas and the identification information can be realized, and further the sharing receiving terminal can obtain screen frame images from the canvas according to the identification information corresponding to the canvas.
It may be understood that the sharing receiving terminal receives the video encoded data, decodes the video encoded data in a decoding manner corresponding to the resolution of the canvas, and obtains the canvas and the identification information, where the decoding process is a process opposite to the encoding process, and may specifically include: decompressing the video coding data to obtain first coding data and second coding data; encoding the first encoded data with a fixed decoding resolution to obtain a canvas; wherein the decoding resolution is the same as the resolution size of the canvas; and decoding the second coded data in an SEI mode to obtain identification information.
In one embodiment, when the location information includes the resolution of the screen frame image and coordinates of a preset point of the screen frame image in the canvas, the sharing receiving terminal obtains the screen frame image from the canvas according to the identification information in step S102, including: and intercepting and obtaining the screen frame image from the canvas according to the coordinates of the preset point of the screen frame image in the canvas and the resolution of the screen frame image. The resolution of the screen frame image and the coordinates of the preset points of the screen frame image in the canvas can uniquely determine the position of the screen frame image in the canvas, so that the screen frame image can be conveniently and accurately obtained from the canvas.
In one embodiment, when the location information includes the resolution of the screen frame image, displaying the screen frame image by the sharing receiving terminal in step S102 includes:
step S1021: and creating a rendering window, and adjusting the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
The display resolutions of the sharing release terminal and the sharing receiving terminal are different, so that the resolution of the screen frame image and the resolution of the rendering window are possibly different, and the sizes of the screen frame image shared by the sharing release terminal and the rendering window of the sharing receiving terminal are not matched, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image by adjusting the lamination relation between the screen frame image and the rendering window.
The resolution of the screen frame image comprises a first transverse pixel number and a first longitudinal pixel number; the resolution of the rendering window comprises a second horizontal pixel number and a second vertical pixel number; referring to fig. 8, in step S1021, the sharing receiving terminal adjusts a stacking relationship between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image, including:
Step S10211: and when the second horizontal pixel point number is greater than or equal to the first horizontal pixel point number and the second vertical pixel point number is greater than or equal to the first vertical pixel point number, stacking the screen frame image on the rendering window, so that the screen frame image is displayed in the rendering window.
Wherein, as shown in fig. 9, when the second horizontal pixel count is greater than or equal to the first horizontal pixel count and when the second vertical pixel count is greater than or equal to the first vertical pixel count, it is indicated that the rendering window is large enough to completely display the screen frame image, so that the screen frame image is stacked on the rendering window, and thus the screen frame image is completely displayed in the rendering window.
Optionally, when the screen frame image is superimposed on the rendering window and the screen frame image is displayed in the rendering window, since the screen frame image is smaller than the rendering window and the screen frame image is not completely covered by the rendering window when the screen frame image is superimposed on the rendering window, a region of the rendering window that is not covered by the screen frame image may be filled with a preset color to improve the viewing effect, for example, a region of the rendering window that is not covered by the screen frame image may be filled with a color such as black or white.
Step S10212: when the number of the second horizontal pixel points is smaller than the number of the first horizontal pixel points and/or the number of the second vertical pixel points is smaller than the number of the first vertical pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; when responding to the sliding of the screen frame image in the rendering window, the corresponding position of the screen frame image after the sliding is shown in the rendering window.
When the number of the second horizontal pixels is smaller than the number of the first horizontal pixels and/or the number of the second vertical pixels is smaller than the number of the first vertical pixels, the rendering window is not large enough, and the screen frame image may have a part of images which cannot be displayed, and at this time, the rendering window is slidingly stacked on the screen frame image, so that the screen frame image is partially displayed in the rendering window. Specifically, as shown in fig. 10, only a part of the screen frame image is displayed in the rendering window, for example, a "XX" character is displayed, and a "YY" character located on the right side of the rendering window is not displayed because the rendering window is not overlapped with the screen frame image. When a user slides the screen frame image in the rendering window, the corresponding position of the screen frame image after sliding is displayed in the rendering window, so that images at other positions in the screen frame image can be seen in the rendering window, as shown in fig. 11, after the rendering window is slid to the right, the right YY character can be displayed in the rendering window, and further, the screen frame image can be completely and clearly seen in the rendering window, and detail loss caused by scaling of the screen frame image can be prevented.
In one embodiment, the sharing receiving terminal is provided with a viewing mode selection control for adjusting the display resolution of the screen frame image; optionally, when the sharing receiving terminal is started or just receives the video coding data, the viewing mode selection control may be popped up, so that a user may select whether to trigger the viewing mode selection control. Further, after displaying the screen frame image, the viewing mode selection control may also be displayed in the toolbar 11 of the rendering window, so that the user may select to adjust the display effect of the screen frame image by triggering the viewing mode control.
Optionally, the viewing mode selection control includes a first mode control; the first mode control is used for indicating that the screen frame image is displayed without scaling according to the resolution of the screen frame image; the sharing receiving terminal adjusts the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that before the screen frame image is displayed in the rendering window according to the resolution of the screen frame image in a non-scaling way, the sharing receiving terminal further comprises: and receiving triggering operation of the first mode control. Optionally, when the sharing receiving terminal is started or just receives the video coding data, the viewing mode selection control may be popped up, so that a user may select whether to trigger the first mode control. Optionally, the first mode control may be used as a default trigger, so that when the sharing receiving terminal receives the video coding data, the screen frame image is displayed in a default manner according to the resolution of the screen frame image without scaling. As shown in fig. 12, on the user side, the first mode control is a control corresponding to "100%".
Optionally, the viewing mode selection control further includes a second mode control, where the second mode control is used to instruct to zoom in and display the screen frame image; the screen sharing method further comprises the following steps: and the sharing receiving terminal receives the triggering operation of the second mode control, stacks the screen frame image on the rendering window, keeps the ratio of the first horizontal pixel points to the first vertical pixel points unchanged, expands the screen frame image according to a first preset scaling factor, and expands and displays the screen frame image in the rendering window. After displaying the screen frame image, the viewing mode selection control may also be displayed in the toolbar 11 of the rendering window, so that the user may select whether to enlarge the screen frame image by triggering the second mode control, so as to obtain more details. For example, when the second number of horizontal pixels is greater than or equal to the first number of horizontal pixels and when the second number of vertical pixels is greater than or equal to the first number of vertical pixels, the screen frame image is superimposed on the rendering window, so that the screen frame image is displayed in the rendering window, the viewing effect may be poor due to the fact that the screen frame image is too small, and further the second mode control is triggered to amplify the screen frame image, so that the viewing effect is improved. As shown in fig. 12, on the user side, the first mode control is a control identified as "zoom+". It can be understood that, when the second mode control is triggered once, the screen frame image is amplified once according to the first preset scaling factor, so that the screen frame image is amplified once in an equal proportion.
Optionally, the viewing mode selection control further includes a third mode control, where the third mode control is used to instruct to zoom out and display the screen frame image; the screen sharing method further comprises the following steps: and the sharing receiving terminal receives the triggering operation of the fourth mode control, stacks the screen frame image on the rendering window, keeps the ratio of the first horizontal pixel points to the first vertical pixel points unchanged, and reduces the screen frame image according to a second preset scaling factor to enable the screen frame image to be reduced and displayed in the rendering window. After the screen frame image is displayed, the viewing mode selection control can be displayed on a toolbar of the rendering window, so that a user can select whether the screen frame image is reduced and displayed by triggering the third mode control, and the overall display effect of the screen frame image is obtained. For example, when the number of the second horizontal pixels is smaller than the number of the first horizontal pixels and/or the number of the second vertical pixels is smaller than the number of the first vertical pixels, the rendering window is slidingly stacked on the screen frame image, so that the screen frame image is partially displayed in the rendering window, and when the third mode control is triggered, the screen frame image is reduced, so that the screen frame image is integrally viewed. As shown in fig. 12, on the user side, the first mode control is a control identified as "Zoom-". It can be understood that, when the third mode control is triggered once, a zoom-out operation is performed on the screen frame image according to the second preset zoom factor, so that the screen frame image is zoomed out in an equal proportion once.
Optionally, the viewing mode selection control further includes a fourth mode control, where the fourth mode control is used to instruct to display the screen frame image in a preset proportion; the screen sharing method further comprises the following steps: and the sharing receiving terminal receives the triggering operation of the fourth mode control, stacks the screen frame images on the rendering window, keeps the ratio of the first horizontal pixel points to the first vertical pixel points unchanged, reduces or amplifies the screen frame images according to a third preset scaling factor, and displays the screen frame images in the rendering window according to the preset ratio. Similarly, after the screen frame image is displayed, the viewing mode selection control may also be displayed on the toolbar 11 of the rendering window, so that the user may select whether to adjust the screen frame image to be displayed by triggering the fourth mode control, so as to obtain the expected display effect of the screen frame image. When the preset ratio is that the screen frame image is displayed in a manner of 50% zoom, as shown in fig. 12, on the user side, the fourth mode control is a control identified as "50%". It can be understood that, when the fourth mode control is triggered each time, the screen frame image is reduced or enlarged according to the third preset scaling factor, and the screen frame image is reduced or enlarged according to the preset scaling factor and displayed in the rendering window.
Optionally, the viewing mode selection control further includes a fifth mode control; the fifth mode control is used for indicating to display the screen frame image with the resolution matching the viewing window; the screen sharing method further comprises the following steps: and the sharing receiving terminal receives the triggering operation of the fifth mode control, stacks the screen frame image on the rendering window, keeps the ratio of the first horizontal pixel point number to the first vertical pixel point number unchanged, expands or reduces the screen frame image, and enables the screen frame image to be displayed in the rendering window with the resolution ratio matched with the rendering window. Optionally, when the sharing receiving terminal is started or just receives the video coding data, the viewing mode selection control may be popped up, so that a user may select whether to trigger the fifth mode control. Optionally, the fifth mode control may be used as a default trigger, so that when the sharing receiving terminal receives the video encoded data, the screen frame image is displayed in the resolution matching the viewing window by default. As shown in fig. 12, on the user side, the fifth mode control is a control identified as "Max" corresponding.
Specifically, the step of keeping the ratio of the first horizontal pixel number to the first vertical pixel number unchanged, and expanding or shrinking the screen frame image, so that the screen frame image is displayed in the rendering window with a resolution matching the rendering window includes: calculating the ratio of the first transverse pixel number to the first longitudinal pixel number of the screen frame image to obtain first ratio data; and calculating the ratio of the second transverse pixel number to the second longitudinal pixel number of the rendering window to obtain second ratio data. As shown in fig. 13, when the first ratio data is greater than the second ratio data, the screen frame image is enlarged or reduced until the number of first lateral pixels of the screen frame image and the number of second lateral pixels of the rendering window are reached, so that the enlarged or reduced screen frame image is displayed in the rendering window. As shown in fig. 14, when the first ratio data is smaller than the second ratio data, the screen frame image is enlarged or reduced until the number of first vertical pixels of the screen frame image and the number of second vertical pixels of the rendering window are reached, so that the enlarged or reduced screen frame image is displayed in the rendering window. As shown in fig. 15, when the first ratio data is equal to the second ratio data, the screen frame image is enlarged or reduced until the number of first horizontal pixels of the screen frame image is the same as the number of second horizontal pixels of the rendering window, and the number of first vertical pixels of the screen frame image is the same as the number of second vertical pixels of the rendering window, so that the enlarged or reduced screen frame image is displayed in the rendering window.
Referring to fig. 16, fig. 16 is a flowchart of a screen sharing method according to a second embodiment of the present application, where the screen sharing method is applied to a sharing publishing terminal, and includes the following steps:
step S201: responding to a triggering operation of screen sharing, and acquiring a screen frame image corresponding to a screen area to be shared;
step S202: creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; wherein the identification information includes location information of the screen frame image in the canvas;
step S203: coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
step S204: and transmitting the video coding data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coding data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
According to the embodiment of the application, the canvas with the fixed resolution is created, so that the screen frame image corresponding to the screen area to be shared is pasted on the canvas, and the identification information of the screen frame image on the canvas is obtained; the method comprises the steps of obtaining video coding data to be shared based on the canvas with fixed resolution and the identification information, sending the video coding data to at least one sharing receiving terminal, and further, not needing to adjust the coding resolution in real time according to the resolution of screen frame images, so that the problem of unstable screen sharing caused by restarting an encoder continuously, improving the stability of screen sharing, improving the instantaneity of screen sharing, further, not needing any compression and amplification operation on the screen frame images in the coding and transmission processes, not needing complex programming logic, and realizing the complete transmission of the screen frame images captured by the sharing release terminal to the sharing receiving terminal, and guaranteeing the reduction degree of the screen frame images in the sharing release terminal.
The screen sharing method is described from the side of the sharing release terminal. For a specific implementation manner, reference may be made to the description related to the execution steps of the sharing and publishing terminal in the first embodiment, which is not described herein.
Referring to fig. 17, fig. 17 is a flowchart of a screen sharing method according to a third embodiment of the present application, where the screen sharing method is applied to a sharing receiving terminal, and includes the following steps:
step S301: receiving video encoding data; wherein the video coding data comprises canvas and identification information of screen frame images on the canvas; the canvas resolution is fixed and the identification information includes location information of the screen frame image in the canvas.
Step S302: and decoding the video coding data according to a decoding mode corresponding to the resolution of the canvas to obtain the identification information of the canvas and the screen frame image on the canvas.
Step S303: and obtaining a screen frame image from the canvas according to the identification information.
Step S304: and displaying the screen frame image.
According to the embodiment of the application, the canvas with the fixed resolution is decoded to obtain the screen frame image, and the decoding resolution is not required to be adjusted in real time according to the resolution of the screen frame image, so that the problem of unstable screen sharing caused by continuously restarting the decoder can be avoided, the stability of screen sharing is improved, meanwhile, the real-time performance of screen sharing can be improved, and the reduction degree of the screen frame image is ensured.
In one embodiment, the location information includes a resolution of the screen frame image;
in step S302, the displaying the screen frame image by the sharing receiving terminal includes:
step S3021: and creating a rendering window, and adjusting the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
In one embodiment, the resolution of the screen frame image includes a first number of horizontal pixels and a first number of vertical pixels; the resolution of the rendering window comprises a second horizontal pixel number and a second vertical pixel number;
in step S3021, the sharing receiving terminal adjusts a stacking relationship between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image, and includes:
step S30211: when the second horizontal pixel point number is greater than or equal to the first horizontal pixel point number and the second vertical pixel point number is greater than or equal to the first vertical pixel point number, stacking the screen frame image on the rendering window, so that the screen frame image is displayed in the rendering window;
Step S30212: when the number of the second horizontal pixel points is smaller than the number of the first horizontal pixel points and/or the number of the second vertical pixel points is smaller than the number of the first vertical pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; when responding to the sliding of the screen frame image in the rendering window, the corresponding position of the screen frame image after the sliding is shown in the rendering window.
According to the embodiment of the application, the stacking relation between the screen frame image and the rendering window is adjusted, so that the screen frame image can be completely and clearly seen in the rendering window, and detail loss caused by scaling of the screen frame image is prevented.
The screen sharing method is described from the side of the sharing receiving terminal. For a specific implementation manner, reference may be made to the description related to the execution steps of the sharing and publishing terminal in the first embodiment, which is not described herein.
Referring to fig. 18, fig. 18 is a schematic block diagram of a screen sharing system according to a fourth embodiment of the present application, where the screen sharing system 400 includes a sharing publishing terminal 401 and a sharing receiving terminal 402.
The sharing publishing terminal 401 responds to a triggering operation of screen sharing to acquire a screen frame image corresponding to a screen area to be shared; creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas;
the sharing receiving terminal 402 receives the video encoding data, and decodes the video encoding data in a decoding manner corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
The embodiment describes a screen sharing method from the perspective of a system. For a specific implementation manner, reference may be made to the related description of the execution steps of the screen sharing method in the first embodiment, which is not described herein.
Referring to fig. 19, fig. 19 is a schematic block diagram of a screen sharing device 500 according to a fifth embodiment of the present application, where the screen sharing device 500 is applied to a sharing distribution terminal, and the screen sharing device 500 includes:
the screen frame image obtaining module 501 is configured to obtain a screen frame image corresponding to a screen region to be shared in response to a trigger operation of screen sharing;
an identification information obtaining module 502, configured to create a canvas with a fixed resolution, paste the screen frame image onto the canvas, and obtain identification information of the screen frame image on the canvas; wherein the identification information includes location information of the screen frame image in the canvas;
the encoded data obtaining module 503 is configured to encode the canvas and the identification information according to an encoding mode corresponding to a resolution of the canvas, so as to obtain video encoded data to be shared;
and the data sending module 504 is configured to send the video encoded data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video encoded data according to a decoding mode corresponding to the canvas resolution, and obtains a screen frame image.
It should be noted that, when the screen sharing device in live broadcast provided in the foregoing embodiment performs the screen sharing method, only the division of the foregoing functional modules is used for illustration, in practical application, the foregoing functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the screen sharing device provided in the foregoing embodiment belongs to the same concept as the method executed by the sharing publishing terminal in the screen sharing method in the first embodiment, and the implementation process is shown in the detailed method embodiment, which is not described herein again.
Specifically, referring to fig. 20, fig. 20 is a schematic block diagram of a screen sharing device according to a sixth embodiment of the present application, where the screen sharing device may be implemented as all or a part of a computer device through software, hardware, or a combination of both. The screen sharing device 600 is applied to a sharing receiving terminal, and the screen sharing device 500 includes:
a data receiving module 601, configured to receive video encoded data; wherein the video coding data comprises canvas and identification information of screen frame images on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas;
The decoding module 602 is configured to decode the video encoded data according to a decoding manner corresponding to the resolution of the canvas, to obtain the canvas and the identification information of the screen frame image on the canvas;
a screen frame acquisition module 603 for acquiring a screen frame image from the canvas according to the identification information;
and the display module 604 is used for displaying the screen frame image.
It should be noted that, when the screen sharing device in live broadcast provided in the foregoing embodiment performs the screen sharing method, only the division of the foregoing functional modules is used for illustration, in practical application, the foregoing functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the screen sharing device provided in the foregoing embodiment belongs to the same concept as the method executed by the sharing receiving terminal in the screen sharing method in the first embodiment, and the implementation process is shown in the detailed method embodiment, which is not described herein again.
The embodiments of the screen sharing device of the fifth embodiment and the sixth embodiment of the present application may be applied to a computer device, where the embodiments of the device may be implemented by software, or may be implemented by hardware or a combination of hardware and software. Taking software implementation as an example, the device in a logic sense is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory through a processor of the file processing where the device is located. In a hardware-level, the computer devices on which they reside may include a processor, a network interface, memory, and non-volatile storage, coupled to each other via a data bus or other well-known means.
Fig. 21 is a schematic structural diagram of a computer device according to a seventh embodiment of the present application. As shown in fig. 21, the computer device 700 may include: the processor 701, the network interface 702, the memory 703 and the non-volatile storage 704 are mutually coupled via a data bus 705. In addition to the processor 701, the memory 703 of the network interface 702, and the nonvolatile memory 704 shown in fig. 21, the actual functions of the computer device described in the present application may further include other hardware, which will not be described herein. The memory 703 or the nonvolatile storage 704 has a computer program running thereon, for example: a screen sharing method; the processor 701 implements the steps of the first to third embodiments described above when executing the computer program. Wherein the computer device also serves as a carrier for the screen apparatuses of the fifth and sixth embodiments.
Wherein the processor 701 may include one or more processing cores. The processor 701 utilizes various interfaces and lines to connect various portions of the computer device 700, performs various functions of the computer device 700 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 703 or non-volatile memory 704, and invoking data in the memory 703 or non-volatile memory 704, and alternatively, the processor 701 may be implemented in at least one hardware form in digital signal processing (Digital Signal Processing, DSP), field-programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programble Logic Array, PLA). The processor 701 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the touch display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 701 and may be implemented by a single chip.
The Memory 704 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Memory 704 may be used to store instructions, programs, code, sets of codes, or instruction sets. The memory 704 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the various method embodiments described above, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 704 may also optionally be at least one storage device located remotely from the processor 701.
The eighth embodiment of the present application further provides a computer storage medium, where a plurality of instructions may be stored, where the instructions are adapted to be loaded and executed by a processor, and the specific implementation procedure may refer to the specific description of the foregoing embodiment, and details are not repeated herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc.
The present invention is not limited to the above-described embodiments, but, if various modifications or variations of the present invention are not departing from the spirit and scope of the present invention, the present invention is intended to include such modifications and variations as fall within the scope of the claims and the equivalents thereof.

Claims (18)

1. The screen sharing method is characterized by comprising the following steps of:
the method comprises the steps that a sharing release terminal responds to a triggering operation of screen sharing, and a screen frame image corresponding to a screen area to be shared is obtained; creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
the sharing receiving terminal receives the video coding data, decodes the video coding data in a decoding mode corresponding to the resolution of the canvas, and obtains the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
2. The screen sharing method of claim 1, wherein:
and the resolution of the canvas is the resolution corresponding to the full-screen desktop of the sharing and publishing terminal.
3. The screen sharing method of claim 1, wherein,
the step of the sharing publishing terminal encoding the canvas and the identification information according to an encoding mode corresponding to the resolution of the canvas to obtain video encoding data to be shared comprises the following steps:
coding the canvas by adopting a fixed coding resolution to obtain first coding data; wherein the encoding resolution is the same size as the resolution of the canvas;
coding the identification information in an SEI mode to obtain second coded data;
and packaging the first coded data and the second coded data to obtain the video coded data to be shared.
4. The screen sharing method of claim 1, wherein:
the position information includes a resolution of the screen frame image;
the sharing receiving terminal displays the screen frame image, including:
and creating a rendering window, and adjusting the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
5. The screen sharing method of claim 4, wherein:
the resolution of the screen frame image comprises a first transverse pixel number and a first longitudinal pixel number;
the resolution of the rendering window comprises a second horizontal pixel number and a second vertical pixel number;
the sharing receiving terminal adjusts the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image, and the sharing receiving terminal comprises:
when the second horizontal pixel point number is greater than or equal to the first horizontal pixel point number and the second vertical pixel point number is greater than or equal to the first vertical pixel point number, stacking the screen frame image on the rendering window, so that the screen frame image is displayed in the rendering window;
when the number of the second horizontal pixel points is smaller than the number of the first horizontal pixel points and/or the number of the second vertical pixel points is smaller than the number of the first vertical pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; when responding to the sliding of the screen frame image in the rendering window, the corresponding position of the screen frame image after the sliding is shown in the rendering window.
6. The screen sharing method of claim 5, wherein:
the sharing receiving terminal is provided with a viewing mode selection control; the viewing mode selection control includes a first mode control; the first mode control is used for indicating that the screen frame image is displayed without scaling according to the resolution of the screen frame image;
the sharing receiving terminal adjusts the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that before the screen frame image is displayed in the rendering window according to the resolution of the screen frame image in a non-scaling way, the sharing receiving terminal further comprises: and receiving triggering operation of the first mode control.
7. The screen sharing method of claim 6, wherein:
the viewing mode selection control further comprises a second mode control, wherein the second mode control is used for indicating to enlarge and display the screen frame image;
the screen sharing method further comprises the following steps: the sharing receiving terminal receives the triggering operation of the second mode control, stacks the screen frame image on the rendering window, keeps the ratio of the first horizontal pixel points to the first vertical pixel points unchanged, expands the screen frame image according to a first preset scaling factor, and expands and displays the screen frame image in the rendering window;
And/or the viewing mode selection control further comprises a third mode control, wherein the third mode control is used for indicating to display the screen frame image in a shrinking mode;
the screen sharing method further comprises the following steps: the sharing receiving terminal receives triggering operation of the third mode control, stacks the screen frame image on the rendering window, keeps the ratio of the first horizontal pixel points to the first vertical pixel points unchanged, and reduces the screen frame image according to a second preset scaling factor to enable the screen frame image to be reduced and displayed in the rendering window;
and/or the viewing mode selection control further comprises a fourth mode control, wherein the fourth mode control is used for indicating to display the screen frame image in a preset proportion;
the screen sharing method further comprises the following steps: and the sharing receiving terminal receives the triggering operation of the fourth mode control, stacks the screen frame images on the rendering window, keeps the ratio of the first horizontal pixel points to the first vertical pixel points unchanged, reduces or amplifies the screen frame images according to a third preset scaling factor, and displays the screen frame images in the rendering window according to the preset ratio.
8. The screen sharing method of claim 6, wherein:
the viewing mode selection control further includes a fifth mode control; the fifth mode control is used for indicating to display the screen frame image with the resolution ratio matched with the rendering window;
the screen sharing method further comprises the following steps: and the sharing receiving terminal receives the triggering operation of the fifth mode control, stacks the screen frame image on the rendering window, keeps the ratio of the first horizontal pixel point number to the first vertical pixel point number unchanged, expands or reduces the screen frame image, and enables the screen frame image to be displayed in the rendering window with the resolution ratio matched with the rendering window.
9. The screen sharing method according to any one of claims 4 to 8, wherein,
the position information also comprises coordinates of preset points of the screen frame image in the canvas;
the sharing receiving terminal obtains the screen frame image from the canvas according to the identification information, and the method comprises the following steps:
and intercepting and obtaining the screen frame image from the canvas according to the coordinates of the preset point of the screen frame image in the canvas and the resolution of the screen frame image.
10. A screen sharing method is applied to a sharing release terminal and is characterized by comprising the following steps:
responding to a triggering operation of screen sharing, and acquiring a screen frame image corresponding to a screen area to be shared;
creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
and transmitting the video coding data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coding data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
11. A screen sharing method is applied to a sharing receiving terminal and is characterized by comprising the following steps:
receiving video encoding data; wherein the video coding data comprises canvas and identification information of screen frame images on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
Decoding the video coding data according to a decoding mode corresponding to the resolution of the canvas to obtain the identification information of the canvas and the screen frame image on the canvas;
obtaining a screen frame image from the canvas according to the identification information;
and displaying the screen frame image.
12. The screen sharing method of claim 11, wherein:
the position information includes a resolution of the screen frame image;
the displaying the screen frame image includes:
and creating a rendering window, and adjusting the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
13. The screen sharing method of claim 12, wherein:
the resolution of the screen frame image comprises a first transverse pixel number and a first longitudinal pixel number;
the resolution of the rendering window comprises a second horizontal pixel number and a second vertical pixel number;
the adjusting the layering relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image, comprises the following steps:
When the second horizontal pixel point number is greater than or equal to the first horizontal pixel point number and the second vertical pixel point number is greater than or equal to the first vertical pixel point number, stacking the screen frame image on the rendering window, so that the screen frame image is displayed in the rendering window;
when the number of the second horizontal pixel points is smaller than the number of the first horizontal pixel points and/or the number of the second vertical pixel points is smaller than the number of the first vertical pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; when responding to the sliding of the screen frame image in the rendering window, the corresponding position of the screen frame image after the sliding is shown in the rendering window.
14. The screen sharing system comprises a sharing release terminal and a sharing receiving terminal, and is characterized in that:
the sharing release terminal responds to a triggering operation of screen sharing and acquires a screen frame image corresponding to a screen area to be shared; creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; transmitting the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
The sharing receiving terminal receives the video coding data, decodes the video coding data in a decoding mode corresponding to the resolution of the canvas, and obtains the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
15. The utility model provides a screen sharing device is applied to and shares release terminal, its characterized in that, the device includes:
the screen frame image acquisition module is used for responding to the triggering operation of screen sharing and acquiring a screen frame image corresponding to a screen area to be shared;
the identification information acquisition module is used for creating a canvas with fixed resolution, pasting the screen frame image onto the canvas, and acquiring the identification information of the screen frame image on the canvas; wherein the identification information includes location information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
the coding data acquisition module is used for coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
And the data transmitting module is used for transmitting the video coding data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coding data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
16. The utility model provides a screen sharing device is applied to and shares receiving terminal, its characterized in that, the device includes:
the data receiving module is used for receiving the video coding data; wherein the video coding data comprises canvas and identification information of screen frame images on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas; wherein each frame of the screen frame image corresponds to one canvas; the canvas is a blank canvas;
the decoding module is used for decoding the video coding data according to a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information of the screen frame image on the canvas;
the screen frame acquisition module is used for acquiring a screen frame image from the canvas according to the identification information;
and the display module is used for displaying the screen frame image.
17. A computer device comprising a processor and a memory; the computer program is stored in the memory, and is adapted to be loaded by the processor and to perform the screen sharing method according to any one of claims 1 to 13.
18. A computer readable storage medium having stored thereon a computer program, which when executed by a processor implements the screen sharing method of any of claims 1 to 13.
CN202110849522.9A 2021-07-27 2021-07-27 Screen sharing method, device, system, storage medium and computer equipment Active CN113596571B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110849522.9A CN113596571B (en) 2021-07-27 2021-07-27 Screen sharing method, device, system, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110849522.9A CN113596571B (en) 2021-07-27 2021-07-27 Screen sharing method, device, system, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN113596571A CN113596571A (en) 2021-11-02
CN113596571B true CN113596571B (en) 2024-03-12

Family

ID=78250298

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110849522.9A Active CN113596571B (en) 2021-07-27 2021-07-27 Screen sharing method, device, system, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN113596571B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114356263A (en) * 2021-12-29 2022-04-15 威创集团股份有限公司 Bar screen information display method, bar screen information display device, bar screen information display equipment and readable storage medium
CN115209117B (en) * 2022-07-20 2024-06-18 北京字跳网络技术有限公司 Screen projection method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9451197B1 (en) * 2010-04-12 2016-09-20 UV Networks, Inc. Cloud-based system using video compression for interactive applications
CN110770785A (en) * 2017-06-29 2020-02-07 皇家Kpn公司 Screen sharing for display in VR
CN110806846A (en) * 2019-10-11 2020-02-18 北京字节跳动网络技术有限公司 Screen sharing method, screen sharing device, mobile terminal and storage medium
CN110852946A (en) * 2019-10-30 2020-02-28 北京字节跳动网络技术有限公司 Picture display method and device and electronic equipment
CN111694603A (en) * 2019-03-12 2020-09-22 腾讯科技(深圳)有限公司 Screen sharing method and device, computer equipment and storage medium
CN112367521A (en) * 2020-10-27 2021-02-12 广州华多网络科技有限公司 Display screen content sharing method and device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2493551A (en) * 2011-08-11 2013-02-13 Dexdyne Ltd Displaying plotted trend data at varying resolutions

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9451197B1 (en) * 2010-04-12 2016-09-20 UV Networks, Inc. Cloud-based system using video compression for interactive applications
CN110770785A (en) * 2017-06-29 2020-02-07 皇家Kpn公司 Screen sharing for display in VR
CN111694603A (en) * 2019-03-12 2020-09-22 腾讯科技(深圳)有限公司 Screen sharing method and device, computer equipment and storage medium
CN110806846A (en) * 2019-10-11 2020-02-18 北京字节跳动网络技术有限公司 Screen sharing method, screen sharing device, mobile terminal and storage medium
CN110852946A (en) * 2019-10-30 2020-02-28 北京字节跳动网络技术有限公司 Picture display method and device and electronic equipment
CN112367521A (en) * 2020-10-27 2021-02-12 广州华多网络科技有限公司 Display screen content sharing method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN113596571A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
US9723359B2 (en) Low latency wireless display for graphics
US11120677B2 (en) Transcoding mixing and distribution system and method for a video security system
CN106792092B (en) Live video stream split-mirror display control method and corresponding device thereof
US8115800B2 (en) Server apparatus and video delivery method
WO2018121014A1 (en) Video play control method and apparatus and terminal device
CN113596571B (en) Screen sharing method, device, system, storage medium and computer equipment
CN112235626A (en) Video rendering method and device, electronic equipment and storage medium
US9100543B2 (en) Method and system for controlling video structure of video conference system
CN113395477B (en) Sharing method and device based on video conference, electronic equipment and computer medium
US20040001091A1 (en) Method and apparatus for video conferencing system with 360 degree view
CN112203124B (en) Display device and control method thereof
CN102770827A (en) Method for displaying multimedia content on a screen of a terminal
CN111741343B (en) Video processing method and device and electronic equipment
WO2023193524A1 (en) Live streaming video processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN114095772B (en) Virtual object display method, system and computer equipment under continuous wheat direct sowing
US11748915B2 (en) VR image compression transmission method and system
CN107872683B (en) Video data processing method, device, equipment and storage medium
CN116248889A (en) Image encoding and decoding method and device and electronic equipment
CN111385590A (en) Live broadcast data processing method and device and terminal
CN116170636A (en) Live video playing method and device, equipment and medium thereof
CN110990109B (en) Spliced screen back display method, terminal, system and storage medium
US20110181503A1 (en) Reproduction device, reproduction system and non-transitory computer-readable storage medium
US20170374368A1 (en) Video Processor, Method, Computer Program
CN116016972A (en) Live broadcasting room beautifying method, device and system, storage medium and electronic equipment
JP6431301B2 (en) Movie processing apparatus, method, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant