CN113094011B - Screen sharing method, device, equipment and computer readable storage medium - Google Patents

Screen sharing method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN113094011B
CN113094011B CN202110326820.XA CN202110326820A CN113094011B CN 113094011 B CN113094011 B CN 113094011B CN 202110326820 A CN202110326820 A CN 202110326820A CN 113094011 B CN113094011 B CN 113094011B
Authority
CN
China
Prior art keywords
shared
data
terminals
sharing
access terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110326820.XA
Other languages
Chinese (zh)
Other versions
CN113094011A (en
Inventor
刘金
马岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202110326820.XA priority Critical patent/CN113094011B/en
Publication of CN113094011A publication Critical patent/CN113094011A/en
Application granted granted Critical
Publication of CN113094011B publication Critical patent/CN113094011B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Abstract

The embodiment of the application provides a screen sharing method, device and equipment and a computer readable storage medium, wherein the method comprises the following steps: the method comprises the steps of obtaining data to be shared of at least two sharing terminals through a network, determining the number of access terminals, determining a fusion mode of the data to be shared according to the number of the access terminals, fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain the sharing data corresponding to each access terminal, and displaying content corresponding to the sharing data on each access terminal. Therefore, the data to be shared are fused according to the number of the access terminals, and the fused shared data are displayed on each access terminal, so that the access terminals can not only see the content shared by a plurality of sharing terminals when the screen sharing is carried out, but also display the shared content, the screen sharing function is increased, and the screen sharing efficiency is improved.

Description

Screen sharing method, device, equipment and computer readable storage medium
Technical Field
The embodiment of the application relates to the technical field of terminals, and relates to a screen sharing method, a device, equipment and a computer readable storage medium.
Background
At present, only one user can share content in the conference screen sharing, and if one user is sharing content at the moment and the other user wants to share content, the user needs to wait for the current sharing user to finish sharing or directly replace the sharing of the current sharing user; in the conference room, even if the conference device is connected with two display screens, conference sharing content can be displayed on one of the display screens.
In the related art, conference software does not support a plurality of users to share content simultaneously, the shared content can only be displayed on one display screen, and a conference receiving terminal cannot read the shared content simultaneously, so that conference communication efficiency is affected.
Disclosure of Invention
Based on the problems existing in the related art, embodiments of the present application provide a method, an apparatus, a device, and a computer readable storage medium for sharing a screen.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a screen sharing method, including:
acquiring data to be shared of at least two sharing terminals through a network;
determining a number of access terminals;
determining a fusion mode of the data to be shared according to the number of the access terminals;
Fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain the sharing data corresponding to each access terminal;
and displaying the content corresponding to the shared data on each access terminal.
In a second aspect, an embodiment of the present application provides a screen sharing device, including:
the acquisition module is used for acquiring data to be shared of at least two sharing terminals through a network;
a first determining module configured to determine a number of access terminals;
the second determining module is used for determining a fusion mode of the data to be shared according to the number of the access terminals;
the fusion module is used for fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain the sharing data corresponding to each access terminal;
and the display module is used for displaying the content corresponding to the sharing data on each access terminal.
In a third aspect, an embodiment of the present application provides a screen sharing device, including:
a memory for storing executable instructions; and the processor is used for realizing the screen sharing method when executing the executable instructions stored in the memory.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium storing executable instructions for implementing the above-mentioned screen sharing method when a processor executes the executable instructions.
According to the screen sharing method, device, equipment and computer readable storage medium, the data to be shared and the number of access terminals of a plurality of sharing terminals are obtained through a network, and the data to be shared are fused according to the number of the access terminals to obtain the sharing data with the same number as the access terminals, so that each access terminal displays the corresponding sharing data. Therefore, the data to be shared are fused according to the number of the access terminals, and the fused shared data are displayed on each access terminal, so that the access terminals can not only see the content shared by a plurality of sharing terminals when the screen sharing is carried out, but also display the shared content, the screen sharing function is increased, and the screen sharing efficiency is improved.
Drawings
Fig. 1 is an application scenario schematic diagram of a screen sharing method provided in an embodiment of the present application;
FIG. 2 is a schematic flow chart of an alternative screen sharing method according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart of an alternative screen sharing method according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of an alternative screen sharing method according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of an alternative screen sharing method according to an embodiment of the present disclosure;
FIG. 6 is a schematic flow chart of an alternative screen sharing method according to an embodiment of the present disclosure;
fig. 7 is a schematic view of a screen sharing method according to an embodiment of the present application;
fig. 8 is a schematic view of a screen sharing method according to an embodiment of the present application;
fig. 9 is a schematic view of a screen sharing method according to an embodiment of the present application;
fig. 10 is a schematic diagram of a composition structure of a screen sharing device according to an embodiment of the present disclosure;
fig. 11 is a schematic diagram of a composition structure of a screen sharing device according to an embodiment of the present application.
Detailed Description
For a more clear description of the objects, technical solutions and advantages of the embodiments of the present application, the embodiments of the present application will be described in detail below with reference to the accompanying drawings. It is to be understood that the following description of the embodiments is intended to illustrate and describe the general concepts of the embodiments of the application and should not be construed as limiting the embodiments of the application. In the description and drawings, the same or similar reference numerals refer to the same or similar parts or components. For purposes of clarity, the drawings are not necessarily drawn to scale and some well-known components and structures may be omitted from the drawings.
Based on the problems existing in the related art, the embodiment of the application provides a screen sharing method, which includes acquiring data to be shared of at least two sharing terminals through a network, determining the number of terminals of access terminals, determining a fusion mode of the data to be shared according to the number of terminals, and fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain sharing data corresponding to each access terminal, wherein content corresponding to the sharing data is displayed on each access terminal. Therefore, the data to be shared are fused according to the number of the sharing terminals and the access terminals, and the fused data to be shared are displayed on each access terminal, so that the content shared by multiple persons can be seen in screen sharing, and each access terminal can display the shared content, the conference function is increased, and the conference efficiency is improved.
An exemplary application of the screen sharing device provided in the embodiment of the present application is described below, where the screen sharing device provided in the embodiment of the present application may be implemented as various types of terminals such as a notebook computer, a tablet computer, a desktop computer, a mobile device, and the like, and may also be implemented as a server. Next, an exemplary application when the screen sharing apparatus is implemented as a server will be described.
Fig. 1 is an application scenario schematic diagram of a screen sharing method provided in an embodiment of the present application, as shown in fig. 1, in a screen sharing system 10 provided in an embodiment of the present application, the system includes a sharing terminal 100 and a sharing terminal 101, an access terminal 102 and an access terminal 103, a network 200, and a server 300, where each sharing terminal (for example, the sharing terminal 100 and the sharing terminal 101) corresponds to one data to be shared, and when data sharing is performed, the server may acquire the data to be shared by adopting the method in the embodiment of the present application. When the screen sharing is performed, the server 300 determines the number of data to be shared and the number of access terminals (for example, the access terminal 102 and the access terminal 103) corresponding to at least two sharing terminals through the network 200, determines a fusion mode of the data to be shared according to the number of the data to be shared and the number of the access terminals, fuses the data to be shared of the at least two sharing terminals by adopting the determined fusion mode, obtains the sharing data corresponding to each access terminal, and sends the sharing data to each access terminal (for example, the access terminal 102 and the access terminal 103) through the network 200, so as to realize the screen sharing. After each access terminal (e.g., access terminal 102 and access terminal 103) receives the shared data, the received shared data may be displayed directly on the display interfaces 102-1 and 103-1 of access terminal 102 and access terminal 103.
Referring to fig. 2, fig. 2 is a schematic flowchart of an alternative screen sharing method according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 2.
Step S201, obtaining data to be shared of at least two sharing terminals through a network.
Here, the sharing terminals refer to terminals that want to share content, and each sharing terminal corresponds to data to be shared. In some embodiments, the server may obtain the data to be shared from the sharing terminal through a network.
Step S202, determining the number of access terminals.
Here, the access terminal is a terminal that receives data to be shared and displays content corresponding to the data to be shared. In some embodiments, the number of access terminals may be multiple, for example: when a network conference is performed, the conference access terminal has three displays for displaying access data, and the number of access terminals is three.
Step 203, determining a fusion mode of the data to be shared according to the number of the access terminals.
In this embodiment of the present invention, fusing data to be shared refers to that when the number of data to be shared is greater than the number of access terminals, the access terminals cannot display all the data to be shared, and at this time, according to the number of access terminals, the data to be shared may be fused in a manner of picture-in-picture or parallel splicing, so that the access terminals can display all the data to be shared.
And step S204, fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain the sharing data corresponding to each access terminal.
In the embodiment of the present application, after the data to be shared are fused, the shared data with the same number as the access terminals are obtained, so that each access terminal displays one shared data. For example: in a conference scene, five sharing terminals send five data to be shared, but only two access terminals in a conference room can display one of the data to be shared on one access terminal, and the other four data are spliced and fused and then displayed on the other access terminal; or two data to be shared can be displayed in one access terminal in a picture-in-picture mode, and the other three data are displayed in the other access terminal in a splicing and fusing mode.
In some embodiments, the fusion approach may also take other forms, such as: in a conference scene, two sharing terminals send two data to be shared, and two access terminals are also arranged in a conference room, so that the two data to be shared can be fused in a picture-in-picture mode, and the fused sharing data are simultaneously displayed on two displays, namely the content displayed by the two displays is the same; the two data to be shared can be fused in different fusion modes, the shared data fused through the picture-in-picture is displayed on one display, and the shared data fused through the splicing is displayed on the other display. Therefore, the fusion mode and the display mode of the embodiment of the application have various conditions, and therefore, the fusion mode of the data to be shared and the display mode of the shared data are not limited.
And step 205, displaying the content corresponding to the sharing data on each access terminal.
In some embodiments, when the access terminal receives the fused data, content corresponding to the shared data is displayed on the access terminal.
According to the method, the device and the system, the data to be shared of at least two sharing terminals are obtained through a network, the number of access terminals is determined, the fusion mode of the data to be shared is determined according to the number of the access terminals, the data to be shared of the at least two sharing terminals are fused in the determined fusion mode, the sharing data corresponding to each access terminal is obtained, and content corresponding to the sharing data is displayed on each access terminal. Therefore, the data to be shared are fused according to the number of the access terminals, and the fused shared data are displayed on each access terminal, so that the access terminals can not only see the content shared by a plurality of sharing terminals when the screen sharing is carried out, but also display the shared content, the screen sharing function is increased, and the screen sharing efficiency is improved.
In some embodiments, there may be some access terminals in the plurality of access terminals that are not capable of content display, then it may be desirable to determine the number of active terminals in the access terminals that have been accessed and that are capable of content display. Based on the foregoing embodiments, the embodiments of the present application further provide a screen sharing method, as shown in fig. 3, fig. 3 is an optional flowchart of the screen sharing method provided in the embodiments of the present application, and based on fig. 2, step S203 may be implemented by the following steps.
Step S301, based on the number of the sharing terminals, obtaining the number of the data to be shared.
In some embodiments, the number of data to be shared is equal to the number of sharing terminals.
And step S302, grouping the data to be shared according to the number of the effective terminals when the number of the data to be shared is greater than or equal to the number of the accessed effective terminals, so as to form a data group to be shared of the number of the effective terminals.
Here, an active terminal refers to an access terminal that has been accessed from among access terminals and can display data.
In some embodiments, grouping the data to be shared refers to dividing the data to be shared into the same data groups as the number of the effective terminals according to the number of the effective terminals, because the number of the data to be shared is greater than the number of the effective terminals. For example: in the conference, there are five data to be shared, and only two displays in the three displays connected by the access terminal can display content, so that the five data to be shared can be divided into two groups, wherein the grouping mode can be that one data group to be shared comprises one data to be shared, and the other data group to be shared comprises four data to be shared; the grouping manner may also be that one data set to be shared includes two data to be shared, and the other data set to be shared includes three data to be shared.
In step S303, when a single data set to be shared includes a plurality of data to be shared, it is determined that the fusion manner of the data to be shared is to perform mosaic display on video frames corresponding to the plurality of data to be shared, or perform nested display on video frames corresponding to the plurality of data to be shared.
In this embodiment, after the data groups to be shared are formed, the number of data to be shared in each data group to be shared is determined, and the fusion mode of the data to be shared in the data group to be shared is determined according to the number of data to be shared.
Here, the fusion manner may be splicing and fusion, and when the number of data to be shared in the data group to be shared is greater, the data to be shared may be fused in the manner of splicing and fusion, for example: when four data to be shared exist in the data group to be shared, each video frame of the four data to be shared can be spliced according to time sequence, and one spliced shared data is obtained.
In some embodiments, the fusion manner may be nested fusion, and when the amount of data to be shared in the data set to be shared is smaller, the data to be shared may be fused in a nested fusion manner, for example: when two data to be shared exist in the data group to be shared, each video frame of the two data to be shared can be nested in time sequence, one data to be shared is displayed on the other data to be shared in a small frame mode, the small frame is placed at a position where the display content is not covered, and one shared data after nesting is obtained, wherein the nesting mode between the two data can be switched.
In the embodiment of the application, the data to be shared are grouped and fused according to the number of the effective terminals, so that the effective access terminals can display all videos to be shared, and the efficiency of screen sharing is improved.
In some embodiments, the process of fusing the data to be shared may also be performed by the access terminal or may be performed by the cloud server. Based on the foregoing embodiments, the embodiment of the present application further provides a screen sharing method, as shown in fig. 4, fig. 4 is an optional flowchart of the screen sharing method provided in the embodiment of the present application, and based on fig. 2, step S204 may be implemented by the following steps.
Step S401, fusing the data to be shared by each access terminal in a fusion manner of the data to be shared, so as to obtain the shared data corresponding to the access terminal.
In some embodiments, after grouping the data to be shared, obtaining the data groups to be shared with the same number as the access terminals, and sending each data group to be shared to the corresponding access terminal, where the access terminal fuses the data to be shared in the data groups to be shared according to the determined fusion manner, so as to obtain the shared data.
And step S402, fusing the data to be shared by adopting a fusion mode of the data to be shared through a cloud server, and obtaining the shared data corresponding to each access terminal.
In some embodiments, the execution body for fusing the data to be shared may also be a cloud server.
In the embodiment of the invention, the access terminal or the cloud server fuses the data to be shared to obtain the shared data corresponding to each access terminal, so that the access terminal can automatically adjust the display mode of the shared data when the screen is shared, and the sharing experience is improved.
In some embodiments, each access terminal corresponds to a terminal resolution, the cloud server may encode the shared video according to the screen resolution of the access terminal, so as to obtain shared data with different resolutions, and the access terminal obtains the shared data with the corresponding resolution according to the screen resolution. Based on the foregoing embodiments, the embodiments of the present application further provide a screen sharing method, as shown in fig. 5, fig. 5 is an optional flowchart of the screen sharing method provided in the embodiments of the present application, and based on fig. 4, the following steps may be further included after step S402.
Step S501, obtaining, by the cloud server, the terminal resolution of each access terminal.
Step S502, performing hierarchical encoding on the shared data according to terminal resolutions of a plurality of access terminals, so that a first encoded video corresponding to the encoded shared data has a multi-layer video resolution.
Here, the terminal resolution may be the resolution of the access terminal display.
In some embodiments, layered encoding of the shared data may be achieved by a layered video codec technique (SVC, scaled Video Coding), which may divide the video stream into video streams having multiple resolution, quality and frame rate layers.
In some embodiments, after the shared data is hierarchically encoded by SVC, a multi-layer video stream is obtained, which is composed of a base layer and a plurality of enhancement layers that can improve resolution, frame rate and quality, so that the encoded shared data has multi-layer video resolution. Here, the base layer means encoding the lowest resolution, space and quality of the video stream, and each enhancement layer encodes additional information of the shared data using the base layer as a starting point, thereby reconstructing the high quality or high resolution video stream in the decoding process.
Step S503, according to the terminal resolution of each access terminal, matching target sharing data in the plurality of encoded sharing data with the multi-layer video resolution.
In some embodiments, the matching target shared data may decode a video stream corresponding to the access terminal resolution in the encoded video stream according to the access terminal resolution. For example: when the resolution of the shared data is 4K, a base layer video stream with the resolution of 720P and a plurality of enhancement layers are obtained through a layered coding technology, the enhancement layers are overlapped on the basis of the base layer to obtain video streams with the resolutions of 1080P and 4K, and the access terminal decodes the corresponding base layer and enhancement layer according to the resolution of the access terminal to obtain target shared data matched with the resolution of the access terminal.
Step S504, sharing the target sharing data to the corresponding access terminal.
According to the method and the device for sharing the data, the cloud server is used for selecting and receiving the sharing data with the most proper resolution according to the capability of the access terminal hardware, so that bandwidth occupation is reduced and screen sharing experience is improved under the condition that screen sharing experience is not affected.
In some embodiments, the data to be shared of the sharing terminal may be encoded, the encoded data to be shared may be decoded according to the terminal resolution of the access terminal, so as to obtain the data to be shared corresponding to the terminal resolution of the access terminal, and the data to be shared with the terminal resolution may be fused, so as to obtain the shared data corresponding to the access terminal. Based on the foregoing embodiments, the embodiment of the present application further provides a screen sharing method, as shown in fig. 6, fig. 6 is an optional flowchart of the screen sharing method provided in the embodiment of the present application, and based on fig. 4, step S402 may be further implemented by the following steps.
Step S601, obtaining, by the cloud server, the terminal resolution of each access terminal.
Step S602, encoding the data to be shared according to the terminal resolution of each access terminal, so that the video resolution of the second encoded video frame corresponding to the encoded data to be shared matches with the terminal resolution.
And step 603, fusing the encoded data to be shared by adopting the fusion mode to obtain the shared data corresponding to each access terminal.
In some embodiments, each data to be shared may be layered and encoded, then the encoded data to be shared is divided into data groups to be shared, which are the same as the number of access terminals, according to the number of access terminals, then the data groups to be shared are decoded according to the terminal resolution of each access terminal, so as to obtain data groups to be shared corresponding to the terminal resolution of each access terminal, and the cloud server fuses each data group to be shared, which has the number of access terminals, by adopting a determined fusion manner, so as to obtain the shared data corresponding to the access terminals. For example: the method comprises the steps that a sharing terminal sends two data to be shared with 4K resolution and 1080P resolution, an access terminal is provided with a 720P resolution display and a 1080P resolution display, a cloud server divides three data to be shared with 4K resolution into a group after carrying out layered coding on the three data to be shared, decodes the two data to be shared into 1080P resolution video streams, and carries out splicing fusion on the two 1080P video streams in a splicing fusion mode to obtain sharing data of the access terminal with 1080P resolution; decoding data to be shared with the resolution of 1080P into a video stream with the resolution of 720P, and determining the video stream with the resolution of 720P as shared data corresponding to an access terminal with the resolution of 720P.
According to the method and the device for sharing the data, the cloud server is used for selecting and receiving the sharing data with the most proper resolution according to the capability of the access terminal hardware, so that bandwidth occupation is reduced and screen sharing experience is improved under the condition that screen sharing experience is not affected.
In some embodiments, the sharing data includes at least one frame of video frame after rendering, where rendering refers to integrating each frame of sharing data after fusion into a complete video stream, and displaying each video frame of sharing data after rendering on each access terminal according to the playing order of each video frame in the video stream.
In the following, an exemplary application of the embodiments of the present application in a practical application scenario will be described.
Aiming at the problem that only one path of shared content is allowed to be shared in a conference scene and the shared content can be displayed on one display screen in the related technology, three solutions are provided in the embodiment of the application.
In some embodiments, fig. 7 is a schematic view of a screen sharing method provided in the embodiments of the present application, as shown in fig. 7, in which conference software in the embodiments of the present application allows for multi-channel content sharing pushing in a conference, for example, content sharing 1 (i.e. data to be shared) of user a (i.e. sharing terminal) and content sharing 2 (i.e. data to be shared) of user B (i.e. sharing terminal), all receiving ends (i.e. access terminals) may receive content sharing of all users, and the receiving end device may adapt or set a display layout of received shared content according to the number of connected display screens. For example: when the receiving end has only one display screen, namely the receiving end displays a single screen, the received sharing content can be subjected to picture-in-picture display, as shown in a (b) diagram and a (c) diagram in fig. 7, wherein a main picture and a sub picture can be mutually switched; or two shared contents are displayed in parallel as shown in fig. 7 (a). If the receiving end is connected with two display screens, the shared content can be displayed on the two screens in a full screen manner, as shown in (d) and (e) of fig. 7, or the two shared content can be displayed on one screen in a picture-in-picture or parallel manner.
In some embodiments, fig. 8 is a schematic view of a screen sharing method provided in the embodiments of the present application, as shown in fig. 8, in which conference software in the embodiments of the present application allows for multi-channel content sharing pushing in a conference, for example, content sharing 1 (i.e. data to be shared) of user a (i.e. sharing terminal) and content sharing 2 (i.e. data to be shared) of user B (i.e. sharing terminal), and multi-channel video content may be fused through a cloud (i.e. cloud server), for example: the cloud end forms a plurality of single-channel content streams in different fusion modes through parallel splicing (shown as a diagram in fig. 8 (a)) and a fusion mode of picture-in-picture superposition (shown as a diagram in fig. 8 (b) and a diagram in fig. 8 (c)), and outputs a preview video stream with a low code rate. The display layout of the received content may be adapted or set according to the number of display screens connected by the receiving end device (i.e., access terminal). For example: when the receiving end is connected with a display screen, a user views the preview video stream and selects a fusion video stream suitable for the user to display; if the receiving end is connected with two display screens, one original video stream can be selected to be displayed on one display screen, other contents can be displayed on the other display screen in parallel or in other modes, or the other contents can be displayed on the other display screen in a full screen mode.
In some embodiments, fig. 9 is a schematic view of a screen sharing method provided in the embodiments of the present application, as shown in fig. 9, in the embodiments of the present application, conference software allows for multi-channel content sharing pushing in a conference, for example, content sharing 1 (i.e. data to be shared) of user a (i.e. sharing terminal) and content sharing 2 (i.e. data to be shared) of user B (i.e. sharing terminal), and the received content streams or the video streams after fusion may be hierarchically encoded according to resolution. As shown in fig. 9, after the cloud receives the content share 1 from the user a, the content share 1 is encoded into video streams with different resolutions through hierarchical encoding, and the video streams after fusion in fig. 9 are divided into three layers: the resolution of the base layer B0 is 720P; different enhancement layers are overlapped on the base layer B0 to obtain different resolutions, the resolution of the base layer B0 overlapped enhancement layer S0 is 1080P, and the resolution of the base layer B0 overlapped enhancement layers S0 and S1 is 4K. The conference receiving end (i.e. the access terminal) selects the optimal resolution according to the resolution of the connected display screen to decode and play, as shown in fig. 9, the receiving end with the resolution of 1366x768 decodes and displays the video stream with the resolution of 720P; decoding and displaying a video stream with the resolution of 1080P by a receiving end with the resolution of 1920x 1080; the receiving end with the resolution of 3840x2160 decodes and displays the video stream with the resolution of 4K.
In the embodiment of the application, two users and even a plurality of users are allowed to share simultaneously in a conference, and the conference terminal displays according to the number of connected display screens: when the receiving end only has one display screen, a plurality of sharing contents are spliced and displayed side by default, a user can switch the display form into a picture-in-picture mode through conference software, any one of the sharing contents can be designated as a main picture, other sharing contents are displayed at the corners of the screen in a sub-picture mode, and the user can switch the main picture at any time; when the receiving end has two display screens, if only 1 path of shared content exists, the receiving end uses one screen for full-screen display, if 2 paths of content exist, the receiving end displays the two screens respectively, and a user can switch the content displayed on the two screens; if more contents exist, a user can designate any one of the multiple paths of shared contents as a main shared content, the main shared content and the parallel shared content are displayed on one screen independently, other shared contents are displayed in parallel, and the main shared content and the parallel shared content can be switched between the two screens.
By adopting the screen sharing method provided by the embodiment of the application, multiple paths of shared contents can be seen in the conference, so that the conference function is increased, the conference efficiency is improved, and the conference experience is improved; according to the embodiment of the application, multiple paths of contents can be received at the same time under the condition that the bandwidth of the receiving end is additionally increased, and the multiple paths of contents are displayed on one screen; according to the method and the device, the optimal code stream is selected to be received according to the capability of hardware, and the bandwidth occupation of a receiving end is reduced under the condition that the barrier experience is not affected.
Fig. 10 is a schematic diagram of a composition structure of a screen sharing device according to an embodiment of the present application, and as shown in fig. 10, the screen sharing device 100 includes:
an obtaining module 1001, configured to obtain data to be shared of at least two sharing terminals through a network; a first determining module 1002 configured to determine a number of access terminals; a second determining module 1003, configured to determine, according to the number of access terminals, a fusion manner of the data to be shared; a fusion module 1004, configured to fuse the data to be shared of the at least two sharing terminals by adopting the determined fusion manner, so as to obtain shared data corresponding to each access terminal; and a display module 1005, configured to display content corresponding to the shared data on each access terminal.
In some embodiments, the number of access terminals is the number of active terminals that have been accessed from among the access terminals; the second determining module 1003 is further configured to obtain, based on the number of the sharing terminals, the number of data to be shared; when the number of the data to be shared is greater than or equal to the number of the accessed effective terminals, grouping the data to be shared according to the number of the effective terminals to form a data group to be shared of the number of the effective terminals; and determining a fusion mode of the data to be shared according to the number of the data groups to be shared.
In some embodiments, the second determining module 1003 is further configured to determine, when a single data group to be shared includes a plurality of data to be shared, whether the fusion manner of the data to be shared is to perform a mosaic display on video frames corresponding to the plurality of data to be shared, or perform a nested display on video frames corresponding to the plurality of data to be shared.
In some embodiments, the fusing module 1004 is further configured to fuse the data to be shared by using a fusion manner of the data to be shared by each access terminal, so as to obtain shared data corresponding to the access terminal; or fusing the data to be shared by adopting a fusion mode of the data to be shared through a cloud server to obtain the shared data corresponding to each access terminal.
In some embodiments, each of the access terminals corresponds to a terminal resolution; the fusion module 1004 is further configured to obtain, by using the cloud server, the terminal resolution of each access terminal; according to terminal resolutions of a plurality of access terminals, carrying out layered coding on the shared data so that a first coded video corresponding to the coded shared data has multi-layer video resolution; matching target shared data among a plurality of encoded shared data having the multi-layer video resolution according to the terminal resolution of each of the access terminals; and sharing the target sharing data to the corresponding access terminal.
In some embodiments, each of the access terminals corresponds to a terminal resolution; the fusion module 1004 is further configured to obtain, by using the cloud server, the terminal resolution of each access terminal; encoding the data to be shared according to the terminal resolution of each access terminal, so that the video resolution of a second encoded video frame corresponding to the encoded data to be shared is matched with the terminal resolution; and fusing the coded data to be shared by adopting the fusion mode to obtain the shared data corresponding to each access terminal.
In some embodiments, the shared data comprises at least one rendered video frame; the display module 1005 is further configured to display the rendered at least one video frame on each access terminal according to the playing order of the video frames.
It should be noted that, the description of the apparatus in the embodiment of the present application is similar to the description of the embodiment of the method described above, and has similar beneficial effects as the embodiment of the method, so that a detailed description is omitted. For technical details not disclosed in the embodiments of the present apparatus, please refer to the description of the embodiments of the method of the present application for understanding.
It should be noted that, in the embodiment of the present application, if the screen sharing method is implemented in the form of a software function module, and is sold or used as an independent product, the screen sharing method may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or part contributing to the related art, and the computer software product may be stored in a storage medium, including several instructions for causing a terminal to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, an optical disk, or other various media capable of storing program codes. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
An embodiment of the present application provides a screen sharing device, fig. 11 is a schematic diagram of a composition structure of the screen sharing device provided in the embodiment of the present application, as shown in fig. 11, where the screen sharing device 110 at least includes: a processor 111 and a computer readable storage medium 112 configured to store executable instructions, wherein the processor 111 generally controls the overall operation of the screen sharing device. The computer-readable storage medium 112 is configured to store instructions and applications executable by the processor 111, and may also cache data to be processed or processed by each module in the processor 111 and the screen sharing device 110, and may be implemented by a flash memory or a random access memory (RAM, random Access Memory).
The embodiments of the present application provide a storage medium storing executable instructions, where the executable instructions are stored, which when executed by a processor, cause the processor to perform a screen sharing method provided by the embodiments of the present application, for example, a method as shown in fig. 2.
In some embodiments, the storage medium may be a computer readable storage medium, such as a ferroelectric Memory (FRAM, ferromagnetic Random Access Memory), read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read Only Memory), erasable programmable Read Only Memory (EPROM, erasable Programmable Read Only Memory), electrically erasable programmable Read Only Memory (EEPROM, electrically Erasable Programmable Read Only Memory), flash Memory, magnetic surface Memory, optical Disk, or Compact Disk-Read Only Memory (CD-ROM), or the like; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or, alternatively, distributed across multiple sites and interconnected by a communication network.
The technical features of the embodiments of the present application may be arbitrarily combined without any conflict between them.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The foregoing is merely an embodiment of the present application, but the protection scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A screen sharing method, comprising:
acquiring data to be shared of at least two sharing terminals through a network;
determining a number of access terminals; the number of the access terminals is the number of the effective terminals; the effective terminal is an access terminal which is accessed in the access terminals and can display content;
acquiring the quantity of the data to be shared based on the quantity of the sharing terminals; the number of the data to be shared is equal to the number of the sharing terminals;
when the number of the data to be shared is greater than or equal to the number of the effective terminals, grouping the data to be shared according to the number of the effective terminals to form a data group to be shared of the number of the effective terminals;
determining a fusion mode of the data to be shared according to the number of the data groups to be shared;
fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain the sharing data corresponding to each access terminal;
and displaying the content corresponding to the shared data on each access terminal.
2. The method of claim 1, wherein the determining, according to the number of the data groups to be shared, a fusion manner of the data to be shared includes:
When a single data group to be shared comprises a plurality of data to be shared, determining that the fusion mode of the data to be shared is to splice and display video frames corresponding to the data to be shared, or to nest and display the video frames corresponding to the data to be shared.
3. The method of claim 1, wherein the fusing the data to be shared of the at least two sharing terminals by using the determined fusing manner to obtain the shared data corresponding to each access terminal comprises:
and fusing the data to be shared by each access terminal in a fusion mode of the data to be shared to obtain the shared data corresponding to the access terminal.
4. The method of claim 1, wherein the fusing the data to be shared of the at least two sharing terminals by using the determined fusing manner to obtain the shared data corresponding to each access terminal, further comprises:
and fusing the data to be shared by adopting a fusion mode of the data to be shared through a cloud server to obtain the shared data corresponding to each access terminal.
5. The method of claim 4, wherein each of the access terminals corresponds to a terminal resolution; the method further comprises the steps of:
acquiring the terminal resolution of each access terminal through the cloud server;
according to terminal resolutions of a plurality of access terminals, carrying out layered coding on the shared data so that a first coded video corresponding to the coded shared data has multi-layer video resolution;
matching target shared data among a plurality of encoded shared data having the multi-layer video resolution according to the terminal resolution of each of the access terminals;
and sharing the target sharing data to the corresponding access terminal.
6. The method of claim 4, wherein each of the access terminals corresponds to a terminal resolution;
the method for fusing the data to be shared by the cloud server by adopting the fusion mode of the data to be shared, to obtain the shared data corresponding to each access terminal, comprises the following steps:
acquiring the terminal resolution of each access terminal through the cloud server;
encoding the data to be shared according to the terminal resolution of each access terminal, so that the video resolution of a second encoded video frame corresponding to the encoded data to be shared is matched with the terminal resolution;
And fusing the coded data to be shared by adopting the fusion mode to obtain the shared data corresponding to each access terminal.
7. The method of claim 1, wherein the shared data comprises at least one rendered video frame; the displaying, on each access terminal, content corresponding to the shared data includes:
and displaying the at least one rendered video frame on each access terminal according to the playing sequence of the video frames.
8. A screen sharing apparatus, comprising:
the acquisition module is used for acquiring data to be shared of at least two sharing terminals through a network;
a first determining module configured to determine a number of access terminals; the number of the access terminals is the number of the effective terminals; the effective terminal is an access terminal which is accessed in the access terminals and can display content;
the second determining module is used for acquiring the quantity of the data to be shared based on the quantity of the sharing terminals; the number of the data to be shared is equal to the number of the sharing terminals; when the number of the data to be shared is greater than or equal to the number of the effective terminals, grouping the data to be shared according to the number of the effective terminals to form a data group to be shared of the number of the effective terminals; determining a fusion mode of the data to be shared according to the number of the data groups to be shared;
The fusion module is used for fusing the data to be shared of the at least two sharing terminals by adopting the determined fusion mode to obtain the sharing data corresponding to each access terminal;
and the display module is used for displaying the content corresponding to the sharing data on each access terminal.
9. A screen sharing apparatus, comprising:
a memory for storing executable instructions; a processor configured to implement the screen sharing method according to any one of claims 1 to 7 when executing the executable instructions stored in the memory.
10. A computer readable storage medium storing executable instructions for causing a processor to execute the executable instructions to implement the screen sharing method of any one of claims 1 to 7.
CN202110326820.XA 2021-03-26 2021-03-26 Screen sharing method, device, equipment and computer readable storage medium Active CN113094011B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110326820.XA CN113094011B (en) 2021-03-26 2021-03-26 Screen sharing method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110326820.XA CN113094011B (en) 2021-03-26 2021-03-26 Screen sharing method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113094011A CN113094011A (en) 2021-07-09
CN113094011B true CN113094011B (en) 2023-12-26

Family

ID=76670149

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110326820.XA Active CN113094011B (en) 2021-03-26 2021-03-26 Screen sharing method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113094011B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014041353A2 (en) * 2012-09-13 2014-03-20 Tupac Martir Media content distribution
CN104346115A (en) * 2013-07-29 2015-02-11 中兴通讯股份有限公司 Method for distributing pictures to multiple terminals for display and terminal
CN105556495A (en) * 2013-09-17 2016-05-04 三星电子株式会社 Method for screen mirroring and source device thereof
CN107396034A (en) * 2017-08-11 2017-11-24 苏睿 Image transfer method and system
CN109413433A (en) * 2018-11-19 2019-03-01 上海赛连信息科技有限公司 Content share method, device and system
CN110069229A (en) * 2019-04-22 2019-07-30 努比亚技术有限公司 Screen sharing method, mobile terminal and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106453542A (en) * 2016-09-29 2017-02-22 努比亚技术有限公司 Screen sharing apparatus and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014041353A2 (en) * 2012-09-13 2014-03-20 Tupac Martir Media content distribution
CN104346115A (en) * 2013-07-29 2015-02-11 中兴通讯股份有限公司 Method for distributing pictures to multiple terminals for display and terminal
CN105556495A (en) * 2013-09-17 2016-05-04 三星电子株式会社 Method for screen mirroring and source device thereof
CN107396034A (en) * 2017-08-11 2017-11-24 苏睿 Image transfer method and system
CN109413433A (en) * 2018-11-19 2019-03-01 上海赛连信息科技有限公司 Content share method, device and system
CN110069229A (en) * 2019-04-22 2019-07-30 努比亚技术有限公司 Screen sharing method, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN113094011A (en) 2021-07-09

Similar Documents

Publication Publication Date Title
US11727079B2 (en) Cooperative web browsing using multiple devices
Niamut et al. MPEG DASH SRD: spatial relationship description
US10742999B2 (en) Methods and apparatus for signaling viewports and regions of interest
US20170347084A1 (en) Virtual reality panoramic video system using scalable video coding layers
CN109074678B (en) Information processing method and device
CN108063976B (en) Video processing method and device
EP3734979A1 (en) Video transmission method, client, and server
US20190215485A1 (en) Method and system for providing video stream of video conference
US20190020863A1 (en) Methods and apparatus for spherical region presentation
CN112188303A (en) VR (virtual reality) streaming media playing method and device based on visual angle
CN113965751B (en) Screen content coding method, device, equipment and storage medium
KR20130112162A (en) Video display terminal and method for displaying a plurality of video thumbnail simultaneously
KR20240025543A (en) Interaction methods and apparatus, electronic devices and storage media
US11785195B2 (en) Method and apparatus for processing three-dimensional video, readable storage medium and electronic device
CN114154012A (en) Video recommendation method and device, electronic equipment and storage medium
CN109640180B (en) Method, device, equipment, terminal, server and storage medium for 3D display of video
CN115225937A (en) Immersive media providing method, immersive media obtaining device, immersive media equipment and storage medium
CN109587478A (en) A kind of processing method and processing device of media information
CN113891111B (en) Live broadcasting method, device, medium and equipment of billion pixel video
CN113094011B (en) Screen sharing method, device, equipment and computer readable storage medium
US11157146B2 (en) Display apparatus and control method thereof for providing preview content
KR20220031560A (en) Information processing apparatus, information processing method, reproduction processing apparatus and reproduction processing method
US11677979B2 (en) Freeview video coding
CN113630648B (en) Method and device for playing multi-scenario panoramic video and computer readable storage medium
CN114513692A (en) Video preview method, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant