WO2022135092A1 - 屏幕共享显示方法、装置、设备及存储介质 - Google Patents

屏幕共享显示方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2022135092A1
WO2022135092A1 PCT/CN2021/134886 CN2021134886W WO2022135092A1 WO 2022135092 A1 WO2022135092 A1 WO 2022135092A1 CN 2021134886 W CN2021134886 W CN 2021134886W WO 2022135092 A1 WO2022135092 A1 WO 2022135092A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
frame
images
terminal device
compressed image
Prior art date
Application number
PCT/CN2021/134886
Other languages
English (en)
French (fr)
Inventor
徐斌
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Priority to US18/258,601 priority Critical patent/US20240045641A1/en
Priority to EP21909089.1A priority patent/EP4243408A1/en
Publication of WO2022135092A1 publication Critical patent/WO2022135092A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • G06F3/1462Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay with means for detecting differences between the image stored in the host and the images displayed on the remote displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • H04L65/1089In-session procedures by adding media; by removing media
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/765Media network packet handling intermediate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/507Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction using conditional replenishment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/10Special adaptations of display systems for operation with variable images
    • G09G2320/103Detection of image changes, e.g. determination of an index representative of the image change
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/02Handling of images in compressed format, e.g. JPEG, MPEG

Definitions

  • the embodiments of the present disclosure relate to the technical field of computer and network communication, and in particular, to a screen sharing display method, apparatus, device, and storage medium.
  • the sending end of the screen sharing needs to collect the image data of the screen frame by frame, and after encoding, it is sent to the receiving end of the user for display through the network.
  • Embodiments of the present disclosure provide a screen sharing display method, apparatus, device, and storage medium, so as to overcome the problem that huge data traffic is generated during the screen sharing process, the network bandwidth burden is increased, and video freezes are caused.
  • an embodiment of the present disclosure provides a screen sharing display method, including:
  • an embodiment of the present disclosure provides a screen sharing display method, including:
  • a screen sharing display device including:
  • an acquisition unit configured to acquire two adjacent frames of images of the first terminal device
  • a determining unit configured to determine image change information according to the two adjacent frames of images, wherein the image change information represents the degree of change of the next frame of images relative to the previous frame of images in the two adjacent frames of images;
  • a compression unit configured to compress the next frame of image to obtain compressed image data if it is determined that the degree of change represented by the image change information is greater than or equal to a first preset degree
  • a sending unit configured to send the compressed image data to a second terminal device for display.
  • a screen sharing display device including:
  • a receiving unit configured to receive the compressed image data sent by the first terminal device
  • the display unit is used for displaying the compressed image frame in the compressed image data.
  • embodiments of the present disclosure provide an electronic device, including: at least one processor and a memory;
  • the memory stores computer-executable instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the above-mentioned first aspect and various possible screen sharing display methods related to the first aspect.
  • embodiments of the present disclosure provide an electronic device, including: at least one processor and a memory;
  • the memory stores computer-executable instructions
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the second aspect above and various possible screen sharing display methods related to the second aspect.
  • embodiments of the present disclosure provide a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the first aspect and the first Aspects may involve the screen sharing display method described above, or when the processor executes the computer-executed instructions, the above second aspect and various possible screen sharing display methods involved in the second aspect are implemented.
  • embodiments of the present disclosure provide a computer program product, including a computer program, which, when executed by a processor, implements the above-mentioned first aspect and various possible screen sharing display methods related to the first aspect, Alternatively, the above second aspect and various possible related screen sharing display methods of the second aspect are implemented.
  • an embodiment of the present disclosure provides a computer program that, when executed by a processor, implements the above-mentioned first aspect and various possible screen sharing display methods related to the first aspect, or implements the above-mentioned first aspect
  • the second aspect and various possibilities of the second aspect relate to the described screen sharing display method.
  • the method obtains two adjacent frames of images of the first terminal device; and determines image change information according to the two adjacent frames of images, wherein the The image change information represents the degree of change of the image of the next frame relative to the image of the previous frame in the two adjacent frames of images; if it is determined that the degree of change represented by the image change information is greater than or equal to the first preset degree, The latter frame of image is compressed to obtain compressed image data; the compressed image data is sent to the second terminal device for display. Since the first terminal device compresses the image according to the degree of change in the image, it is shared on the screen.
  • the image data corresponding to the screen image is compressed, which can reduce the amount of transmitted data without affecting the effective data transmission, and solve the problem of data flow in the process of screen sharing and display.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart 1 of a screen sharing display method provided by an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of sending a compressed image to a second terminal device according to an embodiment of the present disclosure
  • FIG. 4 is a second schematic flowchart of a screen sharing display method provided by an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of a continuous multi-frame image provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic flowchart of step S202 in the embodiment shown in FIG. 4;
  • FIG. 7 is a schematic diagram of processing multiple frames of continuous images according to an embodiment of the present disclosure.
  • FIG. 8 is a third schematic flowchart of a screen sharing display method provided by an embodiment of the present disclosure.
  • FIG. 9 is a signaling diagram of a screen sharing display method provided by an embodiment of the present disclosure.
  • FIG. 10 is a structural block diagram of a screen sharing display device according to an embodiment of the present disclosure.
  • FIG. 11 is a structural block diagram of another screen sharing display device provided by an embodiment of the present disclosure.
  • FIG. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
  • FIG. 13 is a schematic structural diagram of another electronic device according to an embodiment of the present disclosure.
  • FIG. 14 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present disclosure.
  • user A remotely displays content displayed in the terminal device to user B.
  • user A sends a screen image to the terminal device b operated by user B through terminal device a, so that user B can watch it online in real time through terminal device b
  • the contents of the screen of terminal device a may be realized through a cloud server as shown in FIG. 1 , or data exchange may be performed directly (not shown in the figure), which is not specifically limited here.
  • user A displays the content displayed on the screen of terminal device a on terminal device b through screen sharing, realizing real-time document display, information exchange and other purposes.
  • the sending end of the screen sharing usually captures the image data displayed on the screen frame by frame, encodes it, and sends it to the user receiving end through the network for display.
  • screen sharing has high requirements on the definition of the captured screen images, capturing and encoding frame by frame will result in huge data traffic during the screen sharing process and increase the burden of network bandwidth. If the network load is too high, it will cause It solves the problem of video freezing, which affects the effect of screen sharing display.
  • the main purpose of the screen sharing function is to display documents, pictures and other files with high information density.
  • the display process of such high information density files usually requires the screen to be kept at Display statically.
  • the screen when the side sharing the screen displays the current page of the document, the screen remains static, while the side viewing the screen needs to view the contents of the current page of the document in detail; after the current page is displayed, slide the document to the next page.
  • the user who actually watches the screen does not need to watch the content displayed on the screen in detail. Therefore, the pictures displayed on the screen during the document sliding period are also subjected to high-frequency sampling without distinction. , encoding and transmission actually waste device resources and bandwidth resources, thus causing the problem of network bandwidth burden. Therefore, a method to solve the above problems is urgently needed at this stage.
  • Embodiments of the present disclosure provide a screen sharing display method to solve the above problems.
  • FIG. 2 is a schematic flow chart 1 of a screen sharing display method provided by an embodiment of the present disclosure. Referring to FIG. 2 , the method of this embodiment may be applied to a first terminal device, and the screen sharing display method includes:
  • S101 Acquire two adjacent frames of images of the first terminal device.
  • the first terminal device is a terminal device that provides image data for screen sharing, that is, the first terminal device shares the local screen image as source information to other terminal devices for viewing.
  • the method for the first terminal device to acquire the image displayed on the screen may be by sampling the content displayed on the screen frame by frame, thereby obtaining multiple frames of screen images. Further, in order to determine whether the shared screen is currently in a static state or a sliding state, it is necessary to acquire the changes between two adjacent frames of images.
  • the two adjacent frames here may be a frame of image that is currently acquired recently and an image of the previous frame adjacent to the frame of image.
  • S102 Determine image change information according to two adjacent frames of images, where the image change information represents the degree of change of the next frame of images relative to the previous frame of images in the two adjacent frames of images.
  • the image change information represents that the image of the next frame of the two adjacent frames of images is relative to the previous frame.
  • the degree of change in the image exemplary, the image change information includes a similarity evaluation value, and the similarity evaluation value is used to represent the similarity between the next frame of images and the previous frame of images in two adjacent frames of images.
  • determining the image change information according to two adjacent frames of images includes: determining the similarity of each group of adjacent two frames of images according to multiple groups of adjacent two frames of images, and determining the similarity of each group of adjacent two frames of images according to each group of adjacent images.
  • the similarity evaluation value of the two frames of images is used to generate image change information. Specifically, for example, the similarity corresponding to each group of adjacent two frame images is calculated to obtain the similarity evaluation value of each group of adjacent two frame images, which is compared with Multiple groups of similarity evaluation values corresponding to two adjacent frames of images form a similarity evaluation value sequence, which is used as image change information.
  • the image change information is a sequence of similarity evaluation values, and the similarity evaluation value sequence includes a plurality of similarity evaluation values. If the degree of change represented by the image change information is greater than or equal to the first preset degree, It means that the multiple similarity evaluation values are all smaller than the preset threshold, that is, there have been continuous multiple frames of images with great changes. At this time, it can be determined that the screen is in a sliding state, that is, the information displayed on the screen in this state is meaningless information.
  • compressing screen data refers to compressing the next frame of images in two adjacent frames of images collected, or the last frame of images in consecutive multiple frames of images, that is, the latest currently collected frame of images.
  • compressing the next frame of image to obtain compressed image data includes: down-sampling the next frame of image, for example, down-sampling an image with a resolution of 1080P to 270P.
  • a compressed image frame is obtained after downsampling, and the compressed image frame has a lower resolution and a smaller image volume than the image before downsampling.
  • the compressed image frame is sent to the encoder for encoding processing to obtain compressed image data.
  • the process of encoding the image is the prior art in the art, and details are not repeated here.
  • S104 Send the compressed image data to the second terminal device for display.
  • the second terminal device is a terminal device that receives and displays the shared screen image.
  • the implementation of sending the compressed image to the second terminal device may be to send the compressed image data to the cloud server, and the cloud server may forward the compressed image data to the second terminal device, or the first terminal device may directly send the compressed image data to the second terminal device.
  • the terminal device sends the compressed image data, which can be set as required, which is not specifically limited here. After the compressed image data is generated, the compressed image data is sent to the second terminal device, and after the second terminal device decodes the compressed image data, the compressed image data is displayed, and the process of sharing the screen display is completed.
  • FIG. 3 is a schematic diagram of sending a compressed image to a second terminal device according to an embodiment of the present disclosure. As shown in FIG.
  • the first terminal device It is judged whether the screen is in a sliding state, and if it is in a sliding state, the screen image is compressed, and compressed image data is generated and sent to the second terminal device for display, thereby reducing the amount of data transmission and network load.
  • two adjacent frames of images of the first terminal device are acquired; image change information is determined according to the two adjacent frames of images, wherein the image change information represents that the next frame of the adjacent two frames of images is relative to the previous frame.
  • the degree of change of one frame of image if it is determined that the degree of change represented by the image change information is greater than or equal to the first preset degree, compress the next frame of image to obtain compressed image data; send the compressed image data to the second terminal device for display , since the first terminal device compresses the image according to the degree of the change of the image, if the image changes greatly during the screen sharing process, that is, during the screen sliding process, the image data corresponding to the screen image is compressed, which can realize the Without affecting the effective data transmission, the transmission data volume is reduced, the problem of large data flow during the screen sharing display process is solved, the fluency of the screen sharing display is improved, and the display freeze is reduced.
  • FIG. 4 is a second schematic flowchart of a screen sharing display method provided by an embodiment of the present disclosure.
  • steps S101 to S103 are further refined, and a step of discarding corresponding screen image frames when the screen is in a static state is added.
  • the screen sharing display method includes:
  • S201 Acquire multiple frames of continuous images of the first terminal device, where the multiple frames of continuous images include at least three frames of images, and each adjacent two frames of the multiple frames of continuous images constitute a group of two adjacent frames of images.
  • acquiring multiple frames of continuous images of the first terminal device includes: the first terminal device performs frame-by-frame image capture of the content displayed on its screen, and after each frame of image is captured, buffering and storing the image. In the local storage medium, then continue to collect the next frame of image. After each acquisition of the current latest frame of image, the first terminal device reads several frames of images adjacent to the current frame image from the buffer, and combines the adjacent several frames of images with the current frame image to form multiple frames. continuous image. Specifically, for example, two frames of images before the current frame are acquired from the buffer, and together with the current frame of images, a three-frame continuous image is formed.
  • FIG. 5 is a schematic diagram of a continuous multi-frame image provided by an embodiment of the present disclosure.
  • the multi-frame continuous image includes a first frame to a fourth frame, wherein the first frame and the second frame, the second frame The frame and the third frame, the third frame and the fourth frame respectively form three groups of images of two adjacent frames.
  • S202 Determine image change information according to each group of two adjacent frames of images.
  • the image change information includes a sequence of similarity evaluation values
  • the sequence of similarity evaluation values includes a plurality of similarity evaluation values
  • each similarity evaluation value corresponds to a group of two adjacent frames of images respectively.
  • the similarity evaluation value is used to characterize the similarity between the next frame of images and the previous frame of images in two adjacent frames of images.
  • step S202 includes two specific implementation steps of steps S2021 and S2022:
  • S2021 Acquire pixel information of a previous frame of images and pixel information of a subsequent frame of images in each group of two adjacent frames of images.
  • S2022 According to a preset image comparison algorithm, perform feature comparison between the pixel information of the previous frame image and the pixel information of the next frame image in each group of adjacent two frame images, and the similarity evaluation value of each group of adjacent two frame images .
  • feature comparison is performed on the pixel information of the previous frame image and the pixel information of the next frame image in each group of two adjacent frame images, for example, calculating the pixel information of the previous frame image and the next frame image respectively.
  • the similarity is obtained, thereby obtaining a corresponding similarity evaluation value, wherein the implementation method of calculating the similarity according to the pixel information is the prior art, which will not be repeated here.
  • it is also possible to compare the brightness, contrast and structure by calculating the structural similarity (SSIM for short), or through the content feature method, key point matching method, etc. Similarity evaluation value, which will not be repeated here.
  • the similarity evaluation values of each group of adjacent two-frame images in the multi-frame continuous images are all smaller than the preset first similarity evaluation value threshold, it means that in the continuous multi-frame images, there is a relatively high similarity between each frame image.
  • the screen has entered the sliding state.
  • the displayed content is the content that the user does not need to watch carefully.
  • the captured image will be displayed in the subsequent encoding process.
  • Generate key frames Compared with non-key frames, the data volume of key frames is larger, and the transmission of key frames will take up more bandwidth. Therefore, for the plane image collected in the sliding state, that is, the last frame of multiple consecutive images
  • the image is compressed and encoded to obtain compressed image data, and the compressed image data is transmitted, which can effectively reduce the amount of data transmission.
  • the similarity evaluation value is greater than the second preset similarity evaluation value threshold. If the similarity evaluation value is greater than 99%, it means that the current newly collected screen image has almost no change compared with the adjacent previous screen image. At this time, it is determined that the screen is in a static state. Among them, when the screen is in a static state, the newly collected screen image will generate non-key frames in the subsequent encoding process, and the absence of non-key frames will not affect the normal playback of the video.
  • the content displayed on the screen is not updated.
  • there is no need to send the captured screen image to the second terminal device that is, no non-key frames corresponding to the screen image are generated and data transmission is reduced. This reduces bandwidth resource consumption, and the second terminal device can continue to display the current screen image without updating the image.
  • the image data is not sent to the second terminal device by judging that the similarity evaluation value of the last group of adjacent two-frame images in the multi-frame continuous image is greater than the preset second similarity evaluation value threshold. , but only buffers the newly collected image for judging the next frame of screen image, so that when the screen is in a static state, no image data is sent to the second terminal device, which further reduces the data in the process of screen sharing and display transmission volume and reduce network load.
  • FIG. 7 is a schematic diagram of processing multiple frames of continuous images according to an embodiment of the present disclosure.
  • the first terminal device collects the content displayed on the screen in real time, generates screen image frames, and buffers them.
  • the terminal device determines the screen state by judging the latest three frames of continuous images, that is, in the last three frames of continuous images, if the similarity between the two adjacent frames of images composed of the first frame image p1 and the second frame image p2 is The evaluation value is greater than 99%, that is, it is determined that the screen is currently in a static state, and the newly collected first frame image p1 is buffered to the local storage medium without sending data to the second terminal device, so that the second terminal device continues to display The current content is not updated and displayed, and the first terminal device continues to collect screen images;
  • the similarity evaluation value of the two adjacent frame images composed of p2 and the third frame image p3 is less than 90%, that is, it is determined that the screen is currently in a sliding state, and the newly collected first frame image p1 is compressed into compressed image data and sent to
  • the second terminal device enables the second terminal device to update the image; in other cases, that is, it is determined that the current screen is in a normal state, the first terminal device send
  • a heartbeat signal is sent to the second terminal device; wherein the heartbeat signal is used to indicate that the first terminal device and the second terminal device are in a connected state.
  • the first terminal device when the image change information indicates that the degree of change is less than or equal to the second preset degree, since the first terminal device does not send data to the second terminal device at this time, in order to enable the second terminal device to confirm in real time with the second terminal device In the connection state of a terminal device, to avoid connection reset, the first terminal device sends a heartbeat signal to the second terminal device, so as to improve the connection stability during the screen sharing process.
  • step S204 it also includes:
  • S205 Acquire mouse layer data; send the mouse layer data to the encoder for encoding processing to obtain mouse layer encoding data; and send the mouse layer encoding data to the second terminal device.
  • the first terminal device determines that the screen is in a static state, by not sending screen image data to the second terminal, the purpose of reducing the amount of data transmission can be achieved.
  • the user on the side of the first terminal device needs to use the mouse to instruct the content displayed on the screen of the first terminal device, so that the voice explanation during the video conference can be more clearly realized. Therefore, The display position of the mouse needs to be sent to the second terminal device for display.
  • the mouse layer data is sent to the encoder for encoding processing, and the mouse layer encoding data can be obtained. Since the data amount of the mouse layer data is small, it is encoded separately, and the The generated mouse layer coding data is sent to the second terminal device, which has little impact on the data transmission volume, so the position of the displayed mouse can be updated without updating the screen image, and the real-time display of the mouse can be realized.
  • S206 Send the compressed image data to the second terminal device for display.
  • step S206 is the same as step S104 in the above-mentioned embodiment.
  • step S104 please refer to the discussion of step S104, which is not repeated here.
  • FIG. 8 is a third schematic flowchart of a screen sharing display method provided by an embodiment of the present disclosure. Referring to FIG. 8 , the method of this embodiment may be applied to a second terminal device, and the screen sharing display method includes:
  • S301 Receive the compressed image data sent by the first terminal device.
  • the compressed image data includes a compressed image frame, and the compressed image frame is an image frame obtained by the first terminal device compressing a plane image collected in a screen sliding state, and the image frame has a smaller volume and a lower size. resolution.
  • displaying the compressed image frames in the compressed image data includes:
  • the compressed image data is decoded to obtain a compressed image frame; and the compressed image frame is enlarged and displayed according to a preset image size.
  • the first terminal device reduces the size of the image by performing scale transformation (scale) on the image.
  • the image is then enlarged, that is, the image is scaled again, so that the scale of the image matches the scale displayed on the screen.
  • the second terminal device continues to display the current image frame when it does not receive the compressed image data sent by the first terminal device.
  • the current image frame is displayed;
  • an alarm message is output.
  • the method further includes:
  • S303 Receive the mouse layer encoding data; decode the mouse layer encoding data to obtain the mouse layer data, and display the mouse layer data.
  • the second terminal device After receiving the mouse layer encoding data, the second terminal device decodes the mouse layer encoding data to obtain mouse layer data, where the mouse layer data represents the position of the displayed mouse. According to the mouse layer coding data sent in real time by the first terminal device, the mouse is displayed on the second terminal device in real time. Since the data volume of the mouse layer coding data is small, the mouse layer coding data is received and processed and displayed by the mouse. , which can realize the display purpose of the mouse without affecting the amount of data transmission.
  • FIG. 9 is a signaling diagram of a screen sharing display method provided by an embodiment of the present disclosure.
  • the screen sharing display method provided by the embodiment of the present disclosure includes:
  • the first terminal device collects a current screen image to obtain a current image frame.
  • the first terminal device determines, according to the current image frame, two adjacent frames of images before the current image frame, and uses the current image frame and the two adjacent frames of images before the current image frame as multi-frame continuous images.
  • the first terminal device compresses the current image frame to obtain compressed image data if it determines that the similarity evaluation values of each group of adjacent two-frame images in the multi-frame continuous images are both smaller than the preset first similarity evaluation value threshold .
  • the first terminal device buffers the current image frame locally if it is determined that the similarity evaluation value between the current image frame and the previous adjacent image frame is greater than the preset second similarity evaluation value threshold.
  • the first terminal device obtains the mouse layer data; sends the mouse layer data to the encoder for encoding processing, and obtains the mouse layer encoding data.
  • the first terminal device sends the mouse layer encoding data to the second terminal device.
  • the second terminal device receives the mouse layer encoding data.
  • the second terminal device decodes the coded data of the mouse layer, obtains the mouse layer data, and displays the mouse layer data.
  • S409 The first terminal device sends the compressed image data to the second terminal device for display.
  • the second terminal device receives the compressed image data sent by the first terminal device, and displays the compressed image frame in the compressed image data.
  • FIG. 10 is a structural block diagram of a screen sharing display apparatus provided by an embodiment of the present disclosure, which is applied to a first terminal device.
  • the screen sharing display device 5 includes:
  • the acquiring unit 51 is configured to acquire two adjacent frames of images of the first terminal device.
  • the determining unit 52 is configured to determine image change information according to two adjacent frames of images, wherein the image change information represents the degree of change of the next frame of images relative to the previous frame of images in the two adjacent frames of images.
  • the compression unit 53 is configured to compress the next frame of image to obtain compressed image data if it is determined that the degree of change represented by the image change information is greater than or equal to the first preset degree.
  • the sending unit 54 is configured to send the compressed image data to the second terminal device for display.
  • the determining unit 52 is further configured to: if it is determined that the image change information represents that the degree of change is less than or equal to the second preset degree, locally buffer the next frame of images in the two adjacent frames of images, wherein , the second terminal device does not update the image.
  • the obtaining unit 51 is further configured to: obtain multiple frames of continuous images of the first terminal device, where the multiple frames of continuous images include at least three frames of continuous images, and each adjacent two frames of the multiple frames of continuous images The frame constitutes a group of two adjacent frames of images; correspondingly, the compression unit 53 is specifically used for: if each group of two adjacent frames of images in the multiple frames of continuous images, the corresponding image change information represents the degree of change is greater than or equal to the first At a preset level, the last frame of the multiple frames of consecutive images is compressed to obtain compressed image data.
  • the image change information includes a similarity evaluation value, and the similarity evaluation value is used to represent the similarity between the next frame of images and the previous frame of images in two adjacent frames of images;
  • the compression unit 53 is specifically used for: if the similarity evaluation value of each group of adjacent two-frame images in the multi-frame continuous image is smaller than the preset first similarity evaluation value threshold, then compress the last frame image of the multi-frame continuous image, Get compressed image data.
  • the determining unit 52 is further configured to: if the similarity evaluation value of the last group of two adjacent frames of images is greater than the preset second similarity evaluation value threshold in the consecutive images of multiple frames, then The next frame image in the last group of two adjacent frame images is locally buffered, wherein the second terminal device does not update the image.
  • the compression unit 53 when the compression unit 53 compresses the image of the next frame to obtain compressed image data, it is specifically used for: down-sampling the image of the next frame to obtain the compressed image frame; to send the compressed image frame to the encoder for encoding processing to obtain compressed image data.
  • the sending unit 54 is further configured to: if it is determined that the degree of change represented by the image change information is less than or equal to a second preset degree, send a heartbeat signal to the second terminal device; wherein the heartbeat signal is used to indicate The first terminal device and the second terminal device are in a connected state.
  • the obtaining unit 51 is further configured to: obtain the mouse layer data; send the mouse layer data to the encoder for encoding processing to obtain the mouse layer encoding data; the sending unit 54 is further configured to: Send the mouse layer encoding data to the second terminal device.
  • the determining module 52 is specifically configured to: acquire pixel information of the previous frame of image and pixel information of the next frame of image; The feature comparison is performed with the pixel information of the next frame of image to determine the image change information.
  • the device provided in this embodiment can be used to implement the technical solutions of the foregoing method embodiments, and the implementation principles and technical effects thereof are similar, and details are not described herein again in this embodiment.
  • FIG. 11 is a structural block diagram of another screen sharing display apparatus provided by an embodiment of the present disclosure, which is applied to a second terminal device. For convenience of explanation, only the parts related to the embodiments of the present disclosure are shown. 11 , the screen sharing display device 6 includes:
  • the receiving unit 61 is configured to receive the compressed image data sent by the first terminal device.
  • the display unit 62 is used for displaying compressed image frames in the compressed image data.
  • the display unit 62 is specifically configured to: decode the compressed image data to obtain a compressed image frame; and to enlarge and display the compressed image frame according to a preset image size.
  • the receiving unit 61 is further configured to: receive mouse layer encoding data.
  • the display unit is further used for: decoding the coded data of the mouse layer to obtain the mouse layer data, and displaying the mouse layer data.
  • the display unit 62 is further configured to: continue to display the current image frame when the compressed image data sent by the first terminal device is not received.
  • the display unit 62 when the display unit 62 continues to display the current image frame when the compressed data sent by the first terminal device is not received, the display unit 62 is specifically configured to: if receiving the heartbeat data sent by the first terminal device, And when the compressed image data sent by the first terminal device is not received, the current image frame is displayed; if the heartbeat data sent by the first terminal device is not received, and the compressed image data sent by the first terminal device is not received, output Alarm information.
  • FIG. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in FIG. 12 , the electronic device 7 includes at least one processor 701 and a memory 702;
  • Memory 702 stores computer-executable instructions
  • At least one processor 701 executes the computer-executed instructions stored in the memory 702, so that the at least one processor 701 executes the screen sharing method in the embodiment shown in FIG. Method steps performed by a terminal device.
  • the processor 701 and the memory 702 are connected through a bus 703 .
  • FIG. 13 is a schematic structural diagram of another electronic device provided by an embodiment of the present disclosure. As shown in FIG. 13 , the electronic device 8 includes at least one processor 801 and a memory 802;
  • Memory 802 stores computer-executable instructions
  • At least one processor 801 executes the computer-executed instructions stored in the memory 802, so that the at least one processor 801 executes the screen sharing method in the embodiment shown in FIG. 8, or executes the screen sharing method in the embodiment shown in FIG. method steps.
  • the processor 801 and the memory 802 are connected through a bus 803 .
  • the electronic device 900 may be a terminal device or a server.
  • the terminal equipment may include, but is not limited to, such as mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, referred to as PDA), tablet computers (Portable Android Device, referred to as PAD), portable multimedia players (Portable Media Player, PMP for short), mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc., and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistant
  • PAD Portable Android Device
  • PMP Portable Multimedia Player
  • mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc.
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • the electronic device shown in FIG. 14 is only an example, and should not impose any limitation on the function and scope of use of the embodiments of the present disclosure.
  • the electronic device 900 may include a processing device (such as a central processing unit, a graphics processor, etc.) 901, which may be stored in a read only memory (Read Only Memory, ROM for short) 902 according to a program or from a storage device 908 is a program loaded into a random access memory (Random Access Memory, RAM for short) 903 to execute various appropriate actions and processes.
  • a processing device such as a central processing unit, a graphics processor, etc.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • various programs and data necessary for the operation of the electronic device 900 are also stored.
  • the processing device 901, the ROM 902, and the RAM 903 are connected to each other through a bus 904.
  • An input/output (I/O) interface 905 is also connected to bus 904 .
  • the following devices can be connected to the I/O interface 905: input devices 906 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD for short) ), speaker, vibrator, etc. output device 907; storage device 908 including, eg, magnetic tape, hard disk, etc.; and communication device 909.
  • the communication means 909 may allow the electronic device 900 to communicate wirelessly or by wire with other devices to exchange data. While FIG. 14 shows an electronic device 900 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
  • embodiments of the present disclosure include a computer program product comprising a computer program carried on a computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via the communication device 909, or from the storage device 908, or from the ROM 902.
  • the processing apparatus 901 the above-mentioned functions defined in the methods of the embodiments of the present disclosure are executed.
  • the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Computer readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable read-only memory (Erasable Programmable Read-Only Memory, referred to as EPROM or flash memory), optical fiber, portable compact disk read-only memory (Compact Disc-Read Only Memory, referred to as CD-ROM), optical storage devices, magnetic storage devices, or the above any suitable combination.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: electric wire, optical cable, RF (Radio Frequency, radio frequency for short), etc., or any suitable combination of the above.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or may exist alone without being assembled into the electronic device.
  • the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, causes the electronic device to execute the methods shown in the foregoing embodiments.
  • Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional Procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network—including a Local Area Network (LAN) or a Wide Area Network (WAN)—or, can be connected to an external A computer (eg using an internet service provider to connect via the internet).
  • LAN Local Area Network
  • WAN Wide Area Network
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logical functions for implementing the specified functions executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure may be implemented in a software manner, and may also be implemented in a hardware manner.
  • the name of the unit does not constitute a limitation of the unit itself under certain circumstances, for example, the first obtaining unit may also be described as "a unit that obtains at least two Internet Protocol addresses".
  • exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Products ( Application Specific Standard Product (ASSP), System on Chip (SOC), Complex Programmable Logic Device (CPLD), etc.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Products
  • SOC System on Chip
  • CPLD Complex Programmable Logic Device
  • a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
  • machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • CD-ROM compact disk read only memory
  • magnetic storage or any suitable combination of the foregoing.
  • a screen sharing display method including: acquiring two adjacent frames of images of a first terminal device; determining image change information according to the two adjacent frames of images, Among them, the image change information represents the degree of change of the next frame of images relative to the previous frame of images in the two adjacent frames of images; if it is determined that the degree of change represented by the image change information is greater than or equal to the first preset degree, the next frame of images is processed. Compress to obtain compressed image data; send the compressed image data to the second terminal device for display.
  • the method further includes: if it is determined that the degree of change represented by the image change information is less than or equal to a second preset degree, locally buffering the next frame of images in the two adjacent frames of images, wherein the first frame of image The second terminal device does not update the image.
  • acquiring two adjacent frames of images of the first terminal device includes: acquiring multiple frames of continuous images of the first terminal device, wherein the multiple frames of continuous images include at least three frames of images, and the multiple frames of images include at least three frames of images.
  • every two adjacent frames constitute a group of two adjacent frames of images; if it is determined that the degree of change represented by the image change information is greater than or equal to the first preset degree, the next frame of image is compressed to obtain compressed image data, including: If the degree of change represented by the corresponding image change information of each group of two adjacent frames in the multiple frames of continuous images is greater than or equal to the first preset degree, the last frame of the multiple frames of continuous images is compressed to obtain a compressed image. image data.
  • the image change information includes a similarity evaluation value, and the similarity evaluation value is used to represent the similarity between the next frame image and the previous frame image in the two adjacent frames of images; if For each group of two adjacent frame images in the multi-frame continuous image, the corresponding image change information represents the change degree greater than or equal to the first preset degree, then compress the last frame of the multi-frame continuous image to obtain a compressed image data, including: if the similarity evaluation value of each group of adjacent two-frame images in the multi-frame continuous image is smaller than the preset first similarity evaluation value threshold, compressing the last frame of the multi-frame continuous image to obtain Compress image data.
  • the method further includes: if the similarity evaluation value of the last group of two adjacent frames of images is greater than a preset second similarity evaluation value threshold in the multiple frames of consecutive images, buffering the image locally The image of the next frame in the last group of two adjacent frames of images, wherein the second terminal device does not update the image.
  • compressing a subsequent frame of image to obtain compressed image data includes: down-sampling the subsequent frame of image to obtain a compressed image frame; sending the compressed image frame to an encoder for encoding process to obtain compressed image data.
  • the method further includes: if it is determined that the degree of change represented by the image change information is less than or equal to a second preset degree, sending a heartbeat signal to the second terminal device; wherein the heartbeat signal is used to indicate the first The terminal device is in a connected state with the second terminal device.
  • the method further includes: acquiring mouse layer data; sending the mouse layer data to an encoder for encoding processing to obtain mouse layer encoding data; sending the mouse layer encoding data to the first Two terminal equipment.
  • determining image change information according to two adjacent frames of images includes: acquiring pixel information of the previous frame of image and pixel information of the next frame of image; according to a preset image comparison algorithm, The feature comparison is performed between the pixel information of the previous frame image and the pixel information of the next frame image to determine the image change information.
  • a screen sharing display method including: receiving compressed image data sent by a first terminal device; and displaying compressed image frames in the compressed image data.
  • displaying an image frame in the compressed image data includes: decoding the compressed image data to obtain a compressed image frame; and enlarging and displaying the compressed image frame according to a preset image size.
  • the method further includes: receiving mouse layer encoding data; decoding the mouse layer encoding data to obtain mouse layer data, and displaying the mouse layer data.
  • the method further includes: when the compressed image data sent by the first terminal device is not received, continuing to display the current image frame.
  • continuing to display the current image frame includes: if the heartbeat data sent by the first terminal device is received and the heartbeat data sent by the first terminal device is not received When the compressed image data sent by the first terminal device is sent, the current image frame is displayed; if the heartbeat data sent by the first terminal device is not received, and the compressed image data sent by the first terminal device is not received, an alarm message is output.
  • a screen sharing display device including:
  • An acquisition unit configured to acquire two adjacent frames of images of the first terminal device.
  • the determining unit is configured to determine image change information according to two adjacent frames of images, wherein the image change information represents the degree of change of the next frame of images relative to the previous frame of images in the two adjacent frames of images.
  • the compression unit is configured to compress the next frame of image to obtain compressed image data if it is determined that the degree of change represented by the image change information is greater than or equal to the first preset degree.
  • a sending unit configured to send the compressed image data to a second terminal device for display.
  • the determining unit is further configured to: if it is determined that the degree of change represented by the image change information is less than or equal to a second preset degree, locally buffer the next frame of images in the two adjacent frames of images, wherein, The second terminal device does not update the image.
  • the obtaining unit is further configured to: obtain multiple frames of continuous images of the first terminal device, wherein the multiple frames of continuous images include at least three frames of images, and each adjacent two frames of the multiple frames of continuous images The frame constitutes a group of two adjacent frames of images; correspondingly, the compression unit is specifically used for: if each group of two adjacent frames of images in the multiple frames of continuous images, the corresponding image change information indicates that the degree of change is greater than or equal to the first If the preset level is set, the last frame of the multiple frames of consecutive images is compressed to obtain compressed image data.
  • the image change information includes a similarity evaluation value, and the similarity evaluation value is used to represent the similarity between the next frame image and the previous frame image in the two adjacent frames of images; compressing The unit is specifically used for: if the similarity evaluation value of each group of adjacent two-frame images in the multi-frame continuous image is smaller than the preset first similarity evaluation value threshold, then compress the last frame image of the multi-frame continuous image , to get the compressed image data.
  • the determining unit is further configured to: if the similarity evaluation value of the last group of two adjacent frames of images is greater than a preset second similarity evaluation value threshold in the multiple frames of continuous images, then The next frame image in the last group of two adjacent frame images is locally buffered, wherein the second terminal device does not update the image.
  • the compression unit when the compression unit compresses the image of the next frame to obtain compressed image data, the compression unit is specifically configured to: downsample the image of the next frame to obtain the compressed image frame; It is sent to the encoder for encoding processing to obtain compressed image data.
  • the sending unit is further configured to: if it is determined that the degree of change represented by the image change information is less than or equal to a second preset degree, send a heartbeat signal to the second terminal device; wherein the heartbeat signal is used for Indicates that the first terminal device and the second terminal device are in a connected state.
  • the obtaining unit is further configured to: obtain the mouse layer data; send the mouse layer data to the encoder for encoding processing to obtain the mouse layer encoding data; the sending unit is further configured to: Send the mouse layer encoding data to the second terminal device.
  • the determining module is specifically configured to: acquire pixel information of the previous frame of image and pixel information of the next frame of image; The information is compared with the pixel information of the next frame of image to determine the image change information.
  • a screen sharing display device including:
  • the receiving unit is configured to receive the compressed image data sent by the first terminal device.
  • the display unit is used to display the compressed image frame in the compressed image data.
  • the display unit is specifically configured to: decode the compressed image data to obtain a compressed image frame; and to enlarge and display the compressed image frame according to a preset image size.
  • the receiving unit is further configured to: receive mouse layer encoding data.
  • the display unit is further used for: decoding the coded data of the mouse layer to obtain the mouse layer data, and displaying the mouse layer data.
  • the display unit is further configured to: continue to display the current image frame when the compressed image data sent by the first terminal device is not received.
  • the display unit when the display unit continues to display the current image frame when the compressed data sent by the first terminal device is not received, the display unit is specifically used for: if the heartbeat data sent by the first terminal device is received , and when the compressed image data sent by the first terminal device is not received, the current image frame is displayed; if the heartbeat data sent by the first terminal device is not received, and the compressed image data sent by the first terminal device is not received, Output alarm information.
  • an electronic device comprising: at least one processor and a memory;
  • memory stores instructions for execution by the computer
  • the at least one processor executes the computer-executed instructions stored in the memory, so that the at least one processor executes the above-mentioned first aspect and various possible related screen sharing display methods of the first aspect.
  • an electronic device comprising: at least one processor and a memory;
  • memory stores instructions for execution by the computer
  • the at least one processor executes the computer-executable instructions stored in the memory, so that the at least one processor executes the above second aspect and various possible related screen sharing display methods of the second aspect.
  • a computer-readable storage medium where computer-executable instructions are stored in the computer-readable storage medium, and when the processor executes the computer-executable instructions, the above first aspect is realized And various possible related screen sharing display methods of the first aspect, or when the processor executes the computer execution instructions, the above second aspect and various possible related screen sharing display methods of the second aspect are implemented.
  • a computer program product including a computer program, which, when executed by a processor, implements the first aspect and various possible related aspects of the first aspect.
  • a screen sharing display method or, implementing the above second aspect and various possible related screen sharing display methods of the second aspect.
  • a computer program which, when executed by a processor, implements the above-mentioned first aspect and various possible related screen sharing display methods of the first aspect, Alternatively, the above second aspect and various possible related screen sharing display methods of the second aspect are implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

本公开实施例提供一种屏幕共享显示方法、装置、设备及存储介质,该方法通过获取第一终端设备的相邻两帧图像;根据相邻两帧图像,确定图像变化信息,其中,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据;将压缩图像数据发送给第二终端设备进行显示,由于第一终端设备根据图像发生变化的程度,对图像进行了压缩,因此在屏幕共享过程中若图像变化较大,对屏幕图像对应的图像数据进行压缩,可以实现在不影响有效数据传递的情况下,降低传输数据量,解决了屏幕共享显示过程中数据流量大的问题,降低显示卡顿。

Description

屏幕共享显示方法、装置、设备及存储介质
相关申请的交叉引用
本申请要求于2020年12月25日提交的、申请号为202011563917.4、名称为“屏幕共享显示方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用并入本文。
技术领域
本公开实施例涉及计算机与网络通信技术领域,尤其涉及一种屏幕共享显示方法、装置、设备及存储介质。
背景技术
随着视频会议的普及,满足了用户的远程会议和沟通的需求,提高了办公效率,在视频会议中,屏幕共享是非常核心的功能,通过远程向屏幕共享接收端展示屏幕上的内容,实现了信息的快速传递,有效地提高了用户之间的信息交互效率。
目前,在屏幕共享的实现方案中,需要屏幕共享的发送端逐帧采集屏幕的图像数据,并进行编码后,通过网络发送至用户接收端进行显示。
然而,由于屏幕共享对采集的屏幕图像的清晰度要求很高,因此,导致在屏幕共享过程中会产生巨大的数据流量,增加网络带宽负担,造成了视频卡顿的问题。
发明内容
本公开实施例提供一种屏幕共享显示方法、装置、设备及存储介质,以克服在屏幕共享过程中产生巨大的数据流量,增加网络带宽负担,造成了视频卡顿的问题。
第一方面,本公开实施例提供一种屏幕共享显示方法,包括:
获取第一终端设备的相邻两帧图像;根据所述相邻两帧图像,确定图像变化信息,其中,所述图像变化信息表征所述相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;若确定所述图像变化信息所表征所述变化程度大于等于第一预设程度,则对所述后一帧图像进行压缩,得到压缩图像数据;将所述压缩图像数据发送给第二终端设备进行显示。
第二方面,本公开实施例提供一种屏幕共享显示方法,包括:
接收第一终端设备发送的压缩图像数据;
显示所述压缩图像数据中的压缩图像帧。
第三方面,本公开实施例提供一种屏幕共享显示装置,包括:
获取单元,用于获取第一终端设备的相邻两帧图像;
确定单元,用于根据所述相邻两帧图像,确定图像变化信息,其中,所述图像变化信息表征所述相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;
压缩单元,用于若确定所述图像变化信息所表征所述变化程度大于等于第一预设程度,则对所述后一帧图像进行压缩,得到压缩图像数据;
发送单元,用于将所述压缩图像数据发送给第二终端设备进行显示。
第四方面,本公开实施例提供一种屏幕共享显示装置,包括:
接收单元,用于接收第一终端设备发送的压缩图像数据;
显示单元,用于显示所述压缩图像数据中的压缩图像帧。
第五方面,本公开实施例提供一种电子设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第一方面以及第一方面各种可能的涉及所述的屏幕共享显示方法。
第六方面,本公开实施例提供一种电子设备,包括:至少一个处理器和存储器;
所述存储器存储计算机执行指令;
所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如上第二方面以及第二方面各种可能的涉及所述的屏幕共享显示方法。
第七方面,本公开实施例提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如上第一方面以及第一方面各种可能的涉及所述的屏幕共享显示方法,或者,当处理器执行所述计算机执行指令时,实现如上第二方面以及第二方面各种可能的涉及所述的屏幕共享显示方法。
第八方面,本公开实施例提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的涉及所述的屏幕共享显示方法,或者,实现如上第二方面以及第二方面各种可能的涉及所述的屏幕共享显示方法。
第九方面,本公开实施例提供一种计算机程序,该计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的涉及所述的屏幕共享显示方法,或者,实现如上第二方面以及第二方面各种可能的涉及所述的屏幕共享显示方法。
本实施例提供的屏幕共享显示方法、装置、设备及存储介质,该方法通过获取第一终端设备的相邻两帧图像;根据所述相邻两帧图像,确定图像变化信息,其中,所述图像变化信息表征所述相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;若确定所述图像变化信息所表征所述变化程度大于等于第一预设程度,则对所述后一帧图像进行压缩,得到压缩图像数据;将所述压缩图像数据发送给第二终端设备进行显示,由于第一终端设备根据图像发生变化的程度,对图像进行了压缩,因此在屏幕共享过程中若图像变化较大,即屏幕滑动过程中,对屏幕图像对应的图像数据进行压缩,可以实现在不影响有效数据传递的情况下,降低传输数据量,解决了屏幕共享显示过程中数据流量大的问题,提高了屏幕共享显示的流畅度,降低显示卡顿。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单的介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的一种应用场景示意图;
图2为本公开实施例提供的屏幕共享显示方法流程示意图一;
图3为本公开实施例提供的一种向第二终端设备发送压缩图像的示意图;
图4为本公开实施例提供的屏幕共享显示方法流程示意图二;
图5为本公开实施例提供的一种连续多帧图像示意图;
图6为图4所示实施例中步骤S202的流程示意图;
图7为本公开实施例提供的一种对多帧连续图像进行处理的示意图;
图8为本公开实施例提供的屏幕共享显示方法流程示意图三;
图9为本公开实施例提供的一种屏幕共享显示方法的信令图;
图10为本公开实施例提供的一种屏幕共享显示装置的结构框图;
图11为本公开实施例提供的另一种屏幕共享显示装置的结构框图;
图12为本公开实施例提供的一种电子设备的结构示意图;
图13为本公开实施例提供的另一种电子设备的结构示意图;
图14为本公开实施例提供的电子设备的硬件结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
下面首先对本公开所涉及的应用场景进行介绍。
图1为本公开实施例提供的一种应用场景示意图,如图1所示,在应用屏幕共享功能的场景下,用户A向用户B远程展示终端设备内显示的内容,具体地,用户A所操作的终端设备a和用户B所操作的终端设备b在建立通信连接后,用户A通过终端设备a向用户B所操作的终端设备b发送屏幕图像,使用户B能够通过终端设备b实时在线观看终端设备a屏幕中的内容。其中,终端设备a与终端设备b之间,可以如图1所示,通过云服务器实现数据的交换,也可以是直接进行数据交互(如图未示出),此处不进行具体限定。在图1所示的应用场景中,用户A将终端设备a的屏幕上显示的内容,通过屏幕共享显示在终端设备b中,实现了实时的文档展示、信息交流等目的。
现有技术中,在应用屏幕共享的功能时,通常屏幕共享的发送端是逐帧采集屏幕显示的图像数据,并进行编码后,通过网络发送至用户接收端进行显示。然而,由于屏幕共享对采集的屏幕图像的清晰度要求很高,逐帧采集编码,会导致在共享屏幕过程中会产生巨大的数据流量,增加网络带宽负担,若网络负载过高,则会造成了视频卡顿的问题,影响屏幕共享显示的效果。
在实际应用中,与视频会议等场景不同的是,屏幕共享功能的主要目的,是为了展示具有高信息密度的文档、图片等文件,该类高信息密度文件的展示过程,通常需要屏幕保持在静止态进行展示。以展示文档为例,共享屏幕一方在展示文档的当前页时,屏幕保持静止状态,观看屏幕一方则需要对文档当前页的内容进行详细观看;在当前页展示完毕后,滑动文档,至下一页进行展示,在该文档滑动期间,实际上观看屏幕一方的用户是不需要对屏幕所展示的内容进行详细观看的,因此,将文档滑动期间屏幕所显示的图片也无差别的进行高频采样、编码和传输,实际上是浪费了设备资源和带宽资源,从而造成了网络带宽负担的问题。因此,现阶段亟需一种方法来解决上述问题。
本公开实施例提供一种屏幕共享显示方法以解决上述问题。
图2为本公开实施例提供的屏幕共享显示方法流程示意图一,参考图2,本实施例的方法可以应用在第一终端设备中,该屏幕共享显示方法包括:
S101:获取第一终端设备的相邻两帧图像。
示例性地,第一终端设备为提供屏幕共享的图像数据的终端设备,即第一终端设备将本机的屏幕图像作为源信息分享给其他终端设备进行观看。示例性地,第一终端设备获取屏幕所显示的图像的方法可以是通过对屏幕所显示的内容进行逐帧采样,从而获得多帧屏幕图像。进一步地,为了判断共享屏幕当前是处于静止状态还是滑动状态,需要获取相邻两帧图像之间的变化情况。其中,示例性地,此处的相邻两帧可以是当前最新采集到的一帧图像和与该帧图像相邻的前一帧图像。
S102:根据相邻两帧图像,确定图像变化信息,其中,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度。
示例性地,获取相邻两帧图像后,确定该相邻两帧图像之间的变化情况,即确定图像变化信息,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度。示例性地,图像变化信息包括相似度评估值,相似度评估值用于表征相邻两帧图像中的后一帧图像与前一帧图像之间的相似度。
在一种可能的实现方式中,根据相邻两帧图像,确定图像变化信息,包括:根据多组相邻两帧图像分别确定每组相邻两帧图像的相似度,并根据每组相邻两帧图像的相似度评估值,生成图像变化信息,具体地,例如,计算各组相邻两帧图像所分别对应的相似度,得到各组相邻两帧图像的相似度评估值,将与多组相邻两帧图像对应的多个相似度评估值,组成一个相似度评估值序列,作为图像变化信息。
S103:若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据。
具体地,在一种可能的实现方式中,若图像变化信息所表征变化程度大于等于第一预设程度,说明相邻两帧图像中的后一帧,相对前一帧有较大变化,此时,可判定屏幕处于滑动状态,即此状态下屏幕所显示的信息为无意义信息。在另一种可能的实现方式中,图像变化信息为相似度评估值序列,相似度评估值序列中包括多个相似度评估值,若图像变化信息所表征变化程度大于等于第一预设程度,说明多个相似度评估值均已小于预设阈值,即已有连续多帧图像有较大变化,此时,可判定屏幕处于滑动状态,即此状态下屏幕所显示的信息为无意义信息。
其中,当两帧图像中的后一帧相对前一帧有较大变化时,该后一帧图像在后续的编码过程中会生成关键帧(I帧),相比于非关键帧(P帧),关键帧的数据量更大,传输关键帧会占用更大的带宽,因此,对此阶段所采集的屏幕图像进行压缩,可以使后续编码产 生的关键帧的数据体积降低,从而有效地降低数据传输量。示例性地,对屏幕数据压缩是指对采集到的相邻两帧图像中的后一帧图像,或连续多帧图像中的最后一帧图像,即当前最新采集到的一帧图像进行压缩,进一步地,对后一帧图像进行压缩,得到压缩图像数据包括:对后一帧图像进行降采样,例如,将1080P分辨率的图像,降采样至270P。降采样后得到压缩图像帧,该压缩图像帧相比降采样之前的图像具有更低的分辨率,同时,图像体积也更小。
进一步地,将压缩图像帧发送至编码器进行编码处理,得到压缩图像数据。此处,对图像进行编码的过程为本领域现有技术,此处不再赘述。
S104:将压缩图像数据发送给第二终端设备进行显示。
示例性地,第二终端设备为接收共享屏幕图像,并进行显示的终端设备。向该第二终端设备发送压缩图像的实现方式,可以是向云服务器发送该压缩图像数据,并由云服务器向第二终端设备转发该压缩图像数据,也可以是第一终端设备直接向第二终端设备发送该压缩图像数据,可以根据需要进行设置,此处不进行具体限定。在生成压缩图像数据后,将压缩图像数据发送至第二终端设备,第二终端设备对压缩图像数据进行解码后,进行显示,完成共享屏幕显示的过程。图3为本公开实施例提供的一种向第二终端设备发送压缩图像的示意图,如图3所示,第一终端设备根据当前采集到的屏幕图像帧与前一帧之间的变化程度,判断屏幕是否处于滑动状态,若处于滑动状态,则对屏幕图像进行压缩,生成压缩图像数据发送至第二终端设备进行显示,降低数据传输量,降低网络负载。
在本实施例中,通过获取第一终端设备的相邻两帧图像;根据相邻两帧图像,确定图像变化信息,其中,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据;将压缩图像数据发送给第二终端设备进行显示,由于第一终端设备根据图像发生变化的程度,对图像进行了压缩,因此在屏幕共享过程中若图像变化较大,即屏幕滑动过程中,对屏幕图像对应的图像数据进行压缩,可以实现在不影响有效数据传递的情况下,降低传输数据量,解决了屏幕共享显示过程中数据流量大的问题,提高了屏幕共享显示的流畅度,降低显示卡顿。
图4为本公开实施例提供的屏幕共享显示方法流程示意图二。本实施例中进一步对步骤S101至S103进行细化,并增加了在屏幕处于静止状态时,丢弃对应屏幕图像帧的步骤,该屏幕共享显示方法包括:
S201:获取第一终端设备的多帧连续图像,其中,多帧连续图像至少包括三帧图像, 多帧连续图像中每相邻两帧构成一组相邻两帧图像。
具体地,获取第一终端设备的多帧连续图像,包括:第一终端设备对其屏幕所显示的内容进行逐帧的图像采集,在每次采集到一帧图像后,对其进行缓冲,存储在本地的存储介质中,之后继续采集后一帧图像。第一终端设备在每次采集到当前最新的一帧图像后,从缓冲中读取与当前帧图像相邻的若干帧图像,并将该相邻的若干帧图像与当前帧图像一起组成多帧连续图像。具体地,例如,从缓冲中获取当前帧之前的两帧图像,与当前帧图像共同形成一个三帧连续图像。
进一步地,多帧连续图像中的每相邻两帧,构成一组相邻两帧图像。图5为本公开实施例提供的一种连续多帧图像示意图,如图5所示,在多帧连续图像中共包括第一帧至第四帧,其中,第一帧与第二帧、第二帧与第三帧、第三帧与第四帧,分别组成三组相邻两帧图像。
S202:根据各组相邻两帧图像,确定图像变化信息。
示例性地,图像变化信息包括相似度评估值序列,相似度评估值序列中包括多个相似度评估值,每一个相似度评估值分别与一组相邻两帧图像相对应。相似度评估值用于表征相邻两帧图像中的后一帧图像与前一帧图像之间的相似度。
可选地,如图6所示,步骤S202包括步骤S2021、S2022两个具体的实现步骤:
S2021:获取各组相邻两帧图像中前一帧图像的像素信息和后一帧图像的像素信息。
S2022:根据预设的图像比较算法,对各组相邻两帧图像中前一帧图像的像素信息和后一帧图像的像素信息进行特征对比,各组相邻两帧图像的相似度评估值。
示例性地,对各组相邻两帧图像中前一帧图像的像素信息和后一帧图像的像素信息进行特征对比,例如为,分别计算其中前一帧图像和后一帧图像的像素信息的相似度,从而获得一个相应的相似度评估值,其中,根据像素信息计算相似度的实现方法为现有技术,此处不再赘述。再例如,还可以通过计算结构相似性(Structural Similarity,简称SSIM),从亮度,对比度,结构三面进行比较,或者通过内容特征法,关键点匹配法等方法,确定各组相邻两帧图像的相似度评估值,此处不再对此进行一一赘述。
S203:若多帧连续图像中的各组相邻两帧图像的相似度评估值均小于预设第一相似度评估值阈值,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
具体地,若多帧连续图像中的各组相邻两帧图像的相似度评估值均小于预设第一相似度评估值阈值,说明在连续多帧图像中,各帧图像之间均具有较大变化,此时判断屏幕进入滑动状态,在屏幕处于滑动状态下,所显示的内容为不需用户仔细观看的内容,并且, 屏幕处于滑动状态下,所采集的图像在后续的编码过程中会生成关键帧,相比于非关键帧,关键帧的数据量更大,传输关键帧会占用更大的带宽,因此,对进入滑动状态下采集的平面图像,即多帧连续图像的最后一帧图像进行压缩和编码,得到压缩图像数据,对该压缩图像数据进行传输,可有效地降低数据传输量。
S204:若多帧连续图像中,最后一组相邻两帧图像的相似度评估值大于预设第二相似度评估值阈值,则在本地缓冲最后一组相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
具体地,在第一终端设备采集屏幕图像的过程中,获得的连续多帧图像中,若最后一组相邻两帧图像,即当前最新采集到的屏幕图像,和与其相邻的一帧屏幕图像,相似度评估值大于第二预设相似度评估值阈值,如相似度评估值大于99%,即说明当前最新采集到的屏幕图像相较相邻的前一帧屏幕图像,几乎无变化,此时,判断屏幕处于静止状态。其中,当屏幕处于静止状态时,最新采集到的屏幕图像在后续的编码过程中,会生成非关键帧,而缺失非关键帧并不会影响视频的正常播放。因此,在屏幕处于静止状态时,屏幕上的显示的内容不更新,此时无需将采集到的屏幕图像发送给第二终端设备,即不生成和发送屏幕图像对应的非关键帧,减少数据传输量,降低带宽资源消耗,而第二终端设备可以继续显示当前的屏幕图像,而无需进行更新图像。
本实施例中,通过判断在多帧连续图像中的最后一组相邻两帧图像的相似度评估值大于预设第二相似度评估值阈值的情况下,不向第二终端设备发送图像数据,而仅是对最新采集到的图像进行缓冲,用于下一帧屏幕图像的判断,使屏幕处于静止状态时,不再向第二终端设备发送图像数据,进一步减少屏幕分享显示过程中的数据传输量,降低网络负载。
下面以一个更具体的实施例对第一终端设备对多帧连续图像处理的过程进行说明。图7为本公开实施例提供的一种对多帧连续图像进行处理的示意图,如图7所示,第一终端设备实时采集屏幕所显示的内容,生成屏幕图像帧,并进行缓冲,第一终端设备通过对最近的三帧连续图像进行判断,确定屏幕状态,即在该最近的三帧连续图像中,若第一帧图像p1与第二帧图像p2组成的相邻两帧图像的相似度评估值,大于99%,即确定当前为屏幕静止状态,则将最新采集到的第一帧图像p1缓冲至本地存储介质,而不向第二终端设备发送数据,以使第二终端设备继续显示当前内容,不更新显示,同时第一终端设备继续采集屏幕图像;若第一帧图像p1与第二帧图像p2组成的相邻两帧图像的相似度评估值小于90%,且第二帧图像p2与第三帧图像p3组成的相邻两帧图像的相似度评估值小于90%,即确定当前为屏幕滑动状态,则将最新采集到的第一帧图像p1压缩为压缩图像数据,发 送至第二终端设备,以使第二终端设备更新图像;若是其他情况,即确定当前为屏幕一般状态,则第一终端设备将最新采集的第一帧图像p1发送至第二终端设备进行正常显示。
可选地,若确定图像变化信息表征变化程度小于等于第二预设程度,则向第二终端设备发送心跳信号;其中,心跳信号用于指示第一终端设备与第二终端设备处于连接状态。
本实施例步骤中,在图像变化信息表征变化程度小于等于第二预设程度时,由于第一终端设备此时不向第二终端设备发送数据,为了使第二终端设备能够实时的确认与第一终端设备的连接状态,避免连接重置,第一终端设备向第二终端设备发送心跳信号,提高屏幕共享过程中的连接稳定性。
可选地,在步骤S204之后,还包括:
S205:获取鼠标图层数据;将鼠标图层数据发送至编码器进行编码处理,得到鼠标图层编码数据;将鼠标图层编码数据发送至第二终端设备。
示例性地,在第一终端设备判断在屏幕处于静止状态时,通过不向第二终端发送屏幕图像数据,可以实现降低数据传输量的目的。然而,在一些场景下,第一终端设备一侧的用户,需要使用鼠标,对第一终端设备的屏幕上显示的内容进行指示,从而能更清楚的实现视频会议过程中的语音讲解,因此,需要将鼠标的显示位置,发送至第二终端设备进行显示。
具体地,获取鼠标图层数据后,将鼠标图层数据发送至编码器进行编码处理,可以得到鼠标图层编码数据,由于鼠标图层数据的数据量很小,对其单独进行编码,并将生成的鼠标图层编码数据发送至第二终端设备,对数据传输量的影响很小,因此可以实现在不更新屏幕图像的基础上,更新显示鼠标的位置,实现鼠标的实时显示。
S206:将压缩图像数据发送给第二终端设备进行显示。
在本实施例中,步骤S206与上述实施例中步骤S104的一致,详细论述请参考步骤S104的论述,这里不再赘述。
图8为本公开实施例提供的屏幕共享显示方法流程示意图三,参考图8,本实施例的方法可以应用在第二终端设备中,该屏幕共享显示方法包括:
S301:接收第一终端设备发送的压缩图像数据。
S302:显示压缩图像数据中的压缩图像帧。
其中,压缩图像数据中包含压缩图像帧,该压缩图像帧是第一终端设备对屏幕滑动状态下采集的平面图像进行压缩而得到的图像帧,该图像帧具有更小体积,同时也有更低的分辨率。具体地,显示压缩图像数据中的压缩图像帧,包括:
对压缩图像数据进行解码,得到压缩图像帧;根据预设的图像尺寸,对压缩图像帧进行放大显示。其中,第一终端设备在对屏幕图像进行压缩的过程中,通过对图像进行尺度变换(scale),缩小了图像的尺度,为了使该图片在第二终端设备一侧以正常的尺度显示,需要再对该图像进行放大,即对该图像再次进行尺度变换,时图像尺度与屏幕显示尺度相匹配。
在一种可能的实现方式中,第二终端设备在未接收到第一终端设备发送的压缩图像数据时,继续显示当前的图像帧。
进一步地,在一种可能的实现方式中,若接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,显示当前的图像帧;若未接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,输出报警信息。
可选地,该方法还包括:
S303:接收鼠标图层编码数据;对鼠标图层编码数据进行解码,得到鼠标图层数据,并对鼠标图层数据进行显示。
在第二终端设备接收到鼠标图层编码数据后,对鼠标图层编码数据进行解码,可以得到鼠标图层数据,该鼠标图层数据表征显示鼠标的位置。根据第一终端设备实时发送的鼠标图层编码数据,将鼠标实时的显示在第二终端设备上,由于鼠标图层编码数据数据量小,因此通过接收鼠标图层编码数据并进行处理和鼠标显示,能够在不影响数据传输量的情况下,实现鼠标的显示目的。
图9为本公开实施例提供的一种屏幕共享显示方法的信令图,参考图9,对本公开实施例提供的屏幕共享显示方法包括:
S401:第一终端设备采集当前屏幕图像,得到当前图像帧。
S402:第一终端设备根据当前图像帧,确定当前图像帧之前的相邻两帧图像,并将当前图像帧与当前图像帧之前的相邻两帧图像作为多帧连续图像。
S403:第一终端设备若判断多帧连续图像中的各组相邻两帧图像的相似度评估值均小于预设第一相似度评估值阈值,则对当前图像帧进行压缩,得到压缩图像数据。
S404:第一终端设备若所判断述多帧连续图像中,当前图像帧与之前相邻的图像帧的相似度评估值大于预设第二相似度评估值阈值,则在本地缓冲当前图像帧。
S405:第一终端设备获取鼠标图层数据;将鼠标图层数据发送至编码器进行编码处理,得到鼠标图层编码数据。
S406:第一终端设备将鼠标图层编码数据发送至第二终端设备。
S407:第二终端设备接收鼠标图层编码数据。
S408:第二终端设备对鼠标图层编码数据进行解码,得到鼠标图层数据,并对鼠标图层数据进行显示。
S409;第一终端设备将压缩图像数据发送给第二终端设备进行显示。
S410;第二终端设备接收第一终端设备发送的压缩图像数据,并显示压缩图像数据中的压缩图像帧。
其中,本实施例中S401-S410的各步骤的实现方法及有益效果,在上述图2-图7所示实施例中均有介绍,此处不再进行赘述。
对应于上文实施例的屏幕共享显示方法,图10为本公开实施例提供的一种屏幕共享显示装置的结构框图,应用于第一终端设备。为了便于说明,仅示出了与本公开实施例相关的部分。参照图10,屏幕共享显示装置5,包括:
获取单元51,用于获取第一终端设备的相邻两帧图像。
确定单元52,用于根据相邻两帧图像,确定图像变化信息,其中,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度。
压缩单元53,用于若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据。
发送单元54,用于将所述压缩图像数据发送给第二终端设备进行显示。
在本公开的一个实施例中,确定单元52,还用于:若确定图像变化信息表征变化程度小于等于第二预设程度,则在本地缓冲相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
在本公开的一个实施例中,获取单元51,还用于:获取第一终端设备的多帧连续图像,其中,多帧连续图像至少包括三帧连续图像,多帧连续图像中每相邻两帧构成一组相邻两帧图像;相应的,压缩单元53,具体用于:若多帧连续图像中的各组相邻两帧图像,所对应的图像变化信息所表征变化程度均大于等于第一预设程度,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
在本公开的一个实施例中,图像变化信息包括相似度评估值,相似度评估值用于表征相邻两帧图像中的后一帧图像与前一帧图像之间的相似度;压缩单元53,具体用于:若多帧连续图像中的各组相邻两帧图像的相似度评估值均小于预设第一相似度评估值阈值,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
在本公开的一个实施例中,确定单元52,还用于:若多帧连续图像中,最后一组相 邻两帧图像的相似度评估值大于预设第二相似度评估值阈值,则在本地缓冲最后一组相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
在本公开的一个实施例中,压缩单元53在对后一帧图像进行压缩,得到压缩图像数据时,具体用于:对后一帧图像进行降采样,得到压缩图像帧;将压缩图像帧发送至编码器进行编码处理,得到压缩图像数据。
在本公开的一个实施例中,发送单元54,还用于:若确定图像变化信息表征变化程度小于等于第二预设程度,则向第二终端设备发送心跳信号;其中,心跳信号用于指示第一终端设备与第二终端设备处于连接状态。
在本公开的一个实施例中,获取单元51,还用于:获取鼠标图层数据;将鼠标图层数据发送至编码器进行编码处理,得到鼠标图层编码数据;发送单元54还用于:将鼠标图层编码数据发送至第二终端设备。
在本公开的一个实施例中,确定模块52,具体用于:获取前一帧图像的像素信息和后一帧图像的像素信息;根据预设的图像比较算法,对前一帧图像的像素信息和后一帧图像的像素信息进行特征对比,确定图像变化信息。
本实施例提供的设备,可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,本实施例此处不再赘述。
图11为本公开实施例提供的另一种屏幕共享显示装置的结构框图,应用于第二终端设备。为了便于说明,仅示出了与本公开实施例相关的部分。参照图11,屏幕共享显示装置6,包括:
接收单元61,用于接收第一终端设备发送的压缩图像数据。
显示单元62,用于显示压缩图像数据中的压缩图像帧。
在本公开的一个实施例中,显示单元62具体用于:对压缩图像数据进行解码,得到压缩图像帧;根据预设的图像尺寸,对压缩图像帧进行放大显示。
在本公开的一个实施例中,接收单元61,还用于:接收鼠标图层编码数据。显示单元,还用于:对鼠标图层编码数据进行解码,得到鼠标图层数据,并对鼠标图层数据进行显示。
在本公开的一个实施例中,显示单元62,还用于:在未接收到第一终端设备发送的压缩图像数据时,继续显示当前的图像帧。
在本公开的一个实施例中,显示单元62在未接收到第一终端设备发送的压缩数据时,继续显示当前的图像帧时,具体用于:若接收到第一终端设备发送的心跳数据,且未接收 到第一终端设备发送的压缩图像数据时,显示当前的图像帧;若未接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,输出报警信息。
图12为本公开实施例提供的一种电子设备的结构示意图,如图12所示,该电子设备7包括至少一个处理器701和存储器702;
存储器702存储计算机执行指令;
至少一个处理器701执行存储器702存储的计算机执行指令,使得至少一个处理器701执行如图2-图7所示实施例中的屏幕共享方法,或者,执行图9所示实施例中由第一终端设备执行的方法步骤。
其中,处理器701和存储器702通过总线703连接。
相关说明可以对应参见图2-图7所对应的实施例中的步骤所对应的相关描述和效果进行理解,此处不做过多赘述。
图13为本公开实施例提供的另一种电子设备的结构示意图,如图13所示,该电子设备8包括至少一个处理器801和存储器802;
存储器802存储计算机执行指令;
至少一个处理器801执行存储器802存储的计算机执行指令,使得至少一个处理器801执行如图8所示实施例中的屏幕共享方法,或者,执行图9所示实施例中由第二终端设备执行的方法步骤。
其中,处理器801和存储器802通过总线803连接。
相关说明可以对应参见图8所对应的实施例中的步骤所对应的相关描述和效果进行理解,此处不做过多赘述。
参考图14,其示出了适于用来实现本公开实施例的电子设备900的结构示意图,该电子设备900可以为终端设备或服务器。其中,终端设备可以包括但不限于诸如移动电话、笔记本电脑、数字广播接收器、个人数字助理(Personal Digital Assistant,简称PDA)、平板电脑(Portable Android Device,简称PAD)、便携式多媒体播放器(Portable Media Player,简称PMP)、车载终端(例如车载导航终端)等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。图14示出的电子设备仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图14所示,电子设备900可以包括处理装置(例如中央处理器、图形处理器等)901,其可以根据存储在只读存储器(Read Only Memory,简称ROM)902中的程序或者从存储装置908加载到随机访问存储器(Random Access Memory,简称RAM)903中的 程序而执行各种适当的动作和处理。在RAM 903中,还存储有电子设备900操作所需的各种程序和数据。处理装置901、ROM 902以及RAM 903通过总线904彼此相连。输入/输出(I/O)接口905也连接至总线904。
通常,以下装置可以连接至I/O接口905:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置906;包括例如液晶显示器(Liquid Crystal Display,简称LCD)、扬声器、振动器等的输出装置907;包括例如磁带、硬盘等的存储装置908;以及通信装置909。通信装置909可以允许电子设备900与其他设备进行无线或有线通信以交换数据。虽然图14示出了具有各种装置的电子设备900,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置909从网络上被下载和安装,或者从存储装置908被安装,或者从ROM 902被安装。在该计算机程序被处理装置901执行时,执行本公开实施例的方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(Erasable Programmable Read-Only Memory,简称EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(Compact Disc-Read Only Memory,简称CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、 光缆、RF(Radio Frequency,简称射频)等等,或者上述的任意合适的组合。
上述计算机可读介质可以是上述电子设备中所包含的;也可以是单独存在,而未装配入该电子设备中。
上述计算机可读介质承载有一个或者多个程序,当上述一个或者多个程序被该电子设备执行时,使得该电子设备执行上述实施例所示的方法。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(Local Area Network,简称LAN)或广域网(Wide Area Network,简称WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现。其中,单元的名称在某种情况下并不构成对该单元本身的限定,例如,第一获取单元还可以被描述为“获取至少两个网际协议地址的单元”。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:现场可编程门阵列(Filed Programmable Gate Array,简称FPGA)、专用集成电路(Application Specific Integrated Circuit,简称ASIC)、专用标准产品(Application Specific Standard Product,简称ASSP)、片上系统(System on Chip,简称SOC)、复杂可编程逻辑设备(Complex Programmable Logic  Device,简称CPLD)等等。
在本公开的上下文中,机器可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。机器可读介质可以是机器可读信号介质或机器可读储存介质。机器可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或闪存)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
第一方面,根据本公开的一个或多个实施例,提供了一种屏幕共享显示方法,包括:获取第一终端设备的相邻两帧图像;根据相邻两帧图像,确定图像变化信息,其中,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据;将压缩图像数据发送给第二终端设备进行显示。
根据本公开的一个或多个实施例,方法还包括:若确定图像变化信息表征变化程度小于等于第二预设程度,则在本地缓冲相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
根据本公开的一个或多个实施例,获取第一终端设备的相邻两帧图像,包括:获取第一终端设备的多帧连续图像,其中,多帧连续图像至少包括三帧图像,多帧连续图像中每相邻两帧构成一组相邻两帧图像;若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据,包括:若多帧连续图像中的各组相邻两帧图像,所对应的图像变化信息所表征变化程度均大于等于第一预设程度,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
根据本公开的一个或多个实施例,图像变化信息包括相似度评估值,相似度评估值用于表征相邻两帧图像中的后一帧图像与前一帧图像之间的相似度;若多帧连续图像中的各组相邻两帧图像,所对应的图像变化信息所表征变化程度均大于等于第一预设程度,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据,包括:若多帧连续图像中的各组相邻两帧图像的相似度评估值均小于预设第一相似度评估值阈值,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
根据本公开的一个或多个实施例,方法还包括:若多帧连续图像中,最后一组相邻两 帧图像的相似度评估值大于预设第二相似度评估值阈值,则在本地缓冲最后一组相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
根据本公开的一个或多个实施例,对后一帧图像进行压缩,得到压缩图像数据,包括:对后一帧图像进行降采样,得到压缩图像帧;将压缩图像帧发送至编码器进行编码处理,得到压缩图像数据。
根据本公开的一个或多个实施例,方法还包括:若确定图像变化信息表征变化程度小于等于第二预设程度,则向第二终端设备发送心跳信号;其中,心跳信号用于指示第一终端设备与第二终端设备处于连接状态。
根据本公开的一个或多个实施例,方法还包括:获取鼠标图层数据;将鼠标图层数据发送至编码器进行编码处理,得到鼠标图层编码数据;将鼠标图层编码数据发送至第二终端设备。
根据本公开的一个或多个实施例,根据相邻两帧图像,确定图像变化信息,包括:获取前一帧图像的像素信息和后一帧图像的像素信息;根据预设的图像比较算法,对前一帧图像的像素信息和后一帧图像的像素信息进行特征对比,确定图像变化信息。
第二方面,根据本公开的一个或多个实施例,提供了一种屏幕共享显示方法,包括:接收第一终端设备发送的压缩图像数据;显示压缩图像数据中的压缩图像帧。
根据本公开的一个或多个实施例,显示压缩图像数据中的图像帧,包括:对压缩图像数据进行解码,得到压缩图像帧;根据预设的图像尺寸,对压缩图像帧进行放大显示。
根据本公开的一个或多个实施例,方法还包括:接收鼠标图层编码数据;对鼠标图层编码数据进行解码,得到鼠标图层数据,并对鼠标图层数据进行显示。
根据本公开的一个或多个实施例,方法还包括:在未接收到第一终端设备发送的压缩图像数据时,继续显示当前的图像帧。
根据本公开的一个或多个实施例,在未接收到第一终端设备发送的压缩数据时,继续显示当前的图像帧,包括:若接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,显示当前的图像帧;若未接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,输出报警信息。
第三方面,根据本公开的一个或多个实施例,提供了一种屏幕共享显示装置,包括:
获取单元,用于获取第一终端设备的相邻两帧图像。
确定单元,用于根据相邻两帧图像,确定图像变化信息,其中,图像变化信息表征相邻两帧图像中后一帧图像相对于前一帧图像的变化程度。
压缩单元,用于若确定图像变化信息所表征变化程度大于等于第一预设程度,则对后一帧图像进行压缩,得到压缩图像数据。
发送单元,用于将所述压缩图像数据发送给第二终端设备进行显示。
在本公开的一个实施例中,确定单元,还用于:若确定图像变化信息表征变化程度小于等于第二预设程度,则在本地缓冲相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
根据本公开的一个或多个实施例,获取单元,还用于:获取第一终端设备的多帧连续图像,其中,多帧连续图像至少包括三帧图像,多帧连续图像中每相邻两帧构成一组相邻两帧图像;相应的,压缩单元,具体用于:若多帧连续图像中的各组相邻两帧图像,所对应的图像变化信息所表征变化程度均大于等于第一预设程度,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
根据本公开的一个或多个实施例,图像变化信息包括相似度评估值,相似度评估值用于表征相邻两帧图像中的后一帧图像与前一帧图像之间的相似度;压缩单元,具体用于:若多帧连续图像中的各组相邻两帧图像的相似度评估值均小于预设第一相似度评估值阈值,则对多帧连续图像的最后一帧图像进行压缩,得到压缩图像数据。
根据本公开的一个或多个实施例,确定单元,还用于:若多帧连续图像中,最后一组相邻两帧图像的相似度评估值大于预设第二相似度评估值阈值,则在本地缓冲最后一组相邻两帧图像中的后一帧图像,其中,第二终端设备不更新图像。
根据本公开的一个或多个实施例,压缩单元在对后一帧图像进行压缩,得到压缩图像数据时,具体用于:对后一帧图像进行降采样,得到压缩图像帧;将压缩图像帧发送至编码器进行编码处理,得到压缩图像数据。
根据本公开的一个或多个实施例,发送单元,还用于:若确定图像变化信息表征变化程度小于等于第二预设程度,则向第二终端设备发送心跳信号;其中,心跳信号用于指示第一终端设备与第二终端设备处于连接状态。
根据本公开的一个或多个实施例,获取单元,还用于:获取鼠标图层数据;将鼠标图层数据发送至编码器进行编码处理,得到鼠标图层编码数据;发送单元还用于:将鼠标图层编码数据发送至第二终端设备。
根据本公开的一个或多个实施例,确定模块,具体用于:获取前一帧图像的像素信息和后一帧图像的像素信息;根据预设的图像比较算法,对前一帧图像的像素信息和后一帧图像的像素信息进行特征对比,确定图像变化信息。
第五方面,根据本公开的一个或多个实施例,提供了一种屏幕共享显示装置,包括:
接收单元,用于接收第一终端设备发送的压缩图像数据。
显示单元,用于显示压缩图像数据中的压缩图像帧。
根据本公开的一个或多个实施例,显示单元具体用于:对压缩图像数据进行解码,得到压缩图像帧;根据预设的图像尺寸,对压缩图像帧进行放大显示。
根据本公开的一个或多个实施例,接收单元,还用于:接收鼠标图层编码数据。显示单元,还用于:对鼠标图层编码数据进行解码,得到鼠标图层数据,并对鼠标图层数据进行显示。
根据本公开的一个或多个实施例,显示单元,还用于:在未接收到第一终端设备发送的压缩图像数据时,继续显示当前的图像帧。
根据本公开的一个或多个实施例,显示单元在未接收到第一终端设备发送的压缩数据时,继续显示当前的图像帧时,具体用于:若接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,显示当前的图像帧;若未接收到第一终端设备发送的心跳数据,且未接收到第一终端设备发送的压缩图像数据时,输出报警信息。
第五方面,根据本公开的一个或多个实施例,提供了一种电子设备,包括:至少一个处理器和存储器;
存储器存储计算机执行指令;
至少一个处理器执行存储器存储的计算机执行指令,使得至少一个处理器执行如上第一方面以及第一方面各种可能的涉及的屏幕共享显示方法。
第六方面,根据本公开的一个或多个实施例,提供了一种电子设备,包括:至少一个处理器和存储器;
存储器存储计算机执行指令;
至少一个处理器执行存储器存储的计算机执行指令,使得至少一个处理器执行如上第二方面以及第二方面各种可能的涉及的屏幕共享显示方法。
第七方面,根据本公开的一个或多个实施例,提供一种计算机可读存储介质,计算机可读存储介质中存储有计算机执行指令,当处理器执行计算机执行指令时,实现如上第一方面以及第一方面各种可能的涉及的屏幕共享显示方法,或者,当处理器执行计算机执行指令时,实现如上第二方面以及第二方面各种可能的涉及的屏幕共享显示方法。
第八方面,根据本公开的一个或多个实施例,提供一种计算机程序产品,包括计算机程序,该计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的涉及的 屏幕共享显示方法,或者,实现如上第二方面以及第二方面各种可能的涉及的屏幕共享显示方法。
第九方面,根据本公开的一个或多个实施例,提供一种计算机程序,该计算机程序被处理器执行时,实现如上第一方面以及第一方面各种可能的涉及的屏幕共享显示方法,或者,实现如上第二方面以及第二方面各种可能的涉及的屏幕共享显示方法。
以上描述仅为本公开的较佳实施例以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的公开范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离上述公开构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。
此外,虽然采用特定次序描绘了各操作,但是这不应当理解为要求这些操作以所示出的特定次序或以顺序次序执行来执行。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实施例中。相反地,在单个实施例的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实施例中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (21)

  1. 一种屏幕共享显示方法,其特征在于,包括:
    获取第一终端设备的相邻两帧图像;
    根据所述相邻两帧图像,确定图像变化信息,其中,所述图像变化信息表征所述相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;
    若确定所述图像变化信息所表征所述变化程度大于等于第一预设程度,则对所述后一帧图像进行压缩,得到压缩图像数据;
    将所述压缩图像数据发送给第二终端设备进行显示。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    若确定所述图像变化信息表征所述变化程度小于等于第二预设程度,则在本地缓冲所述相邻两帧图像中的后一帧图像,其中,所述第二终端设备不更新图像。
  3. 根据权利要求1所述的方法,其特征在于,获取第一终端设备的相邻两帧图像,包括:
    获取所述第一终端设备的多帧连续图像,其中,所述多帧连续图像至少包括三帧连续图像,所述多帧连续图像中每相邻两帧构成一组所述相邻两帧图像;
    若确定所述图像变化信息所表征所述变化程度大于等于所述第一预设程度,则对所述后一帧图像进行压缩,得到所述压缩图像数据,包括:
    若所述多帧连续图像中的各组所述相邻两帧图像,所对应的所述图像变化信息所表征所述变化程度均大于等于所述第一预设程度,则对所述多帧连续图像的最后一帧图像进行压缩,得到所述压缩图像数据。
  4. 根据权利要求3所述的方法,其特征在于,所述图像变化信息包括相似度评估值,所述相似度评估值用于表征所述相邻两帧图像中的所述后一帧图像与所述前一帧图像之间的相似度;若所述多帧连续图像中的各组所述相邻两帧图像,所对应的所述图像变化信息所表征所述变化程度均大于等于所述第一预设程度,则对所述多帧连续图像的所述最后一帧图像进行压缩,得到所述压缩图像数据,包括:
    若所述多帧连续图像中的各组所述相邻两帧图像的所述相似度评估值均小于等于预设第一相似度评估值阈值,则对所述多帧连续图像的所述最后一帧图像进行压缩,得到所述压缩图像数据。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    若所述多帧连续图像中,最后一组所述相邻两帧图像的相似度评估值大于等于预设第二相似度评估值阈值,则在本地缓冲所述最后一组所述相邻两帧图像中的所述后一帧图像,其中,所述第二终端设备不更新图像。
  6. 根据权利要求1所述的方法,其特征在于,对所述后一帧图像进行压缩,得到所 述压缩图像数据,包括:
    对所述后一帧图像进行降采样,得到压缩图像帧;
    将所述压缩图像帧发送至编码器进行编码处理,得到所述压缩图像数据。
  7. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    若确定所述图像变化信息表征所述变化程度小于等于所述第二预设程度,则向所述第二终端设备发送心跳信号;其中,所述心跳信号用于指示所述第一终端设备与所述第二终端设备处于连接状态。
  8. 根据权利要求2所述的方法,其特征在于,所述方法还包括:
    获取鼠标图层数据;
    将所述鼠标图层数据发送至编码器进行编码处理,得到鼠标图层编码数据;
    将所述鼠标图层编码数据发送至所述第二终端设备。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,根据所述相邻两帧图像,确定所述图像变化信息,包括:
    获取所述前一帧图像的像素信息和所述后一帧图像的像素信息;
    根据预设的图像比较算法,对所述前一帧图像的像素信息和所述后一帧图像的像素信息进行特征对比,确定所述图像变化信息。
  10. 一种屏幕共享显示方法,其特征在于,包括:
    接收第一终端设备发送的压缩图像数据;
    显示所述压缩图像数据中的压缩图像帧。
  11. 根据权利要求10所述的方法,其特征在于,显示所述压缩图像数据中的图像帧,包括:
    对所述压缩图像数据进行解码,得到压缩图像帧;
    根据预设的图像尺寸,对所述压缩图像帧进行放大显示。
  12. 根据权利要求10或11所述的方法,其特征在于,所述方法还包括:
    接收鼠标图层编码数据;
    对所述鼠标图层编码数据进行解码,得到鼠标图层数据,并对所述鼠标图层数据进行显示。
  13. 根据权利要求10或11所述的方法,其特征在于,所述方法还包括:
    在未接收到所述第一终端设备发送的所述压缩图像数据时,继续显示当前的图像帧。
  14. 根据权利要求13所述的方法,其特征在于,在未接收到所述第一终端设备发送的所述压缩图像数据时,继续显示当前的图像帧,包括:
    若接收到所述第一终端设备发送的心跳数据,且未接收到所述第一终端设备发送的所述压缩图像数据时,显示当前的图像帧;
    若未接收到所述第一终端设备发送的心跳数据,且未接收到所述第一终端设备发送的所述压缩图像数据时,输出报警信息。
  15. 一种屏幕共享显示装置,其特征在于,包括:
    获取单元,用于获取第一终端设备的相邻两帧图像;
    确定单元,用于根据所述相邻两帧图像,确定图像变化信息,其中,所述图像变化信息表征所述相邻两帧图像中后一帧图像相对于前一帧图像的变化程度;
    压缩单元,用于若确定所述图像变化信息所表征所述变化程度大于等于第一预设程度,则对所述后一帧图像进行压缩,得到压缩图像数据;
    发送单元,用于将所述压缩图像数据发送给第二终端设备进行显示。
  16. 一种屏幕共享显示装置,其特征在于,包括:
    接收单元,用于接收第一终端设备发送的压缩图像数据;
    显示单元,用于显示所述压缩图像数据中的压缩图像帧。
  17. 一种电子设备,其特征在于,包括:至少一个处理器和存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1至9任一项所述的屏幕共享方法。
  18. 一种电子设备,其特征在于,包括:至少一个处理器和存储器;
    所述存储器存储计算机执行指令;
    所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求10至14任一项所述的屏幕共享方法。
  19. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1至14任一项所述的屏幕共享方法。
  20. 一种计算机程序产品,其特征在于,包括计算机程序指令,所述计算机程序指令使得计算机执行如权利要求1-14任一项所述的屏幕共享方法。
  21. 一种计算机程序,其特征在于,所述计算机程序使得计算机执行如权利要求1-14任一项所述的屏幕共享方法。
PCT/CN2021/134886 2020-12-25 2021-12-01 屏幕共享显示方法、装置、设备及存储介质 WO2022135092A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/258,601 US20240045641A1 (en) 2020-12-25 2021-12-01 Screen sharing display method and apparatus, device, and storage medium
EP21909089.1A EP4243408A1 (en) 2020-12-25 2021-12-01 Screen sharing display method and apparatus, device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011563917.4A CN112714273A (zh) 2020-12-25 2020-12-25 屏幕共享显示方法、装置、设备及存储介质
CN202011563917.4 2020-12-25

Publications (1)

Publication Number Publication Date
WO2022135092A1 true WO2022135092A1 (zh) 2022-06-30

Family

ID=75546599

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/134886 WO2022135092A1 (zh) 2020-12-25 2021-12-01 屏幕共享显示方法、装置、设备及存储介质

Country Status (4)

Country Link
US (1) US20240045641A1 (zh)
EP (1) EP4243408A1 (zh)
CN (1) CN112714273A (zh)
WO (1) WO2022135092A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112714273A (zh) * 2020-12-25 2021-04-27 北京字节跳动网络技术有限公司 屏幕共享显示方法、装置、设备及存储介质
CN113873295B (zh) * 2021-10-26 2024-05-28 北京金山云网络技术有限公司 多媒体信息处理方法、装置、设备及存储介质
CN117193685A (zh) * 2022-05-30 2023-12-08 荣耀终端有限公司 投屏数据的处理方法、电子设备及存储介质
CN117041468A (zh) * 2023-07-20 2023-11-10 北京安盛祥元科技发展有限公司 网络通信方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851282A (zh) * 2017-02-15 2017-06-13 福建时迅信息科技有限公司 一种vdi协议中减少视频图像编码数据量的方法和系统
CN106954004A (zh) * 2017-03-17 2017-07-14 宇龙计算机通信科技(深圳)有限公司 一种屏幕共享的方法及装置
CN108810610A (zh) * 2017-05-05 2018-11-13 腾讯科技(深圳)有限公司 屏幕共享方法和装置
US20200310739A1 (en) * 2017-06-20 2020-10-01 Microsoft Technology Licensing, Llc Real-time screen sharing
CN112714273A (zh) * 2020-12-25 2021-04-27 北京字节跳动网络技术有限公司 屏幕共享显示方法、装置、设备及存储介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8457194B2 (en) * 2008-09-29 2013-06-04 Microsoft Corporation Processing real-time video
US20130195198A1 (en) * 2012-01-23 2013-08-01 Splashtop Inc. Remote protocol
WO2016088244A1 (ja) * 2014-12-05 2016-06-09 富士通株式会社 サーバの画像配信方法
JP2016224766A (ja) * 2015-06-01 2016-12-28 富士通株式会社 リモート画面表示システム、リモート画面表示方法、およびリモート画面表示プログラム
CN108600783B (zh) * 2018-04-23 2021-03-30 深圳齐心好视通云计算有限公司 一种帧率调节方法、装置及终端设备
CN110559651A (zh) * 2019-09-16 2019-12-13 网易(杭州)网络有限公司 云游戏的控制方法及装置、计算机存储介质、电子设备
US20210127125A1 (en) * 2019-10-23 2021-04-29 Facebook Technologies, Llc Reducing size and power consumption for frame buffers using lossy compression
CN111625211B (zh) * 2019-12-03 2023-11-28 蘑菇车联信息科技有限公司 一种屏幕投屏方法、装置、安卓设备及显示设备
US11164539B2 (en) * 2019-12-18 2021-11-02 Ross Video Limited Systems and methods for bandwidth reduction in video signal transmission
CN111970518B (zh) * 2020-08-14 2022-07-22 山东云海国创云计算装备产业创新中心有限公司 一种图像丢帧处理方法、系统、设备及计算机存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851282A (zh) * 2017-02-15 2017-06-13 福建时迅信息科技有限公司 一种vdi协议中减少视频图像编码数据量的方法和系统
CN106954004A (zh) * 2017-03-17 2017-07-14 宇龙计算机通信科技(深圳)有限公司 一种屏幕共享的方法及装置
CN108810610A (zh) * 2017-05-05 2018-11-13 腾讯科技(深圳)有限公司 屏幕共享方法和装置
US20200310739A1 (en) * 2017-06-20 2020-10-01 Microsoft Technology Licensing, Llc Real-time screen sharing
CN112714273A (zh) * 2020-12-25 2021-04-27 北京字节跳动网络技术有限公司 屏幕共享显示方法、装置、设备及存储介质

Also Published As

Publication number Publication date
EP4243408A1 (en) 2023-09-13
CN112714273A (zh) 2021-04-27
US20240045641A1 (en) 2024-02-08

Similar Documents

Publication Publication Date Title
WO2022135092A1 (zh) 屏幕共享显示方法、装置、设备及存储介质
JP6242029B2 (ja) 低電力画像圧縮及び表示のための技術
CN113395477B (zh) 基于视频会议的共享方法、装置、电子设备和计算机介质
CN112272226B (zh) 图片加载方法、装置及可读存储介质
CN110290398B (zh) 视频下发方法、装置、存储介质及电子设备
WO2021196994A1 (zh) 编码的方法及装置、终端和存储介质
CN112969075A (zh) 直播过程中的补帧方法、装置及计算设备
WO2022078066A1 (zh) 视频处理方法、系统、终端和存储介质
CN110806846A (zh) 屏幕共享方法、屏幕共享装置、移动终端、存储介质
CN115761090A (zh) 特效渲染方法、装置、设备、计算机可读存储介质及产品
CN115442637A (zh) 直播特效渲染方法、装置、设备、可读存储介质及产品
CN111385576B (zh) 视频编码方法、装置、移动终端及存储介质
CN113038176B (zh) 视频抽帧方法、装置和电子设备
CN113259729B (zh) 数据切换的方法、服务器、系统及存储介质
CN116248889A (zh) 图像编码及解码方法、装置和电子设备
WO2023036257A1 (zh) 图像处理方法及设备
CN110798700A (zh) 视频处理方法、视频处理装置、存储介质与电子设备
CN111435995B (zh) 用于生成动态图片的方法、装置和系统
WO2024022427A1 (zh) 视频录制方法、装置、设备、存储介质和程序产品
CN115134641B (zh) 投屏方法、装置和电子设备
WO2023197811A1 (zh) 视频下载、传输方法、装置、终端设备、服务器及介质
US10484714B2 (en) Codec for multi-camera compression
CN113850055A (zh) 数据处理方法、设备、存储介质及产品
CN116744026A (zh) 语音连麦的合流方法及设备
CN118158430A (zh) 视频并行编码方法、装置、电子设备和计算机可读介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21909089

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021909089

Country of ref document: EP

Effective date: 20230607

WWE Wipo information: entry into national phase

Ref document number: 18258601

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE