CN109218731B - Screen projection method, device and system of mobile equipment - Google Patents

Screen projection method, device and system of mobile equipment Download PDF

Info

Publication number
CN109218731B
CN109218731B CN201710524252.8A CN201710524252A CN109218731B CN 109218731 B CN109218731 B CN 109218731B CN 201710524252 A CN201710524252 A CN 201710524252A CN 109218731 B CN109218731 B CN 109218731B
Authority
CN
China
Prior art keywords
terminal
screen
cache
image
screen image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710524252.8A
Other languages
Chinese (zh)
Other versions
CN109218731A (en
Inventor
王炳堪
崔精兵
周强
叶高艺
王俊豪
于涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710524252.8A priority Critical patent/CN109218731B/en
Priority to PCT/CN2018/092306 priority patent/WO2019001347A1/en
Publication of CN109218731A publication Critical patent/CN109218731A/en
Application granted granted Critical
Publication of CN109218731B publication Critical patent/CN109218731B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

Abstract

The invention discloses a screen projection method, device and system of mobile equipment, and belongs to the technical field of screen projection. The method comprises the following steps: obtaining a screen image according to the content to be displayed; storing the screen image in a first cache and a second cache; displaying the screen image stored in the first buffer on a screen of the first terminal; performing image differential encoding on the screen image stored in the second buffer; and sending the encoded image data stream to a second terminal, wherein the image data stream is used for triggering the second terminal to decode the received image data stream, and rendering and displaying the decoded screen image. The invention solves the problems of large data volume and slow transmission speed when compressing and transmitting each frame of image, and achieves the effects of reducing the data volume and improving the transmission speed.

Description

Screen projection method, device and system of mobile equipment
Technical Field
The embodiment of the invention relates to the technical field of screen projection, in particular to a screen projection method, device and system of mobile equipment.
Background
The screen projection means that the content displayed in the screen of the first terminal is projected to the second terminal for display. The screen projection requirements are met in scenes such as mobile phone game live broadcast, video conference, remote assistance, application program testing and the like. Typically, a screen is projected on a computer to display contents displayed on a mobile phone screen, so as to achieve the purpose of displaying the contents on a large screen.
In the related art, when the content displayed in the screen of the mobile phone is projected onto the computer, the mobile phone generally acquires a screen image in a frame buffer, then zlib compression is performed on the screen image, the compressed picture data is sent to the computer, the computer performs zlib decompression on the received picture data to obtain a screen image before compression, a Graphics Device Interface (GDI) is called to render and display the screen image, and the above operation is performed on each frame of screen image in the frame buffer, so that continuous screen images are displayed on the computer, and a video can be projected.
Since the related art compresses and transmits each frame of screen image when compressing the screen image in the frame buffer, the amount of data is large and the transmission speed is slow.
Disclosure of Invention
In order to solve the problems that each frame of screen image is compressed and transmitted when the screen image is compressed in the related art, the data size is large and the transmission speed is slow, embodiments of the present invention provide a screen projection method, apparatus and system for a mobile device. The technical scheme is as follows:
in a first aspect, a screen projection method of a mobile device is provided, the method including:
obtaining a screen image according to the content to be displayed;
storing the screen image in a first cache and a second cache;
displaying the screen image stored in the first cache on a screen of a first terminal;
performing image differential encoding on the screen image stored in the second buffer; and sending the encoded image data stream to a second terminal, wherein the image data stream is used for triggering the second terminal to decode the received image data stream, and rendering and displaying the decoded screen image.
In a second aspect, a screen projection method of a mobile device is provided, the method comprising:
the method comprises the steps that a first terminal receives a projection program file sent by a second terminal and sends a response signal to the second terminal, wherein the response signal is used for informing the second terminal that the projection program file is successfully received, the projection program file is sent when a projection application program on the second terminal is started, and the projection program file corresponds to the projection application program;
the first terminal receives a control signal sent by the second terminal and operates the projection program file according to the control signal;
and the first terminal performs image differential coding on a screen image through the projection program file, sends a coded image data stream to the second terminal, and the image data stream is used for triggering the projection application program on the second terminal to decode the image data stream and render and display the decoded screen image in a projection window.
In a third aspect, a screen projection method of a mobile device is provided, the method including:
receiving an image data stream sent by a first terminal, wherein the image data stream is obtained by carrying out image differential coding on a screen image of the first terminal, the screen image is stored in a first cache and a second cache of the first terminal, the screen image stored in the first cache is used for being displayed on a screen of the first terminal, and the screen image stored in the second cache is used for being sent after carrying out image differential coding;
decoding the image data stream to obtain the screen image;
rendering the screen image, and displaying the screen image on a second terminal.
In a fourth aspect, a screen projection method of a mobile device is provided, the method comprising:
when a projection application program is started, a second terminal sends a projection program file to a first terminal and displays a projection window of the projection application program, wherein the projection application program is installed on the second terminal, and the projection program file corresponds to the projection application program;
when a response signal sent by the first terminal is received, the second terminal sends a control signal for operating the projection program file to the first terminal, and the control signal is used for triggering the projection program file to carry out image differential coding on the screen image and then sending the screen image;
the second terminal receives the image data stream sent by the first terminal and decodes the image data stream;
and the second terminal renders and displays the decoded screen image in the projection window.
In a fifth aspect, a screen projection apparatus of a mobile device is provided, the apparatus including:
the first generation module is used for obtaining a screen image according to the content to be displayed;
the storage module is used for storing the screen image obtained by the first generation module in a first cache and a second cache;
the display module is used for displaying the screen image stored in the first cache by the storage module on a screen of a first terminal;
the coding module is used for carrying out image differential coding on the screen image stored in the second cache by the storage module;
and the sending module is used for sending the image data stream coded by the coding module to a second terminal, and the image data stream is used for triggering the second terminal to decode the received image data stream and rendering and displaying the screen image obtained after decoding.
In a sixth aspect, a screen projection apparatus of a mobile device is provided, the apparatus including:
the receiving module is used for receiving an image data stream sent by a first terminal, wherein the image data stream is obtained by carrying out image differential coding on a screen image of the first terminal, the screen image is stored in a first cache and a second cache of the first terminal, the screen image stored in the first cache is used for being displayed on a screen of the first terminal, and the screen image stored in the second cache is used for being sent after carrying out image differential coding;
the decoding module is used for decoding the image data stream received by the receiving module to obtain the screen image;
and the display module is used for rendering the screen image obtained by the decoding module and displaying the screen image on a second terminal.
In a seventh aspect, a screen projection system of a mobile device is provided, where the screen projection system of the mobile device includes a first terminal and a second terminal;
the first terminal comprises the screen projecting device of the mobile device according to the fifth aspect, and the second terminal comprises the screen projecting device of the mobile device according to the sixth aspect.
In an eighth aspect, a terminal is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the screen projection method of the mobile device according to the first, second, third, or fourth aspect.
In a ninth aspect, there is provided a computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the screen projection method of the mobile device according to the first, second, third, or fourth aspect.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
the screen image which needs to be displayed on the screen of the first terminal is subjected to image differential coding, the coded image data stream is sent to the second terminal, the second terminal decodes the image data stream to obtain the screen image, and the screen image is rendered and displayed on the second terminal.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by one embodiment of the invention;
FIG. 2 is a diagram of a wizard window provided by one embodiment of the present invention;
FIG. 3 is a flowchart of a method for a screen projection method of a mobile device according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for a screen projection method of a mobile device according to another embodiment of the invention;
FIG. 5 is a flowchart of a method for a screen projection method of a mobile device according to another embodiment of the invention;
FIG. 6 is a schematic view of a screen projection provided by one embodiment of the present invention;
FIG. 7 is a diagram illustrating a screen projection method of a mobile device according to an embodiment of the present invention;
fig. 8 is a block diagram illustrating a structure of a screen projecting apparatus of a mobile device according to an embodiment of the present invention;
fig. 9 is a block diagram illustrating a structure of a screen projection apparatus of a mobile device according to another embodiment of the present invention;
fig. 10 is a block diagram illustrating a screen projection system of a mobile device according to an embodiment of the present invention;
fig. 11 is a block diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings. As used herein, a "mobile device" may include a smartphone, a tablet computer, and the like.
Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present invention, as shown in fig. 1, the implementation environment includes a first terminal 110 and a second terminal 120.
Illustratively, the first terminal 110 is a smart phone and the second terminal 120 is a desktop computer. Optionally, in other possible implementations, the first terminal 110 may also be a tablet computer, and the second terminal 120 may also be a portable computer. The embodiment does not limit the specific implementation forms of the first terminal 110 and the second terminal 120.
Optionally, the first terminal 110 and the second terminal 120 are connected by a wired manner or a wireless manner. Such as: the wired mode includes a Universal Serial Bus (USB) connection mode, and the Wireless mode includes a Wireless Fidelity (WiFi) connection mode. Optionally, the wireless mode further includes a bluetooth (english: bluetooth 9) connection mode, and the user may select the connection mode between the first terminal 110 and the second terminal 120 according to an actual connection requirement.
For example, in this embodiment, the operating system of the first terminal 110 is an Android (english: Android) operating system, and the operating system of the second terminal 120 is a Windows operating system.
In practical applications, the first terminal 110 serves as a source terminal for screen projection, the second terminal 120 serves as a target terminal for screen projection, and the first terminal 110 projects contents displayed on a screen onto the second terminal 120 for display. The first terminal 110 is a mobile device, and the screen projection in the embodiments of the present application projects content displayed on a screen of the mobile device.
Optionally, the second terminal 120 is installed with a projection application, and when the second terminal 120 starts the projection application, a window is displayed on a desktop of the second terminal 120, where the window is used to display the content displayed on the screen of the first terminal 110.
Optionally, when the projection application is started, a guidance window is displayed on the desktop of the second terminal 120, and with reference to fig. 2, a screen projection mode selection control 131 is displayed in the guidance window 130, where the screen projection mode includes, for example, a USB mode and a WiFi mode, and when the network condition is good, the WiFi mode may be selected, and when the network condition is not good, the USB mode may be selected. In addition, a configuration parameter adjustment control 132 is further included in the guide window 130, and the configuration parameter adjustment illustratively includes adjustment of resolution and adjustment of definition, and the configuration parameter affects the size of the image data and the quality of image display, and the higher the resolution and the higher the definition are, the larger the image data is, the higher the image quality is, and correspondingly, the slower the transmission speed of the image data is.
The wizard window 130 is also called a configuration window, and the screen projection mode selection control 131 is a configuration control of a predetermined connection mode, where the predetermined connection mode at least includes a USB connection mode and a WiFi connection mode.
When the projection application receives the start instruction, the second terminal 120 displays a configuration window of the projection application, where the configuration window includes a configuration control of a predetermined connection manner, and the configuration control is, for example, a screen-casting mode selection control 131 shown in fig. 2. When the projection application program receives the trigger signal for configuring the control, the second terminal 120 establishes a connection with the first terminal 110 according to a predetermined connection mode, that is, after the user selects the USB connection mode or the WiFi connection mode, the second terminal 120 connects with the first terminal 110 according to the connection mode selected by the user. When the first terminal 110 establishes a connection with the second terminal 120, the projection application program on the second terminal 120 sends a projection program file to the first terminal 110, the projection program file is equivalent to the projection application program on the mobile device side, after receiving the projection program file, the first terminal 110 sends a response signal to the second terminal 120, the response signal is used for notifying the second terminal 120 that the first terminal 110 has successfully received the projection program file, and when receiving the response signal, the second terminal 120 sends a control signal to the first terminal 110, and the control signal is used for controlling the first terminal 110 to run the projection program file.
Optionally, the configuration window may include a configuration parameter adjustment control 132, and a user may adjust the resolution and definition during screen projection through the configuration parameter adjustment control 132, and after selecting the resolution and definition, the user triggers the screen projection mode selection control 131, so that the control signal sent by the projection application program to the first terminal 110 includes the selected resolution and definition requirements, and the first terminal 110 processes the screen image according to the required resolution and definition.
It should be noted that, after the second terminal 120 sends the projection program file to the first terminal 110, the projection application displays a projection window of the projection application on the second terminal 120, where the projection window is used to display a screen image that the first terminal 110 needs to perform screen projection.
Fig. 3 is a flowchart of a method for screen projection of a mobile device according to an embodiment of the present invention, which is illustrated in the implementation environment shown in fig. 1. As shown in fig. 3, the method may include:
step 201, the first terminal obtains a screen image according to the content to be displayed.
The content to be displayed is the content that needs to be displayed on the screen. In practical applications, there may be more than one content to be displayed, and there is an occlusion and/or mixing relationship between the contents to be displayed, such as: when the desktop needs to be displayed on the screen, the wallpaper is a content to be displayed, the icon of each application program is a content to be displayed, and the icon of the application program is displayed on the wallpaper, so that the icon of the application program shields a part of the area of the wallpaper. Mixing refers to transparent channel mixing, such as: a green picture is arranged below the red semitransparent picture, and the color of the finally displayed picture is a composite color obtained by mixing red and green.
The screen image is an image obtained by performing occlusion and/or mixing processing on content to be displayed, and corresponds to a picture displayed on the screen.
In step 202, the first terminal stores the screen image in a first buffer and a second buffer.
The first cache is a frame cache corresponding to a hardware screen, the image stored in the first cache is used for being displayed on a screen of the first terminal, the second cache is a virtual cache corresponding to a virtual screen, and the image stored in the second cache is used for being coded and compressed and then sent to the second terminal.
And after the first terminal synthesizes the contents to be displayed into the screen image, sending the synthesized screen image to the first cache and the second cache simultaneously.
In step 203, the first terminal displays the screen image stored in the first buffer on the screen of the first terminal.
When the screen image is written into the first buffer, the screen image is displayed on the screen of the first terminal.
And step 204, the first terminal performs image differential coding on the screen image stored in the second cache.
And when the first terminal writes the screen image into the second buffer, informing the encoder to encode the written screen image.
The screen image is encoded by image differential encoding, and the encoding rule can be preset by a technician. Optionally, the image differential encoding comprises H264 encoding.
When the content to be displayed on the screen changes, the first terminal generates a screen image according to the content to be displayed, and then writes the screen image into the second cache. When image differential encoding is performed on a screen image, the encoding rule is generally a time interval, such as: the technician sets the time interval to 10 seconds, and generates a key frame every 10 seconds, then the screen image written in the 10 seconds is coded as a non-key frame, the key frame is the code of the complete content of one screen image, the non-key frame is obtained by coding the variable quantity between one screen image and the previous screen image of the screen image, the first screen image written in every 10 seconds is coded as a key frame, and the other screen images written in the 10 seconds are coded as non-key frames.
In step 205, the first terminal sends the encoded image data stream to the second terminal.
The first terminal encodes a screen image to obtain a key frame and/or a non-key frame as an image data stream, and sends the encoded image data stream to a second terminal for screen projection.
In the first terminal, step 203 and step 204 are executed in parallel.
In step 206, the second terminal receives the image data stream sent by the first terminal.
And step 207, the second terminal decodes the image data stream to obtain a screen image.
Since the image data stream is obtained by encoding the screen image, the second terminal needs to decode the image data stream after receiving the image data stream, so as to obtain the original screen image before encoding.
And step 208, rendering the screen image by the second terminal, and displaying the screen image on the second terminal.
And after the original screen image is obtained by decoding, the second terminal renders the obtained screen image, so that the screen image is displayed on the second terminal.
In summary, in the screen projection method of the mobile device provided by the embodiment of the present invention, the screen image to be displayed on the screen of the first terminal is subjected to image differential encoding, the encoded image data stream is sent to the second terminal, the second terminal decodes the image data stream to obtain the screen image, the screen image is rendered and displayed on the second terminal, because the image data stream is obtained by image differential encoding, the encoding is performed with respect to a portion where the screen image is changed, not with respect to the entire screen image, at the time of encoding, so that the data size after the screen image encoding can be reduced, and in addition, because the first cache and the second cache are two independent caches, the first terminal encodes the screen image in the second cache, and can avoid influencing the image data in the first cache, thereby ensuring the normal display of the screen of the first terminal.
Fig. 4 is a flowchart of a method for screen projection of a mobile device according to another embodiment of the present invention, which is illustrated in the implementation environment shown in fig. 1. As shown in fig. 4, the method may include:
step 301, the first terminal generates a second cache in a virtual form through the display system.
The first terminal is provided with a display system, and the display system is used for storing screen images.
The display system includes a first display and a second display, the first display corresponding to the hardware screen, the second display corresponding to the virtual screen, the screen image displayed in the first display being stored in a first cache, the screen image displayed in the second display being stored in a second cache.
The Hardware Display (english: Hardware Display) corresponding to the Hardware screen is included in the Display system, and the Display system allows at least one Virtual Display to be created. When a virtual display is created in the display system, the screen image is stored by traversing the hardware display and the virtual display. The hardware display corresponds to the first cache, and the virtual display corresponds to the second cache, that is, the screen image sent to the hardware display is actually stored in the first cache, and the screen image sent to the virtual display is actually stored in the second cache. When the virtual display is created, the operating system automatically creates a second cache corresponding to the virtual display.
The first display is a hardware display, and the second display is a virtual display.
The first Buffer is called a Frame Buffer in practical application, and the second Buffer is called a virtual Buffer in practical application.
Step 302, the first terminal obtains a screen image according to the content to be displayed.
The content to be displayed is the content that needs to be displayed on the screen. In practical applications, there may be more than one content to be displayed, and there is an occlusion and/or mixing relationship between the contents to be displayed, such as: when the desktop needs to be displayed on the screen, the wallpaper is a content to be displayed, the icon of each application program is a content to be displayed, and the icon of the application program is displayed on the wallpaper, so that the icon of the application program shields a part of the area of the wallpaper. Mixing refers to transparent channel mixing, such as: a green picture is arranged below the red semitransparent picture, and the color of the finally displayed picture is a composite color obtained by mixing red and green.
The screen image is an image obtained by performing occlusion and/or mixing processing on content to be displayed, and corresponds to a picture displayed on the screen.
Alternatively, step 302 may be replaced with steps 302a to 302c shown in fig. 5.
Step 302a, the first terminal packages the views to be displayed as surfaces, one surface for each view.
The content to be displayed includes at least one view.
The operating system of the first terminal is provided with a window component, and the window is a unit for managing display content and is a carrier of images. When the desktop is displayed on the screen, a desktop window is opened, and when a user opens a certain application program, a window corresponding to the application program is opened. Such as: and when the operating system is an android operating system, the window component is Activity.
The window comprises a plurality of display contents, taking a desktop window as an example, the desktop window comprises a wallpaper and icons of 3 application programs, the wallpaper corresponds to one view (English), the icon of one application program corresponds to one view, the contents to be displayed are the view corresponding to the wallpaper and the view corresponding to the icon of each application program, the first terminal firstly packages the views, and the views are respectively packaged into surfaces (English), so that a wallpaper surface and icon surfaces of three application programs are obtained.
And step 302b, the first terminal determines the occlusion relation and/or the mixing relation of each surface through the image management framework.
The image management framework is responsible for managing occlusion and blending between multiple images.
And when the operating system of the first terminal is an android operating system, the image management framework is a surfaceFlinger framework.
After the view to be displayed is encapsulated into surfaces, occlusion and/or blending between the various surfaces is managed through the surface flicker framework.
And step 302c, the first terminal synthesizes all the surfaces to obtain a screen image according to the shielding relation and/or the mixing relation.
The image management framework internally comprises a rendering engine, the rendering engine synthesizes all the surfaces into an image according to the shielding relation and/or the mixing relation among all the surfaces, and the synthesized image is a screen image.
When the image management framework is a surface flag framework, the rendering engine is an Open Graphics Library (OpenGL).
In step 303, the first terminal stores the screen image in the first cache and the second cache through the image management framework as the producer.
The first terminal adopts a producer consumer mode, the producer is an image management framework, and the consumer is an encoder.
The image management framework is used as a producer and used for generating a screen image according to the content to be displayed, and after the image management framework generates the screen image, the generated screen image is sent to the display system.
Taking the operating system of the first terminal as an android system as an example, after generating a screen image, the surface flicker framework traverses all displays (including a hardware display and a virtual display) in the display system, then sends the same screen image to each display, stores the screen image sent to the hardware display (the first display) in a first cache (a frame cache), and stores the screen image sent to the virtual display (the second display) in a second cache (a virtual cache).
And step 304, the first terminal displays the screen image stored in the first cache on a screen of the first terminal.
When the screen image is written into the first buffer, the screen image is displayed on the screen of the first terminal.
The first terminal differentially encodes the screen image stored in the second buffer through the encoder as a consumer, step 305.
And when the encoder serving as the consumer writes the screen image into the second cache, the notification of the operating system is received, and the encoder encodes the screen image in the second cache.
Optionally, the operating system is an android operating system, and the encoder is MediaCodec.
The screen image is encoded by image differential encoding, and the encoding rule can be preset by a technician. Optionally, the image differential encoding comprises H264 encoding.
When the content to be displayed on the screen changes, the surfefinger framework generates a screen image according to the content to be displayed, and then writes the screen image into the second cache. When image differential encoding is performed on a screen image, the encoding rule is generally a time interval, such as: the technician sets the time interval to 10 seconds, and generates a key frame every 10 seconds, and the screen image written in the 10 seconds is encoded into a non-key frame, wherein the key frame is the code of the complete content of one screen image, and the non-key frame is obtained by encoding the variable quantity between one screen image and a screen image before the screen image, and the previous screen image is called a reference frame. Note that the first screen image written every 10 seconds is encoded as a key frame, and the other screen images written during the 10 seconds are encoded as non-key frames.
Step 306, the first terminal sends the encoded image data stream to the second terminal.
Optionally, since the first terminal and the second terminal are connected in a wired manner or a wireless manner, optionally, step 306 may be implemented in the following two manners:
and S1, the first terminal generates a Transmission Control Protocol (TCP) data packet according to the image data stream, and sends the TCP data packet to the second terminal.
When the first terminal is connected with the second terminal in a wireless network mode, the TCP communication protocol is observed between the first terminal and the second terminal when the first terminal sends the image data stream to the second terminal.
And S2, the first terminal generates a Universal Serial Bus (USB) data packet according to the image data stream and sends the USB data packet to the second terminal.
When the quality of network communication is not good, in order to enable the second terminal to receive the image data stream sent by the first terminal in time, the first terminal and the second terminal can communicate in a wired mode. Such as: after the USB connection is established between the first terminal and the second terminal, the first terminal sends the image data stream to the second terminal.
In step 307, the second terminal receives the image data stream sent by the first terminal.
Corresponding to the two ways in step 306, step 307 can be implemented by the following two ways:
and S3, the second terminal receives the TCP data packet sent by the first terminal, analyzes the TCP data packet and obtains the image data stream in the TCP data packet.
When the first terminal and the second terminal are connected in a wireless mode, the first terminal packs the image data stream into a TCP data packet for transmission, and the second terminal can analyze the image data stream from the TCP data packet after receiving the TCP data packet.
And S4, the second terminal receives the USB data packet sent by the first terminal, analyzes the USB data packet and obtains the image data stream in the USB data packet.
When the first terminal and the second terminal are connected in a wired mode, the first terminal packs the image data stream into a USB data packet for transmission, and the second terminal can analyze the image data stream from the USB data packet after receiving the USB data packet.
And 308, the second terminal decodes the image data stream through a decoder serving as a producer to obtain a screen image.
The second terminal is also in producer-consumer mode, the decoder as producer and the renderer as consumer.
Optionally, before the decoder decodes the image data stream, the image data stream needs to be firstly subjected to format conversion, and converted into a format recognizable by the decoder. Such as: and if the operating system of the second terminal is a Windows operating system, the decoder is FFmpeg, and the format recognizable by the decoder is AVpacket.
When decoding the image data stream, the decoder uses a decoding rule corresponding to the encoding rule. Such as: when decoding the key frame, the decoder directly decodes the key frame to obtain the screen image corresponding to the key frame, and when decoding the non-key frame, the decoder needs to combine the reference frame of the non-key frame during encoding to obtain the screen image corresponding to the non-key frame.
The decoder stores the decoded screen image in a buffer corresponding to Frame Data (English: Frame Data).
And 309, writing the screen image into the video memory by the second terminal serving as a renderer of the consumer, rendering the screen image, and displaying the screen image on the second terminal.
And rendering and displaying the decoded screen image in a projection window by the projection application program.
And the renderer reads the screen image from the buffer zone corresponding to the frame data, writes the screen image into the video memory, converts the screen image into an image in the video memory, the image in the video memory is called as texture, and renders the texture, so that the screen image can be displayed on the second terminal.
And when the operating system of the second terminal is a Windows operating system, the renderer is DirectX, and the texture is D3 DTfuture.
Because DirectX is a programming interface for directly operating graphics card hardware for rendering, the amount of calculation of a Central Processing Unit (CPU) can be greatly reduced, and the image Processing speed can be improved.
When the projection application is installed on the second terminal, the renderer renders and displays the screen image in a window corresponding to the projection application. Referring now to fig. 6, a schematic diagram of the projection of the content displayed on the screen of the mobile phone into a window on the desktop of the computer is shown. The first terminal 110 is a mobile phone, the second terminal 120 is a computer, and the content displayed on the screen 10 of the first terminal 110 is projected to the window 20 on the desktop of the second terminal 120 for display. The window 20 can display the contents displayed in the screen 10 of the first terminal 110 in real time, and at the same time, the user can simulate a touch operation on the screen 10 by performing a click operation in the window 20 with the mouse, and simulate a slide operation on the screen 10 by performing a click-and-drag operation in the window 20 with the mouse.
For the projection method in this embodiment, a schematic diagram of the projection method shown in fig. 7 may be expressed. As shown in fig. 7, the Android terminal 400 includes a producer 410 and a consumer 420, in the producer 410, each window 411 includes a view 412, the view 412 is packaged to obtain a surface 413, each surface 413 is sent to the image management framework 414 for composition, the composition operation is executed by a rendering engine in the image management framework 414, a screen image synthesized by the image management framework 414 is sent to the hardware display 415 and the virtual display 417, the screen image sent to the hardware display 415 is stored in the frame buffer 416 for display, and the screen image sent to the virtual display 417 is stored in the virtual buffer 418; in the consumer 420, the screen image in the virtual buffer 418 is used as an input screen image 421, the input screen image 421 is encoded by the encoder 422, and the encoded image data stream (H264 stream) is output to the output buffer 423 by the encoder 422. The Windows terminal 500 includes a producer 510 and a consumer 520, the producer 510 receives the H264 stream in the output buffer 423, the H264 stream is sent to the Windows terminal 500 through TCP or USB, the producer 510 converts the received H264 stream into a decoder format 511, and sends the format-converted H264 stream to a decoder 512 for decoding, so as to obtain a decoded screen image 513, where the decoded screen image 513 may include multiple image formats, such as: RGB or YUV, the decoded screen image 513 is stored in the frame data buffer 514, the screen image in the frame data buffer 514 is written into the video memory to obtain the texture 521, the texture 521 is rendered by using the renderer 522, and the screen image is displayed in the desktop window 523.
In this embodiment, the steps executed by the first terminal are controlled by the projection program file, and the steps executed by the second terminal are controlled by the projection application.
In summary, in the screen projection method of the mobile device provided by the embodiment of the present invention, the screen image to be displayed on the screen of the first terminal is subjected to image differential encoding, the encoded image data stream is sent to the second terminal, the second terminal decodes the image data stream to obtain the screen image, the screen image is rendered and displayed on the second terminal, because the image data stream is obtained by image differential encoding, the encoding is performed with respect to a portion where the screen image is changed, not with respect to the entire screen image, at the time of encoding, so that the data size after the screen image encoding can be reduced, and in addition, because the first cache and the second cache are two independent caches, the first terminal encodes the screen image in the second cache, and can avoid influencing the image data in the first cache, thereby ensuring the normal display of the screen of the first terminal.
Aiming at the steps 303 to 305, the first terminal adopts a producer consumer mode, so that the encoder encodes the screen image when the screen image is written in the second cache, and the screen image can be safely and quickly transmitted.
In step 301, by adding a second cache in a virtual form to the display system of the first terminal, when the surfefinger framework in the android system sends a synthesized screen image to the display system, the screen image can be simultaneously sent to a first cache corresponding to the hardware screen and a second cache corresponding to the virtual screen, so that the screen image in the first cache meets the display requirement of the screen of the first terminal, and meanwhile, the screen image in the second cache is independently encoded, so that the processing efficiency of the screen image is improved, and the problem that the screen image is read from the first cache, so that the first cache is used for both display of the hardware screen and projection processing, and the processing efficiency is low is solved.
With respect to steps 306 to 307, data transmission is performed between the first terminal and the second terminal by using multiple modes of TCP or USB, so that data transmission can be performed in a wired mode under the condition that the network is not good, thereby ensuring the transmission efficiency of the image data stream.
And in steps 308 to 309, the second terminal uses the producer consumer mode, the decoder decodes the image data stream, the obtained screen image is stored in the buffer area, and the renderer acquires the screen image from the buffer area and writes the screen image into the video memory for rendering and displaying, so that the second terminal can safely and quickly decode and display the screen image.
The image data stream is converted by the second terminal into a format recognizable by the decoder, such that the decoder can decode the received image data stream, per step 308.
With respect to step 309, by using the DirectX renderer, the amount of calculation on the CPU side is reduced, so that the speed of processing the image is faster.
Fig. 8 is a block diagram illustrating a screen projecting apparatus of a mobile device according to an embodiment of the present invention, which is applied to the first terminal 110 shown in fig. 1. As shown in fig. 8, the screen projecting apparatus of the mobile device includes: a first generation module 610, a storage module 620, a display module 630, an encoding module 640, and a transmission module 650.
A first generating module 610, configured to implement the above step 201, step 302, and any other implicit or disclosed generating related functions.
A storage module 620, configured to implement the foregoing step 202, step 303, and any other implicit or public storage-related functions.
A display module 630, configured to implement the above-mentioned step 203, step 304, and any other implicit or public display-related functions.
And an encoding module 640, configured to implement the above step 204, step 305, and any other implicit or disclosed encoding related functions.
A sending module 650, configured to implement the above step 205, step 306, and any other implicit or explicit sending related functions.
Optionally, the screen projecting apparatus of the mobile device further includes a second generating module.
A second generation module, configured to implement step 301 and any other implicit or disclosed generation-related functions.
Optionally, the first generating module 610 includes: the device comprises an encapsulation unit, a determination unit and a synthesis unit.
And an encapsulation unit, configured to implement the foregoing step 302a and any other implicit or disclosed encapsulation-related functions.
A determination unit, configured to implement the above step 302b and any other implicit or disclosed determination related functions.
A synthesis unit for implementing the above step 302c and any other implicit or disclosed synthesis related functions.
Optionally, the sending module 650 includes: a first transmitting unit and a second transmitting unit.
A first sending unit, configured to implement step S1 and any other implicit or disclosed sending related functions.
A second sending unit, configured to implement step S2 and any other implicit or disclosed sending related functions.
In summary, in the screen projection apparatus of the mobile device provided in the embodiment of the present invention, the screen image that needs to be displayed on the screen of the first terminal is subjected to image differential encoding, the encoded image data stream is sent to the second terminal, the second terminal decodes the image data stream to obtain the screen image, and renders and displays the screen image on the second terminal, because the image data stream is obtained by image differential encoding, the encoding is performed with respect to a portion where the screen image is changed, not with respect to the entire screen image, at the time of encoding, so that the data size after the screen image encoding can be reduced, and in addition, because the first cache and the second cache are two independent caches, the first terminal encodes the screen image in the second cache, and can avoid influencing the image data in the first cache, thereby ensuring the normal display of the screen of the first terminal.
It should be noted that: in the screen projecting apparatus of the mobile device provided in the above embodiment, only the division of the above functional modules is used for illustration when projecting the screen, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the first terminal is divided into different functional modules to complete all or part of the above described functions. In addition, the screen projecting apparatus of the mobile device and the screen projecting method embodiment of the mobile device provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
An embodiment of the present invention further provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the screen projection method of the mobile device described in steps 201 to 205 in fig. 3 and steps 301 to 306 in fig. 4.
In an exemplary embodiment, a computer readable storage medium is also provided, in which at least one instruction, at least one program, code set, or instruction set is stored, and the at least one instruction, the at least one program, code set, or instruction set is loaded and executed by a processor to implement the screen projection method of the mobile device as described in steps 201 to 205 in fig. 3, and steps 301 to 306 in fig. 4.
Fig. 9 is a block diagram of a screen projection apparatus of a mobile device according to another embodiment of the present invention, which is illustrated in the second terminal 120 shown in fig. 1. As shown in fig. 9, the screen projecting apparatus of the mobile device includes: a receiving module 710, a decoding module 720, and a display module 730.
A receiving module 710, configured to implement the foregoing step 206, step 307, and any other implicit or public receiving-related functions.
A decoding module 720, configured to implement the above step 207, step 308, and any other implicit or disclosed decoding related functions.
A display module 730, configured to implement the above step 208, step 309, and any other implicit or public display related functions.
Optionally, the receiving module 710 includes: a first receiving unit and a second receiving unit.
A first receiving unit, configured to implement the step S3 and any other implicit or disclosed receiving related functions.
A second receiving unit, configured to implement the step S4 and any other implicit or disclosed receiving related functions.
In summary, in the screen projection apparatus of the mobile device provided in the embodiment of the present invention, the screen image that needs to be displayed on the screen of the first terminal is subjected to image differential encoding, the encoded image data stream is sent to the second terminal, the second terminal decodes the image data stream to obtain the screen image, and renders and displays the screen image on the second terminal, because the image data stream is obtained by image differential encoding, the encoding is performed with respect to a portion where the screen image is changed, not with respect to the entire screen image, at the time of encoding, so that the data size after the screen image encoding can be reduced, and in addition, because the first cache and the second cache are two independent caches, the first terminal encodes the screen image in the second cache, and can avoid influencing the image data in the first cache, thereby ensuring the normal display of the screen of the first terminal.
It should be noted that: in the screen projecting apparatus of the mobile device provided in the above embodiment, only the division of the above functional modules is used for illustration when projecting the screen, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the second terminal is divided into different functional modules to complete all or part of the above described functions. In addition, the screen projecting apparatus of the mobile device and the screen projecting method embodiment of the mobile device provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
An embodiment of the present invention further provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the screen projection method of the mobile device as described in steps 206 to 208 in fig. 3 and steps 307 to 309 in fig. 4.
In an exemplary embodiment, a computer readable storage medium is also provided, in which at least one instruction, at least one program, code set, or instruction set is stored, and the at least one instruction, the at least one program, code set, or instruction set is loaded and executed by a processor to implement the screen projection method of the mobile device as described in steps 206 to 208 in fig. 3, and steps 307 to 309 in fig. 4.
Fig. 10 is a block diagram illustrating a screen projection system of a mobile device according to an embodiment of the present invention, where the screen projection system 800 of the mobile device includes a first terminal 810 and a second terminal 820.
The first terminal 810 includes a screen projecting apparatus of a mobile device as shown in fig. 8, and the second terminal 820 includes a screen projecting apparatus of a mobile device as shown in fig. 9.
Referring to fig. 11, a block diagram of a terminal according to some embodiments of the present invention is shown. The terminal 900 is used for implementing the screen projection method of the mobile device provided by the above embodiment. Terminal 900 in the present invention may include one or more of the following components: a processor for executing computer program instructions to perform various processes and methods, a Random Access Memory (RAM) and a read-only memory (ROM) for storing information and program instructions, a memory for storing data and information, I/O devices, interfaces, antennas, and the like. Specifically, the method comprises the following steps:
the terminal 900 may include a Radio Frequency (RF) circuit 910, a memory 920, an input unit 930, a display unit 940, a sensor 950, an audio circuit 960, a wireless fidelity (WiFi) module 970, a processor 980, a power supply 982, a camera 990, and the like. Those skilled in the art will appreciate that the terminal structure shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The various components of terminal 900 are described in detail below with reference to fig. 11:
the RF circuit 910 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information of a base station and then processing the received downlink information to the processor 980; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuit includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), etc.
The memory 920 may be used to store software programs and modules, and the processor 980 may execute various functional applications and data processing of the terminal 900 by operating the software programs and modules stored in the memory 920. The memory 920 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal 900, and the like. Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 930 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal 900. Specifically, the input unit 930 may include a touch panel 931 and other input devices 932. The touch panel 931, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 931 (e.g., a user's operation on or near the touch panel 931 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a preset program. Alternatively, the touch panel 931 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 980, and can receive and execute commands sent by the processor 980. In addition, the touch panel 931 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 930 may include other input devices 932 in addition to the touch panel 931. In particular, other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 940 may be used to display information input by the user or information provided to the user and various menus of the terminal 900. The Display unit 940 may include a Display panel 941, and optionally, the Display panel 941 may be configured by using a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), and the like. Further, the touch panel 931 may cover the display panel 941, and when the touch panel 931 detects a touch operation on or near the touch panel 931, the touch panel transmits the touch operation to the processor 980 to determine the type of the touch event, and then the processor 980 provides a corresponding visual output on the display panel 941 according to the type of the touch event. Although in fig. 11 the touch panel 931 and the display panel 941 are two independent components to implement the input and output functions of the terminal 900, in some embodiments, the touch panel 931 and the display panel 941 may be integrated to implement the input and output functions of the terminal 900.
The terminal 900 can also include at least one sensor 950, such as a gyroscope sensor, a magnetic induction sensor, an optical sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 941 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 941 and/or a backlight when the terminal 900 is moved to the ear. As one type of motion sensor, the acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of electronic equipment, and related functions (such as pedometer and tapping) for vibration recognition; as for other sensors such as barometer, hygrometer, thermometer, infrared sensor, etc. that can be configured in the terminal 900, they will not be described herein.
Audio circuitry 960, speaker 961, microphone 962 may provide an audio interface between a user and terminal 900. The audio circuit 960 may transmit the electrical signal converted from the received audio data to the speaker 961, and convert the electrical signal into a sound signal for output by the speaker 961; on the other hand, the microphone 962 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 960, and outputs the audio data to the processor 980 for processing, and then transmits the audio data to another terminal via the RF circuit 910, or outputs the audio data to the memory 920 for further processing.
WiFi belongs to a short-distance wireless transmission technology, and the terminal 900 can help a user send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 970, which provides the user with wireless broadband internet access. Although fig. 11 shows WiFi module 970, it is understood that it does not belong to the essential constituents of terminal 900, and can be omitted entirely as needed within the scope of not changing the essence of the disclosure.
The processor 980 is a control center of the terminal 900, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the terminal 900 and processes data by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the electronic device. Alternatively, processor 980 may include one or more processing units; preferably, the processor 980 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 980.
The terminal 900 also includes a power supply 982 (e.g., a battery) for powering the various components, which are preferably logically connected to the processor 980 via a power management system that provides management of charging, discharging, and power consumption.
The camera 990 generally includes a lens, an image sensor, an interface, a digital signal processor, a Central Processing Unit (CPU), a display screen, and the like. The lens is fixed above the image sensor, and the focusing can be changed by manually adjusting the lens; the image sensor is equivalent to the 'film' of a traditional camera and is the heart of a camera for acquiring images; the interface is used for connecting the camera with the main board of the electronic equipment by using a flat cable, a board-to-board connector and a spring type connection mode, and sending the acquired image to the memory 920; the digital signal processor processes the acquired image through a mathematical operation, converts the acquired analog image into a digital image, and transmits the digital image to the memory 920 through an interface.
Although not shown, the terminal 900 may further include a bluetooth module or the like, which is not described in detail herein.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (16)

1. A screen projection method of a mobile device, the method comprising:
generating a second cache of a virtual form through a display system, wherein the display system comprises a hardware display and a virtual display, the hardware display corresponds to a hardware screen, the virtual display corresponds to a virtual screen, a screen image displayed in the hardware display is stored in the first cache, and a screen image displayed in the virtual display is stored in the second cache;
obtaining a screen image according to the content to be displayed;
sending the screen image to the first cache and the second cache simultaneously, wherein the first cache and the second cache are independent;
displaying the screen image stored in the first cache on a screen of a first terminal, and simultaneously performing image differential coding on the screen image stored in the second cache, wherein the first terminal is the mobile device;
and sending the encoded image data stream to a second terminal, wherein the image data stream is used for triggering the second terminal to decode the received image data stream, and rendering and displaying the decoded screen image, and the second terminal is a screen projection target terminal.
2. The method of claim 1, wherein the first terminal comprises an image management framework and an encoder, and wherein the first terminal is in a producer-consumer mode;
the sending the screen image to the first cache and the second cache simultaneously includes:
sending the screen image to the first cache and the second cache simultaneously through the image management framework as the producer;
the image differential encoding of the screen image stored in the second buffer includes:
image differential encoding the screen image stored in the second buffer by the encoder as the consumer.
3. The method of claim 2, wherein the content to be displayed comprises at least one view;
the obtaining of the screen image according to the content to be displayed includes:
packaging the views to be displayed as surfaces, one for each view;
determining, by the image management framework, an occlusion relationship and/or a blending relationship for each of the surfaces;
and synthesizing the surfaces according to the shielding relation and/or the mixing relation to obtain the screen image.
4. The method according to any of claims 1 to 3, wherein said transmitting the encoded image data stream to the second terminal comprises:
generating a Transmission Control Protocol (TCP) data packet according to the image data stream, and sending the TCP data packet to the second terminal;
or the like, or, alternatively,
and generating a Universal Serial Bus (USB) data packet according to the image data stream, and sending the USB data packet to the second terminal.
5. A screen projection method of a mobile device, the method comprising:
a first terminal generates a second cache in a virtual form through a display system, wherein the display system comprises a hardware display and a virtual display, the hardware display corresponds to a hardware screen, the virtual display corresponds to a virtual screen, a screen image displayed in the hardware display is stored in the first cache, the screen image displayed in the virtual display is stored in the second cache, and the first terminal is the mobile device;
the first terminal receives a projection program file sent by a second terminal and sends a response signal to the second terminal, wherein the response signal is used for notifying the second terminal that the projection program file is successfully received, the projection program file is sent after a projection application program on the second terminal is started, the projection program file corresponds to the projection application program, and the second terminal is a target terminal for screen projection;
the first terminal receives a control signal sent by the second terminal and operates the projection program file according to the control signal;
the first terminal obtains a screen image according to the content to be displayed;
the first terminal sends the screen image to the first cache and the second cache simultaneously, and the first cache and the second cache are independent;
the first terminal displays the screen image stored in the first cache on a screen of the first terminal, and simultaneously performs image differential coding on the screen image stored in the second cache through the control of the projection program file;
and the first terminal sends the encoded image data stream to the second terminal, wherein the image data stream is used for triggering the second terminal to decode the image data stream under the control of the projection application program, and the decoded screen image is rendered and displayed in a projection window.
6. The method according to claim 5, wherein before the first terminal receives the projector program file sent by the second terminal, the method further comprises:
the first terminal establishes connection with the second terminal according to a preset connection mode, the preset connection mode is determined when the second terminal receives a trigger signal of a configuration control of the preset connection mode, and the configuration control of the preset connection mode is located in a configuration window of the projection application program.
7. A screen projection method of a mobile device, the method comprising:
receiving an image data stream sent by a first terminal, wherein the image data stream is obtained by performing image differential encoding on a screen image stored in a second cache while displaying the screen image stored in a first cache on a screen of the first terminal, the screen image is simultaneously sent to the first cache and the second cache, the first cache and the second cache are independent of each other, the screen image stored in the first cache is used for displaying on the screen of the first terminal, the screen image stored in the second cache is used for sending after performing image differential encoding, the second cache is a cache in a virtual form generated by a display system of the first terminal, the display system comprises a hardware display and a virtual display, and the hardware display corresponds to the hardware screen, the virtual display corresponds to a virtual screen, a screen image displayed in the hardware display is stored in the first cache, a screen image displayed in the virtual display is stored in the second cache, and the first terminal is the mobile device;
decoding the image data stream to obtain the screen image;
rendering the screen image, and displaying the screen image on a second terminal, wherein the second terminal is a target terminal for screen projection.
8. The method of claim 7, wherein the second terminal comprises a decoder and a renderer, and the second terminal adopts a producer-consumer mode;
the decoding the image data stream to obtain the screen image includes:
decoding the image data stream by the decoder as the producer to obtain the screen image;
the rendering the screen image and displaying the screen image on a second terminal includes:
and writing the screen image into a video memory through the renderer as the consumer, rendering the screen image, and displaying the screen image on the second terminal.
9. The method according to claim 7 or 8, wherein the receiving the image data stream transmitted by the first terminal comprises:
receiving a Transmission Control Protocol (TCP) data packet sent by the first terminal, and analyzing the TCP data packet to obtain the image data stream in the TCP data packet;
or the like, or, alternatively,
and receiving a Universal Serial Bus (USB) data packet sent by the first terminal, and analyzing the USB data packet to obtain the image data stream in the USB data packet.
10. A screen projection method of a mobile device, the method comprising:
the method comprises the steps that after a projection application program is started, a second terminal sends a projection program file to a first terminal and displays a projection window of the projection application program, the projection application program is installed on the second terminal, the projection program file corresponds to the projection application program, the first terminal is the mobile device, and the second terminal is a target terminal for screen projection;
when a response signal sent by the first terminal is received, the second terminal sends a control signal for operating the projection program file to the first terminal, and the control signal is used for triggering the screen image to be subjected to image differential coding under the control of the projection program file and then sent; the screen image is obtained by the first terminal according to the content to be displayed, the screen image is simultaneously sent to the first cache and the second cache, the first cache and the second cache are independent of each other, the screen image stored in the first cache is used for being displayed on a screen of the first terminal, the screen image stored in the second cache is used for being sent after image differential coding is carried out, the second cache is a virtual form of cache generated by a display system of the first terminal, the display system includes a hardware display and a virtual display, the hardware display corresponding to a hardware screen, the virtual display corresponds to a virtual screen, the screen image displayed in the hardware display is stored in the first cache, and the screen image displayed in the virtual display is stored in the second cache;
the second terminal receives an image data stream sent by the first terminal, decodes the image data stream, and performs image differential coding on the screen image stored in the second buffer while displaying the screen image stored in the first buffer on the screen of the first terminal;
and the second terminal renders and displays the decoded screen image in the projection window.
11. The method of claim 10, wherein before sending the projector file to the first terminal, further comprising:
the second terminal displays a configuration window of the projection application program, wherein the configuration window comprises a configuration control of a preset connection mode;
and when receiving a trigger signal of the configuration control, the second terminal establishes connection with the first terminal according to the preset connection mode.
12. A screen projection apparatus of a mobile device, the apparatus comprising:
a second generation module, configured to generate a second cache in a virtual form through a display system, where the display system includes a hardware display and a virtual display, the hardware display corresponds to a hardware screen, the virtual display corresponds to a virtual screen, a screen image displayed in the hardware display is stored in the first cache, and a screen image displayed in the virtual display is stored in the second cache;
the first generation module is used for obtaining a screen image according to the content to be displayed;
the storage module is used for simultaneously sending the screen image obtained by the first generation module to the first cache and the second cache, and the first cache and the second cache are independent;
the display module is used for displaying the screen image stored in the first cache by the storage module on a screen of a first terminal, and the first terminal is the mobile device;
the coding module is used for carrying out image differential coding on the screen image stored in the second cache by the storage module;
and the sending module is used for sending the image data stream coded by the coding module to a second terminal, the image data stream is used for triggering the second terminal to decode the received image data stream and rendering and displaying the screen image obtained after decoding, and the second terminal is a screen projection target terminal.
13. A screen projection apparatus of a mobile device, the apparatus comprising:
a receiving module, configured to receive an image data stream sent by a first terminal, where the image data stream is obtained by performing image differential encoding on a screen image stored in a second cache while displaying the screen image stored in a first cache on a screen of the first terminal, the screen image is sent to the first cache and the second cache simultaneously, the first cache and the second cache are independent of each other, the screen image stored in the first cache is used for displaying on the screen of the first terminal, the screen image stored in the second cache is used for sending after performing image differential encoding, the second cache is a cache in a virtual form generated by a display system of the first terminal, the display system includes a hardware display and a virtual display, and the hardware display corresponds to a hardware screen, the virtual display corresponds to a virtual screen, a screen image displayed in the hardware display is stored in the first cache, a screen image displayed in the virtual display is stored in the second cache, and the first terminal is the mobile device;
the decoding module is used for decoding the image data stream received by the receiving module to obtain the screen image;
and the display module is used for rendering the screen image obtained by the decoding module and displaying the screen image on a second terminal, and the second terminal is a screen projection target terminal.
14. The screen projecting system of the mobile equipment is characterized by comprising a first terminal and a second terminal, wherein the first terminal is the mobile equipment, and the second terminal is a target terminal for screen projection;
the first terminal comprises the screen projecting device of the mobile device as claimed in claim 12, and the second terminal comprises the screen projecting device of the mobile device as claimed in claim 13.
15. A terminal, characterized in that the terminal comprises a processor and a memory, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, which is loaded and executed by the processor to implement the screen projection method of the mobile device according to any one of claims 1 to 4, claims 5 to 6, claims 7 to 9, or claims 10 to 11.
16. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a method of screen projection for a mobile device as claimed in any one of claims 1 to 4, 5 to 6, 7 to 9, or 10 to 11.
CN201710524252.8A 2017-06-30 2017-06-30 Screen projection method, device and system of mobile equipment Active CN109218731B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710524252.8A CN109218731B (en) 2017-06-30 2017-06-30 Screen projection method, device and system of mobile equipment
PCT/CN2018/092306 WO2019001347A1 (en) 2017-06-30 2018-06-22 Screen projection method for mobile device, storage medium, terminal and screen projection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710524252.8A CN109218731B (en) 2017-06-30 2017-06-30 Screen projection method, device and system of mobile equipment

Publications (2)

Publication Number Publication Date
CN109218731A CN109218731A (en) 2019-01-15
CN109218731B true CN109218731B (en) 2021-06-01

Family

ID=64740393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710524252.8A Active CN109218731B (en) 2017-06-30 2017-06-30 Screen projection method, device and system of mobile equipment

Country Status (2)

Country Link
CN (1) CN109218731B (en)
WO (1) WO2019001347A1 (en)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109889887A (en) * 2019-04-10 2019-06-14 北京硬壳科技有限公司 A kind of two-way interaction method and system
CN110035289B (en) * 2019-04-24 2022-04-01 润电能源科学技术有限公司 Layered compression method, system and related device for screen image
CN110381195A (en) * 2019-06-05 2019-10-25 华为技术有限公司 A kind of throwing screen display methods and electronic equipment
CN110232882B (en) * 2019-06-06 2022-09-06 深圳市福瑞达显示技术有限公司 Fan screen display control method and system based on Linux system
CN112055040B (en) * 2019-06-06 2023-07-04 百度在线网络技术(北京)有限公司 Screen projection processing method, device, equipment and storage medium
CN110321857B (en) * 2019-07-08 2021-08-17 苏州万店掌网络科技有限公司 Accurate passenger group analysis method based on edge calculation technology
CN110267073A (en) * 2019-07-24 2019-09-20 深圳市颍创科技有限公司 A kind of throwing screen picture, which show and throws, shields picture spinning solution
CN115793950A (en) * 2019-08-29 2023-03-14 荣耀终端有限公司 Control method applied to screen projection scene and related equipment
CN110572469B (en) * 2019-09-18 2022-04-12 江苏视博云信息技术有限公司 Data transmission method, input device, cloud server and cloud game system
CN110798708A (en) * 2019-10-12 2020-02-14 重庆爱奇艺智能科技有限公司 Screen projection method, device and system for display content of VR equipment
CN110850964A (en) * 2019-10-12 2020-02-28 重庆爱奇艺智能科技有限公司 Method, device and system for remotely inputting VR equipment
CN110753265B (en) * 2019-10-28 2022-04-19 北京奇艺世纪科技有限公司 Data processing method and device and electronic equipment
CN113129202B (en) * 2020-01-10 2023-05-09 华为技术有限公司 Data transmission method and device, data processing system and storage medium
CN111432070B (en) * 2020-03-17 2022-04-08 阿波罗智联(北京)科技有限公司 Application screen projection control method, device, equipment and medium
CN113687803A (en) * 2020-05-19 2021-11-23 华为技术有限公司 Screen projection method, screen projection source end, screen projection destination end, screen projection system and storage medium
CN111813302B (en) * 2020-06-08 2022-02-08 广州视源电子科技股份有限公司 Screen projection display method and device, terminal equipment and storage medium
CN112019914B (en) * 2020-08-27 2021-10-22 北京字节跳动网络技术有限公司 Screen projection method and device, electronic equipment and computer readable medium
CN114579068A (en) * 2020-11-30 2022-06-03 华为技术有限公司 Multi-screen cooperative display method and electronic equipment
CN114584816A (en) * 2020-11-30 2022-06-03 上海新微技术研发中心有限公司 Android screen projection definition setting method, computer-readable storage medium and equipment
CN114584828A (en) * 2020-11-30 2022-06-03 上海新微技术研发中心有限公司 Android screen projection method, computer-readable storage medium and device
CN114579217A (en) * 2020-11-30 2022-06-03 上海新微技术研发中心有限公司 Content definable screen projection equipment and method and computer readable storage medium
CN112866804B (en) * 2020-12-31 2023-08-11 努比亚技术有限公司 Screen projection self-adaption method, mobile terminal and readable storage medium
CN115145513A (en) * 2021-03-31 2022-10-04 华为技术有限公司 Screen projection method, system and related device
CN113660494A (en) * 2021-07-19 2021-11-16 惠州Tcl云创科技有限公司 Frame rate stable output method and system and intelligent terminal
CN113612858A (en) * 2021-09-08 2021-11-05 深圳市乐橙互联有限公司 Multi-terminal cooperation system and cooperation method
CN114173183B (en) * 2021-09-26 2023-01-24 荣耀终端有限公司 Screen projection method and electronic equipment
CN114095764B (en) * 2021-09-26 2023-01-06 荣耀终端有限公司 Screen projection method and electronic equipment
CN114567802B (en) * 2021-12-29 2024-02-09 沈阳中科创达软件有限公司 Data display method and device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447998B (en) * 2008-12-25 2012-07-11 广东威创视讯科技股份有限公司 Desktop sharing method and system
US9407724B2 (en) * 2010-05-04 2016-08-02 Microsoft Technology Licensing, Llc Using double buffering for screen sharing
US8963799B2 (en) * 2011-01-11 2015-02-24 Apple Inc. Mirroring graphics content to an external display
CN103248946B (en) * 2012-02-03 2018-01-30 海尔集团公司 The method and system that a kind of video image quickly transmits
CN102740155A (en) * 2012-06-15 2012-10-17 宇龙计算机通信科技(深圳)有限公司 Method for displaying images and electronic equipment
CN102883135B (en) * 2012-11-01 2015-08-26 成都飞视美视频技术有限公司 Screen sharing and control method
CN102883134B (en) * 2012-11-01 2015-04-22 成都飞视美视频技术有限公司 Screen sharing and controlling method for video conference system
CN203120046U (en) * 2013-03-08 2013-08-07 成都飞视美视频技术有限公司 Visualization control system based on video conference terminal
CN104407829B (en) * 2014-11-06 2018-01-23 北京凌阳益辉科技有限公司 A kind of image mirrors display methods and its device
CN105262974A (en) * 2015-08-12 2016-01-20 北京恒泰实达科技股份有限公司 Method for realizing wireless screen sharing of multiple users

Also Published As

Publication number Publication date
WO2019001347A1 (en) 2019-01-03
CN109218731A (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN109218731B (en) Screen projection method, device and system of mobile equipment
CN111544886B (en) Picture display method and related device
CN106412691B (en) Video image intercepting method and device
CN106412702B (en) Video clip intercepting method and device
CN106412687B (en) Method and device for intercepting audio and video clips
CN106412681B (en) Live bullet screen video broadcasting method and device
CN108924464B (en) Video file generation method and device and storage medium
CN106792120B (en) Video picture display method and device and terminal
CN110213504B (en) Video processing method, information sending method and related equipment
CN110138769A (en) A kind of method and relevant apparatus of image transmitting
CN109196865B (en) Data processing method, terminal and storage medium
CN108513671B (en) Display method and terminal for 2D application in VR equipment
CN114071197B (en) Screen projection data processing method and device
CN112312144B (en) Live broadcast method, device, equipment and storage medium
CN112995727A (en) Multi-screen coordination method and system and electronic equipment
WO2018205878A1 (en) Method for transmitting video information, terminal, server and storage medium
CN109726064B (en) Method, device and system for simulating abnormal operation of client and storage medium
CN111510757A (en) Method, device and system for sharing media data stream
CN113835656A (en) Display method and device and electronic equipment
CN109474833B (en) Network live broadcast method, related device and system
CN107622234B (en) Method and device for displaying budding face gift
CN108984075B (en) Display mode switching method and device and terminal
CN111208966B (en) Display method and device
CN111385513B (en) Call method and related equipment
CN111210496B (en) Picture decoding method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant