CN114697731A - Screen projection method, electronic device and storage medium - Google Patents
Screen projection method, electronic device and storage medium Download PDFInfo
- Publication number
- CN114697731A CN114697731A CN202011625074.6A CN202011625074A CN114697731A CN 114697731 A CN114697731 A CN 114697731A CN 202011625074 A CN202011625074 A CN 202011625074A CN 114697731 A CN114697731 A CN 114697731A
- Authority
- CN
- China
- Prior art keywords
- screen projection
- image data
- application
- parameters
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 158
- 238000003860 storage Methods 0.000 title claims abstract description 15
- 230000008569 process Effects 0.000 claims abstract description 72
- 238000012545 processing Methods 0.000 claims description 86
- 230000008859 change Effects 0.000 claims description 85
- 239000000872 buffer Substances 0.000 claims description 36
- 238000009877 rendering Methods 0.000 claims description 30
- 238000004590 computer program Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 7
- 238000004891 communication Methods 0.000 description 62
- 238000001514 detection method Methods 0.000 description 34
- 230000015572 biosynthetic process Effects 0.000 description 32
- 238000003786 synthesis reaction Methods 0.000 description 32
- 238000007726 management method Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 16
- 238000010295 mobile communication Methods 0.000 description 14
- 230000005540 biological transmission Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 230000000694 effects Effects 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 8
- 230000005236 sound signal Effects 0.000 description 8
- 230000003068 static effect Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000003139 buffering effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 238000005266 casting Methods 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440281—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The application discloses a screen projection method, electronic equipment and a storage medium, and belongs to the field of screen projection. In the embodiment of the application, first image data on the first electronic device is acquired; acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment is projected and displayed; screen projection data are obtained according to the display parameters and the first image data; and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to process the screen projection data according to the display parameters and display images in the screen projection data. That is, the screen projection between the first electronic device and the second electronic device can be optimized according to the embodiment of the application.
Description
Technical Field
The present disclosure relates to the field of screen projection technologies, and in particular, to a screen projection method, an electronic device, and a storage medium.
Background
With the development of computer technology, more and more computer devices have a screen projection function, and the screen projection function is widely applied to daily life of users. For example, a user may project a display interface of a first electronic device (e.g., a mobile phone) with a screen projection function to a second electronic device (e.g., a television) with a screen projection support function for display, so that the user can view display content related to the same device on different devices.
In the implementation process, the inventor finds that at least the following problems exist in the related art: in some scenarios, since the first electronic device and the second electronic device have different capabilities, for example, specifications such as resolution and video memory of the display screen may be different, power consumption may exist when the first electronic device or the second electronic device is projected.
Disclosure of Invention
The embodiment of the application provides a screen projection method, which reduces power consumption of first electronic equipment and second electronic equipment in a screen projection process by optimizing screen projection.
In a first aspect, a screen projection method is provided, and is applied to a first electronic device, and the method includes: acquiring first image data on the first electronic equipment; acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment is projected and displayed; screen projection data are obtained according to the display parameters and the first image data; and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to process the screen projection data according to the display parameters and display images in the screen projection data.
In the implementation of the application, the parameter used for indicating the second electronic device to project the screen display is obtained, screen projection optimization is carried out according to the parameter, power consumption in the screen projection process is reduced, and display fluency and user experience in the screen projection process are improved. The efficiency and the effect of transmission on the same screen are improved.
Optionally, the first image data comprises image data of an application on the first electronic device, and the display parameter comprises a drawing parameter, wherein the drawing parameter is used for representing the number of image frames with changed content of the application in one period; sending the screen projection data to a second electronic device to trigger the second electronic device to process the screen projection data according to the display parameters, wherein displaying images in the screen projection data comprises: and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to display the image in the screen projection data according to the drawing parameters.
In the application, in the split-area screen projection, the image frame change condition of the application is obtained, and the image frame change condition of the application is sent to the second electronic device, so that the second electronic device refreshes the image of the application in the first image data according to the image frame change condition of the application, the frame rate of the image of the application is controlled and refreshed accurately according to the image frame change condition of the application, the refresh power consumption of the second electronic device is reduced, and the display smoothness and the use experience of a user during screen projection are improved. The efficiency and the effect of transmission on the same screen are improved.
Optionally, the method further comprises: determining N applications from M applications of the first electronic equipment in response to selection operation of a user, wherein M is an integer greater than or equal to 1, and N is a positive integer less than or equal to M; the display parameters comprise respective drawing parameters of the N applications, wherein the drawing parameters are used for representing the number of image frames with content changes corresponding to the N applications in one period; sending the screen projection data to a second electronic device to trigger the second electronic device to process the screen projection data according to the display parameters, wherein displaying images in the screen projection data comprises: and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to correspondingly display the images of N applications in the screen projection data according to the drawing parameters of the N applications.
In the application, in the split-area screen projection, the image frame change condition of each application is obtained, and the image frame change condition of each application is sent to the second electronic device, so that the second electronic device refreshes the image of the application in the first image data according to the image frame change condition of the application, the frame rate of the image of the application is controlled and refreshed accurately according to the image frame change condition of the application, the refresh power consumption of the second electronic device is reduced, and the display fluency and the use experience of a user during screen projection are improved. The efficiency and the effect of on-screen transmission are improved.
Optionally, the acquiring the display parameters includes: and obtaining corresponding drawing parameters according to the respective image data of the N applications.
Optionally, the acquiring the display parameters includes: and obtaining corresponding drawing parameters according to the respective layer buffer queues of the N applications.
Optionally, the display parameters include a screen projection parameter, wherein the screen projection parameter is used for representing a capability parameter of the second electronic device; then the obtaining of the screen projection data according to the display parameter and the first image data includes: processing the first image data according to the screen projection parameters to obtain second image data; and obtaining screen projection data according to the second image data.
In the embodiment of the application, the screen projection parameter is obtained according to the capability parameter of the second electronic device, and the first image data is processed according to the screen projection parameter to obtain the second image data. Based on the second image data obtained according to the capability parameters of the second electronic device, the second electronic device supports processing of the second image data, avoids frame dropping operation of the second electronic device, and reduces power consumption caused by the frame dropping operation. And the processing of the first image data is before the coding, compared with the processing of the first image data after the coding or during the coding, the processing of the first image data can reduce the workload of subsequent operations, reduce the unnecessary workload of the first electronic equipment, reduce the power consumption of the first electronic equipment and the second electronic equipment, and improve the display fluency and the user experience during screen projection. Efficiency and effect of improving transmission on same screen
Optionally, the obtaining the screen projection data according to the second image data includes: and coding the second image data to obtain the screen projection data.
Optionally, the acquiring the display parameters includes: acquiring a capability parameter of the second electronic equipment; and obtaining the screen projection parameters according to the capability parameters.
Optionally, the processing the first image data according to the screen projection parameter to obtain second image data includes: and performing frame skipping processing on the first image data according to the screen projection parameters to obtain second image data.
Optionally, there are more image frames in the first image data than in the second image data.
In a second aspect, the present application provides a screen projection method, applied to a second electronic device, the method including: receiving screen projection data sent by first electronic equipment, wherein the screen projection data comprise data obtained according to display parameters and first image data on the first electronic equipment, and the display parameters are used for representing parameters according to which second electronic equipment performs screen projection display; and processing the screen projection data according to the display parameters, and displaying images in the screen projection data.
In a third aspect, the present application provides a screen projection method applied to a second electronic device, where the method includes: receiving screen projection data sent by first electronic equipment, wherein the screen projection data comprise data obtained according to first image data on the first electronic equipment; acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment is projected and displayed; and processing the screen projection data according to the display parameters, and displaying images in the screen projection data.
Optionally, the first image data comprises image data of the application on the first electronic device, and the display parameter comprises a drawing parameter, wherein the drawing parameter is used for representing the number of image frames of content change of the application in one period; then, the processing the screen projection data according to the display parameters, and displaying the image in the screen projection data includes: and displaying the image in the screen projection data according to the drawing parameters.
Optionally, the first image data includes image data sent by the first electronic device after the first electronic device determines N applications from M applications of the first electronic device, where M is an integer greater than or equal to 1, and N is a positive integer less than or equal to M; the display parameters comprise respective drawing parameters of the N applications, wherein the drawing parameters are used for representing the number of image frames with content changes corresponding to the N applications in one period; then, the processing the screen projection data according to the display parameters, and displaying the image in the screen projection data includes: and correspondingly displaying the images of the N applications in the first image data according to the drawing parameters of the N applications.
Optionally, the acquiring the display parameters includes: and obtaining a drawing parameter according to the decoding of the first image data.
In a fourth aspect, the present application provides a computing device comprising a processor and a memory, the memory being configured to store a set of computer instructions, which when executed by the processor, perform the method as provided by the first aspect and its optional implementation.
In a fifth aspect, the present application provides a computing device comprising a processor and a memory, the memory being configured to store a set of computer instructions, which when executed by the processor, perform the method provided by the second or third aspect and its optional implementation.
In a sixth aspect, the present application also provides a computer program product containing instructions which, when run on a computer, cause the computer to perform the method as provided by the first aspect and its optional implementation.
In a seventh aspect, the present application also provides a computer program product containing instructions which, when run on a computer, cause the computer to perform the method as provided by the second or third aspect and its optional implementation.
The technical effects obtained by the third, fourth, fifth, sixth and seventh aspects of the second aspect are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not repeated here.
The beneficial effect that technical scheme that this application provided brought includes at least:
in the embodiment of the application, the power consumption of the first electronic device and the power consumption of the second electronic device are reduced, and the display fluency and the use experience of a user during screen projection are improved. The efficiency and the effect of transmission on the same screen are improved.
Drawings
Fig. 1 is a schematic view of a screen projection scene provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a screen projection method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another screen projection method provided in the embodiment of the present application;
fig. 5 is a block diagram illustrating usage states of a source device and a destination device according to an embodiment of the present disclosure;
fig. 6 is a schematic flow chart illustrating a usage status of a source device and a destination device according to an embodiment of the present application;
fig. 7 is a schematic flowchart of another screen projection method provided in the embodiment of the present application;
fig. 8 is a schematic view of a screen projection application scenario of a source device and a destination device according to an embodiment of the present application;
fig. 9 is a block diagram illustrating another usage status of a source device and a destination device according to an embodiment of the present application;
fig. 10 is a block diagram illustrating another usage status of a source device and a destination device according to an embodiment of the present application;
fig. 11 is a block diagram illustrating another usage status of a source device and a destination device according to an embodiment of the present application;
fig. 12 is a schematic flow chart illustrating a usage status of another source device and a destination device according to an embodiment of the present application;
fig. 13 is a schematic flowchart of a screen projection method applied to a source device according to an embodiment of the present application;
fig. 14 is a schematic diagram of a layer buffer queue according to an embodiment of the present application;
fig. 15 is a flowchart illustrating a screen projection method applied to a first electronic device according to an embodiment of the present application;
fig. 16 is a flowchart illustrating another screen projection method applied to a first electronic device according to an embodiment of the present application;
fig. 17 is a flowchart illustrating another screen projection method applied to a first electronic device according to an embodiment of the present application;
fig. 18 is a schematic view of an application scenario of another screen projection method according to an embodiment of the present application;
fig. 19 is a schematic view of an application scenario of another screen projection method according to an embodiment of the present application;
fig. 20 is a flowchart illustrating a screen projection method applied to a second electronic device according to an embodiment of the present application;
fig. 21 is a flowchart illustrating another screen projection method applied to a second electronic device according to an embodiment of the present application;
fig. 22 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Description of the main elements
100-a first electronic device; 200-a second electronic device; 110-a processor; 120-external memory interface; 121-internal memory; 130-USB interface; 140-a charge management module; 141-power management module; 142-a battery; 1-an antenna; 2-an antenna; 150-a mobile communication module; 160-a wireless communication module; 170-an audio module; 170A-speaker; 170B-receiver; 170C-microphone; 170D-headset interface; 180-a sensor module; 193-camera; 194-a display screen; 101-a drawing module; 102-a synthesis module; 103-local screen; 104-a first display sending module; 105-a first display; 106-negotiation module; 107-frame skip module; 108-virtual screen; 109-an encoding module; 111-a sending module; 201-a detection module; 202-a receiving module; 203-a decoding module; 204-a second display sending module; 205-a second display; 112-parameter detection module; 206-window splitting module; 207-application a screen projection window; 208-application B screen projection window; 209-information extraction module; 211-refresh determination module; 2201-touch screen; 2202-a processor; 2203-memory; 2204-computer program; 2205-a communication bus; 2206-touch sensor; 2207-display screen; 2208-communication module.
Detailed Description
In the present application, "at least one" means one or more, "and" a plurality "means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, e.g., A and/or B may represent: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The terms "first," "second," "third," "fourth," and the like in the description and in the claims and drawings of the present application, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Referring to fig. 1, fig. 1 is a schematic view of a screen projection scene according to an embodiment of the present application.
The first electronic device 100 is connected to the second electronic device 200 through a communication network. After the first electronic device 100 establishes a communication connection with the second electronic device 200, the image data on the display interface of the first electronic device 100 is transmitted to the second electronic device 200, and the second electronic device 200 can display the image data transmitted by the first electronic device 100 on the display interface thereof.
In the embodiment of the present application, the communication network may be a wired network or a wireless network. For example, the communication network may be a Local Area Network (LAN) or a Wide Area Network (WAN), such as the internet. The communication network may be implemented using any known network communication protocol, which may be any of various wired or wireless communication protocols, such as Ethernet, Universal Serial Bus (USB), FIREWIRE (FIREWIRE), Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time division code division multiple Access (TD-SCDMA), long Term Evolution (LTE), bluetooth, Wi-Fi, NFC, voice over Internet protocol (VoIP), a communication protocol supporting a network slice architecture, or any other suitable communication protocol. Illustratively, the first electronic device 100 may establish a Wi-Fi connection with the second electronic device 200 via a Wi-Fi protocol.
In some embodiments, the specific structures of the first electronic device 100 and the second electronic device 200 may be the same or different. By way of example, each of the electronic devices may be a mobile phone, a tablet computer, an electronic whiteboard, a wearable device, an Augmented Reality (AR) \ virtual display (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a laptop computer (laptop), an electronic large screen, a television, a car center control or an electronic billboard, etc., which is not limited in this application.
Exemplarily, fig. 2 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a camera 193, a display screen 194, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not limit the electronic device. In other embodiments of the present application, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device. The mobile communication module 150 may include one or more filters, switches, power amplifiers, Low Noise Amplifiers (LNAs), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices that integrate one or more communication processing modules. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 150 and antenna 2 is coupled to the wireless communication module 160 so that the electronic device can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device implements the display function through the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device selects a frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. In this way, the electronic device can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device to execute the screen projection display method provided in some embodiments of the present application, and various functional applications and data processing. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area can store data (such as photos, contacts and the like) and the like created during the use of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory, such as one or more magnetic disk storage devices, flash memory devices, Universal Flash Storage (UFS), and the like. In other embodiments, the processor 110 may cause the electronic device to execute the screen projection display method provided in the embodiments of the present application, and various functional applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device answers a call or voice information, it can answer the voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also known as a "microphone," is used to convert the sound signal into an electrical signal. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device may be provided with one or more microphones 170C. In other embodiments, the electronic device may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and the like.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like. The touch sensor can be arranged on the display screen, and the touch sensor and the display screen form the touch screen, which is also called a touch screen.
In addition, the electronic device may further include one or more components such as a key, a motor, an indicator, and a SIM card interface, which is not limited in this embodiment.
For convenience of description, in the embodiments of the present application, the above-described "first electronic device" is referred to as a "source device"; the above-described "second electronic device" is referred to as a "destination device". The source device projects image data on a display interface of the source device to a display interface of the destination device through a screen projecting function, the screen projecting function can be provided by an application, and if the source device and the destination device are both provided with a screen projecting application, and the screen projecting application is started, screen projecting operation can be performed, and the method is not particularly limited in this application.
In some embodiments, the application scene may be divided into a one-to-one screen projection scene, a one-to-many screen projection scene, and a many-to-one screen projection scene according to different numbers of the source device and the destination device in the application scene. The number of the source device and the number of the destination devices in the one-to-one screen projection scene are both one. The many-to-one screen projection scene comprises a plurality of source devices and a destination device. The one-to-many screen projection scene comprises a source device and a plurality of destination devices.
It should be noted that, in the embodiment of the present application, the screen projection scene may be applied to a one-to-one screen projection scene, and may also be applied to a one-to-many screen projection scene or a many-to-one screen projection scene. And the number of the screen projection objects can be one or more, which is not specifically limited in this application. The specific products and the specific quantities of the source device and the destination device are not particularly limited in the present application.
Referring to fig. 3, fig. 3 is a schematic flow chart of a screen projection method according to an embodiment of the present application. The screen projection method is applied to the source equipment. The order of steps in the screen projection method may be changed and some steps may be omitted.
Step S01: receiving a first screen projection instruction input by a user, and searching equipment capable of projecting a screen according to the first screen projection instruction.
Illustratively, when the source device and each electronic device capable of projecting screens access the same Wi-Fi network, the source device responds to the first screen projecting instruction and searches for the electronic devices capable of projecting screens in the same Wi-Fi network.
Step S02: and receiving a second screen projection instruction input by the user, and determining the target equipment according to the second screen projection instruction.
In the embodiment of the application, after the source device searches the electronic devices capable of projecting screens, a second screen projecting instruction input by a user is received, the second screen projecting instruction determines the destination device, and the destination device capable of projecting screens at this time is determined from the electronic devices capable of projecting screens. Illustratively, the identification of the electronic equipment capable of being subjected to screen projection is displayed on the display interface of the source equipment, so that the user can select according to the identification. And displaying an equipment list on a display interface of the source equipment, wherein the equipment list comprises electronic equipment identifications capable of projecting screens, such as mobile phone identifications and computer identifications, a user clicks and selects the mobile phone identification to input a second screen projecting instruction to the source equipment, the source equipment determines the mobile phone identification corresponding to the second screen projecting instruction, the mobile phone corresponding to the mobile phone identification is determined as target equipment, and the source equipment projects screens with the mobile phone.
Step S03: and sending a screen projection request to the destination equipment.
In the embodiment of the application, after the source device determines the destination device, a screen projection request is sent to the destination device to request to establish a communication connection between the source device and the destination device.
Step S04: and after the target equipment responds to the screen projection request, screen projection data are sent to the target equipment.
In this embodiment of the application, after the destination device receives a screen projection request sent by the source device, the destination device responds to the screen projection request, and then the source device may send screen projection data to the destination device based on the communication connection between the source device and the destination device. It can be understood that the screen projection request includes information such as a screen projection protocol between the source device and the destination device, so that data transmission can be performed between the source device and the destination device according to the screen projection protocol. The screen projection protocol between the source device and the destination device may be self-defined, and is not specifically limited in this application. The screen projection data comprises a video stream of interface data of the source device projected to the destination device.
In the embodiment of the application, the interface content launched on the display interface of the source device can be captured by using a mirror image function to obtain a video stream, the video stream is encoded to obtain screen-projecting data, the screen-projecting data is transmitted to the target device, and the target device decodes the screen-projecting data and plays the screen-projecting data. Mirroring refers to creating a file identical to a file according to the file, where the file may be an image or the like.
In the embodiment of the present application, a display interface of the source device may include a plurality of objects, such as applications, controls, and the like. Before the source device sends screen projection data to the destination device, a screen projection object projected to the destination device by the source device and screen projection content corresponding to the screen projection object need to be determined. The screen projection object is an object on a display interface of the source equipment and is an object to be projected to the destination equipment by the source equipment. The screen-casting content can be interface content displayed on a display interface of the source device by the screen-casting object. The method comprises the steps of performing screen projection by taking an application program (hereinafter referred to as application) as an object, wherein the screen projection object is screen projection application, and screen projection content is interface content displayed on a display interface of source equipment by the screen projection application when the screen projection application runs, and comprises an application interface, a playing window and the like of the screen projection application. And obtaining screen projection data according to the screen projection content, wherein the screen projection data can comprise the screen projection content and can also comprise content obtained after processing the screen projection content. The above processes may include, but are not limited to: adjusting frame rate, adjusting display layout, etc.
It can be understood that, in the embodiment of the present application, a communication connection is established between the source device and the destination device, and the manner of implementing screen projection data transmission is not limited to the above steps.
In the implementation process, the inventor finds that, in the process of performing mirror image screen projection on a source device and a destination device, the source device transmits screen projection data to the destination device, and when the capability of the destination device cannot support processing and displaying of the screen projection data at the frame rate of a transmitted video stream, the destination device may perform processing such as frame dropping on the transmitted video stream, but in this process, the destination device needs to additionally receive, parse and drop the frame number that needs to be dropped in the video stream. Or the destination device does not need to analyze all the image frames in the received screen projection data and refresh the image frames when the image frames are displayed, namely the destination device does not need to refresh the image frames according to the frame rate of the screen projection data. In both cases, there is no meaningful power consumption for the source device and the destination device to process the transmitted video stream.
Therefore, the embodiment of the application provides a screen projection method, which optimizes screen projection by determining parameters according to which the second electronic device performs screen projection display. The parameter is a display parameter, and is used to indicate a parameter according to which the second electronic device displays, and the source device may process data transmitted to the second electronic device according to the display parameter, or the destination device may process the transmitted data according to the display parameter.
Referring to fig. 4, fig. 4 is a schematic flow chart of another screen projection display method provided in the embodiment of the present application, and the method is applied to a first electronic device.
Step S41: first image data on the first electronic device is acquired.
In this embodiment of the application, the first image data is image data to be projected to the second electronic device by the first electronic device, and the first image data may be image data to be displayed on the first electronic device. In one possible implementation manner, the first image data may be image data already displayed on the first electronic device, and the image data displayed on the first electronic device is recorded and then sent to the second electronic device.
Step S42: and acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment is projected and displayed.
In the embodiment of the application, the display parameters comprise screen projection parameters and/or drawing parameters. The display parameter may be a frame rate, i.e. a video stream frame rate of data received by the second electronic device or a frame rate at which the second electronic device refreshes the display. The frame rate of the video stream of the data received by the second electronic device is the frame rate of the video stream obtained by the first electronic device processing the data transmitted to the second electronic device. When the display parameter is a screen projection parameter, the second electronic device receives and processes the first image data according to the screen projection parameter, and displays an image according to the screen projection parameter, for example, the image in the first image data is displayed in a refreshing mode at a frame rate of the screen projection parameter. When the display parameter is a drawing parameter, the second electronic device refreshes the image of the application with the drawing parameter corresponding to the refresh application, for example, with the frame rate of the drawing parameter.
Step S43: and obtaining screen projection data according to the display parameters and the first image data.
In this embodiment of the present application, the first image data may be processed according to the display parameters to obtain the screen projection data, or the display parameters and the first image data may be packaged to obtain the screen projection data.
Step S44: and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to process the screen projection data according to the display parameters and display images in the screen projection data.
In this embodiment of the application, the screen projection data is data sent by the first electronic device to the second electronic device, and the second electronic device displays an image to be projected to the second electronic device by the first electronic device according to the screen projection data. The second electronic device may analyze the screen projection data according to the display parameters, or the second electronic device may display an image on the screen projection data according to the display parameters.
In one scenario, in the process of performing mirror image projection on a source device and a destination device, when the capabilities of the source device and the destination device are not equal, the capability of the destination device cannot support processing of projection data, and if the frame rate of a video stream in the projection data is 120fps, and the refresh rate of a display screen of the destination device is 60HZ, the destination device performs frame dropping processing on image frames in the video stream. And due to the delayed screen projection data receiving, the frame skipping phenomenon can occur when the target device carries out frame dropping processing, and the display smoothness of the target device is further influenced.
In one scenario, in the process of performing mirror image projection on a source device and a destination device, when capabilities of the source device and the destination device are not equal, the source device reduces specification operation, for example, a refresh rate of a display of the source device is 120Hz, the source device reduces specification, and an image frame is output according to a set frame rate of 60fps instead of 120fps, once the source device starts projection, display smoothness of the source device is affected, so that user experience is affected, and if the capabilities of the destination device do not support the set frame rate, meaningless power consumption still exists when the destination device processes projection data.
The inventor finds that, in the screen projection scene, the capabilities of the source device and the destination device are not equal, and the source device or the destination device generates meaningless power consumption in the screen projection process. When the capability of the destination device cannot support processing of the video stream transmitted by the source device, the destination device may perform frame dropping processing on the image frames in the video stream. The destination device needs to perform additional processing on the discarded image frame, such as receiving, parsing, and frame dropping, thereby causing insignificant power consumption. The source device also performs additional processing, such as encoding and transmission, on the image frames discarded by the destination device, thereby also generating insignificant power consumption. Discarding the image frame wastes resources, such as bandwidth resources.
Therefore, the embodiment of the present application provides a screen projection method, where capability detection is added to a destination device to detect a capability supported by the destination device, a source device obtains a detection result of the destination device, and performs corresponding processing on screen projection data according to the detection result, so as to adjust screen projection data transmitted to the destination device.
Referring to fig. 5, fig. 5 is a block diagram illustrating usage states of a source device and a destination device according to an embodiment of the present disclosure. The source device is in communication connection with the target device, and the source device sends screen projection data to the target device to trigger the target device to refresh and display the screen projection data on a display interface of the target device.
The source device at least comprises an application a, and further comprises a drawing module 101, a composition module 102, a local screen 103, a first display sending module 104, a first display 105, a negotiation module 106, a frame skipping module 107, a virtual screen 108, an encoding module 109, and a sending module 111. The destination device includes a detecting module 201, a receiving module 202, a decoding module 203, a second display module 204, and a second display 205. It is to be appreciated that in some scenarios, a source device or a destination device may adaptively add or subtract modules.
The application a runs on the source device, and an application interface corresponding to the application a is displayed on the first display 105 of the source device. When the application a refreshes the content of the application interface, the application a sends a drawing request to the drawing module 101, where the drawing request may include window information and drawing content of the application interface. The window information may include information of size, position, format, transparency, etc. of the application interface, and the drawing content may include content information of text, etc. that the application interface needs to display. The application a sends a drawing request according to its own application design condition (generally determined by the corresponding installation package APK) to perform a corresponding drawing operation.
The drawing module 101 is configured to receive a drawing request of an application a, and create a window and a layer in response to the drawing request. The drawing module 101 creates a window according to the window information, and creates a layer according to the drawing content. The layer is used for drawing the content of the application interface, including the graphic data of the application interface and the like. Each application at least comprises a window, the application A draws the content of an application interface needing to be displayed into the window, and the image data of the application A is displayed on the window. Each application may include multiple layers, for example, a video playing application generally includes three layers, one layer for displaying video content, one layer for displaying bullet screen content, and another layer for displaying user interface controls (e.g., a play progress bar, a volume control bar, etc.). Illustratively, the drawing module 101 requests an interface creation to an interface courier (surface flag) when creating a Layer, and when requesting the interface creation, creates a Layer (Layer) object in the process of the interface courier (surface flag) and creates a corresponding Layer cache queue for the Layer (Layer), where the Layer provides control information operations for drawing the application interface and content processing operations of the application interface. For example, layers may call an Open Graphics Library (OpenGL) or a Library of 2D vector Graphics processing functions (Skia) for rendering operations.
The synthesis module 102 is configured to receive the layer created by the drawing module 101, and synthesize the layer to obtain first image data. Illustratively, all layers (including visible layers and invisible layers) created by the drawing module 101 constitute a layer list, the composition module 102 selects the visible layer list from the layer list, then finds out a free Frame Buffer from three Frame buffers (Frame buffers, FBs) that can be recycled in the system, performs a composition operation on the free Frame Buffer according to application configuration information, such as which layer is on the bottom, which layer is on the top, which region is a visible region, and the like, and overlaps the layers included in the visible layer list to obtain a corresponding image Frame. The synthesis module 102 is further configured to obtain a parameter of the local hardware of the source device, such as obtaining a refresh rate of the first display 105, and the synthesis module 102 outputs the first image data according to the refresh rate. Illustratively, the synthesis module 102 outputs first image data according to the refresh rate 120HZ, that is, 120 image frames per second, and obtains a first image frame set according to the output image frames, that is, the first image data. The synthesis module 102 may be implemented as a synthesizer.
The local screen 103 is used for buffering the first image data to be displayed on the first display 105, that is, buffering the first image frame set output by the combining module 102 according to the refresh rate of the first display 105. The local screen 103 may be a frame buffer corresponding to the first display 105, buffering image frames in the first set of image frames. The image frames stored in the local screen 103 are displayed on the first display 105. The local screen 103 may be implemented as a memory, or as a buffer in memory.
The first display sending module 104 may be a hardware display sending driver, such as a video card driver, on the source device. The first display 105 displays image frames in a first set of image frames stored in the local screen 103 in a refresh manner according to its refresh rate. The first display 105 may be a hardware display on the source device that includes a display screen for displaying images, such as images in the first image data.
The negotiation module 106 is configured to receive the capability parameter transmitted by the destination device, or request the destination device to report the capability parameter, so as to obtain the capability parameter of the destination device. The negotiation module may send a parameter request to the destination device to receive capability parameters of the destination device, such as 60fps, after the destination device responds to the request. The negotiation module 106 is further configured to determine a screen projection parameter according to the received capability parameter, and if the capability parameter supported by the destination device is 60fps, determine the screen projection parameter to be 60 fps.
The frame skipping module 107 is configured to receive the first image data output by the synthesis module 102, process the first image data output by the synthesis module 102 according to the screen projection parameter determined by the negotiation module 106, and output the second image data obtained after the processing to the virtual screen 108. The frame skipping module may be implemented as a sub-module of the composition module 102 to output the first image data output by the composition module 102 to the virtual screen 108 according to the screen projection parameters. For example, when the screen projection parameter is 60fps, the combining module 102 outputs a first image frame set with 120 frames per second, and the frame skipping module 107 receives the first image frame set output by the combining module 102, and processes the first image frame set according to the 60fps frame rate, for example, frame skipping processing is performed, so as to obtain a second image frame set. For example, reading the image frames in the first image frame set, and outputting the image frames at 60 frames per second, that is, outputting the image frames after skipping some image frames for each second, to obtain a second image frame set, where the image frames in the second image frame set are fewer than the image frames in the first image frame set, that is, the image frames in the second image data are fewer than the image frames in the first image data.
The virtual screen 108 is configured to buffer second image data to be displayed on the second display 205, that is, the buffer frame skipping module 107 processes the first image data according to the screen projection parameters to obtain the second image data. The virtual screen 108 may be a virtual buffer corresponding to the second display 205, buffering a second set of image frames corresponding to the second image data. The image frames stored in the virtual screen 108 are for display on the second display 205. The virtual screen 108 may be implemented as a memory, or as a buffer in memory. The encoding module 109 is configured to obtain second image data cached by the virtual screen 108, and encode the second image data into a video file in a corresponding video format, where the video file is screen projection data. The encoding module 109 may be an encoder of the source device. The sending module 111 is configured to send the screen projection data output by the encoding module 109 to the destination device.
The detection module 201 is configured to detect a capability of the destination device, determine a capability parameter that the destination device can support/process when the destination device is projected on a screen, and obtain the capability parameter of the destination device. The detecting module 201 is further configured to transmit the detected capability parameter to the negotiating module 106 of the source device. The receiving module 202 is configured to receive screen projection data sent by the source device, and send the screen projection data to the decoding module 203. There is a communication circuit between the source device and the destination device that establishes communication between the source device and the destination device. For example, the communication circuit may be connected to a network via wireless communication or wired communication to enable communication between the source device and the destination device. The negotiation module 106 and the detection module 201, and the sending module 111 and the receiving module 202 are in communication connection through the communication circuit. The decoding module 203 is configured to process the screen projection data, decode the screen projection data to obtain second image data, and obtain a second image frame set. The decoding module 203 may be a decoder on the destination device. The second display sending module 204 is configured to output the image frames in the second image frame set output by the decoding module 203 to the second display 205 for display. The second display module 204 may be a hardware display driver, such as a video card driver, on the source device. The second display 205 refreshes display of image frames in the second set of image frames according to the screen projection parameters, i.e., displays an image of the second image data on the second display. The second display 205 may be a display on the source device that includes a display screen for displaying images, such as images in the second image data.
In the embodiment of the present application, when the frame skipping module 107 is not provided, the first image data may be stored by traversing the local screen 103 and the virtual screen 108, that is, the first image data output by the composition module 102 may be stored in the local screen 103 and the virtual screen 108. The virtual screen 108 stores first image data that is output according to hardware parameters of the source device, and when the capabilities of the destination device and the source device are not equal and the capabilities of the destination device do not support the processing of the first image data, the destination device may perform frame dropping processing on an image frame in the first image data. By setting the frame skipping module 107, the second image data obtained by processing the first image data according to the screen projection parameters by the frame skipping module 107 is stored in the virtual screen 108, and the first image data is processed before the screen projection data is output by the encoding module 109, so that the workload of processing each subsequent module is reduced. The virtual screen 108 is buffered with a second image frame set obtained after processing, and image frames in the second image frame set are less than image frames in the first image frame set, so that the decoding module 203 reduces processing of discarded image frames, and the screen projection data transmitted by the sending module 111 is correspondingly reduced. Thereby reducing meaningless power consumption of the destination device and the source device during screen projection.
Referring to fig. 6, fig. 6 is a schematic view illustrating a usage status flow of a source device and a destination device according to an embodiment of the present disclosure. The source device includes a running application a, and further includes a drawing module 101, a composition module 102, a negotiation module 106, a frame skipping module 107, a virtual screen 108, an encoding module 109, and a transmission module 111. The destination device comprises a detection module 201, a receiving module 202, a decoding module 203 and a second display module 204.
When the application A is running, a drawing request is issued to the drawing module 101, and the drawing module 101 responds to the drawing request to create a window and a layer. The drawing module 101 sends the created layer to the synthesis module 102, and the synthesis module 102 synthesizes an image frame according to the layer and outputs first image data. The detection module 201 of the destination device reports the capability parameter 48fps to the negotiation module 106, the negotiation module 106 determines that the screen projection parameter is 48fps according to the capability parameter 48fps, the frame skipping module 107 processes the first image data output by the synthesis module 102 according to the screen projection parameter 48fps to output second image data, namely, the first image data of the synthesis module 102 is obtained, then an image frame is output to the virtual screen 108 by 48 frames of image frames per second, and the output image frame is the second image data. The encoding module 109 acquires second image data from the virtual screen 108, encodes the second image data to obtain screen projection data, the sending module 111 sends the screen projection data to the receiving module 202 of the destination device, the receiving module 202 transmits the screen projection data to the decoding module 203, the decoding module 203 decodes the screen projection data, and sends the decoded second image data to the second display sending module 204 for display.
Referring to fig. 7, fig. 7 is a schematic flow chart of another screen projection method according to an embodiment of the present application. The screen projection method is applied to the source equipment. According to different requirements, the sequence of each step in the screen projection method can be changed, and some steps can be omitted.
Step S60: a communication connection is established with the destination device.
In this embodiment of the present application, after the source device establishes the communication connection with the destination device, a screen may be projected between the source device and the destination device based on a screen projection protocol.
Step S61: and sending a parameter request to the destination equipment.
In the embodiment of the present application, after a source device establishes a communication connection with a destination device, the source device sends a parameter request to the destination device to request the destination device to report its capability parameters, so as to obtain the capability parameters of the destination device. The source device can know the capability supported by the target device during screen projection through the capability parameter, and then performs corresponding adjustment processing according to the capability supported by the target device during screen projection, so that the meaningless power consumption of the target device and the source device during screen projection is reduced.
In one possible implementation manner, after the source device establishes the communication connection with the destination device, the destination device may directly detect the capability of the source device to obtain the capability parameter, and then report the capability parameter to the source device, without sending a parameter request to the destination device by the source device.
In one possible implementation manner, the destination device may store and record the capability parameter obtained by the last detection, so as to directly read the capability parameter after the source device establishes a communication connection with the destination device or when receiving a capability parameter reporting request, and then report the capability parameter to the source device without performing capability detection each time.
Step S62: and after the target equipment responds to the parameter request, acquiring the capability parameter of the target equipment.
The target device responds to the parameter request, detects the capability of the target device so as to calculate the capability parameter of the target device, and reports the capability parameter to the source device, and the source device obtains the capability parameter of the target device. The capability parameter may be a frame rate, i.e. a frame rate at which the destination device is determined to process the video stream, at which frame rate the image frames in the video stream can be processed or displayed.
In the embodiment of the application, the destination device may detect the capability of a hardware device or the capability of software thereof to obtain the capability parameter of the destination device. The destination device may send the capability parameter to the source device through a network connection with the source device.
In one possible implementation manner, the destination device may detect the capability of a hardware device thereof, and the hardware device may include a video card, a video memory, a panel, and the like. Obtaining the frame rate that each hardware device can process/support by reading the parameters of the hardware device, for example, obtaining the model and size of the video card and the video memory, calculating the frame rate of the video stream that the video card and the video memory can process/support, respectively, limiting the frame rate of the video stream that the video card can process/support according to the capacity of the video stream that the video memory can access, directly obtaining the refresh rate of the panel by reading the parameters such as the model of the panel, determining the frame rate of the video stream that each hardware device can process/support as the frame rate of the hardware device, for example, determining the frame rate that the video card can process/support as the frame rate of the video card, determining the frame rate that the video memory can process/support as the frame rate of the video memory, determining the refresh rate of the panel as the frame rate of the panel, obtaining the frame rate that each hardware device of the destination device can process/support, and comprehensively processing the frame rate corresponding to each hardware device, and obtaining the capability parameter according to the processing result.
In one possible implementation manner, the destination device selects the minimum frame rate from the frame rates corresponding to the hardware devices to determine the capability parameter, for example, the minimum frame rate that the video card can process/support is 48fps, the capacity of the video stream that the video memory can access limits the minimum frame rate that the video card can process/support to 60fps, and the refresh rate of the panel is 80Hz, and then combines the minimum frame rates of the video card, the video memory and the panel, and determines the minimum frame rate of 48fps as the capability parameter of the destination device.
In one possible implementation manner, the software capability of the destination device may also be detected, and by reading the video encoding and decoding capability, a frame rate that can be processed/supported by the software is obtained, and the frame rate is determined as a capability parameter of the destination device.
In one possible implementation manner, the capabilities of the hardware device and the software may be finally integrated, and the processing/supporting capability of the destination device may be calculated, that is, the lowest frame rate is selected from the frame rates corresponding to the integrated hardware device and the software, and the lowest frame rate is determined as the capability parameter of the destination device. For example, the video encoding and decoding capability is read to determine that the software can process/support the frame rate to be 60fps, after the hardware devices can process/support the frame rate comprehensively, the minimum frame rate which can be processed/supported by the hardware devices is 48fps, and then the frame rates of the hardware devices and the software are synthesized, and the frame rate of 48fps is determined as the capability parameter.
Step S63: and obtaining screen projection parameters according to the capability parameters.
In the embodiment of the application, after receiving the capability parameter sent by the destination device, the source device determines the parameter of the synthesized virtual screen, i.e. the screen projection parameter, according to the capability parameter. And the capability parameter is the frame rate which can be processed/supported by the target equipment, the frame rate which can be processed/supported by the target equipment is determined as the screen projection parameter, so that the first image data output by the synthesis module is processed according to the screen projection parameter, and the second image data cached in the virtual screen is output, so that the screen projection data projected to the target equipment is adjusted. Corresponding processing is carried out before the coding module outputs screen projection data, so that the workload of subsequent step processing is reduced, and further, the meaningless power consumption of source equipment is avoided. In this embodiment, the screen projection parameter may be a frame rate.
Step S64: first image data on the source device is acquired.
In the embodiment of the application, the source device display interface includes a plurality of objects, and the image data displayed on the source device display interface by the objects in the running process is acquired, that is, the first image data is acquired. The first image data comprises a plurality of image frames, and the image frames are interface contents presented by the screen projection object in the operation process.
Specifically, the screen projection application issues a drawing request according to the application design condition in the running process to request to draw a corresponding application interface. And the drawing module creates a layer according to the drawing request, wherein the layer is used for drawing the content of the application interface. And the synthesis module synthesizes the visible image layers drawn by the drawing module and outputs image frames. The synthesis module further outputs image frames frame by frame according to hardware parameters of a source device display, resulting in a first image frame set, which may be first image data to be displayed on the source device. It is to be understood that the first image data is image data synthesized by the synthesis module from the object, including a plurality of image frames.
It can be understood that the first image data is to be displayed on the first electronic device, and the second image data obtained according to the first image data is to be displayed on the second electronic device. The first image data may include image data of all objects on the display interface of the first electronic device, or may include only image data of objects projected to the second electronic device.
Step S65: and processing the first image data according to the screen projection parameters to obtain second image data.
In one possible implementation manner, the processing the first image data according to the screen projection parameter to obtain second image data includes: and performing frame skipping processing on the first image data according to the screen projection parameters to obtain second image data.
In this embodiment, the processing of the first image data according to the screen projection parameters may include selecting a corresponding frame skipping algorithm, purposefully selecting a corresponding image frame from the first image data, and obtaining second image data according to the selected image frame. And caching the selected image frame to a virtual screen, and enabling the frame rate of the cached image frame to the virtual screen to be the screen projection parameter. When the image frames are cached to the virtual screen, frame skipping is carried out on the image frames in the first image frame set according to the determined screen projection parameters, the image frames obtained after frame skipping are cached to the virtual screen, and then the second image frame set is cached on the virtual screen.
Specifically, the synthesis module outputs 60 frames of image frames per second according to a display refresh rate of 60Hz, outputs first image data, and mirrors the output first image data into two image data, i.e., the first image data is copied into two copies, one copy being to be sent to the local screen and the other copy being to be sent to the virtual screen. The first image data is directly transmitted to a local screen, a first image frame set is cached on the local screen, a first display sending module of the source device sends the first image frame set cached on the local screen to a first display so as to display image frames in the first image frame set on the first display, and the first display refreshes and displays each image frame in the first image frame set by 60 frames per second. The first image data transmitted to the virtual screen is subjected to frame skipping processing through a frame skipping module, the frame skipping module performs frame skipping synthesis on the first image data according to the screen projection parameters and a preset frame skipping algorithm, if the screen projection parameters are 30fps, when the frame skipping module processes the first image frame set output by the synthesis module, every two image frames can be selected to skip one image frame, one of the two image frames is reserved to be cached to the virtual screen, namely for the image frame with 60 frames per second, 30 image frames are selected to be transmitted to the virtual screen for caching, the image frame cached to the virtual screen has 30 image frames per second, and the image frame cached to the local screen has 60 image frames per second. The virtual screen may be understood as a buffer area of the image frame projected to the second display, and the buffer area buffers the image frame output after being processed by the frame skipping module.
In the embodiment of the present application, when skipping frame synthesis of the content of the virtual screen according to the screen projection parameters, frame skipping synthesis algorithms that can be selected include, but are not limited to, uniform frame skipping, frame skipping according to video content. Illustratively, using a uniform frame skipping algorithm, after the source device receives the projection parameter of 60fps, a frame skipping ratio 60/120 is calculated, and the synthesis module uniformly selects 60 frames of 120 frames to be cached to the virtual screen within one second according to the frame skipping ratio 60/120. When the frame skipping is carried out according to the video content, for one second of video, the first half second of video is static, and the second half second of video is dynamic, only one image frame is output when the first half second of frame skipping is carried out, and 59 image frames are output when the second half second of frame skipping is carried out, so that 60 image frames in one second are obtained. The frame skipping synthesis algorithm is not specifically limited in the present application. It can be understood that, according to the mode of processing the first image data according to the screen projection parameter, the frame dropping may be performed, or the frame lifting may be performed, that is, the frame rate of the first image data synthesized into the virtual screen is dropped to the screen projection parameter, or the frame rate of the first image data synthesized into the virtual screen is lifted to the screen projection parameter, which is not specifically limited in this application.
Step S66: and sending the encoded second image data to a target device to trigger the target device to display an image corresponding to the second image data. In the embodiment of the application, the second image data is encoded to obtain screen projection data, and the screen projection data is sent to the destination device. Specifically, the second image data cached on the virtual screen is encoded, that is, the second image frame set is encoded into a corresponding video format file, corresponding screen projection data is obtained according to the video format file, the screen projection data is sent to the destination device, the video stream is encoded to obtain a corresponding video format file, and the video format file is transmitted to the destination device through a network. And the frame rate of the video stream in the encoded screen projection data is less than or equal to the screen projection parameter.
In the embodiment of the application, the source device performs transmission of screen projection data based on a communication connection with the destination device, such as sending a corresponding video format file to the destination device. And after receiving the screen projection data, the target equipment processes the screen projection data, for example, after receiving a corresponding video format file, the target equipment decodes the file to obtain a video stream, and refreshes and displays the video stream according to the screen projection parameters. The source device refreshes and displays the image in the first image data at the original refresh rate, and the destination device refreshes and displays the image in the second image data according to the screen projection parameters.
It is understood that the second image data is fewer than the image frames in the first image data, and the image frames in the screen shot data and the image frames in the second image data coincide.
In the embodiment of the application, the source device transmits the corresponding screen projection data according to the capability parameter of the destination device, so that the meaningless processing of the destination device on meaningless image frames is reduced, for example, the receiving operation of the destination device on the meaningless image frames is reduced, and the power consumption caused by the meaningless processing is avoided. Further, meaningless processing operations of the destination device are reduced through processing of the screen projection data, for example, parsing and processing operations of meaningless image frames by the destination device are reduced.
Furthermore, the image frames are cached to the virtual screen according to the capability parameters of the destination device, and compared with the method that the screen projection data frame rate is adjusted according to the capability parameters when the image frames cached to the virtual screen are coded and converted into the video stream, the workload of the subsequent flow steps can be reduced. And synthesizing the content of the virtual screen according to the screen projection parameters, wherein the content on the virtual screen is all meaningful image frames, if the interface content of the virtual screen is encoded into a video stream according to the capability parameters, the interface content of the virtual screen may have meaningless image frames, and the subsequent processing operation also has the processing of encoding the meaningless image frames.
Furthermore, when the virtual screen content is subjected to frame skipping processing according to the capability parameters during encoding, and image data output by the synthesis module is transmitted to the encoding module, a delay problem exists, which may cause a starting point of encoding and recording on the virtual screen to be different from a starting point of recording on the local screen, and thus a screen projection result is not ideal. When the interface content of the virtual screen is synthesized according to the screen projection parameters, the frame skipping algorithm is combined to enable the synthesized interface content of the virtual screen to skip frames more optimally and smoothly, and compared with the frame skipping during coding, the lost information amount of the interface content of the virtual screen directly synthesized according to the screen projection parameters is smaller.
Referring to fig. 8, fig. 8 is a schematic view of a screen projection application scenario of a source device and a destination device according to an embodiment of the present application.
After the user starts the memo application on the source device, the source device receives the interface content of the application interface a11 corresponding to the memo application in the running process in real time, and refreshes and displays the content of the application interface a11 at a source device display refresh rate of 120Hz, that is, the frame rate of the application interface displayed by the source device is 120 fps. When the source equipment is in communication connection with the target equipment, the target equipment detects the whole capacity of the source equipment and obtains a capacity parameter of 48 fps. The destination device sends the capability parameter 48fps information to the source device. And the source equipment receives the capability parameter of 48fps and determines the projection parameter to be 48 fps. The source device acquires first image data applied by a memo, processes the first image data according to a screen projection parameter of 48fps to obtain second image data cached to a virtual screen, wherein the video stream frame rate in the second image data is 48fps, and encodes the second image data cached to the virtual screen to obtain screen projection data. And the source equipment sends the screen projection data to the destination equipment. The target device processes the screen projection data, for example, decodes a video stream in the screen projection data to obtain second image data, the screen projection parameter based on the screen projection data is a frame rate 48fps which can be supported by the target device, and the target device refreshes the video stream of the second image data according to the frame rate 48fps, so as to refresh the content of the display application interface B11. Therefore, when the screen projection is carried out between the source device and the destination device, the source device still carries out refreshing display on the application interface A11 at the original frame rate of 120fps, the destination device carries out refreshing display on the application interface B11 at the frame rate of 48fps, and the interface content of the application interface A11 is consistent with the interface content of the application interface B11.
In the embodiment of the application, the destination device detects the capability parameter of the destination device, the source device obtains the screen projection parameter according to the capability parameter, and outputs corresponding screen projection data according to the screen projection parameter, the video stream frame rate of the screen projection data received by the destination device is the frame rate which can be supported by the destination device, frame loss when the destination device processes the video stream with a high frame rate is avoided, unnecessary frame loss or frame skipping operations are reduced, and unnecessary power consumption generated by the source device and the destination device is reduced.
The inventor has further found that in some application scenarios, the destination device may have insignificant power consumption for processing the transmitted video stream. For example, the user selectively screens one or more objects on the display interface of the source device to one or more destination devices, and for the one or more screen-projected objects, the destination devices do not need to refresh and display the video stream in the received screen-projected data at the frame rate of the video stream, or do not need to refresh and display the video stream in the received screen-projected data at the frame rate that the destination devices can support/can process, and can refresh and display at a lower frame rate.
Referring again to FIG. 8, the display of the source device in FIG. 8 refreshes the content on application interface A11 at a refresh rate of 120Hz, and the destination device refreshes application interface B11 at its supportable frame rate of 48fps, but the memorandum applicationDuring the screen projection process, a user can observe the display area of the source equipmentThe picture of the upper memorandum application is static, and for the destination device, the frame rate of 48fps is not needed to be displayed, namely when the frame is displayed on the display interface of the destination device, 48 frames of image frames do not need to be output every second, the frame can be displayed at a lower frame rate, for example, the frame is refreshed at the frame rate of 1fps, one frame of image frame is output every second, the displayed interface content is consistent with the interface content displayed on the source device, the screen projection effect is the same, but the unnecessary power consumption can be reduced.
Therefore, the embodiment of the application provides a screen projection method, which is characterized in that parameter detection is added on a source device to detect the change condition of an image frame of a screen projection object on the source device, and the detected result is sent to a destination device to trigger the destination device to perform corresponding processing according to the detection result. As the picture in the display area of the memo application is still in fig. 8, the image frames newly displayed in the display area of the memo application are all the same image frame, and for this reason, the destination device may refresh the application interface of the application according to the image frame change condition in the display area of the application.
In one possible implementation manner, the drawing parameters are determined according to the change frequency of the cache data in the layer cache queue of the object.
In one possible implementation, the rendering parameters are determined from first image data on the first electronic device. Specifically, the drawing parameters are defined according to the display condition of the object on the display interface of the source device. The rendering parameter is used to indicate a change in the image frame of the object, i.e. according to the content of the image frame in the first set of image frames on the source device. An image frame content change is understood to mean that the image frame has a change with respect to the image content of the previous image frame or with respect to the image content of the next image frame, i.e. the image content of the image frame has a change.
In one possible implementation, the determination of how many frames of image frames with content changes in the display area of the object is made according to how many frames of image frames with content changes per second in the first set of image frames to be displayed by the object. Which in turn triggers the destination device to refresh the display based on how many frames per second the subject has changed. The drawing parameter is used to represent the number of image frames of the content change of the application within one period. The period may be one second.
In one possible implementation, a drawing parameter is defined for calculating how many image frames of the object have content changes in one second, where the drawing parameter refers to how many image frames of the object have content changes in one second.
When one screen projection object is included, detecting a drawing parameter corresponding to the screen projection object, and triggering the target device to refresh and display screen projection data of the screen projection object according to the drawing parameter. When the screen projection objects comprise a plurality of screen projection objects, drawing parameters corresponding to the screen projection objects are detected, after the target device obtains the drawing parameters corresponding to the screen projection objects, the target device is triggered to process the transmitted screen projection data, image data of the screen projection objects can be refreshed and displayed according to the drawing parameters corresponding to the screen projection objects, namely, display areas of the screen projection objects are independently controlled, the display areas of the screen projection objects are independently controlled by controlling windows corresponding to the screen projection objects, and the independent control of the display of the screen projection objects can be further realized. Therefore, the screen projection data are prevented from being transmitted to the target equipment by the source equipment, and the target equipment refreshes and displays the interfaces of all screen projection objects according to the uniform frame rate.
The following description will be made taking an object as an application.
Referring to fig. 9, fig. 9 is a block diagram illustrating another usage status of a source device and a destination device according to an embodiment of the present disclosure.
Fig. 9 differs from fig. 5 in that two applications, including application a and application B, run on the source device, which also includes parameter detection module 112, but may not include negotiation module 106 and frame skipping module 107. The destination device further includes a window splitting module 206, an application a screen projection window 207, and an application B screen projection window 208, but may not include the detection module 201. It is to be appreciated that in some scenarios, a source device or a destination device may adaptively add or subtract modules.
Both application a and application B send drawing requests to the drawing module 101. The drawing module 101 is configured to receive a drawing request of the application a, and create a window and a layer of the application a in response to the drawing request of the application a, and the drawing module 101 receives a drawing request of the application B, and creates a window and a layer of the application B in response to the drawing request of the application B. The synthesis module 102 is configured to receive the layers of the application a and the application B created by the drawing module 101, synthesize the visible layers, and output an image frame.
The parameter detection module 112 is configured to detect data when the synthesis module 102 processes layers of the application a and the application B, so as to obtain a drawing parameter of the application a and a drawing parameter of the application B, where the drawing parameter of the application a is 120fps, and the drawing parameter of the application B is 5fps, that is, image content in a display area of the application a is dynamically changed, and image content in a display area of the application B is relatively static. The parameter detection module 112 is configured to monitor the layer buffer queues of the objects in the composition module 102 when the composition module 102 composes the image frames of the application a and the application B, read a change frequency of buffer data in the layer buffer queues of the objects, and determine drawing parameters of the screen-projected object according to the change frequency of buffer data in the layer buffer queues of the screen-projected object. The parameter detecting module 112 transmits the obtained drawing parameters to the destination device through the route (1) in fig. 9, that is, directly transmits the detected drawing parameters to the receiving module 202 of the destination device through the network, so as to realize real-time transmission of the drawing parameter information to the destination device through the network connection. Or, the information of the rendering parameters is transmitted to the encoding module 109 through the route (2) in fig. 9, that is, the information of the rendering parameters is encapsulated into screen projection data, and the screen projection data is transmitted to the destination device through the network, where the screen projection data includes the encoded video file and the rendering parameter information. The parameter detection module 112 may also be configured to determine the rendering parameter of each screen-projecting object in combination with the window position information output by the rendering module 101, that is, determine each screen-projecting object according to the window position information, then determine which screen-projecting object has a change in the area corresponding to the changed area in each frame of the image frame, and further obtain the rendering parameter according to the change in the image frame of the screen-projecting object.
The synthesis module 102 outputs image frames of the application a and the application B to the local screen 103 and the virtual screen 108, and the first display sending module 104 is configured to obtain the cached first image data from the local screen 103 and send the first image data to the first display 105 for refresh display. The encoding module 109 is configured to obtain first image data cached on the local screen 103, encode the first image data into a video file in a corresponding video format, and obtain screen projection data according to the video file. The drawing module 101 may encapsulate the window information of each application into the screen projection data, and send the screen projection data to the receiving module 202 of the destination device through the sending module 111.
The receiving module 202 is configured to receive screen projection data sent by the source device, where the screen projection data includes video streams and drawing parameter information. The receiving module 202 is configured to obtain the video stream and drawing parameter information corresponding to each screen projection object. The receiving module 202 is further configured to send the screen projection data to the decoding module 203. There is a communication circuit between the source device and the destination device that establishes communication between the source device and the destination device. For example, the communication circuit may be connected to a network via wireless communication or wired communication to enable communication between the source device and the destination device. The parameter detection module 112 and the receiving module 202, and the sending module 111 and the receiving module 202 are communicatively connected through the communication circuit. The decoding module 203 is configured to process the screen projection data, obtain a video file in a corresponding video format from the screen projection data, and decode the video file into a video stream. The decoding module 203 may be a decoder on the destination device. The window splitting module 206 is configured to receive the video stream transmitted by the decoding module 203, where the window splitting module 206 is configured to create and split two windows according to the window information created by the drawing module 101, split two windows according to the respective window information of the application a and the application B, split the video stream into two video streams corresponding to the application a and the application B, respectively, and encapsulate the contents of the two video streams into corresponding windows, to obtain a window of the application a and a window of the application B. The window splitting module 206 is further configured to receive drawing parameters in the screen projection data received by the receiving module 202, and package the window of the application a and the window of the application B according to the drawing parameters corresponding to each part, that is, refresh image frame data in the display window according to the drawing parameters. The application a screen projection window 207 is configured to refresh and display image data of the application a, that is, image content of a display area where the application a window is located, according to the drawing parameter 120fps of the application a. The application B screen projection window 208 is used for refreshing and displaying the image data of the application B, that is, the image content of the display area where the application B window is located, according to the drawing parameter 5fps of the application B. The app a screen projection window 207 may be implemented as a player application, and the app B screen projection window 208 may be implemented as a player application. The second rendering module 204 is configured to output an image frame of each frame output by the app a screen projection window 207 and the app B screen projection window 208 into the second display 205. The second display 205 is configured to refresh and display each frame of image output by the app a screen projection window 207 according to the rendering parameter 120 fps. The second display 205 is configured to refresh and display each frame of image output by the application B screen projection window 208 according to a drawing parameter of 5 fps. Therefore, the regional refreshing display is realized according to the image frame change condition of each application.
In the embodiment of the present application, the image frame change condition of each object is obtained by detecting the image frame change condition through the setting parameter detection module 112, so that the destination device can refresh and display each object according to the actual content change condition of each object, the refresh condition of each object on the display interface of the destination device is accurately controlled, and unnecessary refresh power consumption of the destination device is reduced.
Referring to fig. 10, fig. 10 is a block diagram illustrating another usage status of a source device and a destination device according to an embodiment of the present disclosure.
The difference between fig. 10 and fig. 9 is that the parameter detection module 112 is configured to detect data when the encoding module 109 encodes the first image data, and when the encoding module 109 performs video encoding, image data of image frames of each application may be acquired, and an image frame change condition of each application is detected according to the acquired image data, so as to determine a rendering parameter of each application according to the image frame change condition of each application. The mapping parameters obtained by the parameter detection module 112 are transmitted to the destination device through the route (1) and/or the route (2) in fig. 10. The parameter detection module 112 may also be configured to determine the rendering parameter of each screen-projecting object in combination with the window position information output by the rendering module 101, that is, determine each screen-projecting object according to the window position information, then determine which screen-projecting object has a change in the area corresponding to the changed area in each frame of the image frame, and further obtain the rendering parameter according to the change in the image frame of the screen-projecting object.
Referring to fig. 11, fig. 11 is a block diagram illustrating a usage status of another source device and a destination device according to an embodiment of the present disclosure.
Fig. 11 differs from fig. 10 in that the source device is not provided with the parameter detection module 112, and the destination device is provided with the corresponding parameter detection module 112. The parameter detection module 112 is configured to detect data when the decoding module 203 decodes the first image data, and when the encoding module 109 performs video encoding, may acquire image data of image frames of each application, detect an image frame change condition of each application according to the acquired image data, and further determine a drawing parameter of each application according to the image frame change condition of each application. On the destination device, the parameter detection module 112 may include an information extraction module 209 and a refresh determination module 211. The information extraction module 209 is configured to extract information in decoding of the video stream, and the information extraction module 209 may extract a region change condition between two adjacent image frames of the decoding module 203, and may obtain which region content between two frames changes and which region content does not change in the upper and lower frames when decoding the video. The refresh determining module 211 is configured to analyze the information extracted by the information extracting module 209, that is, obtain the image frame change condition of each application according to the area change condition between two adjacent image frames determined by the information extracting module 209. The refresh determining module 211 may be further configured to determine the rendering parameters of each screen-projecting object in combination with the window position information output by the rendering module 101, that is, determine each screen-projecting object according to the window position information, then determine which screen-projecting object has a change in the area corresponding to the changed area in each frame of the image frame, and further obtain the rendering parameters according to the change in the image frame of the screen-projecting object.
Referring to fig. 12, fig. 12 is a schematic view illustrating a usage status flow of another source device and a destination device according to an embodiment of the present disclosure. The source device includes a running application B, and the source device includes a rendering module 101, a composition module 102, a parameter detection module 112, an encoding module 109, and a transmission module 111. The destination device comprises a receiving module 202, a decoding module 203, a window splitting module 206 and a second rendering module 204.
When the application B is running, a drawing request is issued to the drawing module 101, and the drawing module 101 responds to the drawing request to create a window and a layer. The drawing module 101 sends the created layer to the synthesis module 102, and the synthesis module 102 is configured to synthesize an image frame according to the layer and output first image data. The parameter detection module 112 detects data of the composition module 102 when the image frame of the application B is composed, so as to obtain a rendering parameter 5fps of the application B, writes the rendering parameter 5fps of the application B into the encoding module 109, encodes the first image data by the encoding module 109 to obtain screen projection data, sends the screen projection data to the sending module 111, and sends the screen projection data output by the encoding module 109 to the receiving module 202 of the destination device through the sending module 111. The receiving module 202 reports the received screen projection data to the decoding module 203, and the decoding module 203 decodes the screen projection data and converts the screen projection data into first image data. The video stream decoded by the decoding module 203 is transmitted to the window splitting module 206. The window splitting module 206 is configured to receive the video stream of the first image data transmitted by the decoding module 203, and the window splitting module 206 is configured to split a window according to the window information of the drawing module 101, and package the content of the video stream into the corresponding windows respectively to obtain the window displayed by the application B on the destination device. The window splitting module 206 is further configured to receive the drawing parameters received by the receiving module 202 from the screen projection data, and package the window of the application according to the corresponding drawing parameters. The window splitting module 206 outputs the packaged image data to the second display sending module 204 for display sending.
Referring to fig. 13, fig. 13 is a flowchart illustrating a screen projection method applied to a source device according to an embodiment of the present application. The screen projection method is applied to a source device, the source device comprises a plurality of objects, the sequence of steps in the flow chart can be changed according to different requirements, and certain steps can be omitted.
Step S130: a communication connection is established with the destination device.
In this embodiment, after the source device establishes a communication connection with the destination device, the source device and the destination device may transmit screen projection data based on a screen projection protocol.
Step S131: and determining a screen projection object in response to the operation of the user.
In the embodiment of the application, the object projected to the destination device on the display interface of the source device can be determined according to the operation of the user.
It can be understood that the source device display interface may include a plurality of objects, each object has a corresponding window, each object has a corresponding display area on the source device display interface, the display area of each object is determined according to each window, and image data of the object in the running process is played and displayed in the display area of each object. The user can select to project one screen projection object to the target device, and can also select to project a plurality of screen projection objects to the target device.
Step S132: and acquiring first image data of the screen projection object.
In the embodiment of the application, after each screen projection object is determined, first image data to be displayed by each screen projection object in the operation process is acquired, wherein the first image data comprises a plurality of image frames, and the image frames are image data presented by the screen projection objects in the operation process.
Step S133: and obtaining drawing parameters corresponding to the screen projection object.
In the embodiment of the application, each screen projection object has corresponding drawing parameters, and when the source device detects the drawing parameters of the screen projection object, the source device can detect the change frequency of the layer cache content of the screen projection object for each screen projection object, determine the drawing parameters of the screen projection object according to the change frequency of the layer cache content of the screen projection object, and acquire the drawing parameters corresponding to each screen projection object.
The method comprises the following steps of taking an application as a screen projection object, determining drawing parameters of the screen projection object according to the content change frequency of a layer of the screen projection object, and realizing the following steps:
the first mode is as follows: the source equipment monitors the layer cache queue of the screen projection object, reads the change frequency of cache data in the layer cache queue of the screen projection object, records the change frequency of the cache data in the layer cache queue of the screen projection object, and determines the drawing parameters of the screen projection object according to the change frequency of the cache data in the layer cache queue of the screen projection object.
Specifically, taking the example of running a game application and a PPT application simultaneously on a source device, the PPT records a game play, and the game application is executing and playing the game. The content of the game application interface is continuously refreshed, the interface content is continuously changed, and the PPT application interface is static. The game application and the PPT application issue drawing requests according to application design, create layers and windows, and create layer cache queues when layers are created. Referring to fig. 14, the source device monitors the layer buffer queue of the application, detects that buffer data is continuously transmitted in the layer buffer queue of the game application, records the change frequency of the buffer data in the layer buffer queue of the game application, and calculates the drawing parameter according to the change frequency of the buffer data in the layer buffer queue of the game application interface. If the image frames with changed contents in one second of the game application can be calculated to obtain the drawing parameters of the game application, the actual picture content change condition of the game application can be obtained. The source device detects that no buffer data transmission exists in a layer buffer queue of a PPT application interface for a long time, records the change frequency of the buffer data in the layer buffer queue of the PPT application, calculates how many frames of image frames with content change in one second of the PPT application according to the change frequency of the buffer data in the layer buffer queue of the PPT application, determines that multiple frames of image frames are changed, obtains the drawing parameters of the PPT application, and can obtain the actual picture content change condition of the PPT application. And the synthesis module reads the layers in the layer buffer queue, performs synthesis processing and outputs image frames.
In one possible implementation manner, when the drawing parameters of all interface objects on the display interface of the source device are detected, the drawing parameters of all the objects on the display interface of the source device are obtained, the target device determines the screen-projected objects according to the window information of the screen-projected objects, and then the drawing parameters corresponding to all the screen-projected objects are obtained. Or directly detecting the layer buffer queue corresponding to the screen projection object to obtain the drawing parameters of the screen projection object, which is not specifically limited in the present application.
In the embodiment of the present application, window information corresponding to each screen-casting object may be acquired through a Window Manager Service (WMS), and after the window information is acquired, the source device may send the window information to the destination device. The window information comprises information such as a window position, a window size and a window hierarchy of a screen projection object. If the window position of the screen projection object is at the right side of the display of the source equipment, the window is as high as the display, the width is half of the display, and the level of the window is 22. The examples are merely illustrative of the present application and should not be construed as limiting.
In the embodiment of the application, the drawing request includes window information, and the window information of the screen projection object can be obtained according to the window information in the drawing request.
It can be understood that, when the application is running, an application interface of each application has a plurality of layers, and in a window, each visible layer is stacked to form a display image, and when the content of the applied visible layer is updated, the visible layer cache data will be issued continuously, and for an application interface of an application, when the cache data of a layer is changed, the corresponding image data of the application is also changed, and the image frame of the frame is changed relative to the image frame of the previous frame. When the content of an application interface needs to be refreshed, the cached data can be transmitted in the layer cache queue corresponding to the application.
The second mode is as follows: during video coding, the coding module obtains first image data of each application to obtain a first image frame set of each application, detects cache data change frequency in a layer cache queue of each application according to the first image frame set of each application, and determines drawing parameters of each application according to the cache data change frequency in the layer cache queue of each application. And during video coding, obtaining the change condition of the image frame in the first image frame set, and further determining the drawing parameters according to the change condition of the image frame in the first image frame set.
It can be understood that, during video encoding, image data of each application, including layer information, may be obtained, and then transform coding or predictive coding may be performed according to correlation between pixels, or inter-frame prediction or motion compensation coding may be performed according to correlation in a time direction, or contour coding or region division coding may be performed according to a structure of an image itself, or knowledge-based coding may be performed according to common knowledge of people at both transmitting and receiving ends, or non-linear quantization coding may be performed according to human visual characteristics, and thus, a layer content change frequency about each application interface during video encoding may be obtained. That is, in the video encoding process, which region content between two frames is changed and which region content is not changed can be obtained, and then the layer content change frequency of each application can be obtained by combining the window information of the application interface.
Specifically, video coding is compression coding based on redundant information (temporal redundancy, spatial redundancy, picture construction redundancy, knowledge redundancy, and visual redundancy) of video. When video is coded, the coding algorithm searches for a rule for compression, for example, in a one-minute video, a 30-second picture is still, and compression coding is performed according to the rule. Therefore, when the video is coded, the image data of each image frame of each application interface can be acquired, and in the upper and lower image frames, which region between the two image frames changes in content, and which region does not change in content, the changed image frame can be acquired, and then the image frames with the changed content in one second can be acquired, and further the number of the image frames with the changed content in one second can be acquired.
When the first image data of all screen projection objects on the display interface of the source device are coded, the display area of each application can be determined by combining the window information of each application, and then the layer content change frequency in the layer cache queue corresponding to each application display area is obtained according to each application display area. The method comprises the steps that a source device display interface comprises a plurality of applications, image data of one application is intercepted, and when the image data of the application is only coded, the change frequency of the content of the layer in a layer cache queue can be directly obtained, the number of frames of the image frame with the changed content in one second of the application interface is calculated, and then drawing parameters of the application are obtained.
In the embodiment of the application, the drawing parameters of the application can be determined according to each image frame in the first image frame set during encoding, and the image frame change condition can be determined according to the I frame, the P frame and the B frame in the first image frame set. The frame which is generated by referring to the previous I frame and only contains difference part codes is called P frame, and the frame which is coded by referring to the previous frame and the next frame is called B frame. Whereby rendering parameters for each application can be derived from the encoded information.
Step S134: and sending the drawing parameters and the first image data to a target device so as to trigger the second electronic device to correspondingly refresh the image in the first image data according to the drawing parameters.
In the embodiment of the application, the synthesis module outputs the first image data to the virtual screen, the coding module codes the first image data cached on the virtual screen into a corresponding video format file, obtains corresponding screen projection data according to the video format file, and sends the screen projection data to the destination device.
In one possible implementation manner, the detected drawing parameters are packaged in the screen projection video file to obtain screen projection data, and the screen projection data includes first image data and corresponding drawing parameters.
In one possible implementation manner, the source device may process the first image data according to the drawing parameters of the screen projection object, and encode the processed first image data into the screen projection data, so that a video stream frame rate of the screen projection object in the screen projection data is the drawing parameters of the screen projection object. For example, if the rendering parameter of the screen projection object is 5fps, the first image data is processed according to 5fps to obtain first image data cached to the virtual screen, and then encoding is performed according to the first image data processed on the virtual screen. Or, directly performing frame skipping on the first image data of the virtual screen according to the drawing parameter 5fps, that is, performing frame skipping according to the drawing parameter 5fps when the video is converted from the encoding, and encoding the frame skipping into a corresponding video format file with a corresponding frame rate of 5fps to obtain screen projection data, which is not specifically limited in the present application.
In the embodiment of the application, each screen projection object has the corresponding drawing parameter, and the target device is triggered to refresh and display the screen projection data of the screen projection object according to the drawing parameter corresponding to each screen projection object. When the screen projection data can include image data of a plurality of objects on a display interface of the source device, when the video stream and drawing parameters of each object are sent, position window information of each object is also sent to the target device, so that the target device obtains a display area corresponding to each screen projection object according to the position window information corresponding to the screen projection object, first image data of each screen projection object is obtained, and the target device refreshes and displays the corresponding first image data according to the drawing parameters corresponding to each screen projection object. When a plurality of screen projection objects are included, drawing parameters corresponding to each screen projection object are obtained, the destination device divides a display interface of the source device into a plurality of windows according to position window information of each screen projection object, each window corresponds to a display area of the screen projection object, for example, a first screen projection object and a second screen projection object are included, the display interface of the destination device is divided into two windows, determining a display area of the first screen projection object according to the window of the first screen projection object, determining a display area of the second screen projection object according to the window of the second screen projection object, refreshing and displaying the first image data corresponding to the first screen projection object in the corresponding display area according to the drawing parameters corresponding to the first screen projection object, and refreshing and displaying the first image data corresponding to the second screen projection object in the display area according to the drawing parameters corresponding to the second screen projection object.
In one possible implementation manner, the screen projection data further includes position window information of a screen projection object on a display interface of the source device, a screen projection video stream of the screen projection object, and drawing parameters of the screen projection object.
In this embodiment of the application, the position window information of the screen projection object and/or the drawing parameters of the screen projection object may be encapsulated in the screen projection data. The position window information of the screen projection object and/or the drawing parameters of the screen projection object can also be transmitted through the communication connection between the source device and the destination device.
In the embodiment of the application, the drawing parameters of each object are obtained, and the target device is triggered to refresh and display the first image data of each object according to the drawing parameters of each object. The destination device does not have to refresh meaningless image frames at a high frame rate and can achieve a similar display effect.
Referring to fig. 15, fig. 15 is a flowchart illustrating a screen projection method applied to a first electronic device according to an embodiment of the present disclosure. The screen projection method is applied to the first electronic equipment. The first electronic equipment display interface comprises an object, and the object is used as an application for explanation. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Step S151: acquiring first image data of the application on the first electronic equipment.
In an implementation of the application, the obtaining of the first image data of the application on the first electronic device may include obtaining the first image data of a screen-projection application, and may also include obtaining the first image data of all applications on the first electronic device.
Step S152: acquiring a drawing parameter of the application, wherein the drawing parameter is used for representing the number of image frames with changed content of the application in one period.
In this embodiment of the application, the obtaining of the drawing parameters of the application on the first electronic device includes obtaining drawing parameters of a screen projection application, and may also include obtaining drawing parameters of all applications on the first electronic device.
Step S153: and sending the drawing parameters and the first image data to second electronic equipment so as to trigger the second electronic equipment to display the image in the first image data according to the drawing parameters.
In this embodiment of the application, the drawing parameter and the first image data are directly sent to the second electronic device, and the frame skipping module or the encoding module may also process the first image data according to the drawing parameter to obtain processed first image data, and send the processed first image data and the drawing parameter to the second electronic device.
In this embodiment of the application, the second electronic device may refresh an image in the first image data corresponding to the application according to the drawing parameter of the application. When the first image data and the drawing parameters comprise a plurality of objects, the window information of each object can be obtained from the drawing module, and the window information of each object is sent to the destination device, so that the destination device is triggered to display in a partition mode according to each object, and each object is refreshed and displayed according to the actual picture content change condition of each object.
Referring to fig. 16, fig. 16 is a schematic flowchart illustrating another screen projection method applied to a first electronic device according to an embodiment of the present disclosure. The screen projection method is applied to the first electronic equipment. The source device includes M applications, M being an integer greater than or equal to 1. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Step S161: receiving selection operation of a user, and determining N applications according to the selection operation, wherein the selection operation is used for the first electronic device to determine the N applications from the M applications, and N is a positive integer smaller than or equal to M.
In the embodiment of the application, the screen projection in the partition mode can be performed, that is, a plurality of applications run on the source device, application interfaces of the plurality of applications are displayed on a display interface of the source device, and a user can determine application interfaces of the N applications from the application interfaces of the plurality of applications and project the determined application interfaces of the N applications to the destination device.
Step S162: and responding to the selection operation, and acquiring first image data on the first electronic equipment, wherein the first image data is used for displaying the image data of the N applications by the second electronic equipment.
In this embodiment of the application, the source device, in response to the selection operation, acquires first image data on the first electronic device, where the first image data may include image data of all applications on the first electronic device, and may also include image data of the N applications.
Step S163: obtaining respective drawing parameters of the N applications, wherein the drawing parameters are used for representing the number of the image frames with the content change corresponding to the N applications in one period.
In the implementation of the application, the drawing parameters of M applications may be directly obtained, and then the respective drawing parameters of the N applications are obtained from the drawing parameters of the M applications according to the N applications. Or directly acquiring the respective drawing parameters of the N applications.
In one possible implementation manner, the obtaining drawing parameters of each of the N applications includes: and obtaining corresponding drawing parameters according to the respective image data of the N applications. Namely, the respective drawing parameters of the N applications are determined according to the image frame change condition of the respective image data of the N applications.
In one possible implementation manner, the obtaining drawing parameters of each of the N applications includes: the obtaining of the drawing parameters of the N applications includes: and obtaining corresponding drawing parameters according to the respective layer buffer queues of the N applications. And obtaining the change condition of the layer cache data in the layer cache queues of the N applications, and obtaining the drawing parameters corresponding to the applications according to the change condition of the layer cache data in the layer cache queues of the applications.
Step S164: and obtaining screen projection data according to the first image data.
In one possible implementation manner, the obtaining of the projection data according to the first image data includes: acquiring screen projection parameters, wherein the screen projection parameters are used for representing capability parameters of second electronic equipment, processing the first image data according to the screen projection parameters to obtain second image data, and obtaining the screen projection data according to the second image data.
Specifically, screen projection parameters are obtained according to the capability parameters of the second electronic device, frame skipping processing is performed on the first image data according to the screen projection parameters, and second image data is obtained, wherein image frames in the second image data are fewer than image frames in the first image data, and the second image data is encoded, that is, the second image frame set is encoded to obtain a file in a corresponding video format, that is, screen projection data.
Step S165: and sending the drawing parameters and the screen projection data to second electronic equipment so as to trigger the second electronic equipment to correspondingly display the images of N applications in the screen projection data according to the drawing parameters of the N applications.
In the embodiment of the application, the first electronic device processes the first image data according to the screen projection parameters to obtain the second image data, and reduces the image frames, so that the destination device can support the processing of the first image data, and avoid frame loss after the first electronic device receives the first image data, thereby reducing unnecessary power consumption. And sending the drawing parameters of each application to the second electronic equipment, so that the second electronic equipment refreshes and displays the image of each application according to the drawing parameters of each application. And the screen projection is carried out according to the combined screen projection parameters and the drawing parameters of each application, the first image data is processed according to the screen projection parameters, and the second image data received by the target equipment is adjusted. And triggering the target equipment to refresh according to the detected actual image change condition of each application, so that unnecessary refresh operation of the target equipment is reduced, and further power consumption is reduced.
Referring to fig. 17, fig. 17 is a schematic flowchart illustrating another screen projection method applied to a first electronic device according to an embodiment of the present disclosure. The screen projection method is applied to the first electronic equipment. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Step S171: receiving selection operation of a user, and determining N applications according to the selection operation, wherein the selection operation is used for the first electronic device to determine the N applications from the M applications, and N is a positive integer smaller than or equal to M.
Step S172: and responding to the selection operation, and acquiring first image data on the first electronic equipment, wherein the first image data is used for displaying the image data of the N applications by the second electronic equipment.
Step S173: obtaining the respective drawing parameters of the N applications, wherein the drawing parameters are used for representing the number of image frames with the changed content of the applications in one period.
Step S174: and acquiring screen projection parameters.
Step S175: and processing the first image data according to the screen projection parameters to obtain second image data.
In one possible implementation manner, frame skipping processing is performed on the first image data according to the screen projection parameters to obtain second image data.
Step S176: and the second image data is sent to the second electronic equipment after being coded.
Step S177: and sending the drawing parameters to the second electronic equipment.
In the embodiment of the application, the first electronic device processes the first image data according to the screen projection parameters to obtain the second image data, and reduces the image frames, so that the destination device can support the processing of the first image data, and avoid frame loss after the first electronic device receives the first image data, thereby reducing unnecessary power consumption. And sending the drawing parameters of each application to the second electronic equipment, so that the second electronic equipment refreshes and displays the image of each application according to the drawing parameters of each application.
Referring to fig. 18, fig. 18 is a schematic view of an application scenario of another screen projection method according to an embodiment of the present application.
The method comprises the steps that a user starts a PPT application and a video conference application on source equipment, the PPT application and the video conference application are operated on the source equipment, an interface of the PPT application corresponds to an application interface A31, an interface of the video conference application corresponds to an application interface A32, the source equipment obtains a display refresh rate of 120Hz, the application interface A31 and an application interface A32 are refreshed and displayed at a frame rate of 120fps, the display interface of the source equipment is seen from a user perspective, the application interface A31 is static, the interface content of the PPT application is not changed, and the application interface A32 is continuously changed.
The method comprises the steps that a user selects to screen application interface content of a PPT application to a destination device, when a frame rate detection module of a source device synthesizes an image, drawing parameters of an application interface A31 are calculated by detecting layer content change frequency of the application, or when a frame rate detection module of the source device detects layer content change frequency of each application interface when a coding module codes a video stream during video coding, drawing parameters of an application interface A31 are calculated by combining window information corresponding to an application interface A31, or the source device sends the video stream to the destination device, the destination device can obtain the layer content change frequency of each application interface during decoding the video stream, and calculates the drawing parameters of an application interface A31 by combining window information of the application interface A31, so that the drawing parameters of the application interface A31 are 1 fps.
After receiving the video stream and obtaining the drawing parameter 1fps of the application interface a31, the destination device processes the video stream, refreshes and displays the application interface B32 on the destination device by using the drawing parameter 1fps of the application interface a31, and the interface content of the application interface a32 is consistent with the interface content of the application interface B32. For example, for the application interface B32, the window corresponding to the application interface B32 may be controlled, and after the image of the first frame image frame is refreshed, the image of the following frame image frame is not processed.
Therefore, when the source device and the destination device perform screen projection, the source device still performs refresh display on the application interface a31 and the application interface a32 at the original frame rate of 120fps, the frame rate of the video stream of the screen projection data transmitted by the source device to the destination device may be higher than the frame rate of 1fps, but the destination device performs refresh display on the interface content of the application interface B31 at the rendering parameter 1fps of the application interface a31, and the interface content of the application interface a31 is consistent with the interface content of the application interface B31.
In one possible implementation manner, the source device acquires the drawing parameters corresponding to each application interface in real time, for example, taking the PPT application as an example, if a screen of the PPT application interface is static in 1 to 60 seconds, the drawing parameter is determined to be 1fps, and if the PPT performs a page turning operation in 61 to 33 seconds, the drawing parameter in 61 to 63 seconds is recorded to be 60 fps.
In the embodiment of the application, when the source device and the destination device perform screen projection on a plurality of screen projection objects, the source device detects the drawing parameters of the application interface and sends the corresponding drawing parameters on the display interface of the source device to the destination device, so that the destination device refreshes and displays the corresponding drawing parameters of the screen projection objects, the interface content which does not need to be refreshed is prevented from being refreshed by the destination device, and unnecessary power consumption is reduced by the destination device under the condition of ensuring the picture quality.
It can be understood that the screen projection method may be applied to a one-to-many screen projection scene or a many-to-one screen projection scene alone, and the screen projection methods may also be applied to a one-to-one screen projection scene, a one-to-many screen projection scene or a many-to-one screen projection scene in combination, where the screen projection object may include one or more screen objects, and this application is not limited specifically.
Referring to fig. 19, fig. 19 is a schematic view of an application scenario of another screen projection method according to an embodiment of the present application.
The source device A runs the application M and the application N, the refresh rate of the display of the source device A is 60Hz, and the source device A refreshes and displays the application interface M1 and the application interface N1 at a frame rate of 60 fps. The user screens the content of the application M to the target device B, the user screens the content of the application N to the target device C, the content of the application M is displayed on a display interface of the target device B as an application interface M2, and the content of the application N is displayed on a display interface of the target device C as an application interface N2.
And the source device A receives the capability parameter of the target device B and determines that the screen projection parameter projected to the target device B is 48 fps. And the source equipment A receives the capability parameter of the target equipment C and determines that the screen projection parameter projected to the target equipment C is 48 fps.
And the source equipment A processes the first image data of the application M according to the screen projection parameter 48fps to obtain the content of the virtual screen I corresponding to the application M. And the source equipment A processes the first image data of the application N according to the screen projection parameter 48fps to obtain the content of the virtual screen J corresponding to the application N. The source device A encodes the interface content on the virtual screen I to obtain a video stream O, the frame rate of the video stream O is 48fps, and the video stream O is sent to the destination device B. The source device A encodes the interface content on the virtual screen J to obtain a video stream H, the frame rate of the video stream is 48fps, and the video stream H is sent to the destination device B.
The drawing parameter of the application M obtained by the destination device B is 48fps, and the drawing parameter of the application N obtained by the destination device C is 5 fps. The destination device B decodes the video stream O to obtain a frame of image frame, and refreshes the display image frame according to the drawing parameter 48 fps. The destination device C decodes the video stream H to obtain a frame of image frame, and refreshes the display image frame according to the drawing parameter 5 fps.
In this embodiment of the application, the destination devices B and C obtain the drawing parameters of each application, may obtain image data of each frame of image frame of each application interface for the frame rate detection module to detect the image composition of the source device, detect the layer content change frequency of each application interface according to the obtained image data, and determine the drawing parameters of each application interface according to the layer content change frequency of each application interface. Or, the frame rate detection module acquires image data of each frame of image frame of each application interface when the source device is performing video coding, detects the layer content change frequency of each application interface according to the acquired image data, and determines the drawing parameters of each application interface according to the layer content change frequency of each application interface. Or when the destination device decodes, the frame rate detection module of the destination device acquires image data of each frame of image frame of each application interface, detects the layer content change frequency of each application interface according to the acquired image data, and determines the drawing parameters of each application interface according to the layer content change frequency of each application interface.
Referring to fig. 20, fig. 20 is a flowchart illustrating a screen projection method applied to a second electronic device according to an embodiment of the present disclosure. The screen projection method is applied to the second electronic equipment. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Step S201: receiving screen projection data sent by first electronic equipment, wherein the screen projection data comprise data obtained according to display parameters and first image data on the first electronic equipment, and the display parameters are used for representing parameters according to which second electronic equipment performs screen projection display.
In the implementation of the application, the display parameters can comprise drawing parameters and/or screen projection parameters. The first electronic equipment sends the first image data and the display parameters to the second electronic equipment.
Step S202: and processing the screen projection data according to the display parameters, and displaying images in the screen projection data.
In this embodiment of the application, the screen projection data is data sent by the first electronic device to the second electronic device, and the second electronic device displays an image to be projected to the second electronic device by the first electronic device according to the screen projection data. The second electronic device may analyze the screen projection data according to the display parameters, or the second electronic device may display an image on the screen projection data according to the display parameters.
Referring to fig. 21, fig. 21 is a schematic flowchart of another screen projection method applied to a second electronic device according to an embodiment of the present disclosure. The screen projection method is applied to the second electronic equipment. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Step S211: receiving screen projection data sent by first electronic equipment, wherein the screen projection data comprise data obtained according to first image data on the first electronic equipment.
In the embodiment of the application, the first electronic device sends the screen projection data to the second electronic device, so that the second electronic device can refresh and display according to the screen projection data.
Step S212: and acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment is projected and displayed.
In the embodiment of the present application, the display parameter may be obtained from a first electronic device, and the display parameter may also be obtained from a second electronic device.
Step S213: and processing the screen projection data according to the display parameters, and displaying images in the screen projection data.
In one possible implementation manner, a first electronic device comprises an application, the first image data comprises image data of the application on the first electronic device, and the display parameter comprises a drawing parameter, wherein the drawing parameter is used for representing image frame variation of the application; then, the processing the screen projection data according to the display parameters, and displaying the image in the screen projection data includes: and displaying the image in the screen projection data according to the drawing parameters. In one possible implementation manner, the second electronic device receives first image data sent after the first electronic device determines N applications from M applications, where N is a positive integer less than or equal to M. Obtaining respective drawing parameters of the N applications, wherein the drawing parameters are used for representing image frame change conditions corresponding to the N applications. Wherein the obtaining of the drawing parameters of each of the N applications includes: and obtaining respective drawing parameters of N applications according to the decoding of the first image data. Decoding the image data of the N applications; and finally, correspondingly displaying the N applied images in the first image data according to the N applied drawing parameters.
In the embodiment of the application, during video decoding, a decoding module obtains first image data of each application to obtain a first image frame set of each application, detects cache data change frequency in a layer cache queue of each application according to the first image frame set of each application, and further determines drawing parameters of each application according to the cache data change frequency in the layer cache queue of each application. And during video decoding, the change condition of the image frame in the first image frame set is obtained, and then the drawing parameters are determined according to the change condition of the image frame in the first image frame set.
Specifically, by taking a screen projection object as an application, image data of each frame of image frame in each application display area can be obtained during video decoding, and in the upper and lower image frames, which area between the two frames has changed content, and which area has no change in content, the changed image frame can be obtained, and further the image frames with changed frame content in one second can be obtained, and further the number of the changed image frames in one second can be obtained.
When a plurality of screen projection objects are included, each application can be determined by combining the window position information of each application when screen projection data are decoded, and then the corresponding layer cache data change frequency is obtained according to each application. The source equipment display interface comprises a plurality of applications, the content of one application can be intercepted, when the interface content of the application is only decoded, the change frequency of the layer content of the application can be directly obtained, the number of image frames with changed content in one second of the application is calculated, the number of the image frames with changed content in one second is obtained, and then the drawing parameters of the screen projection object are obtained.
In the embodiment of the application, the second electronic device may correspondingly refresh each application according to the drawing parameter of each application, and precisely control the refresh condition of each application. The embodiment of the application discloses electronic equipment, which comprises a processor, and a memory, an input device, an output device and a communication module which are connected with the processor. In which an input device and an output device may be integrated into one device, for example, a touch sensor may be used as the input device, a display screen may be used as the output device, and the touch sensor and the display screen may be integrated into a touch screen.
At this time, as shown in fig. 22, the electronic device may include: a touch screen 2201, the touch screen 2201 comprising a touch sensor 2206 and a display 2207; one or more processors 2202; a memory 2203; a communication module 2208; one or more application programs (not shown); and one or more computer programs 2204, which may be connected via one or more communication buses 2205. Wherein the one or more computer programs 2204 are stored in the memory 2203 and configured to be executed by the one or more processors 2202, the one or more computer programs 2204 comprising instructions that may be used to perform the steps of the embodiments described above. All relevant contents of the steps related to the above method embodiment may be referred to the functional description of the corresponding entity device, and are not described herein again.
For example, the processor 2202 may be specifically the processor 110 shown in fig. 2, the memory 2203 may be specifically the internal memory 121 and/or the external memory interface 120 shown in fig. 2, the display 2207 may be specifically the display 194 shown in fig. 2, the touch sensor 2206 may be specifically the touch sensor in the sensor module 180 shown in fig. 2, and the communication module 2208 may be specifically the mobile communication module 150 and/or the wireless communication module 160 shown in fig. 2, which is not limited in this embodiment of the present invention.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again. Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.
Claims (19)
1. A screen projection method is applied to a first electronic device, and comprises the following steps:
acquiring first image data on the first electronic equipment;
acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment is projected and displayed;
screen projection data are obtained according to the display parameters and the first image data;
and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to process the screen projection data according to the display parameters and display images in the screen projection data.
2. The method of claim 1, wherein the first image data comprises image data of an application of the first electronic device, and the display parameter comprises a rendering parameter, wherein the rendering parameter is used to represent a number of image frames of a content change of the application within one period;
sending the screen projection data to a second electronic device to trigger the second electronic device to process the screen projection data according to the display parameters, wherein displaying images in the screen projection data comprises:
and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to display the image in the screen projection data according to the drawing parameters.
3. The method of claim 1 or 2, further comprising: determining N applications from M applications of the first electronic equipment in response to selection operation of a user, wherein M is an integer greater than or equal to 1, and N is a positive integer less than or equal to M; the display parameters comprise respective drawing parameters of the N applications, wherein the drawing parameters are used for representing the number of image frames with content changes corresponding to the N applications in one period;
sending the screen projection data to a second electronic device to trigger the second electronic device to process the screen projection data according to the display parameters, wherein displaying images in the screen projection data comprises:
and sending the screen projection data to second electronic equipment to trigger the second electronic equipment to correspondingly display the images of N applications in the screen projection data according to the drawing parameters of the N applications.
4. The method of claim 2 or 3, wherein the obtaining display parameters comprises:
and obtaining corresponding drawing parameters according to the respective image data of the N applications.
5. The method of any of claims 2-4, wherein the obtaining display parameters comprises:
and obtaining corresponding drawing parameters according to the respective layer buffer queues of the N applications.
6. The method of any one of claims 1-5, wherein the display parameters include a screen projection parameter, wherein the screen projection parameter is used to represent a capability parameter of the second electronic device;
then the obtaining of the screen projection data according to the display parameter and the first image data includes:
processing the first image data according to the screen projection parameters to obtain second image data;
and obtaining screen projection data according to the second image data.
7. The method of claim 6, wherein the deriving the screen projection data from the second image data comprises:
and coding the second image data to obtain the screen projection data.
8. The method of any of claims 6-7, wherein the obtaining display parameters comprises:
acquiring a capability parameter of the second electronic equipment;
and obtaining the screen projection parameters according to the capability parameters.
9. The method according to any one of claims 6-8, wherein the processing the first image data according to the screen projection parameters to obtain second image data comprises:
and performing frame skipping processing on the first image data according to the screen projection parameters to obtain second image data.
10. The method of any of claims 6-9, wherein there are more image frames in the first image data than in the second image data.
11. A screen projection method is applied to a second electronic device, and comprises the following steps:
receiving screen projection data sent by first electronic equipment, wherein the screen projection data comprise data obtained according to display parameters and first image data on the first electronic equipment, and the display parameters are used for representing parameters according to which second electronic equipment performs screen projection display;
and processing the screen projection data according to the display parameters, and displaying images in the screen projection data.
12. A screen projection method is applied to a second electronic device, and comprises the following steps:
receiving screen projection data sent by first electronic equipment, wherein the screen projection data comprise data obtained according to first image data on the first electronic equipment;
acquiring display parameters, wherein the display parameters are used for representing parameters according to which the second electronic equipment performs screen projection display;
and processing the screen projection data according to the display parameters, and displaying images in the screen projection data.
13. The method of claim 12, wherein the first image data comprises image data of the application on the first electronic device, and the display parameter comprises a rendering parameter, wherein the rendering parameter is used to represent a number of image frames of a content change of the application within one period;
then, the processing the screen projection data according to the display parameters, and displaying the image in the screen projection data includes:
and displaying the image in the screen projection data according to the drawing parameters.
14. The method according to claim 12 or 13, wherein the first image data comprises image data sent by the first electronic device after the first electronic device determines N applications from M applications of the first electronic device, where M is an integer greater than or equal to 1, and N is a positive integer less than or equal to M; the display parameters comprise respective drawing parameters of the N applications, wherein the drawing parameters are used for representing the number of image frames with content changes corresponding to the N applications in one period;
then, the processing the screen projection data according to the display parameters, and displaying the image in the screen projection data includes:
and correspondingly displaying the images of the N applications in the first image data according to the drawing parameters of the N applications.
15. The method of claim 13 or 14, wherein the obtaining display parameters comprises:
and obtaining a drawing parameter according to the decoding of the first image data.
16. A computing device, comprising a processor and a memory, the memory storing a set of computer instructions, the computing device performing the method of any of claims 1 to 10 when the processor executes the set of computer instructions.
17. A computing device, comprising a processor and a memory, the memory storing a set of computer instructions, the computing device performing the method of any of claims 11 to 14 when the processor executes the set of computer instructions.
18. A computer-readable storage medium, characterized in that the computer-readable storage medium stores computer program code which, when executed by a computing device, performs the method of any of the preceding claims 1 to 10.
19. A computer-readable storage medium, characterized in that the computer-readable storage medium stores computer program code which, when executed by a computing device, performs the method of any of the preceding claims 11 to 14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011625074.6A CN114697731B (en) | 2020-12-31 | 2020-12-31 | Screen projection method, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011625074.6A CN114697731B (en) | 2020-12-31 | 2020-12-31 | Screen projection method, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114697731A true CN114697731A (en) | 2022-07-01 |
CN114697731B CN114697731B (en) | 2023-06-16 |
Family
ID=82134187
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011625074.6A Active CN114697731B (en) | 2020-12-31 | 2020-12-31 | Screen projection method, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114697731B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115361569A (en) * | 2022-08-10 | 2022-11-18 | 深圳乐播科技有限公司 | Dynamic frame screen projection method in cloud conference and related product |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110083324A (en) * | 2019-04-30 | 2019-08-02 | 华为技术有限公司 | Method, apparatus, electronic equipment and the computer storage medium of Image Rendering |
CN110221798A (en) * | 2019-05-29 | 2019-09-10 | 华为技术有限公司 | A kind of throwing screen method, system and relevant apparatus |
CN111190558A (en) * | 2018-11-15 | 2020-05-22 | 腾讯科技(深圳)有限公司 | Screen projection control method and device, computer readable storage medium and computer equipment |
CN112019897A (en) * | 2020-08-27 | 2020-12-01 | 北京字节跳动网络技术有限公司 | Screen projection method and device, electronic equipment and computer readable medium |
CN112035081A (en) * | 2020-09-01 | 2020-12-04 | 平安付科技服务有限公司 | Screen projection method and device, computer equipment and storage medium |
-
2020
- 2020-12-31 CN CN202011625074.6A patent/CN114697731B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111190558A (en) * | 2018-11-15 | 2020-05-22 | 腾讯科技(深圳)有限公司 | Screen projection control method and device, computer readable storage medium and computer equipment |
CN110083324A (en) * | 2019-04-30 | 2019-08-02 | 华为技术有限公司 | Method, apparatus, electronic equipment and the computer storage medium of Image Rendering |
CN110221798A (en) * | 2019-05-29 | 2019-09-10 | 华为技术有限公司 | A kind of throwing screen method, system and relevant apparatus |
CN112019897A (en) * | 2020-08-27 | 2020-12-01 | 北京字节跳动网络技术有限公司 | Screen projection method and device, electronic equipment and computer readable medium |
CN112035081A (en) * | 2020-09-01 | 2020-12-04 | 平安付科技服务有限公司 | Screen projection method and device, computer equipment and storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115361569A (en) * | 2022-08-10 | 2022-11-18 | 深圳乐播科技有限公司 | Dynamic frame screen projection method in cloud conference and related product |
CN115361569B (en) * | 2022-08-10 | 2023-10-20 | 深圳乐播科技有限公司 | Dynamic frame screen projection method in cloud conference and related products |
Also Published As
Publication number | Publication date |
---|---|
CN114697731B (en) | 2023-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110381345B (en) | Screen projection display method and electronic equipment | |
CN113422903B (en) | Shooting mode switching method, equipment and storage medium | |
US20230162324A1 (en) | Projection data processing method and apparatus | |
CN111580765A (en) | Screen projection method, screen projection device, storage medium, screen projection equipment and screen projection equipment | |
CN113556598A (en) | Multi-window screen projection method and electronic equipment | |
JP7085014B2 (en) | Video coding methods and their devices, storage media, equipment, and computer programs | |
CN116055786B (en) | Method for displaying multiple windows and electronic equipment | |
CN112954251B (en) | Video processing method, video processing device, storage medium and electronic equipment | |
WO2022007862A1 (en) | Image processing method, system, electronic device and computer readable storage medium | |
CN111741303B (en) | Deep video processing method and device, storage medium and electronic equipment | |
CN114470750B (en) | Display method of image frame stream, electronic device and storage medium | |
CN113726815B (en) | Method for dynamically adjusting video, electronic equipment, chip system and storage medium | |
CN113797530A (en) | Image prediction method, electronic device and storage medium | |
CN113099233A (en) | Video encoding method, video encoding device, video encoding apparatus, and storage medium | |
CN110996117A (en) | Video transcoding method and device, electronic equipment and storage medium | |
CN114697731B (en) | Screen projection method, electronic equipment and storage medium | |
CN116170629A (en) | Method for transmitting code stream, electronic equipment and computer readable storage medium | |
CN113473216A (en) | Data transmission method, chip system and related device | |
WO2022042281A1 (en) | Encoding and decoding method, device, and system | |
CN117440194A (en) | Method and related device for processing screen throwing picture | |
CN111626931B (en) | Image processing method, image processing device, storage medium and electronic apparatus | |
WO2024082713A1 (en) | Image rendering method and apparatus | |
CN113934388B (en) | Synchronous display method, terminal and storage medium | |
WO2024027718A1 (en) | Multi-window screen mirroring method and system, and electronic device | |
WO2023109442A1 (en) | Dynamic range mapping method and apparatus for panoramic video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |