WO2022052773A1 - 多窗口投屏方法及电子设备 - Google Patents

多窗口投屏方法及电子设备 Download PDF

Info

Publication number
WO2022052773A1
WO2022052773A1 PCT/CN2021/113506 CN2021113506W WO2022052773A1 WO 2022052773 A1 WO2022052773 A1 WO 2022052773A1 CN 2021113506 W CN2021113506 W CN 2021113506W WO 2022052773 A1 WO2022052773 A1 WO 2022052773A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
interface
window
video
interfaces
Prior art date
Application number
PCT/CN2021/113506
Other languages
English (en)
French (fr)
Inventor
陈晨
郭睿帅
胡迁乔
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US18/044,707 priority Critical patent/US20240020074A1/en
Priority to EP21865829.2A priority patent/EP4199523A4/en
Publication of WO2022052773A1 publication Critical patent/WO2022052773A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43076Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of the same content streams on multiple devices, e.g. when family members are watching the same movie on different devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/22Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
    • G09G5/222Control of the character-code memory
    • G09G5/227Resolution modifying circuits, e.g. variable screen formats, resolution change between memory contents and display screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4113PC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8173End-user applications, e.g. Web browser, game
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • G09G2340/0435Change or adaptation of the frame rate of the video stream

Definitions

  • the embodiments of the present application relate to the field of electronic technologies, and in particular, to a multi-window screen projection method and an electronic device.
  • the multi-window screen projection technology is by projecting multiple application interfaces launched on one electronic device (such as the first device) to another electronic device (such as the second device), so as to realize the first device and the second device.
  • Mirror manipulation and input coordination is by projecting multiple application interfaces launched on one electronic device (such as the first device) to another electronic device (such as the second device), so as to realize the first device and the second device.
  • multiple application interfaces are usually projected to the second device at a fixed frame rate (frames per second, FPS) and resolution.
  • FPS frames per second
  • the occupancy rate of the graphics processing unit (GPU) is usually very high (for example, the occupancy rate often reaches more than 80%), and the communication resources (such as wireless fidelity (WiFi) resources) ) also has a higher throughput pressure.
  • the communication resources such as wireless fidelity (WiFi) resources
  • Embodiments of the present application provide a multi-window screen projection method and electronic device, which can reduce the image processing load of the electronic device during multi-window projection, so as to ensure the smoothness and clarity of the projection screen.
  • a multi-window screen projection method is provided.
  • the method is applied to a scenario where a first device projects a screen to a second device.
  • the method includes: when the second device displays the first interface synchronously with the first device, acquiring first information; wherein the first interface includes multiple application interfaces; the second device adaptively adjusts one or more of the following according to the acquired first information: the frame rate corresponding to the multiple application interfaces, the frame rate corresponding to the multiple application interfaces The application display area size, the display resolution of the second device, or the video resolution corresponding to the multiple application interfaces.
  • the second device obtains the first information in the process of accepting the screen projection of the first device, and realizes the frame rate corresponding to the screen projection interface, the size of the application display area corresponding to the projection interface, and the first information.
  • the load on the second device is reduced.
  • the processing capability of the second device is limited, ensure the smoothness and clarity of the projected screen.
  • the above-mentioned first information includes window states corresponding to the plurality of application interfaces; the above-mentioned first information is specifically used by the second device to adaptively adjust the frame rates corresponding to the plurality of application interfaces; wherein, Window states include focused windows, non-minimized and non-focused windows, and minimized windows.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the window states corresponding to the multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the above-mentioned second device adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the obtained window states corresponding to the multiple application interfaces, including: the second device adapts to the following first preset strategy
  • the frame rate corresponding to multiple application interfaces can be adjusted freely: the frame rate corresponding to the focus window > the frame rate corresponding to the non-minimized and non-focus window > the frame rate corresponding to the minimized window.
  • the second device adjusts the frame rate corresponding to the screen projection interface adaptively according to different window states according to a preset strategy, so as to ensure the fluency and clarity of the screen projection image.
  • the above-mentioned first information includes application categories corresponding to multiple application interfaces, and the first information is specifically used by the second device to adaptively adjust the frame rates corresponding to the multiple application interfaces; wherein, the application category Including one or more of game, video, instant messaging, office, social, life, shopping, and function.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the application categories corresponding to the multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the above application categories include games, videos, and instant messaging;
  • the second device adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the acquired application categories corresponding to the multiple application interfaces, Including: the second device adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the following second preset strategy: the frame rate corresponding to the game application interface > the frame rate corresponding to the video application interface > the frame corresponding to the instant messaging application interface Rate.
  • the second device adjusts the frame rate corresponding to the screen projection interface according to the preset strategy and adaptively according to different application categories, so as to ensure the fluency and clarity of the screen projection image.
  • obtaining the first information when the second device displays the first interface synchronously with the first device includes: when the second device displays the first interface synchronously with the first device, if the second device determines that the first interface is displayed synchronously with the first device
  • the processing load of the second device is higher than the preset threshold, and the first information is acquired.
  • the solution provided by the present application can be implemented based on the processing load of the second device being higher than the preset threshold. With this solution, the fluency and clarity of the projected screen can be ensured when the processing capability of the second device is limited.
  • the above-mentioned second device determines that the processing load of the second device is higher than a preset threshold according to one or more of the following: the decoding delay of the GPU of the second device is greater than the delay threshold, the GPU The load rate is greater than the load threshold, and the number of multiple application interfaces is greater than the number threshold.
  • the processing load of the second device is high by judging whether the decoding delay of the GPU is greater than the delay threshold, whether the load rate of the GPU is greater than the load threshold, and whether the number of multiple application interfaces is greater than the number threshold. at the preset threshold.
  • the above-mentioned first information includes the number of multiple application interfaces, and the first information is specifically used for the second device to adaptively adjust one or more of the following: application display areas corresponding to multiple application interfaces size, the display resolution of the second device, or the video resolution corresponding to multiple application interfaces.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the number of multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the second device determines that the size of the application display area corresponding to the application interface is 2a 1 ⁇ b 1 , and the display resolution of the second device is a 2 ⁇ 2b 2 , the video resolution corresponding to the multiple application interfaces is a 3 ⁇ b 3 ; wherein, 2a 1 is the length of the application display area, and b 1 is the width of the application display area; a 2 is the horizontal dimension of the display screen of the second device.
  • the number of displayed pixels, 2b 2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device; a 3 is the number of pixels that can be displayed in the unit area image of the horizontal dimension, and b 3 is the number of pixels that can be displayed in the unit area image of the vertical dimension
  • the second device determines that the size of the application display area corresponding to the application interface is 3a 1 ⁇ b 1 , and the display resolution of the second device is a 2 ⁇ 3b 2 , the video resolution corresponding to the multiple application interfaces is a 3 ⁇ b 3 ; wherein, 3a 1 is the length of the application display area, and b 1 is the width of the application display area; a 2 is the horizontal dimension of the display screen of the second device.
  • the number of displayed pixels, 3b2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device; a3 is the number of pixels that can be displayed in the unit area image of the horizontal dimension, and b3 is the number of pixels that can be displayed in the unit area image of the vertical dimension
  • an electronic device comprising: a processing unit configured to acquire first information when the electronic device and the first device synchronously display a first interface; wherein the first interface includes a plurality of application interfaces; And, for adaptively adjusting one or more of the following according to the acquired first information: the frame rate corresponding to the multiple application interfaces, the size of the application display area corresponding to the multiple application interfaces, the display resolution of the second device or Video resolutions corresponding to multiple application interfaces.
  • the second device obtains the first information in the process of accepting the screen projection from the first device, and realizes the frame rate corresponding to the projection interface, the size of the application display area corresponding to the projection interface, and the first information.
  • the load on the second device is reduced.
  • the processing capability of the second device is limited, ensure the smoothness and clarity of the projected screen.
  • the above-mentioned first information includes window states corresponding to the plurality of application interfaces; the above-mentioned first information is specifically used by the second device to adaptively adjust the frame rates corresponding to the plurality of application interfaces; wherein, Window states include focused windows, non-minimized and non-focused windows, and minimized windows.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the window states corresponding to the multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the processing unit adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the acquired window states corresponding to the multiple application interfaces, including: the processing unit adaptively adjusts according to the following first preset strategy Frame rates corresponding to multiple application interfaces: the frame rate corresponding to the focus window > the frame rate corresponding to the non-minimized and non-focus window > the frame rate corresponding to the minimized window.
  • the second device adjusts the frame rate corresponding to the screen projection interface adaptively according to different window states according to a preset strategy, so as to ensure the fluency and clarity of the screen projection image.
  • the above-mentioned first information includes application categories corresponding to multiple application interfaces, and the first information is specifically used by the second device to adaptively adjust the frame rates corresponding to the multiple application interfaces; wherein, the application category Including one or more of game, video, instant messaging, office, social, life, shopping, and function.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the application categories corresponding to the multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the above-mentioned application categories include games, videos, and instant message categories;
  • the above-mentioned processing unit adaptively adjusts the frame rates corresponding to the plurality of application interfaces according to the obtained application categories corresponding to the plurality of application interfaces, It includes: the processing unit adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the following second preset strategy: the frame rate corresponding to the game application interface>the frame rate corresponding to the video application interface>the frame rate corresponding to the instant messaging application interface .
  • the second device adjusts the frame rate corresponding to the screen projection interface according to the preset strategy and adaptively according to different application categories, so as to ensure the fluency and clarity of the screen projection image.
  • the above-mentioned processing unit acquires the first information when the electronic device and the first device synchronously display the first interface, including: when the electronic device and the first device synchronously display the first interface, if It is determined that the processing load of the second device is higher than a preset threshold, and the first information is acquired.
  • the solution provided by the present application can be implemented based on the processing load of the second device being higher than the preset threshold. With this solution, the fluency and clarity of the projected screen can be ensured when the processing capability of the second device is limited.
  • the processing unit determines that the processing load of the second device is higher than a preset threshold according to one or more of the following: the decoding delay of the GPU of the second device is greater than the delay threshold; The load rate is greater than the load threshold, and the number of multiple application interfaces is greater than the number threshold.
  • a preset threshold it is possible to judge whether the processing load of the second device is high by judging whether the decoding delay of the GPU is greater than the delay threshold, whether the load rate of the GPU is greater than the load threshold, and whether the number of multiple application interfaces is greater than the number threshold. at the preset threshold.
  • the above-mentioned first information includes the number of multiple application interfaces, and the first information is specifically used for the second device to adaptively adjust one or more of the following: application display areas corresponding to multiple application interfaces size, the display resolution of the second device, or the video resolution corresponding to multiple application interfaces.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the number of multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the second device determines that the size of the application display area corresponding to the application interface is 2a 1 ⁇ b 1 , and the display resolution of the second device is a 2 ⁇ 2b 2 , the video resolution corresponding to the multiple application interfaces is a 3 ⁇ b 3 ; wherein, 2a 1 is the length of the application display area, and b 1 is the width of the application display area; a 2 is the horizontal dimension of the display screen of the second device.
  • the number of displayed pixels, 2b 2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device; a 3 is the number of pixels that can be displayed in the unit area image of the horizontal dimension, and b 3 is the number of pixels that can be displayed in the unit area image of the vertical dimension
  • the second device determines that the size of the application display area corresponding to the application interface is 3a 1 ⁇ b 1 , and the display resolution of the second device is a 2 ⁇ 3b 2 , the video resolution corresponding to the multiple application interfaces is a 3 ⁇ b 3 ; wherein, 3a 1 is the length of the application display area, and b 1 is the width of the application display area; a 2 is the horizontal dimension of the display screen of the second device.
  • the number of displayed pixels, 3b2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device; a3 is the number of pixels that can be displayed in the unit area image of the horizontal dimension, and b3 is the number of pixels that can be displayed in the unit area image of the vertical dimension
  • an electronic device comprising: a memory for storing a computer program; a transceiver for receiving or transmitting a radio signal; a processor for executing the computer program, so that the electronic device is used in the
  • the first information is obtained;
  • the first interface includes multiple application interfaces; and one or more of the following are adaptively adjusted according to the obtained first information: multiple applications The frame rate corresponding to the interface, the size of the application display area corresponding to the multiple application interfaces, the display resolution of the second device, or the video resolution corresponding to the multiple application interfaces.
  • the second device obtains the first information in the process of accepting the screen projection of the first device, and realizes the frame rate corresponding to the projection screen interface, the size of the application display area corresponding to the projection interface, and the first information.
  • the load on the second device is reduced.
  • the processing capability of the second device is limited, ensure the smoothness and clarity of the projected screen.
  • the above-mentioned first information includes window states corresponding to the plurality of application interfaces; the above-mentioned first information is specifically used by the second device to adaptively adjust the frame rates corresponding to the plurality of application interfaces; wherein, Window states include focused windows, non-minimized and non-focused windows, and minimized windows.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the window states corresponding to the multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the above-mentioned processor is configured to execute the computer program, so that the electronic device adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the following first preset strategy: the frame rate corresponding to the focus window>non-minimum The frame rate corresponding to the minimized and non-focused window > the frame rate corresponding to the minimized window.
  • the second device adjusts the frame rate corresponding to the screen projection interface adaptively according to different window states according to a preset strategy, so as to ensure the fluency and clarity of the screen projection image.
  • the above-mentioned first information includes application categories corresponding to multiple application interfaces, and the first information is specifically used by the second device to adaptively adjust the frame rates corresponding to the multiple application interfaces; wherein, the application category Including one or more of game, video, instant messaging, office, social, life, shopping, and function.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the application categories corresponding to the multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the above-mentioned application categories include games, videos, and instant messaging;
  • the above-mentioned processor is configured to execute the computer program, so that the electronic device adaptively adjusts multiple application interfaces according to the following second preset strategy
  • the corresponding frame rate the frame rate corresponding to the game application interface > the frame rate corresponding to the video application interface > the frame rate corresponding to the instant messaging application interface.
  • the second device adjusts the frame rate corresponding to the screen projection interface according to the preset strategy and adaptively according to different application categories, so as to ensure the fluency and clarity of the screen projection image.
  • the above-mentioned processor is configured to execute the computer program, so that when the electronic device displays the first interface synchronously with the first device, if it is determined that the processing load of the second device is higher than a preset threshold, Get the first information.
  • the solution provided by this application can be implemented based on the fact that the processing load of the second device is higher than the preset threshold. Through this solution, when the processing capability of the second device is limited, the fluency and clarity of the projected screen can be guaranteed.
  • the above-mentioned processor determines that the processing load of the second device is higher than a preset threshold according to one or more of the following: the decoding delay of the GPU of the second device is greater than the delay threshold; The load rate is greater than the load threshold, and the number of multiple application interfaces is greater than the number threshold.
  • the above-mentioned first information includes the number of multiple application interfaces, and the first information is specifically used for the second device to adaptively adjust one or more of the following: application display areas corresponding to multiple application interfaces size, the display resolution of the second device, or the video resolution corresponding to multiple application interfaces.
  • the second device realizes the adaptive adjustment of the frame rate corresponding to the projection interface by acquiring the number of multiple application interfaces. In order to realize the on-demand allocation of image processing resources and processing capabilities of the device, and ensure the fluency and clarity of the projected screen.
  • the second device determines that the size of the application display area corresponding to the application interface is 2a 1 ⁇ b 1 , and the display resolution of the second device is a 2 ⁇ 2b 2 , the video resolution corresponding to the multiple application interfaces is a 3 ⁇ b 3 ; wherein, 2a 1 is the length of the application display area, and b 1 is the width of the application display area; a 2 is the horizontal dimension of the display screen of the second device.
  • the number of displayed pixels, 2b 2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device; a 3 is the number of pixels that can be displayed in the unit area image of the horizontal dimension, and b 3 is the number of pixels that can be displayed in the unit area image of the vertical dimension
  • the second device determines that the size of the application display area corresponding to the application interface is 3a 1 ⁇ b 1 , and the display resolution of the second device is a 2 ⁇ 3b 2 , the video resolution corresponding to the multiple application interfaces is a 3 ⁇ b 3 ; wherein, 3a 1 is the length of the application display area, and b 1 is the width of the application display area; a 2 is the horizontal dimension of the display screen of the second device.
  • the number of displayed pixels, 3b2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device; a3 is the number of pixels that can be displayed in the unit area image of the horizontal dimension, and b3 is the number of pixels that can be displayed in the unit area image of the vertical dimension
  • a computer-readable storage medium on which computer program code is stored, and when the computer program code is executed by a processor, implements the method in any possible implementation manner of the first aspect .
  • a chip system in a fifth aspect, includes a processor and a memory, and computer program codes are stored in the memory; when the computer program codes are executed by the processor, any possibility as described in the first aspect is realized.
  • the chip system can be composed of chips, and can also include chips and other discrete devices.
  • a computer program product which, when run on a computer, causes a method as in any of the possible implementations of the first aspect to be implemented.
  • Fig. 1 is a scene example diagram of a multi-window screen projection provided by an embodiment of the present application
  • FIG. 2 is an example diagram of another multi-window screen projection scenario provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a hardware structure of a first device according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a hardware structure of a second device according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of software interaction when a first device projects a screen to a second device according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of a process of projecting a screen from a first device to a second device
  • FIG. 7 is a flowchart 1 of a multi-window screen projection method provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram 1 of a multi-window collaborative screen projection process provided by an embodiment of the present application.
  • FIG. 9 is a flowchart 2 of a multi-window screen projection method provided by an embodiment of the present application.
  • FIG. 10 is a flowchart 3 of a multi-window screen projection method provided by an embodiment of the present application.
  • FIG. 11 is a second schematic diagram of a multi-window collaborative screen projection process provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram 3 of a multi-window collaborative screen projection process provided by an embodiment of the present application.
  • FIG. 13 is a fourth flowchart of a multi-window screen projection method provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram 4 of a multi-window collaborative screen projection process provided by an embodiment of the present application.
  • 16 is a structural block diagram of an electronic device provided by an embodiment of the present application.
  • 17 is a structural block diagram of another electronic device provided by an embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • An embodiment of the present application provides a multi-window screen projection method, which is implemented based on a multi-window screen projection technology.
  • the multi-window screen projection technology refers to a communication connection established between a device and a device (such as a first device and a second device) to realize the mirror display of multiple application interfaces on multiple devices. Based on the mirror display of multiple application interfaces on multiple devices, the function of multi-screen collaborative interaction across devices is realized through mirror manipulation and input collaboration.
  • the function of multi-screen collaborative interaction across devices and systems can also be implemented through mirror manipulation and input collaboration.
  • multiple application interfaces started on the first device may be displayed on the second device synchronously.
  • the user can operate on the above-mentioned application interface through hardware of the second device (such as a keyboard, a mouse, a microphone, a speaker, etc.).
  • the user can open a new application interface through the first device or the second device to further synchronize to the second device.
  • the user can also complete functions such as fast data sharing with the first device on the second device.
  • FIG. 1 and FIG. 2 show two scenarios of multi-window screen projection.
  • a communication connection for multi-window screen projection is established between the smartphone 110 (ie, the first device) and the notebook computer 120 (ie, the second device).
  • the smartphone 110 is displaying the mobile phone desktop
  • the user's startup operation of the short message application, video application and game application on the smartphone 110 is received.
  • the smart phone 110 will start the screen projection interface in the form of a free floating window, ie the short message application, video application and game application.
  • the smartphone 110 renders the above-mentioned free-floating window application interface together with the mobile phone desktop of the smartphone 110 , a part of which is rendered and sent to the main display of the smartphone 110 and a part is rendered on the virtual screen of the smartphone 110 . display). And, the smartphone 110 encodes the Surface corresponding to the interface rendered on the virtual screen into a standard video stream and transmits it to the notebook computer 120, thereby realizing multiple windows (ie, windows of multiple application interfaces) between the smartphone 110 (ie, the first device) and the laptop computer 120. Collaborative Display on Laptop 120 (ie, Second Device).
  • the smartphone 110 may also encode the Surface corresponding to all the interfaces rendered on the home screen and the virtual screen into a standard video stream and transmit it to the notebook computer 120 . Based on the collaborative display of multiple windows on the first device and the second device, the user can cooperatively control multiple application interfaces launched by the first device through the first device and the second device.
  • the interface displayed by the smartphone 110 (ie, the first device) to the home screen of the smartphone 110 may be referred to as a default interface.
  • the default interface can be a preset interface, such as a mobile phone desktop (as shown in Figure 1 and Figure 2 ), a setting interface or a tool interface, etc.
  • the default interface can also be a user-defined interface, etc., which is not limited in this application. .
  • the multi-window screen projection methods may include homologous screen projection and heterogeneous screen projection.
  • the same-origin screen projection refers to projecting the interfaces of multiple applications launched on the first device to the second device in a manner of extending the screen.
  • the first device sends the standard video stream encoded on the Surface corresponding to all the application interfaces rendered on the main screen and the virtual screen to the second device by using one channel of encoding, so as to be displayed on the display screen of the second device.
  • All application UI (including default UI) rendered on the virtual screen.
  • the default interface can be understood as an interface for sending and displaying the first device.
  • the mobile phone 110 when the mobile phone 110 displays the mobile phone 110 desktop, in response to the user clicking on the short message application icon (“Information” icon shown in FIG. 1), the video application icon (“Huawei Video” shown in FIG. 1 ” icon) and game application icon (as shown in FIG. 1 ), the mobile phone 110 renders the mobile phone 110 desktop, short message application interface, video application interface and game application interface on the home screen and the virtual screen together. Based on the same source screen projection, as shown in FIG.
  • the notebook computer 120 receives from the smartphone 110 (ie the first device), including the home screen of the smartphone 110 and the virtual screen of the co-rendered screen After all the standard video streams corresponding to the application interfaces, the notebook computer 120 displays the smartphone 110 desktop, short message application interface, video application interface and game application interface according to the standard video stream.
  • the first device adopts two-way encoding, and one of the encodings sends the default interface to display (ie, displays it on the display screen of the first device).
  • the other encoding sends information such as a standard video stream corresponding to the application interface rendered on the virtual screen to the second device.
  • the mobile phone 110 when the mobile phone 110 displays the mobile phone 110 desktop, in response to the user clicking on the short message application icon (“Information” icon as shown in FIG. 2), the video application icon (as shown in FIG. 1 “Huawei Video” ” icon) and game application icons (as shown in FIG. 2 ), the mobile phone 110 renders the mobile phone 110 desktop, short message application interface, video application interface and game application interface on the home screen and the virtual screen together. Based on heterogeneous screen projection, as shown in FIG.
  • the notebook computer 120 receives from the smartphone 110 (ie the first device), including the application interface (eg, the first device) rendered on the virtual screen of the smartphone 110 After the standard video stream corresponding to the short message application interface, video application interface and game application interface), the notebook computer 120 displays the short message application interface, video application interface and game application interface according to the standard video stream.
  • the smartphone 110 ie the first device
  • the application interface eg, the first device
  • the notebook computer 120 displays the short message application interface, video application interface and game application interface according to the standard video stream.
  • the homologous screen projection method and the heterogeneous screen projection method have their own advantages and disadvantages.
  • the homologous screen projection method can ensure the continuity of the application; while the heterogeneous screen projection method requires restarting the application when switching between different screens.
  • the heterogeneous screen projection method requires restarting the application when switching between different screens.
  • the heterologous screen projection method has better isolation.
  • the heterogeneous screen projection method can provide the user with independent control screens (ie, the display screen of the first device and the display screen of the second device) to handle different interfaces.
  • the multi-window screen projection method provided by the embodiments of the present application is applicable to any screen projection method (including homologous screen projection and heterogeneous screen projection).
  • the multi-window screen projection technology can provide users with a convenient user experience.
  • the mouse of the notebook computer 120 can act as the user's finger to implement more precise touch operations on the short message application interface, video application interface and game application interface or on the desktop of the mobile phone 110 .
  • the large-sized physical keyboard of the notebook computer 120 can replace the small-sized virtual input method window on the display screen of the smart phone 110 to achieve a better text input experience.
  • the multi-channel stereo speakers of the notebook computer 120 can replace the speakers of the smartphone 110 to output audio from the smartphone 110 (such as audio from a video application interface or a game application interface, etc.) to improve volume and sound quality.
  • the first device and the second device can be “touched”, “scanned” (such as scanning a two-dimensional code or barcode), “proximity automatic discovery” (such as by means of Bluetooth or wireless fidelity (wireless fidelity, WiFi)) to establish a wireless communication connection.
  • the first device and the second device may follow a wireless transmission protocol, and transmit information through a wireless connection transceiver.
  • the wireless transmission protocol may include, but is not limited to, a Bluetooth (bluetooth, BT) transmission protocol or a wireless fidelity (wireless fidelity, WiFi) transmission protocol, and the like.
  • the WiFi transport protocol may be the WiFi P2P transport protocol.
  • the wireless connection transceiver includes but is not limited to Bluetooth, WiFi and other transceivers. Through wireless pairing, information transmission between the first device and the second device is realized.
  • the information transmitted between the first device and the second device includes, but is not limited to, content data to be displayed (such as standard video streams) and control instructions.
  • a wired communication connection may be established between the first device and the second device.
  • the first device and the second device are connected through a video image adapter (video graphics array, VGA), digital video interface (digital visual interface, DVI), high definition multimedia interface (high definition multimedia interface, HDMI) or data transmission line etc. to establish a wired communication connection.
  • Information transmission is implemented between the first device and the second device through the established wired communication connection.
  • the present application does not limit the specific connection manner between the first device and the second device.
  • both the first device and the second device include a display screen.
  • the first device and the second device may include, but are not limited to, smart phones, netbooks, tablet computers, smart watches, smart bracelets, phone watches, smart cameras, PDAs, personal computers (PCs), personal digital assistants (personal digital assistants) digital assistant, PDA), portable multimedia player (PMP), (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, television, projection equipment or somatosensory game console in human-computer interaction scenes Wait.
  • the first device and the second device may also be other types or structures of electronic devices, which are not limited in this application.
  • the multi-window screen projection technology is mostly used between a portable device (ie, the first device) and a large-screen device (ie, the second device).
  • a portable device is a smartphone
  • a large-screen device is a laptop
  • the portable device is a tablet computer
  • the large-screen device is a TV.
  • the first device and the second device can be smartphones, netbooks, tablet computers, smart watches, smart bracelets, phone watches, and smart cameras.
  • the first device may include a processor 310, a memory (including an external memory interface 320 and an internal memory 321), a universal serial bus (USB) interface 330, a charging management module 340, and a power management module 341, battery 342, antenna 1, antenna 2, mobile communication module 350, wireless communication module 360, audio module 370, speaker 370A, receiver 370B, microphone 370C, headphone jack 370D, sensor module 380, button 390, motor 391, indicator 392, a camera 393, a display screen 394, and a subscriber identification module (SIM) card interface 395 and the like.
  • SIM subscriber identification module
  • the sensor module 380 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the first device.
  • the first device may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • Processor 310 may include one or more processing units.
  • the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a flight controller, Video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc.
  • application processor application processor, AP
  • modem processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • flight controller Video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • a memory may also be provided in the processor 310 for storing instructions and data.
  • the memory in processor 310 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 310 . If the processor 310 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 310 is reduced, thereby increasing the efficiency of the system.
  • processor 310 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the charging management module 340 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 340 may receive charging input from the wired charger through the USB interface 330 .
  • the charging management module 340 may receive wireless charging input through the wireless charging coil of the first device. While the charging management module 340 is charging the battery 342 , it can also supply power to the first device through the power management module 341 .
  • the power management module 341 is used to connect the battery 342 , the charging management module 340 and the processor 310 .
  • the power management module 341 receives input from the battery 342 and/or the charge management module 340, and supplies power to the processor 310, the internal memory 321, the display screen 394, the camera assembly 393, and the wireless communication module 360.
  • the power management module 341 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 341 may also be provided in the processor 310 .
  • the power management module 341 and the charging management module 340 may also be provided in the same device.
  • the wireless communication function of the first device may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the first device may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 350 may provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the first device.
  • the mobile communication module 350 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 350 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 350 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 350 may be provided in the processor 310 .
  • at least part of the functional modules of the mobile communication module 350 may be provided in the same device as at least part of the modules of the processor 310 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through an audio device (not limited to the speaker 370A, the receiver 370B, etc.), or displays images or videos through the display screen 394 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 310, and may be provided in the same device as the mobile communication module 350 or other functional modules.
  • the wireless communication module 360 can provide applications on the first device including wireless local area networks (WLAN) (such as WiFi networks), Bluetooth BT, global navigation satellite system (GNSS), frequency modulation (frequency modulation) , FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation
  • FM near field communication technology
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 360 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 360 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 310 .
  • the wireless communication module 360 can also receive the signal to be sent from the processor 310 , perform frequency modulation on the signal, amplify the signal, and then convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the first device is coupled with the mobile communication module 350, and the antenna 2 is coupled with the wireless communication module 360, so that the first device can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS) and/or a satellite-based augmentation system ( satellite based augmentation systems, SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • SBAS satellite-based augmentation system
  • the first device implements a display function through a GPU, a display screen 394, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 394 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 310 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the GPU can be used to convert and drive the display information required by the computer system, and provide a line scan signal to the display to control the correct display of the display.
  • Display screen 394 is used to display images, videos, and the like.
  • Display screen 394 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the first device may include 1 or N display screens 394 , where N is a positive integer greater than 1.
  • the first device may implement a shooting function through an ISP, a camera component 393, a video codec, a GPU, a display screen 394, an application processor, and the like.
  • the external memory interface 320 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the first device.
  • the external memory card communicates with the processor 310 through the external memory interface 320 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 321 may be used to store computer executable program code, which includes instructions.
  • the internal memory 321 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the first device and the like.
  • the internal memory 321 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 310 executes various functional applications and data processing of the first device by executing the instructions stored in the internal memory 321 and/or the instructions stored in the memory provided in the processor.
  • the first device may implement an audio function through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an application processor, and the like. Such as music playback, recording, etc.
  • an audio module 370 a speaker 370A, a receiver 370B, a microphone 370C, an application processor, and the like.
  • the specific working principles and functions of the audio module 370, the speaker 370A, the receiver 370B and the microphone 370C reference may be made to the introduction in the conventional technology.
  • the keys 390 include a power-on key, a volume key, and the like. Keys 390 may be mechanical keys. It can also be a touch key.
  • the first device may receive key input and generate key signal input related to user settings and function control of the first device.
  • Motor 391 can generate vibrating cues.
  • the motor 391 can be used for incoming call vibration alerts, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 391 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 394 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 392 can be an indicator light, which can be used to indicate the charging status, the change of power, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 395 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 395 or pulled out from the SIM card interface 395 to achieve contact with and separation from the first device.
  • the first device may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 395 can support Nano SIM card, Micro SIM card, SIM card and so on.
  • the same SIM card interface 395 can insert multiple cards at the same time.
  • the types of the plurality of cards may be the same or different.
  • the SIM card interface 395 can also be compatible with different types of SIM cards.
  • the SIM card interface 395 is also compatible with external memory cards.
  • the first device interacts with the network through the SIM card to implement functions such as call and data communication.
  • the first device employs an eSIM, ie an embedded SIM card.
  • the eSIM card can be embedded in the first device and cannot be separated from the first device.
  • the hardware modules included in the first device shown in FIG. 3 are only illustratively described, and do not limit the specific structure of the first device.
  • the first device may also include other functional modules.
  • FIG. 4 shows a schematic diagram of a hardware structure of a second device by taking the second device as a notebook computer as an example.
  • the notebook computer may include: a processor 410, an external memory interface 420, an internal memory 421, a USB interface 430, a power management module 440, an antenna 450, a wireless communication module 460, an audio module 470, a speaker 470A, and a microphone 470C , speaker interface 470B, mouse 480, keyboard 490, indicator 491, camera 493, and display screen 492 and so on.
  • the structure illustrated in this embodiment does not constitute a specific limitation on the notebook computer.
  • the notebook computer may include more or fewer components than shown, or some components may be combined, or some components may be split, or a different arrangement of components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 410 may include one or more processing units, for example, the processor 410 may include an application processor AP, a modem processor, a graphics processor GPU, an ISP, a controller, a memory, a video codec, a DSP, a baseband processor, and/or NPU, etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller can be the nerve center and command center of the laptop.
  • the controller can fetch the instruction according to the instruction, generate the operation control signal, and then execute the control of the instruction.
  • a memory may also be provided in the processor 410 for storing instructions and data.
  • the memory in processor 410 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 410 . If the processor 410 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 410 is reduced, thereby improving the efficiency of the system.
  • processor 410 may include one or more interfaces. The interface may include integrated circuit I2C interface, integrated circuit built-in audio I2S interface, PCM interface, UART interface, MIPI, GPIO interface, and/or USB interface, etc.
  • the interface connection relationship between the modules shown in this embodiment is only a schematic illustration, and does not constitute a structural limitation of the notebook computer.
  • the notebook computer may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the power management module 440 is used to connect power.
  • the charging management module 440 may also be connected with the processor 410, the internal memory 421, the display screen 494, the camera 493, the wireless communication module 460, and the like.
  • the power management module 441 receives power input, and supplies power to the processor 410 , the internal memory 421 , the display screen 494 , the camera 493 and the wireless communication module 460 .
  • the power management module 441 may also be provided in the processor 410 .
  • the wireless communication function of the notebook computer can be realized by the antenna and the wireless communication module 460 and the like.
  • the wireless communication module 460 can provide wireless local area network WLAN (such as WiFi network), Bluetooth BT, global navigation satellite system GNSS, frequency modulation FM, short-range wireless communication technology NFC, infrared technology IR and other wireless communication applied on the notebook computer. solution.
  • the wireless communication module 460 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 460 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 410 .
  • the wireless communication module 460 can also receive the signal to be sent from the processor 410 , perform frequency modulation on it, amplify it, and then convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna of the notebook computer is coupled with the wireless communication module 360 so that the notebook computer can communicate with the network and other devices through wireless communication technology.
  • the notebook computer realizes the display function through the GPU, the display screen 492, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 492 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 410 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 492 is used to display images, videos, and the like.
  • the display screen 492 includes a display panel.
  • the GPU can be used to convert and drive the display information required by the computer system, and provide a line scan signal to the display to control the correct display of the display.
  • the notebook computer can realize the shooting function through ISP, camera 493, video codec, GPU, display screen 492, and application processor.
  • the ISP is used to process the data fed back by the camera 493 .
  • the ISP may be provided in the camera 493 .
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the notebook computer selects the frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, etc.
  • Video codecs are used to compress or decompress digital video.
  • a laptop can support one or more video codecs. In this way, the notebook computer can play videos in various encoding formats, such as: Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • the external memory interface 420 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the notebook computer.
  • the external memory card communicates with the processor 410 through the external memory interface 420 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 421 may be used to store computer executable program code, which includes instructions.
  • the processor 410 executes various functional applications and data processing of the notebook computer by executing the instructions stored in the internal memory 421 .
  • the processor 410 may execute instructions stored in the internal memory 421, and the internal memory 421 may include a program storage area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the notebook computer.
  • the internal memory 421 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the notebook computer can implement audio functions through an audio module 470, a speaker 470A, a microphone 470C, a speaker interface 470B, and an application processor. For example, music playback, recording, etc.
  • the indicator 491 may be an indicator light, which may be used to indicate that the notebook computer is in a power-on state or a power-off state, or the like. For example, if the indicator light is off, it can indicate that the notebook computer is in a power-off state; if the indicator light is on, it can indicate that the notebook computer is in a power-on state.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the notebook computer. It may have more or fewer components than those shown in Figure 4, may combine two or more components, or may have a different configuration of components.
  • the notebook computer may also include components such as speakers.
  • the various components shown in Figure 4 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing or application specific integrated circuits.
  • the software systems of the first device and the second device provided by the embodiments of the present application may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture, or the like.
  • the software system may include, but is not limited to, operating systems such as Symbian, Android, Windows, Apple (iOS), Blackberry (Blackberry), and Harmony (Harmony), which are not limited in this application.
  • FIG. 5 takes an Android operating system with a layered architecture as an example to specifically introduce a schematic diagram of software interaction when a first device projects a screen to a second device in an embodiment of the present application.
  • the layered architecture can divide the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the software structures of the first device and the second device can be divided into three layers from top to bottom: application layer (referred to as application layer), application framework layer (referred to as framework layer), system library, Android runtime Time and kernel layer (also known as driver layer).
  • the application layer can include a series of application packages, such as camera, gallery, calendar, call, map, navigation, bluetooth, music, video, SMS and other applications.
  • the application program is simply referred to as the application below.
  • the application on the first device can be a native application (such as an application installed in the first device when the operating system is installed before the first device leaves the factory), or a third-party application (such as an application downloaded and installed by a user through an application store) ), the embodiments of the present application are not limited.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer may include a window management server (window manager service, WMS), an activity management server (activity manager service, AMS), an input event management server (input manager service, IMS) and a screen casting management module .
  • the application framework layer may also include content providers, view systems, telephony managers, resource managers, notification managers, etc. (not shown in Figure 5).
  • WMS carries the data and attributes related to the "interface", and is used to manage the state related to the "interface”. For example, it is used to manage window procedures and event dispatch.
  • managing the window program refers to orderly output to the physical screen or other display device with the assistance of the application server and the WMS according to the display request of the application program.
  • Event dispatch refers to dispatching user events from keyboard, physical keys, touch screen, mouse, trackball (TraceBoll), etc. to corresponding controls or windows.
  • the window management server can also obtain the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • AMS is used to manage activities, and is responsible for the startup, switching, scheduling of various components in the system, and the management and scheduling of applications.
  • AMS defines data classes for saving a process (Process), an activity (Activity) and a task (Task) respectively.
  • the data class corresponding to the process (Process) may include process file information, memory state information of the process, and Activity, Service, and the like contained in the process.
  • Activity information can be saved in ActivityStack.
  • ActivityStack is used for unified scheduling of application Activity.
  • ActivityStack can save all running Activity (ie final ArrayList mHistory) information, such as interface configuration information.
  • running Activity can be saved in new ArrayList.
  • ActivityStack can also save the information of historically run activities, such as interface configuration information. It should be noted that Activity does not correspond to an application, and ActivityThread corresponds to an application. Therefore, Android allows multiple applications to run at the same time, and actually allows multiple ActivityThreads to run at the same time.
  • each application process reports to AMS when it wants to start a new Activity or stop the current Activity.
  • AMS internally records all application processes. When AMS receives a start or stop report, it first updates the internal record, and then notifies the corresponding client process to run or stop the specified Activity. Since there are records of all activities inside AMS, these activities can be scheduled, and the activities in the background can be automatically closed according to the status of the activities and system memory.
  • the IMS can be used to translate, encapsulate, etc. the original input event, get the input event with more information, and send it to the WMS.
  • the WMS stores the clickable areas (such as controls) of each application and the position of the focus window. information, etc. Therefore, WMS can correctly dispatch input events to the specified control or focus window.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • Data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the telephony manager is used to provide the communication function of the first device. For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the screencasting management module is used to manage affairs related to screencasting. For example, the video stream corresponding to the application interface, interface configuration parameters, etc. are transmitted. For another example, receiving and distributing a screen rotation request from a screen-casting device (such as a second device), etc.
  • the screen projection management module may be Huawei's Assistant or Manager.
  • the Assistant may be a module for interacting with other electronic devices (such as the second device) to project screen-related information.
  • the Assistant may provide an API and programming framework for the first device to communicate with other electronic devices (such as the second device).
  • the Manager may be a computer housekeeper, a computer assistant, or the like.
  • the system library and Android runtime include the functions that FWK needs to call, the Android core library, and the Android virtual machine.
  • a system library can include multiple functional modules. For example: browser kernel, three-dimensional (3 dimensional, 3D) graphics, font library, etc.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, compositing and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can include display drivers, input/output device drivers (eg, keyboard, touch screen, headset, speaker, microphone, etc.), device nodes, camera drivers, audio drivers, and sensor drivers.
  • input/output device drivers eg, keyboard, touch screen, headset, speaker, microphone, etc.
  • device nodes e.g., camera drivers, audio drivers, and sensor drivers.
  • the user performs input operations through the input device, and the kernel layer can generate corresponding original input events according to the input operations and store them in the device node.
  • Input/output device drivers can detect user input events. For example, the action of the user launching the app.
  • the user can control the application interface that is projected from the first device to the second device through the second device.
  • the input/output device driver or sensor driver of the second device can detect the user's input event.
  • the input event may be an input event in which the user clicks a button on a certain interface to enter a next-level interface of the interface, or an input event in which the user rotates the display screen of the second device.
  • the input/output device driver or sensor driver of the second device reports the user's input event to the IMS.
  • the IMS synchronizes the input event to the screencasting management module (for example, Assistant or Manager) of the first device through the screencasting management module (for example, Assistant or Manager).
  • the screen projection management module of the first device distributes the input event to the corresponding application.
  • the application calls the startActivity interface in AMS to start the Activity corresponding to the input event.
  • AMS calls the WMS interface according to the startup parameters.
  • WMS draws the window corresponding to the Activity according to the startup parameters, and refreshes the application interface configuration parameters.
  • the screen projection management module of the first device encodes the Surface corresponding to the refreshed application interface configuration parameters into a standard video stream, and resynchronizes to the screen projection management module of the second device.
  • the screen projection management module of the second device recalls the display driver through the WMS according to the received standard video stream, so as to realize synchronous display on the display screen of the second device.
  • FIG. 5 only takes an Android system with a layered architecture as an example to introduce a schematic diagram of software interaction between devices during multi-window screen projection.
  • the present application does not limit the specific architectures of the software systems of the first device and the second device.
  • Frame rate refers to the number of frames of the picture in 1 second, which can also be understood as the graphics processor can refresh several times per second.
  • Frame rate usually affects the smoothness of the picture. Among them, the frame rate is proportional to the smoothness of the picture. Specifically, the larger the frame rate, the smoother the picture; the smaller the frame rate, the more pulsating the picture will be. Due to the special physiological structure of human eyes, if the frame rate of a picture is higher than 16FPS, humans will think that the picture is coherent, and this phenomenon is called persistence of vision.
  • Resolution used to indicate how many pixels can be displayed in a unit area. Resolution is used to reflect the precision of the display. Generally, the more pixels that can be displayed per unit area, the finer the picture; the fewer the pixels that can be displayed per unit area, the rougher the picture.
  • the display resolution and the image resolution may be specifically involved.
  • Display resolution used to indicate the number of pixels that can be displayed per unit area of the device display. Display resolution is used to reflect the precision of the screen. Because the dots, lines, and areas on a device's display are made up of pixels, the more pixels the display can display, the finer the picture. At the same time, the more pixels a monitor can display, the more information can be displayed in a display area of the same size. Generally, under the condition of a certain display resolution, the smaller the display screen, the clearer the image. When the display size is fixed, the larger the display resolution, the clearer the image.
  • Image resolution used to indicate the number of pixels that can be displayed in the image per unit area.
  • image resolution can be expressed in pixels per inch (ppi) and image dimensions (including image length and width).
  • Image resolution is used to reflect the precision of the image (ie, picture).
  • the image resolution can be represented by the number of horizontal pixels and the number of vertical pixels. Generally, when the display resolution is fixed, the higher the image resolution, the more image pixels, and the larger the size and area of the image.
  • Bitrate (br) refers to the number of bits of data transmitted per unit time. For example, the number of bits (bits) transmitted per unit time, so the code rate also becomes the bit rate. Usually the unit of bit rate is bps (bit per second). The bit rate can be understood as the sampling rate. Generally, the larger the sampling rate, the higher the precision, and the closer the processed file is to the original file. However, since the file size is proportional to the sampling rate, almost all encoding formats focus on how to achieve the least distortion with the lowest bit rate. Coding formats such as variable bitrate (VBR), average bitrate (ABR) and constant bitrate (CBR) are derived around this core.
  • VBR variable bitrate
  • ABR average bitrate
  • CBR constant bitrate
  • the resolution is inversely proportional to the resolution. Specifically, the higher the resolution, the less clear the image is, and the lower the resolution, the clearer the image.
  • the bit rate is proportional to the resolution. Specifically, the higher the bit rate, the clearer the image; the lower the bit rate, the less clear the image.
  • the process of screen projection from the first device to the second device may mainly include rendering instruction generation ⁇ interface rendering ⁇ color space conversion ⁇ video encoding ⁇ video decoding ⁇ color space conversion ⁇ screen cutting ⁇ display sending.
  • interface rendering and video encoding are completed by the first device; video decoding, picture cutting and display are completed by the second device.
  • the interface rendering refers to that the first device jointly renders multiple application interfaces displayed in multiple windows.
  • Color space conversion refers to the representation of colors in a color-coded form that can be recognized by machines.
  • the color coding may adopt coding methods such as YUV color coding or RGB color coding.
  • YUV color coding uses luminance and chromaticity to define the color of pixels.
  • Y in YUV represents Luminance
  • U and V represent Chrominance.
  • Chroma is used to define two aspects of a color: hue and saturation.
  • RGB color coding adopts the principle of adding the three primary colors of red (Red), green (Green), and blue (Blue) in different proportions to generate a variety of color lights.
  • each pixel has three base colors of red, green, and blue, and each primary color occupies 8 bits (ie, one byte), so one pixel also occupies 24 bits (ie, three bytes).
  • the codec capability of the codec determines whether color space conversion is required. For example, if the device supports RGB color decoding and does not support YUV color decoding, you need to convert the color encoding form from YUV to RGB.
  • the above-mentioned color space conversion step may not be performed.
  • the device needs to perform color space conversion as an example for introduction.
  • Video encoding refers to a method of converting a certain video format file into another video format file through a specific compression technology.
  • the video coding can adopt standards such as H.261, H.263, H.263+, H.263++, or H.264.
  • Video decoding is the reverse process of video encoding.
  • the specific process of video coding, and the specific process of video decoding reference may be made to explanations and descriptions in the conventional technology, which will not be repeated in this application.
  • the decoded video stream consists of picture frames.
  • the picture frame includes the interface configuration information of multiple screen projection interfaces, such as application development properties/application data configuration, boundary information of the application interface, direction of the application, icons on the application interface, text on the application interface, icon position, size and Color, text display position, size and color, etc.
  • the application development attributes and application data configuration may be used to reflect one or more of interface attributes, application categories, or application functions.
  • Screen cutting refers to cutting the image frame including the configuration information of the screen projection interface into multiple sub-interfaces. For example, cut into multiple application interfaces.
  • Sending to display refers to calling the display driver to start multiple rendering tasks, and rendering multiple application interfaces after cutting in the corresponding windows and displaying them on the display screen.
  • FIG. 6 illustrates a conventional process of screen casting from a first device to a second device by taking a first device to project a short message application, a video application and a game application as an example.
  • the first device when the first device projects a screen to the second device, the first device firstly renders the interface of the short message application, the video application interface and the game application interface started on the first device. Then the rendered interface is converted into color space (Figure 6 takes conversion to YUV color coding as an example). Then, the interface after the color space conversion is video encoded (Fig. 6 uses the H.264 standard for video encoding as an example).
  • the encoded standard video stream is sent (eg, sent through the screen projection management module of the first device) to the second device (eg, the screen projection management module of the second device).
  • the second device After receiving the standard video stream, the second device first performs video decoding on the standard video stream (in FIG. 6 , the H.264 standard is used for video decoding as an example). Then, color space conversion is performed on each decoded frame of picture (Fig. 6 takes conversion to YUV color decoding as an example). Then, each frame of picture is cut according to different interface attributes, such as cutting into a short message application interface, a video application interface and a game application interface. Finally, the cut application interface will be displayed.
  • the first device since the frame rate of the short message application and the game application are both 60 FPS, during encoding, in order to ensure the completion of the application interface information required by the high frame rate, the first device usually All application interfaces are encoded at 60 FPS.
  • multiple application interfaces decoded and cut by the second device are also displayed at a fixed frame rate (eg, 60 FPS).
  • the occupancy rate of the GPU is often relatively high (for example, reaching more than 80%).
  • the resolutions of multiple application interfaces are also usually fixed.
  • the throughput pressure of communication resources (such as WiFi resources) is usually larger.
  • the above problems will cause the system to freeze, which will lead to problems such as stuck or unsmooth projection images, which will affect the user experience.
  • an embodiment of the present application provides a multi-window screen projection method, which is used to ensure the fluency and clarity of the projection screen when the first device projects the screen to the second device in multiple windows.
  • a multi-window screen projection method provided by an embodiment of the present application can ensure the smoothness and clarity of the screen projection by reducing the GPU pressure of the second device when the first device projects to the second device with multiple windows. .
  • the second device may adaptively and dynamically adjust the frame rates corresponding to the multiple application interfaces according to the window states corresponding to the multiple application interfaces projected from the first device to the second device, so as to reduce the GPU of the second device pressure, so as to ensure the fluency and clarity of the projected screen.
  • the above-mentioned window state may include, but is not limited to, a focus window, a non-minimized and non-focus window, and a minimized window.
  • the second device may adaptively and dynamically adjust the frame rates corresponding to different application interfaces according to application categories corresponding to multiple application interfaces projected from the first device to the second device, so as to reduce the GPU of the second device pressure, so as to ensure the fluency and clarity of the projected screen.
  • application categories may include, but are not limited to, instant messaging, video, game, office, social, life, shopping, or functional.
  • the multi-window screen projection method provided by the embodiment of the present application can ensure the screen projection image by adaptively adjusting the size and/or resolution of the display area when the first device projects the screen to the second device in multiple windows. Fluency and clarity.
  • the resolution may include, but is not limited to, display resolution and video resolution.
  • the second device may adaptively and dynamically adjust the application display area (Display) size and the application display area (Display) resolution (also called display area) according to the number of application interfaces projected from the first device to the second device. resolution) and video resolution, etc., to ensure the fluency and clarity of the projected screen. Since the video is composed of frame-by-frame images, the video resolution is also called the image resolution.
  • the size of the display area may be understood as the size of the display area used by the display screen of the device to display the application interface.
  • the display area (Display) resolution is used to characterize the number of pixels that can be displayed per unit area in the display area of the device display screen used to display the application interface.
  • the video resolution is used to represent the number of pixels that can be displayed in the unit image area of the image frame corresponding to the video stream.
  • the multi-window screen projection method provided by the following embodiments of the present application is applicable to both the same source screen projection method and the heterogeneous screen projection method.
  • the technical solutions provided by the embodiments of the present application will be described in detail by taking the wireless transmission protocol between the first device and the second device as an example in conjunction with specific embodiments.
  • the second device may adaptively and dynamically adjust the frame rates corresponding to different application interfaces according to the window states corresponding to multiple application interfaces projected from the first device to the second device, so as to reduce the frame rate of the second device.
  • GPU pressure to ensure the smoothness and clarity of the projected screen.
  • FIG. 7 shows a flowchart of a multi-window screen projection method provided by an embodiment of the present application.
  • a multi-window screen projection method provided by an embodiment of the present application may include the following steps S701-S703:
  • the second device and the first device synchronously display a first interface, where the first interface includes multiple application interfaces.
  • the second device and the first device synchronously display the first interface means that the second device synchronously displays the first interface projected from the first device to the second device.
  • the first interface is a combination of multiple application interfaces.
  • the first device is a mobile phone 110
  • the mobile phone 110 uses the same source screen projection method to project the desktop, short message application interface, video application interface and game application interface of the mobile phone 110 to the laptop computer 120
  • the first interface is shown in FIG. 1 . display, including mobile phone desktop, short message application interface, video application interface and game application interface.
  • the first device is a mobile phone 110, and the mobile phone 110 projects the short message application interface, video application interface and game application interface to the notebook computer 120 by using a heterogeneous screen projection method, the first interface is shown in FIG. 2, including short message application interface, video application interface and game application interface. Messaging application interface, video application interface and game application interface.
  • the second device acquires window states corresponding to multiple application interfaces.
  • the above-mentioned window state may include, but is not limited to, a focus window, a non-minimized and non-focus window, and a minimized window.
  • the focus window may be understood as the application window manipulated by the user most recently.
  • the focused window can also be referred to as the currently active window.
  • a non-minimized and non-focused window can be understood as an application window that is not currently minimized and the user has not manipulated the last time.
  • a minimized window can be understood as the currently minimized application window.
  • the desktop of the mobile phone 110, the short message application interface, the video application interface and the game application interface are all non-minimized windows. Assuming that the most recent application window manipulated by the user is a video application interface, the video application interface is the focus window, and the mobile phone 110 desktop, short message application interface and video application interface are non-minimized and non-focused windows.
  • the short message application interface, the video application interface and the game application interface are all non-minimized windows. Assuming that the most recent application window manipulated by the user is a video application interface, the video application interface is the focus window, and the short message application interface and the video application interface are non-minimized and non-focused windows.
  • the second device may periodically acquire window states corresponding to multiple application interfaces.
  • the second device may periodically acquire window states corresponding to multiple application interfaces according to a preset period.
  • the preset period may be preset in the second device.
  • the preset period may be 3 seconds (seconds, s).
  • the second device may acquire window states corresponding to multiple application interfaces in response to receiving a manipulation event from the user.
  • the second device may acquire window states corresponding to multiple application interfaces in response to the input/output device driver or the sensor driver receiving a manipulation event from the user.
  • the above-mentioned manipulation event may be a manipulation event of the user on any one of the above-mentioned multiple application interfaces.
  • the second device adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the obtained window states corresponding to the multiple application interfaces.
  • the second device may adaptively adjust the frame rates corresponding to the multiple application interfaces according to a preset policy (eg, the first preset policy).
  • the first preset policy is related to the window state.
  • the degree of user experience requirement from high to low is: focus window>non-minimized and non-focused window>minimized window. Therefore, in some embodiments, by adaptively adjusting the frame rates corresponding to the multiple application interfaces according to the window states corresponding to the multiple application interfaces, the GPU resources and/or processing capabilities can be transferred to the application interface ( For example, the application interface in the focus window is tilted up, reducing the GPU resources and/or processing power allocated by the device on the application interface with lower user experience requirements (such as the application interface in the minimized window).
  • the second device can adaptively adjust the frame rates corresponding to multiple application interfaces according to the following first preset strategy: adaptively adjust from high to low and from large to small according to the degree of user experience requirements Frame rates corresponding to multiple application interfaces.
  • the size of the frame rate can be adjusted as follows: focus window > non-minimized and non-focused window > minimized window .
  • the frame rate of the application interface corresponding to the focused window can be adjusted to 60 FPS (that is, refreshed 60 times per second), and the frame rate of the application interface corresponding to the non-minimized and non-focused window can be adjusted to 30 FPS (that is, per second). Refresh 30 times), the frame rate of the application interface corresponding to the minimized window can be adjusted to 0 FPS (that is, not refreshed).
  • the focus window is the application window manipulated by the user most recently, the user's experience requirement is the highest.
  • the minimized window has the lowest level of user experience because it is currently minimized.
  • the current user experience requirement of a non-minimized and non-focused window is not high, but the user may manipulate the application interface in the window at any time, so the user's experience requirement is between the focus window and the minimized window.
  • FIG. 8 shows a schematic diagram of a multi-window collaborative screen projection process provided by an embodiment of the present application by taking a first device to project a short message application, a video application, and a game application to a second device as an example.
  • the first device renders the common interface of the short message application, video application and game application launched on the first device, and converts the color space (Figure 8 takes the conversion to YUV color coding as an example)
  • video encoding in FIG. 8 , the H.264 standard is used for video encoding as an example
  • the encoded standard video stream is sent to the second device.
  • the second device completes video decoding (in Figure 8, the H.264 standard is used for video decoding as an example), color space conversion (in Figure 8, conversion to YUV color decoding is used as an example), picture cutting and display sending.
  • the second device can control the short message application interface and video application interface when the user controls the interface.
  • the window states corresponding to the short message application interface, the video application interface and the game application interface are obtained (eg, periodically obtained, in response to receiving a manipulation event, etc.).
  • the second device determines that the short message application window is currently minimized, the video application window is currently a non-minimized and non-focused window, and the game application window is currently the focused window.
  • the second device adaptively adjusts the short message application interface, the video application interface and the game application interface according to the obtained window state and the size of the frame rate is: focus window>non-minimized and non-focus window>minimized window strategy corresponding frame rate.
  • the second device may adjust the frame rate of the short message application interface to 0 FPS according to the window states of the short message application window, the video application window and the game application window, and adjust the frame rate of the video application interface to 0 FPS.
  • the frame rate is adjusted to 30 FPS, and the frame rate of the game application interface is not adjusted (that is, it is still 60 FPS).
  • the second device may not refresh the short message application interface, and use the interface configuration information of the short message application corresponding to the previous image frame. Send it.
  • the application interface of the short message sent and displayed may still be the same as the application interface of the short message in the previous frame.
  • the second device adjusting the frame rate of the video application interface from 60 FPS to 30 FPS may specifically include: the second device starts a rendering task every other frame, and renders the video application interface in the video application window and sends it for display.
  • the second device sending the game application interface for display may specifically include: the second device starts a rendering task for each frame of the game application interface after cutting, and renders and sends the display in the game application window.
  • the second device obtains the window states of different projection windows to allocate GPU resources and/or processing capabilities of the device as needed according to the window states of different projection windows. And/or the processing power is tilted toward the application interface with higher user experience requirements (such as the application interface in the focus window), reducing the device in the application interface with lower user experience requirements (such as in the minimized window). application interface) allocated GPU resources and/or processing power. In this way, the load of the second device can be reduced while ensuring the fluency and clarity of the projected screen. Alternatively, through the above method of allocating resources on demand, the fluency and clarity of the projected screen can be ensured when the processing capability of the second device is limited.
  • Embodiment 2 is a diagrammatic representation of Embodiment 1:
  • the second device can adaptively and dynamically adjust the frame rates corresponding to different application interfaces according to the application categories corresponding to multiple application interfaces projected from the first device to the second device, so as to reduce the frame rate of the second device.
  • GPU pressure to ensure the smoothness and clarity of the projected screen.
  • FIG. 9 shows a flowchart of another multi-window screen projection method provided by an embodiment of the present application.
  • a multi-window screen projection method provided by this embodiment of the present application may include the following steps S701, S901 and S902:
  • the second device and the first device synchronously display a first interface, where the first interface includes multiple application interfaces.
  • the second device and the first device synchronously display the first interface means that the second device synchronously displays the first interface projected from the first device to the second device.
  • the first interface is a combination of multiple application interfaces.
  • the first device is a mobile phone 110
  • the mobile phone 110 uses the same source screen projection method to project the desktop, short message application interface, video application interface and game application interface of the mobile phone 110 to the laptop computer 120
  • the first interface is shown in FIG. 1 . display, including mobile phone desktop, short message application interface, video application interface and game application interface.
  • the first device is a mobile phone 110, and the mobile phone 110 projects the short message application interface, video application interface and game application interface to the notebook computer 120 by using a heterogeneous screen projection method, the first interface is shown in FIG. 2, including short message application interface, video application interface and game application interface. Messaging application interface, video application interface and game application interface.
  • the second device acquires application categories corresponding to multiple application interfaces.
  • the above application categories may include, but are not limited to, instant messaging, video, game, office, social, life, shopping, or functional.
  • a short message application can be understood as an instant message application
  • a video application can be understood as a video application
  • a game application can be understood as a game application.
  • a short message application may be understood as an instant message application
  • a video application may be understood as a video application
  • a game application may be understood as a game application.
  • the second device may acquire, from the first device, application categories corresponding to the above-mentioned multiple application interfaces projected by the first device to the second device.
  • application categories corresponding to multiple application interfaces may be determined by attributes and/or functions of corresponding applications in the video stream from the first device.
  • the second device may determine application categories corresponding to multiple application interfaces according to the application development attributes and/or application data configuration acquired from the first device.
  • the second device may periodically acquire application categories corresponding to multiple application interfaces from the first device.
  • the second device may periodically acquire window states corresponding to multiple application interfaces according to a preset period.
  • the preset period may be preset in the second device.
  • the preset period may be 3 seconds (seconds, s).
  • the second device may acquire, from the first device, application categories corresponding to multiple application interfaces in response to receiving a manipulation event from the user.
  • the second device may acquire application categories corresponding to multiple application interfaces in response to the input/output device driver or the sensor driver receiving a manipulation event from the user.
  • the above-mentioned manipulation event may be a manipulation event of the user on any one of the above-mentioned multiple application interfaces.
  • the second device adaptively adjusts the frame rates corresponding to the multiple application interfaces according to the acquired application categories corresponding to the multiple application interfaces.
  • the second device may adaptively adjust frame rates corresponding to multiple application interfaces according to a preset policy (eg, a second preset policy).
  • a preset policy eg, a second preset policy.
  • the second preset policy is related to the application category.
  • resources and/or processing capabilities can be transferred to applications with higher requirements (such as game applications)
  • the rendering of the interface is tilted upward, which reduces the resources and/or processing power allocated by the device to the rendering of the interface of applications with lower requirements (such as functional applications).
  • the second device may adaptively adjust the frame rates corresponding to the multiple application interfaces according to the following second preset strategy: according to the degree of demand for resources and/or processing capabilities from high to low, from high to low Adjust the frame rate corresponding to multiple application interfaces to a small adaptation.
  • the degree of demand for resources and processing power from high to low is: game application > video application > instant messaging application
  • the size of the frame rate can be adjusted as: game application interface > video application interface > instant message class application interface.
  • the frame rate of a game application interface can be adjusted to 60 FPS (that is, refreshed 60 times per second)
  • the frame rate of a video application interface can be adjusted to 24 FPS (that is, refreshed 24 times per second).
  • the frame rate of the app-like interface can be adjusted to 24 FPS (that is, 24 refreshes per second).
  • the second device obtains the application categories of different screencasting applications to allocate device resources and/or processing capabilities as needed according to the application categories of different applications, for example, assigning device resources and/or The processing power is inclined to the rendering of the interface of the application with high demand (such as game application), and the resources and/or processing power allocated by the device to the rendering of the interface of the application with low demand (such as functional application) is reduced.
  • the load on the second device can be reduced while ensuring the fluency and clarity of the projected screen.
  • the fluency and clarity of the projected screen can be ensured when the processing capability of the second device is limited.
  • the solution provided by the second embodiment of the present application can be combined with the solution provided by the first embodiment to adaptively and dynamically adjust the frame rate corresponding to different application interfaces and reduce the GPU pressure of the second device. , so as to ensure the fluency and clarity of the projected screen.
  • FIG. 10 shows a flowchart of another multi-window screen projection method provided by an embodiment of the present application.
  • a multi-window screen projection method provided by an embodiment of the present application may include steps S701 , S702 , S901 and S1001 .
  • the second device when the second device performs the above steps S702 and S901, the second device can adaptively adjust the frame rates corresponding to different application interfaces by comprehensively considering the window states and application categories corresponding to different application interfaces through step S1001:
  • the second device adaptively adjusts the frame rate corresponding to one or more application interfaces according to the obtained window states and application categories corresponding to the multiple application interfaces.
  • the second device may adaptively adjust the frame rate corresponding to one or more application interfaces according to different weights corresponding to window states and application categories.
  • the weight is used to indicate the reference degree or importance degree of the corresponding factor. For example, even if an application has high requirements on resources and/or processing power, if the corresponding window state of the application indicates that the application interface is less concerned by the user (if the window state is a minimized window), the application hardly needs Take up resources and/or processing power. Therefore, usually the window state is more weighted than the application category. That is, after comprehensively considering the window states and application categories corresponding to different application interfaces, the window states corresponding to different application interfaces are first considered.
  • the second device can adaptively adjust the frame rate corresponding to one or more application interfaces by adopting a strategy of preferring the minimum value according to the window state and the application category.
  • FIG. 11 shows a schematic diagram of a multi-window collaborative screen projection process provided by an embodiment of the present application by taking a first device to project a short message application, a video application, and a game application to a second device as an example.
  • the first device renders the common interface of the short message application, video application and game application started on the first device, color space conversion (Figure 11 takes conversion to YUV color coding as an example), video coding ( Figure 11 takes the H.264 standard as an example for video encoding) and sends the encoded standard video stream to the second device.
  • the second device completes video decoding (in Figure 11, the H.264 standard is used for video decoding), color space conversion (in Figure 11, conversion to YUV color decoding is used as an example), picture cutting and display sending.
  • the second device can control the short message application interface and video application interface when the user controls the interface.
  • the window states and application categories corresponding to the short message application interface, the video application interface and the game application interface are obtained (eg periodically obtained, in response to receiving a manipulation event, etc.).
  • the second device determines that the short message application window is currently minimized, the video application window is currently a non-minimized and non-focused window, and the game application window is currently the focus window.
  • the second device determines that the short message application is an instant message application, the video application is a video application, and the game application is a game application. Then the second device comprehensively considers the obtained window state and application category, and takes the frame rate as: game application window>video application window>short message application window strategy, and adaptively adjusts the short message application interface, video application interface and game The frame rate corresponding to the application interface. Exemplarily, as shown in FIG.
  • the second device can adjust the frame rate of the short message application interface to 0 FPS according to the window state and application category of the short message application window, the video application window and the game application window, and the video application
  • the frame rate of the interface is adjusted to 24 FPS, and the frame rate of the game application interface is not adjusted (ie, it is still 60 FPS).
  • the second device since the frame rate of the short message application interface is currently adjusted to 0 FPS, during video stream processing, the second device may not cut the short message application interface.
  • the displayed short message application interface may still be the same as the short message application interface of the previous frame.
  • the embodiments shown in FIG. 7 , FIG. 9 or FIG. 10 of the present application may be triggered by the resource occupation and/or processing capability of the second device. For example, when the second device executes the above step S701, if the GPU load of the second device is too high, the second device executes the steps S702 and S703 shown in FIG. 7; or the second device executes the steps shown in FIG. 9 . S902 and S903; or the second device performs steps S702, S901 and S1001 shown in FIG. 10 .
  • the GPU load is too high: the decoding delay of the GPU is greater than the delay threshold, the load rate of the GPU exceeds the load threshold (eg 80%), the screen-casting application The number of interfaces is greater than the number threshold (eg 2).
  • the second device takes determining the GPU load according to whether the decoding delay of the GPU is greater than the preset threshold as an example, for example, in the process of the second device performing the above step S701, if the GPU decoding delay of the second device is greater than the preset threshold (eg 10ms) , the second device performs steps S702 and S703 shown in FIG. 7 to reduce the frame rate of the application interface with lower user experience requirements and ensure the application interface frame rate with higher user experience requirements. For example, the interface frame rate corresponding to the non-focus window shown in FIG. 8 (such as a minimized window or a non-minimized and non-focus window) is reduced to ensure the interface frame rate corresponding to the focus window.
  • the preset threshold eg 10ms
  • the second device when the second device performs the above step S701, if the GPU decoding delay of the second device is greater than a preset threshold (eg 10ms), the second device performs steps S902 and S903 shown in FIG.
  • a preset threshold eg 10ms
  • the frame rate of the application interface with a low degree of resource and/or processing capability requirements of the device ensures the application interface frame rate with a high degree of device resource and/or processing capability requirements. For example, reduce the frame rate of short message application and video application interface, and ensure the frame rate of game application interface.
  • the second device performs steps S702, S901 and S1001 shown in FIG. 10 .
  • the frame rate of the application interface with low user experience requirements ensures the application interface frame rate with high user experience requirements and high requirements on device resources and/or processing. For example, reduce the interface frame rate corresponding to the non-focused window shown in Figure 11 (such as a minimized window or a non-minimized and non-focused window), to ensure that the window state is the focused window and that the resource and/or processing power requirements of the device are relatively high.
  • the frame rate of the game application interface is reduced.
  • FIG. 12 shows a schematic diagram of a multi-window collaborative screen projection process provided by an embodiment of the present application by taking the first device to project a short message application, a video application, and a game application to the second device as an example.
  • the first device renders the common interface of the short message application, video application and game application started on the first device, color space conversion (Figure 12 takes conversion to YUV color coding as an example), video coding ( Fig. 12 uses the H.264 standard for video encoding as an example) and sends the encoded standard video stream to the second device.
  • the second device completes video decoding (in Figure 12, the H.264 standard is used for video decoding), color space conversion (in Figure 12, conversion to YUV color decoding is used as an example), picture cutting and display sending.
  • the current second device will cast the screen (including the short message application interface, video application interface and game application interface) at a frame rate of 60 FPS.
  • the GPU decoding delay of the second device is greater than the preset threshold, then the second device acquires (such as periodic acquisition, in response to receiving manipulation events, etc.) short message application Window states and application categories corresponding to the interface, video application interface, and game application interface.
  • the second device determines that the short message application window is currently minimized, the video application window is currently a non-minimized and non-focused window, and the game application window is currently the focused window.
  • the second device determines that the short message application is an instant message application, the video application is a video application, and the game application is a game application. Then the second device comprehensively considers the acquired window state and application category, and takes the frame rate as: game application window > video application window > short message application window strategy, and adaptively adjusts the short message application interface, video application interface and game The frame rate corresponding to the application interface.
  • the second device may adjust the frame rate of the short message application interface to 0 FPS according to the window state and application category of the short message application window, the video application window and the game application window, and the video application
  • the frame rate of the interface is adjusted to 24 FPS, and the frame rate of the game application interface is not adjusted (that is, it is still 60 FPS).
  • the second device since the frame rate of the short message application interface is currently adjusted to 0 FPS, during video stream processing, the second device may not cut the short message application interface.
  • the displayed short message application interface may still be the same as the short message application interface of the previous frame.
  • the second device can adaptively and dynamically adjust the application display area size (Display size) and display resolution (Display resolution) according to the number of multiple application interfaces projected from the first device to the second device. ) and one or more of the video resolution, etc., to ensure the fluency and clarity of the projected screen.
  • Display size application display area size
  • Display resolution display resolution
  • FIG. 13 shows a flowchart of another multi-window screen projection method provided by an embodiment of the present application.
  • a multi-window screen projection method provided by an embodiment of the present application may include the following steps S701, S1301-S1302:
  • the second device and the first device synchronously display a first interface, where the first interface includes multiple application interfaces.
  • step S701 For the specific introduction of step S701, reference may be made to the introduction to S701 in the first embodiment above.
  • the second device acquires the current number of screencasting application interfaces.
  • the number of screen-casting application interfaces refers to the number of application interfaces screened by the first device to the second device.
  • the mobile phone 110 desktop, short message application interface, video application interface and game application interface are the application interfaces of the first device to project the screen to the second device, then the second device obtains the current projection.
  • the number of screen application interfaces is 4.
  • the short message application interface, the video application interface and the game application interface are the application interfaces of the first device to project the screen to the second device, and the second device obtains the current screen projection application interface.
  • the number is 3.
  • the second device may acquire the current number of screencasting application interfaces from the first device, for example, from a standard video stream from the first device.
  • the second device adaptively adjusts one or more of the following according to the acquired number of current screencasting application interfaces: the size of the application display area, the display resolution and the video resolution.
  • the application display area refers to a display area used by the second device to display the application interface.
  • the application display area size (Display size) is the size of the display area.
  • the display resolution is used to represent the number of pixels that can be displayed per unit area in the display area of the display screen of the second device used to display the application interface.
  • the video resolution is used to represent the number of pixels that can be displayed in the unit image area of the image frame corresponding to the video stream.
  • the second device may adaptively adjust one or more of the size of the application display area, the display resolution, and the video resolution according to the preset strategy and the number of current screencasting application interfaces.
  • the second device may increase the length or width of the application display area (the size of the application display area) by a multiple, an index, or based on a preset calculation formula according to an increase in the number of current screencasting application interfaces.
  • the Display size may be a 1 ⁇ b 1 , where a 1 is the length of the Display, and b 1 is the width of the Display.
  • the Display size may be 2a 1 ⁇ b 1 , where 2a 1 is the length of the Display, and b 1 is the width of the Display.
  • the Display size may be 3a 1 ⁇ b 1 , where 3a 1 is the length of the Display, and b 1 is the width of the Display.
  • a 1 may be the width of the display screen of the first device, and b 1 may be the height of the display screen of the first device; or a 1 may be the height of the display screen of the first device, and b 1 may be the display screen of the first device width.
  • the second device can increase the number of pixels that can be displayed in the horizontal dimension of the display screen or the number of pixels that can be displayed in the vertical dimension (ie, display resolution) according to the increase in the number of current screen-casting application interfaces, a multiple, an index, or based on a preset calculation formula.
  • the display resolution may be a 2 ⁇ b 2 pixels (pixel, P), where a 2 is the number of pixels that can be displayed in the horizontal dimension of the display screen of the second device.
  • b 2 is the number of pixels that can be displayed in the vertical dimension of the display screen of the second device.
  • the display resolution can be a 2 ⁇ 2b 2 , where a 2 is the number of pixels that can be displayed in the horizontal dimension of the display screen of the second device, and 2b 2 is the vertical screen of the second device. The number of pixels that the dimension can display.
  • the display resolution can be a 2 ⁇ 3b 2 pixels (pixel, p), where a 2 is the number of pixels that can be displayed in the horizontal dimension of the display screen of the second device, and 3b 2 is The number of pixels that can be displayed in the vertical dimension of the display screen of the second device.
  • the second device can reduce the number of displayable pixels per unit area (eg, per inch) of the image based on the display resolution of the second device, a multiple, an index, or a preset calculation formula according to the current increase in the number of screencasting application interfaces.
  • video resolution also known as image resolution
  • a 3 is the number of pixels that can be displayed in the image per unit area (eg, per inch) in the horizontal dimension
  • a 2 /2 (ie a 3 ) is the number of pixels that can be displayed in the image per unit area (eg per inch) in the horizontal dimension
  • b 2 (ie b 3 ) is the number of pixels that can be displayed in the image per unit area (eg per inch) in the vertical dimension the number of pixels.
  • a 2 /2 (ie a 3 ) is the number of pixels that can be displayed in the image per unit area of the horizontal dimension (eg per inch), and 3b 2 /2 (ie b 3 ) is the unit area of the vertical dimension (eg per inch) in the image The number of pixels that can be displayed.
  • the second device adaptively adjusts the size of the application display area, one of the display resolution and the video resolution according to the number of application interfaces projected from the first device to the second device. multiple.
  • the display definition can be adaptively adjusted according to the specific GPU load of the second device. For example, when the number of application interfaces is small and the GPU load is small, a higher display resolution and video resolution are guaranteed to ensure the clarity of the interface; when the number of application interfaces is large, the GPU processing capacity is limited (that is, the load is large). ), reduce the display resolution and video resolution to ensure the smoothness of the interface.
  • the solution provided by the third embodiment of the present application can be combined with the solution provided by the first embodiment and/or the second embodiment, so as to adaptively and dynamically adjust the frame rate and display corresponding to different application interfaces.
  • Resolution and video resolution reduce the GPU pressure of the second device, thus ensuring the smoothness and clarity of the projected screen.
  • FIG. 14 shows a flowchart of another multi-window screen projection method provided by an embodiment of the present application.
  • a multi-window screen projection method provided by this embodiment of the present application may include steps S701 , S702 , S901 and S1001 , and steps S701 , S1301 and S1302 .
  • FIG. 15 shows a schematic diagram of a multi-window collaborative screen projection process provided by an embodiment of the present application by taking the first device to project a short message application, a video application, and a game application to the second device as an example.
  • the first device renders the common interface of the short message application, video application and game application started on the first device, color space conversion (Figure 15 takes conversion to YUV color coding as an example), video coding ( Fig. 15 uses the H.264 standard for video encoding as an example) and sends the encoded standard video stream to the second device.
  • the second device completes video decoding (in Figure 15, the H.264 standard is used for video decoding), color space conversion (in Figure 15, the conversion to YUV color decoding is used as an example), picture cutting and display sending.
  • the current second device will cast the screen (including the short message application interface, video application interface and game application interface) at a frame rate of 60 FPS.
  • the GPU decoding delay of the second device is greater than the preset threshold, then the second device acquires (such as periodic acquisition, in response to receiving manipulation events, etc.) short message application Window states and application categories corresponding to the interface, video application interface, and game application interface.
  • the second device determines that the short message application window is currently minimized, the video application window is currently a non-minimized and non-focused window, and the game application window is currently the focused window.
  • the second device determines that the short message application is an instant message application, the video application is a video application, and the game application is a game application. Then the second device comprehensively considers the acquired window state and application category, and takes the frame rate as: game application window > video application window > short message application window strategy, and adaptively adjusts the short message application interface, video application interface and game The frame rate corresponding to the application interface.
  • the second device may adjust the frame rate of the short message application interface to 0 FPS according to the window state and application category of the short message application window, the video application window and the game application window, and the video application
  • the frame rate of the interface is adjusted to 24 FPS, and the frame rate of the game application interface is not adjusted (that is, it is still 60 FPS).
  • the second device since the frame rate of the short message application interface is currently adjusted to 0 FPS, during video stream processing, the second device may not cut the short message application interface.
  • the displayed short message application interface may still be the same as the short message application interface of the previous frame.
  • the second device can obtain the current number of screen-casting application interfaces from the first device, so as to adaptively adjust the short message application, video application and game shown in FIG. 15 according to the current number of screen-casting application interfaces.
  • the second device may determine that the Display size of the short message application, video application and game application is 3a 1 ⁇ b 1 , and the display size is 3a 1 ⁇ b 1 .
  • the resolution is 2244 ⁇ 3240P, and the video resolution is 1122 ⁇ 1620P.
  • the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be implemented in the present application.
  • the implementation of the examples constitutes no limitation.
  • the electronic devices include corresponding hardware structures and/or software modules for performing each function.
  • the present application can be implemented in hardware or a combination of hardware and computer software with the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
  • the electronic device may be divided into functional modules.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. in the module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • FIG. 16 a structural block diagram of an electronic device provided in an embodiment of the present application is shown.
  • the electronic device may be the first device or the second device.
  • the electronic device may include a storage unit 1620 of the processing unit 1610 .
  • the processing unit 1610 when the electronic device is the second device, the processing unit 1610 is configured to acquire the first information when the second device displays a first interface including multiple application interfaces synchronously with the first device. And, according to the acquired first information, adaptively adjust one or more of the following: the frame rate corresponding to the multiple application interfaces, the size of the application display area corresponding to the multiple application interfaces, the display resolution of the second device, or multiple The video resolution corresponding to the application interface.
  • the processing unit 1610 is used to support the electronic device to perform the above-mentioned steps S702, S703, S901, S902, S1001, S1301 or S1302, and/or other processes for the techniques described herein.
  • the storage unit 1620 is used to store computer programs and implement processing data and/or processing results in the methods provided in the embodiments of the present application.
  • the electronic device may further include a transceiver unit 1630 .
  • the transceiver unit 1630 is used to communicate with the second device. For example, receive interface configuration information, control instructions, and the like from the screen-casting interface of the first device. For another example, a manipulation event of the user is sent to the first device, and the like.
  • the above-mentioned transceiver unit 1630 may include a radio frequency circuit.
  • the electronic device can receive and transmit wireless signals through a radio frequency circuit.
  • radio frequency circuits include, but are not limited to, antennas, at least one amplifier, transceivers, couplers, low noise amplifiers, duplexers, and the like.
  • radio frequency circuits can communicate with other devices through wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile Communications, General Packet Radio Service, Code Division Multiple Access, Wideband Code Division Multiple Access, Long Term Evolution, email, short message service, and the like.
  • each module in the electronic device may be implemented in the form of software and/or hardware, which is not specifically limited.
  • the electronic device is presented in the form of functional modules.
  • a “module” herein may refer to an application-specific integrated circuit ASIC, a circuit, a processor and memory executing one or more software or firmware programs, an integrated logic circuit, and/or other devices that may provide the above-described functions.
  • the portable device may take the form shown in FIG. 18 .
  • the processing unit 1610 may be implemented by the processor 1810 shown in FIG. 18 .
  • the transceiver unit 1630 may be implemented by the transceiver 1830 shown in FIG. 18 .
  • the processor is implemented by executing the computer program stored in the memory.
  • the memory is a storage unit in the chip, such as a register, a cache, etc., and the storage unit may also be a storage unit in the computer device located outside the chip, such as the memory shown in FIG. 18 . 1820.
  • the data transfer when the data transfer is implemented using software, it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are realized.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.)
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integration.
  • the available medium can be a magnetic medium, (eg floppy disk, hard disk , magnetic tape), optical media (such as digital video disk (DVD)), or semiconductor media (such as solid state disk (SSD)), etc.
  • the steps of the method or algorithm described in conjunction with the embodiments of the present application may be implemented in a hardware manner, or may be implemented in a manner in which a processor executes software instructions.
  • the software instructions can be composed of corresponding software modules, and the software modules can be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, removable hard disk, CD-ROM or any other form of storage well known in the art in the medium.
  • An exemplary storage medium is coupled to the processor, such that the processor can read information from, and write information to, the storage medium.
  • the storage medium can also be an integral part of the processor.
  • the processor and storage medium may reside in an ASIC. Alternatively, the ASIC may be located in the detection device. Of course, the processor and storage medium may also be present in the detection device as discrete components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请公开了多窗口投屏方法及电子设备,涉及电子技术领域,能够在多窗口投屏时,通过缓解电子设备的图像处理负荷,以保证投屏画面的流畅度和清晰度。本申请中,第二设备在接受第一设备投屏的过程中,通过获取第一信息,实现对投屏界面对应的帧率、投屏界面对应的应用显示区域尺寸、第二设备的显示分辨率或投屏界面对应的视频分辨率中的一项或多项的适应性调整。以实现设备图像处理资源和处理能力的按需分配,在保证投屏画面的流畅度和清晰度的同时,降低第二设备的负荷。或者,在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。

Description

多窗口投屏方法及电子设备
本申请要求于2020年9月10日提交国家知识产权局、申请号为202010949156.X、申请名称为“多窗口投屏方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及电子技术领域,尤其涉及多窗口投屏方法及电子设备。
背景技术
随着应用显示技术的发展,越来越多的电子设备支持多窗口投屏技术。多窗口投屏技术是通过将一个电子设备(如第一设备)上启动的多个应用界面投屏至另一个电子设备(如第二设备),以实现在第一设备和第二设备上的镜像操控和输入协同。
常规的多窗口投屏技术中,多个应用界面通常以固定帧率(frames per second,FPS)和分辨率投屏至第二设备,基于常规的多窗口投屏技术,在将多个应用界面同时渲染在第二设备上时,通常图形处理器(graphics processing unit,GPU)的占用率很高(例如占用率经常达到80%以上),通信资源(如无线保真(wireless fidelity,WiFi)资源)的吞吐压力也较大。上述问题会导致系统卡顿,从而导致投屏画面卡顿或者不流畅等问题,影响用户体验。
发明内容
本申请实施例提供一种多窗口投屏方法及电子设备,能够在多窗口投屏时,通过缓解电子设备的图像处理负荷,以保证投屏画面的流畅度和清晰度。
为达到上述目的,本申请实施例采用如下技术方案:
第一方面,提供一种多窗口投屏方法,该方法应用于第一设备向第二设备投屏的场景中,该方法包括:第二设备在与第一设备同步显示第一界面时,获取第一信息;其中第一界面包括多个应用界面;第二设备根据获取的第一信息,适应性调整以下中的一个或多个:多个应用界面对应的帧率、多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。
上述第一方面提供的技术方案,第二设备在接受第一设备投屏的过程中,通过获取第一信息,实现对投屏界面对应的帧率、投屏界面对应的应用显示区域尺寸、第二设备的显示分辨率或投屏界面对应的视频分辨率中的一项或多项的适应性调整。以实现设备图像处理资源和处理能力的按需分配,在保证投屏画面的流畅度和清晰度的同时,降低第二设备的负荷。或者,在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第一信息包括所述多个应用界面对应的窗口状态;所上述第一信息具体用于第二设备适应性调整多个应用界面对应的帧率;其中,窗口状态包括焦点窗口,非最小化且非焦点窗口和最小化窗口。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面对应的窗口状态,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第二设备根据获取的多个应用界面对应的窗口状态,适应性调整多个应用界面对应的帧率,包括:第二设备按照以下第一预设策略适应性调整多个应用界面对应的帧率:焦点窗口对应的帧率>非最小化且非焦点窗口对应的帧率>最小化窗口对应的帧率。第二设备通过根据预设的策略,适应性根据不同窗口状态,调整投屏界面对应的帧率,以保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第一信息包括多个应用界面对应的应用类别,第一信息具体用于第二设备适应性调整所述多个应用界面对应的帧率;其中,应用类别包括游戏类、视频类、即时消息类、办公类、社交类、生活类、购物类和功能类中的一个或多个。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面对应的应用类别,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述应用类别包括游戏类、视频类和即时消息类;第二设备根据获取的多个应用界面对应的应用类别,适应性调整多个应用界面对应的帧率,包括:第二设备按照以下第二预设策略适应性调整多个应用界面对应的帧率:游戏类应用界面对应的帧率>视频类应用界面对应的帧率>即时消息类应用界面对应的帧率。第二设备通过根据预设的策略,适应性根据不同应用类别,调整投屏界面对应的帧率,以保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第二设备在与第一设备同步显示第一界面时,获取第一信息,包括:第二设备在与第一设备同步显示第一界面时,若确定第二设备的处理负荷高于预设阈值,获取第一信息。本申请提供的方案可以基于第二设备的处理负荷高于预设阈值来实现,通过该方案,可以在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第二设备根据以下中的一个或多个,确定第二设备的处理负荷高于预设阈值:第二设备的GPU的解码时延大于时延阈值、GPU的负载率大于负载阈值、多个应用界面的数量大于数量阈值。本申请提供的方案中,可以通过判断GPU的解码时延是否大于时延阈值、GPU的负载率是否大于负载阈值、多个应用界面的数量是否大于数量阈值来判断第二设备的处理负荷是否高于预设阈值。
在一种可能的实现方式中,上述第一信息包括多个应用界面的数量,第一信息具体用于第二设备适应性调整以下中的一个或多个:多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面的数量,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,若多个应用界面的数量为1,则第二设备确定应用界面对应的应用显示区域尺寸为a 1×b 1,第二设备的显示分辨率为a 2×b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2,b 3=b 2
在一种可能的实现方式中,若多个应用界面的数量为2,则第二设备确定应用界面对应的应用显示区域尺寸为2a 1×b 1,第二设备的显示分辨率为a 2×2b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,2a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,2b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=b 2
在一种可能的实现方式中,若多个应用界面的数量为3,则第二设备确定应用界面对应的应用显示区域尺寸为3a 1×b 1,第二设备的显示分辨率为a 2×3b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,3a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,3b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=3b 2/2。
第二方面,提供一种电子设备,该电子设备包括:处理单元,用于在该电子设备与第一设备同步显示第一界面时,获取第一信息;其中第一界面包括多个应用界面;以及,用于根据获取的第一信息,适应性调整以下中的一个或多个:多个应用界面对应的帧率、多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。
上述第二方面提供的技术方案,第二设备在接受第一设备投屏的过程中,通过获取第一信息,实现对投屏界面对应的帧率、投屏界面对应的应用显示区域尺寸、第二设备的显示分辨率或投屏界面对应的视频分辨率中的一项或多项的适应性调整。以实现设备图像处理资源和处理能力的按需分配,在保证投屏画面的流畅度和清晰度的同时,降低第二设备的负荷。或者,在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第一信息包括所述多个应用界面对应的窗口状态;所上述第一信息具体用于第二设备适应性调整多个应用界面对应的帧率;其中,窗口状态包括焦点窗口,非最小化且非焦点窗口和最小化窗口。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面对应的窗口状态,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述处理单元根据获取的多个应用界面对应的窗口状态,适应性调整多个应用界面对应的帧率,包括:处理单元按照以下第一预设策略适应性调整多个应用界面对应的帧率:焦点窗口对应的帧率>非最小化且非焦点窗口对应的帧率>最小化窗口对应的帧率。第二设备通过根据预设的策略,适应性根据不同窗口状态,调整投屏界面对应的帧率,以保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第一信息包括多个应用界面对应的应用类别,第一信息具体用于第二设备适应性调整所述多个应用界面对应的帧率;其中,应用类别包括游戏类、视频类、即时消息类、办公类、社交类、生活类、购物类和功能类中的一个或多个。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面对应的应用类别,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处 理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述应用类别包括游戏类、视频类和即时消息类;上述处理单元根据获取的多个应用界面对应的应用类别,适应性调整多个应用界面对应的帧率,包括:处理单元按照以下第二预设策略适应性调整多个应用界面对应的帧率:游戏类应用界面对应的帧率>视频类应用界面对应的帧率>即时消息类应用界面对应的帧率。第二设备通过根据预设的策略,适应性根据不同应用类别,调整投屏界面对应的帧率,以保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述处理单元在电子设备与第一设备同步显示第一界面时,获取第一信息,包括:处理单元在电子设备与第一设备同步显示第一界面时,若确定第二设备的处理负荷高于预设阈值,获取第一信息。本申请提供的方案可以基于第二设备的处理负荷高于预设阈值来实现,通过该方案,可以在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述处理单元根据以下中的一个或多个,确定第二设备的处理负荷高于预设阈值:第二设备的GPU的解码时延大于时延阈值、GPU的负载率大于负载阈值、多个应用界面的数量大于数量阈值。本申请提供的方案中,可以通过判断GPU的解码时延是否大于时延阈值、GPU的负载率是否大于负载阈值、多个应用界面的数量是否大于数量阈值来判断第二设备的处理负荷是否高于预设阈值。
在一种可能的实现方式中,上述第一信息包括多个应用界面的数量,第一信息具体用于第二设备适应性调整以下中的一个或多个:多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面的数量,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,若多个应用界面的数量为1,则第二设备确定应用界面对应的应用显示区域尺寸为a 1×b 1,第二设备的显示分辨率为a 2×b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2,b 3=b 2
在一种可能的实现方式中,若多个应用界面的数量为2,则第二设备确定应用界面对应的应用显示区域尺寸为2a 1×b 1,第二设备的显示分辨率为a 2×2b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,2a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,2b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=b 2
在一种可能的实现方式中,若多个应用界面的数量为3,则第二设备确定应用界面对应的应用显示区域尺寸为3a 1×b 1,第二设备的显示分辨率为a 2×3b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,3a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,3b 2为第二设备显示屏垂直 维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=3b 2/2。
第三方面,提供一种电子设备,该电子设备包括:存储器,用于存储计算机程序;收发器,用于接收或发送无线电信号;处理器,用于执行所述计算机程序,使得电子设备在该电子设备与第一设备同步显示第一界面时,获取第一信息;其中第一界面包括多个应用界面;以及根据获取的第一信息,适应性调整以下中的一个或多个:多个应用界面对应的帧率、多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。
上述第三方面提供的技术方案,第二设备在接受第一设备投屏的过程中,通过获取第一信息,实现对投屏界面对应的帧率、投屏界面对应的应用显示区域尺寸、第二设备的显示分辨率或投屏界面对应的视频分辨率中的一项或多项的适应性调整。以实现设备图像处理资源和处理能力的按需分配,在保证投屏画面的流畅度和清晰度的同时,降低第二设备的负荷。或者,在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第一信息包括所述多个应用界面对应的窗口状态;所上述第一信息具体用于第二设备适应性调整多个应用界面对应的帧率;其中,窗口状态包括焦点窗口,非最小化且非焦点窗口和最小化窗口。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面对应的窗口状态,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述处理器用于执行所述计算机程序,使得电子设备按照以下第一预设策略适应性调整多个应用界面对应的帧率:焦点窗口对应的帧率>非最小化且非焦点窗口对应的帧率>最小化窗口对应的帧率。第二设备通过根据预设的策略,适应性根据不同窗口状态,调整投屏界面对应的帧率,以保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述第一信息包括多个应用界面对应的应用类别,第一信息具体用于第二设备适应性调整所述多个应用界面对应的帧率;其中,应用类别包括游戏类、视频类、即时消息类、办公类、社交类、生活类、购物类和功能类中的一个或多个。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面对应的应用类别,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述应用类别包括游戏类、视频类和即时消息类;上述处理器用于执行所述计算机程序,使得电子设备按照以下第二预设策略适应性调整多个应用界面对应的帧率:游戏类应用界面对应的帧率>视频类应用界面对应的帧率>即时消息类应用界面对应的帧率。第二设备通过根据预设的策略,适应性根据不同应用类别,调整投屏界面对应的帧率,以保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述处理器用于执行所述计算机程序,使得电子设备在电子设备与第一设备同步显示第一界面时,若确定第二设备的处理负荷高于预设阈值,获取第一信息。本申请提供的方案可以基于第二设备的处理负荷高于预设阈值来 实现,通过该方案,可以在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,上述处理器根据以下中的一个或多个,确定第二设备的处理负荷高于预设阈值:第二设备的GPU的解码时延大于时延阈值、GPU的负载率大于负载阈值、多个应用界面的数量大于数量阈值。本申请提供的方案中,可以通过判断GPU的解码时延是否大于时延阈值、GPU的负载率是否大于负载阈值、多个应用界面的数量是否大于数量阈值来判断第二设备的处理负荷是否高于预设阈值。
在一种可能的实现方式中,上述第一信息包括多个应用界面的数量,第一信息具体用于第二设备适应性调整以下中的一个或多个:多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。第二设备在接受第一设备投屏的过程中,通过获取多个应用界面的数量,实现对投屏界面对应的帧率的适应性调整。以实现设备图像处理资源和处理能力的按需分配,保证投屏画面的流畅度和清晰度。
在一种可能的实现方式中,若多个应用界面的数量为1,则第二设备确定应用界面对应的应用显示区域尺寸为a 1×b 1,第二设备的显示分辨率为a 2×b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2,b 3=b 2
在一种可能的实现方式中,若多个应用界面的数量为2,则第二设备确定应用界面对应的应用显示区域尺寸为2a 1×b 1,第二设备的显示分辨率为a 2×2b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,2a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,2b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=b 2
在一种可能的实现方式中,若多个应用界面的数量为3,则第二设备确定应用界面对应的应用显示区域尺寸为3a 1×b 1,第二设备的显示分辨率为a 2×3b 2,多个应用界面对应的视频分辨率为a 3×b 3;其中,3a 1为应用显示区域的长度,b 1为应用显示区域的宽度;a 2为第二设备显示屏水平维度能够显示的像素数,3b 2为第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=3b 2/2。
第四方面,提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序代码,该计算机程序代码被处理器执行时实现如第一方面任一种可能的实现方式中的方法。
第五方面,提供一种芯片系统,该芯片系统包括处理器、存储器,存储器中存储有计算机程序代码;所述计算机程序代码被所述处理器执行时,实现如第一方面任一种可能的实现方式中的方法。该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
第六方面,提供一种计算机程序产品,当其在计算机上运行时,使得实现如第一 方面任一种可能的实现方式中的方法。
附图说明
图1为本申请实施例提供的一种多窗口投屏的场景示例图;
图2为本申请实施例提供的另一种多窗口投屏的场景示例图;
图3为本申请实施例提供的一种第一设备的硬件结构示意图;
图4为本申请实施例提供的一种第二设备的硬件结构示意图;
图5为本申请实施例提供的一种第一设备向第二设备投屏时的软件交互示意图;
图6为一种第一设备向第二设备投屏的过程示意图;
图7为本申请实施例提供的一种多窗口投屏方法流程图一;
图8为本申请实施例提供的一种多窗口协同投屏过程示意图一;
图9为本申请实施例提供的一种多窗口投屏方法流程图二;
图10为本申请实施例提供的一种多窗口投屏方法流程图三;
图11为本申请实施例提供的一种多窗口协同投屏过程示意图二;
图12为本申请实施例提供的一种多窗口协同投屏过程示意图三;
图13为本申请实施例提供的一种多窗口投屏方法流程图四;
图14为本申请实施例提供的一种多窗口投屏方法流程图五;
图15为本申请实施例提供的一种多窗口协同投屏过程示意图四;
图16为本申请实施例提供的一种电子设备的结构框图;
图17为本申请实施例提供的另一种电子设备的结构框图;
图18为本申请实施例提供的一种电子设备的示意性结构图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供一种多窗口投屏方法,该方法基于多窗口投屏技术实现。多窗口投屏技术是指设备与设备(如第一设备与第二设备)之间通过建立的通信连接,实现多个应用界面在多个设备上的镜像显示。基于多个应用界面在多个设备上的镜像显示,通过镜像操控和输入协同来实现跨设备的多屏协同交互的功能。
在一些实施例中,基于多个应用界面在多个设备上的镜像显示,通过镜像操控和输入协同还可以实现跨设备且跨系统的多屏协同交互的功能。
例如,在第一设备与第二设备之间建立了用于多窗口投屏的通信连接之后,第二设备上可以同步显示有第一设备上启动的多个应用界面。用户可以通过第二设备的硬件(如键盘、鼠标、麦克风、扬声器等)等在上述应用界面上操作。同时,用户可以 通过第一设备或第二设备打开新的应用界面,以进一步同步至第二设备。另外,用户还可以在第二设备上完成与第一设备的快速数据共享等功能。
请参考图1和图2,图1和图2示出了两种多窗口投屏的场景示例图。如图1或图2所示,假设智能手机110(即第一设备)与笔记本电脑120(即第二设备)之间建立了用于多窗口投屏的通信连接。在多窗口投屏场景中,假设在智能手机110显示手机桌面的同时,接收到用户对智能手机110上短消息应用、视频应用和游戏应用的启动操作。响应于用户对短消息应用、视频应用和游戏应用的启动操作,智能手机110会以自由悬浮窗的形式启动投屏界面,即短消息应用、视频应用和游戏应用。然后,智能手机110将上述自由悬浮窗应用界面与智能手机110的手机桌面共同渲染,其中一部分渲染并送显至智能手机110的主屏幕(main display),一部分渲染在手机110的虚拟屏幕(virtual display)上。以及,智能手机110将虚拟屏幕上渲染的界面对应的Surface编码成标准视频流传输到笔记本电脑120,从而实现多窗口(即多个应用界面的窗口)在智能手机110(即第一设备)和笔记本电脑120(即第二设备)上的协同显示。在一些实施例中,智能手机110还可以将主屏幕和虚拟屏幕上渲染的所有界面对应的Surface编码成标准视频流传输到笔记本电脑120。基于多窗口在第一设备和第二设备上的协同显示,用户可以通过第一设备和第二设备协同操控第一设备向启动的多个应用界面。
其中,智能手机110(即第一设备)送显至智能手机110的主屏幕的界面可以称为默认界面。示例性的,默认界面可以是预设界面,如手机桌面(如图1和图2所示)、设置界面或工具类界面等,默认界面也可以是用户自定义的界面等,本申请不限定。
在本申请实施例中,根据不同的投屏需求,多窗口投屏的方式可以包括同源投屏和异源投屏。
其中,同源投屏是指采取扩展屏幕的方式,将第一设备上启动的多个应用的界面投屏至第二设备。在同源投屏方式下,第一设备采用一路编码向第二设备发送在主屏幕和虚拟屏幕上渲染的所有应用界面对应的Surface编码成的标准视频流,以在第二设备显示屏上显示虚拟屏幕上渲染的所有应用界面(包括默认界面)。其中,默认界面可以理解为送显第一设备的界面。
假设第一设备为手机110,在手机110显示有手机110桌面时,响应于用户点击短消息应用图标(如图1所示“信息”图标)、视频应用图标(如图1所示“华为视频”图标)和游戏应用图标(如图1所示“游戏”图标)的操作,手机110将手机110桌面、短消息应用界面、视频应用界面和游戏应用界面共同渲染在主屏幕和虚拟屏幕上。基于同源投屏,如图1所示,在笔记本电脑120(即第二设备)接收到来自智能手机110(即第一设备)的,包括智能手机110的主屏幕和虚拟屏幕上共同渲染的所有应用界面对应的标准视频流后,笔记本电脑120根据该标准视频流显示智能手机110桌面、短消息应用界面、视频应用界面和游戏应用界面。
在异源投屏方式下,第一设备采用两路编码,其中一路编码将默认界面送显(即在第一设备显示屏上显示)。另一路编码将渲染在虚拟屏幕上的应用界面对应的标准视频流等信息发送至第二设备。
假设第一设备为手机110,在手机110显示有手机110桌面时,响应于用户点击 短消息应用图标(如图2所示“信息”图标)、视频应用图标(如图1所示“华为视频”图标)和游戏应用图标(如图2所示“游戏”图标)的操作,手机110将手机110桌面、短消息应用界面、视频应用界面和游戏应用界面共同渲染在主屏幕和虚拟屏幕上。基于异源投屏,如图2所示,在笔记本电脑120(即第二设备)接收到来自智能手机110(即第一设备)的,包括智能手机110的虚拟屏幕上渲染的应用界面(如短消息应用界面、视频应用界面和游戏应用界面的)对应的标准视频流后,笔记本电脑120根据该标准视频流显示短消息应用界面、视频应用界面和游戏应用界面。
可以理解,同源投屏方式与异源投屏方式各有优缺点。例如,同源投屏方式可以保证应用的连续性;而异源投屏方式,在不同屏幕间切换时,需要重新启动应用。例如,对于图2所示的示例,若需要在第一设备上处理短消息应用界面、视频应用界面和游戏应用界面,则需要将对应应用界面切换回第一设备。具体的,需要完成Display的切换,在Display切换的过程中,应用将不可避免的发生重启。但异源投屏方式具有更好的隔离性。例如,异源投屏方式可以为用户提供独立的操控屏(即第一设备的显示屏和第二设备的显示屏)处理不同的界面。
本申请实施例提供的多窗口投屏方法对于任何投屏方式(包括同源投屏和异源投屏)均适用。通过多窗口投屏技术,可以为用户提供便捷的使用体验。例如,由于笔记本电脑120的显示屏尺寸往往大于智能手机110的显示屏尺寸,因此可以强化和提升用户的观看感受。又如,笔记本电脑120的鼠标可以充当用户的手指,实现在短消息应用界面、视频应用界面和游戏应用界面或手机110桌面上更精准的触控操作。又如,笔记本电脑120的大尺寸物理键盘可以代替智能手机110显示屏上的小尺寸虚拟输入法窗口,实现更好的文字录入体验。又如,笔记本电脑120的多声道立体声扬声器可以代替智能手机110的扬声器,输出来自智能手机110的音频(如来自视频应用界面或游戏应用界面的音频等),实现音量和音质的提升。
其中,在本申请实施例中,第一设备与第二设备之间可以通过“碰一碰”、“扫一扫”(如扫描二维码或条形码)、“靠近自动发现”(如借助蓝牙或无线保真(wireless fidelity,WiFi))等方式建立无线通信连接。其中,第一设备与第二设备之间可以遵循无线传输协议,通过无线连接收发器传输信息。其中,该无线传输协议可以包含但不限于蓝牙(bluetooth,BT)传输协议或无线保真(wireless fidelity,WiFi)传输协议等。例如,WiFi传输协议可以是WiFi P2P传输协议。该无线连接收发器包含但不限于蓝牙,WiFi等收发器。通过无线配对,实现第一设备与第二设备之间的信息传输。其中,第一设备与第二设备之间传输的信息包括但不限于需要显示的内容数据(如标准视频流)和控制指令等。
或者,第一设备与第二设备之间可以建立有线通信连接。例如,第一设备与第二设备之间通过视频图像配接器(video graphics array,VGA)、数字视频接口(digital visual interface,DVI)、高清多媒体接口(high definition multimedia interface,HDMI)或数据传输线等建立有线通信连接。第一设备与第二设备之间通过建立的有线通信连接实现信息传输。本申请不限定第一设备与第二设备之间的具体连接方式。
在本申请实施例中,第一设备和第二设备均包括显示屏。第一设备和第二设备可 以包括但不限于智能手机、上网本、平板电脑、智能手表、智能手环、电话手表、智能相机、掌上电脑、个人计算机(personal computer,PC)、个人数字助理(personal digital assistant,PDA)、便携式多媒体播放器(portable multimedia player,PMP)、(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、电视机、投影设备或人机交互场景中的体感游戏机等。或者,第一设备和第二设备还可以是其他类型或结构的电子设备,本申请不限定。
通常,为了发挥多窗口投屏技术的最大优势,多窗口投屏技术多用于便携设备(即第一设备)与大屏设备(即第二设备)之间。例如,便携设备是智能手机,大屏设备是笔记本电脑。又如,便携设备是平板电脑,大屏设备是电视机。当然,本申请不限定多窗口投屏场景中的具体设备,如上文所述,第一设备和第二设备可以为智能手机、上网本、平板电脑、智能手表、智能手环、电话手表、智能相机、掌上电脑、PDA、PMP、AR/VR设备或电视机等任意支持多窗口投屏的电子设备。
请参考图3,图3以智能手机为例,示出了本申请实施例提供的一种第一设备的硬件结构示意图。如图3所示,第一设备可以包括处理器310,存储器(包括外部存储器接口320和内部存储器321),通用串行总线(universal serial bus,USB)接口330,充电管理模块340,电源管理模块341,电池342,天线1,天线2,移动通信模块350,无线通信模块360,音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,传感器模块380,按键390,马达391,指示器392,摄像头393,显示屏394,以及用户标识模块(subscriber identification module,SIM)卡接口395等。其中传感器模块380可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
可以理解的是,本发明实施例示意的结构并不构成对第一设备的具体限定。在本申请另一些实施例中,第一设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器310可以包括一个或多个处理单元。例如:处理器310可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),飞行控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器310中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器310中的存储器为高速缓冲存储器。该存储器可以保存处理器310刚用过或循环使用的指令或数据。如果处理器310需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器310的等待时间,因而提高了系统的效率。
在一些实施例中,处理器310可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器 (universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
充电管理模块340用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块340可以通过USB接口330接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块340可以通过第一设备的无线充电线圈接收无线充电输入。充电管理模块340为电池342充电的同时,还可以通过电源管理模块341为第一设备供电。
电源管理模块341用于连接电池342,充电管理模块340与处理器310。电源管理模块341接收电池342和/或充电管理模块340的输入,为处理器310,内部存储器321,显示屏394,摄像组件393,和无线通信模块360等供电。电源管理模块341还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块341也可以设置于处理器310中。在另一些实施例中,电源管理模块341和充电管理模块340也可以设置于同一个器件中。
第一设备的无线通信功能可以通过天线1,天线2,移动通信模块350,无线通信模块360,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。第一设备中的每个天线可用于覆盖单个或多个通信频段。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块350可以提供应用在第一设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块350可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块350可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块350还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块350的至少部分功能模块可以被设置于处理器310中。在一些实施例中,移动通信模块350的至少部分功能模块可以与处理器310的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器370A、受话器370B等)输出声音信号,或通过显示屏394显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器310,与移动通信模块350或其他功能模块设置在同一个器件中。
无线通信模块360可以提供应用在第一设备上的包括无线局域网(wireless local area networks,WLAN)(如WiFi网络),蓝牙BT,全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术 (near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块360可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块360经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器310。无线通信模块360还可以从处理器310接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,第一设备的天线1和移动通信模块350耦合,天线2和无线通信模块360耦合,使得第一设备可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
第一设备通过GPU,显示屏394,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏394和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器310可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。在本申请实施例中,GPU可以用于将计算机系统所需要的显示信息进行转换驱动,并向显示器提供行扫描信号,控制显示器的正确显示。
显示屏394用于显示图像,视频等。显示屏394包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,第一设备可以包括1个或N个显示屏394,N为大于1的正整数。
第一设备可以通过ISP,摄像组件393,视频编解码器,GPU,显示屏394以及应用处理器等实现拍摄功能。
外部存储器接口320可以用于连接外部存储卡,例如Micro SD卡,实现扩展第一设备的存储能力。外部存储卡通过外部存储器接口320与处理器310通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器321可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器321可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储第一设备使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器321可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS) 等。处理器310通过运行存储在内部存储器321的指令,和/或存储在设置于处理器中的存储器的指令,执行第一设备的各种功能应用以及数据处理。
第一设备可以通过音频模块370,扬声器370A,受话器370B,麦克风370C以及应用处理器等实现音频功能。例如音乐播放,录音等。关于音频模块370,扬声器370A,受话器370B和麦克风370C的具体工作原理和作用,可以参考常规技术中的介绍。
按键390包括开机键,音量键等。按键390可以是机械按键。也可以是触摸式按键。第一设备可以接收按键输入,产生与第一设备的用户设置以及功能控制有关的键信号输入。
马达391可以产生振动提示。马达391可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏394不同区域的触摸操作,马达391也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器392可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口395用于连接SIM卡。SIM卡可以通过插入SIM卡接口395,或从SIM卡接口395拔出,实现和第一设备的接触和分离。第一设备可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口395可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口395可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口395也可以兼容不同类型的SIM卡。SIM卡接口395也可以兼容外部存储卡。第一设备通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,第一设备采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在第一设备中,不能和第一设备分离。
需要说明的是,图3所示第一设备包括的硬件模块只是示例性地描述,并不对第一设备的具体结构做出限定。例如,第一设备还可以包括其他功能模块。
作为一种示例,图4以第二设备是笔记本电脑为例,示出一种第二设备的硬件结构示意图。如图3所示,笔记本电脑可以包括:处理器410,外部存储器接口420,内部存储器421,USB接口430,电源管理模块440,天线450,无线通信模块460,音频模块470,扬声器470A,麦克风470C,音箱接口470B,鼠标480,键盘490,指示器491,摄像头493,以及显示屏492等。
可以理解的是,本实施例示意的结构并不构成对笔记本电脑的具体限定。在另一些实施例中,笔记本电脑可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器410可以包括一个或多个处理单元,例如:处理器410可以包括应用处理器AP,调制解调处理器,图形处理器GPU,ISP,控制器,存储器,视频编解码器,DSP,基带处理器,和/或NPU等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是笔记本电脑的神经中枢和指挥中心。控制器可以根据指令完成取指 令,产生操作控制信号,进而执行指令的控制。
处理器410中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器410中的存储器为高速缓冲存储器。该存储器可以保存处理器410刚用过或循环使用的指令或数据。如果处理器410需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器410的等待时间,因而提高了系统的效率。在一些实施例中,处理器410可以包括一个或多个接口。接口可以包括集成电路I2C接口,集成电路内置音频I2S接口,PCM接口,UART接口,MIPI,GPIO接口,和/或USB接口等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对笔记本电脑的结构限定。在另一些实施例中,笔记本电脑也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电源管理模块440用于连接电源。充电管理模块440还可以与处理器410、内部存储器421、显示屏494、摄像头493和无线通信模块460等连接。电源管理模块441接收电源的输入,为处理器410、内部存储器421、显示屏494、摄像头493和无线通信模块460等供电。在一些实施例中,电源管理模块441也可以设置于处理器410中。
笔记本电脑的无线通信功能可以通过天线和无线通信模块460等实现。其中,无线通信模块460可以提供应用在笔记本电脑上的包括无线局域网WLAN(如WiFi网络),蓝牙BT,全球导航卫星系统GNSS,调频FM,近距离无线通信技术NFC,红外技术IR等无线通信的解决方案。
无线通信模块460可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块460经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器410。无线通信模块460还可以从处理器410接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。在一些实施例中,笔记本电脑的天线和无线通信模块360耦合,使得笔记本电脑可以通过无线通信技术与网络以及其他设备通信。
笔记本电脑通过GPU,显示屏492,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏492和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器410可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。显示屏492用于显示图像,视频等。该显示屏492包括显示面板。在本申请实施例中,GPU可以用于将计算机系统所需要的显示信息进行转换驱动,并向显示器提供行扫描信号,控制显示器的正确显示。
笔记本电脑可以通过ISP,摄像头493,视频编解码器,GPU,显示屏492以及应用处理器等实现拍摄功能。ISP用于处理摄像头493反馈的数据。在一些实施例中,ISP可以设置在摄像头493中。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当笔记本电脑在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。视频编解码器用于对数字视频压缩或解压缩。笔记本电脑可以支持一种或多种视频编解码器。这样,笔记本电脑可以播放多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4 等。
外部存储器接口420可以用于连接外部存储卡,例如Micro SD卡,实现扩展笔记本电脑的存储能力。外部存储卡通过外部存储器接口420与处理器410通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器421可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器410通过运行存储在内部存储器421的指令,从而执行笔记本电脑的各种功能应用以及数据处理。例如,在本申请实施例中,处理器410可以通过执行存储在内部存储器421中的指令,内部存储器421可以包括存储程序区和存储数据区。
其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储笔记本电脑使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器421可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
笔记本电脑可以通过音频模块470,扬声器470A,麦克风470C,音箱接口470B,以及应用处理器等实现音频功能。例如,音乐播放,录音等。
指示器491可以是指示灯,可以用于指示笔记本电脑处于开机状态或者关机状态等。例如,指示灯灭灯,可指示笔记本电脑处于关机状态;指示灯亮灯,可指示笔记本电脑处于开机状态。
可以理解的是,本申请实施例示意的结构并不构成对笔记本电脑的具体限定。其可以具有比图4中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。例如,该笔记本电脑还可以包括音箱等部件。图4中所示出的各种部件可以在包括一个或多个信号处理或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
示例性的,本申请实施例提供的第一设备和第二设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构等。例如,该软件系统可以包括但不限于塞班(Symbian)、安卓(Android)、Windows、苹果(iOS)、黑莓(Blackberry)、鸿蒙(Harmony)等操作系统,本申请不限定。
请参考图5,图5以分层架构的安卓(Android)操作系统为例,具体介绍本申请实施例中第一设备向第二设备投屏时的软件交互示意图。分层架构可将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。如图5所示,第一设备和第二设备的软件结构从上至下可以分为三层:应用程序层(简称应用层),应用程序框架层(简称框架层),系统库,安卓运行时和内核层(也称为驱动层)。
其中,应用程序层可以包括一系列应用程序包,例如相机,图库,日历,通话,地图,导航,蓝牙,音乐,视频,短信息等应用程序。为方便描述,以下将应用程序简称为应用。第一设备上的应用可以是原生的应用(如在第一设备出厂前,安装操作系统时安装在第一设备中的应用),也可以是第三方应用(如用户通过应用商店下载安装的应用),本申请实施例不予限定。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。如图5所示,应用程序框架层可以包括窗 口管理服务器(window manager service,WMS),活动管理服务器(activity manager service,AMS)、输入事件管理服务器(input manager service,IMS)和投屏管理模块。在一些实施例中,应用程序框架层还可以包括内容提供器,视图系统,电话管理器,资源管理器,通知管理器等(图5中未示出)。
其中,WMS承载着和“界面”有关的数据和属性,用于管理和“界面”有关的状态。例如用于管理窗口程序和事件派发。其中,管理窗口程序是指根据应用程序的显示请求在应用服务端和WMS的协助下有序地输出给物理屏幕或其他显示设备。事件派发是指将来自键盘、物理按键、触摸屏、鼠标、轨迹球(TraceBoll)等的用户事件派发给相应的控件或窗口。窗口管理服务器还可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
AMS用于负责管理Activity,负责系统中各组件的启动、切换、调度及应用程序的管理和调度等工作。具体的,AMS中定义了分别用来保存进程(Process)、活动(Activity)和任务(Task)的数据类。其中,进程(Process)对应的数据类可以包括进程文件信息、进程的内存状态信息和进程中包含的Activity、Service等。Activity信息可以保存在ActivityStack中。其中,ActivityStack用于同统一调度应用程序Activity。ActivityStack具体可以保存所有正在运行的Activity(即final ArrayList mHistory)信息,如界面配置信息。例如正在运行的Activity可以保存在new ArrayList中。ActivityStack还可以保存历史运行过的Activity的信息,如界面配置信息。需要注意,Activity并不对应一个应用程序,ActivityThread才对应一个应用程序。因此Android允许同时运行多个应用程序,实际是允许同时运行多个ActivityThread。
在Android中,Activity调度的基本思路是这样的:各应用进程在要启动新的Activity或者停止当前的Activity时,向AMS报告。AMS在内部为所有应用进程都做了记录,当AMS接到启动或停止的报告时,首先更新内部记录,然后再通知相应客户进程运行或者停止指定的Activity。由于AMS内部有所有Activity的记录,因此能够调度这些Activity,并根据Activity和系统内存的状态自动关闭后台的Activity。
IMS可以用于对原始输入事件进行翻译、封装等处理,得到包含更多信息的输入事件,并发送到WMS,WMS中存储有每个应用程序的可点击区域(比如控件)、焦点窗口的位置信息等。因此,WMS可以正确的将输入事件分发到指定的控件或者焦点窗口。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供第一设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的 消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
投屏管理模块用于负责管理投屏相关事务。例如,传输应用界面对应的视频流,界面配置参数等。又如,接收并分发来自投屏设备(如第二设备)的转屏请求等。示例性的,投屏管理模块可以是华为的Assistant或者Manager等。例如,Assistant可以是用于与其他电子设备(如第二设备)交互投屏相关信息的模块,例如Assistant可以提供第一设备与其他电子设备(如第二设备)通信的API和编程框架。示例性的,Manager可以是电脑管家、电脑助手等。
系统库和安卓运行时包含FWK所需要调用的功能函数,Android的核心库,以及Android虚拟机。系统库可以包括多个功能模块。例如:浏览器内核,三维(3 dimensional,3D)图形,字体库等。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包含显示驱动,输入/输出设备驱动(例如,键盘、触摸屏、耳机、扬声器、麦克风等),设备节点,摄像头驱动,音频驱动以及传感器驱动等。用户通过输入设备进行输入操作,内核层可以根据输入操作产生相应的原始输入事件,并存储在设备节点中。输入/输出设备驱动可以检测到用户的输入事件。例如,用户启动应用的操作。
在本申请实施例中,在第一设备向第二设备投屏的过程中,用户可以通过第二设备操控由第一设备投屏至第二设备的应用界面。第二设备的输入/输出设备驱动或传感器驱动可以检测用户的输入事件。例如,该输入事件可以是用户点击某一界面上的按钮以进入该界面的下一级界面的输入事件,或者用户旋转第二设备显示屏的输入事件等。第二设备的输入/输出设备驱动或传感器驱动将用户的输入事件上报给IMS。IMS将输入事件通过投屏管理模块(如Assistant或Manager)同步至第一设备的投屏管理模块(如Assistant或Manager)。第一设备的投屏管理模块将该输入事件分发给相应的应用。应用调用AMS中的startActivity接口以启动输入事件对应的Activity。AMS根据启动参数调用WMS接口。WMS根据启动参数绘制Activity对应的窗口,并刷新应用界面配置参数。然后,第一设备的投屏管理模块将刷新后的应用界面配置参数对应的Surface编码成标准视频流,重新同步至第二设备投屏管理模块。第二设备的投屏管 理模块根据接收到的标准视频流通过WMS重新调用显示驱动,以实现在第二设备显示屏上的同步显示。
需要说明的是,图5仅以分层架构的Android系统为例,介绍一种多窗口投屏时设备间的软件交互示意图。本申请不限定第一设备和第二设备软件系统的具体架构,关于其他架构的软件系统的具体介绍,可以参考常规技术。
为方便理解,以下对本申请实施例涉及的部分技术术语进行解释和说明:
帧率(frames per second,FPS):是指在1秒钟时间里图片的帧数,也可以理解为图形处理器每秒钟能够刷新几次。帧率通常会影响画面的流畅度。其中,帧率与画面流畅度成正比。具体的,帧率越大,画面越流畅;帧率越小,画面越有跳动感。由于人类眼睛的特殊生理结构,通常如果画面的帧率高于16FPS,则人类会认为该画面是连贯的,此现象称之为视觉暂留。
分辨率:用于表示单位面积内可显示的像素点的多少。分辨率用于体现出显示的精密度。通常,单位面积内可显示的像素点越多,则表示画面越精细;单位面积内可显示的像素点越少,则表示画面越粗糙。
在本申请实施例中,具体可能涉及到显示分辨率和图像分辨率。
显示分辨率:用于表示设备显示屏单位面积内可显示的像素点的多少。显示分辨率用于体现出屏幕的精密度。由于设备显示屏上的点、线和面都是由像素组成的,因此,显示器可显示的像素越多,画面就越精细。同时,显示器可显示的像素越多,同样大小的显示区域内能显示的信息也越多。通常,在显示分辨率一定的情况下,显示屏越小则图像越清晰。在显示屏尺寸固定时,显示分辨率越大则图像越清晰。
图像分辨率:用于表示单位面积图像内可显示的像素点的多少。例如图像分辨率可以用每英寸像素(pixels per inch,ppi)和图像尺寸(包括图像长度和宽度)来表示。图像分辨率用于体现出图像(即画面)的精密度。又如,图像分辨率可以用水平像素数和垂直像素数来表示。通常,在显示分辨率固定的情况下,图像分辨率越高则图像像素点越多,图像的尺寸和面积也越大。
码率(bitrate,br):是指单位时间内传输的数据位数。例如单位时间内传输的比特(bit)数,因此码率也成为比特率。通常码率的单位是bps(bit per second)。码率可以理解为取样率,通常,取样率越大,精度越高,处理出来的文件就越接近原始文件。但是,由于文件体积与取样率是成正比的,因此几乎所有的编码格式重视的都是如何用最低的码率达到最少的失真。围绕这个核心衍生出来例如可变码率(variable bitrate,VBR),平均比特率(average bitrate,ABR)和固定码率(constant bitrate,CBR)等编码格式。
一般,在码率一定的情况下,分辨率与清晰度成反比关系。具体的,分辨率越高,图像越不清晰,分辨率越低,图像越清晰。在分辨率一定的情况下,码率与清晰度成正比关系。具体的,码率越高,图像越清晰;码率越低,图像越不清晰。
通常,第一设备向第二设备投屏的过程主要可以包括渲染指令生成→界面渲染→颜色空间转换→视频编码→视频解码→颜色空间转换→画面切割→送显。其中,界面渲染和视频编码由第一设备完成;视频解码、画面切割和送显由第二设备完成。其中,界面渲染是指第一设备将多窗口显示的多个应用界面进行共同渲染。
颜色空间转换是指将颜色以机器能够识别的颜色编码形式来表示。例如,颜色编码可以采用YUV颜色编码或RGB颜色编码等编码方式。其中,YUV颜色编码采用明亮度和色度来定义像素的颜色。YUV中的Y表示明亮度(Luminance),U和V表示色度(Chrominance)。色度用于定义颜色的两个方面:色调和饱和度。RGB颜色编码采用将红(Red)、绿(Green)、蓝(Blue)三原色的色光以不同的比例相加,以产生多种多样的色光的原理进行编码。RGB图像中,每个像素点都有红、绿、蓝三个基底颜色,其中每种原色都占用8bit(即一个字节),那么一个像素点也就占用24bit(即三个字节)。关于颜色编码的具体编码方式和具体过程的介绍,可以参考常规技术中的解释和说明,本申请不做赘述。
通常,编解码器的编解码能力决定了是否需要进行颜色空间转换。例如,若设备支持RGB颜色解码,不支持YUV颜色解码,则需要将颜色编码形式由YUV转换为RGB。
因此,在一些实施例中,若设备的编解码器具备对应编解码能力,则可以不进行上述颜色空间转换步骤。本申请以下实施例以设备需要进行颜色空间转换为例进行介绍。
视频编码是指通过特定的压缩技术,将某个视频格式的文件转换成另一种视频格式文件的方式。例如,视频编码可以采用H.261、H.263、H.263+、H.263++或H.264等标准。视频解码是视频编码的逆向过程。关于不同视频编码的标准、视频编码的具体过程以及视频解码的具体过程的介绍,可以参考常规技术中的解释和说明,本申请不做赘述。
通常,解码后的视频流由图片帧组成。图片帧包括多个投屏界面的界面配置信息,例如应用开发属性/应用数据配置,应用界面的边界信息,应用的转向,应用界面上的图标,应用界面上的文字,图标的位置、大小和颜色,文字的显示位置、大小和颜色等。其中,应用开发属性和应用数据配置可以用于体现界面属性、应用类别或应用功能等中的一个或多个。画面切割是指将包括投屏界面配置信息的图像帧切割成多个子界面。例如,切割成多个应用界面。送显是指调用显示驱动启动多个渲染任务,将切割后的多个应用界面在对应窗口渲染并显示在显示屏上。
请参考图6,图6以第一设备向第二设备投屏短消息应用、视频应用和游戏应用为例,介绍一种常规的第一设备向第二设备投屏的过程示意图。如图6所示,在第一设备向第二设备投屏时,首先第一设备将第一设备上启动的短消息应用界面、视频应用界面和游戏应用界面共同进行界面渲染。然后将渲染后的界面进行颜色空间转换(图6以转换为YUV颜色编码为例)。之后将颜色空间转换后的界面进行视频编码(图6以视频编码采用H.264标准为例)。最后将编码后的标准视频流发送(如通过第一设备的投屏管理模块发送)至第二设备(如第二设备的投屏管理模块)。第二设备在接收到该标准视频流之后,首先对该标准视频流进行视频解码(图6以视频解码采用H.264标准为例)。然后对解码后的每一帧画面进行颜色空间转换(图6以转换为YUV颜色解码为例)。然后按照不同的界面属性对每一帧画面进行切割,如切割为短消息应用界面、视频应用界面和游戏应用界面。最后将切割后的应用界面送显。
但是,图6所示多窗口投屏时,由于短消息应用和游戏应用的帧率均为60 FPS, 因此在编码时,为了保证高帧率要求的应用界面信息的完成,第一设备通常会以60 FPS的帧率对所有应用界面进行编码。相应的,第二设备解码、切割得到的多个应用界面也均以固定的帧率(如60 FPS)送显。较大帧率的多个应用界面在第二设备上同时解码和渲染时,GPU的占用率经常会比较高(例如达到80%以上)。另外,常规的多窗口投屏技术中,多个应用界面的分辨率也通常是固定的。较大分辨率的多个应用界面在第二设备上同时解码和渲染时,通信资源(如WiFi资源)的吞吐压力通常较大。上述问题会导致系统卡顿,从而导致投屏画面卡顿或者不流畅等问题,影响用户体验。
为解决上述问题,本申请实施例提供一种多窗口投屏方法,该方法用于在第一设备向第二设备多窗口投屏时,保证投屏画面的流畅度和清晰度。
例如,本申请实施例提供的一种多窗口投屏方法可以在第一设备向第二设备多窗口投屏时,通过降低第二设备的GPU压力的方式保证投屏画面的流畅度和清晰度。
在一些实施例中,第二设备可以根据第一设备投屏至第二设备的多个应用界面对应的窗口状态,适应性动态调整多个应用界面对应的帧率,以降低第二设备的GPU压力,从而保证投屏画面的流畅度和清晰度。其中,上述窗口状态可以包括但不限于焦点窗口,非最小化且非焦点窗口和最小化窗口。
在另一些实施例中,第二设备可以根据第一设备投屏至第二设备的多个应用界面对应的应用类别,适应性动态调整不同应用界面对应的帧率,以降低第二设备的GPU压力,从而保证投屏画面的流畅度和清晰度。其中,上述应用类别可以包括但不限于即时消息类、视频类、游戏类、办公类、社交类、生活类、购物类或功能类等。
又如,本申请实施例提供的一种多窗口投屏方法可以在第一设备向第二设备多窗口投屏时,通过适应性调整显示区域尺寸和/或分辨率的方式保证投屏画面的流畅度和清晰度。其中,分辨率可以包括但不限于显示分辨率和视频分辨率。
在一些实施例中,第二设备可以根据第一设备投屏至第二设备的应用界面的数量,适应性动态调整应用显示区域(Display)尺寸、应用显示区域(Display)分辨率(也称显示分辨率)和视频分辨率等中的一个或多个,以保证投屏画面的流畅度和清晰度。由于视频是由一帧一帧的图像组成,因此视频分辨率也称为图像分辨率。
其中,在本申请实施例中,显示区域(Display)尺寸可以理解为设备显示屏用于显示应用界面的显示区域尺寸。显示区域(Display)分辨率用于表征设备显示屏用于显示应用界面的显示区域中,单位面积内可显示的像素点的多少。视频分辨率用于表征视频流对应的图像帧单位图像面积内可显示的像素点的多少。
本申请以下实施例提供的多窗口投屏方法对于同源投屏方式与异源投屏方式均适用。以下结合具体实施例,以第一设备和第二设备之间遵循无线传输协议为例,对本申请实施例提供的技术方案进行具体阐述。
实施例一:
在本申请实施例一中,第二设备可以根据第一设备投屏至第二设备的多个应用界面对应的窗口状态,适应性动态调整不同应用界面对应的帧率,以降低第二设备的GPU压力,从而保证投屏画面的流畅度和清晰度。
请参考图7,图7示出了本申请实施例提供的一种多窗口投屏方法流程图。如图7所示,本申请实施例提供的一种多窗口投屏方法可以包括以下步骤S701-S703:
S701、第二设备与第一设备同步显示第一界面,该第一界面包括多个应用界面。
其中,本申请实施例中,第二设备与第一设备同步显示第一界面是指第二设备同步显示由第一设备投屏至第二设备的第一界面。第一界面是多个应用界面的组合。
例如,假设第一设备为手机110,手机110采用同源投屏方式将手机110桌面、短消息应用界面、视频应用界面和游戏应用界面投屏至笔记本电脑120,则第一界面如图1所示,包括手机桌面、短消息应用界面、视频应用界面和游戏应用界面。
又如,第一设备为手机110,手机110采用异源投屏方式将短消息应用界面、视频应用界面和游戏应用界面投屏至笔记本电脑120,则第一界面如图2所示,包括短消息应用界面、视频应用界面和游戏应用界面。
S702、第二设备获取多个应用界面对应的窗口状态。
其中,上述窗口状态可以包括但不限于焦点窗口,非最小化且非焦点窗口和最小化窗口。
其中,在本申请实施例中,焦点窗口可以理解为用户最近一次操控的应用窗口。焦点窗口也可以称为当前活动窗口。非最小化且非焦点窗口可以理解为当前未处于最小化,且用户最近一次未操控的应用窗口。最小化窗口可以理解为当前处于最小化的应用窗口。
以图1所示第一界面为例,其中,手机110桌面、短消息应用界面、视频应用界面和游戏应用界面均为非最小化窗口。假设用户最近一次操控的应用窗口是视频应用界面,则视频应用界面为焦点窗口,手机110桌面、短消息应用界面和视频应用界面为非最小化且非焦点窗口。
以图2所示第一界面为例,其中,短消息应用界面、视频应用界面和游戏应用界面均为非最小化窗口。假设用户最近一次操控的应用窗口是视频应用界面,则视频应用界面为焦点窗口,短消息应用界面和视频应用界面为非最小化且非焦点窗口。
在一些实施例中,第二设备可以周期性获取多个应用界面对应的窗口状态。例如,第二设备可以根据预设周期,周期性获取多个应用界面对应的窗口状态。其中,该预设周期可以预先设置在第二设备中。例如,该预设周期可以为3秒(seconds,s)。
在另一些实施例中,第二设备可以响应于接收到用户的操控事件,获取多个应用界面对应的窗口状态。例如,第二设备可以响应于输入/输出设备驱动或传感器驱动接收到用户的操控事件,获取多个应用界面对应的窗口状态。示例性的,上述操控事件可以是用户对上述多个应用界面中任一个界面的操控事件。
S703、第二设备根据获取的多个应用界面对应的窗口状态,适应性调整多个应用界面对应的帧率。
其中,第二设备可以根据预设策略(如第一预设策略),适应性调整多个应用界面对应的帧率。该第一预设策略与窗口状态相关。
可以理解,用户对不同窗口状态中的应用界面的体验需求程度对传输和显示不同。示例性的,用户体验需求程度由高到低为:焦点窗口>非最小化且非焦点窗口>最小化窗口。因此,在一些实施例中,根据多个应用界面对应的窗口状态,适应性调整多个应用界面对应的帧率,可以将GPU资源和/或处理能力向用户体验需求程度较高的应用界面(如处于焦点窗口中的应用界面)上倾斜,减小设备在用户体验需求程度较 低的应用界面(如处于最小化窗口中的应用界面)上分配的GPU资源和/或处理能力。
基于上述原因,在一些实施例中,第二设备可以根据以下第一预设策略适应性调整多个应用界面对应的帧率:根据用户体验需求程度由高到低,由大到小适应性调整多个应用界面对应的帧率。
例如,用户体验需求程度由高到低:焦点窗口>非最小化且非焦点窗口>最小化窗口,则帧率的大小程度可以调整为:焦点窗口>非最小化且非焦点窗口>最小化窗口。示例性的,焦点窗口对应的应用界面帧率可以调整为60 FPS(即每秒钟刷新60次),非最小化且非焦点窗口对应的应用界面帧率可以调整为30 FPS(即每秒钟刷新30次),最小化窗口对应的应用界面帧率可以调整为0 FPS(即不刷新)。其中,焦点窗口由于是用户最近一次操控的应用窗口,因此用户的体验需求程度最高。最小化窗口由于当前正处于最小化,因此用户的体验需求程度最低。非最小化且非焦点窗口当前的用户体验需求程度并不高,但是用户随时可能操控该窗口中的应用界面,因此用户的体验需求程度介于焦点窗口和最小化窗口之间。
请参考图8,图8以第一设备向第二设备投屏短消息应用、视频应用和游戏应用为例,示出了本申请实施例提供的一种多窗口协同投屏过程示意图。如图8所示,初始投屏时,在第一设备将第一设备上启动的短消息应用、视频应用和游戏应用共同界面渲染、颜色空间转换(图8以转换为YUV颜色编码为例)、视频编码(图8以视频编码采用H.264标准为例)并将编码后的标准视频流发送至第二设备。第二设备完成视频解码(图8以视频解码采用H.264标准为例)、颜色空间转换(图8以转换为YUV颜色解码为例)、画面切割和送显。
假设当前第二设备分别以60 FPS的帧率将投屏界面(包括短消息应用界面、视频应用界面和游戏应用界面中),之后,第二设备可以在用户操控短消息应用界面、视频应用界面和游戏应用界面中的一个或多个的过程中,获取(如周期性获取、响应于接收到操控事件等)短消息应用界面、视频应用界面和游戏应用界面对应的窗口状态。如图8所示,假设第二设备确定短消息应用窗口当前处于最小化,视频应用窗口当前是非最小化且非焦点窗口,游戏应用窗口当前是焦点窗口。则第二设备根据获取的窗口状态,以帧率的大小程度为:焦点窗口>非最小化且非焦点窗口>最小化窗口的策略,适应性调整短消息应用界面、视频应用界面和游戏应用界面对应的帧率。示例性的,如图8所示,第二设备可以根据短消息应用窗口、视频应用窗口和游戏应用窗口的窗口状态,将短消息应用界面的帧率调整为0 FPS,将视频应用界面的帧率调整为30 FPS,将游戏应用界面的帧率不作调整(即仍为60 FPS)。其中,由于短消息应用界面的帧率当前被调整为0 FPS,因此,在视频流处理时,第二设备可以不对短消息应用界面进行刷新,使用之前图像帧对应的短消息应用的界面配置信息送显。其中,送显的短消息应用界面可以仍然与前一帧短消息应用界面相同。第二设备将视频应用界面的帧率由60 FPS调整为30 FPS具体可以包括:第二设备每隔一帧启动一次渲染任务,将视频应用界面在视频应用窗口渲染并送显。第二设备将游戏应用界面送显具体可以包括:第二设备对切割后的每一帧游戏应用界面启动渲染任务,在游戏应用窗口渲染并送显。
通过本申请上述实施例一提供的方法,第二设备通过获取不同投屏窗口的窗口状 态,以根据不同投屏窗口的窗口状态,按需分配设备GPU资源和/或处理能力,例如将GPU资源和/或处理能力向用户的体验需求程度较高的应用界面(如处于焦点窗口中的应用界面)上倾斜,减小设备在用户的体验需求程度较低的应用界面(如处于最小化窗口中的应用界面)上分配的GPU资源和/或处理能力。通过这样的方式,可以在保证投屏画面的流畅度和清晰度的同时,降低第二设备的负荷。或者,通过上述按需分配资源的方式,可以在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
实施例二:
在本申请实施例二中,第二设备可以根据第一设备投屏至第二设备的多个应用界面对应的应用类别,适应性动态调整不同应用界面对应的帧率,以降低第二设备的GPU压力,从而保证投屏画面的流畅度和清晰度。
请参考图9,图9示出了本申请实施例提供的另一种多窗口投屏方法流程图。如图9所示,本申请实施例提供的一种多窗口投屏方法可以包括以下步骤S701、S901和S902:
S701、第二设备与第一设备同步显示第一界面,该第一界面包括多个应用界面。
其中,本申请实施例中,第二设备与第一设备同步显示第一界面是指第二设备同步显示由第一设备投屏至第二设备的第一界面。第一界面是多个应用界面的组合。
例如,假设第一设备为手机110,手机110采用同源投屏方式将手机110桌面、短消息应用界面、视频应用界面和游戏应用界面投屏至笔记本电脑120,则第一界面如图1所示,包括手机桌面、短消息应用界面、视频应用界面和游戏应用界面。
又如,第一设备为手机110,手机110采用异源投屏方式将短消息应用界面、视频应用界面和游戏应用界面投屏至笔记本电脑120,则第一界面如图2所示,包括短消息应用界面、视频应用界面和游戏应用界面。
S901、第二设备获取多个应用界面对应的应用类别。
其中,上述应用类别可以包括但不限于即时消息类、视频类、游戏类、办公类、社交类、生活类、购物类或功能类等。
以图1所示第一界面为例,其中,手机110桌面可以理解为功能类应用,短消息应用可以理解为即时消息类应用,视频应用可以理解为视频类应用,游戏应用可以理解为游戏类应用。以图2所示第一界面为例,其中,短消息应用可以理解为即时消息类应用,视频应用可以理解为视频类应用,游戏应用可以理解为游戏类应用。
在本申请实施例中,第二设备可以从第一设备中获取由第一设备投屏至第二设备的上述多个应用界面对应的应用类别。示例性的,多个应用界面对应的应用类别可以由来自第一设备的视频流中对应应用的属性和/或功能等决定。例如,第二设备可以根据从第一设备获取的应用开发属性和/或应用数据配置确定多个应用界面对应的应用类别。
在一些实施例中,第二设备可以周期性从第一设备获取多个应用界面对应的应用类别。例如,第二设备可以根据预设周期,周期性获取多个应用界面对应的窗口状态。其中,该预设周期可以预先设置在第二设备中。例如,该预设周期可以为3秒(seconds,s)。
在另一些实施例中,第二设备可以响应于接收到用户的操控事件,从第一设备获取多个应用界面对应的应用类别。例如,第二设备可以响应于输入/输出设备驱动或传感器驱动接收到用户的操控事件,获取多个应用界面对应的应用类别。示例性的,上述操控事件可以是用户对上述多个应用界面中任一个界面的操控事件。
S902、第二设备根据获取的多个应用界面对应的应用类别,适应性调整多个应用界面对应的帧率。
其中,第二设备可以根据预设策略(如第二预设策略),适应性调整多个应用界面对应的帧率。该第二预设策略与应用类别相关。
可以理解,不同类别的应用的界面在渲染时,为保证较好的应用功能完善度和用户体验,对设备的资源(如GPU资源)和处理能力(如GPU处理能力)等的需求。示例性的,游戏类应用由于需要为用户呈现较丰富的画面,保证较小的时延,因此对画面流畅度、清晰度和延时性等要求最高,因此游戏类应用的界面在渲染时,对设备的资源和/或处理能力等的需求较大。视频类应用在渲染时,对设备的资源和/或处理能力等的需求次于游戏类应用,即时消息类由于数据的传输通常是间隔性地,因此在渲染时,对设备的资源和/或处理能力等的需求最低。
因此,在一些实施例中,根据多个应用界面对应的应用类别,适应性调整多个应用界面对应的帧率,可以将资源和/或处理能力向要求较高的应用(如游戏类应用)界面的渲染上倾斜,减小设备在要求较低的应用(如功能类应用)界面的渲染上分配的资源和/或处理能力。
基于上述原因,在一些实施例中,第二设备可以根据以下第二预设策略适应性调整多个应用界面对应的帧率:根据对资源和/或处理能力需求程度由高到低,由大到小适应性调整多个应用界面对应的帧率。
例如,对资源和处理能力需求程度由高到低为:游戏类应用>视频类应用>即时消息类应用,则帧率的大小程度可以调整为:游戏类应用界面>视频类应用界面>即时消息类应用界面。示例性的,游戏类应用界面的帧率可以调整为60 FPS(即每秒钟刷新60次),视频类应用界面的帧率可以调整为24 FPS(即每秒钟刷新24次),即时消息类应用界面的帧率可以调整为24 FPS(即每秒钟刷新24次)。
通过本申请上述实施例二提供的方法,第二设备通过获取不同投屏应用的应用类别,以根据不同应用的应用类别,按需分配设备资源和/或处理能力,例如将设备资源和/或处理能力向需求程度较高的应用(如游戏类应用)界面的渲染上倾斜,减小设备在需求程度较低的应用(如功能类应用)界面的渲染上分配的资源和/或处理能力。通过上述按需分配资源的方式,可以在保证投屏画面的流畅度和清晰度的同时,降低第二设备的负荷。或者,通过上述按需分配资源的方式,可以在第二设备处理能力受限时,保证投屏画面的流畅度和清晰度。
需要说明的是,本申请实施例提供的上述实施实施例二提供的方案可以与上述实施例一提供的方案结合,以适应性动态调整不同应用界面对应的帧率,降低第二设备的GPU压力,从而保证投屏画面的流畅度和清晰度。
示例性的,请参考图10,图10示出了本申请实施例提供的另一种多窗口投屏方法流程图。如图10所示,本申请实施例提供的一种多窗口投屏方法可以包括步骤S701、 S702、S901和S1001。其中,在第二设备执行上述步骤S702和S901的情况下,第二设备可以通过步骤S1001,综合考虑不同应用界面对应的窗口状态和应用类别,适应性调整不同应用界面对应的帧率:
S1001、第二设备根据获取的多个应用界面对应的窗口状态和应用类别,适应性调整一个或多个应用界面对应的帧率。
在一些实施例中,第二设备可以根据窗口状态和应用类别对应的不同权重,适应性调整一个或多个应用界面对应的帧率。其中,权重用于表示相应因素的可参考程度或重要程度。例如,即使应用对资源和/或处理能力要求较高,但是若该应用对应的窗口状态表明该应用界面受用户关注的程度较低(若窗口状态为最小化窗口),则该应用几乎不需要占用资源和/或处理能力。因此,通常窗口状态的权重大于应用类别。即,在综合考虑不同应用界面对应的窗口状态和应用类别,首要考虑不同应用界面对应的窗口状态。
在一些实施例中,第二设备可以根据窗口状态和应用类别,采取优选最小值的策略,适应性调整一个或多个应用界面对应的帧率。
请参考图11,图11以第一设备向第二设备投屏短消息应用、视频应用和游戏应用为例,示出了本申请实施例提供的一种多窗口协同投屏过程示意图。其中,初始投屏时,在第一设备将第一设备上启动的短消息应用、视频应用和游戏应用共同界面渲染、颜色空间转换(图11以转换为YUV颜色编码为例)、视频编码(图11以视频编码采用H.264标准为例)并将编码后的标准视频流发送至第二设备。第二设备完成视频解码(图11以视频解码采用H.264标准为例)、颜色空间转换(图11以转换为YUV颜色解码为例)、画面切割和送显。
假设当前第二设备分别以60 FPS的帧率将投屏界面(包括短消息应用界面、视频应用界面和游戏应用界面中),之后,第二设备可以在用户操控短消息应用界面、视频应用界面和游戏应用界面中的一个或多个的过程中,获取(如周期性获取、响应于接收到操控事件等)短消息应用界面、视频应用界面和游戏应用界面对应的窗口状态和应用类别。如图11所示,假设第二设备确定短消息应用窗口当前处于最小化,视频应用窗口当前是非最小化且非焦点窗口,游戏应用窗口当前是焦点窗口。以及,第二设备确定短消息应用为即时消息类应用,视频应用为视频类应用,游戏应用为游戏类应用。则第二设备综合考虑获取的窗口状态和应用类别,以帧率的大小程度为:游戏应用窗口>视频应用窗口>短消息应用窗口的策略,适应性调整短消息应用界面、视频应用界面和游戏应用界面对应的帧率。示例性的,如图11所示,第二设备可以根据短消息应用窗口、视频应用窗口和游戏应用窗口的窗口状态和应用类别,将短消息应用界面的帧率调整为0 FPS,将视频应用界面的帧率调整为24 FPS,将游戏应用界面的帧率不作调整(即仍为60 FPS)。其中,由于短消息应用界面的帧率当前被调整为0 FPS,因此,在视频流处理时,第二设备可以不对短消息应用界面进行切割。示例性的,送显的短消息应用界面可以仍然与前一帧短消息应用界面相同。
在一些实施例中,本申请图7、图9或图10所示实施例可以由第二设备的资源占用情况和/或处理能力触发。例如,在第二设备执行上述步骤S701的过程中,若第二设备的GPU负荷过高,则第二设备执行图7所示的步骤S702和S703;或者第二设备 执行图9所示的步骤S902和S903;或者第二设备执行图10所示的步骤S702、S901和S1001。
在一些实施例中,若满足以下中的一个或多个,则可以确定GPU负荷过高:GPU的解码时延大于时延阈值,GPU的负载率超过负载阈值(如80%),投屏应用界面的数量大于数量阈值(如2个)。
以根据GPU的解码时延是否大于预设阈值来判断GPU负荷为例,例如,在第二设备执行上述步骤S701的过程中,若第二设备的GPU解码时延大于预设阈值(如10ms),则第二设备执行图7所示的步骤S702和S703,降低用户的体验需求程度较低的应用界面帧率,保证用户的体验需求程度较高的应用界面帧率。例如,降低图8所示非焦点窗口(如最小化窗口或非最小化且非焦点窗口)对应的界面帧率,保证焦点窗口对应的界面帧率。
又如,在第二设备执行上述步骤S701的过程中,若第二设备的GPU解码时延大于预设阈值(如10ms),则第二设备执行图9所示的步骤S902和S903,降低对设备的资源和/或处理能力需求程度较低的应用界面帧率,保证对设备的资源和/或处理能力需求程度较高的应用界面帧率。例如,降低短消息应用和视频应用界面帧率,保证游戏应用界面帧率。
又如,在第二设备执行上述步骤S701的过程中,若第二设备的GPU解码时延大于预设阈值(如10ms),则第二设备执行图10所示的步骤S702、S901和S1001,用户的体验需求程度较低的应用界面帧率,保证被用户的体验需求程度较高且对设备的资源和/或处理需求程度较高的应用界面帧率。例如,降低图11所示非焦点窗口(如最小化窗口或非最小化且非焦点窗口)对应的界面帧率,保证窗口状态为焦点窗口且对设备的资源和/或处理能力需求程度较高的游戏应用界面帧率。
请参考图12,图12以第一设备向第二设备投屏短消息应用、视频应用和游戏应用为例,示出了本申请实施例提供的一种多窗口协同投屏过程示意图。其中,初始投屏时,在第一设备将第一设备上启动的短消息应用、视频应用和游戏应用共同界面渲染、颜色空间转换(图12以转换为YUV颜色编码为例)、视频编码(图12以视频编码采用H.264标准为例)并将编码后的标准视频流发送至第二设备。第二设备完成视频解码(图12以视频解码采用H.264标准为例)、颜色空间转换(图12以转换为YUV颜色解码为例)、画面切割和送显。
假设当前第二设备分别以60 FPS的帧率将投屏界面(包括短消息应用界面、视频应用界面和游戏应用界面中),之后,若第二设备在用户操控短消息应用界面、视频应用界面和游戏应用界面中的一个或多个的过程中,确定第二设备的GPU解码时延大于预设阈值,则第二设备获取(如周期性获取、响应于接收到操控事件等)短消息应用界面、视频应用界面和游戏应用界面对应的窗口状态和应用类别。如图12所示,假设第二设备确定短消息应用窗口当前处于最小化,视频应用窗口当前是非最小化且非焦点窗口,游戏应用窗口当前是焦点窗口。以及,第二设备确定短消息应用为即时消息类应用,视频应用为视频类应用,游戏应用为游戏类应用。则第二设备综合考虑获取的窗口状态和应用类别,以帧率的大小程度为:游戏应用窗口>视频应用窗口>短消息应用窗口的策略,适应性调整短消息应用界面、视频应用界面和游戏应用界面对 应的帧率。示例性的,如图12所示,第二设备可以根据短消息应用窗口、视频应用窗口和游戏应用窗口的窗口状态和应用类别,将短消息应用界面的帧率调整为0 FPS,将视频应用界面的帧率调整为24 FPS,将游戏应用界面的帧率不作调整(即仍为60FPS)。其中,由于短消息应用界面的帧率当前被调整为0 FPS,因此,在视频流处理时,第二设备可以不对短消息应用界面进行切割。示例性的,送显的短消息应用界面可以仍然与前一帧短消息应用界面相同。
实施例三:
在本申请实施例三中,第二设备可以根据第一设备投屏至第二设备的多个应用界面的数量,适应性动态调整应用显示区域尺寸(Display尺寸)、显示分辨率(Display分辨率)和视频分辨率等中的一个或多个,以保证投屏画面的流畅度和清晰度。
请参考图13,图13示出了本申请实施例提供的另一种多窗口投屏方法流程图。如图13所示,本申请实施例提供的一种多窗口投屏方法可以包括以下步骤S701,S1301-S1302:
S701、第二设备与第一设备同步显示第一界面,该第一界面包括多个应用界面。
关于步骤S701的具体介绍,可以参考上述实施例一中对S701的介绍。
S1301、第二设备获取当前投屏应用界面的数量。
其中,投屏应用界面的数量是指由第一设备投屏至第二设备的应用界面的数量。
以图1所示第一界面为例,其中,手机110桌面、短消息应用界面、视频应用界面和游戏应用界面为第一设备投屏至第二设备的应用界面,则第二设备获取当前投屏应用界面的数量为4。
以图2所示第一界面为例,其中,短消息应用界面、视频应用界面和游戏应用界面为第一设备投屏至第二设备的应用界面,则第二设备获取当前投屏应用界面的数量为3。
示例性的,第二设备可以从第一设备获取当前投屏应用界面的数量,例如从来自第一设备的标准视频流中获取。
S1302、第二设备根据获取的当前投屏应用界面的数量,适应性调整以下中的一个或多个:应用显示区域尺寸,显示分辨率和视频分辨率。
其中,应用显示区域是指第二设备用于显示应用界面的显示区域。应用显示区域尺寸(Display尺寸)为该显示区域的尺寸。显示分辨率用于表征第二设备显示屏用于显示应用界面的显示区域中,单位面积内可显示的像素点的多少。视频分辨率用于表征视频流对应的图像帧单位图像面积内可显示的像素点的多少。
在本申请实施例中,第二设备可以根据预设策略,根据当前投屏应用界面的数量,适应性调整应用显示区域尺寸,显示分辨率和视频分辨率中的一个或多个。
例如,第二设备可以根据当前投屏应用界面的数量的增加,倍数、指数或基于预设计算式增加应用显示区域的长度或者宽度(应用显示区域尺寸)。示例性的,若当前投屏应用界面的数量为1,则Display尺寸可以为a 1×b 1,其中,a 1为Display的长度,b 1为Display的宽度。若当前投屏应用界面的数量为2,则Display尺寸可以为2a 1×b 1,其中,2a 1为Display的长度,b 1为Display的宽度。若当前投屏应用界面的数量为3,则Display尺寸可以为3a 1×b 1,其中,3a 1为Display的长度,b 1为Display的宽度。示 例性的,a 1可以为第一设备显示屏的宽度,b 1可以为第一设备显示屏的高度;或者a 1可以为第一设备显示屏的高度,b 1可以为第一设备显示屏的宽度。
例如,第二设备可以根据当前投屏应用界面的数量的增加,倍数、指数或基于预设计算式增加显示屏水平维度能够显示的像素数或垂直维度能够显示的像素数(即显示分辨率)。示例性的,若当前投屏应用界面的数量为1,则显示分辨率可以为a 2×b 2像素(pixel,P),其中,a 2为第二设备显示屏水平维度能够显示的像素数,b 2为第二设备显示屏垂直维度能够显示的像素数。若当前投屏应用界面的数量为2,则显示分辨率可以为a 2×2b 2,其中,a 2为第二设备显示屏水平维度能够显示的像素数,2b 2为第二设备显示屏垂直维度能够显示的像素数。若当前投屏应用界面的数量为3,则显示分辨率可以为a 2×3b 2像素(pixel,p),其中,a 2为第二设备显示屏水平维度能够显示的像素数,3b 2为第二设备显示屏垂直维度能够显示的像素数。
例如,第二设备可以根据当前投屏应用界面的数量的增加,基于第二设备的显示分辨率,倍数、指数或基于预设计算式降低单位面积(如每英寸)图像内可显示的像素点的数量(即视频分辨率,也称图像分辨率)。示例性的,若当前投屏应用界面的数量为1,第二设备的显示分辨率为a 2×b 2,则视频分辨率可以为a 3×b 3,其中,a 3=a 2,b 3=b 2。a 3为水平维度单位面积(如每英寸)图像内可显示的像素点的数量,b 3为垂直维度单位面积(如每英寸)图像内可显示的像素点的数量。若当前投屏应用界面的数量为2,第二设备的显示分辨率为a 2×2b 2,则视频分辨率可以为a 3×b 3,其中,a 3=a 2/2,b 3=b 2。a 2/2(即a 3)为水平维度单位面积(如每英寸)图像内可显示的像素点的数量,b 2(即b 3)为垂直维度单位面积(如每英寸)图像内可显示的像素点的数量。若当前投屏应用界面的数量为3,第二设备的显示分辨率为a 2×3b 2,则视频分辨率可以为a 3×b 3,其中,a 3=a 2/2,b 3=3b 2/2。a 2/2(即a 3)为水平维度单位面积(如每英寸)图像内可显示的像素点的数量,3b 2/2(即b 3)为垂直维度单位面积(如每英寸)图像内可显示的像素点的数量。
通过本申请上述实施例三提供的方法,第二设备通过根据第一设备投屏至第二设备的应用界面的数量,适应性调整应用显示区域尺寸,显示分辨率和视频分辨率中的一个或多个。通过该方案,可以按照第二设备的GPU具体负荷,适应性调整显示的清晰度。例如在应用界面的数量较少,GPU负荷较小时,保证较高的显示分辨率和视频分辨率,以保证界面清晰度;在应用界面的数量较多,GPU处理能力受限(即负荷较大)时,降低显示分辨率和视频分辨率,以保证界面的流畅度。
需要说明的是,本申请实施例提供的上述实施实施例三提供的方案可以与上述实施例一和/或实施例二提供的方案结合,以适应性动态调整不同应用界面对应的帧率、显示分辨率和视频分辨率,降低第二设备的GPU压力,从而保证投屏画面的流畅度和清晰度。
示例性的,请参考图14,图14示出了本申请实施例提供的另一种多窗口投屏方法流程图。如图14所示,本申请实施例提供的一种多窗口投屏方法可以包括步骤S701、S702、S901和S1001,以及包括步骤S701、S1301和S1302。
请参考图15,图15以第一设备向第二设备投屏短消息应用、视频应用和游戏应用为例,示出了本申请实施例提供的一种多窗口协同投屏过程示意图。其中,初始投 屏时,在第一设备将第一设备上启动的短消息应用、视频应用和游戏应用共同界面渲染、颜色空间转换(图15以转换为YUV颜色编码为例)、视频编码(图15以视频编码采用H.264标准为例)并将编码后的标准视频流发送至第二设备。第二设备完成视频解码(图15以视频解码采用H.264标准为例)、颜色空间转换(图15以转换为YUV颜色解码为例)、画面切割和送显。
假设当前第二设备分别以60 FPS的帧率将投屏界面(包括短消息应用界面、视频应用界面和游戏应用界面中),之后,若第二设备在用户操控短消息应用界面、视频应用界面和游戏应用界面中的一个或多个的过程中,确定第二设备的GPU解码时延大于预设阈值,则第二设备获取(如周期性获取、响应于接收到操控事件等)短消息应用界面、视频应用界面和游戏应用界面对应的窗口状态和应用类别。如图15所示,假设第二设备确定短消息应用窗口当前处于最小化,视频应用窗口当前是非最小化且非焦点窗口,游戏应用窗口当前是焦点窗口。以及,第二设备确定短消息应用为即时消息类应用,视频应用为视频类应用,游戏应用为游戏类应用。则第二设备综合考虑获取的窗口状态和应用类别,以帧率的大小程度为:游戏应用窗口>视频应用窗口>短消息应用窗口的策略,适应性调整短消息应用界面、视频应用界面和游戏应用界面对应的帧率。示例性的,如图15所示,第二设备可以根据短消息应用窗口、视频应用窗口和游戏应用窗口的窗口状态和应用类别,将短消息应用界面的帧率调整为0 FPS,将视频应用界面的帧率调整为24 FPS,将游戏应用界面的帧率不作调整(即仍为60FPS)。其中,由于短消息应用界面的帧率当前被调整为0 FPS,因此,在视频流处理时,第二设备可以不对短消息应用界面进行切割。示例性的,送显的短消息应用界面可以仍然与前一帧短消息应用界面相同。
进一步的,如图15所示,第二设备可以从第一设备获取当前投屏应用界面的数量,以根据当前投屏应用界面的数量适应性调整图15所示短消息应用、视频应用和游戏应用的应用显示区域尺寸,显示分辨率和视频分辨率。示例性的,如图15所示,第二设备确定当前投屏应用界面的数量为3,则第二设备可以确定短消息应用、视频应用和游戏应用的Display尺寸为3a 1×b 1,显示分辨率为2244×3240P,视频分辨率为1122×1620P。
应理解,本申请实施例的各个方案可以进行合理的组合使用,并且实施例中出现的各个术语的解释或说明可以在各个实施例中互相参考或解释,对此不作限定。
还应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
可以理解的是,电子设备(包括第一设备和第二设备)为了实现上述任一个实施例的功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以对电子设备(包括第一设备和第二设备)进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
比如,以采用集成的方式划分各个功能模块的情况下,如图16所示,为本申请实施例提供的一种电子设备的结构框图。例如,该电子设备可以是第一设备或第二设备。如图16所示,该电子设备可以包括处理单元1610的存储单元1620。
其中,在电子设备为第二设备时,处理单元1610用于在第二设备在与第一设备同步显示包括多个应用界面的第一界面时,获取第一信息。以及,根据获取的第一信息,适应性调整以下中的一个或多个:多个应用界面对应的帧率、多个应用界面对应的应用显示区域尺寸、第二设备的显示分辨率或多个应用界面对应的视频分辨率。例如,处理单元1610用于支持电子设备执行上述步骤S702、S703、S901、S902、S1001、S1301或S1302,和/或用于本文所描述的技术的其他过程。存储单元1620用于存储计算机程序和实现本申请实施例提供的方法中的处理数据和/或处理结果等。
在一种可能的结构中,如图17所示,电子设备还可以包括收发单元1630。收发单元1630用于与第二设备通信。例如,接收来自第一设备的投屏界面的界面配置信息,控制指令等。又如,向第一设备发送用户的操控事件等。
需要说明的是,上述收发单元1630可以包括射频电路。具体的,电子设备可以通过射频电路进行无线信号的接收和发送。通常,射频电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路还可以通过无线通信和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
应理解,电子设备中的各个模块可以通过软件和/或硬件形式实现,对此不作具体限定。换言之,电子设备是以功能模块的形式来呈现。这里的“模块”可以指特定应用集成电路ASIC、电路、执行一个或多个软件或固件程序的处理器和存储器、集成逻辑电路,和/或其他可以提供上述功能的器件。可选地,在一个简单的实施例中,本领域的技术人员可以想到便携设备可以采用图18所示的形式。处理单元1610可以通过图18所示的处理器1810实现。收发单元1630可以通过图18所示的收发器1830来实现。具体的,处理器通过执行存储器中存储的计算机程序来实现。可选地,所述存储器为所述芯片内的存储单元,比如寄存器、缓存等,所述存储单元还可以是所述计算机设备内的位于所述芯片外部的存储单元,如图18所的存储器1820。
在一种可选的方式中,当使用软件实现数据传输时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地实现本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计 算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线((digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如软盘、硬盘、磁带)、光介质(例如数字化视频光盘(digital video disk,DVD))、或者半导体介质(例如固态硬盘solid state disk(SSD))等。
结合本申请实施例所描述的方法或者算法的步骤可以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于RAM存储器、闪存、ROM存储器、EPROM存储器、EEPROM存储器、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于探测装置中。当然,处理器和存储介质也可以作为分立组件存在于探测装置中。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。

Claims (15)

  1. 一种多窗口投屏方法,其特征在于,应用于第一设备向第二设备投屏的场景中,所述方法包括:
    第二设备在与第一设备同步显示第一界面时,获取第一信息;所述第一界面包括多个应用界面;
    所述第二设备根据获取的所述第一信息,适应性调整以下中的一个或多个:所述多个应用界面对应的帧率、所述多个应用界面对应的应用显示区域尺寸、所述第二设备的显示分辨率或所述多个应用界面对应的视频分辨率。
  2. 根据权利要求1所述的方法,其特征在于,所述第一信息包括所述多个应用界面对应的窗口状态;所述第一信息具体用于所述第二设备适应性调整所述多个应用界面对应的帧率;
    其中,所述窗口状态包括焦点窗口,非最小化且非焦点窗口和最小化窗口。
  3. 根据权利要求2所述的方法,其特征在于,所述第二设备根据获取的所述多个应用界面对应的窗口状态,适应性调整所述多个应用界面对应的帧率,包括:
    所述第二设备按照以下第一预设策略适应性调整所述多个应用界面对应的帧率:焦点窗口对应的帧率>非最小化且非焦点窗口对应的帧率>最小化窗口对应的帧率。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,所述第一信息包括所述多个应用界面对应的应用类别,所述第一信息具体用于所述第二设备适应性调整所述多个应用界面对应的帧率;
    其中,所述应用类别包括游戏类、视频类、即时消息类、办公类、社交类、生活类、购物类和功能类中的一个或多个。
  5. 根据权利要求4所述的方法,其特征在于,所述应用类别包括游戏类、视频类和即时消息类;所述第二设备根据获取的所述多个应用界面对应的应用类别,适应性调整所述多个应用界面对应的帧率,包括:
    所述第二设备按照以下第二预设策略适应性调整所述多个应用界面对应的帧率:游戏类应用界面对应的帧率>视频类应用界面对应的帧率>即时消息类应用界面对应的帧率。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述第二设备在与第一设备同步显示第一界面时,获取第一信息,包括:
    所述第二设备在与第一设备同步显示第一界面时,若确定所述第二设备的处理负荷高于预设阈值,获取所述第一信息。
  7. 根据权利要求6所述的方法,其特征在于,所述第二设备根据以下中的一个或多个,确定所述第二设备的处理负荷高于预设阈值:所述第二设备的图像处理单元GPU的解码时延大于时延阈值、所述GPU的负载率大于负载阈值、所述多个应用界面的数量大于数量阈值。
  8. 根据权利要求1-7中任一项所述的方法,其特征在于,所述第一信息包括所述多个应用界面的数量,所述第一信息具体用于所述第二设备适应性调整以下中的一个或多个:所述多个应用界面对应的应用显示区域尺寸、所述第二设备的显示分辨率或所述多个应用界面对应的视频分辨率。
  9. 根据权利要求8所述的方法,其特征在于,
    若所述多个应用界面的数量为1,则所述第二设备确定所述应用界面对应的应用显示区域尺寸为a 1×b 1,所述第二设备的显示分辨率为a 2×b 2,所述多个应用界面对应的视频分辨率为a 3×b 3
    其中,a 1为所述应用显示区域的长度,b 1为所述应用显示区域的宽度;a 2为所述第二设备显示屏水平维度能够显示的像素数,b 2为所述第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2,b 3=b 2
  10. 根据权利要求8所述的方法,其特征在于,
    若所述多个应用界面的数量为2,则所述第二设备确定所述应用界面对应的应用显示区域尺寸为2a 1×b 1,所述第二设备的显示分辨率为a 2×2b 2,所述多个应用界面对应的视频分辨率为a 3×b 3
    其中,2a 1为所述应用显示区域的长度,b 1为所述应用显示区域的宽度;a 2为所述第二设备显示屏水平维度能够显示的像素数,2b 2为所述第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=b 2
  11. 根据权利要求8所述的方法,其特征在于,
    若所述多个应用界面的数量为3,则所述第二设备确定所述应用界面对应的应用显示区域尺寸为3a 1×b 1,所述第二设备的显示分辨率为a 2×3b 2,所述多个应用界面对应的视频分辨率为a 3×b 3
    其中,3a 1为所述应用显示区域的长度,b 1为所述应用显示区域的宽度;a 2为所述第二设备显示屏水平维度能够显示的像素数,3b 2为所述第二设备显示屏垂直维度能够显示的像素数;a 3为水平维度单位面积图像内可显示的像素点的数量,b 3为垂直维度单位面积图像内可显示的像素点的数量,a 3=a 2/2,b 3=3b 2/2。
  12. 一种电子设备,其特征在于,所述电子设备包括:
    存储器,用于存储计算机程序;
    收发器,用于接收或发送无线电信号;
    处理器,用于执行所述计算机程序,使得所述电子设备实现如权利要求1-11中任一项所述的方法。
  13. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序代码,所述计算机程序代码被处理电路执行时实现如权利要求1-11中任一项所述的方法。
  14. 一种芯片系统,其特征在于,所述芯片系统包括处理电路、存储介质,所述存储介质中存储有计算机程序代码;所述计算机程序代码被所述处理电路执行时实现如权利要求1-11中任一项所述的方法。
  15. 一种计算机程序产品,其特征在于,所述计算机程序产品用于在计算机上运行,以实现如权利要求1-11中任一项所述的方法。
PCT/CN2021/113506 2020-09-10 2021-08-19 多窗口投屏方法及电子设备 WO2022052773A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/044,707 US20240020074A1 (en) 2020-09-10 2021-08-19 Multi-Window Projection Method and Electronic Device
EP21865829.2A EP4199523A4 (en) 2020-09-10 2021-08-19 MULTI-WINDOW SCREEN PROJECTION METHOD AND ELECTRONIC DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010949156.X 2020-09-10
CN202010949156.XA CN113556598A (zh) 2020-09-10 2020-09-10 多窗口投屏方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022052773A1 true WO2022052773A1 (zh) 2022-03-17

Family

ID=78101632

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/113506 WO2022052773A1 (zh) 2020-09-10 2021-08-19 多窗口投屏方法及电子设备

Country Status (4)

Country Link
US (1) US20240020074A1 (zh)
EP (1) EP4199523A4 (zh)
CN (2) CN113691846A (zh)
WO (1) WO2022052773A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002538A (zh) * 2022-05-13 2022-09-02 深圳康佳电子科技有限公司 多窗口视频录制控制方法、装置、终端设备及存储介质
CN116033158A (zh) * 2022-05-30 2023-04-28 荣耀终端有限公司 投屏方法和电子设备

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114416000B (zh) * 2021-12-29 2024-02-20 上海赫千电子科技有限公司 一种应用于智能汽车的多屏互动方法、多屏互动系统
CN116567336A (zh) * 2022-01-28 2023-08-08 博泰车联网(南京)有限公司 投屏方法、系统、设备和存储介质
CN114647468B (zh) * 2022-02-28 2023-04-07 深圳创维-Rgb电子有限公司 投屏图像显示方法、装置、电子设备及存储介质
CN114979755A (zh) * 2022-05-20 2022-08-30 Oppo广东移动通信有限公司 投屏方法、装置、终端设备及计算机可读存储介质
CN115273763B (zh) * 2022-06-16 2024-02-06 北京小米移动软件有限公司 画面合成帧率调整方法及装置、显示设备及存储介质
CN116055613B (zh) * 2022-08-26 2023-09-29 荣耀终端有限公司 一种投屏方法和设备
CN116033209B (zh) * 2022-08-29 2023-10-20 荣耀终端有限公司 投屏方法和电子设备
CN115484484B (zh) * 2022-08-30 2024-05-14 深圳市思为软件技术有限公司 一种智能设备投屏控制方法、装置、电子设备及存储介质
CN116737097B (zh) * 2022-09-30 2024-05-17 荣耀终端有限公司 一种投屏图像处理方法及电子设备
CN116055795B (zh) * 2023-03-30 2023-11-07 深圳市湘凡科技有限公司 实现多屏协同功能方法、系统、电子设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324457A (zh) * 2013-06-21 2013-09-25 东莞宇龙通信科技有限公司 终端和多任务数据显示方法
CN103685071A (zh) * 2012-09-20 2014-03-26 腾讯科技(深圳)有限公司 一种分配网络资源的方法和装置
CN105828158A (zh) * 2016-03-22 2016-08-03 乐视网信息技术(北京)股份有限公司 基于多窗口视频播放中的播放质量调整方法及装置
US20160343352A1 (en) * 2015-05-18 2016-11-24 Wipro Limited Control unit and method for dynamically controlling resolution of display
CN106816134A (zh) * 2017-01-24 2017-06-09 广东欧珀移动通信有限公司 显示帧率调整方法、装置和终端设备
CN107168513A (zh) * 2017-03-22 2017-09-15 联想(北京)有限公司 信息处理方法及电子设备
CN110381345A (zh) * 2019-07-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8803896B2 (en) * 2008-06-17 2014-08-12 Apple Inc. Providing a coherent user interface across multiple output devices
WO2017163323A1 (ja) * 2016-03-23 2017-09-28 株式会社オプティム 画面共有システム、画面共有方法、および画面共有プログラム
JP2018084863A (ja) * 2016-11-21 2018-05-31 キヤノンマーケティングジャパン株式会社 情報処理システム、情報処理装置、その制御方法及びプログラム
CN109508162B (zh) * 2018-10-12 2021-08-13 福建星网视易信息系统有限公司 一种投屏显示方法、系统及存储介质
CN111192544B (zh) * 2018-11-14 2021-11-26 腾讯科技(深圳)有限公司 投屏控制方法、装置、计算机可读存储介质和计算机设备
CN110221798A (zh) * 2019-05-29 2019-09-10 华为技术有限公司 一种投屏方法、系统及相关装置
CN110659136B (zh) * 2019-09-19 2022-07-15 Oppo广东移动通信有限公司 限制帧率的方法、装置、终端及存储介质
CN111432261A (zh) * 2019-12-31 2020-07-17 杭州海康威视数字技术股份有限公司 一种视频窗口画面显示方法及装置
CN111182346A (zh) * 2020-01-16 2020-05-19 武汉卡比特信息有限公司 一种移动终端与计算机类终端的组合分屏投射方法
CN111290725B (zh) * 2020-03-13 2023-07-14 深圳市腾讯信息技术有限公司 一种投屏方法、设备及存储介质
CN111432070B (zh) * 2020-03-17 2022-04-08 阿波罗智联(北京)科技有限公司 应用投屏控制方法、装置、设备和介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103685071A (zh) * 2012-09-20 2014-03-26 腾讯科技(深圳)有限公司 一种分配网络资源的方法和装置
CN103324457A (zh) * 2013-06-21 2013-09-25 东莞宇龙通信科技有限公司 终端和多任务数据显示方法
US20160343352A1 (en) * 2015-05-18 2016-11-24 Wipro Limited Control unit and method for dynamically controlling resolution of display
CN105828158A (zh) * 2016-03-22 2016-08-03 乐视网信息技术(北京)股份有限公司 基于多窗口视频播放中的播放质量调整方法及装置
CN106816134A (zh) * 2017-01-24 2017-06-09 广东欧珀移动通信有限公司 显示帧率调整方法、装置和终端设备
CN107168513A (zh) * 2017-03-22 2017-09-15 联想(北京)有限公司 信息处理方法及电子设备
CN110381345A (zh) * 2019-07-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4199523A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002538A (zh) * 2022-05-13 2022-09-02 深圳康佳电子科技有限公司 多窗口视频录制控制方法、装置、终端设备及存储介质
CN115002538B (zh) * 2022-05-13 2024-03-12 深圳康佳电子科技有限公司 多窗口视频录制控制方法、装置、终端设备及存储介质
CN116033158A (zh) * 2022-05-30 2023-04-28 荣耀终端有限公司 投屏方法和电子设备
CN116033158B (zh) * 2022-05-30 2024-04-16 荣耀终端有限公司 投屏方法和电子设备

Also Published As

Publication number Publication date
EP4199523A1 (en) 2023-06-21
EP4199523A4 (en) 2024-01-10
CN113556598A (zh) 2021-10-26
CN113691846A (zh) 2021-11-23
US20240020074A1 (en) 2024-01-18

Similar Documents

Publication Publication Date Title
WO2022052773A1 (zh) 多窗口投屏方法及电子设备
WO2022052772A1 (zh) 多窗口投屏场景下的应用界面显示方法及电子设备
WO2020221039A1 (zh) 投屏方法、电子设备以及系统
WO2021175213A1 (zh) 刷新率切换方法和电子设备
WO2020014880A1 (zh) 一种多屏互动方法及设备
WO2021129253A1 (zh) 显示多窗口的方法、电子设备和系统
WO2022258024A1 (zh) 一种图像处理方法和电子设备
US12019942B2 (en) Multi-screen collaboration method and system, and electronic device
WO2022105445A1 (zh) 基于浏览器的应用投屏方法及相关装置
WO2022017205A1 (zh) 一种显示多个窗口的方法及电子设备
WO2022121775A1 (zh) 一种投屏方法及设备
WO2022083465A1 (zh) 电子设备的投屏方法及其介质和电子设备
KR20170043324A (ko) 전자 장치 및 전자 장치의 영상 인코딩 방법
WO2022222924A1 (zh) 一种投屏显示参数调节方法
WO2023005900A1 (zh) 一种投屏方法、电子设备及系统
EP3891997B1 (en) Electronic device and method for playing high dynamic range video and method thereof
WO2021052488A1 (zh) 一种信息处理方法及电子设备
WO2022068882A1 (zh) 镜像投屏方法、装置及系统
WO2023185636A1 (zh) 图像显示方法及电子设备
WO2022206600A1 (zh) 一种投屏方法、系统及相关装置
WO2024027718A1 (zh) 一种多窗口投屏方法、电子设备及系统
US20240112298A1 (en) Image processing method, electronic device, and storage medium
WO2024139884A1 (zh) 投屏显示方法、电子设备及系统
CN117241086A (zh) 通信方法、芯片、电子设备及计算机可读存储介质
CN116684521A (zh) 音频处理方法、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21865829

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18044707

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2021865829

Country of ref document: EP

Effective date: 20230317

NENP Non-entry into the national phase

Ref country code: DE