WO2022242503A1 - 投屏方法及相关装置 - Google Patents

投屏方法及相关装置 Download PDF

Info

Publication number
WO2022242503A1
WO2022242503A1 PCT/CN2022/091899 CN2022091899W WO2022242503A1 WO 2022242503 A1 WO2022242503 A1 WO 2022242503A1 CN 2022091899 W CN2022091899 W CN 2022091899W WO 2022242503 A1 WO2022242503 A1 WO 2022242503A1
Authority
WO
WIPO (PCT)
Prior art keywords
desktop
electronic device
status bar
display
application
Prior art date
Application number
PCT/CN2022/091899
Other languages
English (en)
French (fr)
Inventor
刘小璐
蔡世宗
罗朴良
廖洪均
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22803825.3A priority Critical patent/EP4343533A1/en
Publication of WO2022242503A1 publication Critical patent/WO2022242503A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • the present application relates to the field of electronic technology, and in particular to a screen projection method and related devices.
  • Smart terminals and multiple display devices form multi-screen linkage and synergistic complementarity, which is an important link in the establishment of a full-scene ecology.
  • screen projection technology covers large-screen scenarios such as mobile office scenarios, car-machine Hicar, and smart screens. Take the screen projection of a mobile phone to a personal computer (PC) as an example. After the screen of the mobile phone is mirrored to the computer, the user can control the mobile phone on the computer.
  • PC personal computer
  • mirror projection is usually used, that is, the display content on the mobile phone (that is, the projection sending device) and the computer (that is, the projection receiving device) are exactly the same after projection, and the user cannot The screen projection sending device and the screen projection receiving device view different display contents.
  • the present application provides a screen projection method and a related device, which can display different contents on a screen projection sending device and a screen projection receiving device after screen projection.
  • the present application provides a screen projection method, including: the first electronic device invokes the first module of the first application to run the first desktop, and the first desktop is associated with the first display area; Displaying the first display content, the first display content includes the first desktop; in response to the first user operation, the first electronic device invokes the second module of the first application to run the second desktop, and the second desktop is associated with the second display area; An electronic device sends the second display content corresponding to the second display area to the second electronic device, and the second display content includes the second desktop; in response to the second user operation acting on the first display content, the first electronic device based on the first
  • the task stack running in the display area displays the third display content; in response to the third user operation acting on the second display content displayed by the second electronic device, the first electronic device determines the second display content based on the task stack running in the second display area.
  • the display content corresponding to the display area is the fourth display content; the first electronic device sends the fourth display content to the second electronic device.
  • the first electronic device (that is, the screen projection sending device) supports running multiple desktop instances in different display areas through the same application at the same time, for example, running the first desktop in the first display area through the first module of the first application , using the second module of the first application to run the second desktop in the second display area.
  • the first electronic device determines the display content of the main screen of the device based on the task stack running in the first display area, and determines the display content to be projected to the second electronic device (ie, the screen projection receiving device) based on the task stack running in the second display area. In this way, based on the two different display areas, the first electronic device and the second electronic device can display different desktops and other different contents.
  • the first electronic device in response to the second user operation acting on the first display content, displays the third display content based on the task stack running in the first display area, including: responding to the first For the second user operation on the first desktop in the displayed content, the first electronic device displays the third displayed content based on the task stack of the first application running in the first display area; the above response acts on the second displayed by the second electronic device
  • the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack run by the second display area, including: In the third user operation on the second desktop, the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack of the first application running in the second display area.
  • the first electronic device may execute a response event corresponding to the second user operation based on the task stack of the first application running in the display area associated with the first desktop;
  • the first electronic device may execute a response event corresponding to the third user operation based on the task stack of the first application running on the display area associated with the second desktop.
  • data isolation of events (input events and/or response events) of different desktops can be guaranteed.
  • the two desktop instances are both run by the modules of the first application, the two desktops can share specified data, and the second desktop can inherit some or all of the functions of the first desktop.
  • the method further includes: the first electronic device calls the third module of the second application to run the first status bar, and the first The status bar is associated with the first display area, and the first display content includes the first status bar; the method further includes: in response to the first user operation, the first electronic device calls the fourth module of the second application to run the second status bar, The second status bar is associated with the second display area, and the second display content includes the second status bar.
  • the first electronic device supports running multiple instances of the status bar in different display areas through the same application, for example, running the first status bar in the first display area through the third module of the second application, and running the first status bar in the first display area through the second application
  • the fourth module runs a second status bar in the second display area.
  • the first electronic device and the second electronic device can display different status bars, ensuring the data of events (input events and/or response events) in the two status bars isolation.
  • the two status bars can share specified data (such as notification messages), and the second status bar can inherit some or all of the functional characteristics of the first status bar.
  • the method further includes: the first electronic device calls the fifth module of the third application to run the first display of the first variable Object; the first variable is associated with the first display area, and the first display content includes the first display object; the first variable is associated with the second display area, and the second display content includes the first display object.
  • the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time.
  • the third application and the second application may be the same application or different applications, which are not specifically limited here.
  • the method further includes: in response to a fourth user operation acting on the first display content, calling the fifth module of the third application to modify the display object of the first variable to the second display object; the first The electronic device updates the display content corresponding to the first display area to the fifth display content, and the fifth display content includes the second display object; the first electronic device updates the display content corresponding to the second display area to the sixth display content, and sends to the second The electronic device sends sixth display content, where the sixth display content includes the second display object.
  • the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time. After the user changes the display object of the first variable in the first display area, the first variable is displayed in the second display area The display object of will also change accordingly.
  • the first variable is used to indicate the display object of the wallpaper
  • the display object of the wallpaper is a static picture and/or a dynamic picture
  • the wallpaper includes a lock screen wallpaper when the screen is locked and/or a desktop wallpaper when the screen is not locked .
  • the wallpaper projected on the second electronic device after the user changes the wallpaper displayed on the first electronic device, the wallpaper projected on the second electronic device also changes accordingly.
  • the first electronic device presets multiple themes, and the theme is used to indicate the desktop layout style, icon display style and/or interface color, etc.; the first variable is used to indicate the display object of the theme, and the theme display The object is the display content corresponding to one of the various themes.
  • the theme projected on the second electronic device after the user changes the theme displayed on the first electronic device, the theme projected on the second electronic device also changes accordingly.
  • the first module of the first application includes a first common class for creating and running the first desktop, a first user interface UI control class, and a desktop task stack of the first desktop; the first module of the first application
  • the second module includes the second common class for creating and running the second desktop, the second UI control class and the desktop task stack of the second desktop, part or all of the classes in the second common class inherit from the first common class, and the second Part or all of the UI control classes inherit from the first UI control class.
  • the first electronic device adds a second common class for creating and running the second desktop, a second UI control class, and a desktop task stack of the second desktop, and the newly added common class and UI control Part or all of the class is inherited from the first common class and the first UI control class corresponding to the original first desktop. Therefore, the second desktop can inherit part of the functional characteristics of the first desktop, and the two desktops can implement specified data of sharing.
  • the second common class includes one or more of the following: desktop startup provider, database assistant, desktop startup setting class, desktop startup constant class, Pc layout configuration, Pc device file, Pc grid counter, Pc desktop launcher strategy, Pc desktop startup mode, Pc loading tasks, etc.;
  • the second UI control class includes one or more of the following: Pc drag layer, Pc desktop workspace, Pc unit layout, Pc dock view, Pc file folder, Pc folder icon, etc.
  • the third module of the second application includes the first component for creating and running the first status bar, the first dependent control class, and the third UI control class;
  • the second module of the first application includes the first component for creating and running the first status bar;
  • the class inherits from the first dependent control class, and some or all classes in the fourth UI control class inherit from the third UI control class.
  • the first electronic device adds a second component for creating and running the second status bar, a second dependent control class and a fourth UI control class, and the newly added components, dependent control class and UI Part or all of the control class inherits from the first component corresponding to the original first status bar, the first dependent control class, and the third UI control class. Therefore, the second status bar can inherit part of the functions of the first status bar Features, the two status bars can realize the sharing of specified data.
  • the second component includes one or more of the following: Pc dependent class, Pc system provider, Pc system bar, second status bar;
  • the second dependent control class includes one or more of the following: Pc Status bar window control class, screen control class, lock screen control class, remote control class;
  • the fourth UI control class includes one or more of the following: Pc status bar window view, Pc notification panel view, Pc quick setting fragments, Pc status Bar Fragments, Pc Status Bar View.
  • the ID of the display area associated with the second module is the ID of the second display area; in response to the first user operation, the first electronic device invokes the second module of the first application to run the second desktop , the second desktop is associated with the second display area, including: in response to the first user operation, the Pc management service receives an instruction to switch modes, and the instruction is used to indicate to switch the current non-screen projection mode to the screen projection mode; in response to the instruction , the Pc management service calls the Pc desktop service, the Pc desktop service calls the activity management service, and the activity management service calls the activity task manager to start the second module of the first application; the ID of the display area associated with the second module is determined by calling the root activity container; When the ID of the display area associated with the second module is the ID of the second display area, query the Activity of the second desktop as the Activity of the desktop to be started; when the ID of the display area associated with the second module is the ID of the first display area, The Activity of the first desktop is queried as the Activity of the desktop to be started
  • the first electronic device invokes the fourth module of the second application to run the second status bar
  • the second status bar is associated with the second display area, including: responding to the first User operation, the Pc management service receives an instruction to switch modes, and the instruction is used to instruct to switch the current non-screen projection mode to the screen projection mode; in response to the instruction, the Pc management service starts the productivity service, and the productivity service invokes the system bar to start the second state
  • the system bar creates a second status bar based on the configuration file;
  • the second status bar calls the callback interface of the command queue to add a callback to the second status bar;
  • the second status bar initializes the layout and registers the IstatusBar object corresponding to the second status bar to Status bar management service;
  • the second status bar creates and adds the Pc status bar window view to the status bar window control class;
  • the status bar window control class calls the window management interface to add the second status bar to the window management service, and then the second status bar Added to the second display area.
  • the command queue in the non-screen projection mode, supports the first status bar associated with the first display area; in the screen projection mode, supports the first status bar associated with the first display area at the same time, and A second status bar associated with the second display area.
  • the present application provides an electronic device, including one or more processors and one or more memories.
  • the one or more memories are coupled with one or more processors, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions, the electronic device performs A screen projection method in any possible implementation manner of any one of the above aspects.
  • an embodiment of the present application provides a computer storage medium, including computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the screen projection method in any possible implementation of any one of the above aspects .
  • an embodiment of the present application provides a computer program product, which, when the computer program product is run on a computer, causes the computer to execute the screen projection method in any possible implementation manner of any one of the above aspects.
  • FIG. 1A is a schematic diagram of a communication system provided by an embodiment of the present application.
  • FIG. 1B is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of screen projection between electronic equipment pieces provided by the embodiment of the present application.
  • Fig. 3A is a schematic diagram of the main interface provided by the embodiment of the present application.
  • FIG. 3B to FIG. 3G are schematic diagrams of user interfaces for screen projection through NFC provided by the embodiment of the present application.
  • FIGS. 4A to 4H are schematic diagrams of the second-level interface of the status bar of the extended screen provided by the embodiment of the present application.
  • FIG. 4I is a schematic diagram of the status bar of the extended screen provided by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of the user interface of the search bar on the extension screen provided by the embodiment of the present application.
  • 6A to 6B are schematic diagrams of the user interface of the application icon list provided by the embodiment of the present application.
  • FIG. 6C is a schematic diagram of the user interface of the gallery on the extended screen desktop provided by the embodiment of the present application.
  • FIG. 6D is a schematic diagram of a user interface of music on the extended screen desktop provided by the embodiment of the present application.
  • FIGS. 7A to 7B are schematic diagrams of the multitasking interface provided by the embodiment of the present application.
  • FIGS. 8A to 8B are schematic diagrams of relevant user interfaces for displaying desktop icons provided by the embodiment of the present application.
  • 8C to 8E are schematic diagrams of the user interface of the application program on the extended screen desktop provided by the embodiment of the present application.
  • FIGS. 9A to 9B are schematic diagrams of user interfaces related to application icons in the Dok column provided by the embodiment of the present application.
  • FIGS. 10A to 10B are schematic diagrams of the lock screen interface provided by the embodiment of the present application.
  • FIG. 11 to 13 are schematic diagrams of the software system provided by the embodiment of the present application.
  • FIG. 14 is a schematic diagram of an activity stack provided by an embodiment of the present application.
  • 15A to 15C are schematic diagrams of software implementation of the screen projection method provided by the embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
  • FIG. 1A exemplarily shows a schematic structural diagram of a communication system 10 provided by an embodiment of the present application.
  • the communication system 10 includes an electronic device 100 and one or more electronic devices connected to the electronic device 100 , such as an electronic device 200 .
  • the electronic device 100 may be directly connected to the electronic device 200 through a short-range wireless communication connection or a local wired connection.
  • the electronic device 100 and the electronic device 200 may have a near field communication (near field communication, NFC) communication module, a wireless fidelity (wireless fidelity, WiFi) communication module, an ultra wide band (ultra wide band, UWB) communication module, One or more short-distance communication modules in communication modules such as bluetooth communication modules and ZigBee communication modules.
  • NFC near field communication
  • WiFi wireless fidelity
  • UWB ultra wide band
  • One or more short-distance communication modules in communication modules such as bluetooth communication modules and ZigBee communication modules.
  • the electronic device 100 can detect and scan electronic devices near the electronic device 100 by transmitting signals through a short-range communication module (such as an NFC communication module), so that the electronic device 100 can discover nearby electronic devices through a short-range wireless communication protocol.
  • a short-range communication module such as an NFC communication module
  • An electronic device (such as the electronic device 200), establishes a wireless communication connection with a nearby electronic device, and transmits data to the nearby electronic device.
  • the electronic device 100 and the electronic device 200 may be connected to a local area network (local area network, LAN) through the electronic device 300 based on a wired connection or a WiFi connection.
  • the electronic device 300 may be a third-party device such as a router, a gateway, or a smart device controller.
  • the electronic device 100 and the electronic device 200 may also be indirectly connected through at least one electronic device 400 in a wide area network (such as a Huawei cloud network).
  • the electronic device 400 may be a hardware server, or a cloud server embedded in a virtualized environment. It can be understood that, through the electronic device 300 and/or the electronic device 400 , the electronic device 100 and the electronic device 200 can indirectly perform wireless communication connection and data transmission.
  • the structure shown in this embodiment does not constitute a specific limitation on the communication system 10 .
  • the communication system 10 may include more or less devices than those shown.
  • the electronic device 100 can send the projected image data and/or audio data, etc. to the electronic device 200, and the electronic device 200 can transmit data based on the data Interface display and/or audio output.
  • the screen resolutions of the display screens of the electronic device 200 and the electronic device 100 may be different.
  • the electronic device 100 may be a mobile phone, a tablet computer, a personal digital assistant (personal digital assistant, PDA), a wearable device, a laptop computer (laptop) and other portable electronic devices.
  • the electronic device 100 may not be a portable electronic device, which is not limited in this embodiment of the present application.
  • the electronic device 200 may be any display device such as a smart screen, a TV, a tablet computer, a notebook computer, a vehicle-mounted device, or a projector. Exemplary embodiments of electronic device 100 and electronic device 200 include, but are not limited to, carrying or other operating systems.
  • FIG. 1B exemplarily shows a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • the hardware structure of the electronic device 200 reference may be made to the related embodiments of the hardware structure of the electronic device 100, which will not be repeated here.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input of the battery 142 and/or the charging management module 140, and supplies power for the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160, etc.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM).
  • Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5SDRAM), etc.
  • non-volatile memory can include disk storage devices, flash memory (flash memory). According to the operating principle, flash memory can include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc.
  • the potential order of storage cells can include single-level storage cells (single-level cell, SLC), multi-level storage cells (multi-level cell, MLC), triple-level cell (TLC), quad-level cell (QLC), etc., can include universal flash storage (English: universal flash storage, UFS) according to storage specifications , embedded multimedia memory card (embedded multi media Card, eMMC), etc.
  • the random access memory can be directly read and written by the processor 110, can be used to store executable programs (such as machine instructions) of an operating system or other running programs, and can also be used to store user and application data, etc.
  • the non-volatile memory can also store executable programs and data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
  • the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device 100 .
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the air pressure sensor 180C is used to measure air pressure.
  • the magnetic sensor 180D includes a Hall sensor.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (for example, the directions pointed by the three axes in the x, y, z coordinate system of the electronic device 100).
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • LEDs light emitting diodes
  • photodiodes such as photodiodes
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the temperature sensor 180J is used to detect temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice and the blood pressure beating signal.
  • the key 190 can be a mechanical key or a touch key.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the electronic device 200 displays the extended screen desktop of the electronic device 100 in full screen.
  • the extended screen desktop usually adopts an Android application different from the standard desktop of the electronic device 100 A custom APK of a program package (Android application package, APK), that is, the electronic device 100 simulates displaying the desktop and the status bar of the electronic device 100 on the electronic device 200 through the custom APK.
  • Android application package APK
  • the extended screen desktop displayed by the electronic device 200 is different from the standard desktop of the electronic device 100, and the user cannot uniformly maintain the extended screen desktop and the standard desktop.
  • the status bar i.e. the extended screen status bar
  • UX User Experience
  • the functional characteristics of the standard status bar such as the notification center, quick settings, folders and FA functional characteristics
  • the necessary data synchronization such as the management of notification messages.
  • the electronic device 100 can support the desktop launcher (Launcher) to simultaneously run multiple desktop instances in different display areas (Display), and support the system interface (SystemUI) to Run multiple status bar instances, and can maintain the corresponding relationship between status bar instances and desktop instances.
  • the desktop launcher may also be referred to as a desktop application.
  • FIG. 2 shows a schematic diagram of screen projection communication between the electronic device 100 and the electronic device 200 provided in the embodiment of the present application.
  • the physical display screen configured by the electronic device 100 is the default screen of the electronic device 100
  • the electronic device 100 runs a standard desktop and a standard status bar in the default screen display area (i.e. Display0)
  • the corresponding application window of the Display0 runs
  • the display area identity (Display Identity document, DisplayId) is the identity identity (Identity document, ID) of Display0.
  • the electronic device 100 displays a standard desktop and a standard status bar on the default screen based on the application window run by Display0.
  • the display screen configured by the electronic device 200 is used as an extension screen of the mobile phone 100, and an extension screen display area (Display1) is created for the extension screen.
  • the electronic device 100 runs the extended screen desktop and the extended screen status bar on Display1, and the DisplayId corresponding to the application window running on Display1 is the ID of Display1.
  • the extended screen desktop and the standard desktop are two desktop instances created and run by the same Launcher, and the extended screen status bar and the standard status bar are status bar instances created and run by the same SystemUI.
  • the electronic device 100 determines the display content of the extended screen based on the application window running on Display1, and can project the display content of the extended screen to the electronic device 200; the electronic device 200 can display the extended screen desktop and Extended screen status bar.
  • the electronic device 100 may acquire device information such as the model of the electronic device 200 and the screen resolution.
  • the application window of the electronic device 100 running on Display1 is based on the device information such as the model and screen resolution of the electronic device 200.
  • the obtained An application program window of the electronic device 200 is changed the display size of the application window, the display size and interface layout of each interface element in the application window. Without the function, some original functions of the electronic device 100 that are not applicable to the electronic device 200 are shielded.
  • the data isolation of the two desktop instances can be realized; at the same time, since the two desktop instances originate from the same desktop launcher APK, the data isolation of the two desktop instances can be realized.
  • Synchronous operation for example, the theme, wallpaper and lock screen interface of the extended screen of the electronic device 100 all follow the default screen of the electronic device 100 .
  • the embodiment of the present application can run multiple status bar instances at the same time, and maintain the data channel of each status bar instance. Each status bar instance is associated with a different Display, so as to ensure that status bar events of different Displays can be isolated from each other without affecting each other.
  • the desktop and the status bar can run across Displays, and at the same time, data synchronization of different Displays can be maintained.
  • the application window can be the Window object corresponding to the Activity in the Android system, the application window in the IOS system, or the application window in other operating systems.
  • An application program includes a plurality of application program windows, and an application program window generally corresponds to a user interface (User Interface, UI).
  • UI User Interface
  • one application window may also correspond to multiple user interfaces.
  • the application window may also be referred to simply as a window in this embodiment of the present application.
  • the Activity in the Android system is an interface for interaction between the user and the application program, and each Activity component is associated with a Window object, which is used to describe a specific application window.
  • Activity is a highly abstract user interface component.
  • Android it represents the user interface and the corresponding business logic centered on the user interface.
  • the controls in the user interface can monitor and process events triggered by the user.
  • an android application an Activity can be represented as a user interface, and an Android application can have multiple activty.
  • the main interface 11 of the exemplary desktop application of the mobile phone 100 provided in the embodiment of the present application will be introduced below.
  • FIG. 3A shows the main interface 11 on the mobile phone 100 for displaying the application programs installed on the mobile phone 100 .
  • the main interface 11 may include: a standard status bar 101 , a calendar indicator 102 , a weather indicator 103 , a tray with frequently used application icons 104 , and other application icons 105 . in:
  • the standard status column 101 may include: one or more signal strength indicators 101A of a mobile communication signal (also referred to as a cellular signal), an operator name (such as "China Mobile") 101B, wireless fidelity (wireless fidelity, Wi- Fi) one or more signal strength indicator 101C, battery status indicator 101D, time indicator 101E of the signal.
  • a mobile communication signal also referred to as a cellular signal
  • an operator name such as "China Mobile”
  • wireless fidelity wireless fidelity, Wi- Fi
  • signal strength indicator 101C battery status indicator
  • battery status indicator 101D time indicator 101E of the signal.
  • the tray 104 with commonly used application program icons can display: phone icon, contact icon, text message icon, camera icon.
  • Other application program icons 105 may display: file management icons, gallery icons, music icons, setting icons, and the like.
  • the main interface 11 may also include a page indicator 106 . Icons of other application programs may be distributed on multiple pages, and the page indicator 106 may be used to indicate the application program in which page the user is currently viewing. Users can swipe the area of other application icons left and right to view application icons in other pages.
  • the main interface 11 may also include a desktop wallpaper 107 , and the desktop wallpaper 107 may be set by the user, or may be set by default on the mobile phone 100 .
  • the mobile phone 100 can use multiple themes, and the mobile phone 100 can change the desktop layout style, icon display style and desktop color by switching the theme. Usually each theme has a wallpaper configured by default, and the user can also modify the wallpaper of the mobile phone 100 under the current theme.
  • FIG. 3A only exemplarily shows the user interface on the mobile phone 100, and should not be construed as limiting the embodiment of the present application.
  • the mobile phone 100 can establish a connection with the computer 200 through short-range wireless communication technologies such as NFC, WiFi, and Bluetooth, and then can project the desktop of the mobile phone 100 to the computer 200 .
  • short-range wireless communication technologies such as NFC, WiFi, and Bluetooth
  • the screen projection process through NFC is taken as an example below to illustrate the screen projection process.
  • the mobile phone 100 receives a user's downward sliding operation on the standard status bar 101.
  • the mobile phone 100 displays the control center interface 12 shown in FIG. 3B.
  • the control center interface 12 includes Shortcut icons 201 of multiple commonly used functions (such as the WLAN icon, Bluetooth icon, NFC icon 201A, and multi-screen collaboration icon shown in FIG. , laptop MateBook, tablet MatePad, desktop computer Desktop, smart audio Sound X, etc.), and one or more control boxes of smart home devices (such as the control box 204 of the air purifier, the control box 205 of the smart lighting, etc.).
  • the NFC icon 201A has two states, namely a selected state and a non-selected state.
  • the NFC icon 201A shown in FIG. 3B is in an unselected state, the mobile phone 100 does not open the NFC module, and the NFC icon 201A can receive the user's input operation (such as a touch operation), and in response to the input operation, the mobile phone 100 switches the NFC icon 201A is the selected state shown in FIG. 3C , and the NFC module is turned on.
  • the NFC module of the mobile phone 100 is located in the NFC area on the back of the mobile phone, and the NFC module of the computer 200 is located in the NFC area at the lower right corner of the computer 200 .
  • the user can realize the NFC connection between the mobile phone 100 and the computer 200 by bringing the NFC area of the mobile phone 100 close to the NFC area of the computer 200 , and then can realize the desktop projection of the mobile phone 100 to the computer 200 .
  • the NFC area of the mobile phone 100 can also be located in other parts of the mobile phone 100
  • the NFC area of the computer 200 can also be located in other parts of the computer 200, which are not specifically limited here.
  • the mobile phone 100 can detect the NFC signal of the computer 200, and the mobile phone 100 displays the prompt box 13 shown in FIG. 3E on the current display interface.
  • the prompt box 13 Including the model 301 of the computer 200 (for example, MateBook), prompt information 302 , connection control 303 and cancel control 304 .
  • the prompt information is used to prompt the user to click the connection control 302 to realize the NFC connection, and the user can control the mobile phone 100 on the computer 200 after the NFC connection.
  • the connection control 303 may receive a user's input operation (such as a touch operation), and in response to the input operation, the mobile phone 100 sends an NFC connection request to the computer 200 .
  • the cancel control 304 may receive a user's input operation (such as a touch operation), and in response to the input operation, the mobile phone 100 may close the prompt box 13 .
  • the computer 200 displays a prompt box 15 on the current display interface (such as the desktop 14 of the computer 200 ).
  • the prompt box 15 includes prompt information 305 , a connection control 306 and a cancel control 307 .
  • the prompt information 305 is used to prompt the user whether to allow the mobile phone 100 to connect to the device, click the connection control 306 to realize the NFC connection, and after the NFC connection, the user can operate the mobile phone 100 on the computer 200, that is, the mobile phone 100 can connect the mobile phone 100 after the NFC connection.
  • the desktop of 100 is projected to the computer 200, and the user can control the mobile phone 100 through the desktop displayed on the computer 200.
  • the connection control 306 may receive a user's input operation (such as a touch operation), and in response to the input operation, the computer 200 sends an NFC connection response to the mobile phone 100 .
  • the cancel control 307 can receive a user's input operation (such as a touch operation), and in response to the input operation, the computer 200 can close the prompt box 15 .
  • the mobile phone 100 responds to the received NFC connection response of the computer 200, creates an extended screen desktop through the Launcher, and creates an extended screen status bar through the SystemUI, and the DisplayId corresponding to the extended screen desktop and the extended screen status bar is the ID of the extended screen display area (Display1). ;
  • the mobile phone 100 sends the projection data of Display1 to the computer 200 .
  • the computer 200 displays the PC-based main interface 16 of the mobile phone 100 based on the projection data.
  • Pcization means that in order to adapt to the screen resolution of the computer 200 and the user's operating habits on the computer 200, the interface optimization and function optimization of the extended screen desktop and the extended screen status bar projected from the mobile phone 100 to the computer 200 have been carried out.
  • the main interface 16 may include: an extended screen status bar 401 , a search bar 402 , a Dock bar 403 and a desktop wallpaper 404 .
  • the main interface 16 of the screen projected from the mobile phone 100 to the extended screen is adapted to the screen resolution of the computer 200 .
  • the theme and wallpaper of the extended screen desktop displayed on the computer 200 are consistent with the standard desktop of the mobile phone 100.
  • the desktop wallpaper 107 and the desktop wallpaper 404 are from the same picture.
  • the desktop wallpaper 107 and the desktop wallpaper 404 are cropped from the same picture.
  • extension screen status bar 401 The extension screen status bar 401 , the search bar 402 , the Dock bar 403 , and the lock screen interface of the main interface 16 that the mobile phone 100 projects to the extension screen are described in detail below.
  • the extended screen status bar 401 may include: a notification center icon 401A, an input method indicator 401B, one or more signal strength indicators 401C of WiFi signals, a battery status indicator 401D, a time indicator 401E, a control Center icon 401F.
  • the extended screen status bar 401 may also include other interface elements, such as a Bluetooth icon, a signal strength indicator of a cellular network, etc., which are not specifically limited here.
  • the mobile phone 100 runs two status bar instances, that is, the standard status bar 101 corresponding to the standard desktop displayed on the mobile phone 100 and the extended screen status bar 401 corresponding to the extended screen desktop displayed on the computer 200 .
  • the DisplayId associated with the standard status bar 101 is the ID of Display0
  • the DisplayId associated with the extended screen status bar 401 is the ID of Display1.
  • each interface element in the status bar 401 of the extended screen can open a corresponding secondary interface.
  • the secondary interface of each interface element in the status bar 401 of the extended screen will be introduced below.
  • the notification center icon 401A can receive a user's input operation (for example, a click operation of the left mouse button or a user's touch operation), and in response to the input operation, the computer 200 displays the notification center window 17 shown in FIG. 4A .
  • a user's input operation for example, a click operation of the left mouse button or a user's touch operation
  • the input operations received by the extended screen desktop and the extended screen status bar involved in the embodiment of the present application may be operations performed by the user through the mouse; when the computer 200 is configured with a touch screen ( or touch panel), the input operations received by the extended screen desktop and the extended screen status bar involved in the embodiment of the present application may be touch operations performed by the user through the touch screen (or touch panel), which are not specifically limited here. .
  • the control of the status bar of the extended screen displayed on the computer 200 can receive the user's input operation (such as a click operation through the left button of the mouse), and respond to For the above-mentioned input operation, the computer 200 may send relevant information of the above-mentioned input operation (such as the coordinates of the left-click, the DisplayId corresponding to the display screen on which the above-mentioned input operation acts) to the mobile phone 100 .
  • the user's input operation such as a click operation through the left button of the mouse
  • relevant information of the above-mentioned input operation such as the coordinates of the left-click, the DisplayId corresponding to the display screen on which the above-mentioned input operation acts
  • the mobile phone 100 Based on the relevant information of the above-mentioned input operation, the mobile phone 100 recognizes the input operation as a left-click operation of the mouse on the notification center icon 401A in the status bar of the extended screen, and then determines that the response event triggered by the above-mentioned input operation is to display the notification center window 17,
  • the mobile phone 100 runs the notification center window 17 on Display1, and sends the updated display content of Display1 to the computer 200, and the computer 200 displays the notification center window 17 shown in FIG. 4A according to the projection data sent by the mobile phone 100.
  • the processing of input operations received by the status bar of the extended screen displayed on the computer 200 involved in the subsequent embodiments reference may be made to the above implementation process.
  • the notification center window 17 includes one or more notification messages of the mobile phone 100 (such as the notification message 501 of music shown in FIG. 4A, the notification message 502 of a short message, and the notification message 503 of a bank card) , and delete control 504 .
  • the notification message (such as notification message 502) in the notification center window 17 can receive user input operations, and in response to different input operations of the user, operations such as removing, sharing, and viewing details of the notification message can be realized.
  • the notification message 502 may receive a user's input operation (for example, by clicking the left mouse button), and in response to the above input operation, the computer 200 may display the specific content of the notification message 502 in the user interface of the SMS application.
  • the notification message 502 may receive an input operation from the user (such as a right-click operation of the mouse), and in response to the above input operation, the computer 200 displays the menu bar 505 shown in FIG.
  • View control 505A, remove control 505B, and share control 505C by receiving an input operation (for example, by a click operation of the left mouse button) acting on view control 505A, remove control 505B, or share control 505C, targeting notification can be implemented accordingly View, share or remove from the notification center window 17 of the message 502.
  • the display content of the notification center icon 401A includes icons of applications corresponding to the latest N notification messages.
  • the display content of the notification center icon 401A as shown in FIG. 4C includes the music icon corresponding to the latest notification message 501 , the SMS icon corresponding to the notification message 502 , and the bank card icon corresponding to the notification message 503 .
  • the notification message in the notification center window displayed on the default screen of the mobile phone 100 is also removed. (or delete).
  • the input method indicator 401B can receive the user's input operation (for example, click through the left mouse button), and in response to the above input operation, the computer 200 displays the floating window 18 shown in FIG. or a plurality of input method options (such as the option 511 of input method 1, the option 512 of input method 2), and the input method setting control 513, the display content of the input method indicator 401B is the icon of the current input method of the mobile phone 100.
  • the user's input operation for example, click through the left mouse button
  • the computer 200 displays the floating window 18 shown in FIG. or a plurality of input method options (such as the option 511 of input method 1, the option 512 of input method 2), and the input method setting control 513, the display content of the input method indicator 401B is the icon of the current input method of the mobile phone 100.
  • the current input method of the mobile phone 100 is input method 2
  • the display content of the input method indicator 401B is the icon of the input method 2
  • the option 511 of the input method 1 can receive the user's input operation ( For example, through a click operation of the left button of the mouse), in response to the above input operation, the computer 200 can switch the input method of the mobile phone 100 on the extended screen desktop to input method 1.
  • the input method setting control 513 is used to display an input method setting interface.
  • the signal strength indicator 401C can receive the user's input operation (such as a single-click operation through the left mouse button), and in response to the above input operation, the computer 200 displays the floating window 19 shown in FIG. Network status 521 and real-time network speed 522.
  • the network status of WiFi includes three network statuses: strong, medium and weak.
  • the current WiFi network status 521 of the mobile phone 100 is "medium”
  • the real-time network speed 522 is 263 kilobits per second (Kbps).
  • the battery status indicator 401C can receive the user's input operation (for example, by clicking the left button of the mouse), and in response to the above input operation, the computer 200 displays the floating window 20 shown in FIG. The remaining power 531 and the setting control 532 .
  • the current remaining power 531 of the mobile phone 100 is 40%.
  • the setting control 532 can receive user's input operation (for example, click operation by the left mouse button), and in response to the above input operation, the computer 200 can display the battery setting interface of the mobile phone 100 .
  • the time indicator 401E can receive the user's input operation (for example, click through the left mouse button), and in response to the above input operation, the computer 200 displays the floating window 21 shown in FIG. 4G.
  • the floating window 21 includes time, date and current month calendar.
  • the floating window 21 further includes a scroll-up control 541 and a scroll-down control 542 .
  • the scroll up control 541 can be used to view the calendar of the month before the current month
  • the scroll down control 542 can be used to view the calendar of the month after the current month.
  • the floating window 21 further includes a time and date setting control 543, and the setting control 543 is used to display a time and date setting interface.
  • the control center icon 401F can receive the user's input operation (for example, click through the left mouse button), in response to the above input operation, the computer 200 displays the control center window 22 shown in Figure 4H, the control center window 22 includes a number of commonly used functions shortcut icon 551 (such as the WLAN icon, Bluetooth icon, NFC icon, and multi-screen collaboration icon shown in FIG. Desktop, smart stereo Sound X, etc.), and one or more control boxes of smart home devices (such as the control box 553 of the air purifier, the control box 554 of the smart lighting, etc.). As shown in FIG. 4H , the icon 552 of the smart collaborative device shows that the mobile phone 100 and the computer (ie, the computer 200 ) of the model "MateBook" are currently in a coordinated state.
  • the icon 552 of the smart collaborative device shows that the mobile phone 100 and the computer (ie, the computer 200 ) of the model "MateBook" are currently in a coordinated state.
  • the mobile phone 100 may also adjust the color of the interface elements displayed in the status bar 401 of the extension screen according to the theme color and/or wallpaper color of the extension screen desktop.
  • the theme color of the extended screen desktop and the color of the desktop wallpaper 404 are darker, adjust the color of the interface elements of the extended screen status bar 401 to be white or other preset light colors; the theme color of the extended screen desktop and the color of the desktop wallpaper When it is lighter, adjust the color of the interface elements in the status bar 401 of the extended screen to be black or other preset dark colors.
  • the theme color of the extended screen desktop and the color of the desktop wallpaper 404 are darker, adjust the color of the interface elements of the extended screen status bar 401 to be white or other preset light colors; the theme color of the extended screen desktop and the color of the desktop wallpaper When it is lighter, adjust the color of the interface elements in the status bar 401 of the extended screen to be black or other preset dark colors.
  • the color of the desktop wallpaper 404 of the main interface 16 is lighter, and the main color of the interface elements of the extended screen status bar 401 is black; as shown in FIG. 4I, the color of the desktop wallpaper 404 of the main interface 16 is black. The color is darker, and the main color of the interface elements in the status bar 401 of the extended screen is white.
  • the search bar 402 may receive an input operation from the user (for example, a double-click operation through the right mouse button), and in response to the above input operation, the computer 200 displays the global search floating window 23 shown in FIG. 5 .
  • the floating window 23 includes a search bar 601 , frequently used application program icons 602 , search history 603 and hot news 604 .
  • the search bar 601 can receive the text information input by the user, and perform a global search based on the text information input by the user, the icon 602 of frequently used application programs can include icons of one or more frequently used application programs, and the search history 603 can include the most recent information of the search bar 601
  • One or more search records, hot news 604 may include real-time hot news titles and hot news titles in the hot search list.
  • the global search can be realized through the search bar 402, which can not only search offline applications, stored files, etc. locally installed on the mobile phone 100, but also search news, videos, music and other resources online.
  • the search bar 402 on the desktop of the extended screen can be set in the status bar 401 of the extended screen, which is not specifically limited here.
  • the user can use the mouse to move the cursor of the mouse on the main interface 16 of the mobile phone 100 displayed on the computer 200 .
  • the embodiment of the present application adds a corresponding cursor motion effect, thereby increasing the visual feedback for the user's input operation.
  • the initial display form of the cursor of the mouse can be a first shape (such as an arrow), when the cursor of the mouse hovers over the specified interface element displayed on the desktop of the extended screen and the status bar of the extended screen (such as the search bar shown in Figure 5 402 ), the display form of the cursor of the mouse can be changed to a second shape (for example, a small hand on the search bar 402 shown in FIG. 5 ).
  • a first shape such as an arrow
  • the Dock bar 403 may also be called a program dock.
  • the Dock bar 403 may include a fixed application area 701 and a recently run area 702.
  • the fixed application area 701 includes an application list icon 701A, a multitasking icon 701B, a display desktop icon 701C, and one or more application Icons for programs (eg, file management icon 701D, browser icon 701E).
  • Recently run area 702 includes icons of recently run applications.
  • the fixed application area 701 includes an icon of a recently run application program, it is not necessary to add the icon of the application program to the recently run area 702 .
  • the user can set the display state of the Dock bar 403 .
  • the display status of the Dock bar 403 is set to be automatically hidden.
  • the computer 200 displays the extended screen desktop of the mobile phone 100 or the user interface of other application programs of the mobile phone 100, the Dock bar 403 is automatically hidden, and the user moves the cursor close to the mobile phone 100.
  • the computer 200 only displays the Dock bar 403 when the area where the Dock bar 403 is located.
  • the display status of the Dock bar 403 is set to be displayed all the time.
  • the computer 200 displays the extended screen desktop of the mobile phone 100, or displays the user interface of other application programs of the mobile phone 100 in the form of a floating window
  • the Dock bar 403 is always displayed.
  • the computer 200 displays the user interface of other application programs in full screen, the Dock bar 403 is automatically hidden, and when the user moves the cursor close to the area of the Dock bar 403 by the mouse, the computer 200 displays the Dock bar 403 .
  • the application list icon 701A can receive the user's input operation (for example, click operation by the left mouse button), in response to the above input operation, the computer 200 displays the user interface 24 shown in FIG. 6A, the user interface 24 includes the application program icon list 703, Application icon list 703 may display multiple application icons, such as music icon 703A and gallery icon 703B.
  • the user interface 24 may further include a page indicator 704 , other application program icons may be distributed on multiple pages, and the page indicator 704 may be used to indicate the application program in which page the user is currently viewing.
  • the user can slide left or right on the user interface 24 with a finger to view application program icons on other pages.
  • the user interface 24 may further include a scroll left control 705 and a scroll right control 706 , and the user may click the scroll left control 705 or the scroll right control 706 to view application icons on other pages.
  • the left-turning control 705 and the right-turning control 706 can be hidden, and when the mouse moves the cursor close to the area where the left-turning control 705 (or right-turning control 706) is located, the computer 200 will display the left-turning control 705 (or right-turning control 706).
  • control 706) the user interface 24 may further include a search bar 707 , which is used to quickly search for icons of application programs installed on the mobile phone 100 .
  • the user can add the application icons in the application icon list 703 to the fixed application area 701 of the Dock bar.
  • the music icon 703A may receive a user's drag operation (for example, the user selects the icon with the left mouse button and then drags it to the fixed application area 701), and in response to the above drag operation, the computer 200 may move the A music icon 703A is added to the fixed application area 701 .
  • the application program icons in the application program icon list 703 and the Dock column 403 can receive the user's input operation (for example, by double-clicking the left mouse button, touching the user's finger), in response to the above input operation, the computer 200 may display a user interface of a corresponding application of the application icon.
  • the mobile phone 100 sets a white list of applications that can be displayed on the extended screen, and functions that cannot be implemented on the desktop of the extended screen can be blocked by setting the white list.
  • the application icon list 703 only displays the icons of the applications in the above-mentioned white list. The icon of the program is added to the application icon list 703 .
  • some application programs can adapt to the screen resolution of the computer 200, and display the PC-based user interface in full screen on the extended screen; some application programs cannot adapt to the screen resolution of the computer 200, only The standard user interface of the application program on the mobile phone 100 can be displayed on the extended screen.
  • the gallery supports full-screen display of the PC-based user interface on the extended screen
  • the gallery icon 703B can receive user input operations (such as double-clicking through the left mouse button), and in response to the above-mentioned input operations, the computer 200 can display the full-screen The PC-based user interface 25 of the gallery shown in FIG. 6C.
  • the music does not support the full-screen display of the PC-based user interface on the extended screen
  • the music icon 703A can receive the user's input operation (for example, through the double-click operation of the left mouse button), and in response to the above-mentioned input operation, the computer 200 displays the icon.
  • the computer 200 when the computer 200 receives an input operation for starting the application program 1 acting on the application program icon 1 (such as double-clicking the left button on the gallery icon 703B through the mouse), the computer 200 sends a message to the mobile phone.
  • an input operation for starting the application program 1 acting on the application program icon 1 such as double-clicking the left button on the gallery icon 703B through the mouse
  • the mobile phone 100 sends the relevant information of the above-mentioned input operation (such as the coordinates of the double-click of the left button, the DisplayId corresponding to the extended screen that the above-mentioned input operation acts on); the mobile phone 100 is based on the relevant information of the above-mentioned input operation and the user interface (such as the user interface The interface layout of 24) identifies the input operation as an input operation acting on the application program icon 1 on the desktop of the extended screen, and determines that the input operation is used to start the application program 1 .
  • the relevant information of the above-mentioned input operation such as the coordinates of the double-click of the left button, the DisplayId corresponding to the extended screen that the above-mentioned input operation acts on
  • the mobile phone 100 is based on the relevant information of the above-mentioned input operation and the user interface (such as the user interface
  • the interface layout of 24 identifies the input operation as an input operation acting on the application program icon 1 on the desktop of the extended screen,
  • the standard desktop of the mobile phone 100 may have already started the application 1 .
  • the mobile phone 100 determines that the mobile phone 100 does not currently run the application program 1 through Display0, the mobile phone 100 starts the user interface 1 (such as the user interface 25 of the gallery) of the application program 1 on Display1, and updates the displayed content of Display1
  • the screen is projected to the computer 200 , and the computer 200 displays the above user interface 1 according to the screen projection data sent by the mobile phone 100 .
  • the mobile phone 100 determines that the mobile phone 100 is running the application program 1 through Display0
  • the mobile phone 100 moves the activity stack (ActivityStack) of the application program 1 in Display0 to Display1, runs the application program 1 on Display1, and clears the application program in Display0 1, and then project the updated display content of Display1 to the computer 200
  • the computer 200 displays the user interface of the application program 1 (such as the user interface 25 of the gallery) according to the projection data sent by the mobile phone 100.
  • the mobile phone 100 determines that the mobile phone 100 is running the application program 1 through Display0, the mobile phone 100 sends a prompt message 1 to the computer 200, and the computer 200 may display a prompt message 2 based on the above prompt message 1 to remind the user that the mobile phone 100 is running the application program Program 1, the user can manipulate the application program 1 through the mobile phone 100 .
  • the multitasking icon 701B can receive the user's input operation (for example, click through the left mouse button), in response to the above input operation, the computer 200 displays the multitasking interface 27 shown in Figure 7A, the multitasking interface 27 includes the user through the expansion screen Thumbnail images of one or more recently opened applications on the desktop (such as the thumbnail 711 of the gallery, the thumbnail 712 of the note, etc.), and the delete control 713 .
  • the multitasking interface 25 may further include a right scrolling control 714 .
  • the thumbnail image (such as the thumbnail image 711 of the gallery) can receive the user's input operation (such as the click operation through the left button of the mouse), and in response to the detected above-mentioned input operation, the computer 200 can display The thumbnail corresponds to the user interface 25 of the gallery.
  • the thumbnail image (for example, the thumbnail image 711 of the gallery) includes a delete control 711A, and the delete control 711A can receive an input operation of the user (for example, a single-click operation by the left mouse button), and in response to the detected input operation, The computer 200 can clear the running memory occupied by the gallery corresponding to the thumbnail 711 in Display1 through the mobile phone 100 .
  • the delete control 713 can receive the user's input operation (such as a single-click operation through the left mouse button), and in response to the detected input operation, the computer 200 can clear all the applications corresponding to the thumbnail images in the multitasking interface through the mobile phone 100 on Display1.
  • the flip right control 714 may receive user input operations (for example, click operation by the left mouse button), and in response to the detected input operations, the computer 200 may display more thumbnail images of recently run applications.
  • the display desktop icon 701C can be used to minimize one or more application program windows currently displayed and display the extended screen desktop; when no other application windows are displayed on the extended screen desktop
  • displaying the desktop icon 701C may be used to restore the display of one or more application windows that were minimized recently.
  • the computer 200 displays the user interface 25 of the gallery, and the computer 200 can receive the user's input operation (for example, the user moves the cursor close to the area where the Dock bar 403 is located on the display screen through the mouse), in response to the above With the input operation, the computer 200 displays the Dock bar 403 shown in FIG. 8A on the user interface 25 .
  • the user's input operation for example, the user moves the cursor close to the area where the Dock bar 403 is located on the display screen through the mouse
  • the display desktop icon 701C in the Dock bar 403 can receive the user's input operation (such as a single-click operation by the left mouse button), and in response to the above-mentioned input operation, the computer 200 minimizes the user interface 25 and displays the user interface 25 shown in FIG. 8A As shown in the extended screen desktop; as shown in Figure 8B, after the user interface 25 is minimized, the desktop icon 701C can receive the user's input operation (for example, by clicking the left button of the mouse), and in response to the above input operation, the computer 200 The most recently minimized user interface 25 is displayed again.
  • the user's input operation such as a single-click operation by the left mouse button
  • the user interface of the application program displayed in full screen may display a minimize control 721 , a zoom out control 722 , and a close control 723 .
  • the minimization control 721 can receive a user's input operation (for example, through a left mouse click operation), and in response to the above input operation, the computer 200 minimizes the user interface 25 .
  • the close control 723 can receive the user's input operation (for example, click operation by the left mouse button), in response to the above input operation, the computer 200 closes the user interface 25, and clears the running memory occupied by the user interface 25 in the Display1 of the mobile phone 100.
  • the zoom-out control 722 can receive the user's input operation (for example, through the click operation of the left mouse button), as shown in FIG. 8D , in response to the above-mentioned input operation, the computer 200 shrinks the user interface 25 to float the window 28 The form runs the user interface 25 .
  • the floating window 28 may include a magnification control 724, and the magnification control 724 may receive an input operation from the user (for example, a single-click operation by the left mouse button). The user interface 25 shown.
  • the computer 200 can tile and display multiple application program windows on the extended screen desktop of the mobile phone 100 .
  • the computer 200 simultaneously displays the floating window 28 corresponding to the gallery and the floating window 29 corresponding to the memo.
  • the application program icon in the Dock bar 403 and the application program icon list 703 can also receive the input operation of the user (for example, through the click operation of the right mouse button), and in response to the above input operation, the computer 200 can display the icon Multiple operation options of the corresponding application program, so as to realize the removal, sharing, uninstallation or other shortcut functions of the above-mentioned application program.
  • the corresponding operation options for each application program may be different, for example, some system applications (such as gallery, file management, etc.) do not support being uninstalled.
  • the application program icon (for example, the browser icon 701E) in the Dock bar 403 can receive the user's input operation (for example, through the click operation of the right mouse button), and in response to the above-mentioned input operation, the computer 200 A floating window 30 may be displayed.
  • the floating window 30 may include a remove control 801 , a share control 802 , and an uninstall control 803 .
  • the removal control 801 may receive a user's input operation (for example, a mouse right click operation), and in response to the above input operation, the computer 200 may remove the browser icon 701E from the Dock bar 403 .
  • the sharing control 802 is used to share the browser application to the target object.
  • the uninstall control 803 is used to uninstall the browser application of the mobile phone 100 .
  • the application program icon (such as the gallery icon 701F) in the Dock bar 403 can also receive the user's hovering operation (such as hovering the mouse cursor on the application program icon), in response to the above input operation, if The application corresponding to the application icon is running in the background, and the computer 200 may display a thumbnail of the application corresponding to the application icon, and the thumbnail is a thumbnail of a user interface of the application recently run.
  • the user hovers the mouse cursor on the gallery icon 701F and the computer 200 displays a thumbnail 804 of the gallery.
  • the style of the extended screen desktop of the mobile phone 100 is consistent with that of the lock screen interface of the standard desktop.
  • the mobile phone 100 also locks the standard desktop, and vice versa.
  • the computer 200 displays the lock screen interface 31 of the extended screen desktop shown in FIG.
  • the lock screen wallpaper 901 is included; in response to the above lock screen command, the mobile phone 100 displays the lock screen interface 32 shown in FIG. 10B , and the lock screen interface 32 includes the lock screen wallpaper 902 .
  • the mobile phone 100 displays the lock screen interface 32 shown in FIG. 10B, and sends a lock screen command to the computer 200;
  • the lock screen wallpaper 901 and the lock screen wallpaper 902 are from the same picture.
  • the lock screen wallpaper 901 and the lock screen wallpaper 902 are cropped from the same picture.
  • the mobile phone 100 will not follow the extended screen desktop to lock the standard desktop, and vice versa.
  • the screen projection method provided by the embodiment of the present application involves multiple system applications (including Launcher, SystemUI, File Manager (FileManager), global search (Hisearch) and wireless screen projection (AirSharing), etc.) And the modification of the system framework. in,
  • the desktop launcher is used to manage the desktop layout, Dock bar, application list, multitasking, etc. of the desktop.
  • SystemUI is a UI component that provides users with system-level information display and interaction, and is used to manage status bar, navigation bar, notification center, lock screen interface and wallpaper, etc.
  • the file manager is used to provide external file boxes, manage desktop files, provide external file operation capabilities, and implement file drag and drop between applications.
  • Global search is used to realize local and online global search.
  • the wireless screen projection is used to realize the wireless screen projection between the mobile phone 100 and the target screen projection device (such as the computer 200).
  • the software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 11 shows a block diagram of the software architecture of the mobile phone 100 provided by the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system can be divided into an application layer, an application framework layer (Framework) and a system library (Native) from top to bottom.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the application layer includes a series of application packages (Android application package, APK), such as desktop launcher apk (HWLauncher6.apk), system interface apk (SystemUI.apk), file manager apk (FileManager.apk), global search apk (Hisearch.apk) and wireless sharing apk (AirSharing.apk), etc.
  • APK application package
  • the desktop launcher apk includes a standard desktop launcher (UniHomelauncher) and an extended screen desktop launcher (PcHomelauncher).
  • the system interface apk (SystemUI.apk) includes the standard status bar (StatusBar) and the extended screen status bar (PcStatusBar).
  • the file manager apk (FileManager.apk) includes standard interface, column interface and file box.
  • the global search apk (Hisearch.apk) includes the standard search interface and the extended screen search interface.
  • the wireless sharing apk (AirSharing.apk) includes ordinary wireless screen projection and self-developed display high-definition screen projection.
  • the extended screen desktop may also be called the Pc desktop
  • the extended screen status bar may also be called the Pc status bar, which are not specifically limited here.
  • the desktop launcher supports running multiple desktop instances, such as UniHomelauncher and PcHomelauncher, UniHomelauncher is used to start the standard desktop displayed on the mobile phone 100, and PcHomelauncher is used to start the expansion of the screen projected to the computer 200 screen desktop, the extended screen desktop is a PC-based desktop adapted to the computer 200.
  • the standard desktop is displayed based on the application window running in the display area (Display0) of the default screen
  • the extended screen desktop is displayed based on the application window running in the display area (Display1) of the extended screen.
  • SystemUI supports running multiple status bar instances, such as the standard status bar and the extended screen status bar.
  • the DisplayId corresponding to the standard status bar is the ID of Display0
  • the DisplayId corresponding to the extended screen status bar is the ID of Display1.
  • the newly added column interface and file box in the file manager are responsible for extended screen desktop file operations (such as copying, pasting, moving, deleting, restoring, and dragging).
  • Global search also supports running multiple search interface instances, for example, the standard search interface corresponding to the standard desktop and the extended screen search interface corresponding to the extended screen desktop.
  • the application layer also includes a toolkit (Kit), which includes a software development kit (HwSDK), a user interface kit (Uikit) and a screen casting protocol kit (Cast+kit).
  • Kit software development kit
  • Uikit user interface kit
  • Cist+kit screen casting protocol kit
  • Kit adds HwSDK.
  • HwSDK is a collection of development tools for establishing application software for the software package, software framework, hardware platform and operating system involved in this application. .
  • the embodiment of this application modifies Uikit and Cast+kit, so that Uikit can realize the enhancement of native control capabilities in the extended screen desktop, optimization of text right-click menu, mouse click and mouse hover (Hover) and other operations Dynamic optimization, etc., and enables Cast+kit to realize the projection of the extended screen desktop on the high-definition display.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer adds a self-developed display projection service (HwProductiveMultiWindowManager) and a Dock bar management service (DockBarManagerService), which are used for the self-developed display projection service It is used to realize running multiple application windows of the same application at the same time, and projecting a specified window in the above multiple application windows to a target screen projection device (such as the computer 200).
  • the Dock bar management service is used to manage the Dock bar.
  • the embodiment of the present application also modifies the display management service (DisplayManagerService, DMS), input and output service (InputManagerService, IMS) and parallel view service (HwPartsMagicWindow).
  • DisplayManagerService DMS
  • input and output service IMS
  • parallel view service HwPartsMagicWindow
  • DMS is used to manage the life cycle of interface display, control the logical display of the currently connected default screen display area (Display0) and extended screen display area (Display1), and report to the system and application when the display status of Display0 and Display1 changes.
  • the program sends notifications, etc.
  • the IMS is used to manage the input and output of the mobile phone 100, and the input and output devices of the mobile phone 100 may include a printer, a hard disk, a keyboard, a mouse, a hard disk, a magnetic disk, and a writable read-only optical disc.
  • the Parallel Horizon service is used to realize the application split screen function, that is, two user interfaces corresponding to two different activities of an application can be displayed at the same time, and the above two user interfaces can be displayed on the same display screen, or the above two The user interfaces are displayed on different display screens, and the parallel view service provided in the embodiment of the present application supports displaying the user interfaces through a floating window.
  • the mobile phone 100 also modifies the package management service (PackageManagerService, PMS) and wallpaper service (WallpaperService) of the application framework layer, so that the selection strategy for the default Launcher (ie UniHomelauncher) is added in the package management service, WallpaperService supports displaying wallpapers on multiple Displays.
  • PackageManagerService PMS
  • wallpaper service WallpaperService
  • a system library can include multiple function modules.
  • the system library may include an event dispatcher (InputDispatcher), an audio system (AudioSystem), and a screencasting protocol (Huawei Cast+), etc.
  • InputDispatcher InputDispatcher
  • AudioSystem Audio System
  • Screencasting protocol Huawei Cast+
  • FIG. 12 shows another software system framework provided by the embodiment of the present application.
  • SystemUI.apk includes a standard status bar and an extended screen status bar.
  • SystemUI.apk can specifically include: system interface application (SystemUIApplication), base class (Base Class), service (Service), component (Component) and dependency class provider (Dependency) used to implement the standard status bar Providers) and UI control classes (UI Control).
  • system interface application SystemUIApplication
  • Base Class Base Class
  • Service Service
  • Component component
  • Dependency dependency class provider
  • SystemUIApplication is a subclass of Application and is responsible for the initialization of all SystemUI components.
  • the base class includes the system interface factory class (SystemUIFactory), the system interface root component (SystemUIRootComponent) and so on.
  • SystemUIFactory is used to create SystemUI components
  • SystemUIRootComponent is used to implement the initialization of the dependency injection framework (dagger).
  • Services include system interface services (SystemUIService).
  • SystemUIService is used to initialize a series of components of SystemUI. When it starts, the service will instantiate the sub-services defined in the service list of SystemUIService one by one. Each sub-service can be run by calling the start() method of each sub-service.
  • the above-mentioned sub-services are all inherited from the SystemUI abstract class.
  • the status bar and the navigation bar are sub-services in the above-mentioned service list.
  • Components include CommandQueue, Dependency, KeyguardViewMediator, Notification, Systembars, Statusbar and more. in,
  • CommandQueue is a Binder class used to handle requests related to the status bar and notification center. It will be registered to the status bar management service (StatusBarManagerService) by the status bar to receive the message of StatusBarManagerService; CommandQueue maintains an event queue inside, and the status bar service (StatusBarService) is used to implement the Callbacks callback in CommandQueue.
  • the embodiment of the present application modifies the CommandQueue component so that it supports message distribution of multiple StatusBars (that is, the standard status bar and the status bar of the extended screen).
  • SystemBars inherits from the base class SystemUI and is the entry class for creating the entire SystemUI view.
  • the standard status bar is mainly used to display application notification icons (Icon) and system status icons (alarm clock icon, wifi icon, SIM card icon, system time icon, etc.) on the standard desktop, and to control and manage the above icons.
  • Notification is used to display notification information on the status bar, and to control and manage the above notification information.
  • KeyguardViewMediator is the core class of the lock screen, and other lock screen objects interact with each other through KeyguardViewMediator, which is the management class of the state callback. All calls from the lock screen service (KeyguardService) will be transferred to the UI thread by the KeyguardViewMediator.
  • KeyguardService All calls from the lock screen service (KeyguardService) will be transferred to the UI thread by the KeyguardViewMediator.
  • Dependent class providers include status bar window control class (StausBarWindowController), status bar icon controller implementation class (StatusBarIconControllerImpl), status bar policy (StausBarPolicy) and so on. in,
  • StausBarWindowController is used to manage the status bar window view (StatusBarWindowView), and can call the interface of WindowManager to display the standard status bar.
  • the StatusBarWindowController component is modified to support adding, deleting and managing multiple status bar objects.
  • StatusBarIconControllerImpl is used to manage application icons in the status bar, including icon size, position and color changes.
  • StausBarPolicy is used to manage the display policy of the status bar (such as updating the icon of the status bar, display time, display position, etc.).
  • StatusBarPolicy is a policy management class, the actual function is realized by StatusBarService.
  • UI control classes include status bar window view (StatusBarWindowView), notification panel view (NotificationPanelView), quick setting fragment (QSFragment), status bar fragment (StatusBarFragment), status bar view (StatusBarView) and so on.
  • StatusBarWindowView is used to determine the root layout when the status bar is not expanded, creating a status bar window view.
  • StatusBarView is responsible for creating and instantiating the entire SystemUI view (including status bar, notification center and lock screen interface, etc.), and StatusBarView defines the name, display order, and portable network graphics (Png) of the icons displayed in the status bar.
  • StatusBarService When the StatusBarService is initialized, a StatusBarView for displaying the status bar is initialized, and the StatusBarService realizes the initialization of the status bar by calling the makeStatusBarView method.
  • NotificationPanelView is the control class of the notification center after the status bar is pulled down.
  • QSFragment is the control class of the control center after the status bar is pulled down.
  • StatusBarFragment manages the status bar in the contracted state and is responsible for the life cycle management of the icons in the status bar.
  • the embodiment of the present application adds services, components, dependent class providers and UI control classes for realizing the extended screen status bar in the system interface apk. in:
  • the services used to implement the status bar of the extended screen include ProductiveService, and ProductiveService inherits from Service.
  • the components used to realize the status bar of the extended screen include Pc dependency class (PcDependency), Pc system provider (PcSystemProviders), Pc system bar (PcSystembars), and the extended screen status bar.
  • PcDependency, PcSystembars, and the status bar of the extended screen are all inherited from the corresponding components of the aforementioned standard status bar, and implement similar functions for the status bar of the extended screen, which will not be repeated here.
  • the dependent class providers used to realize the status bar of the extended screen include the Pc status bar window control class (PcStausBarWindowController), the screen control class (ScreenController), the lock screen control class (KeyguardController), and the remote control class (RemoteController).
  • PcStausBarWindowController inherits from the aforementioned StausBarWindowController of the standard status bar, and implements similar functions for the status bar of the extended screen, which will not be repeated here.
  • the screen control class (ScreenController) is used to control the on/off of the screen on the projection side.
  • the keyguard component such as the lock screen control class (KeyguardController) is modified to support multiple Display lock screens.
  • the remote control class (RemoteController) is used to communicate with the application framework layer (Framework).
  • the UI control classes used to realize the status bar of the extended screen include Pc status bar window view (PcStatusBarWindowView), Pc notification panel view (PcNotificationPanelView), Pc quick setting fragment (PcQSFragment), Pc status bar fragment (PcStatusBarFragment), Pc status bar view ( PcStatusBarView) and so on, the newly added UI control classes are all inherited from the corresponding control classes of the aforementioned standard status bar, and realize similar functions for the status bar of the extended screen, which will not be repeated here.
  • the embodiment of the present application also supports window management service (WMS), status bar management service (StatusBarManagerService), wallpaper management service (WallpaperManagerService), notification management service (NotificationManagerService, NMS), PC
  • WMS window management service
  • StatusBarManagerService status bar management service
  • WallpaperManagerService wallpaper management service
  • NotificationManagerService NMS
  • PC PC The module (HwPartsPowerOffice) has been modified accordingly.
  • the window management service (WindowManagerService, WMS) includes a window management policy (WindowManagerPolicy) and a lock screen service agent (KeyguardServiceDelegate). WMS supports running multiple application windows for the same application at the same time.
  • WindowManagerPolicy window management policy
  • KeyguardServiceDelegate lock screen service agent
  • StatusBarManagerService supports registration and management of multiple status bars (such as standard status bar and extended screen status bar).
  • StatusBarManagerService is the manager of StatusBarService.
  • StatusBarService is used to load, update and delete icons in the status bar, interact with applications in the status bar, and process notification information.
  • WallpaperManagerService supports displaying wallpapers on multiple Displays (for example, Display0 corresponding to the standard desktop and Display1 corresponding to the extended screen desktop).
  • NMS supports the UX style of PCs, and supports the management of notification centers with multiple pull-down status bars.
  • HwPartsPowerOffice is used to modify the projection screen entrance.
  • the modification of the Pc management service (HwPCManagerService) is added in HwPartsPowerOffice to realize loading of the extended screen desktop and the extended screen status bar in the screen projection scenario.
  • inheritance involved in this application means that the subclass inherits the characteristics and behaviors of the parent class, so that the subclass object (instance) has the instance field and method of the parent class, or the subclass inherits the method from the parent class, so that the subclass has The same behavior as the parent class.
  • the system architecture diagram shown in FIG. 12 also includes an interface layer (Interface), and the Interface layer includes Android Interface Definition Language (Android Interface Definition Language, AIDL) and broadcast (Broadcast).
  • AIDL is a communication interface between different processes; Broadcast is used to send and receive broadcasts, which can realize the transmission of messages.
  • SystemUI system interfaces
  • Figure 12 is only an illustration, and is not limited to the components of SystemUI and subclasses of SystemUI shown in Figure 12, and other SystemUI components may also be included in this embodiment of the application. Components and subclasses of SystemUI are not specifically limited here.
  • FIG. 13 shows another software system framework provided by the embodiment of the present application.
  • the desktop launcher apk includes UniHomelauncher and PcHomelauncher.
  • the desktop launcher apk specifically includes a common class (Common Class), a UI control class (UI Control), a service (Service), and an activity stack.
  • Common Class Common Class
  • UI Control UI Control
  • Service service
  • activity stack an activity stack
  • this application implements the common class (Common Class) desktop startup provider (LauncherProvider), database assistant (DatabaseHelper), desktop startup setting class (LauncherSettings), desktop startup constant class (LauncherConstants) modified.
  • Common Class desktop startup provider
  • DatabaseHelper database assistant
  • Desktop startup setting class LauncherSettings
  • Desktop startup constant class LauncherConstants
  • LauncherProvider is the database of the desktop launcher. It is the database content provider of the application icons of multiple desktop instances (such as standard desktop and extended screen desktop) running on the desktop launcher, and realizes the access and operation of other applications to the data in the desktop launcher. .
  • DatabaseHelper is responsible for database creation and maintenance.
  • LauncherSettings is used to realize the string definition of database items, and provide some Uri to operate LauncherProvider and the field name of the corresponding field in the database through the internal class Favorites.
  • LauncherConstants maintains and manages constants in the application.
  • this application adds Pc layout configuration (PclayoutConfig), Pc device file (PcDeviceProfile), Pc grid (unit) counter (PcCellNumCalculator), Pc desktop launcher policy (PcLauncherPolicy), Pc Desktop launch mode (PcLauncherModel), Pc loading task (PcLoaderTask), etc.
  • Pc layout configuration Pc layout configuration
  • PcDeviceProfile Pc device file
  • Pc grid (unit) counter PcCellNumCalculator
  • PcCellNumCalculator Pc desktop launcher policy
  • PcLauncherPolicy Pc Desktop launch mode
  • PcLoaderTask Pc loading task
  • PclayoutConfig is used to set the layout attributes (such as width and height) and parameters of the extended screen desktop, and the display effect of the components in the extended screen desktop can be constrained by specifying the layout attributes.
  • PcDeviceProfile is used to define the basic attributes of each module in the extended screen desktop, responsible for the initialization of attribute values, and setting the margin (padding) of each element layout, etc.
  • PcCellNumCalculator is a desktop icon layout strategy class.
  • PcLauncherPolicy manages the display policy of the extended screen desktop.
  • PcLauncherModel is a data processing class, which is used to save the desktop state of the extended screen desktop, provide APIs for reading and writing databases, and update the databases when deleting, replacing, and adding applications.
  • PcLoaderTask is used to load the extended screen desktop.
  • this application also adds Pc drag layer (PcDraglayer), Pc desktop workspace (PcWorkSpace), Pc unit layout (PcCelllayout), Pc program dock view (PcDockview), and Pc folder in the control class (PcFolder), Pc folder icon (PcFolderIcon), etc.
  • the newly added control classes are all inherited from the corresponding control classes of the standard desktop, and are used to realize similar functions on the extended screen desktop.
  • PcDraglayer is a view group (ViewGroup) responsible for distributing events, which is used to initially process the events of the extended screen desktop and distribute them according to the situation.
  • DragLayer contains desktop layout (Workspace), navigation point (QuickNavigationView), Dock area (Hotseat), recent task list (OverviewContainer). Hotseat is the container responsible for managing the Dock.
  • PcWorkSpace is a subclass of PagedView, which consists of multiple CellLayouts, and each CellLayout represents a split screen. PcWorkSpace is used to realize the sliding function of the split screen on the extended screen desktop.
  • PcCellLayout is used to manage the display and layout of the split-screen icons of the extended screen desktop.
  • PcDockview is used to implement the Dock layout on the projection side.
  • PcFolder is used to realize the folder of the extended screen desktop (including the folder created by the user and the folder that comes with the system).
  • the activity stack in the embodiment of the present application supports management of the task stack corresponding to the extended screen desktop, and the Pc desktop service (PcHomeService) is added in the service.
  • PcHomeService is used to start the extended screen desktop.
  • the embodiment of the present application also makes corresponding modifications to the activity management service (ActivityManagerService, AMS) in the application framework layer, so that AMS supports the same application program at the same time Running multiple application windows, such as supporting the simultaneous running of the above two desktop instances.
  • ActivityManagerService ActivityManagerService
  • a Dock bar is added to the desktop of the extended screen, and correspondingly, an IdockBar.aidl is added to the desktop launcher to provide a Binder interface for the framework layer to operate the Dock bar.
  • the activity stack includes a task stack corresponding to the default screen and a task stack corresponding to the extended screen
  • the mobile phone 100 can independently maintain the task stack corresponding to the default screen and the task stack corresponding to the extended screen.
  • the mobile phone 100 distinguishes the two task stacks through DisplayId.
  • the DisplayId of the task stack corresponding to the default screen is the ID of Display0
  • the DisplayId of the task stack corresponding to the extended screen is the ID of Display1.
  • the task stack corresponding to Display1 runs a desktop task stack (HomeStack) and N application stacks (AppStack), namely Stack1 to StackN, HomeStack includes one or more desktop tasks (Task), and AppStack includes one or more An application task; the task stack corresponding to Display0 runs a HomeStack and an AppStack, and the AppStack includes one or more application tasks (such as Task-A and Task-B). Among them, a Task includes one or more Activities.
  • PcHomelauncher runs the HomeStack of the extended screen of the mobile phone 100 through Display1
  • UniHomeLauncher runs the HomeStack of the default screen of the mobile phone 100 through Display0.
  • the WMS obtains the input event through the pointer event listener (TapPointerListener), it can distribute the input event acting on the extended screen to Display1, and the input event acting on the default screen to Display0.
  • the embodiment of the present application modifies the software system framework so that it supports running multiple desktop instances and multiple status bar instances on different Displays.
  • the screen projection method provided in the embodiment of the present application includes:
  • the mobile phone 100 After the mobile phone 100 is connected to the display of the target screen projection device (for example, the computer 200), it receives a screen projection instruction for confirming the projection.
  • the target screen projection device for example, the computer 200
  • the mobile phone 100 adds an extended screen status bar and its related logic control classes through the SystemUI, and the DisplayId corresponding to the extended screen status bar is the ID of the display area Display1 of the extended screen.
  • the mobile phone 100 In response to the above screen projection instruction, the mobile phone 100 also adds an extended screen desktop and its related logic control classes through the Launcher, and the DisplayId corresponding to the extended screen desktop is the ID of the display area Display1 of the extended screen.
  • the mobile phone 100 in response to the above screen projection instruction, starts the ProductiveService service through HwPCManagerService, loads the extended screen status bar, and starts the PcHomeService service through HwPCManagerService, loads the extended screen desktop.
  • FIG. 15A shows a software implementation process of the SystemUI involved in the embodiment of the present application, and the implementation process includes:
  • the mobile phone 100 When the mobile phone 100 is turned on, in the non-screen projection mode, the mobile phone 100 starts a Zeyote process, creates a virtual machine instance through the Zeyote process, and executes a system service (SystemServer).
  • SystemServer system service
  • SystemServer starts a series of services required for system operation, including SystemUIService.
  • SystemUIService starts SystemUI and calls SystemUIApplication.
  • SystemServer starts the boot service, core service and other service respectively, and starts SystemUI and Launcher respectively by calling the mActivityManagerService.systemReady() method in the startOtherService method.
  • SystemUIApplication starts the SystemUI component through the startServicesIfNeeded function, and the SystemUI component includes Systembars.
  • the configuration item config_systemUIServiceComponents is read, and each component (including SystemBars) is loaded.
  • the SystemBars component is loaded, read the configuration (config_statusBarComponent), determine the control class of the status bar according to the above configuration items, and then choose to start the extended screen status bar (PcStatusBar) or the standard status bar (StatusBar) according to the control class of the fixed status bar.
  • the value of config_statusBarComponent is com.android.systemui.statusbar.phone.PhoneStatusBar, and SystemBars determines that the manipulation class of the status bar is StatusBar; The value is com.android.systemui.statusbar.tablet.TabletStatusBar, and SystemBars determines that the control class of the status bar is PcStatusBar.
  • the StausBarWindowController calls the interface of the WindowManager, adds the standard status bar to the WMS, and then adds the standard status bar to Display1.
  • the CommandQueue object will be passed to the StatusBarManagerService and saved as mBar.
  • the client obtains the interface of the StatusBarManagerService system service through the ServiceManager, it can call the method of the CommandQueue object through the mBar, and the CommandQueue calls back the message to the StatusBar through the callback interface, so that the StatusBar can be updated.
  • the standard status bar calls the status bar prompt manager (StatusBarPromptManager) to register the screen projection broadcast (Broadcast).
  • the mobile phone 100 accesses the display (DisplayDevice) of the target screen projection device (that is, the aforementioned extended screen), it calls the DMS.
  • the mobile phone 100 can access the target screen projection device through a wired connection, or access the target screen projection device through wireless communication technologies such as NFC, WIFI, or Bluetooth.
  • the DMS triggers a display event (OnDisplayevent), creates a logical display area Display1 corresponding to the extended screen, and calls a display management object (DisplayManagerGlobal) to obtain the DisplayId of Display1.
  • a display event OnDisplayevent
  • DisplayManagerGlobal a display management object
  • the DMS calls the handleDisplayDeviceAddedLocked function to generate a corresponding logical device (LogicalDevice) for the display (DisplayDevice), and will add the LogicalDevice to the logical device list (mLogicalDevices) managed by the DMS, and also add the DisplayDevice to the DMS In the display list (mDisplayDevices), a logical display area (LogicalDisplay) (ie, the aforementioned extended screen display area Display1) is generated for the DisplayDevice.
  • DMS calls DisplayManagerGlobal to determine the DisplayId of the logical display area.
  • DisplayManagerGlobal sends the above-mentioned display event to a display listener delegate (DisplaylistenerDelegate), and adds a display listener delegate to Display1.
  • DisplayManagerGlobal is mainly responsible for managing the communication between Display Manager and DMS.
  • DisplaylistenerDelegate sends a notification of adding a display area (onDisplayAdded).
  • RootActivityContainer starts HwPCManagerService to load the status bar of the extended screen.
  • hwPartsPowerOffice adds a modification to HwPCManagerService, which is used to start the extended screen desktop and the extended screen status bar in the screen projection scenario.
  • HwPCManagerService sends the screen projection capsule prompt broadcast to the status bar prompt manager (StatusBarPromptManager).
  • HwPCManagerService sends a screen projection notification message to NotificationManagerService.
  • the HwPCManagerService receives an instruction to switch modes, and the instruction is used to instruct to switch from the current non-screen projection mode to the screen projection mode.
  • the SystemUI after receiving the capsule reminder broadcast and the screen projection notification message, the SystemUI displays the capsule reminder in the status bar, and displays the screen projection notification message in the notification center.
  • the capsule prompt is the prompt box 13 shown in FIG. 3E .
  • the HwPCManagerService acquires an instruction to switch modes.
  • the mobile phone 100 receives the NFC connection response sent by the computer 200, the HwPCManagerService receives an instruction to switch modes.
  • HwPCManagerService calls bindService to start ProductiveService.
  • ProductiveService calls PcSystembars to enable the extension screen status bar.
  • Systembars creates an extended screen status bar based on the configuration file.
  • Systembars reads the configuration config_PcstatusBarComponent, determines that the control class of the status bar is PcStatusBar, and then creates an extended screen status bar.
  • the status bar of the extended screen calls the callback interface of CommandQueue to add a callback to the status bar of the extended screen.
  • the StausBarWindowController calls the interface of the WindowManager to add the extended screen status bar to the WMS, and then adds the extended screen status bar to Display1.
  • FIG. 15B exemplarily shows the software implementation of StausBar.
  • the CommandQueue only needs to support one status bar instance in the non-screen projection mode, that is, the standard status bar or the extended screen status bar. It can be understood that the mobile phone only needs to support the standard status bar in the non-screen projection mode, and the computer only needs to support the extended screen status bar in the non-screen projection mode. CommandQueue needs to support multiple status bar instances in screen projection mode, and save the callbacks of the above multiple status bar instances. Exemplarily, when the mobile phone 100 screen is projected to the computer 200, CommandQueue needs to support the standard status bar corresponding to the default screen display area (Display0) of the mobile phone 100, and the extended screen status bar corresponding to the extended screen display area (Display1) of the mobile phone 100.
  • Display0 default screen display area
  • Display1 the extended screen display area
  • the Context of the status bar In screen projection mode, you need to adjust the Context of the status bar to ensure that each status bar obtains the corresponding Display information (such as DisplayId); StatusBarWindowController also needs to support adding multiple status bars; when WindowManagerService adds a window through addWindow, you need to modify the window of the window The DisplayId of the state (windowState) to ensure that the window can be correctly displayed on the corresponding display screen.
  • the standard status bar can be displayed on the default screen of the mobile phone 100
  • the extended screen status bar can be displayed on the extended screen of the mobile phone 100 (ie, the display screen of the computer 200).
  • the default status bar ie, the standard status bar of the mobile phone 100
  • the configuration item config_statusBarComponent ie, the configuration item config_statusBarComponent.
  • SystemBars calls the start() function of the standard status bar to initialize, sets the Callback through the CommandQueue component, then calls the Binder interface, and registers the IstatusBar object corresponding to the standard status bar to the StatusBarManagerService for management through the registerStatusBar function.
  • the DMS When the mobile phone 100 is connected to the display (that is, the computer 200), the DMS receives the OnDisplayDeviceEvent through the input channel, and the DMS distributes the display event (DisplayEvent) based on the callback record (CallbackRecord). After the mobile phone 100 switches from the non-projection mode to the projection mode, the DMS creates the logical display area Display1 corresponding to the extended screen, and calls DisplayManagerGlobal to obtain the DisplayId of Display1, and DisplayManagerGlobal sends the above display event to DisplaylistenerDelegate to add a display monitoring agent to Display1.
  • the SystemUI component reads the configuration item config_PcsystemUIServiceComponents of the configuration file Config.xml, and after SystemBars is started, the status bar of the extended screen is started according to the configuration item config_PcstatusBarComponent.
  • SystemBars calls the start() function of the status bar of the extended screen to initialize, sets the Callback through the CommandQueue component, and then calls the Binder interface to register the IstatusBar object corresponding to the status bar of the extended screen to the StatusBarManagerService through the registerStatusBar function to manage.
  • the status bar of the extended screen is created and a StatusBarWindowView is added to the StausBarWindowController; the StausBarWindowController calls the interface of WindowManager to add the status bar of the extended screen to the WMS.
  • WMS calls the ViewRootImpl and IwindowSession of the window management target (WindowManagerGlobal), and modifies the DisplayId of the WindowState of the window corresponding to the status bar of the extended screen to the ID of Display1.
  • the StatusBarManagerService includes an ArryMap list and a SpareArry list.
  • the ArryMap list maintains Binder objects of multiple IstatusBar registered to StatusBarManagerService.
  • the SpareArry list maintains the correspondence between multiple Display Uistate objects and multiple standard status bars.
  • FIG. 15C shows a software implementation process of the Launcher involved in the embodiment of the present application.
  • the following is a detailed introduction to the software implementation process as shown in the figure.
  • SystemServer starts a series of services required for system operation, including Launcher.
  • the SyetemServer process will start PMS and AMS during the startup process. After the PackageManagerService starts, it will analyze and install the application APK in the system. AMS is mainly used for the startup and management of the four major components. The entry to start the Launcher is AMS systemReady method.
  • AMS starts the Launcher through the StartHomeOnAllDisplays method.
  • the Activity Task Manager (ActivityTaskManagerinternal) calls StartHomeOnDisplays to start the Launcher.
  • RootActivityContainer judges whether the DisplayId is the ID of the default screen display area (ie Display0). If yes, resolve the standard desktop activity (resolveHomeActivity), and execute step (34); if not, resolve the extended screen desktop activity (resolveSecondaryHomeActivity), and execute step (37).
  • the PMS if it is the ID of Display0, the PMS queries the Activity of CATEGORY_HOME as the desktop, and if it is the ID of Display1, the PMS queries the Activity of SECONDARY_HOME as the desktop.
  • the activity start controller (ActivityStartController) calls the activity starter (ActivityStarter).
  • ActivityStarter calls startHomeActivityLocked to start the standard desktop.
  • step (12) to step (16) can refer to the relevant description in FIG. 15A .
  • DisplayManagerService will send the onDisplayAdded notification when the display area Display is added, and HwPCManagerService in the PC module (hwPartsPowerOffice) will start the Launcher of the extended screen through startHomeOnDisplay after receiving the notification (i.e. PcHomeLauncher).
  • HwPCManagerService calls Pc desktop service (HwHomeService) through BindService.
  • HwHomeService calls AMS through StartHomeOnProductiveDisplay.
  • AMS calls the StartHomeOnAllDisplays method to start PcHomeLauncher.
  • the active task manager calls StartHomeOnDisplays to start PcHomeLauncher.
  • step (33) is executed, and the extended screen desktop Activity (resolveSecondaryHomeActivity) is obtained in step (33), and step (40) is executed.
  • the activity start controller (ActivityStartController) calls the activity starter (ActivityStarter).
  • an embodiment of the present application provides a screen projection method, which includes but is not limited to steps S201 to S205.
  • the first electronic device invokes the first module of the first application to run the first desktop, and the first desktop is associated with the first display area; the first electronic device displays the first display content based on the first display area, and the first display content includes the first display content a desktop.
  • the first electronic device invokes the second module of the first application to run the second desktop, and the second desktop is associated with the second display area; the first electronic device sends the second display area to the second electronic device Corresponding to the second display content, the second display content includes a second desktop.
  • the first electronic device may be the aforementioned electronic device 100, such as the mobile phone 100; the second electronic device may be the aforementioned electronic device 200, such as the computer 200.
  • the first application may be the aforementioned desktop launcher, such as HWLauncher6; the first module may be a standard desktop launcher, such as UniHomelauncher; the second module may be an extended screen desktop launcher, such as PcHomelauncher.
  • the first desktop may be the aforementioned standard desktop, and the second desktop may be the aforementioned extended screen desktop.
  • the first display area may be the aforementioned default screen display area, namely Disply0, and the second display area may be the aforementioned extended screen display area, namely Disply1.
  • the first display content may be a user interface displayed on the mobile phone 100 shown in FIG. 3A
  • the first desktop may be the desktop shown in FIG. 3A
  • the second display content may be the user interface displayed by the computer 200 shown in FIG. 3G based on the projection data sent by the mobile phone 100, and the second desktop may be the desktop shown in FIG. 3G.
  • the first electronic device displays the third display content based on the task stack running in the first display area.
  • the first display content may be the user interface 11 displayed on the mobile phone 100 shown in FIG.
  • the content may be the control center interface 12 shown in FIG. 3B.
  • the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack running in the second display area .
  • the first electronic device sends the fourth display content to the second electronic device.
  • the second display content may be the user interface 16 displayed by the computer 200 shown in FIG. , icon in the Dock column 403) input operation.
  • the above-mentioned interface element is the notification center icon 401A in the status bar 401
  • the fourth display content may include the user interface 16 and the notification center window 17 shown in FIG. 4A .
  • the third user operation received by the first electronic device is through the second electronic device.
  • the second electronic device determines the first input event, where the first input event is used to indicate the third input operation.
  • the first input event includes the coordinates of the third input operation on the display screen of the second electronic device, the operation type of the third input operation (such as touch operation, mouse click operation, etc.), and the like.
  • the second electronic device sends a first input event to the first electronic device, and the first electronic device determines a third input operation indicated by the first input event based on the first input event and the task stack run in the second display area, and executes the first input event.
  • the response event corresponding to the three input operations After the response event is executed, the display content corresponding to the second display area is updated to the fourth display content.
  • the first display device sends the updated fourth display content to the second display device, and the second display device displays the fourth display content.
  • the first electronic device supports running multiple desktop instances in different display areas through the same application, for example, running the first desktop in the first display area through the first module of the first application, and running the first desktop instance through the first module of the first application
  • the second module runs the second desktop in the second display area.
  • the first electronic device determines the display content of the main screen of the device based on the task stack running in the first display area, and determines the display content projected to the second electronic device based on the task stack running in the second display area. In this way, based on the two different display areas, the first electronic device and the second electronic device can display different desktops and other different contents.
  • the first electronic device in response to the second user operation acting on the first display content, displays the third display content based on the task stack running in the first display area, including: responding to the first display For the second user operation on the first desktop in the content, the first electronic device displays the third display content based on the task stack of the first application running in the first display area; the above response acts on the second display displayed by the second electronic device
  • the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack run by the second display area, including: responding to the second display content displayed by the second electronic device For the third user operation on the desktop, the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack of the first application running in the second display area.
  • the first display content may be an application program icon (such as a gallery icon) on the desktop (ie, the first desktop) shown in FIG.
  • the mobile phone 100 determines a response event corresponding to the second user operation based on the task stack of the desktop launcher running on Display0 (for example, the aforementioned desktop task stack Homestack).
  • the second display content may be the desktop displayed on the computer 200 shown in FIG.
  • mobile phone 100 determines and executes the response event corresponding to the third user operation based on the task stack of the desktop launcher run by Display1 (such as the aforementioned desktop task stack Homestack), and then updates the second
  • the fourth display content corresponding to the display area (that is, the user interface 24 shown in FIG. 6A ) is sent to the computer 200 .
  • the computer 200 displays the user interface 24 shown in FIG. 6A.
  • the first electronic device may execute a response event corresponding to the second user operation based on the task stack of the first application running in the display area associated with the first desktop;
  • the first electronic device may execute a response event corresponding to the third user operation based on the task stack of the first application running on the display area associated with the second desktop.
  • data isolation of events (input events and/or response events) of different desktops can be guaranteed.
  • the two desktop instances are both run by the modules of the first application, the two desktops can share specified data, and the second desktop can inherit some or all of the functions of the first desktop.
  • the method before the above-mentioned first electronic device displays the first display content based on the first display area, the method further includes: the first electronic device invokes the third module of the second application to run the first status bar, and the first status The bar is associated with the first display area, and the first display content includes the first status bar; the method further includes: in response to the first user operation, the first electronic device invokes the fourth module of the second application to run the second status bar, the second The second status bar is associated with the second display area, and the second display content includes the second status bar.
  • the first user operation is an input operation for the user to determine that the first electronic device projects a screen to the second electronic device.
  • the first electronic device responds to the screen projection instruction, the first electronic device calls the second module of the first application to run the second desktop, and calls the fourth module of the second application to run the second state bar, the second desktop and the second status bar are all associated with the second display area.
  • the above-mentioned screen projection instruction may refer to the screen projection instruction or the instruction of switching modes in the foregoing embodiments.
  • the second application can be the aforementioned system interface, such as SystemUI;
  • the third module can be a standard status bar, such as StatusBar;
  • the fourth module can be an extended screen status bar, such as PcStatusBar.
  • the first status bar may be the aforementioned standard status bar, and the second status bar may be the aforementioned extended screen status bar.
  • the first status bar may be the status bar 101 shown in FIG. 3A
  • the second status bar may be the status bar 401 shown in FIG. 3G .
  • the first electronic device supports running multiple instances of the status bar in different display areas through the same application, for example, running the first status bar in the first display area through the third module of the second application, and running the first status bar in the first display area through the second application
  • the fourth module runs a second status bar in the second display area.
  • the first electronic device and the second electronic device can display different status bars, ensuring the data of events (input events and/or response events) in the two status bars isolation.
  • the two status bars can share specified data (such as notification messages), and the second status bar can inherit some or all of the functional characteristics of the first status bar.
  • the method before the above-mentioned first electronic device displays the first display content based on the first display area, the method further includes: the first electronic device calls the fifth module of the third application to run the first variable first display object ;
  • the first variable is associated with the first display area, and the first display content includes the first display object;
  • the first variable is associated with the second display area, and the second display content includes the first display object.
  • the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time.
  • the third application and the second application may be the same application or different applications, which are not specifically limited here.
  • the method further includes: in response to the fourth user operation acting on the first display content, calling the fifth module of the third application to modify the display object of the first variable to the second display object;
  • the device updates the display content corresponding to the first display area to the fifth display content, and the fifth display content includes the second display object;
  • the first electronic device updates the display content corresponding to the second display area to the sixth display content, and sends The device sends sixth display content, where the sixth display content includes the second display object.
  • the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time. After the user changes the display object of the first variable in the first display area, the first variable is displayed in the second display area The display object of will also change accordingly.
  • the first variable is used to indicate the display object of the wallpaper
  • the display object of the wallpaper is a static picture and/or a dynamic picture
  • the wallpaper includes a lock screen wallpaper when the screen is locked and/or a desktop wallpaper when the screen is not locked.
  • the third application may be a system interface (SystemUI) or a wallpaper application, for example, the fifth module may be a wallpaper management service, such as WallpaperManagerService.
  • the first display object of the wallpaper is displayed in the first display area as the desktop wallpaper 107 shown in FIG. 3A
  • the first display object of the wallpaper is displayed in the second display area as the desktop wallpaper 404 shown in FIG. 3G .
  • Desktop wallpaper 107 and desktop wallpaper 404 are derived from the same picture.
  • the first display object of the wallpaper is displayed in the first display area as the lock screen wallpaper 902 in the lock screen interface shown in FIG. 10B, and the first display object of the wallpaper is displayed in the second display area as the lock screen wallpaper 902 shown in FIG. 10A.
  • Desktop wallpaper 107 and desktop wallpaper 404 are derived from the same picture.
  • the wallpaper projected on the second electronic device after the user changes the wallpaper displayed on the first electronic device, the wallpaper projected on the second electronic device also changes accordingly.
  • the first electronic device presets a variety of themes, and the theme is used to indicate the desktop layout style, icon display style and/or interface color, etc.; the first variable is used to indicate the display object of the theme, and the display object of the theme It is the display content corresponding to one of the various themes.
  • the theme projected on the second electronic device after the user changes the theme displayed on the first electronic device, the theme projected on the second electronic device also changes accordingly.
  • the first module of the first application includes a first common class for creating and running the first desktop, a first user interface UI control class, and a desktop task stack of the first desktop; the second module of the first application The module includes the second common class for creating and running the second desktop, the second UI control class and the desktop task stack of the second desktop, part or all of the classes in the second common class inherit from the first common class, and the second UI Part or all of the control classes inherit from the first UI control class.
  • the first electronic device adds a second common class for creating and running the second desktop, a second UI control class, and a desktop task stack of the second desktop, and the newly added common class and UI control Part or all of the class is inherited from the first common class and the first UI control class corresponding to the original first desktop. Therefore, the second desktop can inherit part of the functional characteristics of the first desktop, and the two desktops can implement specified data of sharing.
  • the second common class includes one or more of the following: desktop startup provider, database assistant, desktop startup setting class, desktop startup constant class, Pc layout configuration, Pc device file, Pc grid counter, Pc Desktop launcher strategy, Pc desktop startup mode, Pc loading tasks, etc.;
  • the second UI control class includes one or more of the following: Pc drag layer, Pc desktop workspace, Pc unit layout, Pc program dock view, Pc folder , Pc folder icon, etc.
  • the second common class can be the common class shown in Figure 13
  • the second UI control class can be the UI control class shown in Figure 13
  • the desktop task stack of the first desktop can be the standard desktop shown in Figure 13
  • the task stack of the second desktop can be the task stack of the extended screen desktop shown in FIG. 13 .
  • the third module of the second application includes the first component for creating and running the first status bar, the first dependent control class and the third UI control class;
  • the second module of the first application includes the first component for Create and run the second component of the second status bar, the second dependent control class and the fourth UI control class, some or all of the components in the second component inherit from the first component, and some or all of the classes in the second dependent control class Inherited from the first dependent control class, part or all of the fourth UI control class inherits from the third UI control class.
  • the first electronic device adds a second component for creating and running the second status bar, a second dependent control class and a fourth UI control class, and the newly added components, dependent control class and UI Part or all of the control class inherits from the first component corresponding to the original first status bar, the first dependent control class, and the third UI control class. Therefore, the second status bar can inherit part of the functions of the first status bar Features, the two status bars can realize the sharing of specified data.
  • the second component includes one or more of the following: Pc dependent class, Pc system provider, Pc system bar, second status bar;
  • the second dependent control class includes one or more of the following: Pc status Bar window control class, screen control class, lock screen control class, remote control class;
  • the fourth UI control class includes one or more of the following: Pc status bar window view, Pc notification panel view, Pc quick setting fragments, Pc status bar Fragments, Pc status bar view.
  • the first component, the first dependent control class and the third UI control class may be respectively the component, the dependent control class and the UI control class shown in FIG. 12 for realizing the standard status bar.
  • the second component, the second dependent control class, and the fourth UI control class may be the components, dependent control class, and UI control class shown in FIG. 12 for realizing the status bar of the extended screen, respectively.
  • the ID of the display area associated with the second module is the ID of the second display area; in response to the first user operation, the first electronic device invokes the second module of the first application to run the second desktop,
  • the second desktop is associated with the second display area, including: in response to the first user operation, the Pc management service receives an instruction to switch modes, and the instruction is used to indicate that the current non-screen projection mode is switched to the screen projection mode; in response to the instruction, The Pc management service calls the Pc desktop service, the Pc desktop service calls the activity management service, and the activity management service calls the activity task manager to start the second module of the first application; calling the root activity container determines the ID of the display area associated with the second module; When the ID of the display area associated with the two modules is the ID of the second display area, query the Activity of the second desktop as the Activity of the desktop to be started; when the ID of the display area associated with the second module is the ID of the first display area, query The Activity of the first desktop is used as the Activity of the desktop to be started;
  • the first electronic device invokes the fourth module of the second application to run the second status bar
  • the second status bar is associated with the second display area, including: responding to the first user Operation, the Pc management service receives an instruction to switch modes, and the instruction is used to instruct to switch the current non-screen projection mode to the screen projection mode; in response to the instruction, the Pc management service starts the productivity service, and the productivity service invokes the system bar to start the second status bar , the system bar creates a second status bar based on the configuration file; the second status bar calls the callback interface of the command queue to add a callback to the second status bar; the second status bar initializes the layout, and registers the IstatusBar object corresponding to the second status bar to the status Bar management service; the second status bar creates and adds the Pc status bar window view to the status bar window control class; the status bar window control class calls the window management interface to add the second status bar to the window management service, and then adds the second status bar to
  • the command queue in the non-screen projection mode, supports the first status bar associated with the first display area; in the screen projection mode, supports the first status bar associated with the first display area, and the first status bar associated with the first display area.
  • the second status bar associated with the second display area.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
  • the processes can be completed by computer programs to instruct related hardware.
  • the programs can be stored in computer-readable storage media.
  • When the programs are executed may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了投屏方法及相关装置,该投屏方法中,第一电子设备调用第一应用的第一模块运行与第一显示区关联的第一桌面;基于第一显示区显示包括第一桌面的第一显示内容;响应于第一用户操作,第一电子设备调用第一应用的第二模块运行与第二显示区关联的第二桌面;第一电子设备向第二电子设备发送第二显示区对应的包括第二桌面的第二显示内容;响应于作用于第一显示内容的第二用户操作,第一电子设备基于第一显示区显示第三显示内容;响应于作用于第二电子设备显示的第二显示内容的第三用户操作,第一电子设备基于第二显示区第四显示内容;向第二电子设备发送第四显示内容。这样,投屏后第一电子设备和第二电子设备可以显示不同内容。

Description

投屏方法及相关装置
本申请要求于2021年5月19日提交中国专利局、申请号为202110547552.4、申请名称为“一种投屏的方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。本申请要求于2021年6月30日提交中国专利局、申请号为202110745467.9、申请名称为“投屏方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子技术领域,尤其涉及投屏方法及相关装置。
背景技术
智能终端与多个显示设备形成多屏联动和协同互补,是全场景生态建立的重要环节。目前,投屏技术覆盖了移动办公场景、车机Hicar、智慧屏等大屏场景。以手机投屏至个人电脑(Personal Computer,PC)为例,手机投屏至到电脑后,用户可以在电脑上操控手机。
然而,目前的大屏投屏技术中,通常采用的是镜像投屏,即投屏后手机(即投屏发送设备)和电脑(即投屏接收设备)上的显示内容完全相同,用户不能在投屏发送设备和投屏接收设备查看不同的显示内容。
发明内容
本申请提供了投屏方法及相关装置,能够实现投屏后投屏发送设备和投屏接收设备显示不同内容。
第一方面,本申请提供了投屏方法,包括:第一电子设备调用第一应用的第一模块运行第一桌面,第一桌面和第一显示区关联;第一电子设备基于第一显示区显示第一显示内容,第一显示内容包括第一桌面;响应于第一用户操作,第一电子设备调用第一应用的第二模块运行第二桌面,第二桌面和第二显示区关联;第一电子设备向第二电子设备发送第二显示区对应的第二显示内容,第二显示内容包括第二桌面;响应于作用于第一显示内容的第二用户操作,第一电子设备基于第一显示区运行的任务栈,显示第三显示内容;响应于作用于第二电子设备显示的第二显示内容的第三用户操作,第一电子设备基于第二显示区运行的任务栈,确定第二显示区对应的显示内容为第四显示内容;第一电子设备向第二电子设备发送第四显示内容。
实施本申请实施例,第一电子设备(即投屏发送设备)支持通过同一应用在不同显示区同时运行多个桌面实例,例如通过第一应用的第一模块在第一显示区运行第一桌面,通过第一应用的第二模块在第二显示区运行第二桌面。第一电子设备基于第一显示区运行的任务栈确定本设备主屏的显示内容,基于第二显示区运行的任务栈确定投屏到第二电子设备(即投屏接收设备)的显示内容。这样,基于两个不同的显示区,第一电子设备和第二电子设备可以显示不同桌面,以及其他的不同内容。
在一种实现方式中,上述响应于作用于第一显示内容的第二用户操作,第一电子设备基于第一显示区运行的任务栈,显示第三显示内容,包括:响应于作用于第一显示内容中的第一桌面的第二用户操作,第一电子设备基于第一显示区运行的第一应用的任务栈,显示第三显示内容;上述响应于作用于第二电子设备显示的第二显示内容的第三用户操作,第一电子 设备基于第二显示区运行的任务栈,确定第二显示区对应的显示内容为第四显示内容,包括:响应于作用于第二电子设备显示的第二桌面的第三用户操作,第一电子设备基于第二显示区运行的第一应用的任务栈,确定第二显示区对应的显示内容为第四显示内容。
实施本申请实施例,针对作用于第一桌面的第二用户操作,第一电子设备可以基于第一桌面关联的显示区所运行的第一应用的任务栈,执行第二用户操作对应响应事件;针对作用于第二桌面的第三用户操作,第一电子设备可以基于第二桌面关联的显示区所运行的第一应用的任务栈,执行第三用户操作对应响应事件。这样,可以保证不同桌面的事件(输入事件和/或响应事件)的数据隔离。此外,由于两个桌面实例均由第一应用的模块运行,因此两个桌面可以实现指定数据的共享,第二桌面可以继承第一桌面的部分或全部功能特性。
在一种实现方式中,上述第一电子设备基于第一显示区显示第一显示内容之前,所述方法还包括:第一电子设备调用第二应用的第三模块运行第一状态栏,第一状态栏与第一显示区关联,第一显示内容包括第一状态栏;所述方法还包括:响应于第一用户操作,第一电子设备调用第二应用的第四模块运行第二状态栏,第二状态栏与第二显示区关联,第二显示内容包括第二状态栏。
实施本申请实施例,第一电子设备支持通过同一应用在不同显示区同时运行多个状态栏实例,例如通过第二应用的第三模块在第一显示区运行第一状态栏,通过第二应用的第四模块在第二显示区运行第二状态栏。这样,通过将两个状态栏与不同的显示区相关联,第一电子设备和第二电子设备可以显示不同状态栏,保证了两个状态栏的事件(输入事件和/或响应事件)的数据隔离。此外,由于两个状态栏均由第二应用的模块运行,因此两个状态栏可以实现指定数据(例如通知消息)的共享,第二状态栏可以继承第一状态栏的部分或全部功能特性。
在一种实现方式中,上述第一电子设备基于第一显示区显示第一显示内容之前,所述方法还包括:第一电子设备调用第三应用的第五模块运行第一变量的第一显示对象;第一变量与第一显示区关联,第一显示内容包括第一显示对象;第一变量与第二显示区关联,第二显示内容包括第一显示对象。
实施本申请实施例,第一电子设备支持同一变量对应的显示对象同时在多个不同的显示区显示。本申请实施例,第三应用与第二应用可以为同一应用,也可以为不同应用,此处不做具体限定。
在一种实现方式中,所述方法还包括:响应于作用于第一显示内容的第四用户操作,调用第三应用的第五模块修改第一变量的显示对象为第二显示对象;第一电子设备更新第一显示区对应的显示内容为第五显示内容,第五显示内容包括第二显示对象;第一电子设备更新第二显示区对应的显示内容为第六显示内容,并向第二电子设备发送第六显示内容,第六显示内容包括第二显示对象。
实施本申请实施例,第一电子设备支持同一变量对应的显示对象同时在多个不同的显示区显示,用户改变第一变量在第一显示区的显示对象后,第一变量在第二显示区的显示对象也随之改变。
在一种实现方式中,第一变量用于指示壁纸的显示对象,壁纸的显示对象为静态图片和/或动态图片,壁纸包括锁屏时的锁屏壁纸和/或非锁屏时的桌面壁纸。
实施本申请实施例,用户改变第一电子设备显示的壁纸后,投屏到第二电子设备上的壁纸也随之改变。
在一种实现方式中,第一电子设备预设了多种主题,主题用于指示桌面布局风格、图标显示风格和/或界面色彩等;第一变量用于指示主题的显示对象,主题的显示对象为多种主题中的一种主题对应的显示内容。
实施本申请实施例,用户改变第一电子设备显示的主题后,投屏到第二电子设备上的主题也随之改变。
在一种实现方式中,第一应用的第一模块包括用于创建和运行第一桌面的第一共同类、第一用户界面UI控制类和第一桌面的桌面任务栈;第一应用的第二模块包括用于创建和运行第二桌面的第二共同类、第二UI控制类和第二桌面的桌面任务栈,第二共同类中的部分或全部类继承自第一共同类,第二UI控制类中的部分或全部类继承自第一UI控制类。
实施本申请实施例,第一电子设备新增了用于创建和运行第二桌面的第二共同类、第二UI控制类和第二桌面的桌面任务栈,且新增的共同类和UI控制类中的部分或全部,继承自原有的第一桌面对应的第一共同类和第一UI控制类,因此,第二桌面可以继承第一桌面的部分功能特性,两个桌面可以实现指定数据的共享。
在一种实现方式中,第二共同类包括以下一项或多项:桌面启动提供者、数据库助手、桌面启动设置类、桌面启动常量类、Pc布局配置、Pc设备文件、Pc网格计数器、Pc桌面启动器策略、Pc桌面启动模式、Pc加载任务等;第二UI控制类包括以下一项或多项:Pc拖拽层、Pc桌面工作区、Pc单元布局、Pc程序坞视图、Pc文件夹、Pc文件夹图标等。
在一种实现方式中,第二应用的第三模块包括用于创建和运行第一状态栏的第一组件、第一依赖控制类和第三UI控制类;第一应用的第二模块包括用于创建和运行第二状态栏的第二组件、第二依赖控制类和第四UI控制类,第二组件中的部分或全部组件继承自第一组件,第二依赖控制类中的部分或全部类继承自第一依赖控制类,第四UI控制类中的部分或全部类继承自第三UI控制类。
实施本申请实施例,第一电子设备新增了用于创建和运行第二状态栏的第二组件、第二依赖控制类和第四UI控制类,且新增的组件、依赖控制类和UI控制类中的部分或全部,继承自原有的第一状态栏对应的第一组件、第一依赖控制类和第三UI控制类,因此,第二状态栏可以继承第一状态栏的部分功能特性,两个状态栏可以实现指定数据的共享。
在一种实现方式中,第二组件包括以下一项或多项:Pc依赖类、Pc系统提供者、Pc系统栏、第二状态栏;第二依赖控制类包括以下一项或多项:Pc状态栏窗口控制类、屏幕控制类、锁屏控制类、远程控制类;第四UI控制类包括以下一项或多项:Pc状态栏窗口视图、Pc通知面板视图、Pc快捷设置碎片、Pc状态栏碎片、Pc状态栏视图。
在一种实现方式中,第二模块关联的显示区的身份标识ID为第二显示区的ID;上述响应于第一用户操作,第一电子设备调用第一应用的第二模块运行第二桌面,第二桌面和第二显示区关联,包括:响应于第一用户操作,Pc管理服务接收到切换模式的指令,指令用于指示将当前的非投屏模式切换为投屏模式;响应于指令,Pc管理服务调用Pc桌面服务,Pc桌面服务调用活动管理服务,活动管理服务调用活动任务管理器启动第一应用的第二模块;调用根活动容器确定第二模块关联的显示区的ID;当第二模块关联的显示区的ID为第二显示区的ID时,查询第二桌面的Activity作为待启动桌面的Activity,当第二模块关联的显示区的ID为第一显示区的ID时,查询第一桌面的Activity作为待启动桌面的Activity;活动启动控制器调用活动启动器启动第二桌面。
在一种实现方式中,上述响应于第一用户操作,第一电子设备调用第二应用的第四模块 运行第二状态栏,第二状态栏与第二显示区关联,包括:响应于第一用户操作,Pc管理服务接收到切换模式的指令,指令用于指示将当前的非投屏模式切换为投屏模式;响应于指令,Pc管理服务启动生产力服务,生产力服务调用系统栏启动第二状态栏,系统栏基于配置文件创建第二状态栏;第二状态栏调用命令队列的回调接口,以添加回调给第二状态栏;第二状态栏初始化布局,注册第二状态栏对应的IstatusBar对象至状态栏管理服务;第二状态栏创建并添加Pc状态栏窗口视图到状态栏窗口控制类;状态栏窗口控制类调用窗口管理的接口添加第二状态栏到窗口管理服务,进而将第二状态栏添加至第二显示区。
在一种实现方式中,非投屏模式下,命令队列支持与第一显示区关联的第一状态栏;投屏模式下,命令队列同时支持与第一显示区关联的第一状态栏,以及与第二显示区关联的第二状态栏。
第二方面,本申请提供了一种电子设备,包括一个或多个处理器和一个或多个存储器。该一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行上述任一方面任一项可能的实现方式中的投屏方法。
第三方面,本申请实施例提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述任一方面任一项可能的实现方式中的投屏方法。
第四方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的投屏方法。
附图说明
图1A为本申请实施例提供的通信系统示意图;
图1B为本申请实施例提供的电子设备的结构示意图;
图2为本申请实施例提供的电子设备件间的投屏示意图;
图3A为本申请实施例提供的主界面示意图;
图3B至图3G为本申请实施例提供的通过NFC实现投屏的用户界面示意图;
图4A至图4H为本申请实施例提供的扩展屏状态栏的二级界面示意图;
图4I为本申请实施例提供的扩展屏状态栏的示意图;
图5为本申请实施例提供的扩展屏搜索栏的用户界面示意图;
图6A至图6B为本申请实施例提供的应用图标列表的用户界面示意图;
图6C为本申请实施例提供的扩展屏桌面上的图库的用户界面示意图;
图6D为本申请实施例提供的扩展屏桌面上的音乐的用户界面示意图;
图7A至图7B为本申请实施例提供的多任务界面示意图;
图8A至图8B为本申请实施例提供的显示桌面图标的相关用户界面示意图;
图8C至图8E为本申请实施例提供的扩展屏桌面上应用程序的用户界面示意图;
图9A至图9B为本申请实施例提供的Dok栏的应用程序图标的相关用户界面示意图;
图10A至图10B为本申请实施例提供的锁屏界面示意图;
图11至图13为本申请实施例提供的软件系统示意图;
图14为本申请实施例提供的活动栈示意图;
图15A至图15C为本申请实施例提供的投屏方法的软件实现示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
下面介绍本申请实施例提供涉及的通信系统10。
图1A示例性地示出了本申请实施例提供的一种通信系统10的结构示意图。如图1A所示,该通信系统10包括电子设备100,以及与电子设备100建立连接的一或多个的电子设备,例如电子设备200。
本申请的实施例中,电子设备100可以通过近距离无线通信连接或本地有线连接与电子设备200进行直接连接。示例性的,电子设备100和电子设备200可以具有近场通信(near field communication,NFC)通信模块、无线保真(wireless fidelity,WiFi)通信模块、超宽带(ultra wide band,UWB)通信模块、蓝牙(bluetooth)通信模块、ZigBee通信模块等通信模块中的一项或多项近距离通信模块。以电子设备100为例,电子设备100可以通过近距离通信模块(例如NFC通信模块)发射信号来探测、扫描电子设备100附近的电子设备,使得电子设备100可以通过近距离无线通信协议发现附近的电子设备(例如电子设备200),并与附近的电子设备建立无线通信连接,以及传输数据至附近的电子设备。
在一些实施例中,电子设备100和电子设备200可以基于有线连接或WiFi连接的连接方式,通过电子设备300连接至局域网(local area network,LAN)。例如,电子设备300可以是路由器、网关、智能设备控制器等第三方设备。在一些实施例中,电子设备100和电子设备200还可以通过广域网(例如华为云网络)中的至少一个电子设备400进行间接连接。例如,电子设备400可以是硬件服务器,也可以是植入虚拟化环境中的云端服务器。可以理解,通过电子设备300和/或电子设备400,电子设备100可以和电子设备200间接进行无线通信连接以及数据传输。
可以理解的,本实施例示出的结构并不构成对通信系统10的具体限定。在本申请另一些实施例中,通信系统10可以包括比图示更多或更少的设备。
本申请的实施例中,电子设备100和电子设备200建立连接后,电子设备100可以向电子设备200发送投屏的图像数据和/或音频数据等,电子设备200可以基于电子设备100发送的数据进行界面显示和/或音频输出。
本申请实施例中,电子设备200与电子设备100的显示屏的屏幕分辨率可以不同。其中,电子设备100可以为手机、平板电脑、个人数字助理(personal digital assistant,PDA)、可穿戴设备、膝上型计算机(laptop)等便携式电子设备。可选的,电子设备100也可以不是便携式电子设备,本申请实施例对此不作任何限制。电子设备200可以是智慧屏、电视机、平板电脑、笔记本电脑、车载设备或者投影仪等任一显示装置。电子设备100和电子设备200的示例性实施例包括但不限于搭载
Figure PCTCN2022091899-appb-000001
或者其它操作系统。
下面对本申请实施例涉及的电子设备的硬件结构进行介绍。
参见图1B,图1B示例性地示出了本申请实施例提供的电子设备100的结构示意图。本申请实施例中,电子设备200的硬件结构可以参考电子设备100的硬件结构的相关实施例,此处不再赘述。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏 194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统 (quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。随机存取存储器可以包括静态随机存储器(static random-access memory,SRAM)、动态随机存储器(dynamic random access memory,DRAM)、同步动态随机存储器(synchronous dynamic random access memory,SDRAM)、双倍资料率同步动态随机存取存储器(double data rate synchronous dynamic random access memory,DDR SDRAM,例如第五代DDR SDRAM一般称为DDR5SDRAM)等;非易失性存储器可以包括磁盘存储器件、快闪存储器(flash memory)。快闪存储器按照运作原理划分可以包括NOR FLASH、NAND FLASH、3D NAND FLASH等,按照存储单元电位阶数 划分可以包括单阶存储单元(single-level cell,SLC)、多阶存储单元(multi-level cell,MLC)、三阶储存单元(triple-level cell,TLC)、四阶储存单元(quad-level cell,QLC)等,按照存储规范划分可以包括通用闪存存储(英文:universal flash storage,UFS)、嵌入式多媒体存储卡(embedded multi media Card,eMMC)等。在一些实施例中,随机存取存储器可以由处理器110直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器110直接进行读写。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备100的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。
气压传感器180C用于测量气压。磁传感器180D包括霍尔传感器。
加速度传感器180E可检测电子设备100在各个方向上(例如,电子设备100的x、y、z三轴坐标系中的三轴指向的方向)加速度的大小。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。
指纹传感器180H用于采集指纹。
温度传感器180J用于检测温度。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触控操作。触摸传感器可以将检测到的触控操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触控操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号、血压跳动信号。
按键190可以是机械按键,也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
下面对本申请实施例涉及的电子设备100和电子设备200间的投屏实现进行简要介绍。
在一种投屏实现方式中,电子设备100的桌面投屏至到电子设备200后,电子设备200全屏显示电子设备100的扩展屏桌面。为使电子设备100投屏到电子设备200上的扩展屏桌面适应电子设备200的屏幕分辨率和用户对电子设备200的操作习惯,扩展屏桌面通常采用不同于电子设备100的标准桌面的安卓应用程序包(Android application package,APK)的自定义APK,即电子设备100通过自定义APK在电子设备200上模拟显示电子设备100的桌面和状态栏。这种实现方式下,电子设备200显示的扩展屏桌面与电子设备100的标准桌面不同源,用户不能统一维护扩展屏桌面和标准桌面,这导致投屏到电子设备200的桌面(即扩展屏桌面)和状态栏(即扩展屏状态栏),不能跟随电子设备100的标准桌面的用户体验(User Experience,UX)风格(例如主题、壁纸、锁屏界面),也不能继承电子设备100的标准桌面和标准状态栏的功能特性(例如通知中心、快捷设置、文件夹和FA的功能特性),也不能保持必要的数据同步(例如通知消息的管理)。
在本申请实施例提供的另一种投屏实现方式中,电子设备100可以支持桌面启动器(Launcher)在不同的显示区(Display)同时运行多个桌面实例,以及支持系统界面(SystemUI)同时运行多个状态栏实例,并可以维护状态栏实例和桌面实例之间的对应关系。本申请实施例中,桌面启动器也可以被称为桌面应用。
示例性的,图2示出了本申请实施例提供的电子设备100和电子设备200间的一种投屏通信示意图。如图2所示,电子设备100配置的物理显示屏为电子设备100的默认屏,电子设备100在默认屏显示区(即Display0)运行标准桌面和标准状态栏,Display0运行的应用程序窗口对应的显示区身份标识(Display Identity document,DisplayId)为Display0的身份标识(Identity document,ID)。电子设备100基于Display0运行的应用程序窗口在默认屏显示标准桌面和标准状态栏。
电子设备100基于投屏指令确定目标投屏设备为电子设备200后,将电子设备200配置的显示屏作为手机100的扩展屏,为扩展屏创建扩展屏显示区(Display1)。电子设备100在Display1运行扩展屏桌面和扩展屏状态栏,Display1运行的应用程序窗口对应的DisplayId为Display1的ID。其中,扩展屏桌面和标准桌面是由同一Launcher创建和运行的两个桌面实例,扩展屏状态栏和标准状态栏是由同一SystemUI创建和运行的状态栏实例。电子设备100基于Display1运行的应用程序窗口确定扩展屏的显示内容,并可以将扩展屏的显示内容投屏至电子设备200;电子设备200可以基于电子设备100发送的投屏数据显示扩展屏桌面和扩展屏状态栏。
在一些实施例中,与电子设备200建立连接后,电子设备100可以获取电子设备200的型号、屏幕分辨率等设备信息。电子设备100在Display1中运行的应用程序窗口,是基于电子设备200的型号、屏幕分辨率等设备信息,对电子设备100原有的应用程序窗口进行界面优化和功能优化后,获取到的适用于电子设备200的应用程序窗口。界面优化包括更改应用程序窗口的显示尺寸,应用程序窗口中各界面元素的显示尺寸和界面布局,功能优化包括为适应用户对电子设备200的操作习惯,增加电子设备100原有的应用程序窗口所没有的功能,屏蔽电子设备100原有的不适用于电子设备200的部分功能。
参见图2,通过Launcher在不同Display运行两个桌面实例,可以实现两个桌面实例的数据隔离;同时,由于两个桌面实例源自同一桌面启动器APK,因此可以实现两个桌面实例的数据的同步操作,例如,电子设备100的扩展屏的主题、壁纸和锁屏界面均跟随电子设备100 的默认屏。本申请实施例可以同时运行多个状态栏实例,并维护每个状态栏实例的数据通道,每个状态栏实例关联不同的Display,从而保证不同Display的状态栏事件可以相互隔离,互不影响。综上所述,实施本申请实施例,可以实现桌面和状态栏跨Display运行,同时可以保持不同Display的数据同步。
需要说明的是,本申请实施例中,应用程序窗口可以是Android系统中的Activity对应的Window对象,还可是IOS系统中的应用程序窗口,还可以是其他操作系统中的应用程序窗口,此处不作具体限定。一个应用程序包括多个应用程序窗口,一个应用程序窗口通常对应一个用户界面(User Interface,UI)。可选的,一个应用程序窗口也可以对应多个用户界面。为便于描述,本申请实施例中也可以将应用程序窗口简称为窗口。
其中,Android系统中的Activity是用户和应用程序之间进行交互的接口,每一个Activity组件都关联有一个Window对象,用来描述一个具体的应用程序窗口。由此可知,Activity是一个高度抽象的用户界面组件,在Android中代表了用户界面和以用户界面为中心的相应的业务逻辑,通过用户界面中的控件可以监听并处理用户触发的事件。可以理解,android应用中,一个Activity可以表现为一个用户界面,一个Android应用可以拥有多个activty。
下面以电子设备100是手机、电子设备200是电脑为例,结合附图对本申请实施例涉及的投屏场景的示例性用户界面进行介绍。
下面对本申请实施例提供的示例性的手机100的桌面应用的主界面11进行介绍。
示例性的,图3A示出了手机100上的用于展示手机100安装的应用程序的主界面11。主界面11可以包括:标准状态栏101,日历指示符102,天气指示符103,具有常用应用程序图标的托盘104,以及其他应用程序图标105。其中:
标准状态栏101可以包括:移动通信信号(又可称为蜂窝信号)的一个或多个信号强度指示符101A、运营商名称(例如“中国移动”)101B、无线高保真(wireless fidelity,Wi-Fi)信号的一个或多个信号强度指示符101C,电池状态指示符101D、时间指示符101E。
具有常用应用程序图标的托盘104可展示:电话图标、联系人图标、短信图标、相机图标。其他应用程序图标105可展示:文件管理图标、图库图标、音乐图标、设置图标等。主界面11还可包括页面指示符106。其他应用程序图标可分布在多个页面,页面指示符106可用于指示用户当前查看的是哪一个页面中的应用程序。用户可以左右滑动其他应用程序图标的区域,来查看其他页面中的应用程序图标。
主界面11还可包括桌面壁纸107,桌面壁纸107可以是用户设置的,也可以是手机100默认设置的。此外,手机100可以使用多种主题,手机100可以通过切换主题来改变桌面布局风格、图标显示风格和桌面色彩。通常每个主题均有默认配置的壁纸,用户也可以修改手机100在当前主题下的壁纸。
可以理解,图3A仅仅示例性示出了手机100上的用户界面,不应构成对本申请实施例的限定。
本申请实施例中,手机100可以通过NFC、WiFi、蓝牙等近距离无线通信技术与电脑200建立连接,进而可以将手机100的桌面投屏至电脑200。下面以通过NFC实现投屏为例,对投屏过程进行示例性说明。
示例性的,如图3A所示,手机100接收用户作用于标准状态栏101的向下滑动操作,响应于上述滑动操作,手机100显示图3B所示的控制中心界面12,控制中心界面12包括多个常用功能的快捷图标201(例如图3B所示的WLAN图标、蓝牙图标、NFC图标201A和 多屏协同图标等),亮度调节条202,一或多个智慧协同的设备的图标203(例如,笔记本电脑MateBook、平板MatePad、台式电脑Desktop、智能音响Sound X等),以及一或多个智能家居设备的控制框(例如空气净化器的控制框204、智能灯光的控制框205等)。
NFC图标201A有两种状态,即选中状态和非选中状态。示例性的,图3B所示的NFC图标201A为非选中状态,手机100未开启NFC模块,NFC图标201A可以接收用户的输入操作(例如触摸操作),响应于该输入操作,手机100切换NFC图标201A为图3C所示的选中状态,并开启NFC模块。
示例性的,如图3D所示,手机100的NFC模块位于手机背面的NFC区域内,电脑200的NFC模块位于电脑200的右下角的NFC区域。在一些实施例中,用户可以通过将手机100的NFC区域靠近电脑200的NFC区域,来实现手机100和电脑200间的NFC连接,进而可以实现将手机100的桌面投屏至电脑200。可以理解,手机100的NFC区域也可以位于手机100的其他部位,电脑200的NFC区域也可以位于电脑200的其他部位,此处均不做具体限定。
具体的,用户将手机100的NFC区域靠近电脑200的NFC区域后,手机100可以检测到电脑200的NFC信号,手机100在当前的显示界面上显示图3E所示的提示框13,提示框13包括电脑200的型号301(例如MateBook、)、提示信息302、连接控件303和取消控件304。
其中,提示信息用于提示用户点击连接控件302可以实现NFC连接,且NFC连接后用户可以在电脑200上操控手机100。连接控件303可以接收用户的输入操作(例如触摸操作),响应于该输入操作,手机100向电脑200发送NFC连接请求。取消控件304可以接收用户的输入操作(例如触摸操作),响应于该输入操作,手机100可以关闭提示框13。
如图3F所示,电脑200响应于接收到的手机100的NFC连接请求,在当前显示界面(例如电脑200的桌面14)上显示提示框15。提示框15包括提示信息305、连接控件306和取消控件307。
其中,提示信息305用于提示用户是否允许手机100连接到本设备,点击连接控件306可以实现NFC连接,且NFC连接后用户可以在电脑200上操作手机100,即NFC连接后手机100可以将手机100的桌面投屏至电脑200,用户可以通过电脑200显示的桌面操控手机100。
连接控件306可以接收用户的输入操作(例如触摸操作),响应于该输入操作,电脑200向手机100发送NFC连接响应。取消控件307可以接收用户的输入操作(例如触摸操作),响应于该输入操作,电脑200可以关闭提示框15。
手机100响应于接收到的电脑200的NFC连接响应,通过Launcher创建扩展屏桌面,通过SystemUI创建扩展屏状态栏,扩展屏桌面和扩展屏状态栏对应的DisplayId为扩展屏显示区(Display1)的ID;手机100向电脑200发送Display1的投屏数据。如图3G所示,电脑200基于上述投屏数据显示手机100的Pc化的主界面16。Pc化指为适应电脑200的屏幕分辨率和用户对电脑200的操作习惯,对手机100投屏到电脑200的扩展屏桌面和扩展屏状态栏进行了界面优化和功能优化。
示例性的,如图3G所示,主界面16可以包括:扩展屏状态栏401、搜索栏402、Dock栏403和桌面壁纸404。由图3G可知,手机100投屏至扩展屏的主界面16适应于电脑200的屏幕分辨率。此外,电脑200显示的扩展屏桌面与手机100的标准桌面的主题和壁纸均保 持一致。其中,桌面壁纸107和桌面壁纸404源自同一张图片。可选的,桌面壁纸107和桌面壁纸404是由同一张图片裁剪所得的。
下面对手机100投屏至扩展屏的主界面16的扩展屏状态栏401、搜索栏402、Dock栏403,以及锁屏界面分别进行具体介绍。
如图3G所示,扩展屏状态栏401可以包括:通知中心图标401A、输入法指示符401B、WiFi信号的一个或多个信号强度指示符401C,电池状态指示符401D、时间指示符401E、控制中心图标401F。可选的,扩展屏状态栏401还可以包括其他界面元素,例如蓝牙图标、蜂窝网络的信号强度指示符等,此处不做具体限定。
可以理解,本申请实施例中,手机100运行了两个状态栏实例,即手机100显示的标准桌面对应的标准状态栏101和电脑200显示的扩展屏桌面对应的扩展屏状态栏401。其中,标准状态栏101关联的DisplayId为Display0的ID,扩展屏状态栏401关联的DisplayId为Display1的ID。
不同于手机100的标准桌面对应的标准状态栏101,扩展屏状态栏401中的各界面元素可打开对应的二级界面。下面对扩展屏状态栏401中的各界面元素的二级界面进行介绍。
通知中心图标401A可以接收用户的输入操作(例如通过鼠标左键的单击操作或用户的触摸操作),响应于该输入操作,电脑200显示图4A所示的通知中心窗口17。
需要说明的是,当电脑200配置有鼠标时,本申请实施例涉及的扩展屏桌面和扩展屏状态栏接收到的输入操作可以是用户通过鼠标实施的操作;当电脑200配置有触控屏(或触控面板)时,本申请实施例涉及的扩展屏桌面和扩展屏状态栏接收到的输入操作可以是用户通过触控屏(或触控面板)实施的触摸操作,此处不做具体限定。
具体的,在一些实施例中,电脑200显示的扩展屏状态栏的控件(例如主界面16显示的通知中心图标401A)可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200可以将上述输入操作的相关信息(例如左键单击的坐标、上述输入操作所作用的显示屏对应的DisplayId)发送给手机100。基于上述输入操作的相关信息,手机100识别该输入操作为作用于扩展屏状态栏的通知中心图标401A的鼠标左键单击操作,进而确定上述输入操作触发的响应事件为显示通知中心窗口17,手机100在Display1运行通知中心窗口17,并将更新后的Display1的显示内容发送给电脑200,电脑200根据手机100发送的投屏数据显示图4A所示的通知中心窗口17。后续实施例涉及的有关电脑200显示的扩展屏状态栏接收到的输入操作的处理,均可以参考上述实现过程。
示例性的,如图4A所示,通知中心窗口17包括手机100的一或多条通知消息(例如图4A所示的音乐的通知消息501、短信的通知消息502、银行卡的通知消息503),以及删除控件504。
通知中心窗口17中的通知消息(例如通知消息502)可以接收用户输入操作,响应于用户的不同的输入操作,可以实现对该通知消息的移除、分享、查看详情等操作。可选的,通知消息502可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200可以在短信应用的用户界面中显示通知消息502的具体内容。可选的,通知消息502可以接收用户的输入操作(例如通过鼠标右键的单击操作),响应于上述输入操作,电脑200显示图4B所示的菜单栏505,菜单栏505包括通知消息502的查看控件505A、移除控件505B和分享控件505C,通过接收作用于查看控件505A、移除控件505B或分享控件 505C的输入操作(例如通过鼠标左键的单击操作),可以相应地实现针对通知消息502的查看、分享或从通知中心窗口17中移除。
在一些实施例中,通知中心图标401A的显示内容包括最近N条通知消息对应的应用程序的图标。示例性的,如图4C所示的通知中心图标401A的显示内容包括最近的通知消息501对应的音乐图标、通知消息502对应的短信图标、通知消息503对应的银行卡图标。
在一些实施例中,扩展屏显示的通知中心窗口17的指定通知消息(例如通知消息502)被移除(或删除)时,手机100默认屏显示的通知中心窗口中该通知消息也被移除(或删除)。
输入法指示符401B可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图4D所示的悬浮窗口18,悬浮窗口18可以包括手机100的一或多个输入法选项(例如输入法1的选项511、输入法2的选项512),以及输入法设置控件513,输入法指示符401B的显示内容为手机100的当前输入法的图标。
示例性的,如图4D所示,手机100的当前的输入法为输入法2,输入法指示符401B的显示内容为输入法2的图标,输入法1的选项511可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200可以切换手机100在扩展屏桌面的输入法为输入法1。输入法设置控件513用于显示输入法的设置界面。
信号强度指示符401C可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图4E所示的悬浮窗口19,悬浮窗口19包括手机100的WiFi的网络状态521和实时网速522。示例性的,WiFi的网络状态包括强、中、弱这三种网络状态。如图4E所示,手机100的当前的WiFi网络状态521为“中”,实时网速522为263千比特每秒(Kbps)。
电池状态指示符401C可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图4F所示的悬浮窗口20,悬浮窗口20显示有手机100的当前剩余电量531以及设置控件532。示例性的,如图4E所示,手机100的当前的剩余电量531为40%。设置控件532可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200可以显示手机100的电池的设置界面。
时间指示符401E可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图4G所示的悬浮窗口21,悬浮窗口21包括时间、日期以及当月的日历。可选的,悬浮窗口21还包括上翻控件541和下翻控件542。上翻控件541可以用于查看当前月份之前的月份的日历,下翻控件542可以用于查看当前月份之后的月份的日历。可选的,悬浮窗口21还包括时间和日期的设置控件543,设置控件543用于显示时间和日期的设置界面。
控制中心图标401F可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图4H所示的控制中心窗口22,控制中心窗口22包括多个常用功能的快捷图标551(例如图4H所示的WLAN图标、蓝牙图标、NFC图标和多屏协同图标等),一或多个智慧协同的设备的图标552(例如,笔记本电脑MateBook、平板MatePad、台式电脑Desktop、智能音响Sound X等),以及一或多个智能家居设备的控制框(例如空气净化器的控制框553、智能灯光的控制框554等)。如图4H所示,智慧协同的设备的图标552中显示手机100与型号为“MateBook”的电脑(即电脑200)当前处于已协同状态。
本申请实施例中,手机100还可以根据扩展屏桌面的主题色彩和/或壁纸色彩调整扩展屏状态栏401显示的界面元素的颜色。可选的,扩展屏桌面的主题色彩和桌面壁纸404颜色较 深时,调整扩展屏状态栏401的界面元素的颜色为白色或其他预设的浅色;扩展屏桌面的主题色彩和桌面壁纸颜色较浅时,调整扩展屏状态栏401的界面元素的颜色为黑色或其他预设的深色。示例性的,如图3G所示,主界面16的桌面壁纸404的颜色较浅,扩展屏状态栏401的界面元素的主要颜色为黑色;如图4I所示,主界面16的桌面壁纸404色彩颜色较深,扩展屏状态栏401的界面元素的主要颜色为白色。
搜索栏402可以接收用户的输入操作(例如通过鼠标右键的双击操作),响应于上述输入操作,电脑200显示图5所示的全局搜索的悬浮窗口23。示例性的,悬浮窗口23包括搜索栏601、常用应用程序的图标602、搜索历史603和热点新闻604。搜索栏601可以接收用户输入的文本信息,并基于用户输入的文本信息进行全局搜索,常用应用程序的图标602可以包括一或多个常用应用程序的图标,搜索历史603可以包括搜索栏601最近的一或多条搜索记录,热点新闻604可以包括实时热点新闻标题和热搜排行榜中的热门新闻标题。
本申请实施例中,通过搜索栏402可以实现全局搜索,既可以线下搜索手机100本地安装的应用程序、存储的文件等,又可以线上搜索新闻、视频、音乐等资源。
在一些实施例中,可以将扩展屏桌面上的搜索栏402设置于扩展屏状态栏401内,此处不做具体限定。
需要说明的是,在一些实施例中,用户可以通过鼠标在电脑200显示的手机100的主界面16上移动鼠标的光标。可选的,针对用户通过鼠标实施的指定输入操作,本申请实施例增加了相应的光标动效,从而增加了对用户的输入操作的视觉反馈。示例性的,鼠标的光标的初始显示形态可以为第一形状(例如箭头),当鼠标的光标悬停在扩展屏桌面和扩展屏状态栏显示的指定界面元素(例如图5所示的搜索栏402)上时,可以改变鼠标的光标的显示形态为第二形状(例如图5所示的搜索栏402上的小手)。
Dock栏403也可以被称为程序坞。示例性的,如图6A所示,Dock栏403可以包括固定应用区701和最近运行区702,固定应用区701包括应用列表图标701A,多任务图标701B,显示桌面图标701C以及一或多个应用程序的图标(例如文件管理图标701D、浏览器图标701E)。最近运行区702包括最近运行的应用程序的图标。可选的,若固定应用区701包括最近运行的应用程序的图标,则无需将该应用程序的图标加入最近运行区702。
在一些实施例中,用户可以设置Dock栏403的显示状态。可选的,Dock栏403的显示状态被设置为自动隐藏,当电脑200显示手机100的扩展屏桌面或手机100的其他应用程序的用户界面时,Dock栏403自动隐藏,用户通过鼠标移动光标靠近Dock栏403的所在区域时,电脑200才显示Dock栏403。可选的,Dock栏403的显示状态被设置为一直显示,当电脑200显示手机100的扩展屏桌面,或以悬浮窗口的形态显示手机100的其他应用程序的用户界面时,Dock栏403一直显示;当电脑200全屏显示其他应用程序的用户界面时,Dock栏403自动隐藏,当用户通过鼠标移动光标靠近Dock栏403的所在区域时,电脑200才显示Dock栏403。
应用列表图标701A可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图6A所示的用户界面24,用户界面24包括应用程序图标列表703,应用程序图标列表703可展示多个应用程序图标,例如音乐图标703A和图库图标703B。
可选的,用户界面24还可以包括页面指示符704,其他应用程序图标可分布在多个页面,页面指示符704可用于指示用户当前查看的是哪一个页面中的应用程序。可选的,用户可以 通过手指在用户界面24上左滑或右滑来查看其他页面的应用程序图标。可选的,如图6A所示,用户界面24还可以包括左翻控件705和右翻控件706,用户可以通过鼠标点击左翻控件705或右翻控件706查看其他页面的应用程序图标。可选的,左翻控件705和右翻控件706可以被隐藏,当鼠标移动光标靠近左翻控件705(或右翻控件706)的所在区域时,电脑200才显示左翻控件705(或右翻控件706)。可选的,用户界面24还可以包括搜索栏707,搜索栏707用于快捷搜索手机100已安装的应用程序的图标。
本申请实施例中,用户可以将应用程序图标列表703中的应用程序图标添加到Dock栏的固定应用区701。示例性的,如图6B所示,音乐图标703A可以接收用户的拖动操作(例如用户通过鼠标左键选中图标后向固定应用区701拖动),响应于上述拖动操作,电脑200可以将音乐图标703A添加到固定应用区701。
本申请实施例中,应用程序图标列表703和Dock栏403中的应用程序图标可以接收用户的输入操作(例如通过鼠标左键的双击操作、用户手指的触摸操作),响应于上述输入操作,电脑200可以显示该应用程序图标的对应的应用程序的用户界面。
需要说明的是,手机100安装的部分第三方应用不支持扩展屏显示。本申请实施例中,手机100设置可以在扩展屏显示的应用程序的白名单,通过设置白名单可以屏蔽不能在扩展屏桌面上实现的功能。在一些实施例中,应用程序图标列表703仅显示上述白名单中的应用程序的图标,同时在第三方应用能支持扩展屏显示后,动态将该应用程序加入到上述白名单,以及将该应用程序的图标加入到应用程序图标列表703。此外,支持扩展屏显示的应用程序中,部分应用程序可以适应于电脑200的屏幕分辨率,在扩展屏上全屏显示PC化后的用户界面;部分应用程序不能适应电脑200的屏幕分辨率,仅能在扩展屏上显示该应用程序在手机100上的标准用户界面。
示例性的,图库支持在扩展屏上全屏显示PC化后的用户界面,图库图标703B可以接收用户的输入操作(例如通过鼠标左键的双击操作),响应于上述输入操作,电脑200可以全屏显示图6C所示的图库的PC化的用户界面25。示例性的,音乐不支持在扩展屏上全屏显示PC化后的用户界面,音乐图标703A可以接收用户的输入操作(例如通过鼠标左键的双击操作),响应于上述输入操作,电脑200显示图6D所示的音乐在手机100上的标准用户界面26。
具体的,在一些实施例中,电脑200接收到作用于应用程序图标1的用于启动应用程序1的输入操作(例如通过鼠标作用于图库图标703B的左键双击操作)时,电脑200向手机100发送上述输入操作的相关信息(例如左键双击的坐标、上述输入操作所作用的扩展屏对应的DisplayId);手机100基于上述输入操作的相关信息和Display1当前前台运行的用户界面(例如用户界面24)的界面布局,识别上述输入操作为作用于扩展屏桌面的应用程序图标1的输入操作,确定该输入操作用于启动应用程序1。
需要说明的是,用户意图在扩展屏桌面上启动手机100的应用程序1时,手机100的标准桌面上可能已经启动应用程序1。
可选的,当手机100确定手机100当前未通过Display0运行应用程序1时,手机100在Display1启动应用程序1的用户界面1(例如图库的用户界面25),并将更新后的Display1的显示内容投屏至电脑200,电脑200根据手机100发送的投屏数据显示上述用户界面1。可选的,当手机100确定手机100正在通过Display0运行应用程序1时,手机100将Display0中应用程序1的活动栈(ActivityStack)挪动至Display1,在Display1运行应用程序1,并清 除Display0中应用程序1的运行数据,然后将更新后的Display1的显示内容投屏至电脑200,电脑200根据手机100发送的投屏数据显示应用程序1的用户界面(例如图库的用户界面25)。可选的,当手机100确定手机100正在通过Display0运行应用程序1时,手机100向电脑200发送提示信息1,电脑200可以基于上述提示信息1显示提示信息2,以提示用户手机100正在运行应用程序1,用户可以通过手机100操控应用程序1。
多任务图标701B可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200显示图7A所示的多任务界面27,多任务界面27包括用户通过扩展屏桌面最近打开的一或多个应用的缩略图像(例如图库的缩略图711、备忘录的缩略图712等),以及删除控件713。可选的,最近打开的应用较多,电脑200无法同时显示最近打开的所有应用的缩略图像时,多任务界面25还可以包括右翻控件714。
其中,如图7B所示,缩略图像(例如图库的缩略图711)可接收用户的输入操作(例如通过鼠标左键的单击操作),响应于检测到的上述输入操作,电脑200可以显示该缩略图对应的图库的用户界面25。可选的,缩略图像(例如图库的缩略图711)包括删除控件711A,删除控件711A可接收用户的输入操作(例如通过鼠标左键的单击操作),响应于检测到的上述输入操作,电脑200可以通过手机100可以清除缩略图711对应的图库在Display1中占用的运行内存。删除控件713可接收用户的输入操作(例如通过鼠标左键的单击操作),响应于检测到的上述输入操作,电脑200可以通过手机100清除多任务界面中所有缩略图像对应的应用在Display1中占用的运行内存。右翻控件714可接收用户的输入操作(例如通过鼠标左键的单击操作),响应于检测到的上述输入操作,电脑200可以显示更多最近运行的应用的缩略图像。
当扩展屏桌面上显示有其他应用程序的应用程序窗口时,显示桌面图标701C可以用于将当前显示的一或多个应用程序窗口最小化,显示扩展屏桌面;当扩展屏桌面上没有显示其他应用程序窗口时,显示桌面图标701C可以用于将最近最小化的一或多个应用程序窗口恢复显示。示例性的,如图7B所示,电脑200显示图库的用户界面25,电脑200可以接收用户的输入操作(例如用户通过鼠标移动光标靠近Dock栏403在显示屏上的所在区域),响应于上述输入操作,电脑200在用户界面25上显示图8A所示的Dock栏403。如图8A所示,Dock栏403中的显示桌面图标701C可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200最小化用户界面25,显示图8A所示的扩展屏桌面;如图8B所示,最小化用户界面25后,显示桌面图标701C可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200再次显示最近被最小化的用户界面25。
在一些实施例中,全屏显示的应用程序的用户界面(例如图8C所示的图库的用户界面25)可以显示有最小化控件721、缩小控件722、关闭控件723。最小化控件721可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200最小化用户界面25。关闭控件723可以接收用户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200关闭用户界面25,清除用户界面25在手机100的Display1中占用的运行内存。
如图8C所示,缩小控件722可以接收用户的输入操作(例如通过鼠标左键的单击操作),如图8D所示,响应于上述输入操作,电脑200缩小用户界面25,以悬浮窗口28的形态运行用户界面25。如图8D所示,悬浮窗口28可以包括放大控件724,放大控件724可以接收用 户的输入操作(例如通过鼠标左键的单击操作),响应于上述输入操作,电脑200可以全屏显示图8C所示的用户界面25。
在一些实施例中,电脑200可以在手机100的扩展屏桌面上平铺显示多个应用程序窗口。示例性的,如图8E所示,电脑200同时显示图库对应的悬浮窗口28和备忘录对应的悬浮窗口29。
本申请实施例中,Dock栏403和应用程序图标列表703中的应用程序图标还可以接收用户的输入操作(例如通过鼠标右键的单击操作),响应于上述输入操作,电脑200可以显示该图标对应的应用程序的多个操作选项,以实现对上述应用程序的移除、分享、卸载或其他快捷功能等。需要说明的是,每个应用程序可对应的操作选项可能不同,例如,部分系统应用(例如图库、文件管理等)不支持被卸载。
示例性的,如图9A所示,Dock栏403中的应用程序图标(例如浏览器图标701E)可以接收用户的输入操作(例如通过鼠标右键的单击操作),响应于上述输入操作,电脑200可以显示悬浮窗口30。悬浮窗口30可以包括移除控件801、分享控件802、卸载控件803。
移除控件801可以接收用户的输入操作(例如通过鼠标右键的单击操作),响应于上述输入操作,电脑200可以将浏览器图标701E从Dock栏403中移除。分享控件802用于将浏览器应用分享给目标对象。卸载控件803用于卸载手机100的浏览器应用。
在一些实施例中,Dock栏403中的应用程序图标(例如图库图标701F)还可以接收用户的悬停操作(例如将鼠标光标悬停在该应用程序图标上),响应于上述输入操作,若该应用程序图标对应的应用处于后台运行状态,电脑200可以显示该应用程序图标对应的应用的缩略图,上述缩略图为上述应用最近运行的用户界面的缩略图。示例性的,如图9B所示,用户将鼠标光标悬停在图库图标701F上,电脑200显示图库的缩略图804。
在一些实施例中,手机100的扩展屏桌面和标准桌面的锁屏界面的风格保持一致。
可选的,用户控制电脑200对手机100的扩展屏桌面锁屏后,手机100也对标准桌面进行锁屏,反之亦然。示例性的,用户在电脑200上控制手机100的扩展屏桌面锁屏后,电脑200显示图10A所示的扩展屏桌面的锁屏界面31,并向手机100发送锁屏指令,锁屏界面31包括锁屏壁纸901;响应于上述锁屏指令,手机100显示图10B所示的锁屏界面32,锁屏界面32包括锁屏壁纸902。示例性的,用户控制手机100的标准桌面锁屏后,手机100显示图10B所示的锁屏界面32,并向电脑200发送锁屏指令;响应于上述锁屏指令,电脑200显示图10A所示的扩展屏桌面的锁屏界面31。其中,锁屏壁纸901和锁屏壁纸902源自同一张图片。可选的,锁屏壁纸901和锁屏壁纸902是由同一张图片裁剪所得的。
可选的,用户控制电脑200对手机100的扩展屏桌面锁屏后,手机100不会跟随扩展屏桌面来对标准桌面进行锁屏,反之亦然。
下面结合附图对本申请实施例涉及的手机100的软件系统进行介绍。
首先,需要说明的是,本申请实施例提供的投屏方法涉及对多个系统应用(包括Launcher、SystemUI、文件管理器(FileManager)、全局搜索(Hisearch)和无线投屏等(AirSharing)等)以及系统框架的修改。其中,
桌面启动器用于管理桌面的桌面布局、Dock栏、应用列表和多任务等。SystemUI是为用户提供系统级别的信息显示与交互的UI组件,用于管理状态栏、导航栏、通知中心、锁屏界面和壁纸等。文件管理器用于对外提供文件盒子、管理桌面文件、对外提供文件操作能力以及实现应用之间的文件拖拽。全局搜索用于实现本地和线上的全局搜索。无线投屏用于实 现手机100到目标投屏设备(例如电脑200)间的无线投屏。
在本申请实施例中,手机100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
示例性的,图11示出了本申请实施例提供的手机100的软件架构框图。
如图11所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,可以将Android系统从上至下分为应用程序层,应用程序框架层(Framework)以及系统库(Native)。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
应用程序层包括一系列应用程序包(Android application package,APK),例如桌面启动器apk(HWLauncher6.apk),系统界面apk(SystemUI.apk),文件管理器apk(FileManager.apk),全局搜索apk(Hisearch.apk)以及无线共享apk(AirSharing.apk)等。
在一些实施例中,桌面启动器apk包括标准桌面启动器(UniHomelauncher)和扩展屏桌面启动器(PcHomelauncher)。系统界面apk(SystemUI.apk)包括标准状态栏(StatusBar)和扩展屏状态栏(PcStatusBar)。文件管理器apk(FileManager.apk)包括标准界面,分栏界面以及文件盒子。全局搜索apk(Hisearch.apk)包括标准搜索界面和扩展屏搜索界面。无线共享apk(AirSharing.apk)包括普通无线投屏和自研显示器高清投屏。本申请实施例中,扩展屏桌面也可以被称为Pc桌面,扩展屏状态栏也可以被称为Pc状态栏,此处不做具体限定。
如图11所示,本申请实施例中,桌面启动器支持运行多个桌面实例,例如UniHomelauncher以及PcHomelauncher,UniHomelauncher用于启动手机100显示的标准桌面,PcHomelauncher用于启动投屏到电脑200上的扩展屏桌面,扩展屏桌面为适应于电脑200的Pc化的桌面。其中,标准桌面基于默认屏显示区(Display0)运行的应用程序窗口进行显示,扩展屏桌面基于扩展屏显示区(Display1)运行的应用程序窗口进行显示。SystemUI支持运行多个状态栏实例,例如标准状态栏以及扩展屏状态栏,标准状态栏对应的DisplayId为Display0的ID,扩展屏状态栏对应的DisplayId为Display1的ID。文件管理器中新增的分栏界面以及文件盒子,负责扩展屏桌面文件操作(例如复制、粘贴、移动、删除、还原和拖拽)。全局搜索也支持运行多个搜索界面实例,例如,标准桌面对应的标准搜索界面和扩展屏桌面对应的扩展屏搜索界面。
应用程序层还包括工具包(Kit),Kit包括软件开发包(HwSDK)、用户界面工具包(Uikit)和投屏协议工具包(Cast+kit)。
本申请实施例中,为实现本申请实施例提供的投屏方法,Kit新增了HwSDK,HwSDK是为本申请涉及的软件包、软件框架、硬件平台和操作系统建立应用软件的开发工具的集合。为保证Kit的兼容性,本申请实施例对Uikit和Cast+kit进行了修改,使得Uikit可以实现扩展屏桌面中原生控件能力增强、文本右键菜单优化、鼠标点击和鼠标悬停(Hover)等操作的动效优化等,并使得Cast+kit能够实现扩展屏桌面在高清显示器上的投屏。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming  interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
本申请实施例中,为实现本申请实施例提供的投屏方法,应用程序框架层新增了自研显示器投屏服务(HwProductiveMultiWindowManager)和Dock栏管理服务(DockBarManagerService),自研显示器投屏服务用于实现同时运行同一应用程序的多个应用程序窗口,以及将上述多个应用程序窗口中的指定窗口投屏至目标投屏设备(例如电脑200)。Dock栏管理服务用于管理Dock栏。
为保证应用程序框架层的兼容性,本申请实施例还对显示管理服务(DisplayManagerService,DMS)、输入输出服务(InputManagerService,IMS)和平行视界服务(HwPartsMagicWindow)进行了修改。
其中,DMS用于管理界面显示的生命周期,控制当前连接的默认屏显示区(Display0)和扩展屏显示区(Display1)的逻辑显示,并且在Display0和Display1的显示状态更改时,向系统和应用程序发送通知等。IMS用于管理手机100的输入和输出,手机100的输入输出设备可以包括打印机、硬盘、键盘、鼠标、硬盘、磁碟和可写性只读光盘。平行视界服务用于实现应用分屏功能,即一个应用的两个不同activity对应的两个用户界面可以同时显示,可以实现上述两个用户界面在同一显示屏上分屏显示,也可以实现上述两个用户界面在不同显示屏上显示,且本申请实施例提供的平行视界服务支持通过悬浮窗口显示用户界面。
本申请实施例中,手机100还对应用框架层的包管理服务(PackageManagerService,PMS)和壁纸服务(WallpaperService)进行了修改,使得包管理服务中新增对默认Launcher(即UniHomelauncher)的选取策略,WallpaperService支持壁纸在多个Display显示。
系统库可以包括多个功能模块。本申请实施例中,系统库可以包括事件分发器(InputDispatcher),音频系统(AudioSystem)以及投屏协议(Huawei Cast+)等。本申请实施例中,为实现将扩展屏桌面投屏至高清显示器,本申请实施例对投屏协议也做了相应的修改。
下面分别针对Launcher和SystemUI,对本申请实施例提供的投屏方法所应用的软件系统框架进行详细介绍。
示例性的,针对SystemUI,图12示出了本申请实施例提供的另一种软件系统框架。由图11可知,SystemUI.apk包括标准状态栏和扩展屏状态栏。如图12可知,SystemUI.apk具体可以包括:系统界面应用程序(SystemUIApplication),用于实现标准状态栏的基类(Base Class)、服务(Service)、组件(Component)、依赖类提供者(Dependency Providers)和UI控制类(UI Control)。其中:
SystemUIApplication是Application的子类,用于负责所有SystemUI组件的初始化。
基类包括系统界面工厂类(SystemUIFactory)、系统界面根组件(SystemUIRootComponent)等等。其中,SystemUIFactory用于创建SystemUI组件;SystemUIRootComponent用于实现依赖注入框架(dagger)的初始化。
服务包括系统界面服务(SystemUIService)。SystemUIService用于初始化SystemUI的一系列组件。在其启动时,该服务会逐个将定义在SystemUIService的服务列表的子服务实例化。通过调用各子服务的start()方法可以运行各个子服务,上述子服务均继承自SystemUI抽象类,状态栏和导航栏是上述服务列表中的子服务。
组件包括命令队列(CommandQueue)、依赖类(Dependency)、锁屏视图调解器 (KeyguardViewMediator)、通知中心(Notification)、标准系统栏(Systembars)、标准状态栏等等。其中,
CommandQueue是一个Binder类,用于处理状态栏和通知中心相关的请求。它会被状态栏注册到状态栏管理服务(StatusBarManagerService)中,用于接收StatusBarManagerService的消息;CommandQueue内部维护了一个事件队列,状态栏服务(StatusBarService)用于实现CommandQueue中的Callbacks回调。本申请实施例修改了CommandQueue组件,使其支持多个StatusBar(即标准状态栏和扩展屏状态栏)的消息分发。
Dependency用于创建全局可用的依赖关系。
SystemBars继承于SystemUI这个基类,是创建整个SystemUI视图的入口类。标准状态栏主要用来在标准桌面上显示应用的通知图标(Icon)和系统的状态图标(闹钟图标、wifi图标、SIM卡图标、系统时间图标等),并对上述图标进行控制和管理。Notification用于在状态栏显示通知信息,并对上述通知信息进行控制和管理。
KeyguardViewMediator是锁屏的核心类,其它锁屏对象都通过KeyguardViewMediator相互交互,是状态回调的管理类。所有来自锁屏服务(KeyguardService)的调用都会被KeyguardViewMediator转为UI线程。
依赖类提供者包括状态栏窗口控制类(StausBarWindowController)、状态栏图标控制器实现类(StatusBarIconControllerImpl)、状态栏策略(StausBarPolicy)等等。其中,
StausBarWindowController用于管理状态栏窗口视图(StatusBarWindowView),可以调用WindowManager的接口来显示标准状态栏。本申请实施例通过修改StatusBarWindowController组件,使其支持添加、删除和管理多个状态栏对象。
StatusBarIconControllerImpl用于管理状态栏中的应用图标,包括图标大小、位置和颜色变化。
StausBarPolicy用于管理状态栏的显示策略(例如状态栏的图标的更新、显示时间、显示位置等)。StatusBarPolicy是一个策略管理类,实际功能是由StatusBarService来实现的。
UI控制类包括状态栏窗口视图(StatusBarWindowView)、通知面板视图(NotificationPanelView)、快捷设置碎片(QSFragment)、状态栏碎片(StatusBarFragment)、状态栏视图(StatusBarView)等等。
StatusBarWindowView用于确定状态栏未扩展时的根布局,创建状态栏窗口视图。
StatusBarView负责创建和实例化整个SystemUI视图(包括状态栏、通知中心和锁屏界面等),StatusBarView里面定义了状态栏显示的图标的名字、显示顺序和便携式网络图形(Png)等。StatusBarService初始化时初始化了一个用于显示状态栏的StatusBarView,StatusBarService通过调用makeStatusBarView方法实现状态栏的初始化。
NotificationPanelView是状态栏下拉后的通知中心的控制类。
QSFragment是状态栏下拉后控制中心的控制类。
StatusBarFragment管理收缩状态的状态栏,负责状态栏中的图标的生命周期管理。
为了实现扩展屏状态栏,本申请实施例在系统界面apk中增加了用于实现扩展屏状态栏的服务、组件、依赖类提供者以及UI控制类。其中:
用于实现扩展屏状态栏的服务包括生产力服务(ProductiveService),ProductiveService继承自Service。
用于实现扩展屏状态栏的组件包括Pc依赖类(PcDependency)、Pc系统提供者 (PcSystemProviders)、Pc系统栏(PcSystembars)、扩展屏状态栏。其中,PcDependency、PcSystembars和扩展屏状态栏均继承自前述标准状态栏的相应的组件,针对扩展屏状态栏实现类似的功能,此处不再赘述。
用于实现扩展屏状态栏的依赖类提供者包括Pc状态栏窗口控制类(PcStausBarWindowController)、屏幕控制类(ScreenController)、锁屏控制类(KeyguardController)、远程控制类(RemoteController)。其中,PcStausBarWindowController继承自前述标准状态栏的StausBarWindowController,针对扩展屏状态栏实现类似的功能,此处不再赘述。
屏幕控制类(ScreenController)用于控制投屏侧的屏幕亮屏/熄屏。
本申请实施例,通过修改keyguard组件,例如锁屏控制类(KeyguardController),使其支持多个Display的锁屏。
远程控制类(RemoteController)用于和应用程序框架层(Framework)通信。
用于实现扩展屏状态栏的UI控制类包括Pc状态栏窗口视图(PcStatusBarWindowView)、Pc通知面板视图(PcNotificationPanelView)、Pc快捷设置碎片(PcQSFragment)、Pc状态栏碎片(PcStatusBarFragment)、Pc状态栏视图(PcStatusBarView)等等,新增的UI控制类均继承自前述标准状态栏的相应的控制类,针对扩展屏状态栏实现类似的功能,此处不再赘述。
此外,为保证兼容性,本申请实施例还对应用框架层中的窗口管理服务(WMS)、状态栏管理服务(StatusBarManagerService)、壁纸管理服务(WallpaperManagerService)、通知管理服务(NotificationManagerService,NMS)、Pc化模块(HwPartsPowerOffice)进行了相应的修改。其中,
窗口管理服务(WindowManagerService,WMS)包括窗口管理策略(WindowManagerPolicy)和锁屏服务代理(KeyguardServiceDelegate)。WMS支持同一应用程序同时运行多个应用程序窗口。
StatusBarManagerService支持注册和管理多个状态栏(例如标准状态栏和扩展屏状态栏)。StatusBarManagerService是StatusBarService的管理者,StatusBarService用于实现状态栏的图标的加载、更新和删除,状态栏与应用的交互,以及通知信息的处理等。
WallpaperManagerService支持壁纸在多个Display(例如标准桌面对应的Display0和扩展屏桌面对应的Display1)显示。
NMS支持Pc的UX风格,支持管理多个状态栏下拉的通知中心。
HwPartsPowerOffice用于修改投屏入口,本申请实施例在HwPartsPowerOffice中新增对Pc管理服务(HwPCManagerService)的修改,以实现在投屏场景下加载扩展屏桌面和扩展屏状态栏。
需要说明的是,本申请涉及的继承指子类继承父类的特征和行为,使得子类对象(实例)具有父类的实例域和方法,或子类从父类继承方法,使得子类具有父类相同的行为。
在一些实施例中,图12所示的系统架构图还包括接口层(Interface),Interface层包括安卓接口定义语言(Android Interface Definition Language,AIDL)和广播(Broadcast)。其中,AIDL是不同进程间的通信接口;Broadcast用于发送和接收广播,可实现消息的传递。
需要说明的是,SystemUI中包含的系统界面类型很多,图12仅仅是示例性说明,不限于图12所示的SystemUI的组件以及SystemUI的子类,本申请实施例中,还可以包括其他 SystemUI的组件以及SystemUI的子类,此处不做具体限定。
示例性的,针对Launcher,图13示出了本申请实施例提供的另一种软件系统框架。由图11可知,桌面启动器apk包括UniHomelauncher和PcHomelauncher。如图13所示,桌面启动器apk具体包括共同类(Common Class)、UI控制类(UI Control)、服务(Service)、活动栈。
为保证对扩展屏桌面的兼容性,本申请对共同类(Common Class)的桌面启动提供者(LauncherProvider)、数据库助手(DatabaseHelper)、桌面启动设置类(LauncherSettings)、桌面启动常量类(LauncherConstants)进行了修改。其中,
LauncherProvider为桌面启动器的数据库,是桌面启动器运行的多个桌面实例(例如标准桌面和扩展屏桌面)的应用图标的数据库内容提供者,以及实现其他应用对桌面启动器中数据的访问和操作。
DatabaseHelper用于负责数据库的创建和维护。
LauncherSettings用于实现数据库项的字符串定义,通过内部类Favorites提供一些Uri去操作LauncherProvider以及数据库中对应字段的字段名称。
LauncherConstants维护和管理应用程序中的常量。
为实现扩展屏桌面,本申请在共同类中新增了Pc布局配置(PclayoutConfig)、Pc设备文件(PcDeviceProfile)、Pc网格(单元)计数器(PcCellNumCalculator)、Pc桌面启动器策略(PcLauncherPolicy)、Pc桌面启动模式(PcLauncherModel)、Pc加载任务(PcLoaderTask)等,新增的共同类继承自标准桌面的相应的共同类,用于在扩展屏桌面上实现类似的功能。其中,
PclayoutConfig用于设定扩展屏桌面的布局属性(例如宽(width)和高(height))和参数,通过指定布局属性可以对扩展屏桌面中的组件的显示效果进行约束。
PcDeviceProfile用于定义扩展屏桌面中各个模块的基本属性、负责属性值的初始化、设置各元素布局的边距(padding)等。
PcCellNumCalculator为桌面图标布局策略类。
PcLauncherPolicy管理扩展屏桌面的显示策略。
PcLauncherModel是数据处理类,用于保存扩展屏桌面的桌面状态,提供读写数据库的API,在删除、替换、添加应用程序时更新数据库。
PcLoaderTask用于加载扩展屏桌面。
为实现扩展屏桌面,本申请还在控制类中新增了Pc拖拽层(PcDraglayer)、Pc桌面工作区(PcWorkSpce)、Pc单元布局(PcCelllayout)、Pc程序坞视图(PcDockview)、Pc文件夹(PcFolder)、Pc文件夹图标(PcFolderIcon)等,新增的控制类均继承自标准桌面的相应的控制类,用于在扩展屏桌面上实现类似的功能。其中,
PcDraglayer是负责分发事件的视图组(ViewGroup),用于对扩展屏桌面的事件进行初步处理,并按情况进行分发。DragLayer包含桌面布局(Workspace)、导航点(QuickNavigationView)、Dock区(Hotseat)、最近任务列表(OverviewContainer)。Hotseat是负责管理Dock栏的容器。
PcWorkSpace是PagedView的子类,由多个CellLayout组成,每个CellLayout代表一个分屏。PcWorkSpace用于实现扩展屏桌面上的分屏的滑动功能。
PcCellLayout用于管理扩展屏桌面的分屏的图标的显示和布局。
PcDockview用于实现投屏侧的Dock布局。
PcFolder用于实现扩展屏桌面的文件夹(包括用户创建的文件夹和系统自带的文件夹)。
为实现扩展屏桌面,本申请实施例的活动栈支持管理扩展屏桌面对应的任务栈,在服务中增加了Pc桌面服务(PcHomeService)。PcHomeService用于启动扩展屏桌面。
此外,为保证应用框架层对扩展屏桌面的投屏的兼容性,本申请实施例还对应用框架层中的活动管理服务(ActivityManagerService,AMS)进行了相应的修改,使得AMS支持同一应用程序同时运行多个应用程序窗口,例如支持上述两个桌面实例同时运行。
在一些实施例中,扩展屏桌面上新增了Dock栏,相应的,桌面启动器新增了IdockBar.aidl,以提供Binder接口给框架层操作Dock栏。
下面对本申请实施例涉及的任务栈进行介绍。
示例性的,如图14所示,活动栈包括默认屏对应的任务栈和扩展屏对应的任务栈,手机100可以分别独立维护默认屏对应的任务栈和扩展屏对应的任务栈。手机100通过DisplayId区分这两个任务栈,默认屏对应的任务栈的DisplayId为Display0的ID,扩展屏对应的任务栈的DisplayId为Display1的ID。如图14所示,Display1对应的任务栈运行一个桌面任务栈(HomeStack)和N个应用栈(AppStack),即Stack1至StackN,HomeStack包括一或多个桌面任务(Task),AppStack包括一或多个应用程序任务;Display0对应的任务栈运行一个HomeStack和一个AppStack,AppStack包括一或多个应用程序任务(例如Task-A和Task-B)。其中,一个Task包括一或多个Activity。
PcHomelauncher通过Display1运行手机100的扩展屏的HomeStack,UniHomeLauncher通过Display0运行手机100的默认屏的HomeStack,这样,可以实现扩展屏的HomeStack和默认屏的HomeStack的数据隔离。WMS通过指针事件监听器(TapPointerListener)获取输入事件后,可以将作用于扩展屏的输入事件分发给Display1,作用于默认屏的输入事件分发给Display0。
基于前述应用场景和软件系统,下面对本申请实施例提供的投屏方法的具体实现进行详细介绍。
参见图11至图13,本申请实施例修改了软件系统框架,使其支持在不同Display运行多个桌面实例和多个状态栏实例。基于上述软件系统框架,本申请实施例提供的投屏方法,包括:
S101、手机100接入目标投屏设备(例如电脑200)的显示器后,接收确认投屏的投屏指令。
S102、响应于上述投屏指令,手机100通过SystemUI新增扩展屏状态栏以及其相关的逻辑控制类,扩展屏状态栏对应的DisplayId为扩展屏显示区Display1的ID。
S103、响应于上述投屏指令,手机100还通过Launcher新增扩展屏桌面以及其相关的逻辑控制类,扩展屏桌面对应的DisplayId为扩展屏显示区Display1的ID。
具体的,在一些实施例中,响应于上述投屏指令,手机100通过HwPCManagerService启动ProductiveService服务,加载扩展屏状态栏,并通过HwPCManagerService启动PcHomeService服务,加载扩展屏桌面。
为进一步说明本申请实施例提供的投屏方法的具体实现,下面对本申请实施例涉及的 SystemUI和Launcher的软件实现流程,分别进行详细介绍。
示例性的,图15A示出了本申请实施例涉及的SystemUI的软件实现流程,该实现流程包括:
(1)、手机100开机时,在非投屏模式下,手机100启动Zeyote进程,通过Zeyote进程创建虚拟机实例,并执行系统服务(SystemServer)。
(2)、SystemServer启动系统运行所需的一系列服务,包括SystemUIService。
(3)、SystemUIService启动SystemUI,调用SystemUIApplication。
具体的,SystemServer分别启动boot service、core service和other service,在startOtherService方法中通过调用mActivityManagerService.systemReady()方法分别启动SystemUI和Launcher。
(4)、SystemUIApplication通过startServicesIfNeeded函数来启动SystemUI组件,SystemUI组件包括Systembars。
(5)、SystemBars启动后根据配置项config_statusBarComponent,启动标准状态栏(StatusBar)。
本申请实施例中,SystemUIService启动后读取配置项config_systemUIServiceComponents,加载各个组件(包括SystemBars)。SystemBars组件加载后读取配置(config_statusBarComponent),根据上述配置项确定状态栏的操控类,进而根据定状态栏的操控类选择启动扩展屏状态栏(PcStatusBar)还是标准状态栏(StatusBar)。
在一些实施例中,手机100在非投屏模式下,config_statusBarComponent的值为com.android.systemui.statusbar.phone.PhoneStatusBar,SystemBars确定状态栏的操控类为StatusBar;手机100在投屏模式下,config_systemBarComponent的值为com.android.systemui.statusbar.tablet.TabletStatusBar,SystemBars确定状态栏的操控类为PcStatusBar。
(6)、调用CommandQueue的callback接口,以添加回调给标准状态栏。
(7)、标准状态栏初始化布局,注册标准状态栏对应的IstatusBar对象至StatusBarManagerService。
(8)、标准状态栏创建并添加标准状态栏窗口视图(StatusBarWindowView)到StausBarWindowController。
(9)、StausBarWindowController调用WindowManager的接口,添加标准状态栏到WMS,进而将标准状态栏添加至Display1。
在一些实施例中,在StatusBar的启动过程中,会把CommandQueue对象传给StatusBarManagerService,保存为mBar。当客户端通过ServiceManager拿到StatusBarManagerService这个系统服务的接口时,可以通过mBar来调用CommandQueue对象的方法,而CommandQueue则通过callback接口回调消息给StatusBar,这样可以实现StatusBar的更新。
(10)、标准状态栏调用状态栏提示管理器(StatusBarPromptManager)注册投屏广播(Broadcast)。
(11)、创建标准状态栏后,还调用通知管理服务(NotificationManagerServic)注册消息监听。
(12)、手机100接入目标投屏设备的显示器(DisplayDevice)(即前述扩展屏)后, 调用DMS。
本申请实施例中,手机100可以通过有线连接接入目标投屏设备,或者通过NFC、WIFI或蓝牙等无线通信技术接入目标投屏设备。
(13)、DMS触发显示事件(OnDisplayevent),创建扩展屏对应的逻辑显示区Display1,并调用显示管理目标(DisplayManagerGlobal)获取Display1的DisplayId。
在一些实施例中,DMS调用handleDisplayDeviceAddedLocked函数为显示器(DisplayDevice)生成一个对应的逻辑设备(LogicalDevice),同时会将LogicalDevice加入到DMS管理的逻辑设备列表(mLogicalDevices)中,同时也将DisplayDevice加入到DMS的显示器列表(mDisplayDevices)中,进而为DisplayDevice生成一个逻辑显示区(LogicalDisplay)(即前述扩展屏显示区Display1)。DMS调用DisplayManagerGlobal确定该逻辑显示区的DisplayId。
(14)、DisplayManagerGlobal向显示监听代理(DisplaylistenerDelegate)发送上述显示事件,为Display1增添显示监听代理。
DisplayManagerGlobal主要负责管理显示管理器(Display Manager)与DMS之间的通信。
(15)、DisplaylistenerDelegate发送添加显示区(onDisplayAdded)的通知。
(16)、RootActivityContainer启动HwPCManagerService加载扩展屏状态栏。
本申请实施例中,hwPartsPowerOffice中新增了对HwPCManagerService的修改,用于在投屏场景下启动扩展屏桌面和扩展屏状态栏。
(17)、HwPCManagerService向状态栏提示管理器(StatusBarPromptManager)发送投屏胶囊提示广播。
(18)、HwPCManagerService向NotificationManagerService发送投屏通知消息。
(19)、HwPCManagerService接收切换模式的指令,该指令用于指示将当前的非投屏模式切换为投屏模式。
本申请实施例中,SystemUI到接收胶囊提示广播和投屏通知消息后,在状态栏显示胶囊提示,在通知中心显示投屏通知消息。示例性的,胶囊提示为图3E所示的提示框13,手机100接收用户点击提示框13的连接控件303的输入操作后,HwPCManagerService获取到切换模式的指令。示例性的,手机100接收到的电脑200发送的NFC连接响应后,HwPCManagerService接收切换模式的指令。
(20)、响应于上述指令,HwPCManagerService启动ProductiveService。
在一些实施例中,HwPCManagerService调用bindService启动ProductiveService。
(21)、ProductiveService调用Systembars启动扩展屏状态栏。
在一些实施例中,ProductiveService调用PcSystembars启动扩展屏状态栏。
(22)、Systembars基于配置文件创建扩展屏状态栏。
在一些实施例中,Systembars读取配置config_PcstatusBarComponent,确定状态栏的操控类为PcStatusBar,进而创建扩展屏状态栏。
(23)、扩展屏状态栏调用CommandQueue的callback接口,以添加回调给扩展屏状态栏。
(24)、扩展屏状态栏初始化布局,注册扩展屏状态栏对应的IstatusBar对象至StatusBarManagerService。
(25)、扩展屏状态栏创建并添加Pc状态栏窗口视图(StatusBarWindowView)到 StausBarWindowController。
(26)、StausBarWindowController调用WindowManager的接口添加扩展屏状态栏到WMS,进而将扩展屏状态栏添加至Display1。
示例性的,结合图15A所示的SystemUI的软件实现流程,图15B示例性示出了StausBar的软件实现。
如图15B所示,非投屏模式下CommandQueue只需要支持一个状态栏实例,即标准状态栏或扩展屏状态栏。可以理解,手机在非投屏模式下只需要支持标准状态栏,电脑在非投屏模式下只需要支持扩展屏状态栏。投屏模式下CommandQueue需要支持多个状态栏实例,保存上述多个状态栏实例的Callback。示例性的,手机100投屏到电脑200,CommandQueue需要支持手机100的默认屏显示区(Display0)对应的标准状态栏,以及手机100的扩展屏显示区(Display1)对应的扩展屏状态栏。
投屏模式下,需要调整状态栏的Context,确保每个状态栏获取对应的Display信息(例如DisplayId);StatusBarWindowController也需要支持添加多个状态栏;WindowManagerService通过addWindow添加窗口时,需要修改该窗口的窗口状态(windowState)的DisplayId,以确保该窗口能正确地显示在相应的显示屏上。例如,标准状态栏能显示到手机100的默认屏上,扩展屏状态栏能显示到手机100的扩展屏(即电脑200的显示屏)上。
参见图15B,SystemUIService启动时,调用SystemUIApplication的startServicesIfNeeded来启动SystemUI组件(包括SystemBars)。
非投屏模式下,SystemBars启动后根据配置项config_statusBarComponent获取默认状态栏(即手机100的标准状态栏)。SystemBars调用标准状态栏的start()函数进行初始化,通过CommandQueue组件设置Callback,然后调用Binder接口,通过registerStatusBar函数将标准状态栏对应的IstatusBar对象注册至StatusBarManagerService进行管理。
手机100接入显示器(即电脑200)时,DMS通过输入通道接收到OnDisplayDeviceEvent,DMS基于回调记录(CallbackRecord)分发显示事件(DisplayEvent)。手机100切换非投屏模式至投屏模式后,DMS创建扩展屏对应的逻辑显示区Display1,并调用DisplayManagerGlobal获取Display1的DisplayId,DisplayManagerGlobal向DisplaylistenerDelegate发送上述显示事件,为Display1增添显示监听代理。SystemUI组件读取配置文件Config.xml的配置项config_PcsystemUIServiceComponents,SystemBars启动后根据配置项config_PcstatusBarComponent启动扩展屏状态栏。如图15B所示,投屏模式下,SystemBars调用扩展屏状态栏的start()函数进行初始化,通过CommandQueue组件设置Callback,然后调用Binder接口通过registerStatusBar函数将扩展屏状态栏对应的IstatusBar对象注册至StatusBarManagerService进行管理。然后,扩展屏状态栏创建并添加StatusBarWindowView到StausBarWindowController;StausBarWindowController调用WindowManager的接口添加扩展屏状态栏到WMS。WMS调用窗口管理目标(WindowManagerGlobal)的ViewRootImpl和IwindowSession,并修改扩展屏状态栏对应窗口的WindowState的DisplayId为Display1的ID。
如图15B所示,StatusBarManagerService包括ArryMap列表和SpareArry列表。其中,ArryMap列表维护注册到StatusBarManagerService的多个IstatusBar的Binder对象。SpareArry列表维护多个Display的Uistate对象和多个标准状态栏的对应关系。
示例性的,图15C示出了本申请实施例涉及的Launcher的软件实现流程。下面对该如图软件实现流程进行详细介绍。
首先,需要说明的是,本申请实施例中,由于投屏模式下需要支持扩展屏显示,因此在应用的AndroidMainfest.xml中PcHomeLauncher节点的配置intent-filter中新增category选项:"android.intent.category.SECONDARY_HOME",该选项用于表示扩展屏的桌面。
非投屏模式下,前述步骤(2)中,SystemServer启动系统运行所需的一系列服务,包括Launcher。
具体的,SyetemServer进程在启动过程中会启动PMS和AMS,PackageManagerService启动后会将系统中的应用程序APK进行解析和安装,AMS主要用于四大组件的启动和管理,启动Launcher的入口为AMS的systemReady方法。
(31)AMS通过StartHomeOnAllDisplays方法,启动Launcher。
(32)活动任务管理器(ActivityTaskManagerinternal)调用StartHomeOnDisplays,启动Launcher。
(33)RootActivityContainer判断DisplayId是否为默认屏显示区(即Display0)的ID。若是,则解析标准桌面Activity(resolveHomeActivity),并执行步骤(34);若不是,则解析扩展屏桌面Activity(resolveSecondaryHomeActivity),并执行步骤(37)。
在一些实施例中,如果是Display0的ID,PMS查询CATEGORY_HOME的Activity作为桌面,如果是Display1的ID,PMS查询SECONDARY_HOME的Activity作为桌面。
(34)活动启动控制器(ActivityStartController)调用活动启动器(ActivityStarter)。
(35)ActivityStarter调用startHomeActivityLocked启动标准桌面。
投屏模式下,步骤(12)至步骤(16)可以参考图15A的相关描述。
在步骤(12)至步骤(16)执行过程中,DisplayManagerService在添加显示区Display时,会发送onDisplayAdded通知,PC化模块(hwPartsPowerOffice)中的HwPCManagerService在接收到通知后,通过startHomeOnDisplay启动扩展屏的Launcher(即PcHomeLauncher)。
(36)HwPCManagerService通过BindService调用Pc桌面服务(HwHomeService)。
(37)HwHomeService通过StartHomeOnProductiveDisplay调用AMS。
(38)AMS调用StartHomeOnAllDisplays方法,启动PcHomeLauncher。
(39)活动任务管理器调用StartHomeOnDisplays,启动PcHomeLauncher。
然后,执行步骤(33),在步骤(33)中获取扩展屏桌面Activity(resolveSecondaryHomeActivity),并执行步骤(40)。
(40)活动启动控制器(ActivityStartController)调用活动启动器(ActivityStarter)。
(41)ActivityStarter调用startHomeActivityLocked启动扩展屏桌面。
基于前述实施例,本申请实施例提供了一种投屏方法,该投屏方法包括但不限于步骤S201至步骤S205。
S201、第一电子设备调用第一应用的第一模块运行第一桌面,第一桌面和第一显示区关联;第一电子设备基于第一显示区显示第一显示内容,第一显示内容包括第一桌面。
S202、响应于第一用户操作,第一电子设备调用第一应用的第二模块运行第二桌面,第二桌面和第二显示区关联;第一电子设备向第二电子设备发送第二显示区对应的第二显示内容,第二显示内容包括第二桌面。
本申请实施例中,第一电子设备可以是前述电子设备100,例如手机100;第二电子设备 可以是前述电子设备200,例如电脑200。第一应用可以是前述桌面启动器,例如HWLauncher6;第一模块可以是标准桌面启动器,例如UniHomelauncher;第二模块可以是扩展屏桌面启动器,例如PcHomelauncher。第一桌面可以是前述标准桌面,第二桌面可以是前述扩展屏桌面。第一显示区可以是前述默认屏显示区,即Disply0,第二显示区可以是前述扩展屏显示区,即Disply1。
示例性的,第一显示内容可以是图3A所示手机100显示的用户界面,第一桌面可以是图3A所示的桌面。第二显示内容可以是图3G所示的电脑200基于手机100发送的投屏数据显示的用户界面,第二桌面可以是图3G所示的桌面。
S203、响应于作用于第一显示内容的第二用户操作,第一电子设备基于第一显示区运行的任务栈,显示第三显示内容。
示例性的,第一显示内容可以是图3A所示手机100显示的用户界面11,第二用户操作可以是图3A所示的作用于用户界面11的状态栏的向下滑动操作,第三显示内容可以是图3B所示的控制中心界面12。
S204、响应于作用于第二电子设备显示的第二显示内容的第三用户操作,第一电子设备基于第二显示区运行的任务栈,确定第二显示区对应的显示内容为第四显示内容。
S205、第一电子设备向第二电子设备发送第四显示内容。
示例性的,第二显示内容可以是图3G所示的电脑200显示的用户界面16,第二用户操作可以是作用于用户界面16显示的界面元素(例如状态栏401中的图标、搜索栏402、Dock栏403中的图标)的输入操作。举例来说,上述界面元素为状态栏401中的通知中心图标401A,第四显示内容可以包括图4A所示的用户界面16和通知中心窗口17。
需要说明的是,第一电子设备是通过第二电子设备接收到的第三用户操作。第二电子设备接收到第三用户操作后,确定所述第一输入事件,第一输入事件用于指示所述第三输入操作。例如,第一输入事件包括第三输入操作作用在第二电子设备的显示屏上的坐标,所述第三输入操作的操作类型(例如触摸操作、鼠标点击操作等)等。第二电子设备向第一电子设备发送第一输入事件,第一电子设备基于第一输入事件和第二显示区运行的任务栈确定第一输入事件指示的第三输入操作,以及执行所述第三输入操作对应的响应事件。执行所述响应事件后,所述第二显示区对应的显示内容更新为第四显示内容。第一显示设备向第二显示设备发送更新后的第四显示内容,第二显示设备显示第四显示内容。
实施本申请实施例,第一电子设备支持通过同一应用在不同显示区同时运行多个桌面实例,例如通过第一应用的第一模块在第一显示区运行第一桌面,通过第一应用的第二模块在第二显示区运行第二桌面。第一电子设备基于第一显示区运行的任务栈确定本设备主屏的显示内容,基于第二显示区运行的任务栈确定投屏到第二电子设备的显示内容。这样,基于两个不同的显示区,第一电子设备和第二电子设备可以显示不同桌面,以及其他的不同内容。
在一些实施例中,上述响应于作用于第一显示内容的第二用户操作,第一电子设备基于第一显示区运行的任务栈,显示第三显示内容,包括:响应于作用于第一显示内容中的第一桌面的第二用户操作,第一电子设备基于第一显示区运行的第一应用的任务栈,显示第三显示内容;上述响应于作用于第二电子设备显示的第二显示内容的第三用户操作,第一电子设备基于第二显示区运行的任务栈,确定第二显示区对应的显示内容为第四显示内容,包括:响应于作用于第二电子设备显示的第二桌面的第三用户操作,第一电子设备基于第二显示区运行的第一应用的任务栈,确定第二显示区对应的显示内容为第四显示内容。
示例性的,第一显示内容可以是图3A所示的桌面(即第一桌面)的应用程序图标(例如图库图标),第二用户操作可以是作用于上述应用程序图标的触摸操作,响应于第二用户操作,手机100基于Display0运行的桌面启动器的任务栈(例如前述桌面任务栈Homestack),确定第二用户操作对应的响应事件。示例性的,第二显示内容可以是图6A所示的电脑200显示的桌面(即第二桌面),第三用户操作可以是作用于用户界面16显示的Dock栏403中的应用列表图标701A的输入操作,响应于第三用户操作,手机100基于Display1运行的桌面启动器的任务栈(例如前述桌面任务栈Homestack),确定第三用户操作对应的响应事件并执行,然后将更新后的第二显示区对应的第四显示内容(即图6A所示用户界面24)发送给电脑200。电脑200显示图6A所示的用户界面24。
实施本申请实施例,针对作用于第一桌面的第二用户操作,第一电子设备可以基于第一桌面关联的显示区所运行的第一应用的任务栈,执行第二用户操作对应响应事件;针对作用于第二桌面的第三用户操作,第一电子设备可以基于第二桌面关联的显示区所运行的第一应用的任务栈,执行第三用户操作对应响应事件。这样,可以保证不同桌面的事件(输入事件和/或响应事件)的数据隔离。此外,由于两个桌面实例均由第一应用的模块运行,因此两个桌面可以实现指定数据的共享,第二桌面可以继承第一桌面的部分或全部功能特性。
在一些实施例中,上述第一电子设备基于第一显示区显示第一显示内容之前,所述方法还包括:第一电子设备调用第二应用的第三模块运行第一状态栏,第一状态栏与第一显示区关联,第一显示内容包括第一状态栏;所述方法还包括:响应于第一用户操作,第一电子设备调用第二应用的第四模块运行第二状态栏,第二状态栏与第二显示区关联,第二显示内容包括第二状态栏。
本申请实施例中,第一用户操作是用户确定第一电子设备向第二电子设备投屏的输入操作。在一些实施例中,步骤S202中,第一电子设备响应于投屏指令,第一电子设备调用第一应用的第二模块运行第二桌面,并调用第二应用的第四模块运行第二状态栏,第二桌面和第二状态栏均与第二显示区关联。示例性的,上述投屏指令可以参考前述实施例中的投屏指令或切换模式的指令。
本申请实施例中,第二应用可以是前述系统界面,例如SystemUI;第三模块可以是标准状态栏,例如StatusBar;第四模块可以是扩展屏状态栏,例如PcStatusBar。第一状态栏可以是前述标准状态栏,第二状态栏可以是前述扩展屏状态栏。
示例性的,第一状态栏可以是图3A所示的状态栏101,第二状态栏可以是图3G所示的状态栏401。
实施本申请实施例,第一电子设备支持通过同一应用在不同显示区同时运行多个状态栏实例,例如通过第二应用的第三模块在第一显示区运行第一状态栏,通过第二应用的第四模块在第二显示区运行第二状态栏。这样,通过将两个状态栏与不同的显示区相关联,第一电子设备和第二电子设备可以显示不同状态栏,保证了两个状态栏的事件(输入事件和/或响应事件)的数据隔离。此外,由于两个状态栏均由第二应用的模块运行,因此两个状态栏可以实现指定数据(例如通知消息)的共享,第二状态栏可以继承第一状态栏的部分或全部功能特性。
在一些实施例中,上述第一电子设备基于第一显示区显示第一显示内容之前,所述方法还包括:第一电子设备调用第三应用的第五模块运行第一变量的第一显示对象;第一变量与第一显示区关联,第一显示内容包括第一显示对象;第一变量与第二显示区关联,第二显示 内容包括第一显示对象。
实施本申请实施例,第一电子设备支持同一变量对应的显示对象同时在多个不同的显示区显示。本申请实施例,第三应用与第二应用可以为同一应用,也可以为不同应用,此处不做具体限定。
在一些实施例中,所述方法还包括:响应于作用于第一显示内容的第四用户操作,调用第三应用的第五模块修改第一变量的显示对象为第二显示对象;第一电子设备更新第一显示区对应的显示内容为第五显示内容,第五显示内容包括第二显示对象;第一电子设备更新第二显示区对应的显示内容为第六显示内容,并向第二电子设备发送第六显示内容,第六显示内容包括第二显示对象。
实施本申请实施例,第一电子设备支持同一变量对应的显示对象同时在多个不同的显示区显示,用户改变第一变量在第一显示区的显示对象后,第一变量在第二显示区的显示对象也随之改变。
在一些实施例中,第一变量用于指示壁纸的显示对象,壁纸的显示对象为静态图片和/或动态图片,壁纸包括锁屏时的锁屏壁纸和/或非锁屏时的桌面壁纸。
本申请实施例中,第三应用可以是系统界面(SystemUI)或者壁纸应用,例如,第五模块可以是壁纸管理服务,例如WallpaperManagerService。
示例性的,壁纸的第一显示对象在第一显示区显示为图3A所示的桌面壁纸107,壁纸的第一显示对象在第二显示区显示为图3G所示的桌面壁纸404。桌面壁纸107和桌面壁纸404源自同一张图片。
示例性的,壁纸的第一显示对象在第一显示区显示为图10B所示的锁屏界面中锁屏壁纸902,壁纸的第一显示对象在第二显示区显示为图10A所示的锁屏界面中锁屏壁纸901。桌面壁纸107和桌面壁纸404源自同一张图片。
实施本申请实施例,用户改变第一电子设备显示的壁纸后,投屏到第二电子设备上的壁纸也随之改变。
在一些实施例中,第一电子设备预设了多种主题,主题用于指示桌面布局风格、图标显示风格和/或界面色彩等;第一变量用于指示主题的显示对象,主题的显示对象为多种主题中的一种主题对应的显示内容。
实施本申请实施例,用户改变第一电子设备显示的主题后,投屏到第二电子设备上的主题也随之改变。
在一些实施例中,第一应用的第一模块包括用于创建和运行第一桌面的第一共同类、第一用户界面UI控制类和第一桌面的桌面任务栈;第一应用的第二模块包括用于创建和运行第二桌面的第二共同类、第二UI控制类和第二桌面的桌面任务栈,第二共同类中的部分或全部类继承自第一共同类,第二UI控制类中的部分或全部类继承自第一UI控制类。
实施本申请实施例,第一电子设备新增了用于创建和运行第二桌面的第二共同类、第二UI控制类和第二桌面的桌面任务栈,且新增的共同类和UI控制类中的部分或全部,继承自原有的第一桌面对应的第一共同类和第一UI控制类,因此,第二桌面可以继承第一桌面的部分功能特性,两个桌面可以实现指定数据的共享。
在一些实施例中,第二共同类包括以下一项或多项:桌面启动提供者、数据库助手、桌面启动设置类、桌面启动常量类、Pc布局配置、Pc设备文件、Pc网格计数器、Pc桌面启动器策略、Pc桌面启动模式、Pc加载任务等;第二UI控制类包括以下一项或多项:Pc拖拽层、 Pc桌面工作区、Pc单元布局、Pc程序坞视图、Pc文件夹、Pc文件夹图标等。
示例性的,第二共同类可以为图13所示的共同类,第二UI控制类可以为图13所示的UI控制类,第一桌面的桌面任务栈可以为图13所示的标准桌面的任务栈,第二桌面的桌面任务栈可以为图13所示的扩展屏桌面的任务栈。
在一些实施例中,第二应用的第三模块包括用于创建和运行第一状态栏的第一组件、第一依赖控制类和第三UI控制类;第一应用的第二模块包括用于创建和运行第二状态栏的第二组件、第二依赖控制类和第四UI控制类,第二组件中的部分或全部组件继承自第一组件,第二依赖控制类中的部分或全部类继承自第一依赖控制类,第四UI控制类中的部分或全部类继承自第三UI控制类。
实施本申请实施例,第一电子设备新增了用于创建和运行第二状态栏的第二组件、第二依赖控制类和第四UI控制类,且新增的组件、依赖控制类和UI控制类中的部分或全部,继承自原有的第一状态栏对应的第一组件、第一依赖控制类和第三UI控制类,因此,第二状态栏可以继承第一状态栏的部分功能特性,两个状态栏可以实现指定数据的共享。
在一些实施例中,第二组件包括以下一项或多项:Pc依赖类、Pc系统提供者、Pc系统栏、第二状态栏;第二依赖控制类包括以下一项或多项:Pc状态栏窗口控制类、屏幕控制类、锁屏控制类、远程控制类;第四UI控制类包括以下一项或多项:Pc状态栏窗口视图、Pc通知面板视图、Pc快捷设置碎片、Pc状态栏碎片、Pc状态栏视图。
示例性的,第一组件、第一依赖控制类和第三UI控制类可以分别为图12所示的用于实现标准状态栏的组件、依赖控制类和UI控制类。第二组件、第二依赖控制类和第四UI控制类可以分别为图12所示的用于实现扩展屏状态栏的组件、依赖控制类和UI控制类。
在一些实施例中,第二模块关联的显示区的身份标识ID为第二显示区的ID;上述响应于第一用户操作,第一电子设备调用第一应用的第二模块运行第二桌面,第二桌面和第二显示区关联,包括:响应于第一用户操作,Pc管理服务接收到切换模式的指令,指令用于指示将当前的非投屏模式切换为投屏模式;响应于指令,Pc管理服务调用Pc桌面服务,Pc桌面服务调用活动管理服务,活动管理服务调用活动任务管理器启动第一应用的第二模块;调用根活动容器确定第二模块关联的显示区的ID;当第二模块关联的显示区的ID为第二显示区的ID时,查询第二桌面的Activity作为待启动桌面的Activity,当第二模块关联的显示区的ID为第一显示区的ID时,查询第一桌面的Activity作为待启动桌面的Activity;活动启动控制器调用活动启动器启动第二桌面。
在一些实施例中,上述响应于第一用户操作,第一电子设备调用第二应用的第四模块运行第二状态栏,第二状态栏与第二显示区关联,包括:响应于第一用户操作,Pc管理服务接收到切换模式的指令,指令用于指示将当前的非投屏模式切换为投屏模式;响应于指令,Pc管理服务启动生产力服务,生产力服务调用系统栏启动第二状态栏,系统栏基于配置文件创建第二状态栏;第二状态栏调用命令队列的回调接口,以添加回调给第二状态栏;第二状态栏初始化布局,注册第二状态栏对应的IstatusBar对象至状态栏管理服务;第二状态栏创建并添加Pc状态栏窗口视图到状态栏窗口控制类;状态栏窗口控制类调用窗口管理的接口添加第二状态栏到窗口管理服务,进而将第二状态栏添加至第二显示区。
在一些实施例中,非投屏模式下,命令队列支持与第一显示区关联的第一状态栏;投屏模式下,命令队列同时支持与第一显示区关联的第一状态栏,以及与第二显示区关联的第二状态栏。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (17)

  1. 一种投屏方法,其特征在于,包括:
    所述第一电子设备调用第一应用的第一模块运行第一桌面,所述第一桌面和第一显示区关联;
    所述第一电子设备基于所述第一显示区显示第一显示内容,所述第一显示内容包括第一桌面;
    响应于第一用户操作,所述第一电子设备调用所述第一应用的第二模块运行第二桌面,所述第二桌面和第二显示区关联;
    所述第一电子设备向所述第二电子设备发送所述第二显示区对应的第二显示内容,所述第二显示内容包括所述第二桌面;
    响应于作用于所述第一显示内容的第二用户操作,所述第一电子设备基于所述第一显示区运行的任务栈,显示第三显示内容;
    响应于作用于所述第二电子设备显示的第二显示内容的第三用户操作,所述第一电子设备基于所述第二显示区运行的任务栈,确定所述第二显示区对应的显示内容为第四显示内容;
    所述第一电子设备向所述第二电子设备发送第四显示内容。
  2. 根据权利要求1所述的方法,其特征在于,所述响应于作用于所述第一显示内容的第二用户操作,所述第一电子设备基于所述第一显示区运行的任务栈,显示第三显示内容,包括:
    响应于作用于所述第一显示内容中的所述第一桌面的所述第二用户操作,所述第一电子设备基于所述第一显示区运行的所述第一应用的任务栈,显示第三显示内容;
    所述响应于作用于所述第二电子设备显示的第二显示内容的第三用户操作,所述第一电子设备基于所述第二显示区运行的任务栈,确定所述第二显示区对应的显示内容为第四显示内容,包括:
    响应于作用于所述第二电子设备显示的所述第二桌面的所述第三用户操作,所述第一电子设备基于所述第二显示区运行的所述第一应用的任务栈,确定所述第二显示区对应的显示内容为所述第四显示内容。
  3. 根据权利要求1所述的方法,其特征在于,所述第一电子设备基于所述第一显示区显示第一显示内容之前,所述方法还包括:
    所述第一电子设备调用第二应用的第三模块运行第一状态栏,所述第一状态栏与所述第一显示区关联,所述第一显示内容包括所述第一状态栏;
    所述方法还包括:
    响应于所述第一用户操作,所述第一电子设备调用所述第二应用的第四模块运行第二状态栏,所述第二状态栏与所述第二显示区关联,所述第二显示内容包括所述第二状态栏。
  4. 根据权利要求1所述的方法,其特征在于,所述第一电子设备基于所述第一显示区显示第一显示内容之前,所述方法还包括:
    所述第一电子设备调用所述第三应用的第五模块运行第一变量的第一显示对象;所述第 一变量与所述第一显示区关联,所述第一显示内容包括所述第一显示对象;所述第一变量与所述第二显示区关联,所述第二显示内容包括所述第一显示对象。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    响应于作用于所述第一显示内容的第四用户操作,调用所述第三应用的所述第五模块修改所述第一变量的显示对象为第二显示对象;
    所述第一电子设备更新所述第一显示区对应的显示内容为第五显示内容,所述第五显示内容包括所述第二显示对象;
    所述第一电子设备更新所述第二显示区对应的显示内容为第六显示内容,并向所述第二电子设备发送所述第六显示内容,所述第六显示内容包括第二显示对象。
  6. 根据权利要求4或5所述的方法,其特征在于,所述第一变量用于指示壁纸的显示对象,所述壁纸的显示对象为静态图片和/或动态图片,所述壁纸包括锁屏时的锁屏壁纸和/或非锁屏时的桌面壁纸。
  7. 根据权利要求4或5所述的方法,其特征在于,所述第一电子设备预设了多种主题,所述主题用于指示桌面布局风格、图标显示风格和/或界面色彩等;
    所述第一变量用于指示所述主题的显示对象,所述主题的显示对象为所述多种主题中的一种主题对应的显示内容。
  8. 根据权利要求1所述的方法,其特征在于,所述第一应用的所述第一模块包括用于创建和运行第一桌面的第一共同类、第一用户界面UI控制类和第一桌面的桌面任务栈;所述第一应用的所述第二模块包括用于创建和运行第二桌面的第二共同类、第二UI控制类和第二桌面的桌面任务栈,所述第二共同类中的部分或全部类继承自所述第一共同类,所述第二UI控制类中的部分或全部类继承自所述第一UI控制类。
  9. 根据权利要求8所述的方法,其特征在于,所述第二共同类包括以下一项或多项:桌面启动提供者、数据库助手、桌面启动设置类、桌面启动常量类、Pc布局配置、Pc设备文件、Pc网格计数器、Pc桌面启动器策略、Pc桌面启动模式、Pc加载任务等;
    第二UI控制类包括以下一项或多项:Pc拖拽层、Pc桌面工作区、Pc单元布局、Pc程序坞视图、Pc文件夹、Pc文件夹图标等。
  10. 根据权利要求1所述的方法,其特征在于,所述第二应用的所述第三模块包括用于创建和运行第一状态栏的第一组件、第一依赖控制类和第三UI控制类;所述第一应用的所述第二模块包括用于创建和运行第二状态栏的第二组件、第二依赖控制类和第四UI控制类,所述第二组件中的部分或全部组件继承自所述第一组件,所述第二依赖控制类中的部分或全部类继承自所述第一依赖控制类,所述第四UI控制类中的部分或全部类继承自所述第三UI控制类。
  11. 根据权利要求10所述的方法,其特征在于,所述第二组件包括以下一项或多项:Pc 依赖类、Pc系统提供者、Pc系统栏、第二状态栏;
    第二依赖控制类包括以下一项或多项:Pc状态栏窗口控制类、屏幕控制类、锁屏控制类、远程控制类;
    第四UI控制类包括以下一项或多项:Pc状态栏窗口视图、Pc通知面板视图、Pc快捷设置碎片、Pc状态栏碎片、Pc状态栏视图。
  12. 根据权利要求1至11任一项所述的方法,其特征在于,所述第二模块关联的显示区的身份标识ID为所述第二显示区的ID;所述响应于第一用户操作,所述第一电子设备调用所述第一应用的第二模块运行第二桌面,所述第二桌面和第二显示区关联,包括:
    响应于所述第一用户操作,Pc管理服务接收到切换模式的指令,所述指令用于指示将当前的非投屏模式切换为投屏模式;
    响应于所述指令,所述Pc管理服务调用Pc桌面服务,所述Pc桌面服务调用活动管理服务,所述活动管理服务调用活动任务管理器启动所述第一应用的所述第二模块;
    调用根活动容器确定所述第二模块关联的显示区的ID;当所述第二模块关联的显示区的ID为所述第二显示区的ID时,查询所述第二桌面的Activity作为待启动桌面的Activity,当所述第二模块关联的显示区的ID为所述第一显示区的ID时,查询所述第一桌面的Activity作为待启动桌面的Activity;
    活动启动控制器调用活动启动器启动所述第二桌面。
  13. 根据权利要求1至11任一项所述的方法,其特征在于,所述响应于所述第一用户操作,所述第一电子设备调用所述第二应用的第四模块运行第二状态栏,所述第二状态栏与所述第二显示区关联,包括:
    响应于所述第一用户操作,所述Pc管理服务接收到切换模式的所述指令,所述指令用于指示将当前的非投屏模式切换为投屏模式;
    响应于所述指令,所述Pc管理服务启动生产力服务,所述生产力服务调用系统栏启动第二状态栏,所述系统栏基于配置文件创建所述第二状态栏;
    所述第二状态栏调用命令队列的回调接口,以添加回调给所述第二状态栏;所述第二状态栏初始化布局,注册所述第二状态栏对应的IstatusBar对象至状态栏管理服务;所述第二状态栏创建并添加Pc状态栏窗口视图到状态栏窗口控制类;
    所述状态栏窗口控制类调用窗口管理的接口添加所述第二状态栏到窗口管理服务,进而将所述第二状态栏添加至所述第二显示区。
  14. 根据权利要求13所述的方法,其特征在于,非投屏模式下,所述命令队列支持与所述第一显示区关联的所述第一状态栏;投屏模式下,所述命令队列同时支持与所述第一显示区关联的所述第一状态栏,以及与所述第二显示区关联的所述第二状态栏。
  15. 一种电子设备,包括触控屏,存储器,一个或多个处理器,多个应用程序,以及一个或多个程序;其中,所述一个或多个程序被存储在所述存储器中;其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备实现如权利要求1至14任一项所述的方法。
  16. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1至14任一项所述的方法。
  17. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1至14任一项所述的方法。
PCT/CN2022/091899 2021-05-19 2022-05-10 投屏方法及相关装置 WO2022242503A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22803825.3A EP4343533A1 (en) 2021-05-19 2022-05-10 Screen projection method and related apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110547552 2021-05-19
CN202110547552.4 2021-05-19
CN202110745467.9 2021-06-30
CN202110745467.9A CN115373778A (zh) 2021-05-19 2021-06-30 投屏方法及相关装置

Publications (1)

Publication Number Publication Date
WO2022242503A1 true WO2022242503A1 (zh) 2022-11-24

Family

ID=84060018

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/091899 WO2022242503A1 (zh) 2021-05-19 2022-05-10 投屏方法及相关装置

Country Status (3)

Country Link
EP (1) EP4343533A1 (zh)
CN (1) CN115373778A (zh)
WO (1) WO2022242503A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140281896A1 (en) * 2013-03-15 2014-09-18 Google Inc. Screencasting for multi-screen applications
CN107493375A (zh) * 2017-06-30 2017-12-19 北京超卓科技有限公司 移动终端扩展式投屏方法及投屏系统
US20190121682A1 (en) * 2015-04-26 2019-04-25 Intel Corporation Integrated android and windows device
CN112416200A (zh) * 2020-11-26 2021-02-26 维沃移动通信有限公司 显示方法、装置、电子设备和可读存储介质
CN112527221A (zh) * 2019-09-18 2021-03-19 华为技术有限公司 一种数据传输的方法及相关设备
CN112667183A (zh) * 2020-12-31 2021-04-16 努比亚技术有限公司 投屏方法、移动终端及计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140281896A1 (en) * 2013-03-15 2014-09-18 Google Inc. Screencasting for multi-screen applications
US20190121682A1 (en) * 2015-04-26 2019-04-25 Intel Corporation Integrated android and windows device
CN107493375A (zh) * 2017-06-30 2017-12-19 北京超卓科技有限公司 移动终端扩展式投屏方法及投屏系统
CN112527221A (zh) * 2019-09-18 2021-03-19 华为技术有限公司 一种数据传输的方法及相关设备
CN112416200A (zh) * 2020-11-26 2021-02-26 维沃移动通信有限公司 显示方法、装置、电子设备和可读存储介质
CN112667183A (zh) * 2020-12-31 2021-04-16 努比亚技术有限公司 投屏方法、移动终端及计算机可读存储介质

Also Published As

Publication number Publication date
CN115373778A (zh) 2022-11-22
EP4343533A1 (en) 2024-03-27

Similar Documents

Publication Publication Date Title
WO2021013158A1 (zh) 显示方法及相关装置
WO2021057830A1 (zh) 一种信息处理方法及电子设备
US11803451B2 (en) Application exception recovery
CN112269527B (zh) 应用界面的生成方法及相关装置
WO2021139768A1 (zh) 跨设备任务处理的交互方法、电子设备及存储介质
CN108845856B (zh) 基于对象的同步更新方法、装置、存储介质及设备
KR102481065B1 (ko) 애플리케이션 기능 구현 방법 및 전자 디바이스
US11861382B2 (en) Application starting method and apparatus, and electronic device
CN113553130B (zh) 应用执行绘制操作的方法及电子设备
CN113961157B (zh) 显示交互系统、显示方法及设备
WO2022127661A1 (zh) 应用共享方法、电子设备和存储介质
US20240086231A1 (en) Task migration system and method
WO2020155875A1 (zh) 电子设备的显示方法、图形用户界面及电子设备
WO2022262439A1 (zh) 网络资源的处理方法、电子设备及计算机可读存储介质
WO2023130921A1 (zh) 一种适配多设备的页面布局的方法及电子设备
EP4198709A1 (en) Navigation bar display method, display method and first electronic device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
WO2021190524A1 (zh) 截屏处理的方法、图形用户接口及终端
WO2021052488A1 (zh) 一种信息处理方法及电子设备
WO2023109764A1 (zh) 一种壁纸显示方法及电子设备
WO2023088459A1 (zh) 设备协同方法及相关装置
WO2022242503A1 (zh) 投屏方法及相关装置
CN117009023B (zh) 显示通知信息的方法及相关装置
WO2023142935A1 (zh) 应用组件管理方法及相关设备
WO2024083114A1 (zh) 一种软件分发方法、电子设备及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22803825

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023571459

Country of ref document: JP

Ref document number: 18561899

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2022803825

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022803825

Country of ref document: EP

Effective date: 20231219