WO2022242503A1 - 投屏方法及相关装置 - Google Patents
投屏方法及相关装置 Download PDFInfo
- Publication number
- WO2022242503A1 WO2022242503A1 PCT/CN2022/091899 CN2022091899W WO2022242503A1 WO 2022242503 A1 WO2022242503 A1 WO 2022242503A1 CN 2022091899 W CN2022091899 W CN 2022091899W WO 2022242503 A1 WO2022242503 A1 WO 2022242503A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- desktop
- electronic device
- status bar
- display
- application
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 105
- 230000004044 response Effects 0.000 claims abstract description 103
- 230000000694 effects Effects 0.000 claims description 56
- 230000015654 memory Effects 0.000 claims description 32
- 230000001419 dependent effect Effects 0.000 claims description 31
- 238000003860 storage Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 10
- 239000012634 fragment Substances 0.000 claims description 10
- 230000003068 static effect Effects 0.000 claims description 4
- 238000007726 management method Methods 0.000 description 68
- 238000004891 communication Methods 0.000 description 43
- 230000006870 function Effects 0.000 description 38
- 230000008569 process Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 24
- 238000007667 floating Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 9
- 238000002955 isolation Methods 0.000 description 7
- 230000005236 sound signal Effects 0.000 description 7
- 230000004048 modification Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 238000005457 optimization Methods 0.000 description 6
- 239000011230 binding agent Substances 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 4
- 210000000988 bone and bone Anatomy 0.000 description 4
- 239000002775 capsule Substances 0.000 description 4
- 210000004027 cell Anatomy 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 239000000243 solution Substances 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 210000000352 storage cell Anatomy 0.000 description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 239000007858 starting material Substances 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 1
- TZCXTZWJZNENPQ-UHFFFAOYSA-L barium sulfate Chemical compound [Ba+2].[O-]S([O-])(=O)=O TZCXTZWJZNENPQ-UHFFFAOYSA-L 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002195 synergetic effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/452—Remote windowing, e.g. X-Window System, desktop virtualisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
Definitions
- the present application relates to the field of electronic technology, and in particular to a screen projection method and related devices.
- Smart terminals and multiple display devices form multi-screen linkage and synergistic complementarity, which is an important link in the establishment of a full-scene ecology.
- screen projection technology covers large-screen scenarios such as mobile office scenarios, car-machine Hicar, and smart screens. Take the screen projection of a mobile phone to a personal computer (PC) as an example. After the screen of the mobile phone is mirrored to the computer, the user can control the mobile phone on the computer.
- PC personal computer
- mirror projection is usually used, that is, the display content on the mobile phone (that is, the projection sending device) and the computer (that is, the projection receiving device) are exactly the same after projection, and the user cannot The screen projection sending device and the screen projection receiving device view different display contents.
- the present application provides a screen projection method and a related device, which can display different contents on a screen projection sending device and a screen projection receiving device after screen projection.
- the present application provides a screen projection method, including: the first electronic device invokes the first module of the first application to run the first desktop, and the first desktop is associated with the first display area; Displaying the first display content, the first display content includes the first desktop; in response to the first user operation, the first electronic device invokes the second module of the first application to run the second desktop, and the second desktop is associated with the second display area; An electronic device sends the second display content corresponding to the second display area to the second electronic device, and the second display content includes the second desktop; in response to the second user operation acting on the first display content, the first electronic device based on the first
- the task stack running in the display area displays the third display content; in response to the third user operation acting on the second display content displayed by the second electronic device, the first electronic device determines the second display content based on the task stack running in the second display area.
- the display content corresponding to the display area is the fourth display content; the first electronic device sends the fourth display content to the second electronic device.
- the first electronic device (that is, the screen projection sending device) supports running multiple desktop instances in different display areas through the same application at the same time, for example, running the first desktop in the first display area through the first module of the first application , using the second module of the first application to run the second desktop in the second display area.
- the first electronic device determines the display content of the main screen of the device based on the task stack running in the first display area, and determines the display content to be projected to the second electronic device (ie, the screen projection receiving device) based on the task stack running in the second display area. In this way, based on the two different display areas, the first electronic device and the second electronic device can display different desktops and other different contents.
- the first electronic device in response to the second user operation acting on the first display content, displays the third display content based on the task stack running in the first display area, including: responding to the first For the second user operation on the first desktop in the displayed content, the first electronic device displays the third displayed content based on the task stack of the first application running in the first display area; the above response acts on the second displayed by the second electronic device
- the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack run by the second display area, including: In the third user operation on the second desktop, the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack of the first application running in the second display area.
- the first electronic device may execute a response event corresponding to the second user operation based on the task stack of the first application running in the display area associated with the first desktop;
- the first electronic device may execute a response event corresponding to the third user operation based on the task stack of the first application running on the display area associated with the second desktop.
- data isolation of events (input events and/or response events) of different desktops can be guaranteed.
- the two desktop instances are both run by the modules of the first application, the two desktops can share specified data, and the second desktop can inherit some or all of the functions of the first desktop.
- the method further includes: the first electronic device calls the third module of the second application to run the first status bar, and the first The status bar is associated with the first display area, and the first display content includes the first status bar; the method further includes: in response to the first user operation, the first electronic device calls the fourth module of the second application to run the second status bar, The second status bar is associated with the second display area, and the second display content includes the second status bar.
- the first electronic device supports running multiple instances of the status bar in different display areas through the same application, for example, running the first status bar in the first display area through the third module of the second application, and running the first status bar in the first display area through the second application
- the fourth module runs a second status bar in the second display area.
- the first electronic device and the second electronic device can display different status bars, ensuring the data of events (input events and/or response events) in the two status bars isolation.
- the two status bars can share specified data (such as notification messages), and the second status bar can inherit some or all of the functional characteristics of the first status bar.
- the method further includes: the first electronic device calls the fifth module of the third application to run the first display of the first variable Object; the first variable is associated with the first display area, and the first display content includes the first display object; the first variable is associated with the second display area, and the second display content includes the first display object.
- the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time.
- the third application and the second application may be the same application or different applications, which are not specifically limited here.
- the method further includes: in response to a fourth user operation acting on the first display content, calling the fifth module of the third application to modify the display object of the first variable to the second display object; the first The electronic device updates the display content corresponding to the first display area to the fifth display content, and the fifth display content includes the second display object; the first electronic device updates the display content corresponding to the second display area to the sixth display content, and sends to the second The electronic device sends sixth display content, where the sixth display content includes the second display object.
- the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time. After the user changes the display object of the first variable in the first display area, the first variable is displayed in the second display area The display object of will also change accordingly.
- the first variable is used to indicate the display object of the wallpaper
- the display object of the wallpaper is a static picture and/or a dynamic picture
- the wallpaper includes a lock screen wallpaper when the screen is locked and/or a desktop wallpaper when the screen is not locked .
- the wallpaper projected on the second electronic device after the user changes the wallpaper displayed on the first electronic device, the wallpaper projected on the second electronic device also changes accordingly.
- the first electronic device presets multiple themes, and the theme is used to indicate the desktop layout style, icon display style and/or interface color, etc.; the first variable is used to indicate the display object of the theme, and the theme display The object is the display content corresponding to one of the various themes.
- the theme projected on the second electronic device after the user changes the theme displayed on the first electronic device, the theme projected on the second electronic device also changes accordingly.
- the first module of the first application includes a first common class for creating and running the first desktop, a first user interface UI control class, and a desktop task stack of the first desktop; the first module of the first application
- the second module includes the second common class for creating and running the second desktop, the second UI control class and the desktop task stack of the second desktop, part or all of the classes in the second common class inherit from the first common class, and the second Part or all of the UI control classes inherit from the first UI control class.
- the first electronic device adds a second common class for creating and running the second desktop, a second UI control class, and a desktop task stack of the second desktop, and the newly added common class and UI control Part or all of the class is inherited from the first common class and the first UI control class corresponding to the original first desktop. Therefore, the second desktop can inherit part of the functional characteristics of the first desktop, and the two desktops can implement specified data of sharing.
- the second common class includes one or more of the following: desktop startup provider, database assistant, desktop startup setting class, desktop startup constant class, Pc layout configuration, Pc device file, Pc grid counter, Pc desktop launcher strategy, Pc desktop startup mode, Pc loading tasks, etc.;
- the second UI control class includes one or more of the following: Pc drag layer, Pc desktop workspace, Pc unit layout, Pc dock view, Pc file folder, Pc folder icon, etc.
- the third module of the second application includes the first component for creating and running the first status bar, the first dependent control class, and the third UI control class;
- the second module of the first application includes the first component for creating and running the first status bar;
- the class inherits from the first dependent control class, and some or all classes in the fourth UI control class inherit from the third UI control class.
- the first electronic device adds a second component for creating and running the second status bar, a second dependent control class and a fourth UI control class, and the newly added components, dependent control class and UI Part or all of the control class inherits from the first component corresponding to the original first status bar, the first dependent control class, and the third UI control class. Therefore, the second status bar can inherit part of the functions of the first status bar Features, the two status bars can realize the sharing of specified data.
- the second component includes one or more of the following: Pc dependent class, Pc system provider, Pc system bar, second status bar;
- the second dependent control class includes one or more of the following: Pc Status bar window control class, screen control class, lock screen control class, remote control class;
- the fourth UI control class includes one or more of the following: Pc status bar window view, Pc notification panel view, Pc quick setting fragments, Pc status Bar Fragments, Pc Status Bar View.
- the ID of the display area associated with the second module is the ID of the second display area; in response to the first user operation, the first electronic device invokes the second module of the first application to run the second desktop , the second desktop is associated with the second display area, including: in response to the first user operation, the Pc management service receives an instruction to switch modes, and the instruction is used to indicate to switch the current non-screen projection mode to the screen projection mode; in response to the instruction , the Pc management service calls the Pc desktop service, the Pc desktop service calls the activity management service, and the activity management service calls the activity task manager to start the second module of the first application; the ID of the display area associated with the second module is determined by calling the root activity container; When the ID of the display area associated with the second module is the ID of the second display area, query the Activity of the second desktop as the Activity of the desktop to be started; when the ID of the display area associated with the second module is the ID of the first display area, The Activity of the first desktop is queried as the Activity of the desktop to be started
- the first electronic device invokes the fourth module of the second application to run the second status bar
- the second status bar is associated with the second display area, including: responding to the first User operation, the Pc management service receives an instruction to switch modes, and the instruction is used to instruct to switch the current non-screen projection mode to the screen projection mode; in response to the instruction, the Pc management service starts the productivity service, and the productivity service invokes the system bar to start the second state
- the system bar creates a second status bar based on the configuration file;
- the second status bar calls the callback interface of the command queue to add a callback to the second status bar;
- the second status bar initializes the layout and registers the IstatusBar object corresponding to the second status bar to Status bar management service;
- the second status bar creates and adds the Pc status bar window view to the status bar window control class;
- the status bar window control class calls the window management interface to add the second status bar to the window management service, and then the second status bar Added to the second display area.
- the command queue in the non-screen projection mode, supports the first status bar associated with the first display area; in the screen projection mode, supports the first status bar associated with the first display area at the same time, and A second status bar associated with the second display area.
- the present application provides an electronic device, including one or more processors and one or more memories.
- the one or more memories are coupled with one or more processors, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions, the electronic device performs A screen projection method in any possible implementation manner of any one of the above aspects.
- an embodiment of the present application provides a computer storage medium, including computer instructions, and when the computer instructions are run on the electronic device, the electronic device executes the screen projection method in any possible implementation of any one of the above aspects .
- an embodiment of the present application provides a computer program product, which, when the computer program product is run on a computer, causes the computer to execute the screen projection method in any possible implementation manner of any one of the above aspects.
- FIG. 1A is a schematic diagram of a communication system provided by an embodiment of the present application.
- FIG. 1B is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
- FIG. 2 is a schematic diagram of screen projection between electronic equipment pieces provided by the embodiment of the present application.
- Fig. 3A is a schematic diagram of the main interface provided by the embodiment of the present application.
- FIG. 3B to FIG. 3G are schematic diagrams of user interfaces for screen projection through NFC provided by the embodiment of the present application.
- FIGS. 4A to 4H are schematic diagrams of the second-level interface of the status bar of the extended screen provided by the embodiment of the present application.
- FIG. 4I is a schematic diagram of the status bar of the extended screen provided by the embodiment of the present application.
- FIG. 5 is a schematic diagram of the user interface of the search bar on the extension screen provided by the embodiment of the present application.
- 6A to 6B are schematic diagrams of the user interface of the application icon list provided by the embodiment of the present application.
- FIG. 6C is a schematic diagram of the user interface of the gallery on the extended screen desktop provided by the embodiment of the present application.
- FIG. 6D is a schematic diagram of a user interface of music on the extended screen desktop provided by the embodiment of the present application.
- FIGS. 7A to 7B are schematic diagrams of the multitasking interface provided by the embodiment of the present application.
- FIGS. 8A to 8B are schematic diagrams of relevant user interfaces for displaying desktop icons provided by the embodiment of the present application.
- 8C to 8E are schematic diagrams of the user interface of the application program on the extended screen desktop provided by the embodiment of the present application.
- FIGS. 9A to 9B are schematic diagrams of user interfaces related to application icons in the Dok column provided by the embodiment of the present application.
- FIGS. 10A to 10B are schematic diagrams of the lock screen interface provided by the embodiment of the present application.
- FIG. 11 to 13 are schematic diagrams of the software system provided by the embodiment of the present application.
- FIG. 14 is a schematic diagram of an activity stack provided by an embodiment of the present application.
- 15A to 15C are schematic diagrams of software implementation of the screen projection method provided by the embodiment of the present application.
- first and second are used for descriptive purposes only, and cannot be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, the “multiple” The meaning is two or more.
- FIG. 1A exemplarily shows a schematic structural diagram of a communication system 10 provided by an embodiment of the present application.
- the communication system 10 includes an electronic device 100 and one or more electronic devices connected to the electronic device 100 , such as an electronic device 200 .
- the electronic device 100 may be directly connected to the electronic device 200 through a short-range wireless communication connection or a local wired connection.
- the electronic device 100 and the electronic device 200 may have a near field communication (near field communication, NFC) communication module, a wireless fidelity (wireless fidelity, WiFi) communication module, an ultra wide band (ultra wide band, UWB) communication module, One or more short-distance communication modules in communication modules such as bluetooth communication modules and ZigBee communication modules.
- NFC near field communication
- WiFi wireless fidelity
- UWB ultra wide band
- One or more short-distance communication modules in communication modules such as bluetooth communication modules and ZigBee communication modules.
- the electronic device 100 can detect and scan electronic devices near the electronic device 100 by transmitting signals through a short-range communication module (such as an NFC communication module), so that the electronic device 100 can discover nearby electronic devices through a short-range wireless communication protocol.
- a short-range communication module such as an NFC communication module
- An electronic device (such as the electronic device 200), establishes a wireless communication connection with a nearby electronic device, and transmits data to the nearby electronic device.
- the electronic device 100 and the electronic device 200 may be connected to a local area network (local area network, LAN) through the electronic device 300 based on a wired connection or a WiFi connection.
- the electronic device 300 may be a third-party device such as a router, a gateway, or a smart device controller.
- the electronic device 100 and the electronic device 200 may also be indirectly connected through at least one electronic device 400 in a wide area network (such as a Huawei cloud network).
- the electronic device 400 may be a hardware server, or a cloud server embedded in a virtualized environment. It can be understood that, through the electronic device 300 and/or the electronic device 400 , the electronic device 100 and the electronic device 200 can indirectly perform wireless communication connection and data transmission.
- the structure shown in this embodiment does not constitute a specific limitation on the communication system 10 .
- the communication system 10 may include more or less devices than those shown.
- the electronic device 100 can send the projected image data and/or audio data, etc. to the electronic device 200, and the electronic device 200 can transmit data based on the data Interface display and/or audio output.
- the screen resolutions of the display screens of the electronic device 200 and the electronic device 100 may be different.
- the electronic device 100 may be a mobile phone, a tablet computer, a personal digital assistant (personal digital assistant, PDA), a wearable device, a laptop computer (laptop) and other portable electronic devices.
- the electronic device 100 may not be a portable electronic device, which is not limited in this embodiment of the present application.
- the electronic device 200 may be any display device such as a smart screen, a TV, a tablet computer, a notebook computer, a vehicle-mounted device, or a projector. Exemplary embodiments of electronic device 100 and electronic device 200 include, but are not limited to, carrying or other operating systems.
- FIG. 1B exemplarily shows a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
- the hardware structure of the electronic device 200 reference may be made to the related embodiments of the hardware structure of the electronic device 100, which will not be repeated here.
- the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
- SIM subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
- the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
- the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
- the illustrated components can be realized in hardware, software or a combination of software and hardware.
- the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
- application processor application processor, AP
- modem processor graphics processing unit
- GPU graphics processing unit
- image signal processor image signal processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- baseband processor baseband processor
- neural network processor neural-network processing unit
- the controller may be the nerve center and command center of the electronic device 100 .
- the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
- a memory may also be provided in the processor 110 for storing instructions and data.
- the memory in processor 110 is a cache memory.
- the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
- processor 110 may include one or more interfaces.
- the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- PCM pulse code modulation
- UART universal asynchronous transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input and output
- subscriber identity module subscriber identity module
- SIM subscriber identity module
- USB universal serial bus
- the charging management module 140 is configured to receive a charging input from a charger.
- the charger may be a wireless charger or a wired charger.
- the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
- the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
- the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
- the power management module 141 receives the input of the battery 142 and/or the charging management module 140, and supplies power for the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160, etc.
- the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
- the power management module 141 may also be disposed in the processor 110 .
- the power management module 141 and the charging management module 140 may also be set in the same device.
- the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
- Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
- the antenna may be used in conjunction with a tuning switch.
- the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
- the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
- the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
- at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
- at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
- a modem processor may include a modulator and a demodulator.
- the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
- the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
- the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
- the modem processor may be a stand-alone device.
- the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
- the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
- System global navigation satellite system, GNSS
- frequency modulation frequency modulation, FM
- near field communication technology near field communication, NFC
- infrared technology infrared, IR
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
- the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
- the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
- GSM global system for mobile communications
- GPRS general packet radio service
- code division multiple access code division multiple access
- CDMA broadband Code division multiple access
- WCDMA wideband code division multiple access
- time division code division multiple access time-division code division multiple access
- TD-SCDMA time-division code division multiple access
- the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- Beidou navigation satellite system beidou navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite based augmentation systems
- the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
- the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
- Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
- the display screen 194 is used to display images, videos and the like.
- the display screen 194 includes a display panel.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
- the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
- the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
- the ISP is used for processing the data fed back by the camera 193 .
- the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
- ISP can also perform algorithm optimization on image noise, brightness, and skin color.
- ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
- the ISP may be located in the camera 193 .
- Camera 193 is used to capture still images or video.
- the object generates an optical image through the lens and projects it to the photosensitive element.
- the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
- CMOS complementary metal-oxide-semiconductor
- the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
- the ISP outputs the digital image signal to the DSP for processing.
- DSP converts digital image signals into standard RGB, YUV and other image signals.
- the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
- Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
- Video codecs are used to compress or decompress digital video.
- the electronic device 100 may support one or more video codecs.
- the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
- MPEG moving picture experts group
- the NPU is a neural-network (NN) computing processor.
- NN neural-network
- Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
- the internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM).
- Random access memory can include static random-access memory (SRAM), dynamic random access memory (DRAM), synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous Dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as the fifth generation DDR SDRAM is generally called DDR5SDRAM), etc.
- non-volatile memory can include disk storage devices, flash memory (flash memory). According to the operating principle, flash memory can include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc.
- the potential order of storage cells can include single-level storage cells (single-level cell, SLC), multi-level storage cells (multi-level cell, MLC), triple-level cell (TLC), quad-level cell (QLC), etc., can include universal flash storage (English: universal flash storage, UFS) according to storage specifications , embedded multimedia memory card (embedded multi media Card, eMMC), etc.
- the random access memory can be directly read and written by the processor 110, can be used to store executable programs (such as machine instructions) of an operating system or other running programs, and can also be used to store user and application data, etc.
- the non-volatile memory can also store executable programs and data of users and application programs, etc., and can be loaded into the random access memory in advance for the processor 110 to directly read and write.
- the external memory interface 120 can be used to connect an external non-volatile memory, so as to expand the storage capacity of the electronic device 100 .
- the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external non-volatile memory.
- the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
- the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
- the audio module 170 may also be used to encode and decode audio signals.
- Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
- Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
- the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
- the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
- the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
- the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
- the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
- the air pressure sensor 180C is used to measure air pressure.
- the magnetic sensor 180D includes a Hall sensor.
- the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (for example, the directions pointed by the three axes in the x, y, z coordinate system of the electronic device 100).
- the distance sensor 180F is used to measure the distance.
- the electronic device 100 may measure the distance by infrared or laser.
- Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
- LEDs light emitting diodes
- photodiodes such as photodiodes
- the ambient light sensor 180L is used for sensing ambient light brightness.
- the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
- the fingerprint sensor 180H is used to collect fingerprints.
- the temperature sensor 180J is used to detect temperature.
- the touch sensor 180K is also called “touch device”.
- the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
- the touch sensor 180K is used to detect a touch operation on or near it.
- the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
- Visual output related to the touch operation can be provided through the display screen 194 .
- the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
- the bone conduction sensor 180M can acquire vibration signals.
- the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice and the blood pressure beating signal.
- the key 190 can be a mechanical key or a touch key.
- the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
- the electronic device 200 displays the extended screen desktop of the electronic device 100 in full screen.
- the extended screen desktop usually adopts an Android application different from the standard desktop of the electronic device 100 A custom APK of a program package (Android application package, APK), that is, the electronic device 100 simulates displaying the desktop and the status bar of the electronic device 100 on the electronic device 200 through the custom APK.
- Android application package APK
- the extended screen desktop displayed by the electronic device 200 is different from the standard desktop of the electronic device 100, and the user cannot uniformly maintain the extended screen desktop and the standard desktop.
- the status bar i.e. the extended screen status bar
- UX User Experience
- the functional characteristics of the standard status bar such as the notification center, quick settings, folders and FA functional characteristics
- the necessary data synchronization such as the management of notification messages.
- the electronic device 100 can support the desktop launcher (Launcher) to simultaneously run multiple desktop instances in different display areas (Display), and support the system interface (SystemUI) to Run multiple status bar instances, and can maintain the corresponding relationship between status bar instances and desktop instances.
- the desktop launcher may also be referred to as a desktop application.
- FIG. 2 shows a schematic diagram of screen projection communication between the electronic device 100 and the electronic device 200 provided in the embodiment of the present application.
- the physical display screen configured by the electronic device 100 is the default screen of the electronic device 100
- the electronic device 100 runs a standard desktop and a standard status bar in the default screen display area (i.e. Display0)
- the corresponding application window of the Display0 runs
- the display area identity (Display Identity document, DisplayId) is the identity identity (Identity document, ID) of Display0.
- the electronic device 100 displays a standard desktop and a standard status bar on the default screen based on the application window run by Display0.
- the display screen configured by the electronic device 200 is used as an extension screen of the mobile phone 100, and an extension screen display area (Display1) is created for the extension screen.
- the electronic device 100 runs the extended screen desktop and the extended screen status bar on Display1, and the DisplayId corresponding to the application window running on Display1 is the ID of Display1.
- the extended screen desktop and the standard desktop are two desktop instances created and run by the same Launcher, and the extended screen status bar and the standard status bar are status bar instances created and run by the same SystemUI.
- the electronic device 100 determines the display content of the extended screen based on the application window running on Display1, and can project the display content of the extended screen to the electronic device 200; the electronic device 200 can display the extended screen desktop and Extended screen status bar.
- the electronic device 100 may acquire device information such as the model of the electronic device 200 and the screen resolution.
- the application window of the electronic device 100 running on Display1 is based on the device information such as the model and screen resolution of the electronic device 200.
- the obtained An application program window of the electronic device 200 is changed the display size of the application window, the display size and interface layout of each interface element in the application window. Without the function, some original functions of the electronic device 100 that are not applicable to the electronic device 200 are shielded.
- the data isolation of the two desktop instances can be realized; at the same time, since the two desktop instances originate from the same desktop launcher APK, the data isolation of the two desktop instances can be realized.
- Synchronous operation for example, the theme, wallpaper and lock screen interface of the extended screen of the electronic device 100 all follow the default screen of the electronic device 100 .
- the embodiment of the present application can run multiple status bar instances at the same time, and maintain the data channel of each status bar instance. Each status bar instance is associated with a different Display, so as to ensure that status bar events of different Displays can be isolated from each other without affecting each other.
- the desktop and the status bar can run across Displays, and at the same time, data synchronization of different Displays can be maintained.
- the application window can be the Window object corresponding to the Activity in the Android system, the application window in the IOS system, or the application window in other operating systems.
- An application program includes a plurality of application program windows, and an application program window generally corresponds to a user interface (User Interface, UI).
- UI User Interface
- one application window may also correspond to multiple user interfaces.
- the application window may also be referred to simply as a window in this embodiment of the present application.
- the Activity in the Android system is an interface for interaction between the user and the application program, and each Activity component is associated with a Window object, which is used to describe a specific application window.
- Activity is a highly abstract user interface component.
- Android it represents the user interface and the corresponding business logic centered on the user interface.
- the controls in the user interface can monitor and process events triggered by the user.
- an android application an Activity can be represented as a user interface, and an Android application can have multiple activty.
- the main interface 11 of the exemplary desktop application of the mobile phone 100 provided in the embodiment of the present application will be introduced below.
- FIG. 3A shows the main interface 11 on the mobile phone 100 for displaying the application programs installed on the mobile phone 100 .
- the main interface 11 may include: a standard status bar 101 , a calendar indicator 102 , a weather indicator 103 , a tray with frequently used application icons 104 , and other application icons 105 . in:
- the standard status column 101 may include: one or more signal strength indicators 101A of a mobile communication signal (also referred to as a cellular signal), an operator name (such as "China Mobile") 101B, wireless fidelity (wireless fidelity, Wi- Fi) one or more signal strength indicator 101C, battery status indicator 101D, time indicator 101E of the signal.
- a mobile communication signal also referred to as a cellular signal
- an operator name such as "China Mobile”
- wireless fidelity wireless fidelity, Wi- Fi
- signal strength indicator 101C battery status indicator
- battery status indicator 101D time indicator 101E of the signal.
- the tray 104 with commonly used application program icons can display: phone icon, contact icon, text message icon, camera icon.
- Other application program icons 105 may display: file management icons, gallery icons, music icons, setting icons, and the like.
- the main interface 11 may also include a page indicator 106 . Icons of other application programs may be distributed on multiple pages, and the page indicator 106 may be used to indicate the application program in which page the user is currently viewing. Users can swipe the area of other application icons left and right to view application icons in other pages.
- the main interface 11 may also include a desktop wallpaper 107 , and the desktop wallpaper 107 may be set by the user, or may be set by default on the mobile phone 100 .
- the mobile phone 100 can use multiple themes, and the mobile phone 100 can change the desktop layout style, icon display style and desktop color by switching the theme. Usually each theme has a wallpaper configured by default, and the user can also modify the wallpaper of the mobile phone 100 under the current theme.
- FIG. 3A only exemplarily shows the user interface on the mobile phone 100, and should not be construed as limiting the embodiment of the present application.
- the mobile phone 100 can establish a connection with the computer 200 through short-range wireless communication technologies such as NFC, WiFi, and Bluetooth, and then can project the desktop of the mobile phone 100 to the computer 200 .
- short-range wireless communication technologies such as NFC, WiFi, and Bluetooth
- the screen projection process through NFC is taken as an example below to illustrate the screen projection process.
- the mobile phone 100 receives a user's downward sliding operation on the standard status bar 101.
- the mobile phone 100 displays the control center interface 12 shown in FIG. 3B.
- the control center interface 12 includes Shortcut icons 201 of multiple commonly used functions (such as the WLAN icon, Bluetooth icon, NFC icon 201A, and multi-screen collaboration icon shown in FIG. , laptop MateBook, tablet MatePad, desktop computer Desktop, smart audio Sound X, etc.), and one or more control boxes of smart home devices (such as the control box 204 of the air purifier, the control box 205 of the smart lighting, etc.).
- the NFC icon 201A has two states, namely a selected state and a non-selected state.
- the NFC icon 201A shown in FIG. 3B is in an unselected state, the mobile phone 100 does not open the NFC module, and the NFC icon 201A can receive the user's input operation (such as a touch operation), and in response to the input operation, the mobile phone 100 switches the NFC icon 201A is the selected state shown in FIG. 3C , and the NFC module is turned on.
- the NFC module of the mobile phone 100 is located in the NFC area on the back of the mobile phone, and the NFC module of the computer 200 is located in the NFC area at the lower right corner of the computer 200 .
- the user can realize the NFC connection between the mobile phone 100 and the computer 200 by bringing the NFC area of the mobile phone 100 close to the NFC area of the computer 200 , and then can realize the desktop projection of the mobile phone 100 to the computer 200 .
- the NFC area of the mobile phone 100 can also be located in other parts of the mobile phone 100
- the NFC area of the computer 200 can also be located in other parts of the computer 200, which are not specifically limited here.
- the mobile phone 100 can detect the NFC signal of the computer 200, and the mobile phone 100 displays the prompt box 13 shown in FIG. 3E on the current display interface.
- the prompt box 13 Including the model 301 of the computer 200 (for example, MateBook), prompt information 302 , connection control 303 and cancel control 304 .
- the prompt information is used to prompt the user to click the connection control 302 to realize the NFC connection, and the user can control the mobile phone 100 on the computer 200 after the NFC connection.
- the connection control 303 may receive a user's input operation (such as a touch operation), and in response to the input operation, the mobile phone 100 sends an NFC connection request to the computer 200 .
- the cancel control 304 may receive a user's input operation (such as a touch operation), and in response to the input operation, the mobile phone 100 may close the prompt box 13 .
- the computer 200 displays a prompt box 15 on the current display interface (such as the desktop 14 of the computer 200 ).
- the prompt box 15 includes prompt information 305 , a connection control 306 and a cancel control 307 .
- the prompt information 305 is used to prompt the user whether to allow the mobile phone 100 to connect to the device, click the connection control 306 to realize the NFC connection, and after the NFC connection, the user can operate the mobile phone 100 on the computer 200, that is, the mobile phone 100 can connect the mobile phone 100 after the NFC connection.
- the desktop of 100 is projected to the computer 200, and the user can control the mobile phone 100 through the desktop displayed on the computer 200.
- the connection control 306 may receive a user's input operation (such as a touch operation), and in response to the input operation, the computer 200 sends an NFC connection response to the mobile phone 100 .
- the cancel control 307 can receive a user's input operation (such as a touch operation), and in response to the input operation, the computer 200 can close the prompt box 15 .
- the mobile phone 100 responds to the received NFC connection response of the computer 200, creates an extended screen desktop through the Launcher, and creates an extended screen status bar through the SystemUI, and the DisplayId corresponding to the extended screen desktop and the extended screen status bar is the ID of the extended screen display area (Display1). ;
- the mobile phone 100 sends the projection data of Display1 to the computer 200 .
- the computer 200 displays the PC-based main interface 16 of the mobile phone 100 based on the projection data.
- Pcization means that in order to adapt to the screen resolution of the computer 200 and the user's operating habits on the computer 200, the interface optimization and function optimization of the extended screen desktop and the extended screen status bar projected from the mobile phone 100 to the computer 200 have been carried out.
- the main interface 16 may include: an extended screen status bar 401 , a search bar 402 , a Dock bar 403 and a desktop wallpaper 404 .
- the main interface 16 of the screen projected from the mobile phone 100 to the extended screen is adapted to the screen resolution of the computer 200 .
- the theme and wallpaper of the extended screen desktop displayed on the computer 200 are consistent with the standard desktop of the mobile phone 100.
- the desktop wallpaper 107 and the desktop wallpaper 404 are from the same picture.
- the desktop wallpaper 107 and the desktop wallpaper 404 are cropped from the same picture.
- extension screen status bar 401 The extension screen status bar 401 , the search bar 402 , the Dock bar 403 , and the lock screen interface of the main interface 16 that the mobile phone 100 projects to the extension screen are described in detail below.
- the extended screen status bar 401 may include: a notification center icon 401A, an input method indicator 401B, one or more signal strength indicators 401C of WiFi signals, a battery status indicator 401D, a time indicator 401E, a control Center icon 401F.
- the extended screen status bar 401 may also include other interface elements, such as a Bluetooth icon, a signal strength indicator of a cellular network, etc., which are not specifically limited here.
- the mobile phone 100 runs two status bar instances, that is, the standard status bar 101 corresponding to the standard desktop displayed on the mobile phone 100 and the extended screen status bar 401 corresponding to the extended screen desktop displayed on the computer 200 .
- the DisplayId associated with the standard status bar 101 is the ID of Display0
- the DisplayId associated with the extended screen status bar 401 is the ID of Display1.
- each interface element in the status bar 401 of the extended screen can open a corresponding secondary interface.
- the secondary interface of each interface element in the status bar 401 of the extended screen will be introduced below.
- the notification center icon 401A can receive a user's input operation (for example, a click operation of the left mouse button or a user's touch operation), and in response to the input operation, the computer 200 displays the notification center window 17 shown in FIG. 4A .
- a user's input operation for example, a click operation of the left mouse button or a user's touch operation
- the input operations received by the extended screen desktop and the extended screen status bar involved in the embodiment of the present application may be operations performed by the user through the mouse; when the computer 200 is configured with a touch screen ( or touch panel), the input operations received by the extended screen desktop and the extended screen status bar involved in the embodiment of the present application may be touch operations performed by the user through the touch screen (or touch panel), which are not specifically limited here. .
- the control of the status bar of the extended screen displayed on the computer 200 can receive the user's input operation (such as a click operation through the left button of the mouse), and respond to For the above-mentioned input operation, the computer 200 may send relevant information of the above-mentioned input operation (such as the coordinates of the left-click, the DisplayId corresponding to the display screen on which the above-mentioned input operation acts) to the mobile phone 100 .
- the user's input operation such as a click operation through the left button of the mouse
- relevant information of the above-mentioned input operation such as the coordinates of the left-click, the DisplayId corresponding to the display screen on which the above-mentioned input operation acts
- the mobile phone 100 Based on the relevant information of the above-mentioned input operation, the mobile phone 100 recognizes the input operation as a left-click operation of the mouse on the notification center icon 401A in the status bar of the extended screen, and then determines that the response event triggered by the above-mentioned input operation is to display the notification center window 17,
- the mobile phone 100 runs the notification center window 17 on Display1, and sends the updated display content of Display1 to the computer 200, and the computer 200 displays the notification center window 17 shown in FIG. 4A according to the projection data sent by the mobile phone 100.
- the processing of input operations received by the status bar of the extended screen displayed on the computer 200 involved in the subsequent embodiments reference may be made to the above implementation process.
- the notification center window 17 includes one or more notification messages of the mobile phone 100 (such as the notification message 501 of music shown in FIG. 4A, the notification message 502 of a short message, and the notification message 503 of a bank card) , and delete control 504 .
- the notification message (such as notification message 502) in the notification center window 17 can receive user input operations, and in response to different input operations of the user, operations such as removing, sharing, and viewing details of the notification message can be realized.
- the notification message 502 may receive a user's input operation (for example, by clicking the left mouse button), and in response to the above input operation, the computer 200 may display the specific content of the notification message 502 in the user interface of the SMS application.
- the notification message 502 may receive an input operation from the user (such as a right-click operation of the mouse), and in response to the above input operation, the computer 200 displays the menu bar 505 shown in FIG.
- View control 505A, remove control 505B, and share control 505C by receiving an input operation (for example, by a click operation of the left mouse button) acting on view control 505A, remove control 505B, or share control 505C, targeting notification can be implemented accordingly View, share or remove from the notification center window 17 of the message 502.
- the display content of the notification center icon 401A includes icons of applications corresponding to the latest N notification messages.
- the display content of the notification center icon 401A as shown in FIG. 4C includes the music icon corresponding to the latest notification message 501 , the SMS icon corresponding to the notification message 502 , and the bank card icon corresponding to the notification message 503 .
- the notification message in the notification center window displayed on the default screen of the mobile phone 100 is also removed. (or delete).
- the input method indicator 401B can receive the user's input operation (for example, click through the left mouse button), and in response to the above input operation, the computer 200 displays the floating window 18 shown in FIG. or a plurality of input method options (such as the option 511 of input method 1, the option 512 of input method 2), and the input method setting control 513, the display content of the input method indicator 401B is the icon of the current input method of the mobile phone 100.
- the user's input operation for example, click through the left mouse button
- the computer 200 displays the floating window 18 shown in FIG. or a plurality of input method options (such as the option 511 of input method 1, the option 512 of input method 2), and the input method setting control 513, the display content of the input method indicator 401B is the icon of the current input method of the mobile phone 100.
- the current input method of the mobile phone 100 is input method 2
- the display content of the input method indicator 401B is the icon of the input method 2
- the option 511 of the input method 1 can receive the user's input operation ( For example, through a click operation of the left button of the mouse), in response to the above input operation, the computer 200 can switch the input method of the mobile phone 100 on the extended screen desktop to input method 1.
- the input method setting control 513 is used to display an input method setting interface.
- the signal strength indicator 401C can receive the user's input operation (such as a single-click operation through the left mouse button), and in response to the above input operation, the computer 200 displays the floating window 19 shown in FIG. Network status 521 and real-time network speed 522.
- the network status of WiFi includes three network statuses: strong, medium and weak.
- the current WiFi network status 521 of the mobile phone 100 is "medium”
- the real-time network speed 522 is 263 kilobits per second (Kbps).
- the battery status indicator 401C can receive the user's input operation (for example, by clicking the left button of the mouse), and in response to the above input operation, the computer 200 displays the floating window 20 shown in FIG. The remaining power 531 and the setting control 532 .
- the current remaining power 531 of the mobile phone 100 is 40%.
- the setting control 532 can receive user's input operation (for example, click operation by the left mouse button), and in response to the above input operation, the computer 200 can display the battery setting interface of the mobile phone 100 .
- the time indicator 401E can receive the user's input operation (for example, click through the left mouse button), and in response to the above input operation, the computer 200 displays the floating window 21 shown in FIG. 4G.
- the floating window 21 includes time, date and current month calendar.
- the floating window 21 further includes a scroll-up control 541 and a scroll-down control 542 .
- the scroll up control 541 can be used to view the calendar of the month before the current month
- the scroll down control 542 can be used to view the calendar of the month after the current month.
- the floating window 21 further includes a time and date setting control 543, and the setting control 543 is used to display a time and date setting interface.
- the control center icon 401F can receive the user's input operation (for example, click through the left mouse button), in response to the above input operation, the computer 200 displays the control center window 22 shown in Figure 4H, the control center window 22 includes a number of commonly used functions shortcut icon 551 (such as the WLAN icon, Bluetooth icon, NFC icon, and multi-screen collaboration icon shown in FIG. Desktop, smart stereo Sound X, etc.), and one or more control boxes of smart home devices (such as the control box 553 of the air purifier, the control box 554 of the smart lighting, etc.). As shown in FIG. 4H , the icon 552 of the smart collaborative device shows that the mobile phone 100 and the computer (ie, the computer 200 ) of the model "MateBook" are currently in a coordinated state.
- the icon 552 of the smart collaborative device shows that the mobile phone 100 and the computer (ie, the computer 200 ) of the model "MateBook" are currently in a coordinated state.
- the mobile phone 100 may also adjust the color of the interface elements displayed in the status bar 401 of the extension screen according to the theme color and/or wallpaper color of the extension screen desktop.
- the theme color of the extended screen desktop and the color of the desktop wallpaper 404 are darker, adjust the color of the interface elements of the extended screen status bar 401 to be white or other preset light colors; the theme color of the extended screen desktop and the color of the desktop wallpaper When it is lighter, adjust the color of the interface elements in the status bar 401 of the extended screen to be black or other preset dark colors.
- the theme color of the extended screen desktop and the color of the desktop wallpaper 404 are darker, adjust the color of the interface elements of the extended screen status bar 401 to be white or other preset light colors; the theme color of the extended screen desktop and the color of the desktop wallpaper When it is lighter, adjust the color of the interface elements in the status bar 401 of the extended screen to be black or other preset dark colors.
- the color of the desktop wallpaper 404 of the main interface 16 is lighter, and the main color of the interface elements of the extended screen status bar 401 is black; as shown in FIG. 4I, the color of the desktop wallpaper 404 of the main interface 16 is black. The color is darker, and the main color of the interface elements in the status bar 401 of the extended screen is white.
- the search bar 402 may receive an input operation from the user (for example, a double-click operation through the right mouse button), and in response to the above input operation, the computer 200 displays the global search floating window 23 shown in FIG. 5 .
- the floating window 23 includes a search bar 601 , frequently used application program icons 602 , search history 603 and hot news 604 .
- the search bar 601 can receive the text information input by the user, and perform a global search based on the text information input by the user, the icon 602 of frequently used application programs can include icons of one or more frequently used application programs, and the search history 603 can include the most recent information of the search bar 601
- One or more search records, hot news 604 may include real-time hot news titles and hot news titles in the hot search list.
- the global search can be realized through the search bar 402, which can not only search offline applications, stored files, etc. locally installed on the mobile phone 100, but also search news, videos, music and other resources online.
- the search bar 402 on the desktop of the extended screen can be set in the status bar 401 of the extended screen, which is not specifically limited here.
- the user can use the mouse to move the cursor of the mouse on the main interface 16 of the mobile phone 100 displayed on the computer 200 .
- the embodiment of the present application adds a corresponding cursor motion effect, thereby increasing the visual feedback for the user's input operation.
- the initial display form of the cursor of the mouse can be a first shape (such as an arrow), when the cursor of the mouse hovers over the specified interface element displayed on the desktop of the extended screen and the status bar of the extended screen (such as the search bar shown in Figure 5 402 ), the display form of the cursor of the mouse can be changed to a second shape (for example, a small hand on the search bar 402 shown in FIG. 5 ).
- a first shape such as an arrow
- the Dock bar 403 may also be called a program dock.
- the Dock bar 403 may include a fixed application area 701 and a recently run area 702.
- the fixed application area 701 includes an application list icon 701A, a multitasking icon 701B, a display desktop icon 701C, and one or more application Icons for programs (eg, file management icon 701D, browser icon 701E).
- Recently run area 702 includes icons of recently run applications.
- the fixed application area 701 includes an icon of a recently run application program, it is not necessary to add the icon of the application program to the recently run area 702 .
- the user can set the display state of the Dock bar 403 .
- the display status of the Dock bar 403 is set to be automatically hidden.
- the computer 200 displays the extended screen desktop of the mobile phone 100 or the user interface of other application programs of the mobile phone 100, the Dock bar 403 is automatically hidden, and the user moves the cursor close to the mobile phone 100.
- the computer 200 only displays the Dock bar 403 when the area where the Dock bar 403 is located.
- the display status of the Dock bar 403 is set to be displayed all the time.
- the computer 200 displays the extended screen desktop of the mobile phone 100, or displays the user interface of other application programs of the mobile phone 100 in the form of a floating window
- the Dock bar 403 is always displayed.
- the computer 200 displays the user interface of other application programs in full screen, the Dock bar 403 is automatically hidden, and when the user moves the cursor close to the area of the Dock bar 403 by the mouse, the computer 200 displays the Dock bar 403 .
- the application list icon 701A can receive the user's input operation (for example, click operation by the left mouse button), in response to the above input operation, the computer 200 displays the user interface 24 shown in FIG. 6A, the user interface 24 includes the application program icon list 703, Application icon list 703 may display multiple application icons, such as music icon 703A and gallery icon 703B.
- the user interface 24 may further include a page indicator 704 , other application program icons may be distributed on multiple pages, and the page indicator 704 may be used to indicate the application program in which page the user is currently viewing.
- the user can slide left or right on the user interface 24 with a finger to view application program icons on other pages.
- the user interface 24 may further include a scroll left control 705 and a scroll right control 706 , and the user may click the scroll left control 705 or the scroll right control 706 to view application icons on other pages.
- the left-turning control 705 and the right-turning control 706 can be hidden, and when the mouse moves the cursor close to the area where the left-turning control 705 (or right-turning control 706) is located, the computer 200 will display the left-turning control 705 (or right-turning control 706).
- control 706) the user interface 24 may further include a search bar 707 , which is used to quickly search for icons of application programs installed on the mobile phone 100 .
- the user can add the application icons in the application icon list 703 to the fixed application area 701 of the Dock bar.
- the music icon 703A may receive a user's drag operation (for example, the user selects the icon with the left mouse button and then drags it to the fixed application area 701), and in response to the above drag operation, the computer 200 may move the A music icon 703A is added to the fixed application area 701 .
- the application program icons in the application program icon list 703 and the Dock column 403 can receive the user's input operation (for example, by double-clicking the left mouse button, touching the user's finger), in response to the above input operation, the computer 200 may display a user interface of a corresponding application of the application icon.
- the mobile phone 100 sets a white list of applications that can be displayed on the extended screen, and functions that cannot be implemented on the desktop of the extended screen can be blocked by setting the white list.
- the application icon list 703 only displays the icons of the applications in the above-mentioned white list. The icon of the program is added to the application icon list 703 .
- some application programs can adapt to the screen resolution of the computer 200, and display the PC-based user interface in full screen on the extended screen; some application programs cannot adapt to the screen resolution of the computer 200, only The standard user interface of the application program on the mobile phone 100 can be displayed on the extended screen.
- the gallery supports full-screen display of the PC-based user interface on the extended screen
- the gallery icon 703B can receive user input operations (such as double-clicking through the left mouse button), and in response to the above-mentioned input operations, the computer 200 can display the full-screen The PC-based user interface 25 of the gallery shown in FIG. 6C.
- the music does not support the full-screen display of the PC-based user interface on the extended screen
- the music icon 703A can receive the user's input operation (for example, through the double-click operation of the left mouse button), and in response to the above-mentioned input operation, the computer 200 displays the icon.
- the computer 200 when the computer 200 receives an input operation for starting the application program 1 acting on the application program icon 1 (such as double-clicking the left button on the gallery icon 703B through the mouse), the computer 200 sends a message to the mobile phone.
- an input operation for starting the application program 1 acting on the application program icon 1 such as double-clicking the left button on the gallery icon 703B through the mouse
- the mobile phone 100 sends the relevant information of the above-mentioned input operation (such as the coordinates of the double-click of the left button, the DisplayId corresponding to the extended screen that the above-mentioned input operation acts on); the mobile phone 100 is based on the relevant information of the above-mentioned input operation and the user interface (such as the user interface The interface layout of 24) identifies the input operation as an input operation acting on the application program icon 1 on the desktop of the extended screen, and determines that the input operation is used to start the application program 1 .
- the relevant information of the above-mentioned input operation such as the coordinates of the double-click of the left button, the DisplayId corresponding to the extended screen that the above-mentioned input operation acts on
- the mobile phone 100 is based on the relevant information of the above-mentioned input operation and the user interface (such as the user interface
- the interface layout of 24 identifies the input operation as an input operation acting on the application program icon 1 on the desktop of the extended screen,
- the standard desktop of the mobile phone 100 may have already started the application 1 .
- the mobile phone 100 determines that the mobile phone 100 does not currently run the application program 1 through Display0, the mobile phone 100 starts the user interface 1 (such as the user interface 25 of the gallery) of the application program 1 on Display1, and updates the displayed content of Display1
- the screen is projected to the computer 200 , and the computer 200 displays the above user interface 1 according to the screen projection data sent by the mobile phone 100 .
- the mobile phone 100 determines that the mobile phone 100 is running the application program 1 through Display0
- the mobile phone 100 moves the activity stack (ActivityStack) of the application program 1 in Display0 to Display1, runs the application program 1 on Display1, and clears the application program in Display0 1, and then project the updated display content of Display1 to the computer 200
- the computer 200 displays the user interface of the application program 1 (such as the user interface 25 of the gallery) according to the projection data sent by the mobile phone 100.
- the mobile phone 100 determines that the mobile phone 100 is running the application program 1 through Display0, the mobile phone 100 sends a prompt message 1 to the computer 200, and the computer 200 may display a prompt message 2 based on the above prompt message 1 to remind the user that the mobile phone 100 is running the application program Program 1, the user can manipulate the application program 1 through the mobile phone 100 .
- the multitasking icon 701B can receive the user's input operation (for example, click through the left mouse button), in response to the above input operation, the computer 200 displays the multitasking interface 27 shown in Figure 7A, the multitasking interface 27 includes the user through the expansion screen Thumbnail images of one or more recently opened applications on the desktop (such as the thumbnail 711 of the gallery, the thumbnail 712 of the note, etc.), and the delete control 713 .
- the multitasking interface 25 may further include a right scrolling control 714 .
- the thumbnail image (such as the thumbnail image 711 of the gallery) can receive the user's input operation (such as the click operation through the left button of the mouse), and in response to the detected above-mentioned input operation, the computer 200 can display The thumbnail corresponds to the user interface 25 of the gallery.
- the thumbnail image (for example, the thumbnail image 711 of the gallery) includes a delete control 711A, and the delete control 711A can receive an input operation of the user (for example, a single-click operation by the left mouse button), and in response to the detected input operation, The computer 200 can clear the running memory occupied by the gallery corresponding to the thumbnail 711 in Display1 through the mobile phone 100 .
- the delete control 713 can receive the user's input operation (such as a single-click operation through the left mouse button), and in response to the detected input operation, the computer 200 can clear all the applications corresponding to the thumbnail images in the multitasking interface through the mobile phone 100 on Display1.
- the flip right control 714 may receive user input operations (for example, click operation by the left mouse button), and in response to the detected input operations, the computer 200 may display more thumbnail images of recently run applications.
- the display desktop icon 701C can be used to minimize one or more application program windows currently displayed and display the extended screen desktop; when no other application windows are displayed on the extended screen desktop
- displaying the desktop icon 701C may be used to restore the display of one or more application windows that were minimized recently.
- the computer 200 displays the user interface 25 of the gallery, and the computer 200 can receive the user's input operation (for example, the user moves the cursor close to the area where the Dock bar 403 is located on the display screen through the mouse), in response to the above With the input operation, the computer 200 displays the Dock bar 403 shown in FIG. 8A on the user interface 25 .
- the user's input operation for example, the user moves the cursor close to the area where the Dock bar 403 is located on the display screen through the mouse
- the display desktop icon 701C in the Dock bar 403 can receive the user's input operation (such as a single-click operation by the left mouse button), and in response to the above-mentioned input operation, the computer 200 minimizes the user interface 25 and displays the user interface 25 shown in FIG. 8A As shown in the extended screen desktop; as shown in Figure 8B, after the user interface 25 is minimized, the desktop icon 701C can receive the user's input operation (for example, by clicking the left button of the mouse), and in response to the above input operation, the computer 200 The most recently minimized user interface 25 is displayed again.
- the user's input operation such as a single-click operation by the left mouse button
- the user interface of the application program displayed in full screen may display a minimize control 721 , a zoom out control 722 , and a close control 723 .
- the minimization control 721 can receive a user's input operation (for example, through a left mouse click operation), and in response to the above input operation, the computer 200 minimizes the user interface 25 .
- the close control 723 can receive the user's input operation (for example, click operation by the left mouse button), in response to the above input operation, the computer 200 closes the user interface 25, and clears the running memory occupied by the user interface 25 in the Display1 of the mobile phone 100.
- the zoom-out control 722 can receive the user's input operation (for example, through the click operation of the left mouse button), as shown in FIG. 8D , in response to the above-mentioned input operation, the computer 200 shrinks the user interface 25 to float the window 28 The form runs the user interface 25 .
- the floating window 28 may include a magnification control 724, and the magnification control 724 may receive an input operation from the user (for example, a single-click operation by the left mouse button). The user interface 25 shown.
- the computer 200 can tile and display multiple application program windows on the extended screen desktop of the mobile phone 100 .
- the computer 200 simultaneously displays the floating window 28 corresponding to the gallery and the floating window 29 corresponding to the memo.
- the application program icon in the Dock bar 403 and the application program icon list 703 can also receive the input operation of the user (for example, through the click operation of the right mouse button), and in response to the above input operation, the computer 200 can display the icon Multiple operation options of the corresponding application program, so as to realize the removal, sharing, uninstallation or other shortcut functions of the above-mentioned application program.
- the corresponding operation options for each application program may be different, for example, some system applications (such as gallery, file management, etc.) do not support being uninstalled.
- the application program icon (for example, the browser icon 701E) in the Dock bar 403 can receive the user's input operation (for example, through the click operation of the right mouse button), and in response to the above-mentioned input operation, the computer 200 A floating window 30 may be displayed.
- the floating window 30 may include a remove control 801 , a share control 802 , and an uninstall control 803 .
- the removal control 801 may receive a user's input operation (for example, a mouse right click operation), and in response to the above input operation, the computer 200 may remove the browser icon 701E from the Dock bar 403 .
- the sharing control 802 is used to share the browser application to the target object.
- the uninstall control 803 is used to uninstall the browser application of the mobile phone 100 .
- the application program icon (such as the gallery icon 701F) in the Dock bar 403 can also receive the user's hovering operation (such as hovering the mouse cursor on the application program icon), in response to the above input operation, if The application corresponding to the application icon is running in the background, and the computer 200 may display a thumbnail of the application corresponding to the application icon, and the thumbnail is a thumbnail of a user interface of the application recently run.
- the user hovers the mouse cursor on the gallery icon 701F and the computer 200 displays a thumbnail 804 of the gallery.
- the style of the extended screen desktop of the mobile phone 100 is consistent with that of the lock screen interface of the standard desktop.
- the mobile phone 100 also locks the standard desktop, and vice versa.
- the computer 200 displays the lock screen interface 31 of the extended screen desktop shown in FIG.
- the lock screen wallpaper 901 is included; in response to the above lock screen command, the mobile phone 100 displays the lock screen interface 32 shown in FIG. 10B , and the lock screen interface 32 includes the lock screen wallpaper 902 .
- the mobile phone 100 displays the lock screen interface 32 shown in FIG. 10B, and sends a lock screen command to the computer 200;
- the lock screen wallpaper 901 and the lock screen wallpaper 902 are from the same picture.
- the lock screen wallpaper 901 and the lock screen wallpaper 902 are cropped from the same picture.
- the mobile phone 100 will not follow the extended screen desktop to lock the standard desktop, and vice versa.
- the screen projection method provided by the embodiment of the present application involves multiple system applications (including Launcher, SystemUI, File Manager (FileManager), global search (Hisearch) and wireless screen projection (AirSharing), etc.) And the modification of the system framework. in,
- the desktop launcher is used to manage the desktop layout, Dock bar, application list, multitasking, etc. of the desktop.
- SystemUI is a UI component that provides users with system-level information display and interaction, and is used to manage status bar, navigation bar, notification center, lock screen interface and wallpaper, etc.
- the file manager is used to provide external file boxes, manage desktop files, provide external file operation capabilities, and implement file drag and drop between applications.
- Global search is used to realize local and online global search.
- the wireless screen projection is used to realize the wireless screen projection between the mobile phone 100 and the target screen projection device (such as the computer 200).
- the software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
- the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
- FIG. 11 shows a block diagram of the software architecture of the mobile phone 100 provided by the embodiment of the present application.
- the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
- the Android system can be divided into an application layer, an application framework layer (Framework) and a system library (Native) from top to bottom.
- the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
- the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
- the application layer and the application framework layer run in virtual machines.
- the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
- the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
- the application layer includes a series of application packages (Android application package, APK), such as desktop launcher apk (HWLauncher6.apk), system interface apk (SystemUI.apk), file manager apk (FileManager.apk), global search apk (Hisearch.apk) and wireless sharing apk (AirSharing.apk), etc.
- APK application package
- the desktop launcher apk includes a standard desktop launcher (UniHomelauncher) and an extended screen desktop launcher (PcHomelauncher).
- the system interface apk (SystemUI.apk) includes the standard status bar (StatusBar) and the extended screen status bar (PcStatusBar).
- the file manager apk (FileManager.apk) includes standard interface, column interface and file box.
- the global search apk (Hisearch.apk) includes the standard search interface and the extended screen search interface.
- the wireless sharing apk (AirSharing.apk) includes ordinary wireless screen projection and self-developed display high-definition screen projection.
- the extended screen desktop may also be called the Pc desktop
- the extended screen status bar may also be called the Pc status bar, which are not specifically limited here.
- the desktop launcher supports running multiple desktop instances, such as UniHomelauncher and PcHomelauncher, UniHomelauncher is used to start the standard desktop displayed on the mobile phone 100, and PcHomelauncher is used to start the expansion of the screen projected to the computer 200 screen desktop, the extended screen desktop is a PC-based desktop adapted to the computer 200.
- the standard desktop is displayed based on the application window running in the display area (Display0) of the default screen
- the extended screen desktop is displayed based on the application window running in the display area (Display1) of the extended screen.
- SystemUI supports running multiple status bar instances, such as the standard status bar and the extended screen status bar.
- the DisplayId corresponding to the standard status bar is the ID of Display0
- the DisplayId corresponding to the extended screen status bar is the ID of Display1.
- the newly added column interface and file box in the file manager are responsible for extended screen desktop file operations (such as copying, pasting, moving, deleting, restoring, and dragging).
- Global search also supports running multiple search interface instances, for example, the standard search interface corresponding to the standard desktop and the extended screen search interface corresponding to the extended screen desktop.
- the application layer also includes a toolkit (Kit), which includes a software development kit (HwSDK), a user interface kit (Uikit) and a screen casting protocol kit (Cast+kit).
- Kit software development kit
- Uikit user interface kit
- Cist+kit screen casting protocol kit
- Kit adds HwSDK.
- HwSDK is a collection of development tools for establishing application software for the software package, software framework, hardware platform and operating system involved in this application. .
- the embodiment of this application modifies Uikit and Cast+kit, so that Uikit can realize the enhancement of native control capabilities in the extended screen desktop, optimization of text right-click menu, mouse click and mouse hover (Hover) and other operations Dynamic optimization, etc., and enables Cast+kit to realize the projection of the extended screen desktop on the high-definition display.
- the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions.
- the application framework layer adds a self-developed display projection service (HwProductiveMultiWindowManager) and a Dock bar management service (DockBarManagerService), which are used for the self-developed display projection service It is used to realize running multiple application windows of the same application at the same time, and projecting a specified window in the above multiple application windows to a target screen projection device (such as the computer 200).
- the Dock bar management service is used to manage the Dock bar.
- the embodiment of the present application also modifies the display management service (DisplayManagerService, DMS), input and output service (InputManagerService, IMS) and parallel view service (HwPartsMagicWindow).
- DisplayManagerService DMS
- input and output service IMS
- parallel view service HwPartsMagicWindow
- DMS is used to manage the life cycle of interface display, control the logical display of the currently connected default screen display area (Display0) and extended screen display area (Display1), and report to the system and application when the display status of Display0 and Display1 changes.
- the program sends notifications, etc.
- the IMS is used to manage the input and output of the mobile phone 100, and the input and output devices of the mobile phone 100 may include a printer, a hard disk, a keyboard, a mouse, a hard disk, a magnetic disk, and a writable read-only optical disc.
- the Parallel Horizon service is used to realize the application split screen function, that is, two user interfaces corresponding to two different activities of an application can be displayed at the same time, and the above two user interfaces can be displayed on the same display screen, or the above two The user interfaces are displayed on different display screens, and the parallel view service provided in the embodiment of the present application supports displaying the user interfaces through a floating window.
- the mobile phone 100 also modifies the package management service (PackageManagerService, PMS) and wallpaper service (WallpaperService) of the application framework layer, so that the selection strategy for the default Launcher (ie UniHomelauncher) is added in the package management service, WallpaperService supports displaying wallpapers on multiple Displays.
- PackageManagerService PMS
- wallpaper service WallpaperService
- a system library can include multiple function modules.
- the system library may include an event dispatcher (InputDispatcher), an audio system (AudioSystem), and a screencasting protocol (Huawei Cast+), etc.
- InputDispatcher InputDispatcher
- AudioSystem Audio System
- Screencasting protocol Huawei Cast+
- FIG. 12 shows another software system framework provided by the embodiment of the present application.
- SystemUI.apk includes a standard status bar and an extended screen status bar.
- SystemUI.apk can specifically include: system interface application (SystemUIApplication), base class (Base Class), service (Service), component (Component) and dependency class provider (Dependency) used to implement the standard status bar Providers) and UI control classes (UI Control).
- system interface application SystemUIApplication
- Base Class Base Class
- Service Service
- Component component
- Dependency dependency class provider
- SystemUIApplication is a subclass of Application and is responsible for the initialization of all SystemUI components.
- the base class includes the system interface factory class (SystemUIFactory), the system interface root component (SystemUIRootComponent) and so on.
- SystemUIFactory is used to create SystemUI components
- SystemUIRootComponent is used to implement the initialization of the dependency injection framework (dagger).
- Services include system interface services (SystemUIService).
- SystemUIService is used to initialize a series of components of SystemUI. When it starts, the service will instantiate the sub-services defined in the service list of SystemUIService one by one. Each sub-service can be run by calling the start() method of each sub-service.
- the above-mentioned sub-services are all inherited from the SystemUI abstract class.
- the status bar and the navigation bar are sub-services in the above-mentioned service list.
- Components include CommandQueue, Dependency, KeyguardViewMediator, Notification, Systembars, Statusbar and more. in,
- CommandQueue is a Binder class used to handle requests related to the status bar and notification center. It will be registered to the status bar management service (StatusBarManagerService) by the status bar to receive the message of StatusBarManagerService; CommandQueue maintains an event queue inside, and the status bar service (StatusBarService) is used to implement the Callbacks callback in CommandQueue.
- the embodiment of the present application modifies the CommandQueue component so that it supports message distribution of multiple StatusBars (that is, the standard status bar and the status bar of the extended screen).
- SystemBars inherits from the base class SystemUI and is the entry class for creating the entire SystemUI view.
- the standard status bar is mainly used to display application notification icons (Icon) and system status icons (alarm clock icon, wifi icon, SIM card icon, system time icon, etc.) on the standard desktop, and to control and manage the above icons.
- Notification is used to display notification information on the status bar, and to control and manage the above notification information.
- KeyguardViewMediator is the core class of the lock screen, and other lock screen objects interact with each other through KeyguardViewMediator, which is the management class of the state callback. All calls from the lock screen service (KeyguardService) will be transferred to the UI thread by the KeyguardViewMediator.
- KeyguardService All calls from the lock screen service (KeyguardService) will be transferred to the UI thread by the KeyguardViewMediator.
- Dependent class providers include status bar window control class (StausBarWindowController), status bar icon controller implementation class (StatusBarIconControllerImpl), status bar policy (StausBarPolicy) and so on. in,
- StausBarWindowController is used to manage the status bar window view (StatusBarWindowView), and can call the interface of WindowManager to display the standard status bar.
- the StatusBarWindowController component is modified to support adding, deleting and managing multiple status bar objects.
- StatusBarIconControllerImpl is used to manage application icons in the status bar, including icon size, position and color changes.
- StausBarPolicy is used to manage the display policy of the status bar (such as updating the icon of the status bar, display time, display position, etc.).
- StatusBarPolicy is a policy management class, the actual function is realized by StatusBarService.
- UI control classes include status bar window view (StatusBarWindowView), notification panel view (NotificationPanelView), quick setting fragment (QSFragment), status bar fragment (StatusBarFragment), status bar view (StatusBarView) and so on.
- StatusBarWindowView is used to determine the root layout when the status bar is not expanded, creating a status bar window view.
- StatusBarView is responsible for creating and instantiating the entire SystemUI view (including status bar, notification center and lock screen interface, etc.), and StatusBarView defines the name, display order, and portable network graphics (Png) of the icons displayed in the status bar.
- StatusBarService When the StatusBarService is initialized, a StatusBarView for displaying the status bar is initialized, and the StatusBarService realizes the initialization of the status bar by calling the makeStatusBarView method.
- NotificationPanelView is the control class of the notification center after the status bar is pulled down.
- QSFragment is the control class of the control center after the status bar is pulled down.
- StatusBarFragment manages the status bar in the contracted state and is responsible for the life cycle management of the icons in the status bar.
- the embodiment of the present application adds services, components, dependent class providers and UI control classes for realizing the extended screen status bar in the system interface apk. in:
- the services used to implement the status bar of the extended screen include ProductiveService, and ProductiveService inherits from Service.
- the components used to realize the status bar of the extended screen include Pc dependency class (PcDependency), Pc system provider (PcSystemProviders), Pc system bar (PcSystembars), and the extended screen status bar.
- PcDependency, PcSystembars, and the status bar of the extended screen are all inherited from the corresponding components of the aforementioned standard status bar, and implement similar functions for the status bar of the extended screen, which will not be repeated here.
- the dependent class providers used to realize the status bar of the extended screen include the Pc status bar window control class (PcStausBarWindowController), the screen control class (ScreenController), the lock screen control class (KeyguardController), and the remote control class (RemoteController).
- PcStausBarWindowController inherits from the aforementioned StausBarWindowController of the standard status bar, and implements similar functions for the status bar of the extended screen, which will not be repeated here.
- the screen control class (ScreenController) is used to control the on/off of the screen on the projection side.
- the keyguard component such as the lock screen control class (KeyguardController) is modified to support multiple Display lock screens.
- the remote control class (RemoteController) is used to communicate with the application framework layer (Framework).
- the UI control classes used to realize the status bar of the extended screen include Pc status bar window view (PcStatusBarWindowView), Pc notification panel view (PcNotificationPanelView), Pc quick setting fragment (PcQSFragment), Pc status bar fragment (PcStatusBarFragment), Pc status bar view ( PcStatusBarView) and so on, the newly added UI control classes are all inherited from the corresponding control classes of the aforementioned standard status bar, and realize similar functions for the status bar of the extended screen, which will not be repeated here.
- the embodiment of the present application also supports window management service (WMS), status bar management service (StatusBarManagerService), wallpaper management service (WallpaperManagerService), notification management service (NotificationManagerService, NMS), PC
- WMS window management service
- StatusBarManagerService status bar management service
- WallpaperManagerService wallpaper management service
- NotificationManagerService NMS
- PC PC The module (HwPartsPowerOffice) has been modified accordingly.
- the window management service (WindowManagerService, WMS) includes a window management policy (WindowManagerPolicy) and a lock screen service agent (KeyguardServiceDelegate). WMS supports running multiple application windows for the same application at the same time.
- WindowManagerPolicy window management policy
- KeyguardServiceDelegate lock screen service agent
- StatusBarManagerService supports registration and management of multiple status bars (such as standard status bar and extended screen status bar).
- StatusBarManagerService is the manager of StatusBarService.
- StatusBarService is used to load, update and delete icons in the status bar, interact with applications in the status bar, and process notification information.
- WallpaperManagerService supports displaying wallpapers on multiple Displays (for example, Display0 corresponding to the standard desktop and Display1 corresponding to the extended screen desktop).
- NMS supports the UX style of PCs, and supports the management of notification centers with multiple pull-down status bars.
- HwPartsPowerOffice is used to modify the projection screen entrance.
- the modification of the Pc management service (HwPCManagerService) is added in HwPartsPowerOffice to realize loading of the extended screen desktop and the extended screen status bar in the screen projection scenario.
- inheritance involved in this application means that the subclass inherits the characteristics and behaviors of the parent class, so that the subclass object (instance) has the instance field and method of the parent class, or the subclass inherits the method from the parent class, so that the subclass has The same behavior as the parent class.
- the system architecture diagram shown in FIG. 12 also includes an interface layer (Interface), and the Interface layer includes Android Interface Definition Language (Android Interface Definition Language, AIDL) and broadcast (Broadcast).
- AIDL is a communication interface between different processes; Broadcast is used to send and receive broadcasts, which can realize the transmission of messages.
- SystemUI system interfaces
- Figure 12 is only an illustration, and is not limited to the components of SystemUI and subclasses of SystemUI shown in Figure 12, and other SystemUI components may also be included in this embodiment of the application. Components and subclasses of SystemUI are not specifically limited here.
- FIG. 13 shows another software system framework provided by the embodiment of the present application.
- the desktop launcher apk includes UniHomelauncher and PcHomelauncher.
- the desktop launcher apk specifically includes a common class (Common Class), a UI control class (UI Control), a service (Service), and an activity stack.
- Common Class Common Class
- UI Control UI Control
- Service service
- activity stack an activity stack
- this application implements the common class (Common Class) desktop startup provider (LauncherProvider), database assistant (DatabaseHelper), desktop startup setting class (LauncherSettings), desktop startup constant class (LauncherConstants) modified.
- Common Class desktop startup provider
- DatabaseHelper database assistant
- Desktop startup setting class LauncherSettings
- Desktop startup constant class LauncherConstants
- LauncherProvider is the database of the desktop launcher. It is the database content provider of the application icons of multiple desktop instances (such as standard desktop and extended screen desktop) running on the desktop launcher, and realizes the access and operation of other applications to the data in the desktop launcher. .
- DatabaseHelper is responsible for database creation and maintenance.
- LauncherSettings is used to realize the string definition of database items, and provide some Uri to operate LauncherProvider and the field name of the corresponding field in the database through the internal class Favorites.
- LauncherConstants maintains and manages constants in the application.
- this application adds Pc layout configuration (PclayoutConfig), Pc device file (PcDeviceProfile), Pc grid (unit) counter (PcCellNumCalculator), Pc desktop launcher policy (PcLauncherPolicy), Pc Desktop launch mode (PcLauncherModel), Pc loading task (PcLoaderTask), etc.
- Pc layout configuration Pc layout configuration
- PcDeviceProfile Pc device file
- Pc grid (unit) counter PcCellNumCalculator
- PcCellNumCalculator Pc desktop launcher policy
- PcLauncherPolicy Pc Desktop launch mode
- PcLoaderTask Pc loading task
- PclayoutConfig is used to set the layout attributes (such as width and height) and parameters of the extended screen desktop, and the display effect of the components in the extended screen desktop can be constrained by specifying the layout attributes.
- PcDeviceProfile is used to define the basic attributes of each module in the extended screen desktop, responsible for the initialization of attribute values, and setting the margin (padding) of each element layout, etc.
- PcCellNumCalculator is a desktop icon layout strategy class.
- PcLauncherPolicy manages the display policy of the extended screen desktop.
- PcLauncherModel is a data processing class, which is used to save the desktop state of the extended screen desktop, provide APIs for reading and writing databases, and update the databases when deleting, replacing, and adding applications.
- PcLoaderTask is used to load the extended screen desktop.
- this application also adds Pc drag layer (PcDraglayer), Pc desktop workspace (PcWorkSpace), Pc unit layout (PcCelllayout), Pc program dock view (PcDockview), and Pc folder in the control class (PcFolder), Pc folder icon (PcFolderIcon), etc.
- the newly added control classes are all inherited from the corresponding control classes of the standard desktop, and are used to realize similar functions on the extended screen desktop.
- PcDraglayer is a view group (ViewGroup) responsible for distributing events, which is used to initially process the events of the extended screen desktop and distribute them according to the situation.
- DragLayer contains desktop layout (Workspace), navigation point (QuickNavigationView), Dock area (Hotseat), recent task list (OverviewContainer). Hotseat is the container responsible for managing the Dock.
- PcWorkSpace is a subclass of PagedView, which consists of multiple CellLayouts, and each CellLayout represents a split screen. PcWorkSpace is used to realize the sliding function of the split screen on the extended screen desktop.
- PcCellLayout is used to manage the display and layout of the split-screen icons of the extended screen desktop.
- PcDockview is used to implement the Dock layout on the projection side.
- PcFolder is used to realize the folder of the extended screen desktop (including the folder created by the user and the folder that comes with the system).
- the activity stack in the embodiment of the present application supports management of the task stack corresponding to the extended screen desktop, and the Pc desktop service (PcHomeService) is added in the service.
- PcHomeService is used to start the extended screen desktop.
- the embodiment of the present application also makes corresponding modifications to the activity management service (ActivityManagerService, AMS) in the application framework layer, so that AMS supports the same application program at the same time Running multiple application windows, such as supporting the simultaneous running of the above two desktop instances.
- ActivityManagerService ActivityManagerService
- a Dock bar is added to the desktop of the extended screen, and correspondingly, an IdockBar.aidl is added to the desktop launcher to provide a Binder interface for the framework layer to operate the Dock bar.
- the activity stack includes a task stack corresponding to the default screen and a task stack corresponding to the extended screen
- the mobile phone 100 can independently maintain the task stack corresponding to the default screen and the task stack corresponding to the extended screen.
- the mobile phone 100 distinguishes the two task stacks through DisplayId.
- the DisplayId of the task stack corresponding to the default screen is the ID of Display0
- the DisplayId of the task stack corresponding to the extended screen is the ID of Display1.
- the task stack corresponding to Display1 runs a desktop task stack (HomeStack) and N application stacks (AppStack), namely Stack1 to StackN, HomeStack includes one or more desktop tasks (Task), and AppStack includes one or more An application task; the task stack corresponding to Display0 runs a HomeStack and an AppStack, and the AppStack includes one or more application tasks (such as Task-A and Task-B). Among them, a Task includes one or more Activities.
- PcHomelauncher runs the HomeStack of the extended screen of the mobile phone 100 through Display1
- UniHomeLauncher runs the HomeStack of the default screen of the mobile phone 100 through Display0.
- the WMS obtains the input event through the pointer event listener (TapPointerListener), it can distribute the input event acting on the extended screen to Display1, and the input event acting on the default screen to Display0.
- the embodiment of the present application modifies the software system framework so that it supports running multiple desktop instances and multiple status bar instances on different Displays.
- the screen projection method provided in the embodiment of the present application includes:
- the mobile phone 100 After the mobile phone 100 is connected to the display of the target screen projection device (for example, the computer 200), it receives a screen projection instruction for confirming the projection.
- the target screen projection device for example, the computer 200
- the mobile phone 100 adds an extended screen status bar and its related logic control classes through the SystemUI, and the DisplayId corresponding to the extended screen status bar is the ID of the display area Display1 of the extended screen.
- the mobile phone 100 In response to the above screen projection instruction, the mobile phone 100 also adds an extended screen desktop and its related logic control classes through the Launcher, and the DisplayId corresponding to the extended screen desktop is the ID of the display area Display1 of the extended screen.
- the mobile phone 100 in response to the above screen projection instruction, starts the ProductiveService service through HwPCManagerService, loads the extended screen status bar, and starts the PcHomeService service through HwPCManagerService, loads the extended screen desktop.
- FIG. 15A shows a software implementation process of the SystemUI involved in the embodiment of the present application, and the implementation process includes:
- the mobile phone 100 When the mobile phone 100 is turned on, in the non-screen projection mode, the mobile phone 100 starts a Zeyote process, creates a virtual machine instance through the Zeyote process, and executes a system service (SystemServer).
- SystemServer system service
- SystemServer starts a series of services required for system operation, including SystemUIService.
- SystemUIService starts SystemUI and calls SystemUIApplication.
- SystemServer starts the boot service, core service and other service respectively, and starts SystemUI and Launcher respectively by calling the mActivityManagerService.systemReady() method in the startOtherService method.
- SystemUIApplication starts the SystemUI component through the startServicesIfNeeded function, and the SystemUI component includes Systembars.
- the configuration item config_systemUIServiceComponents is read, and each component (including SystemBars) is loaded.
- the SystemBars component is loaded, read the configuration (config_statusBarComponent), determine the control class of the status bar according to the above configuration items, and then choose to start the extended screen status bar (PcStatusBar) or the standard status bar (StatusBar) according to the control class of the fixed status bar.
- the value of config_statusBarComponent is com.android.systemui.statusbar.phone.PhoneStatusBar, and SystemBars determines that the manipulation class of the status bar is StatusBar; The value is com.android.systemui.statusbar.tablet.TabletStatusBar, and SystemBars determines that the control class of the status bar is PcStatusBar.
- the StausBarWindowController calls the interface of the WindowManager, adds the standard status bar to the WMS, and then adds the standard status bar to Display1.
- the CommandQueue object will be passed to the StatusBarManagerService and saved as mBar.
- the client obtains the interface of the StatusBarManagerService system service through the ServiceManager, it can call the method of the CommandQueue object through the mBar, and the CommandQueue calls back the message to the StatusBar through the callback interface, so that the StatusBar can be updated.
- the standard status bar calls the status bar prompt manager (StatusBarPromptManager) to register the screen projection broadcast (Broadcast).
- the mobile phone 100 accesses the display (DisplayDevice) of the target screen projection device (that is, the aforementioned extended screen), it calls the DMS.
- the mobile phone 100 can access the target screen projection device through a wired connection, or access the target screen projection device through wireless communication technologies such as NFC, WIFI, or Bluetooth.
- the DMS triggers a display event (OnDisplayevent), creates a logical display area Display1 corresponding to the extended screen, and calls a display management object (DisplayManagerGlobal) to obtain the DisplayId of Display1.
- a display event OnDisplayevent
- DisplayManagerGlobal a display management object
- the DMS calls the handleDisplayDeviceAddedLocked function to generate a corresponding logical device (LogicalDevice) for the display (DisplayDevice), and will add the LogicalDevice to the logical device list (mLogicalDevices) managed by the DMS, and also add the DisplayDevice to the DMS In the display list (mDisplayDevices), a logical display area (LogicalDisplay) (ie, the aforementioned extended screen display area Display1) is generated for the DisplayDevice.
- DMS calls DisplayManagerGlobal to determine the DisplayId of the logical display area.
- DisplayManagerGlobal sends the above-mentioned display event to a display listener delegate (DisplaylistenerDelegate), and adds a display listener delegate to Display1.
- DisplayManagerGlobal is mainly responsible for managing the communication between Display Manager and DMS.
- DisplaylistenerDelegate sends a notification of adding a display area (onDisplayAdded).
- RootActivityContainer starts HwPCManagerService to load the status bar of the extended screen.
- hwPartsPowerOffice adds a modification to HwPCManagerService, which is used to start the extended screen desktop and the extended screen status bar in the screen projection scenario.
- HwPCManagerService sends the screen projection capsule prompt broadcast to the status bar prompt manager (StatusBarPromptManager).
- HwPCManagerService sends a screen projection notification message to NotificationManagerService.
- the HwPCManagerService receives an instruction to switch modes, and the instruction is used to instruct to switch from the current non-screen projection mode to the screen projection mode.
- the SystemUI after receiving the capsule reminder broadcast and the screen projection notification message, the SystemUI displays the capsule reminder in the status bar, and displays the screen projection notification message in the notification center.
- the capsule prompt is the prompt box 13 shown in FIG. 3E .
- the HwPCManagerService acquires an instruction to switch modes.
- the mobile phone 100 receives the NFC connection response sent by the computer 200, the HwPCManagerService receives an instruction to switch modes.
- HwPCManagerService calls bindService to start ProductiveService.
- ProductiveService calls PcSystembars to enable the extension screen status bar.
- Systembars creates an extended screen status bar based on the configuration file.
- Systembars reads the configuration config_PcstatusBarComponent, determines that the control class of the status bar is PcStatusBar, and then creates an extended screen status bar.
- the status bar of the extended screen calls the callback interface of CommandQueue to add a callback to the status bar of the extended screen.
- the StausBarWindowController calls the interface of the WindowManager to add the extended screen status bar to the WMS, and then adds the extended screen status bar to Display1.
- FIG. 15B exemplarily shows the software implementation of StausBar.
- the CommandQueue only needs to support one status bar instance in the non-screen projection mode, that is, the standard status bar or the extended screen status bar. It can be understood that the mobile phone only needs to support the standard status bar in the non-screen projection mode, and the computer only needs to support the extended screen status bar in the non-screen projection mode. CommandQueue needs to support multiple status bar instances in screen projection mode, and save the callbacks of the above multiple status bar instances. Exemplarily, when the mobile phone 100 screen is projected to the computer 200, CommandQueue needs to support the standard status bar corresponding to the default screen display area (Display0) of the mobile phone 100, and the extended screen status bar corresponding to the extended screen display area (Display1) of the mobile phone 100.
- Display0 default screen display area
- Display1 the extended screen display area
- the Context of the status bar In screen projection mode, you need to adjust the Context of the status bar to ensure that each status bar obtains the corresponding Display information (such as DisplayId); StatusBarWindowController also needs to support adding multiple status bars; when WindowManagerService adds a window through addWindow, you need to modify the window of the window The DisplayId of the state (windowState) to ensure that the window can be correctly displayed on the corresponding display screen.
- the standard status bar can be displayed on the default screen of the mobile phone 100
- the extended screen status bar can be displayed on the extended screen of the mobile phone 100 (ie, the display screen of the computer 200).
- the default status bar ie, the standard status bar of the mobile phone 100
- the configuration item config_statusBarComponent ie, the configuration item config_statusBarComponent.
- SystemBars calls the start() function of the standard status bar to initialize, sets the Callback through the CommandQueue component, then calls the Binder interface, and registers the IstatusBar object corresponding to the standard status bar to the StatusBarManagerService for management through the registerStatusBar function.
- the DMS When the mobile phone 100 is connected to the display (that is, the computer 200), the DMS receives the OnDisplayDeviceEvent through the input channel, and the DMS distributes the display event (DisplayEvent) based on the callback record (CallbackRecord). After the mobile phone 100 switches from the non-projection mode to the projection mode, the DMS creates the logical display area Display1 corresponding to the extended screen, and calls DisplayManagerGlobal to obtain the DisplayId of Display1, and DisplayManagerGlobal sends the above display event to DisplaylistenerDelegate to add a display monitoring agent to Display1.
- the SystemUI component reads the configuration item config_PcsystemUIServiceComponents of the configuration file Config.xml, and after SystemBars is started, the status bar of the extended screen is started according to the configuration item config_PcstatusBarComponent.
- SystemBars calls the start() function of the status bar of the extended screen to initialize, sets the Callback through the CommandQueue component, and then calls the Binder interface to register the IstatusBar object corresponding to the status bar of the extended screen to the StatusBarManagerService through the registerStatusBar function to manage.
- the status bar of the extended screen is created and a StatusBarWindowView is added to the StausBarWindowController; the StausBarWindowController calls the interface of WindowManager to add the status bar of the extended screen to the WMS.
- WMS calls the ViewRootImpl and IwindowSession of the window management target (WindowManagerGlobal), and modifies the DisplayId of the WindowState of the window corresponding to the status bar of the extended screen to the ID of Display1.
- the StatusBarManagerService includes an ArryMap list and a SpareArry list.
- the ArryMap list maintains Binder objects of multiple IstatusBar registered to StatusBarManagerService.
- the SpareArry list maintains the correspondence between multiple Display Uistate objects and multiple standard status bars.
- FIG. 15C shows a software implementation process of the Launcher involved in the embodiment of the present application.
- the following is a detailed introduction to the software implementation process as shown in the figure.
- SystemServer starts a series of services required for system operation, including Launcher.
- the SyetemServer process will start PMS and AMS during the startup process. After the PackageManagerService starts, it will analyze and install the application APK in the system. AMS is mainly used for the startup and management of the four major components. The entry to start the Launcher is AMS systemReady method.
- AMS starts the Launcher through the StartHomeOnAllDisplays method.
- the Activity Task Manager (ActivityTaskManagerinternal) calls StartHomeOnDisplays to start the Launcher.
- RootActivityContainer judges whether the DisplayId is the ID of the default screen display area (ie Display0). If yes, resolve the standard desktop activity (resolveHomeActivity), and execute step (34); if not, resolve the extended screen desktop activity (resolveSecondaryHomeActivity), and execute step (37).
- the PMS if it is the ID of Display0, the PMS queries the Activity of CATEGORY_HOME as the desktop, and if it is the ID of Display1, the PMS queries the Activity of SECONDARY_HOME as the desktop.
- the activity start controller (ActivityStartController) calls the activity starter (ActivityStarter).
- ActivityStarter calls startHomeActivityLocked to start the standard desktop.
- step (12) to step (16) can refer to the relevant description in FIG. 15A .
- DisplayManagerService will send the onDisplayAdded notification when the display area Display is added, and HwPCManagerService in the PC module (hwPartsPowerOffice) will start the Launcher of the extended screen through startHomeOnDisplay after receiving the notification (i.e. PcHomeLauncher).
- HwPCManagerService calls Pc desktop service (HwHomeService) through BindService.
- HwHomeService calls AMS through StartHomeOnProductiveDisplay.
- AMS calls the StartHomeOnAllDisplays method to start PcHomeLauncher.
- the active task manager calls StartHomeOnDisplays to start PcHomeLauncher.
- step (33) is executed, and the extended screen desktop Activity (resolveSecondaryHomeActivity) is obtained in step (33), and step (40) is executed.
- the activity start controller (ActivityStartController) calls the activity starter (ActivityStarter).
- an embodiment of the present application provides a screen projection method, which includes but is not limited to steps S201 to S205.
- the first electronic device invokes the first module of the first application to run the first desktop, and the first desktop is associated with the first display area; the first electronic device displays the first display content based on the first display area, and the first display content includes the first display content a desktop.
- the first electronic device invokes the second module of the first application to run the second desktop, and the second desktop is associated with the second display area; the first electronic device sends the second display area to the second electronic device Corresponding to the second display content, the second display content includes a second desktop.
- the first electronic device may be the aforementioned electronic device 100, such as the mobile phone 100; the second electronic device may be the aforementioned electronic device 200, such as the computer 200.
- the first application may be the aforementioned desktop launcher, such as HWLauncher6; the first module may be a standard desktop launcher, such as UniHomelauncher; the second module may be an extended screen desktop launcher, such as PcHomelauncher.
- the first desktop may be the aforementioned standard desktop, and the second desktop may be the aforementioned extended screen desktop.
- the first display area may be the aforementioned default screen display area, namely Disply0, and the second display area may be the aforementioned extended screen display area, namely Disply1.
- the first display content may be a user interface displayed on the mobile phone 100 shown in FIG. 3A
- the first desktop may be the desktop shown in FIG. 3A
- the second display content may be the user interface displayed by the computer 200 shown in FIG. 3G based on the projection data sent by the mobile phone 100, and the second desktop may be the desktop shown in FIG. 3G.
- the first electronic device displays the third display content based on the task stack running in the first display area.
- the first display content may be the user interface 11 displayed on the mobile phone 100 shown in FIG.
- the content may be the control center interface 12 shown in FIG. 3B.
- the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack running in the second display area .
- the first electronic device sends the fourth display content to the second electronic device.
- the second display content may be the user interface 16 displayed by the computer 200 shown in FIG. , icon in the Dock column 403) input operation.
- the above-mentioned interface element is the notification center icon 401A in the status bar 401
- the fourth display content may include the user interface 16 and the notification center window 17 shown in FIG. 4A .
- the third user operation received by the first electronic device is through the second electronic device.
- the second electronic device determines the first input event, where the first input event is used to indicate the third input operation.
- the first input event includes the coordinates of the third input operation on the display screen of the second electronic device, the operation type of the third input operation (such as touch operation, mouse click operation, etc.), and the like.
- the second electronic device sends a first input event to the first electronic device, and the first electronic device determines a third input operation indicated by the first input event based on the first input event and the task stack run in the second display area, and executes the first input event.
- the response event corresponding to the three input operations After the response event is executed, the display content corresponding to the second display area is updated to the fourth display content.
- the first display device sends the updated fourth display content to the second display device, and the second display device displays the fourth display content.
- the first electronic device supports running multiple desktop instances in different display areas through the same application, for example, running the first desktop in the first display area through the first module of the first application, and running the first desktop instance through the first module of the first application
- the second module runs the second desktop in the second display area.
- the first electronic device determines the display content of the main screen of the device based on the task stack running in the first display area, and determines the display content projected to the second electronic device based on the task stack running in the second display area. In this way, based on the two different display areas, the first electronic device and the second electronic device can display different desktops and other different contents.
- the first electronic device in response to the second user operation acting on the first display content, displays the third display content based on the task stack running in the first display area, including: responding to the first display For the second user operation on the first desktop in the content, the first electronic device displays the third display content based on the task stack of the first application running in the first display area; the above response acts on the second display displayed by the second electronic device
- the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack run by the second display area, including: responding to the second display content displayed by the second electronic device For the third user operation on the desktop, the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on the task stack of the first application running in the second display area.
- the first display content may be an application program icon (such as a gallery icon) on the desktop (ie, the first desktop) shown in FIG.
- the mobile phone 100 determines a response event corresponding to the second user operation based on the task stack of the desktop launcher running on Display0 (for example, the aforementioned desktop task stack Homestack).
- the second display content may be the desktop displayed on the computer 200 shown in FIG.
- mobile phone 100 determines and executes the response event corresponding to the third user operation based on the task stack of the desktop launcher run by Display1 (such as the aforementioned desktop task stack Homestack), and then updates the second
- the fourth display content corresponding to the display area (that is, the user interface 24 shown in FIG. 6A ) is sent to the computer 200 .
- the computer 200 displays the user interface 24 shown in FIG. 6A.
- the first electronic device may execute a response event corresponding to the second user operation based on the task stack of the first application running in the display area associated with the first desktop;
- the first electronic device may execute a response event corresponding to the third user operation based on the task stack of the first application running on the display area associated with the second desktop.
- data isolation of events (input events and/or response events) of different desktops can be guaranteed.
- the two desktop instances are both run by the modules of the first application, the two desktops can share specified data, and the second desktop can inherit some or all of the functions of the first desktop.
- the method before the above-mentioned first electronic device displays the first display content based on the first display area, the method further includes: the first electronic device invokes the third module of the second application to run the first status bar, and the first status The bar is associated with the first display area, and the first display content includes the first status bar; the method further includes: in response to the first user operation, the first electronic device invokes the fourth module of the second application to run the second status bar, the second The second status bar is associated with the second display area, and the second display content includes the second status bar.
- the first user operation is an input operation for the user to determine that the first electronic device projects a screen to the second electronic device.
- the first electronic device responds to the screen projection instruction, the first electronic device calls the second module of the first application to run the second desktop, and calls the fourth module of the second application to run the second state bar, the second desktop and the second status bar are all associated with the second display area.
- the above-mentioned screen projection instruction may refer to the screen projection instruction or the instruction of switching modes in the foregoing embodiments.
- the second application can be the aforementioned system interface, such as SystemUI;
- the third module can be a standard status bar, such as StatusBar;
- the fourth module can be an extended screen status bar, such as PcStatusBar.
- the first status bar may be the aforementioned standard status bar, and the second status bar may be the aforementioned extended screen status bar.
- the first status bar may be the status bar 101 shown in FIG. 3A
- the second status bar may be the status bar 401 shown in FIG. 3G .
- the first electronic device supports running multiple instances of the status bar in different display areas through the same application, for example, running the first status bar in the first display area through the third module of the second application, and running the first status bar in the first display area through the second application
- the fourth module runs a second status bar in the second display area.
- the first electronic device and the second electronic device can display different status bars, ensuring the data of events (input events and/or response events) in the two status bars isolation.
- the two status bars can share specified data (such as notification messages), and the second status bar can inherit some or all of the functional characteristics of the first status bar.
- the method before the above-mentioned first electronic device displays the first display content based on the first display area, the method further includes: the first electronic device calls the fifth module of the third application to run the first variable first display object ;
- the first variable is associated with the first display area, and the first display content includes the first display object;
- the first variable is associated with the second display area, and the second display content includes the first display object.
- the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time.
- the third application and the second application may be the same application or different applications, which are not specifically limited here.
- the method further includes: in response to the fourth user operation acting on the first display content, calling the fifth module of the third application to modify the display object of the first variable to the second display object;
- the device updates the display content corresponding to the first display area to the fifth display content, and the fifth display content includes the second display object;
- the first electronic device updates the display content corresponding to the second display area to the sixth display content, and sends The device sends sixth display content, where the sixth display content includes the second display object.
- the first electronic device supports displaying objects corresponding to the same variable in multiple different display areas at the same time. After the user changes the display object of the first variable in the first display area, the first variable is displayed in the second display area The display object of will also change accordingly.
- the first variable is used to indicate the display object of the wallpaper
- the display object of the wallpaper is a static picture and/or a dynamic picture
- the wallpaper includes a lock screen wallpaper when the screen is locked and/or a desktop wallpaper when the screen is not locked.
- the third application may be a system interface (SystemUI) or a wallpaper application, for example, the fifth module may be a wallpaper management service, such as WallpaperManagerService.
- the first display object of the wallpaper is displayed in the first display area as the desktop wallpaper 107 shown in FIG. 3A
- the first display object of the wallpaper is displayed in the second display area as the desktop wallpaper 404 shown in FIG. 3G .
- Desktop wallpaper 107 and desktop wallpaper 404 are derived from the same picture.
- the first display object of the wallpaper is displayed in the first display area as the lock screen wallpaper 902 in the lock screen interface shown in FIG. 10B, and the first display object of the wallpaper is displayed in the second display area as the lock screen wallpaper 902 shown in FIG. 10A.
- Desktop wallpaper 107 and desktop wallpaper 404 are derived from the same picture.
- the wallpaper projected on the second electronic device after the user changes the wallpaper displayed on the first electronic device, the wallpaper projected on the second electronic device also changes accordingly.
- the first electronic device presets a variety of themes, and the theme is used to indicate the desktop layout style, icon display style and/or interface color, etc.; the first variable is used to indicate the display object of the theme, and the display object of the theme It is the display content corresponding to one of the various themes.
- the theme projected on the second electronic device after the user changes the theme displayed on the first electronic device, the theme projected on the second electronic device also changes accordingly.
- the first module of the first application includes a first common class for creating and running the first desktop, a first user interface UI control class, and a desktop task stack of the first desktop; the second module of the first application The module includes the second common class for creating and running the second desktop, the second UI control class and the desktop task stack of the second desktop, part or all of the classes in the second common class inherit from the first common class, and the second UI Part or all of the control classes inherit from the first UI control class.
- the first electronic device adds a second common class for creating and running the second desktop, a second UI control class, and a desktop task stack of the second desktop, and the newly added common class and UI control Part or all of the class is inherited from the first common class and the first UI control class corresponding to the original first desktop. Therefore, the second desktop can inherit part of the functional characteristics of the first desktop, and the two desktops can implement specified data of sharing.
- the second common class includes one or more of the following: desktop startup provider, database assistant, desktop startup setting class, desktop startup constant class, Pc layout configuration, Pc device file, Pc grid counter, Pc Desktop launcher strategy, Pc desktop startup mode, Pc loading tasks, etc.;
- the second UI control class includes one or more of the following: Pc drag layer, Pc desktop workspace, Pc unit layout, Pc program dock view, Pc folder , Pc folder icon, etc.
- the second common class can be the common class shown in Figure 13
- the second UI control class can be the UI control class shown in Figure 13
- the desktop task stack of the first desktop can be the standard desktop shown in Figure 13
- the task stack of the second desktop can be the task stack of the extended screen desktop shown in FIG. 13 .
- the third module of the second application includes the first component for creating and running the first status bar, the first dependent control class and the third UI control class;
- the second module of the first application includes the first component for Create and run the second component of the second status bar, the second dependent control class and the fourth UI control class, some or all of the components in the second component inherit from the first component, and some or all of the classes in the second dependent control class Inherited from the first dependent control class, part or all of the fourth UI control class inherits from the third UI control class.
- the first electronic device adds a second component for creating and running the second status bar, a second dependent control class and a fourth UI control class, and the newly added components, dependent control class and UI Part or all of the control class inherits from the first component corresponding to the original first status bar, the first dependent control class, and the third UI control class. Therefore, the second status bar can inherit part of the functions of the first status bar Features, the two status bars can realize the sharing of specified data.
- the second component includes one or more of the following: Pc dependent class, Pc system provider, Pc system bar, second status bar;
- the second dependent control class includes one or more of the following: Pc status Bar window control class, screen control class, lock screen control class, remote control class;
- the fourth UI control class includes one or more of the following: Pc status bar window view, Pc notification panel view, Pc quick setting fragments, Pc status bar Fragments, Pc status bar view.
- the first component, the first dependent control class and the third UI control class may be respectively the component, the dependent control class and the UI control class shown in FIG. 12 for realizing the standard status bar.
- the second component, the second dependent control class, and the fourth UI control class may be the components, dependent control class, and UI control class shown in FIG. 12 for realizing the status bar of the extended screen, respectively.
- the ID of the display area associated with the second module is the ID of the second display area; in response to the first user operation, the first electronic device invokes the second module of the first application to run the second desktop,
- the second desktop is associated with the second display area, including: in response to the first user operation, the Pc management service receives an instruction to switch modes, and the instruction is used to indicate that the current non-screen projection mode is switched to the screen projection mode; in response to the instruction, The Pc management service calls the Pc desktop service, the Pc desktop service calls the activity management service, and the activity management service calls the activity task manager to start the second module of the first application; calling the root activity container determines the ID of the display area associated with the second module; When the ID of the display area associated with the two modules is the ID of the second display area, query the Activity of the second desktop as the Activity of the desktop to be started; when the ID of the display area associated with the second module is the ID of the first display area, query The Activity of the first desktop is used as the Activity of the desktop to be started;
- the first electronic device invokes the fourth module of the second application to run the second status bar
- the second status bar is associated with the second display area, including: responding to the first user Operation, the Pc management service receives an instruction to switch modes, and the instruction is used to instruct to switch the current non-screen projection mode to the screen projection mode; in response to the instruction, the Pc management service starts the productivity service, and the productivity service invokes the system bar to start the second status bar , the system bar creates a second status bar based on the configuration file; the second status bar calls the callback interface of the command queue to add a callback to the second status bar; the second status bar initializes the layout, and registers the IstatusBar object corresponding to the second status bar to the status Bar management service; the second status bar creates and adds the Pc status bar window view to the status bar window control class; the status bar window control class calls the window management interface to add the second status bar to the window management service, and then adds the second status bar to
- the command queue in the non-screen projection mode, supports the first status bar associated with the first display area; in the screen projection mode, supports the first status bar associated with the first display area, and the first status bar associated with the first display area.
- the second status bar associated with the second display area.
- all or part of them may be implemented by software, hardware, firmware or any combination thereof.
- software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the present application will be generated in whole or in part.
- the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
- the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, DSL) or wireless (eg, infrared, wireless, microwave, etc.) means.
- the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
- the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
- the processes can be completed by computer programs to instruct related hardware.
- the programs can be stored in computer-readable storage media.
- When the programs are executed may include the processes of the foregoing method embodiments.
- the aforementioned storage medium includes: ROM or random access memory RAM, magnetic disk or optical disk, and other various media that can store program codes.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Digital Computer Display Output (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
Claims (17)
- 一种投屏方法,其特征在于,包括:所述第一电子设备调用第一应用的第一模块运行第一桌面,所述第一桌面和第一显示区关联;所述第一电子设备基于所述第一显示区显示第一显示内容,所述第一显示内容包括第一桌面;响应于第一用户操作,所述第一电子设备调用所述第一应用的第二模块运行第二桌面,所述第二桌面和第二显示区关联;所述第一电子设备向所述第二电子设备发送所述第二显示区对应的第二显示内容,所述第二显示内容包括所述第二桌面;响应于作用于所述第一显示内容的第二用户操作,所述第一电子设备基于所述第一显示区运行的任务栈,显示第三显示内容;响应于作用于所述第二电子设备显示的第二显示内容的第三用户操作,所述第一电子设备基于所述第二显示区运行的任务栈,确定所述第二显示区对应的显示内容为第四显示内容;所述第一电子设备向所述第二电子设备发送第四显示内容。
- 根据权利要求1所述的方法,其特征在于,所述响应于作用于所述第一显示内容的第二用户操作,所述第一电子设备基于所述第一显示区运行的任务栈,显示第三显示内容,包括:响应于作用于所述第一显示内容中的所述第一桌面的所述第二用户操作,所述第一电子设备基于所述第一显示区运行的所述第一应用的任务栈,显示第三显示内容;所述响应于作用于所述第二电子设备显示的第二显示内容的第三用户操作,所述第一电子设备基于所述第二显示区运行的任务栈,确定所述第二显示区对应的显示内容为第四显示内容,包括:响应于作用于所述第二电子设备显示的所述第二桌面的所述第三用户操作,所述第一电子设备基于所述第二显示区运行的所述第一应用的任务栈,确定所述第二显示区对应的显示内容为所述第四显示内容。
- 根据权利要求1所述的方法,其特征在于,所述第一电子设备基于所述第一显示区显示第一显示内容之前,所述方法还包括:所述第一电子设备调用第二应用的第三模块运行第一状态栏,所述第一状态栏与所述第一显示区关联,所述第一显示内容包括所述第一状态栏;所述方法还包括:响应于所述第一用户操作,所述第一电子设备调用所述第二应用的第四模块运行第二状态栏,所述第二状态栏与所述第二显示区关联,所述第二显示内容包括所述第二状态栏。
- 根据权利要求1所述的方法,其特征在于,所述第一电子设备基于所述第一显示区显示第一显示内容之前,所述方法还包括:所述第一电子设备调用所述第三应用的第五模块运行第一变量的第一显示对象;所述第 一变量与所述第一显示区关联,所述第一显示内容包括所述第一显示对象;所述第一变量与所述第二显示区关联,所述第二显示内容包括所述第一显示对象。
- 根据权利要求4所述的方法,其特征在于,所述方法还包括:响应于作用于所述第一显示内容的第四用户操作,调用所述第三应用的所述第五模块修改所述第一变量的显示对象为第二显示对象;所述第一电子设备更新所述第一显示区对应的显示内容为第五显示内容,所述第五显示内容包括所述第二显示对象;所述第一电子设备更新所述第二显示区对应的显示内容为第六显示内容,并向所述第二电子设备发送所述第六显示内容,所述第六显示内容包括第二显示对象。
- 根据权利要求4或5所述的方法,其特征在于,所述第一变量用于指示壁纸的显示对象,所述壁纸的显示对象为静态图片和/或动态图片,所述壁纸包括锁屏时的锁屏壁纸和/或非锁屏时的桌面壁纸。
- 根据权利要求4或5所述的方法,其特征在于,所述第一电子设备预设了多种主题,所述主题用于指示桌面布局风格、图标显示风格和/或界面色彩等;所述第一变量用于指示所述主题的显示对象,所述主题的显示对象为所述多种主题中的一种主题对应的显示内容。
- 根据权利要求1所述的方法,其特征在于,所述第一应用的所述第一模块包括用于创建和运行第一桌面的第一共同类、第一用户界面UI控制类和第一桌面的桌面任务栈;所述第一应用的所述第二模块包括用于创建和运行第二桌面的第二共同类、第二UI控制类和第二桌面的桌面任务栈,所述第二共同类中的部分或全部类继承自所述第一共同类,所述第二UI控制类中的部分或全部类继承自所述第一UI控制类。
- 根据权利要求8所述的方法,其特征在于,所述第二共同类包括以下一项或多项:桌面启动提供者、数据库助手、桌面启动设置类、桌面启动常量类、Pc布局配置、Pc设备文件、Pc网格计数器、Pc桌面启动器策略、Pc桌面启动模式、Pc加载任务等;第二UI控制类包括以下一项或多项:Pc拖拽层、Pc桌面工作区、Pc单元布局、Pc程序坞视图、Pc文件夹、Pc文件夹图标等。
- 根据权利要求1所述的方法,其特征在于,所述第二应用的所述第三模块包括用于创建和运行第一状态栏的第一组件、第一依赖控制类和第三UI控制类;所述第一应用的所述第二模块包括用于创建和运行第二状态栏的第二组件、第二依赖控制类和第四UI控制类,所述第二组件中的部分或全部组件继承自所述第一组件,所述第二依赖控制类中的部分或全部类继承自所述第一依赖控制类,所述第四UI控制类中的部分或全部类继承自所述第三UI控制类。
- 根据权利要求10所述的方法,其特征在于,所述第二组件包括以下一项或多项:Pc 依赖类、Pc系统提供者、Pc系统栏、第二状态栏;第二依赖控制类包括以下一项或多项:Pc状态栏窗口控制类、屏幕控制类、锁屏控制类、远程控制类;第四UI控制类包括以下一项或多项:Pc状态栏窗口视图、Pc通知面板视图、Pc快捷设置碎片、Pc状态栏碎片、Pc状态栏视图。
- 根据权利要求1至11任一项所述的方法,其特征在于,所述第二模块关联的显示区的身份标识ID为所述第二显示区的ID;所述响应于第一用户操作,所述第一电子设备调用所述第一应用的第二模块运行第二桌面,所述第二桌面和第二显示区关联,包括:响应于所述第一用户操作,Pc管理服务接收到切换模式的指令,所述指令用于指示将当前的非投屏模式切换为投屏模式;响应于所述指令,所述Pc管理服务调用Pc桌面服务,所述Pc桌面服务调用活动管理服务,所述活动管理服务调用活动任务管理器启动所述第一应用的所述第二模块;调用根活动容器确定所述第二模块关联的显示区的ID;当所述第二模块关联的显示区的ID为所述第二显示区的ID时,查询所述第二桌面的Activity作为待启动桌面的Activity,当所述第二模块关联的显示区的ID为所述第一显示区的ID时,查询所述第一桌面的Activity作为待启动桌面的Activity;活动启动控制器调用活动启动器启动所述第二桌面。
- 根据权利要求1至11任一项所述的方法,其特征在于,所述响应于所述第一用户操作,所述第一电子设备调用所述第二应用的第四模块运行第二状态栏,所述第二状态栏与所述第二显示区关联,包括:响应于所述第一用户操作,所述Pc管理服务接收到切换模式的所述指令,所述指令用于指示将当前的非投屏模式切换为投屏模式;响应于所述指令,所述Pc管理服务启动生产力服务,所述生产力服务调用系统栏启动第二状态栏,所述系统栏基于配置文件创建所述第二状态栏;所述第二状态栏调用命令队列的回调接口,以添加回调给所述第二状态栏;所述第二状态栏初始化布局,注册所述第二状态栏对应的IstatusBar对象至状态栏管理服务;所述第二状态栏创建并添加Pc状态栏窗口视图到状态栏窗口控制类;所述状态栏窗口控制类调用窗口管理的接口添加所述第二状态栏到窗口管理服务,进而将所述第二状态栏添加至所述第二显示区。
- 根据权利要求13所述的方法,其特征在于,非投屏模式下,所述命令队列支持与所述第一显示区关联的所述第一状态栏;投屏模式下,所述命令队列同时支持与所述第一显示区关联的所述第一状态栏,以及与所述第二显示区关联的所述第二状态栏。
- 一种电子设备,包括触控屏,存储器,一个或多个处理器,多个应用程序,以及一个或多个程序;其中,所述一个或多个程序被存储在所述存储器中;其特征在于,所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备实现如权利要求1至14任一项所述的方法。
- 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1至14任一项所述的方法。
- 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1至14任一项所述的方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023571459A JP2024521564A (ja) | 2021-05-19 | 2022-05-10 | 画面投影方法及び関連装置 |
US18/561,899 US20240220184A1 (en) | 2021-05-19 | 2022-05-10 | Screen projection method and related apparatus |
EP22803825.3A EP4343533A4 (en) | 2021-05-19 | 2022-05-10 | SCREEN PROJECTION METHOD AND ASSOCIATED APPARATUS |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110547552 | 2021-05-19 | ||
CN202110547552.4 | 2021-05-19 | ||
CN202110745467.9 | 2021-06-30 | ||
CN202110745467.9A CN115373778A (zh) | 2021-05-19 | 2021-06-30 | 投屏方法及相关装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022242503A1 true WO2022242503A1 (zh) | 2022-11-24 |
Family
ID=84060018
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/091899 WO2022242503A1 (zh) | 2021-05-19 | 2022-05-10 | 投屏方法及相关装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240220184A1 (zh) |
EP (1) | EP4343533A4 (zh) |
JP (1) | JP2024521564A (zh) |
CN (1) | CN115373778A (zh) |
WO (1) | WO2022242503A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118626182A (zh) * | 2023-03-07 | 2024-09-10 | 华为技术有限公司 | 切换应用程序的方法、电子设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140281896A1 (en) * | 2013-03-15 | 2014-09-18 | Google Inc. | Screencasting for multi-screen applications |
CN107493375A (zh) * | 2017-06-30 | 2017-12-19 | 北京超卓科技有限公司 | 移动终端扩展式投屏方法及投屏系统 |
US20190121682A1 (en) * | 2015-04-26 | 2019-04-25 | Intel Corporation | Integrated android and windows device |
CN112416200A (zh) * | 2020-11-26 | 2021-02-26 | 维沃移动通信有限公司 | 显示方法、装置、电子设备和可读存储介质 |
CN112527221A (zh) * | 2019-09-18 | 2021-03-19 | 华为技术有限公司 | 一种数据传输的方法及相关设备 |
CN112667183A (zh) * | 2020-12-31 | 2021-04-16 | 努比亚技术有限公司 | 投屏方法、移动终端及计算机可读存储介质 |
-
2021
- 2021-06-30 CN CN202110745467.9A patent/CN115373778A/zh active Pending
-
2022
- 2022-05-10 US US18/561,899 patent/US20240220184A1/en active Pending
- 2022-05-10 WO PCT/CN2022/091899 patent/WO2022242503A1/zh active Application Filing
- 2022-05-10 JP JP2023571459A patent/JP2024521564A/ja active Pending
- 2022-05-10 EP EP22803825.3A patent/EP4343533A4/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140281896A1 (en) * | 2013-03-15 | 2014-09-18 | Google Inc. | Screencasting for multi-screen applications |
US20190121682A1 (en) * | 2015-04-26 | 2019-04-25 | Intel Corporation | Integrated android and windows device |
CN107493375A (zh) * | 2017-06-30 | 2017-12-19 | 北京超卓科技有限公司 | 移动终端扩展式投屏方法及投屏系统 |
CN112527221A (zh) * | 2019-09-18 | 2021-03-19 | 华为技术有限公司 | 一种数据传输的方法及相关设备 |
CN112416200A (zh) * | 2020-11-26 | 2021-02-26 | 维沃移动通信有限公司 | 显示方法、装置、电子设备和可读存储介质 |
CN112667183A (zh) * | 2020-12-31 | 2021-04-16 | 努比亚技术有限公司 | 投屏方法、移动终端及计算机可读存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4343533A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4343533A1 (en) | 2024-03-27 |
EP4343533A4 (en) | 2024-08-21 |
JP2024521564A (ja) | 2024-06-03 |
CN115373778A (zh) | 2022-11-22 |
US20240220184A1 (en) | 2024-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021013158A1 (zh) | 显示方法及相关装置 | |
WO2021139768A1 (zh) | 跨设备任务处理的交互方法、电子设备及存储介质 | |
WO2021057830A1 (zh) | 一种信息处理方法及电子设备 | |
US11803451B2 (en) | Application exception recovery | |
CN112269527B (zh) | 应用界面的生成方法及相关装置 | |
CN108845856B (zh) | 基于对象的同步更新方法、装置、存储介质及设备 | |
US11861382B2 (en) | Application starting method and apparatus, and electronic device | |
US20240295945A1 (en) | Method, electronic device, and system for creating application shortcut | |
CN113553130B (zh) | 应用执行绘制操作的方法及电子设备 | |
CN116360725B (zh) | 显示交互系统、显示方法及设备 | |
WO2022127661A1 (zh) | 应用共享方法、电子设备和存储介质 | |
WO2021190524A1 (zh) | 截屏处理的方法、图形用户接口及终端 | |
WO2023130921A1 (zh) | 一种适配多设备的页面布局的方法及电子设备 | |
WO2023109764A1 (zh) | 一种壁纸显示方法及电子设备 | |
WO2023088459A1 (zh) | 设备协同方法及相关装置 | |
WO2022242503A1 (zh) | 投屏方法及相关装置 | |
EP4198709A1 (en) | Navigation bar display method, display method and first electronic device | |
US20230236714A1 (en) | Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device | |
WO2021052488A1 (zh) | 一种信息处理方法及电子设备 | |
WO2022161058A1 (zh) | 一种全景图像的拍摄方法及电子设备 | |
WO2024169305A1 (zh) | 应用管理的方法和电子设备 | |
WO2024140757A1 (zh) | 跨设备分屏方法及相关装置 | |
CN117009023B (zh) | 显示通知信息的方法及相关装置 | |
WO2023142935A1 (zh) | 应用组件管理方法及相关设备 | |
WO2024083114A1 (zh) | 一种软件分发方法、电子设备及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22803825 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023571459 Country of ref document: JP Ref document number: 18561899 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022803825 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022803825 Country of ref document: EP Effective date: 20231219 |