WO2022017393A1 - 显示交互系统、显示方法及设备 - Google Patents

显示交互系统、显示方法及设备 Download PDF

Info

Publication number
WO2022017393A1
WO2022017393A1 PCT/CN2021/107410 CN2021107410W WO2022017393A1 WO 2022017393 A1 WO2022017393 A1 WO 2022017393A1 CN 2021107410 W CN2021107410 W CN 2021107410W WO 2022017393 A1 WO2022017393 A1 WO 2022017393A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
display
interface
application
window
Prior art date
Application number
PCT/CN2021/107410
Other languages
English (en)
French (fr)
Inventor
韩国辉
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21845720.8A priority Critical patent/EP4174633A4/en
Priority to US18/006,082 priority patent/US20230350547A1/en
Publication of WO2022017393A1 publication Critical patent/WO2022017393A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports

Definitions

  • the present application relates to the technical field of terminals and communications, and in particular, to a display interaction system, display method, and device.
  • Screen projection refers to sharing the content displayed by an electronic device to be projected (usually a mobile terminal, such as a mobile phone, a tablet computer, etc.) to other second devices with a display screen (usually a TV, a smart interactive device, etc.) Tablets, projectors, etc.).
  • a mobile terminal such as a mobile phone, a tablet computer, etc.
  • a display screen usually a TV, a smart interactive device, etc.
  • Tablets, projectors, etc. Tablets, projectors, etc.
  • the size of the window of the desktop of the electronic device displayed in the second device after screen projection does not change, resulting in that the display area of the second device cannot be fully utilized.
  • the embodiment of the present application discloses a display method and a related device, which can make full use of the display area of the second device, facilitate user operations, and improve user experience.
  • the present application provides a display interaction system, and the above system includes a first device and a second device; wherein,
  • the above-mentioned first device is used to send the interface display information of the first application to the above-mentioned second device, wherein the display screen of the above-mentioned first device displays the first user interface of the above-mentioned first application, and the above-mentioned interface display information includes the above-mentioned first device. for displaying the data of the above-mentioned first user interface;
  • the above-mentioned second device is configured to display the second user interface of the above-mentioned first application according to the above-mentioned interface display information; the content displayed on the above-mentioned second user interface includes the content displayed on the above-mentioned first user interface; the typesetting of the above-mentioned second user interface is the same as that of the above-mentioned first user interface.
  • the layout of a user interface is different, and the display area of the second user interface is larger than the display area of the first user interface.
  • the present application can map the application content displayed by the first device (such as a mobile phone, etc.) to the display screen of the second device (such as a tablet, etc.) for large-screen display, making full use of the screen area of the second device, and providing users with possible The possibility of large-screen operation improves the user experience.
  • the first device such as a mobile phone, etc.
  • the second device such as a tablet, etc.
  • the method further includes: :
  • the above-mentioned second device is used to display a third user interface according to the above-mentioned interface display information, the size of the above-mentioned third user interface does not match the size of the display screen of the above-mentioned second device, and the content displayed on the above-mentioned third user interface is the same as the above-mentioned first.
  • the content displayed on the user interface is consistent;
  • the second device is configured to receive a first operation input for the first control in the third user interface, where the first operation is used to trigger the operation of the second device to display the second user interface according to the interface display information.
  • the user interface mapped from the first device is first displayed in the size of the display screen of the first device, so that the user can choose whether to display it in full screen, providing the user with more possibilities for selection.
  • the above-mentioned second device is configured to display a second user interface according to the above-mentioned interface display information, including:
  • the above-mentioned second device is configured to determine, according to the above-mentioned interface display information, that the above-mentioned first user interface has the property of being displayed in a landscape orientation;
  • the above-mentioned second device is configured to display the above-mentioned second user interface according to the above-mentioned attributes of the horizontal screen display.
  • the above-mentioned second device is configured to display a second user interface according to the above-mentioned interface display information, including:
  • the above-mentioned second device is configured to determine, according to the above-mentioned interface display information, that the above-mentioned first user interface has no horizontal screen display attribute;
  • the above-mentioned second device is configured to display the above-mentioned second user interface according to the above-mentioned attribute of no landscape display, wherein the above-mentioned second user interface includes a plurality of small windows, and the above-mentioned plurality of small windows include the main page of the first application.
  • a window and a window that is consistent with the content displayed on the first user interface, and the plurality of small windows are all windows belonging to the first application.
  • the above-mentioned first user interface has a landscape display property, it can be displayed in full screen on the display screen of the second device, which provides users with achievable solutions in various aspects and improves user experience.
  • the plurality of small windows include a window of the home page of the first application and a filling window
  • the filling window is a window that the second device customizes and displays in the user interface of the first application .
  • the user interface of the full-screen display of the first application is displayed to the user by filling the window, so as to improve the sensory experience of the user.
  • the interface display information includes information in a task stack of the first application in the first device, and the second device is configured to receive information about the first control in the third user interface. After the first operation, it also includes:
  • the second device is configured to determine that the first application is installed in the second device according to information in the task stack of the first application in the first device;
  • the above-mentioned second device is configured to display the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application.
  • the above-mentioned second device is configured to display the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application, including:
  • the second device is configured to display the second user interface through the first application based on the interface display information in response to a second operation, where the second operation is a touch operation on a selection button of the first application.
  • entering the full-screen display through the corresponding application can make the displayed user interface more effective, reduce the display delay, and improve the user experience.
  • the present application provides a display method, the method comprising:
  • the second device receives interface display information of the first application from the first device, wherein the display screen of the first device displays the first user interface of the first application, and the interface display information includes that the first device is used to display the above data of the first user interface;
  • the second device displays a second user interface according to the interface display information, and the content displayed on the second user interface includes the content displayed on the first user interface; the layout of the second user interface is different from the layout of the first user interface, The display area of the second user interface is larger than the display area of the first user interface.
  • the method further includes:
  • the above-mentioned second device displays a third user interface according to the above-mentioned interface display information, the size of the above-mentioned third user interface does not match the size of the display screen of the above-mentioned second device, and the content displayed on the above-mentioned third user interface is the same as that of the above-mentioned first user interface.
  • the content displayed on the interface is consistent;
  • the above-mentioned second device receives a first operation input for the first control in the above-mentioned third user interface, and the above-mentioned first operation is used to trigger the above-mentioned second device to display the above-mentioned operation of the above-mentioned second user interface according to the above-mentioned interface display information.
  • the above-mentioned second device displays a second user interface according to the above-mentioned interface display information, including:
  • the second device determines, according to the interface display information, that the first user interface has a landscape display property
  • the above-mentioned second device displays the above-mentioned second user interface according to the above-mentioned attributes of the horizontal screen display.
  • the above-mentioned second device displays a second user interface according to the above-mentioned interface display information, including:
  • the above-mentioned second device determines that the above-mentioned first user interface does not have an attribute of horizontal screen display according to the above-mentioned interface display information
  • the above-mentioned second device displays the above-mentioned second user interface according to the above-mentioned attribute of no horizontal screen display, wherein the above-mentioned second user interface includes a plurality of small windows, and the above-mentioned plurality of small windows includes the window of the main page of the above-mentioned first application and It includes a window that is consistent with the content displayed on the above-mentioned first user interface, and the above-mentioned multiple small windows are all windows belonging to the above-mentioned first application.
  • the plurality of small windows include a window of the home page of the first application and a filling window
  • the filling window is a window that the second device customizes and displays in the user interface of the first application .
  • the above-mentioned interface display information includes information in the task stack of the above-mentioned first application in the above-mentioned first device, and the above-mentioned second device receives the first information about the first control in the above-mentioned third user interface. After the operation, it also includes:
  • the second device determines that the first application is installed in the second device according to the information in the task stack of the first application in the first device;
  • the second device displays the second user interface based on the interface display information through the first application.
  • the above-mentioned second device displays the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application, including:
  • the second device displays the second user interface through the first application based on the interface display information in response to a second operation, where the second operation is a touch operation on a selection button of the first application.
  • the present application provides a display device, the device comprising:
  • a first receiving unit configured to receive interface display information of a first application from a first device, wherein the display screen of the first device displays a first user interface of the first application, and the interface display information includes the first device data for displaying the above-mentioned first user interface;
  • a display unit configured to display a second user interface according to the interface display information, the content displayed on the second user interface includes the content displayed on the first user interface; the layout of the second user interface is different from the layout of the first user interface , the display area of the second user interface is larger than the display area of the first user interface.
  • the above-mentioned display unit is further configured to, after the above-mentioned first receiving unit receives the interface display information of the first application from the first device, the above-mentioned display unit displays the second user interface according to the above-mentioned interface display information Before,
  • a third user interface is displayed according to the interface display information, the size of the third user interface does not match the size of the display screen of the display device, and the content displayed on the third user interface is consistent with the content displayed on the first user interface ;
  • the above-mentioned display device further includes a second receiving unit configured to receive a first operation inputted to a first control in the above-mentioned third user interface, where the above-mentioned first operation is used to trigger the above-mentioned display device to display the above-mentioned second user according to the above-mentioned interface display information operation of the interface.
  • the above-mentioned display unit is specifically used for:
  • the above-mentioned interface display information it is judged that the above-mentioned first user interface has the property of being displayed on a horizontal screen;
  • the second user interface is displayed according to the attributes displayed on the horizontal screen.
  • the above-mentioned display unit is specifically used for:
  • the second user interface is displayed according to the property of no landscape display, wherein the second user interface includes a plurality of small windows, and the plurality of small windows include a window of the home page of the first application and a window that includes the same as the first application.
  • a window with the same content displayed on the user interface, and the plurality of small windows are all windows belonging to the first application.
  • the plurality of small windows include a window of the home page of the first application and a filling window
  • the filling window is a window that the display device customizes and displays in the user interface of the first application.
  • the above-mentioned interface display information includes the information in the task stack of the above-mentioned first application in the above-mentioned first device, and the above-mentioned display device further includes a judgment unit for receiving the above-mentioned response to the above-mentioned second receiving unit in the above-mentioned second receiving unit.
  • the first application is installed in the display device
  • the above-mentioned display unit is further configured to display the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application.
  • the above-mentioned display unit is specifically used for:
  • the second user interface is displayed through the first application based on the interface display information in response to a second operation, where the second operation is a touch operation on a selection button of the first application.
  • the present application provides a display device, the device includes a processor, a receiving interface, a transmitting interface, and a memory, wherein the memory is used to store computer programs and/or data, and the processor is used to execute the stored in the memory.
  • a computer program that causes the above-mentioned device to perform the method according to any one of the above-mentioned second aspects.
  • the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and the computer program is executed by a processor to implement any one of the above-mentioned methods in the above-mentioned second aspect.
  • the present application provides a computer program product.
  • the computer program product is read and executed by a computer, the method described in any one of the above-mentioned second aspects will be executed.
  • the present application provides a computer program, which, when executed on a computer, enables the computer to implement the method described in any one of the above-mentioned second aspect.
  • the present application can map the application content displayed by the first device (such as a mobile phone, etc.) to the display screen of the second device (such as a tablet, etc.) for large-screen display, making full use of the screen area of the second device, It provides users with the possibility of operating on a large screen and improves the user experience.
  • the first device such as a mobile phone, etc.
  • the second device such as a tablet, etc.
  • FIG. 1 is a schematic structural diagram of a display interaction system according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a software structure of an electronic device provided by an embodiment of the present application.
  • 4A is a schematic diagram of a user interface of an electronic device provided by an embodiment of the present application.
  • 4B is a schematic diagram of a pull-down menu of a status bar of a user interface of an electronic device according to an embodiment of the present application
  • 4C is a schematic diagram of a user interface of another electronic device provided by an embodiment of the present application.
  • 5A to 5C are schematic diagrams of user interfaces in the process of establishing a connection between two devices according to an embodiment of the present application
  • FIGS. 6A and 6B are schematic diagrams of user interfaces of an electronic device provided by an embodiment of the present application.
  • FIG. 7 to 11 are schematic diagrams of user interfaces on an electronic device provided by an embodiment of the present application.
  • 12A and 12B are schematic diagrams of a user interface on an electronic device provided by an embodiment of the present application.
  • FIG. 13 and 14 are schematic diagrams of a user interface on an electronic device provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of an interaction flow of a display method provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of a logical structure of a device provided by an embodiment of the present application.
  • FIG. 1 is a schematic diagram of a display interaction system provided by an embodiment of the present application.
  • the system may include one or more first devices 11 (only one of which is exemplarily drawn in FIG. 1 ) and one or more second devices 12 (only exemplified in FIG. 1 ) drawing one of the second devices), where:
  • the first device 11 can install and run one or more applications (application, APP), and the one or more applications can be, for example, applications such as WeChat, Meituan, email, etc.
  • application application
  • APP application
  • the display content of 11 is mapped to an application program of the second device (the application program is referred to as a "cooperative assistant" in subsequent embodiments of the present application).
  • An application can also be referred to simply as an application.
  • the first device 11 may include, but is not limited to, any handheld electronic product based on an intelligent operating system, which can perform human-computer interaction with the user through input devices such as keyboards, virtual keyboards, touchpads, touchscreens, and voice-activated devices, such as smart Mobile phones, tablet computers, handheld computers, wearable electronic devices, etc.
  • the smart operating system includes, but is not limited to, any operating system that enriches device functions by providing various applications to the device, such as systems such as Android, IOS, Windows, and MAC.
  • the second device 12 may include, but is not limited to, tablet computers, personal computers, desktop computers, televisions, in-vehicle displays, projector displays, and the like.
  • the second device 12 may provide a display service for the first device 11 .
  • the second device 12 needs to run a corresponding program to provide the display service, such as an application program that receives and saves the information sent by the first device 11 (hereinafter referred to as "cooperative assistant") and the information sent by the first device 11
  • cooperative assistant an application program that receives and saves the information sent by the first device 11
  • An application program may be referred to as a "window manager” below) and the like displayed on the display screen of the second device 12 .
  • the second device 12 may establish a connection with the first device 11 through a data cable, Bluetooth, or a wireless fidelity (Wireless Fidelity, WiFi) network, etc., to perform data interaction.
  • the first device 11 and the second device 12 can realize communication connection through WiFi p2p technology, and when the two devices are connected to the same network, the first device 11 can search to discover the second device 12, and then receive the user The operation instructions of the device realize the communication connection with the second device 12 . Or when the two devices access the same network at the same time, the first device 11 can search for the second device 12 and automatically establish a communication connection with the second device 12 .
  • the process of establishing a communication connection between the two devices will be described in detail below, which will not be described in detail here.
  • Exemplary electronic devices provided in the following embodiments of the present application are first introduced below with reference to FIG. 2 .
  • FIG. 2 shows a schematic structural diagram of the electronic device 100
  • the electronic device 100 may be the second device 12 shown in FIG. 1 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 150, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 150 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 150 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 150 may receive wireless charging input through the wireless charging coil of the electronic device 100 . While the charging management module 150 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used to connect the battery 142 , the charging management module 150 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 150, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 150 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of the antennas.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, SMS, and collaborative assistant.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a package manager, a resource manager, a notification manager, a view system, a collaboration framework, a display framework, and a stack manager, among others.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • the package manager can be used to manage the installation and uninstallation of the installation package of the application, and the analysis and query of the configuration information of the installation package.
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the collaboration framework is used to notify each event of establishing a connection between the electronic device 100 and the first device (for example, the first device 11 shown in The instructions of the "cooperative assistant” assist the "cooperative assistant" to obtain data information.
  • the collaboration framework can implement a "Onehop” service and a multicast source discovery protocol (MSDP) service, that is, the electronic device 100 can establish communication with the first device based on the Onehop service and the MSDP service. connect.
  • MSDP multicast source discovery protocol
  • the display framework is used to obtain the display data of the interface or window of the application being displayed in the electronic device 100 and is sent to the "cooperative assistant" through the collaborative framework, and can also be used to obtain the data received from the first device (for example, by the "collaborative assistant" through the collaborative framework. Display data and the like of the first device 11) shown in FIG. 1 .
  • the stack manager may be used to store and manage process information of applications running in the electronic device 100 .
  • the stack manager may store information about the active activities of the application, such as storing information such as the package name and class name of each activity.
  • the Android runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer can run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • a virtual machine can be used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the desktop of the electronic device 100 may be one or more user interfaces displayed in the home screen area after the electronic device is turned on and the system of the electronic device is successfully logged in, and these user interfaces may include applications installed on the electronic device icon and name.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, etc.).
  • Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event.
  • the touch operation is a touch click operation
  • the control corresponding to the click operation is the control of the "Cooperative Assistant” application.
  • the "Cooperative Assistant” application calls the collaborative framework interface of the application framework layer to start the "Cooperative Assistant” application.
  • the hardware structure and software structure framework of the first device for example, the first device 11 shown in FIG. 1
  • the hardware structure and software structure framework of one device and the second device may not be exactly the same, which is determined according to the actual situation, and will not be repeated here.
  • the embodiments of the present application provide a display method and device, and the devices involved in the embodiments of the present application
  • Cooperative assistant can be a service or function provided by the device, which can be used to establish a communication connection between the first device and the second device, realize data transmission between the first device and the second device, and realize the second device and the second device.
  • the "collaborative assistant” may be an Android installation package (Android Package, APK), which may be installed in the device in the form of a control or an APP.
  • the three functions of establishing a communication connection between the first device and the second device, data transmission between the first device and the second device, and instruction transmission between the second device and the first device may not be integrated into one APK file, these functions can be implemented through one or more APK files.
  • GUI graphical user interface
  • GUI can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
  • FIG. 4A illustrates an exemplary user interface 41 on the second device for presenting applications installed by the second device.
  • User interface 41 may include a status bar 401, application icons 402, page indicators 403, a tray 404 with icons of frequently used applications, and other indicators (not shown in Figure 4A). in:
  • the status bar 401 may include one or more signal strength indicators 401A of a wireless fidelity (Wi-Fi) signal, a Bluetooth indicator 401B, a battery status indicator 401C, and a time indicator 401D.
  • Wi-Fi wireless fidelity
  • the application icons 402 include icons of a first application, a second application, a third application, a fourth application, a fifth application, a sixth application, and a seventh application, etc.
  • Video Internet, Clock, QQ, WeChat, Taobao, AutoNavi Map, etc.
  • the page indicator 403 may be used to indicate the application icon in which page the user is currently browsing. The user can slide the area of the application icon 402 left and right to browse application icons in other pages. These pages may also be referred to as the desktop of the second device.
  • a tray 404 with icons of frequently used applications may display: icons of an eighth application, a ninth application, a tenth application, an eleventh application, and the like. These applications can be more commonly used applications, such as settings, music, reading and camera, and so on.
  • the user interface 41 may further include a navigation bar, wherein the navigation bar may include system navigation keys such as a back key, a home screen key, a multitasking key, and the like.
  • the navigation bar may include system navigation keys such as a back key, a home screen key, a multitasking key, and the like.
  • the second device may display the previous page of the current page.
  • the second device may display the home interface.
  • the second device may display the most recently opened tasks of the user.
  • the names of the navigation keys may also be other, which is not limited in this application. Not limited to virtual keys, each navigation key in the navigation bar can also be implemented as a physical key.
  • the second device may also include a physical home screen key.
  • the home screen key can be used to receive an instruction from the user and return the currently displayed UI to the home interface, so that it is convenient for the user to view the home screen at any time.
  • the above instruction may be an operation instruction for the user to press the home screen key once, or an operation instruction for the user to press the home screen key twice in a short period of time, or the user presses the home screen key for a predetermined period of time. operation instructions.
  • the home screen key may also be integrated with a fingerprint reader, so that when the home screen key is pressed, fingerprint collection and identification are subsequently performed.
  • FIG. 4A only exemplarily shows the user interface on the second device, and should not constitute a limitation on the embodiments of the present application.
  • FIG. 4A and FIG. 4B exemplarily show an operation of opening a "cooperative assistant" on the second device.
  • the second device when the second device detects a swipe down gesture on the status bar 401 , the second device may display a window 405 on the user interface 41 in response to the gesture.
  • the window 405 may display a switch control 405A of "cooperative assistant", and may also display switch controls for other functions (eg, WiFi, Bluetooth, flashlight, etc.).
  • a touch operation on the switch control 405A in the window 405 eg, a click or a touch operation on the switch control 405A
  • the second device may turn on the "cooperative assistant".
  • the user can open the window 405 by making a downward swipe gesture at the status bar 401, and can click the switch control 405A of the “Cooperative Assistant” in the window 405 to conveniently open the “Cooperative Assistant”.
  • the representation form of the switch control 405A of the "collaborative assistant” may be, but not limited to, text information and/or icons.
  • the "cooperative assistant” can also be displayed on the desktop of the second device in the form of an application icon like an application such as a mailbox, a gallery, etc.
  • an application icon like an application such as a mailbox, a gallery, etc.
  • a prompt message indicating that the "collaborative assistant” has been enabled may also be displayed in the status bar 401 .
  • the icon of "cooperative assistant” is displayed in the status bar 401 or the text "cooperative assistant” is directly displayed.
  • the icon 406 is the icon of the "cooperative assistant”. It should be noted that the icon of the “cooperative assistant” is not limited to the icons shown in FIG. 4B and FIG. 4C , which are only an example, and the specific representation form of the icon of the “collaborative assistant” is not limited in this solution.
  • the second device may also enable the "cooperative assistant” by default, for example, automatically enable the "collaborative assistant” after being powered on.
  • the first device can establish a communication connection with the second device, and then transmit data to the second device.
  • the following exemplarily introduces some embodiments of the graphical user interface implemented by the first device in the process of establishing a communication connection with the second device after the first device and the second device enable the "cooperative assistant" function.
  • the first device is a mobile phone and the second device is a tablet computer (tablet personal computer, Tablet PC) as an example to introduce the method of near field communication (Near Field Communication, NFC) between the first device and the second device.
  • NFC Near Field Communication
  • the process of discovering devices and communicating to establish a connection The following describes the process of establishing a connection between two devices in two cases.
  • Case 1 The first device and the second device do not log into the same system account.
  • the first device and the second device are devices of the same brand, but the two devices are not logged into the same system account after being turned on; or the first device and the second device are not of the same brand device of. In these cases, it indicates that the first device and the second device are different account devices, that is, devices that do not log in to one system account at the same time. At this time, the first device and the second device can be connected in the following manner.
  • both the first device and the second device have the NFC function, and when the NFC functions of the first device and the second device are both turned on, the first device can be approached or contacted with the second device, for example, the first device can be A preset part of the device, such as a preset position where the back is close to or touched on the second device, such as a position with a sharing or connection label, the first device and the second device can discover each other, and the display screen of the first device can show Discovered.
  • the user interface of the second device such as the interface shown in FIG. 5A .
  • a window 501 is included, and the window 501 includes the icon 5011 of the discovered second device, the name 5012 of the second device, prompt information 5013, a "connection” control 5014 and a "connection” control 5014. Cancel" control 5015.
  • the icon 5011 of the second device may be, for example, an icon of a tablet computer or the like.
  • the name 5012 of the second device may be, for example, HUAWEI MatePad Pro X or the like.
  • the prompt information 5013 can be used to explain to the user the function of the "connect” control 5014 and the functions after connection. For example, the prompt information 5013 can be "clicking 'connect' will turn on WLAN and Bluetooth. After connection, you can use it on the HUAWEI MatePad Pro X Operate your phone, and share data between devices.” et al.
  • a "connect” control 5014 may be used to send a connection confirmation request to the second device.
  • the "Cancel" control 5015 can be used to cancel the connection operation of the second device with the second device.
  • WLAN and Bluetooth will be turned on, then the process of establishing a connection between the first device and the second device can be completed through Bluetooth. After the connection is successfully established, the first device can be realized through WLAN. Data interaction and sharing between a device and a second device. After the connection is established through Bluetooth, implementing data interaction between the first device and the second device in a WLAN manner can improve the speed of data interaction and the efficiency of mutual response.
  • the first device displays the user interface shown in FIG. 5B in response to the touch operation on the “connect” control 5014 .
  • FIG. 5B it includes a window 502 in which the first device sends a connection confirmation request to the second device and waits for the confirmation of the second device.
  • the window 502 may also include the icon 5021 of the second device, prompt information 5022 and a "cancel" control 5023.
  • the icon 5021 of the second device may be, for example, an icon of a tablet computer or the like.
  • the prompt information 5022 is used to indicate that the confirmation of the second device is being waited for, for example, the prompt information 5022 may be "Please confirm the connection on the HUAWEI MatePad Pro X end." and so on.
  • the "Cancel" control 5023 can be used to cancel the connection operation between the first device and the second device.
  • the first device sends a connection request to the second device after responding to the touch operation on the “connect” control 5014, and after the second device receives the connection request, the display is shown in FIG. 5C user interface shown.
  • the user interface shown in FIG. 5C includes a confirmation window 503 for whether the second device is connected to the first device.
  • the window 503 includes the icon 5031 of the second device, the icon 5032 of the first device, the association symbol 5033 between the second device and the first device, prompt information 5034 , a “deny” control 5035 and an “allow” control 5036 .
  • the icon 5031 of the second device may be, for example, an icon of a computer or the like.
  • the icon 5032 of the second device may be, for example, an icon of a mobile phone or the like.
  • the prompt information 5034 can be used to prompt whether to connect, to explain to the user the function of the "Allow” control 5036 and the function after connection.
  • the prompt information 5034 can be "whether to allow HUAWEI Nova 7 to connect to this computer. Click “Allow", you can Operate your phone on HUAWEI MatePad Pro X and share data between devices. This feature will turn on WLAN and Bluetooth.” etc.
  • HUAWEI Nova 7 is the name of the first device and the "Reject" control 5035 can be used to reject the connection with the second device.
  • An "Allow" control 5036 may be used to establish a connection with the second device. Among them, HUAWEI Nova 7 is the name of the second device.
  • the second device confirms to establish a connection with the first device in response to the click or touch operation on the “allow” control 5036, and the user interface diagram of the second device after the connection is established, for example, can be
  • the interface diagram shown in 6A shows that the connection between the first device and the second device has been successfully established, that is, the information of the first device is transmitted to the second device and displayed on the display screen.
  • the interface diagram shown in FIG. 6B will be described in detail below, and will not be described in detail here.
  • the above-mentioned first device and second device may be connected to the same wireless network. If the first device and/or the second device has already connected to the wireless network, in the interface shown in FIG. 5A and/or FIG. 5C , it is not necessary to connect to the wireless network again.
  • the specific implementation of transmitting the information of the first device to the second device and displaying it on the display screen may include:
  • the "cooperation framework" of the first device (for example, the cooperation framework of the application framework layer in FIG. 3 ) notifies the "cooperation assistant" of the first device of the event of the successful connection establishment ” (for example, the collaborative assistant of the application layer in FIG. 3 ), the “collaborative assistant” responds to the event notification through the resource manager of the first device (for example, the resource manager of the application framework layer in FIG. 3 ) or the first device.
  • the "collaboration framework" of the first device obtains the display information of the first interface on the first device.
  • the first interface display information is information of the first user interface of the first application being displayed on the display screen of the first device, and the information may include data used by the first device to display the first user interface.
  • the first interface display information may include stack information of the first user interface being displayed on the display screen, data of content displayed on the interface, and the like.
  • the stack information may include the package name and class name of the displayed application's activity (Activity) service.
  • the resource manager or "cooperating frame" of the first device may obtain data of the content displayed on the interface through a "display frame" (eg, the display frame of the application frame layer in FIG. 3 ).
  • the "collaboration assistant" of the first device After the “collaboration assistant" of the first device obtains the above-mentioned first interface display information, it can send the first interface display information to the second device through WLAN, that is, the above-mentioned connected wireless network, and the second device uses its own “collaboration assistant” Receive the first interface display information, and send the first interface display information to its own “display framework” through its own “collaboration framework", and the "display framework” mobilizes the window manager (for example, in Figure 3) according to the information.
  • the window manager of the application framework layer displays a window on the display screen, and the content displayed in the window includes the content displayed in the above-mentioned first user interface.
  • the window for example, reference may be made to the window 601A in FIG. 6A , and the size of the window 601A matches the size of the display screen of the first device.
  • the window 601A shown in FIG. 6A may be called the second window, but the content displayed in the second window is not limited to the content shown in the window 601A.
  • the above-mentioned first application can be any one of the applications installed in the first device, and the first application can be a necessary application installed when the first device leaves the factory, such as a desktop application, a system application such as file management or settings .
  • the first application may also be a selectable application installed in the first device, such as third-party applications such as WeChat, Taobao, AutoNavi Maps, or Meituan.
  • the selectable applications are not limited to third-party applications, but also It is an application produced by the brand of the first device, such as Huawei's "App Market” application.
  • Some applications may sometimes be necessary system applications, and sometimes may be optional applications, for example, Huawei's "App Market” application may be a necessary system application for the first device in some possible embodiments.
  • the first application is mainly described by taking a desktop application and an "application market" application as examples, but this does not constitute a limitation on the technical solution.
  • the window 601B shown in FIG. 6B is the user interface of the application of the “application market”, that is, the user interface displayed in the first device is also the same.
  • the user interface of the App Market It should be noted that the user interface displayed in the window 601B in FIG. 6B can also be the user interface of other applications, for example, the user interface of WeChat, QQ or Huawei Mall, etc.
  • the specific application of this solution is not limited.
  • the window 601B shown in FIG. 6B may also be called the second window, but the content displayed in the second window is not limited to the content shown in the window 601B.
  • the window 601 shown in FIG. 6A is a schematic diagram of a desktop of the first device, and the desktop of the first device is also an application, that is, the above-mentioned desktop application. Therefore, the window 601 shown in FIG. 6A can also be said to include the first device.
  • the user interface of the desktop application is also a schematic diagram of a desktop of the first device, and the desktop of the first device is also an application, that is, the above-mentioned desktop application. Therefore, the window 601 shown in FIG. 6A can also be said to include the first device.
  • the user interface of the desktop application is a schematic diagram of a desktop of the first device, and the desktop of the first device is also an application, that is, the above-mentioned desktop application. Therefore, the window 601 shown in FIG. 6A can also be said to include the first device.
  • the user interface of the desktop application is also a schematic diagram of a desktop of the first device, and the desktop of the first device is also an application, that is, the above-
  • the "collaboration assistant" of the first device can also obtain the above-mentioned first interface display information through other modules of the application framework layer in response to the above-mentioned event notification.
  • the example does not limit this.
  • Case 2 The first device and the second device log into the same system account.
  • the first device and the second device are devices of the same brand, and the two devices log in to the same system account after being turned on, that is, the two devices are devices of the same account, then here In this case, the first device and the second device may be connected in the following manner.
  • the second device can be brought close to or in contact with the second device, for example, a preset part of the second device such as the back can be brought close to or touched the second device
  • the preset position with the sharing or connection label the first device and the second device can discover each other, and the user interface of the second device that has been discovered can appear on the display screen of the second device, for example, see Figure 5A interface shown.
  • the second device sends a connection request to the second device in response to the touch operation on the “connect” control 5014. Because the first device and the second device are devices with the same account, they are automatically established. The trust relationship is established, and the second device automatically confirms the connection after receiving the connection request sent by the second device. After the connection is confirmed, the connection between the two devices is completed. Exemplarily, the user interface shown in FIG. 6A or FIG. 6B is displayed in the second device at this time.
  • the way of establishing a communication connection between the first device and the second device to realize data sharing also includes other ways.
  • the application embodiments do not limit this.
  • the following describes some embodiments of the graphical user interface implemented on the second device after the first device establishes the connection with the second device.
  • An exemplary introduction is made with a tablet computer as the second device.
  • FIGS. 6A and 6B exemplarily show that after the first device establishes a connection with the second device, the first device maps the user interface displayed on its own display screen to the user interface displayed on the display screen of the second device.
  • the window 601A in FIG. 6A and the window 601B shown in FIG. 6B may be referred to as collaborative windows.
  • the size of window 601A and window 601B do not match the size of the display screen of the second device.
  • the mismatch may mean that the aspect ratio of the window 601A and the window 601B may be different from the aspect ratio of the display screen of the second device, or that the window 601A and the window 601B only occupy the display screen of the second device. part of the area.
  • the window 601A may also include a title bar 601A2.
  • the title bar 601A2 Can include hide control 6011, minimize control 6012, maximize control 6013, and name 6014 of the first device.
  • the hidden control 6011 can be used to hide the window 601A, and the second device hides the window 601A in response to a click or touch operation on the hidden control 6011; and the second device can The click or touch operation restores the display interface of the window 601A on the display screen.
  • the hidden control 6011 can be used to disconnect the first device from the second device, and the second device can actively disconnect from the first device in response to a click or touch operation on the hidden control 6011 Connection. If the connection between the second device and the first device needs to be re-established, reference may be made to the corresponding description of the establishment of the connection above, which will not be repeated here.
  • the minimize control 6012 can be used to minimize the window 601A, and in response to a click or touch operation on the minimize control 6012, the second device minimizes the window 601A, for example, minimizes the window 601A to the edge of the display screen of the second device,
  • the small window 701 is the minimized window 601A.
  • the small window 701 may include the name of the first device, such as HUAWEI Nova 7 and the like. It should be noted that the position of the small window 701 in the display screen of the second device is not limited to the position shown in FIG. 7 , and may be any position on the edge of the display screen.
  • the second device may restore the window 601A in response to a click or touch operation on the small window 701 .
  • the minimize control 6012 can be used to switch the window 601A to run in the background, and when the window 601A needs to be displayed on the display screen, the window 601A is called out from the background.
  • the maximize control 6013 can be used to maximize the window 601A, and in response to a click or touch operation on the maximize control 6013, the second device maximizes the window 601A, for example, the window 601A fills the full screen and the like.
  • the maximize control will be described in detail below, and will not be described in detail here.
  • the name 6014 of the first device may be, for example, HUAWEI Nova 7 or the like.
  • composition and function of the window 601B in FIG. 6B can be referred to the description of the window 601A in FIG. 6A , which will not be repeated here.
  • the first device can send the data of the user interface displayed on its own display screen and the information used to display the user interface to the second device through the "cooperative assistant" in real time.
  • the second device can update the collaboration window in real time according to the acquired information, so that the content displayed in the collaboration window always includes the content being displayed on the display screen of the first device.
  • the second device displays the first interface display information in full screen or enters a multi-window mode to display the first device according to the first interface display information.
  • the process of mapping over the window is exemplified to describe that after the second device receives the first interface display information sent by the first device through the "cooperative assistant", the second device displays the first interface display information in full screen or enters a multi-window mode to display the first device according to the first interface display information. The process of mapping over the window.
  • the multi-windows described in the embodiments of the present application refer to multiple windows including different user interfaces in the same application. If the user interface displayed on the above includes the home page of the "application market" and the browsing page of the "boutique application", then the user interface is the multi-window mode described in this application. For example, refer to the user interface shown in FIG. 11 or FIG. 14 , which will be described in detail below and will not be described in detail here.
  • This embodiment 1 introduces the process of full-screen display of a collaborative window in conjunction with the above maximized control 6013, and takes the collaborative window as an example of the window 601A shown in FIG. 6A for illustration.
  • the second device receives a click or touch operation on the maximize control 6013, and in response to the operation, the second device determines whether the window 601A has the property of full-screen display.
  • the first interface display information sent by the first device to the second device through the "cooperative assistant" includes the screen orientation (screenOrientation) attribute of the Activity of the application corresponding to the user interface shown in the window 601A.
  • the screen orientation attribute can be included in the stack information. Therefore, the second device can extract the screen orientation attribute information of the Activity of the application through the stack manager (for example, the stack manager shown in FIG. 3 ), and then check whether the attribute information is "landscape", and landscape means landscape. means that it can be zoomed in full screen.
  • the second device invokes the window manager to enlarge the window 601A in full screen, and the schematic diagram after the full screen enlargement of the window 601A can be seen in FIG. 8 .
  • the user interface shown in FIG. 8 may include a map window 801 and a title window 802 .
  • the mapping window 801 is the display window in which the user interface displayed on the display screen of the first device is maximized in the second device.
  • the title window 802 may include a hide control 6011 , a minimize control 6012 , the name of the first device 6014 and a window restore control 6015 .
  • the window restoration control 6015 can be used to restore the full-screen interface shown in FIG. 8 to the window 601A shown in FIG. 6A .
  • the second device may receive a click or touch operation on the window restoration control 6015, and in response to the operation, the second device may call the window manager to restore the full-screen interface shown in FIG. 8 to the window 601A.
  • FIG. 8 it can be seen that the display positions of the hidden control 6011 , the minimized control 6012 and the name 6014 of the first device are also different from those in FIG. 6A .
  • the display position and arrangement of the hidden control 6011, the minimized control 6012, the name of the first device 6014 and the window restoration control 6015 are not limited to the situation shown in FIG.
  • the icons of these controls are not limited to the shapes shown in FIG. 8 , and may be other shapes, which are not limited in this solution.
  • the above-mentioned title window 802 may not be displayed in FIG. 8 , then the mapping window 801 may be displayed on the entire display screen of the second device, for example, see FIG. 12A . Then, the full-screen display of the mapping window can be exited by receiving an operation of double-clicking the mapping window covering the full screen. Alternatively, other methods can also be used, such as receiving an instruction of the Esc key to exit the full-screen display of the mapping window.
  • This embodiment 2 introduces the process of entering the multi-window mode of the collaborative window in combination with the above maximized control 6013, and takes the collaborative window as an example of the window 601B shown in FIG. 6B.
  • the second device receives a click or touch operation on the maximize control 6013, and in response to the operation, the second device determines whether the window 601B has the property of full-screen display.
  • the first interface display information sent by the above-mentioned first device to the second device through the “cooperative assistant” includes the screen orientation attribute of the Activity of the “application market” shown in the window 601B (screenOrientation) , the screen orientation attribute can be included in the stack information. Therefore, the second device can extract the screen orientation attribute information of the Activity of the "application market” through the stack manager (for example, the stack manager shown in FIG. 3 ), and then check whether the attribute information is "landscape", landscape It means horizontal screen, which means that it can be enlarged in full screen.
  • the second device invokes the window manager to switch the window 601B to the state of the multi-window mode according to the first interface display information sent by the first device.
  • the second device can check, through the window manager, how many windows stack information is included in the stack information in the display information of the first interface. The following two situations are introduced:
  • the stack information includes stack information of a window.
  • the window manager of the second device displays the home page window on the side of the display screen of the second device, and then Customize a fill window to appear on the other side of the display.
  • the filling window can be said to be a window customized by the second device and displayed in the user interface of the "application market". See, for example, FIG. 9 .
  • the user interface shown in FIG. 9 may include a home page window 901 , a populated A0 window 902 , and a title window 903 .
  • the home page window 901 is used to display the home page user interface of the "App Market"
  • the filled A0 window 902 is the above-mentioned custom window displayed in the user interface of the "App Market”.
  • the title window 903 please refer to the title window 803 description, which will not be repeated here.
  • the home page window 901 , the filled A0 window 902 and the title window 903 can cover the full screen of the display screen of the second device.
  • the above-mentioned filling window may also display the content of some advertisements, the content of the introduction to the filling window, or other customized content, etc., which is not limited in this solution.
  • the above-mentioned title window 903 may not be displayed in FIG. 9 , then the home page window 901 and the filled A0 window 902 may be displayed on the entire display screen of the second device, for example, see FIG. 13 .
  • the full screen display can then be exited by receiving a double-tap on the display.
  • other methods can also be used, such as receiving an instruction of the Esc key to exit the full-screen display.
  • the above stack information includes stack information of multiple windows.
  • the plurality of windows are usually homepage windows of the "application market” and windows of another user interface entered based on the homepage window.
  • the home window 601B1 of the “App Market” is just displayed in the first device (because the home window 601B1 is mapped by the first device, so at this time in the first device The displayed window is the home page window 601B1), then the first device can enter the user interface of the “boutique application” in response to the click or touch operation of the “more” control in the window 601B1, that is, the first device’s
  • the user interface of the "boutique application” is displayed on the display screen.
  • the user interface of the "boutique application” is also displayed in the coordination window of the second device. See, for example, window 601C in FIG. 10 .
  • the composition and function of each part of the window 601C in FIG. 10 can also be referred to the description of the window 601A in FIG. 6A , which will not be repeated here.
  • the stack information sent from the first device includes the stack information of the window and the homepage window.
  • the window stack of the "boutique application” is at the top of the stack, and the stack of the home page window is at the bottom of the stack.
  • the window manager of the second device displays the home page window on the side of the display screen of the second device according to the above-mentioned stack information and the display data in the above-mentioned first interface display information, and then displays the window of the "boutique application" on the side of the display screen of the second device. the other side of the display. See, for example, FIG. 11 .
  • the user interface shown in FIG. 11 may include a home page window 1101 , a window 1102 of “Boutique Applications”, and a title window 1103 .
  • the home page window 1101 is used to display the user interface home page of the “App Market”, and the window 1102 of the “Best Application” displays the user interface of the above “Beauty Application”.
  • the title window 1103 please refer to the description of the title window 803, here No longer.
  • the home page window 1101 , the window 1102 of the “boutique application” and the title window 1103 can fill the full screen of the display screen of the second device.
  • the above-mentioned title window 1103 may not be displayed in FIG. 11 , then the home page window 1101 and the window 1102 of the “boutique application” may be displayed on the entire display screen of the second device, for example, see FIG. 14 . Then, the full-screen display can be exited by receiving a double-tap operation on the display screen. Alternatively, other methods can also be used, such as receiving an instruction of the Esc key to exit the full-screen display.
  • stack information includes two windows as an example for illustration, and may also include three, four, or more windows.
  • the specific implementation process is similar, and reference may be made to the above description, which will not be repeated here.
  • the above embodiment is the process of full-screen display of the collaborative window or entering the multi-window mode of the collaborative window described in combination with the maximized control 6013.
  • the following describes the process of realizing the full-screen display of the collaborative window or entering the multi-window mode of the collaborative window without maximizing the control 6013.
  • the second device may directly enter the collaborative window into full-screen display or enter the collaborative window into a multi-window mode interface according to the information of the first interface. That is, the second device does not need to generate a smaller collaboration window (for example, the window 601A in FIG. 6A or the window 601B in FIG. 7 , etc.) according to the above-mentioned first interface display information, but can directly display the larger collaboration window ( For example, the window shown in FIG. 8, etc.), the tedious operation of switching from a small window to a large window in the middle is reduced, which further facilitates user operations and improves user experience.
  • the third embodiment and the fourth embodiment are used as examples to introduce.
  • the third embodiment introduces a process in which the second device directly enters the full-screen display of the collaboration window according to the information of the first interface.
  • the first interface display information sent by the first device to the second device through the "cooperative assistant" may include the screenOrientation attribute of the Activity of the application corresponding to the user interface being displayed by the first device, and the screenOrientation attribute may be included in the stack information. Therefore, the second device can extract the screenOrientation attribute information of the Activity of the application through the stack manager (for example, the stack manager shown in FIG. 3 ), and then check whether the attribute information is "landscape".
  • the stack manager for example, the stack manager shown in FIG. 3
  • the second device invokes the window manager to display the user interface mapped by the first device in full screen according to the above-mentioned first interface display information. display interface.
  • the fourth embodiment introduces a process in which the second device directly enters the multi-window mode according to the information on the first interface.
  • the first interface display information sent by the first device to the second device through the "cooperative assistant" may include the screenOrientation attribute of the Activity of the application corresponding to the user interface being displayed by the first device,
  • the screenOrientation property can be included in the stack information. Therefore, the second device can extract the screenOrientation attribute information of the Activity of the application through the stack manager (for example, the stack manager shown in FIG. 3 ), and then check whether the attribute information is "landscape".
  • the second device checks, through the window manager, the stack information of how many windows the stack information in the display information of the first interface includes. The following two situations are introduced:
  • the stack information includes stack information of a window.
  • the window manager of the second device displays the homepage window on the second device. on one side of the display, and then customize a fill window to appear on the other side of the display.
  • the filling window can be said to be a window customized by the second device and displayed in the user interface of the "application market". See, for example, FIG. 9 or FIG. 13 .
  • FIG. 9 or FIG. 13 For the introduction of FIG. 9 or FIG. 13 , reference may be made to the above description, and details are not repeated here.
  • the above stack information includes stack information of multiple windows.
  • the multiple windows are usually homepage windows of an application corresponding to the user interface being displayed by the first device and windows of another user interface entered based on the homepage window.
  • the window manager of the second device displays the home page window on one side of the display screen of the second device according to the stack information and the display data in the first interface display information, and then displays the home page window on the other side of the display screen based on the home page window.
  • the windows of the user interface are displayed on the other side of the display. See, for example, FIG. 11 or FIG. 14 .
  • FIG. 11 or FIG. 14 See, for example, FIG. 11 or FIG. 14 .
  • stack information includes two windows as an example for illustration, and may also include three, four, or more windows.
  • the specific implementation process is similar, and reference may be made to the above description, which will not be repeated here.
  • the second device may not display the collaboration window mapped from the above-mentioned first device in full screen, but may only occupy a relatively large portion of the display screen of the second device.
  • a large part of the area such as three-quarters, four-fifths area, or five-sixth area, etc., is used to display the collaborative window mapped by the first device.
  • the specific display size of the collaborative window can be displayed through the window manager (such as the window manager shown in Figure 3) to manage the implementation.
  • Described above is the process of displaying (or zooming in) the collaborative window mapped by the first device in full screen and the process of entering the multi-window mode of the collaborative window by the second device.
  • the following is a combination of whether the second device itself is installed with the corresponding collaborative window. application to introduce these two processes.
  • the following is an example introduction through Embodiments 5 to 8.
  • the fifth embodiment introduces the process of entering the full-screen display in conjunction with FIG. 6A and in conjunction with whether the second device is installed with an application corresponding to the user interface displayed by the first device.
  • the second device receives a click or touch operation on the maximize control 6013, and in response to the operation, the second device determines whether the window 601A has the property of full-screen display.
  • the second device can call the package manager (eg, the package manager described in FIG. 3 ) to determine whether the desktop application is installed on the second device.
  • the first interface display information sent by the first device received by the second device includes the package name of the desktop application
  • the package manager of the second device can obtain the package name of the desktop application , and then query whether the package name of the application installed by itself includes the package name of the desktop application. If so, it indicates that the desktop application is installed on the second device, and if not, it indicates that the desktop application is not installed on the second device.
  • the above-mentioned second device may respond to the above-mentioned click or touch operation on the maximize control 6013, and simultaneously perform the operation of judging the screenOrientation property of the Activity of the desktop application and judging whether the second device has Actions that have the desktop app installed.
  • the second device invokes the window manager to enlarge the window 601A in full screen.
  • the window manager For a schematic diagram of the full-screen enlarged window 601A, see FIG. 8 or FIG. 12A .
  • FIG. 8 or FIG. 12A For the introduction of FIG. 8 or FIG. 12A, reference may be made to the above description, and details are not repeated here.
  • the second device starts its own desktop application (if it has already been started, it does not need to be started again), and according to the first interface display information obtained above, the user interface of the desktop application displays the same A device displays the same content as the user interface of the desktop application.
  • the user interface of the desktop application displayed on the second device may also be displayed in full screen, for example, refer to the interface shown in FIG. 12A , at this time, the collaboration window 601A returns to the background to run. Alternatively, the collaboration window 601A may be closed at this time.
  • the second device may call the window manager to display the package name and class name of the Activity of the desktop application in the stack information of the information through the first interface, and then combine the package name and class name into an Intent to start the window. It will be the same as the interface of the collaboration window 601A, or the activated window will be consistent with the interface content displayed by the first device.
  • the second device may provide a selection window to the user, allowing the user to choose whether to enter the full-screen interface (or the enlarged interface) or the zoomed-in interface through the collaboration window 601A
  • a full-screen interface is entered through the desktop application installed on the second device itself.
  • Figure 12B Illustratively, see Figure 12B.
  • a selection window 1201 is displayed based on the user interface shown in FIG. 6A , the selection window 1201 includes a description 12011 , a selection button 12012 for a collaboration window, and a selection button 12013 for a desktop application.
  • the content of the description 12011 may be "select to enter a full-screen interface through a collaborative window or a desktop application" and the like.
  • the second device may enter a full-screen interface (or a zoom-in interface) through the collaboration window 601A in response to the user's click or touch operation on the selection button 12012 of the collaboration window.
  • the second device may enter a full-screen interface (or an enlarged interface) through the desktop application installed on the second device itself in response to the user's click or touch operation on the desktop application selection button 12013 .
  • This Embodiment 6 introduces the process of entering multi-window mode display in combination with FIG. 6B and in combination with whether the second device is installed with an application corresponding to the user interface displayed by the first device.
  • the second device receives a click or touch operation on the maximize control 6013, and in response to the operation, the second device determines whether the window 601B has the property of full-screen display.
  • the second device can call A package manager (eg, the package manager described in FIG. 3 ) determines whether the "application market" is installed on the second device.
  • the package manager of the second device can obtain the "application market” ” package name, and then check whether the package name of the application installed by itself includes the package name of the “App Market”. If it is included, it means that the “App Market” is installed on the second device. Install the "App Market”.
  • the above-mentioned second device may, in response to the above-mentioned click or touch operation on the maximize control 6013, simultaneously perform the operation of judging the screenOrientation property of the Activity of the “App Market” and the judgment of the second device Whether the "app market" is installed on the operation.
  • the second device can display the multi-window mode according to the first and second situations in the above-mentioned second embodiment, which is not repeated here. Repeat.
  • the second device When the "application market" is installed on the second device, the second device starts its own “application market” (if it has already been started, it does not need to be started again), and displays information in the "application market” according to the first interface obtained above.
  • the user interface of the device enters the multi-window mode, and the displayed content includes the content of the user interface of the "application market” displayed by the first device.
  • the user interface of the "application market” displayed on the second device may also be displayed in full screen, for example, see the interface shown in FIG. 13 or FIG. 14 .
  • the collaboration window 601B or the collaboration window 601C returns to the background to run.
  • the collaboration window 601B or the collaboration window 601C may also be closed at this time.
  • the above-mentioned second device can call the window manager to display the information on the first interface through the Bctivity package name and class name of the "application market" in the stack information, and then combine the package name and class name into an Intent to start the window,
  • This window will include the content of the interface of the collaboration window 601B, or the activated window includes the content displayed on the interface displayed by the first device.
  • the second device may provide a selection window to the user, allowing the user to choose whether to enter the multi-window mode through the collaboration window 601B or through the first
  • the desktop application installed on the second device itself enters a multi-window mode.
  • FIG. 12B For the implementation process of the specific selection, reference may be made to the above description of FIG. 12B , which will not be repeated here.
  • This Embodiment 7 introduces a process in which the second device directly enters the full-screen display according to the information of the first interface based on whether the second device is installed with an application corresponding to the user interface displayed by the first device.
  • the first interface display information sent by the first device to the second device through the "cooperative assistant" may include an application corresponding to the user interface being displayed by the first device (the application may be an application on the first device). Any installed application may be called the screenOrientation attribute of the Activity of the target application), and the screenOrientation attribute may be included in the stack information. Therefore, the second device can extract the screenOrientation attribute information of the Activity of the application through the stack manager (for example, the stack manager shown in FIG. 3 ), and then check whether the attribute information is "landscape".
  • the stack manager for example, the stack manager shown in FIG. 3
  • the second device may call a package manager (eg, the package manager described in FIG. 3 ) to determine whether the target application is installed on the second device.
  • a package manager eg, the package manager described in FIG. 3
  • the first interface display information received by the second device and sent by the first device includes the package name of the target application
  • the package manager of the second device can obtain the package name of the target application , and then query whether the package name of the application installed by itself includes the package name of the target application, if yes, it indicates that the second device has installed the target application, if not, it indicates that the second device has not installed the target application.
  • the above-mentioned second device may simultaneously perform an operation of judging the screenOrientation attribute of the Activity of the target application and judging whether the target is installed on the second device The operation of the application.
  • the second device invokes the window manager to display the user interface mapped by the first device in full screen according to the above-mentioned first interface display information.
  • the interface displayed in full screen see FIG. 8 or FIG. 12A , for example. interface shown.
  • FIG. 8 or FIG. 12A For the introduction of FIG. 8 or FIG. 12A, reference may be made to the above description, and details are not repeated here.
  • the second device starts its own target application (if it has already been started, it does not need to be started again), and displays on the user interface of the target application the same as the first interface display information obtained above.
  • a device displays the same content as the user interface of the target application.
  • the user interface of the target application displayed on the second device may also be displayed in full screen, for example, refer to the interface shown in FIG. 12A .
  • the second device can call the window manager to display the package name and class name of the Activity of the target application in the stack information of the information through the first interface, and then combine the package name and class name into an Intent to start the window. It will include the interface content displayed by the first device.
  • the seventh embodiment introduces a process for the second device to directly enter the multi-window mode according to the information of the first interface based on whether the second device is installed with an application corresponding to the user interface displayed by the first device.
  • the first interface display information sent by the first device to the second device through the "cooperative assistant" may include an application corresponding to the user interface being displayed by the first device (the application may be an application on the first device). Any installed application may be called the screenOrientation attribute of the Activity of the target application), and the screenOrientation attribute may be included in the stack information. Therefore, the second device can extract the screenOrientation attribute information of the Activity of the application through the stack manager (for example, the stack manager shown in FIG. 3 ), and then check whether the attribute information is "landscape".
  • the stack manager for example, the stack manager shown in FIG. 3
  • the second device may call a package manager (eg, the package manager described in FIG. 3 ) to determine whether the target application is installed on the second device.
  • a package manager eg, the package manager described in FIG. 3
  • the first interface display information received by the second device and sent by the first device includes the package name of the target application
  • the package manager of the second device can obtain the package name of the target application , and then query whether the package name of the application installed by itself includes the package name of the target application, if yes, it indicates that the second device has installed the target application, if not, it indicates that the second device has not installed the target application.
  • the above-mentioned second device may simultaneously perform an operation of judging the screenOrientation attribute of the Activity of the target application and judging whether the target is installed on the second device The operation of the application.
  • the second device may display the window according to the first and second situations in the fourth embodiment, which will not be repeated here.
  • the second device starts its own target application (if it has already been started, it does not need to be started again), and displays on the user interface of the target application the same as the first interface display information obtained above.
  • a device displays the same content as the user interface of the target application.
  • the user interface of the target application displayed on the second device may also be displayed in full screen, for example, refer to the interface shown in FIG. 13 or FIG. 14 .
  • the above-mentioned second device can call the window manager to display the package name and class name of the Bctivity of the target application in the stack information of the information through the above-mentioned first interface, and then combine the package name and class name into an Intent to start the window. It will include the content displayed on the interface displayed by the first device.
  • the applications exemplified in the foregoing embodiments are only examples, and the user interface of any application installed in the first device can be displayed in full screen across the devices on the display screen of the second device.
  • the second device may not display the collaboration window mapped from the first device in full screen, but may only occupy the second device's A larger area of the display screen, such as a three-quarter, four-fifth, or five-sixth area, etc., is used to display the collaborative window mapped by the first device.
  • a window manager such as the one shown in Figure 3, manages the implementation.
  • entering the full-screen display or multi-window mode through the corresponding application can make the displayed user interface more effective, reduce the display delay, and improve the user experience.
  • the following describes operations performed on the second device in response to a user's instruction after the user interface displayed on the display screen of the first device is mapped to full-screen display on the display screen of the second device or displayed in a multi-window mode.
  • the second device for the user interface that enters the full-screen display or multi-window mode display through the collaborative window (refer to the above-mentioned Embodiment 1 to Embodiment 4), after receiving the instruction input by the user on the user interface, the second device uses Its own "cooperative assistant" sends the instruction to the first device. After the first device obtains the corresponding display data according to the instruction, it sends the display data to the second device through its own "cooperative assistant", and the second device receives the corresponding display data according to the instruction. The displayed data updates the user interface displayed by the display. Of course, the display interface of the first device can also be updated according to the above instruction.
  • the second device for entering a user interface displayed in a full-screen display multi-window mode through a corresponding application (refer to Embodiments 5 to 8 above), after the second device receives an instruction input by the user on the user interface,
  • the user interface can be updated directly by interacting with the server of the application, and there is no need to interact with the first device.
  • the embodiment of the present application directly interacts with the server at a high speed, which can reduce the delay in displaying the user interface and improve the user experience.
  • the user interface displayed on the display screen of the first device is mapped to the full-screen display on the display screen of the second device or after entering the multi-window mode display, the display screen of the second device displays the user interface.
  • the typographic layout of the user interface and the user interface displayed on the display screen of the first device are different.
  • FIG. 15 shows a schematic interactive flowchart of a display method provided by an embodiment of the present application.
  • the display method may include the following steps:
  • a first device acquires interface display information of a first application, a display screen of the first device displays a first user interface of the first application, and the interface display information includes the information used by the first device to display the first user interface. data.
  • the above-mentioned first device sends the interface display information to the above-mentioned second device.
  • the above-mentioned second device receives the interface display information.
  • the above-mentioned second device displays a second user interface according to the interface display information, and the content displayed on the second user interface includes the content displayed on the first user interface; the typesetting of the second user interface and the typesetting of the first user interface Differently, the display area of the second user interface is larger than the display area of the first user interface.
  • the above-mentioned second user interface may be, for example, the display interface shown in FIG. 8 , FIG. 9 , FIG. 11 , FIG. 12A , FIG. 13 or FIG. 14 .
  • the above interface display information is the first interface display information described in the above method embodiments.
  • the above-mentioned first application may be any application installed in the first device, for example, it may be a desktop application, an "application market" application, or a target application in the above-mentioned embodiment.
  • the user interface may be a window, and the window may also be a user interface.
  • the window may also be a user interface.
  • the method further includes:
  • the above-mentioned second device displays a third user interface according to the above-mentioned interface display information
  • the size of the above-mentioned third user interface does not match the size of the display screen of the above-mentioned second device
  • the content displayed on the above-mentioned third user interface is the same as that of the above-mentioned first user interface.
  • the content displayed on the interface is consistent; the above-mentioned second device receives a first operation input for the first control in the above-mentioned third user interface, and the above-mentioned first operation is used to trigger the above-mentioned second device to display the above-mentioned second user interface according to the above-mentioned interface display information operation.
  • the third user interface may be, for example, the above-mentioned window 601A1 shown in FIG. 6A or 601B1 shown in FIG. 6B , or the like.
  • the user interface mapped from the first device is first displayed in the size of the display screen of the first device, so that the user can choose whether to display it in full screen, providing the user with more possibilities for selection.
  • the second device displaying the second user interface according to the interface display information includes: the second device determining, according to the interface display information, that the first user interface has the property of being displayed in landscape; the above The second device displays the second user interface according to the attributes displayed on the horizontal screen.
  • the above-mentioned second device displays the second user interface according to the above-mentioned interface display information, comprising: the above-mentioned second device determines the attribute of no landscape display of the above-mentioned first user interface according to the above-mentioned interface display information; the above-mentioned The second device displays the second user interface according to the property of no landscape display, wherein the second user interface includes a plurality of small windows, and the plurality of small windows include a window of the home page of the first application and a window including For the window that is consistent with the content displayed on the above-mentioned first user interface, the above-mentioned multiple small windows are all windows belonging to the above-mentioned first application.
  • the above-mentioned first user interface has a landscape display property, it can be displayed in full screen on the display screen of the second device, which provides users with achievable solutions in various aspects and improves user experience.
  • the plurality of small windows include a window of the home page of the first application and a filling window
  • the filling window is a window that the second device customizes and displays in the user interface of the first application .
  • the filling window may be, for example, the A0 filling window 902 shown in FIG. 9 or the like.
  • the user interface of the full-screen display of the first application is displayed to the user by filling the window, so as to improve the sensory experience of the user.
  • the above-mentioned interface display information includes information in the task stack of the above-mentioned first application in the above-mentioned first device, and the above-mentioned second device receives the first information about the first control in the above-mentioned third user interface. After the operation, it also includes: the second device determines that the first application is installed in the second device according to the information in the task stack of the first application in the first device; the second device uses the first application The above-mentioned second user interface is displayed based on the above-mentioned interface display information.
  • the above-mentioned second device displays the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application, including:
  • the second device displays the second user interface based on the interface display information through the first application in response to a second operation, where the second operation is a touch operation on a selection button of the first application.
  • entering the full-screen display through the corresponding application can make the displayed user interface more effective, reduce the display delay, and improve the user experience.
  • the present application can map the application content displayed by the first device (such as a mobile phone, etc.) to the display screen of the second device (such as a tablet, etc.) for large-screen display, making full use of the screen area of the second device, It provides users with the possibility of operating on a large screen and improves the user experience.
  • the first device such as a mobile phone, etc.
  • the second device such as a tablet, etc.
  • each device includes a corresponding hardware structure and/or software module for executing each function.
  • Those skilled in the art should easily realize that the present application can be implemented in hardware or a combination of hardware and computer software with the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of this application.
  • the device may be divided into functional modules according to the foregoing method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • FIG. 16 shows a schematic diagram of a possible logical structure of a device, and the device may be the second device described in the foregoing embodiment.
  • the device 1600 may include a first receiving unit 1601 and a display unit 1602 . in:
  • the first receiving unit 1601 is configured to receive interface display information of a first application from a first device, wherein the display screen of the first device displays the first user interface of the first application, and the interface display information includes the first user interface of the first application.
  • the device is used to display the data of the above-mentioned first user interface;
  • the display unit 1602 is configured to display a second user interface according to the above-mentioned interface display information, the content displayed on the above-mentioned second user interface includes the content displayed on the above-mentioned first user interface; the typesetting of the above-mentioned second user interface and the typesetting of the above-mentioned first user interface Differently, the display area of the second user interface is larger than the display area of the first user interface.
  • the above-mentioned display unit 1602 is further configured to, after the above-mentioned first receiving unit 1601 receives the interface display information of the first application from the first device, the above-mentioned display unit 1602 to display the first interface display information according to the above-mentioned interface display information Before the second user interface, a third user interface is displayed according to the above-mentioned interface display information, the size of the above-mentioned third user interface does not match the size of the display screen of the above-mentioned device 1600, and the content displayed on the above-mentioned third user interface is the same as that of the above-mentioned first user interface. The content displayed on the interface is consistent;
  • the above-mentioned device 1600 further includes a second receiving unit configured to receive a first operation input for the first control in the above-mentioned third user interface, and the above-mentioned first operation is used to trigger the above-mentioned device 1600 to display the above-mentioned second user according to the above-mentioned interface display information. operation of the interface.
  • the display unit 1602 is specifically configured to: determine that the first user interface has a landscape display attribute according to the interface display information; and display the second user interface according to the landscape display attribute.
  • the display unit 1602 is specifically configured to: determine the attribute of the first user interface without landscape display according to the interface display information; display the second user interface according to the attribute of no landscape display , wherein the above-mentioned second user interface includes a plurality of small windows, the above-mentioned plurality of small windows include a window of the main page of the above-mentioned first application and a window that is consistent with the content displayed on the above-mentioned first user interface, the above-mentioned plurality of small windows The windows are all windows belonging to the above-mentioned first application.
  • the plurality of small windows include a window of the home page of the first application and a filling window
  • the filling window is a window that the device 1600 customizes and displays in the user interface of the first application.
  • the above-mentioned interface display information includes the information in the task stack of the above-mentioned first application in the above-mentioned first device, and the above-mentioned device 1600 further includes a judgment unit for receiving the above-mentioned response to the above-mentioned second receiving unit in the above-mentioned second receiving unit.
  • the above-mentioned display unit 1602 is further configured to display the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application.
  • the above-mentioned display unit 1602 is specifically configured to: display the above-mentioned second user interface based on the above-mentioned interface display information through the above-mentioned first application in response to the second operation, and the above-mentioned second operation is to the above-mentioned first application. Select the touch operation of the button.
  • the term “when” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting" depending on the context.
  • the phrases “in determining" or “if detecting (the stated condition or event)” can be interpreted to mean “if determining" or “in response to determining" or “on detecting (the stated condition or event)” or “in response to the detection of (the stated condition or event)”.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed , which may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Abstract

一种显示交互系统、显示方法及设备,显示交互系统包括第一设备(11)和第二设备(12);第一设备(11)用于向第二设备(12)发送第一应用的界面显示信息,其中,第一设备(11)的显示屏显示第一应用的第一用户界面,界面显示信息包括第一设备(11)用于显示第一用户界面的数据;第二设备(12)用于根据界面显示信息显示第一应用的第二用户界面;第二用户界面显示的内容包括第一用户界面显示的内容;第二用户界面的排版与第一用户界面的排版不同,第二用户界面的显示区域大于第一用户界面的显示区域。显示交互系统能够充分利用第二设备(12)的显示区域,提升用户体验。

Description

显示交互系统、显示方法及设备
本申请要求于2020年07月21日提交中国专利局、申请号为202010708095.8、申请名称为“显示交互系统、显示方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端及通信技术领域,尤其涉及一种显示交互系统、显示方法及设备。
背景技术
投屏指的是,将一个待投屏的电子设备(通常为移动终端,例如:手机、平板电脑等设备)显示的内容共享至其它带有显示屏的第二设备(通常为电视、智能交互平板、投影仪等设备)。目前,在投屏领域,以在智能交互平板上投屏为例,只要待投屏的电子设备与智能交互平板能够通信,且待投屏的电子设备安装有对应的投屏应用,就能够通过特定的无线热点,将内容投屏至智能交互平板。
但现有的技术方案中,投屏后在第二设备中显示的电子设备桌面的窗口的大小没有改变,导致第二设备的显示区域无法充分被利用。
发明内容
本申请实施例公开了一种显示方法及相关装置,能够充分利用第二设备的显示区域,便于用户操作,提升用户体验。
第一方面,本申请提供一种显示交互系统,上述系统包括第一设备和第二设备;其中,
上述第一设备用于向上述第二设备发送第一应用的界面显示信息,其中,上述第一设备的显示屏显示上述第一应用的第一用户界面,上述界面显示信息包括上述第一设备用于显示上述第一用户界面的数据;
上述第二设备用于根据上述界面显示信息显示上述第一应用的第二用户界面;上述第二用户界面显示的内容包括上述第一用户界面显示的内容;上述第二用户界面的排版与上述第一用户界面的排版不同,上述第二用户界面的显示区域大于上述第一用户界面的显示区域。
本申请能够将第一设备(例如手机等)显示的应用内容映射到第二设备(例如平板等)的显示屏中进行大屏显示,充分利用了第二设备的屏幕区域,为用户提供了可大屏操作的可能,提升用户体验。
在一种可能的实施方式中,上述第一设备用于向上述第二设备发送第一应用的界面显示信息之后,上述第二设备用于根据上述界面显示信息显示第二用户界面之前,还包括:
上述第二设备用于根据上述界面显示信息显示第三用户界面,上述第三用户界面的尺寸大小与上述第二设备的显示屏的尺寸大小不匹配,上述第三用户界面显示的内容与上述第一用户界面显示的内容一致;
上述第二设备用于接收针对上述第三用户界面中的第一控件输入的第一操作,上述第一操作用于触发上述第二设备根据上述界面显示信息显示上述第二用户界面的操作。
在本申请实施例中,先将第一设备映射过来的用户界面按第一设备的显示屏的尺寸大小显示,以便于用户选择是否全屏显示,为用户提供了更多选择的可能性。
在一种可能的实施方式中,上述第二设备用于根据上述界面显示信息显示第二用户界面, 包括:
上述第二设备用于根据上述界面显示信息判断出上述第一用户界面具备横屏显示的属性;
上述第二设备用于根据上述横屏显示的属性显示上述第二用户界面。
在一种可能的实施方式中,上述第二设备用于根据上述界面显示信息显示第二用户界面,包括:
上述第二设备用于根据上述界面显示信息判断出上述第一用户界面无横屏显示的属性;
上述第二设备用于根据上述无横屏显示的属性显示上述第二用户界面,其中,上述第二用户界面中包括多个小窗口,上述多个小窗口中包括上述第一应用的主页面的窗口以及包括与上述第一用户界面显示的内容一致的窗口,上述多个小窗口均为属于上述第一应用的窗口。
上述两个可能的实施方式中,不管上述第一用户界面是否具备横屏显示属性,都能够全屏显示在第二设备的显示屏中,多方面为用户提供了可实现的方案,提升用户体验。
在一种可能的实施方式中,上述多个小窗口包括上述第一应用的主页面的窗口和填充窗口,上述填充窗口是上述第二设备自定义显示在上述第一应用的用户界面中的窗口。
在本申请实施例中,通过填充窗口的方式为用户展现第一应用全屏显示的用户界面,提升用户的感官体验。
在一种可能的实施方式中,上述界面显示信息包括上述第一应用在上述第一设备中的任务栈内的信息,上述第二设备用于接收对上述第三用户界面中的第一控件的第一操作之后,还包括:
上述第二设备用于根据上述第一应用在上述第一设备中的任务栈内的信息判断出上述第二设备中安装有上述第一应用;
上述第二设备用于通过上述第一应用基于上述界面显示信息显示上述第二用户界面。
在一种可能的实施方式中,上述第二设备用于通过上述第一应用基于上述界面显示信息显示上述第二用户界面,包括:
上述第二设备用于响应第二操作通过上述第一应用基于上述界面显示信息显示上述第二用户界面,上述第二操作为对上述第一应用的选择按钮的触控操作。
在本申请实施例中,通过对应的应用进入全屏显示,能够使得显示的用户界面的效果更好,显示延迟降低,提高用户体验。
第二方面,本申请提供一种显示方法,该方法包括:
第二设备接收来自第一设备的第一应用的界面显示信息,其中,上述第一设备的显示屏显示上述第一应用的第一用户界面,上述界面显示信息包括上述第一设备用于显示上述第一用户界面的数据;
上述第二设备根据上述界面显示信息显示第二用户界面,上述第二用户界面显示的内容包括上述第一用户界面显示的内容;上述第二用户界面的排版与上述第一用户界面的排版不同,上述第二用户界面的显示区域大于上述第一用户界面的显示区域。
在一种可能的实施方式中,上述第二设备接收来自第一设备的第一应用的界面显示信息之后,上述第二设备根据上述界面显示信息显示第二用户界面之前,还包括:
上述第二设备根据上述界面显示信息显示第三用户界面,上述第三用户界面的尺寸大小与上述第二设备的显示屏的尺寸大小不匹配,上述第三用户界面显示的内容与上述第一用户界面显示的内容一致;
上述第二设备接收针对上述第三用户界面中的第一控件输入的第一操作,上述第一操作 用于触发上述第二设备根据上述界面显示信息显示上述第二用户界面的操作。
在一种可能的实施方式中,上述第二设备根据上述界面显示信息显示第二用户界面,包括:
上述第二设备根据上述界面显示信息判断出上述第一用户界面具备横屏显示的属性;
上述第二设备根据上述横屏显示的属性显示上述第二用户界面。
在一种可能的实施方式中,上述第二设备根据上述界面显示信息显示第二用户界面,包括:
上述第二设备根据上述界面显示信息判断出上述第一用户界面无横屏显示的属性;
上述第二设备根据上述无横屏显示的属性显示上述第二用户界面,其中,上述第二用户界面中包括多个小窗口,上述多个小窗口中包括上述第一应用的主页面的窗口以及包括与上述第一用户界面显示的内容一致的窗口,上述多个小窗口均为属于上述第一应用的窗口。
在一种可能的实施方式中,上述多个小窗口包括上述第一应用的主页面的窗口和填充窗口,上述填充窗口是上述第二设备自定义显示在上述第一应用的用户界面中的窗口。
在一种可能的实施方式中,上述界面显示信息包括上述第一应用在上述第一设备中的任务栈内的信息,上述第二设备接收对上述第三用户界面中的第一控件的第一操作之后,还包括:
上述第二设备根据上述第一应用在上述第一设备中的任务栈内的信息判断出上述第二设备中安装有上述第一应用;
上述第二设备通过上述第一应用基于上述界面显示信息显示上述第二用户界面。
在一种可能的实施方式中,上述第二设备通过上述第一应用基于上述界面显示信息显示上述第二用户界面,包括:
上述第二设备响应于第二操作通过上述第一应用基于上述界面显示信息显示上述第二用户界面,上述第二操作为对上述第一应用的选择按钮的触控操作。
第三方面,本申请提供一种显示设备,该设备包括:
第一接收单元,用于接收来自第一设备的第一应用的界面显示信息,其中,上述第一设备的显示屏显示上述第一应用的第一用户界面,上述界面显示信息包括上述第一设备用于显示上述第一用户界面的数据;
显示单元,用于根据上述界面显示信息显示第二用户界面,上述第二用户界面显示的内容包括上述第一用户界面显示的内容;上述第二用户界面的排版与上述第一用户界面的排版不同,上述第二用户界面的显示区域大于上述第一用户界面的显示区域。
在一种可能的实施方式中,上述显示单元还用于,在上述第一接收单元接收来自第一设备的第一应用的界面显示信息之后,上述显示单元根据上述界面显示信息显示第二用户界面之前,
根据上述界面显示信息显示第三用户界面,上述第三用户界面的尺寸大小与上述显示设备的显示屏的尺寸大小不匹配,上述第三用户界面显示的内容与上述第一用户界面显示的内容一致;
上述显示设备还包括第二接收单元,用于接收针对上述第三用户界面中的第一控件输入的第一操作,上述第一操作用于触发上述显示设备根据上述界面显示信息显示上述第二用户界面的操作。
在一种可能的实施方式中,上述显示单元具体用于:
根据上述界面显示信息判断出上述第一用户界面具备横屏显示的属性;
根据上述横屏显示的属性显示上述第二用户界面。
在一种可能的实施方式中,上述显示单元具体用于:
根据上述界面显示信息判断出上述第一用户界面无横屏显示的属性;
根据上述无横屏显示的属性显示上述第二用户界面,其中,上述第二用户界面中包括多个小窗口,上述多个小窗口中包括上述第一应用的主页面的窗口以及包括与上述第一用户界面显示的内容一致的窗口,上述多个小窗口均为属于上述第一应用的窗口。
在一种可能的实施方式中,上述多个小窗口包括上述第一应用的主页面的窗口和填充窗口,上述填充窗口是上述显示设备自定义显示在上述第一应用的用户界面中的窗口。
在一种可能的实施方式中,上述界面显示信息包括上述第一应用在上述第一设备中的任务栈内的信息,上述显示设备还包括判断单元,用于在上述第二接收单元接收对上述第三用户界面中的第一控件的第一操作之后,
根据上述第一应用在上述第一设备中的任务栈内的信息判断出上述显示设备中安装有上述第一应用;
上述显示单元,还用于通过上述第一应用基于上述界面显示信息显示上述第二用户界面。
在一种可能的实施方式中,上述显示单元具体用于:
响应于第二操作通过上述第一应用基于上述界面显示信息显示上述第二用户界面,上述第二操作为对上述第一应用的选择按钮的触控操作。
第四方面,本申请提供一种显示设备,上述设备包括处理器、接收接口、发送接口和存储器,其中,上述存储器用于存储计算机程序和/或数据,上述处理器用于执行上述存储器中存储的计算机程序,使得上述设备执行上述第二方面任一项所述的方法。
第五方面,本申请提供一种计算机可读存储介质,上述计算机可读存储介质存储有计算机程序,上述计算机程序被处理器执行以实现上述第二方面任意一项上述的方法。
第六方面,本申请提供一种计算机程序产品,当所述计算机程序产品被计算机读取并执行时,上述第二方面任意一项所述的方法将被执行。
第七方面,本申请提供一种计算机程序,当所述计算机程序在计算机上执行时,将会使所述计算机实现上述第二方面任意一项所述的方法。
综上所述,本申请能够将第一设备(例如手机等)显示的应用内容映射到第二设备(例如平板等)的显示屏中进行大屏显示,充分利用了第二设备的屏幕区域,为用户提供了可大屏操作的可能,提升用户体验。
附图说明
图1为本申请实施例提供的一种显示交互系统的架构示意图;
图2为本申请实施例提供的一种电子设备的硬件结构示意图;
图3为本申请实施例提供的一种电子设备的软件结构示意图;
图4A为本申请实施例提供的一种电子设备的用户界面示意图;
图4B为本申请实施例提供的一种电子设备用户界面的状态栏下拉菜单示意图;
图4C为本申请实施例提供的另一种电子设备的用户界面示意图;
图5A至图5C为本申请实施例提供的两个设备建立连接过程中用户界面的示意图;
图6A和图6B为本申请实施例提供的一种电子设备的用户界面示意图;
图7至图11均为本申请实施例提供的一种电子设备上的用户界面示意图;
图12A和图12B均为本申请实施例提供的一种电子设备上的用户界面示意图;
图13和图14均为本申请实施例提供的一种电子设备上的用户界面示意图;
图15为本申请实施例提供的一种显示方法的交互流程示意图;
图16为本申请实施例提供的一种设备的逻辑结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
为了更好的理解本申请实施例提供的一种显示交互系统、显示方法及设备,下面先对本申请实施例提供的一种显示方法适用的显示交互系统的构架进行描述。参阅图1,图1是本申请实施例提供的一种显示交互系统的示意图。如图1所示,该系统可以包括一个或多个第一设备11(图1中仅示例性地画出了其中一个第一设备)以及一个或多个第二设备12(图1中仅示例性地画出了其中一个第二设备),其中:
第一设备11可以安装并运行一个或多个应用程序(application,APP),该一个或多个应用程序例如可以是微信、美团、电子邮件等应用程序,还可以是用于将第一设备11的显示内容映射到第二设备的应用程序(本申请后续实施例将该应用程序称为“协同助手”)。应用程序也可以简称为应用。
第一设备11可以包括但不限于任何一种基于智能操作系统的手持式电子产品,其可与用户通过键盘、虚拟键盘、触摸板、触摸屏以及声控设备等输入设备来进行人机交互,诸如智能手机、平板电脑、手持计算机、可穿戴电子设备等。其中,智能操作系统包括但不限于任何通过向设备提供各种应用来丰富设备功能的操作系统,诸如安卓Android、IOS、Windows和MAC等系统。
第二设备12可以包括但不限于平板电脑、个人电脑、台式电脑、电视机、车载显示器、投影仪显示器等。在本实施例中,第二设备12可以为第一设备11提供显示服务。第二设备12上需要运行有相应的程序来提供该显示服务,如接收第一设备11发送的信息并保存的应用程序(下面可以称为“协同助手”)以及根据第一设备11发送的信息显示在第二设备12的显示屏上的应用程序(下面可以称为“窗口管理器”)等等。
第二设备12可以通过数据线、蓝牙或者无线保真(Wireless Fidelity,WiFi)网络等方式与第一设备11建立连接,以进行数据交互。示例性地,第一设备11和第二设备12可以通过WiFi p2p技术实现通信连接,当该两个设备都连接到同一个网络时,第一设备11可以搜索发现第二设备12,然后接收用户的操作指令实现与第二设备12的通信连接。或者当该两个设备同时接入到同一个网络时,第一设备11可以搜索发现第二设备12并自动与该第二设备12建立通信连接。下面会详细介绍该两个设备建立通信连接的过程,此处暂不详述。
下面结合图2首先介绍本申请以下实施例中提供的示例性电子设备。
图2示出了电子设备100的结构示意图,电子设备100可以是图1中所示的第二设备12。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块150,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192, 摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于 连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其它电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块150用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块150可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块150可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块150为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块150与处理器110。电源管理模块141接收电池142和/或充电管理模块150的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其它一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块150也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块 150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其它功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其它设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理, 转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其它数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。 电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其它一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
基于图2所示本申请实施例的电子设备100的硬件结构示意图,下面介绍本申请实施例的电子设备100的软件结构框图,如图3所示。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图3所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,短信息和协同助手等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3所示,应用程序框架层可以包括窗口管理器,包管理器,资源管理器,通知管理器,视图系统,协同框架,显示框架和栈管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
包管理器可以用于管理应用的安装包的安装、卸载以及安装包的配置信息的解析、查询等。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
协同框架用于将电子设备100与第一设备(例如图1所示的第一设备11)建立连接的各个事件通知到应用程序层的“协同助手”,还可以用于响应于应用程序层的“协同助手”的指令辅助该“协同助手”获取数据信息。示例性地,协同框架可以实现“碰一碰”(Onehop)服务和组播源发现协议(multicast source discovery protocol,MSDP)服务,即电子设备100可以基于Onehop服务和MSDP服务与第一设备建立通信连接。
显示框架用于获取电子设备100中正在显示的应用的界面或窗口的显示数据通过协同框架发送给“协同助手”,也可以用于通过协同框架获取“协同助手”接收的来自第一设备(例如图1所示的第一设备11)的显示数据等。
栈管理器可以用于存储和管理电子设备100中运行的应用程序的进程信息。例如,在本申请实施例中,可以通过栈管理器存储应用程序的活动activity的信息,例如存储每一个activity的包名和类名等信息。
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层可以运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机可以用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合第一设备(例如图1所示的第一设备11)将数据传输到电子设备100的场景,示例性说明电子设备100软件以及硬件的工作流程。需要说明的是,该电子设备100的桌面可以是开启电子设备并成功登陆该电子设备的系统后显示在主屏幕区域的一个或多个用户界面,这些用户界面可以包括该电子设备上安装的应用的图标和名称。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。
以该触摸操作是触摸单击操作,该单击操作所对应的控件为“协同助手”应用程序的控件为例,“协同助手”应用调用应用程序框架层的协同框架接口,启动“协同助手”应用。
需要说明的是,本申请实施例提供的第一设备(例如图1所示的第一设备11)的硬件结构和软件结构框架可以参考上述图2和图3中所示的结构,当然,第一设备与第二设备(上述电子设备100)硬件结构和软件结构框架可以不完全相同,具体根据实际情况确定,此处不再赘述。
基于上述图1所述的系统框架、图2所述的设备的硬件框架和图3所述的设备软件框架,本申请实施例提供了一种显示方法和设备,本申请实施例涉及到的设备安装有“协同助手”的应用程序,在本申请实施例中,需要在设备的“协同助手”开启的条件下,实现将第一设备的信息传输到第二设备上。
“协同助手”可以是设备提供的一种服务或功能,可以用于实现第一设备与第二设备建立通信连接、实现第一设备与第二设备之间的数据传输和实现第二设备与第一设备的指令传输等等功能。示例性地,“协同助手”可以是安卓安装包(AndroidPackage,APK),可以以控件或APP的形式安装于设备中。
在具体的实现中,第一设备与第二设备建立通信连接、第一设备与第二设备之间的数据传输、第二设备与第一设备的指令传输这三个功能可以不集成在一个APK文件中,可以是通过一个或多个APK文件来实现这些功能。
可以理解的是,“协同助手”只是本实施例中所使用的一个词语,其代表的含义在本实施例中已经记载,其名称并不能对本实施例构成任何限制。
下面示例性地介绍本申请实施例中第一设备和第二设备提供的一些用户界面(user interface,UI)。本申请的说明书和权利要求书及附图中的术语“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。GUI可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
图4A示例性示出了第二设备上的用于展示第二设备安装的应用程序的示例性用户界面41。
用户界面41可包括:状态栏401,应用程序图标402,页面指示符403,具有常用应用程序图标的托盘404以及其它指示符(图4A中未示出)。其中:
状态栏401可包括:无线高保真(wireless fidelity,Wi-Fi)信号的一个或多个信号强度指示符401A、蓝牙指示符401B、电池状态指示符401C、时间指示符401D。
应用程序图标402包括第一应用、第二应用、第三应用、第四应用、第五应用、第六应用和第七应用等的图标,这些应用可以是邮箱、手机管家、图库、华为商城、视频、互联网、时钟、QQ、微信、淘宝、高德地图等等。
页面指示符403可用于指示用户当前浏览的是哪一个页面中的应用程序图标。用户可以左右滑动应用程序图标402的区域,来浏览其它页面中的应用程序图标。这些页面也可以称为该第二设备的桌面。
具有常用应用程序图标的托盘404可展示:第八应用、第九应用、第十应用和第十一应用等的图标。这些应用可以是比较常用的应用,例如可以是设置、音乐、阅读和相机等等。
在一些实施例中,用户界面41还可包括导航栏,其中导航栏可包括:返回键、主屏幕键、多任务键等系统导航键。当检测到用户点击返回键时,第二设备可显示当前页面的上一个页面。当检测到用户点击主屏幕键时,第二设备可显示主界面。当检测到用户点击多任务键时,第二设备可显示用户最近打开的任务。各导航键的命名还可以为其它,本申请对此不做限制。不限于虚拟按键,导航栏中的各导航键也可以实现为物理按键。
在其它一些实施例中,第二设备还可以包括实体的主屏幕键。该主屏幕键可用于接收用户的指令,将当前显示的UI返回到主界面,这样可以方便用户随时查看主屏幕。上述指令具体可以是用户单次按下主屏幕键的操作指令,也可以是用户在短时间内连续两次按下主屏幕键的操作指令,还可以是用户在预定时间内长按主屏幕键的操作指令。在本申请其它一些实施例中,主屏幕键还可以集成指纹识别器,以便用于在按下主屏幕键的时候,随之进行指纹采集和识别。
可以理解的是,图4A仅仅示例性示出了第二设备上的用户界面,不应构成对本申请实施例的限定。
下面示例性地介绍本申请实施例提供的开启第二设备上的“协同助手”的方式。
图4A及图4B示例性示出了第二设备上的一种开启“协同助手”的操作。
如图4A所示,当第二设备检测到在状态栏401上的向下滑动手势时,响应于该手势,第二设备可以在用户界面41上显示窗口405。如图4B所示,窗口405中可以显示有“协同助手”的开关控件405A,还可以显示有其它功能(如WiFi、蓝牙、手电筒等等)的开关控件。当检测到在窗口405中的开关控件405A上的触控操作(如在开关控件405A上的点击或触摸操作)时,响应于该触控操作,第二设备可以开启“协同助手”。
也即是说,用户可以在状态栏401处做一个向下滑动的手势来打开窗口405,并可以在窗口405中点击“协同助手”的开关控件405A来方便地开启“协同助手”。“协同助手”的开关控件405A的表现形式可以但不限于为文本信息和/或图标。
在其中一种可能的实施方式中,“协同助手”也可以像邮箱、图库等应用一样以应用图标的形式显示在第二设备的桌面上,当第二设备检测到对该“协同助手”图标的点击或触摸等操作后,开启该第二设备上的“协同助手”。
在本申请一些实施例中,第二设备开启“协同助手”之后,还可以在状态栏401中显示“协同助手”已开启的提示信息。例如,在状态栏401中显示“协同助手”的图标或者直接显示文本“协同助手”等。例如,可以参见图4C,在图4C中,图标406即为“协同助手”的图标。需要说明的是,“协同助手”的图标不限于图4B和图4C中所示的图标,这仅为一个示例,本方案对“协同助手”的图标的具体表现形式不做限制。
在本申请实施例中,不限于在上述示出的开启“协同助手”的方式,在一些实施例中,还可以通过其它方式开启“协同助手”。在另一些实施例中,第二设备也可以默认开启“协同助手”,例如在开机后自动开启“协同助手”等。
在第一设备上开启“协同助手”的操作可以参见上述在第二设备上开启“协同助手”的操作,此处不再赘述。开启第一设备和第二设备的“协同助手”后,第一设备可以实现与第二设备建立通信连接,然后将数据传输到第二设备中。
下面示例性地介绍在第一设备和第二设备开启“协同助手”功能后,第一设备在与第二设备建立通信连接的过程中实现的图形用户界面的一些实施例。
首先,下面以第一设备为手机,第二设备为平板电脑或者便携式电脑(tablet personal computer,Tablet PC)示例性地介绍第一设备和第二设备之间通过近场通信(Near Field Communication,NFC)发现设备并通信建立连接的过程。下面分两种情况介绍两个设备建立连接的过程。
情况一、第一设备和第二设备没有登录同一个系统账号的情况。
在具体实施例中,假设第一设备和第二设备都是同一个品牌的设备,但是该两个设备开启之后没有登录到同一个系统账号上;或者第一设备和第二设备不是同一个品牌的设备。在这些情况下表明第一设备和第二设备为异账号设备,即没有同时登陆一个系统账号的设备,此时,第一设备和第二设备可以通过如下方式进行连接。
具体的,第一设备和第二设备上都具备NFC功能,在第一设备和第二设备的NFC功能都打开的情况下,可以将第一设备靠近或接触第二设备,例如可以将第一设备预设部位例如背部靠近或触碰第二设备的预设位置例如带有分享或连接标签的位置,第一设备和第二设备 即可互相发现对方,第一设备的显示屏可以出现已发现的第二设备的用户界面,例如图5A所示界面。
在图5A所示用户界面中,包括一个窗口501,在窗口501中包括已发现的第二设备的图标5011、该第二设备的名称5012、提示信息5013、一个“连接”控件5014和一个“取消”控件5015。
其中,第二设备的图标5011例如可以是一个平板电脑的图标等。第二设备的名称5012例如可以是HUAWEI MatePad Pro X等。提示信息5013可以用于向用户说明“连接”控件5014的作用以及连接后的功能,例如提示信息5013可以是“点击‘连接’会开启WLAN和蓝牙。连接后,您可在HUAWEI MatePad Pro X上操作手机,并在设备间共享数据。”等。“连接”控件5014可以用于向第二设备发送连接确认请求。“取消”控件5015可以用于取消第二设备与第二设备的连接操作。
可选的,在图5A中点击“连接”控件5014后会开启WLAN和蓝牙,那么第一设备和第二设备建立连接的过程可以通过蓝牙来完成,当成功建立连接之后,可以通过WLAN实现第一设备和第二设备之间的数据交互和共享。通过蓝牙建立连接后,以WLAN的方式实现第一设备和第二设备之间的数据交互可以提高数据交互的速度,提高彼此响应的效率。
在图5A所示用户界面中,第一设备响应于对“连接”控件5014的触控操作,显示如图5B所示用户界面。在图5B中,包括一个第一设备向第二设备发送连接确认请求后等待第二设备确认的一个窗口502。该窗口502同样的可以包括第二设备的图标5021,还包括提示信息5022和一个“取消”控件5023。
其中,第二设备的图标5021例如可以是平板电脑的图标等。提示信息5022用于说明正在等待第二设备的确认,例如提示信息5022可以是“请在HUAWEI MatePad Pro X端进行连接确认…”等。“取消”控件5023可以用于取消第一设备与第二设备的连接操作。
在图5A所示用户界面中,第一设备响应于对“连接”控件5014的触控操作之后,向第二设备发送了一个连接请求,第二设备接收到该连接请求后,显示如图5C所示的用户界面。
在图5C所示的用户界面中,包括一个第二设备是否与第一设备连接的确认窗口503。窗口503包括第二设备的图标5031、第一设备的图标5032、第二设备与第一设备的关联符5033、提示信息5034、一个“拒绝”控件5035和一个“允许”控件5036。
其中,第二设备的图标5031例如可以是电脑的图标等。第二设备的图标5032例如可以是手机的图标等。提示信息5034可以用于提示是否连接、向用户说明“允许”控件5036的作用以及连接后的功能,例如提示信息5034可以是“是否允许HUAWEI Nova 7连接此电脑。点击“允许”,您可在HUAWEI MatePad Pro X上操作手机,并在设备间共享数据。此功能会开启WLAN和蓝牙。”等。示例中,HUAWEI Nova 7为第一设备的名称“拒绝”控件5035可以用于拒绝与第二设备的连接。“允许”控件5036可以用于与第二设备建立连接。其中,HUAWEI Nova 7为第二设备的名称。
在图5C所示的用户界面中,第二设备响应于对“允许”控件5036的点击或触控操作,确认与第一设备建立连接,建立连接后第二设备的用户界面图例如可以是图6A所示界面图,表明第一设备与第二设备之间已经成功建立连接,即第一设备的信息传输到了第二设备并显示在显示屏上。图6B所示界面图下面会详细介绍,此处暂不详述。
上述第一设备和第二设备可以连接入同一个无线网络中。如果第一设备和/或第二设备已经连接了该无线网络,则在图5A和/或图5C所示界面中则可以不用再次连接该无线网络。
具体将第一设备的信息传输到第二设备并显示在显示屏上的实现方式可以包括:
在第一设备与第二设备成功建立连接后,第一设备的“协同框架”(例如图3中应用程序框架层的协同框架)将该成功建立连接的事件通知给第一设备的“协同助手”(例如图3中应用程序层的协同助手),该“协同助手”响应于该事件通知通过第一设备的资源管理器(例如图3中应用程序框架层的资源管理器)或者第一设备的“协同框架”获取第一设备上的第一界面显示信息。
该第一界面显示信息为该第一设备的显示屏正在显示的第一应用的第一用户界面的信息,该信息可以包括第一设备用于显示该第一用户界面的数据。具体的,该第一界面显示信息可以包括该显示屏正在显示的第一用户界面的栈信息和界面显示的内容的数据等等。该栈信息可以包括显示的应用的活动(Activity)服务的包名和类名等。该第一设备的资源管理器或“协同框架”可以是通过“显示框架”(例如图3中应用程序框架层的显示框架)来获取到该界面显示的内容的数据。
第一设备的“协同助手”获取到上述第一界面显示信息后,可以通过WLAN即上述连接的无线网络将该第一界面显示信息发送给第二设备,第二设备通过自身的“协同助手”接收到该第一界面显示信息,并通过自身的“协同框架”将该第一界面显示信息发送给自身的“显示框架”,该“显示框架”根据这些信息调动窗口管理器(例如图3中应用程序框架层的窗口管理器)在显示屏上显示窗口,该窗口中显示的内容包括上述第一用户界面显示的内容。
该窗口例如可以参见图6A中的窗口601A,该窗口601A的尺寸大小与所述第一设备的显示屏的尺寸大小相匹配。在本申请实施例中,像图6A中所示的窗口601A可以称为第二窗口,但是第二窗口中显示的内容不限于窗口601A中所示的内容。
上述第一应用可以为第一设备中安装的应用中的任意一个应用,该第一应用可以是第一设备出厂时就安装有的必备的应用,例如桌面应用、文件管理或设置等系统应用。或者,该第一应用也可以是第一设备中安装有的可选择的应用,例如微信、淘宝、高德地图或美团等第三方应用,可选择的应用不限于是第三方应用,也可以是第一设备的品牌生产的应用,例如华为的“应用市场”应用等。有些应用有时可以是必备的系统应用,有时又可以是可选择的应用,例如华为的“应用市场”应用在一些可能的实施例中可以是第一设备必备的系统应用。
在本申请实施例中,第一应用主要以桌面应用和“应用市场”应用为例说明,但这并不构成对本技术方案的限制。
关于第一设备映射到第二设备的窗口示例性地还可以参见图6B,图6B所示的窗口601B为“应用市场”的应用的用户界面,即第一设备中显示的用户界面也为该“应用市场”的用户界面。需要说明的是,图6B中的窗口601B显示的用户界面也可以是其它应用的用户界面,例如可以是微信、QQ或华为商城等等应用的用户界面,具体的应用本方案不做限制。
同样的,在本申请实施例中,像图6B中所示的窗口601B也可以称为第二窗口,但是第二窗口中显示的内容不限于窗口601B中所示的内容。
图6A中所示的窗口601为第一设备的一个桌面的示意图,第一设备的桌面也是一个应用即上述桌面应用,因此,图6A中所示的窗口601也可以说是包括了第一设备的桌面应用的用户界面。
需要说明的是,除了资源管理器和“协同框架”之外,第一设备的“协同助手”还可以响应于上述事件通知通过应用框架层的其它模块获取上述第一界面显示信息,本申请实施例 对此不做限制。
情况二、第一设备和第二设备登录同一个系统账号的情况。
在具体实施例中,假设第一设备和第二设备都是同一个品牌的设备,且该两个设备开启之后登录到同一个系统账号上,即该两个设备为同账号设备,那么在这种情况下,第一设备和第二设备可以通过如下方式进行连接。
具体的,在第一设备和第二设备的NFC功能都打开的情况下,可以将第二设备靠近或接触第二设备,例如可以将第二设备预设部位例如背部靠近或触碰第二设备的预设位置例如带有分享或连接标签的位置,第一设备和第二设备即可互相发现对方,第二设备的显示屏可以出现已发现的第二设备的用户界面,例如还是参见图5A所示界面。
在图5A所示的用户界面中,第二设备响应于对“连接”控件5014的触控操作,向第二设备发送连接请求,因为第一设备和第二设备为同账号设备,因此自动建立了信任关系,第二设备接收到第二设备发送的连接请求会后自动确认连接。确认连接后即完成了两个设备之间的连接,示例性地,此时第二设备中显示如图6A或如图6B所示的用户界面。
需要说明的是,第一设备与第二设备之间建立通信连接实现数据共享的方式还包括其它的方式,例如,还可以通过蓝牙、数据线或者近场通信NFC的其它方法等来实现,本申请实施例对此不做限制。
下面介绍第一设备与第二设备建立连接后,第二设备上实现的图形用户界面的一些实施例。以第二设备为平板电脑做示例性地介绍。
参见图6A和图6B,图6A和图6B示例性地示出了第一设备与第二设备建立连接后,第一设备将自身显示屏显示的用户界面映射到第二设备的显示屏上的示例图,第二设备默认显示的示例性用户界面可以参见图6A中的窗口601A和图6B所示的窗口601B。其中,窗口601A和窗口601B可以称为协同窗口。
在图6A和图6B中可以看到,窗口601A和窗口601B的尺寸大小与第二设备的显示屏的尺寸大小不匹配。该不匹配可以指的是窗口601A和窗口601B的长宽比例可以与该第二设备的显示屏的长宽比例不相同,或者指的是窗口601A和窗口601B仅占用第二设备的显示屏的一部分面积。
在图6A中窗口601A除了包括第一设备映射过来的用户界面601A1(该第一设备映射过来的用户界面601A1可以称为映射窗口601A1)之外,还可以包括标题栏601A2,该标题栏601A2中可以包括隐藏控件6011、最小化控件6012、最大化控件6013和第一设备的名称6014。
隐藏控件6011可以用于隐藏窗口601A,响应于对隐藏控件6011的点击或触摸操作,第二设备将窗口601A隐藏;而第二设备可以响应于对图6A中所示的“协同助手”图标602的点击或触摸操作,恢复窗口601A在显示屏的显示界面。
在一种可能的实施方式中,隐藏控件6011可以用于断开第一设备与第二设备的连接,第二设备响应于对隐藏控件6011的点击或触摸操作,可以主动断开与第一设备的连接。如果第二设备与第一设备需要重新建立连接,可以参见上述建立连接的对应的描述,此处不再赘述。
最小化控件6012可以用于最小化窗口601A,响应于对最小化控件6012的点击或触摸操作,第二设备将窗口601A最小化,例如将窗口601A最小化到第二设备的显示屏的边缘,示例性的,可以参见图7。在图7中,小窗口701即为最小化后的窗口601A,示例性地,小窗口701中可以包括第一设备的名称例如HUAWEI Nova 7等。需要说明的是,小窗口701在第 二设备的显示屏中的位置不限于图7中所示的位置,可以是该显示屏中的边缘的任意一个位置。第二设备可以响应于对该小窗口701的点击或触摸操作还原窗口601A。
或者,最小化控件6012可以用于将窗口601A切换到后台运行,当需要将窗口601A显示在显示屏上时,再将窗口601A从后台调出来。
最大化控件6013可以用于最大化窗口601A,响应于对最大化控件6013的点击或触摸操作,第二设备将窗口601A最大化,例如将窗口601A铺满全屏等。下面会详细介绍最大化控件的使用和功能,此处暂不详述。
第一设备的名称6014例如可以是HUAWEI Nova 7等。
图6B中的窗口601B的组成及功能可以对应参见对图6A中的窗口601A的描述,此处不再赘述。
在第一设备和第二设备建立通信连接之后,第一设备可以实时地将自身显示屏显示的用户界面的数据以及用于显示该用户界面的信息通过“协同助手”发送给第二设备,第二设备可以实时地根据获取到信息更新协同窗口,使得协同窗口中显示的内容始终包括第一设备的显示屏正在显示的内容。
下面通过几个实施例示例性介绍第二设备通过“协同助手”接收到第一设备发送过来的第一界面显示信息之后,根据该第一界面显示信息全屏显示或进入多窗口模式显示第一设备映射过来的窗口的过程。
需要说明的是,本申请实施例所述的多窗口指的是在同一个应用内包括不同用户界面的多个窗口,例如,以上述的“应用市场”为例,在第二设备的显示屏幕上显示的用户界面包括该“应用市场”的主页和“精品应用”浏览页面,那么该用户界面即为本申请所述的多窗口模式。例如可以参见图11或图14所示的用户界面,下面会详细介绍,此处暂不详述。
实施例一
本实施例一结合上述最大化控件6013介绍协同窗口全屏显示的过程,以协同窗口为图6A所示的窗口601A示例说明。
在图6A所示的用户界面中,第二设备接收对最大化控件6013的点击或触摸操作,响应于该操作,第二设备判断窗口601A是否具备全屏显示的属性。
在具体实施例中,上述第一设备通过“协同助手”向第二设备发送的第一界面显示信息中,包括窗口601A所示的用户界面对应的应用的Activity的屏幕方向(screenOrientation)属性,该屏幕方向属性可以包括在栈信息中。因此,第二设备可以通过栈管理器(例如图3中所示的栈管理器)提取出该应用的Activity的屏幕方向属性信息,然后,查看该属性信息是否为"landscape",landscape表示横屏的意思,即表示可以全屏放大。
在该属性信息为"landscape"的情况下,第二设备调用窗口管理器将窗口601A全屏放大,窗口601A全屏放大后的示意图可以参见图8。在图8中所示的用户界面可以包括映射窗口801和标题窗口802。
映射窗口801即为第一设备显示屏显示的用户界面在第二设备中最大化的显示窗口。
标题窗口802中可以包括隐藏控件6011、最小化控件6012、第一设备的名称6014和窗口还原控件6015。
可以看到,窗口601A中的最大化控件6013变成了窗口还原控件6015。该窗口还原控件6015可以用于将图8所示的全屏界面还原为图6A所示的窗口601A。具体的,第二设备可以 接收对该窗口还原控件6015的点击或触摸操作,响应于该操作,第二设备可以调用窗口管理器将图8所示的全屏界面还原为窗口601A。
另外,在图8中可以看到隐藏控件6011、最小化控件6012和第一设备的名称6014的显示位置相比于图6A中也是有所不同的。隐藏控件6011、最小化控件6012、第一设备的名称6014和窗口还原控件6015的显示位置和排列情况不限于图8所示的情形,也可以是其它的情形,本方案对此不做限制。此外,这些控件的图标也不限于图8所示的形状,也可以是其它的形状,本方案对此亦不限制。
在一种可能的实施方式中,上述标题窗口802可以不显示在图8中,那么映射窗口801可以铺满第二设备的整个显示屏显示,例如可以参见图12A。然后,可以通过接收双击该铺满全屏的映射窗口的操作退出映射窗口的全屏显示。或者,也可以通过其它方法,例如接收Esc按键的指令退出映射窗口的全屏显示等。
实施例二
本实施例二结合上述最大化控件6013介绍协同窗口进入多窗口模式的过程,以协同窗口为图6B所示的窗口601B示例说明。
在图6B所示的用户界面中,第二设备接收对最大化控件6013的点击或触摸操作,响应于该操作,第二设备判断窗口601B是否具备全屏显示的属性。
同样的,在具体实施例中,上述第一设备通过“协同助手”向第二设备发送的第一界面显示信息中,包括窗口601B所示的“应用市场”的Activity的屏幕方向(screenOrientation)属性,该屏幕方向属性可以包括在栈信息中。因此,第二设备可以通过栈管理器(例如图3中所示的栈管理器)提取出该“应用市场”的Activity的屏幕方向属性信息,然后,查看该属性信息是否为"landscape",landscape表示横屏的意思,即表示可以全屏放大。
在该属性信息不是"landscape"的情况下,第二设备调用窗口管理器根据上述第一设备发送的第一界面显示信息将窗口601B切换为多窗口模式的状态。
具体的,第二设备可以通过窗口管理器查看第一界面显示信息中的栈信息包括多少个窗口的栈信息。下面分两种情况介绍:
第一种情况、该栈信息包括一个窗口的栈信息。
如果该栈信息只包括一个窗口(该窗口通常为“应用市场”的主页窗口)的栈信息,那么,第二设备的窗口管理器将该主页窗口显示在第二设备的显示屏的一边,然后自定义一个填充窗口显示在该显示屏的另一边。该填充窗口可以说是该第二设备自定义的显示在该“应用市场”的用户界面中的窗口。例如,可以参见图9。
在图9中所示的用户界面可以包括主页窗口901、填充的A0窗口902和标题窗口903。
主页窗口901用于显示“应用市场”的主页用户界面,填充的A0窗口902即为上述自定义的显示在该“应用市场”的用户界面中的窗口,标题窗口903的描述可以参见标题窗口803的描述,此处不再赘述。
在图9中可以看到,该主页窗口901、填充的A0窗口902和标题窗口903可以全屏铺满该第二设备的显示屏。
在一种可能的实施方式中,上述填充窗口也可以显示一些广告的内容、对于该填充窗口的介绍说明的内容或者其它自定义的内容等等,本方案对此不做限制。
在一种可能的实施方式中,上述标题窗口903可以不显示在图9中,那么主页窗口901和填充的A0窗口902可以铺满第二设备的整个显示屏显示,例如可以参见图13。然后,可 以通过接收双击该显示屏的操作退出全屏显示。或者,也可以通过其它方法,例如接收Esc按键的指令退出全屏显示等。
第二种情况、上述栈信息包括多个窗口的栈信息。
该多个窗口通常为“应用市场”的主页窗口以及基于该主页窗口进入的另一个用户界面的窗口。例如,结合图6B示例性介绍,如果在第一设备中刚开始显示的是该“应用市场”的主页窗口601B1(由于主页窗口601B1是第一设备映射过来的,因此此时在第一设备中显示的窗口即为该主页窗口601B1),那么该第一设备可以响应于对该窗口601B1中的“更多”控件的点击或触摸操作进入“精品应用”的用户界面,即该第一设备的显示屏中显示的是该“精品应用”的用户界面,那么,协同到第二设备后,第二设备的协同窗口中也显示的是该“精品应用”的用户界面。例如可以参见图10中的窗口601C。图10中的窗口601C的各部分组成及功能也可以对应参见对图6A中的窗口601A的描述,此处不再赘述。
需要说明的是,虽然图10中的窗口601C1只显示了“精品应用”的用户界面的窗口,但是在第一设备发送过来的栈信息中包括了该窗口和主页窗口的栈信息。在第二设备的协同窗口栈中,该“精品应用”的窗口栈在栈顶,该主页窗口的栈在栈底。
那么,第二设备的窗口管理器根据上述栈信息以及上述第一界面显示信息中的显示数据将该主页窗口显示在第二设备的显示屏的一边,然后将“精品应用”的窗口显示在该显示屏的另一边。例如,可以参见图11。
在图11中所示的用户界面可以包括主页窗口1101、“精品应用”的窗口1102和标题窗口1103。
主页窗口1101用于显示“应用市场”的用户界面主页,“精品应用”的窗口1102显示的即为上述“精品应用”的用户界面,标题窗口1103的描述可以参见标题窗口803的描述,此处不再赘述。
在图11中可以看到,该主页窗口1101、“精品应用”的窗口1102和标题窗口1103可以全屏铺满该第二设备的显示屏。
在一种可能的实施方式中,上述标题窗口1103可以不显示在图11中,那么主页窗口1101和“精品应用”的窗口1102可以铺满第二设备的整个显示屏显示,例如可以参见图14。然后,可以通过接收双击该显示屏的操作退出全屏显示。或者,也可以通过其它方法,例如接收Esc按键的指令退出全屏显示等。
当然,这里只是以栈信息中包括两个窗口为例说明,也可以包括三个、四个或者更多的窗口,具体的实现过程类似,可以参见上述的描述,此处不再赘述。
上述实施例为结合上述最大化控件6013介绍的协同窗口全屏显示或者协同窗口进入多窗口模式的过程,下面介绍无需最大化控件6013即可实现协同窗口全屏显示或者协同窗口进入多窗口模式的过程。
在具体实施例中,在上述第一设备将上述第一界面显示信息通过“协同助手”发送给上述第二设备,第二设备通过自身的“协同助手”接收到该第一界面显示信息之后,该第二设备可以直接根据该第一界面的信息进入协同窗口全屏显示或者协同窗口进入多窗口模式界面。即第二设备无需根据上述第一界面显示信息先生成一个较小的协同窗口(例如图6A中的窗口601A或图7中的窗口601B等),而是可以直接显示得到较大的协同窗口(例如图8所示的窗口等),减少了中间由小窗口切换为大窗口的繁琐操作,进一步的便于用户操作,提高 用户体验。下面通过实施例三和实施例四示例介绍。
实施例三
本实施例三介绍该第二设备直接根据该第一界面的信息进入协同窗口全屏显示的过程。
具体的,上述第一设备通过“协同助手”向第二设备发送的第一界面显示信息中,可以包括第一设备正在显示的用户界面对应的应用的Activity的screenOrientation属性,该screenOrientation属性可以包括在栈信息中。因此,第二设备可以通过栈管理器(例如图3中所示的栈管理器)提取出该应用的Activity的screenOrientation属性信息,然后,查看该属性信息是否为"landscape"。
在该属性信息为"landscape"的情况下,第二设备调用窗口管理器根据上述第一界面显示信息全屏显示第一设备映射过来的用户界面,全屏显示的界面例如可以参见图8或图12A所示界面。对图8或图12A的介绍可以参见上述的描述,此处不再赘述。
实施例四
本实施例四介绍第二设备直接根据该第一界面的信息进入多窗口模式的过程。
同样的,在具体实施例中,上述第一设备通过“协同助手”向第二设备发送的第一界面显示信息中,可以包括第一设备正在显示的用户界面对应的应用的Activity的screenOrientation属性,该screenOrientation属性可以包括在栈信息中。因此,第二设备可以通过栈管理器(例如图3中所示的栈管理器)提取出该应用的Activity的screenOrientation属性信息,然后,查看该属性信息是否为"landscape"。
在该属性信息不是"landscape"的情况下,第二设备通过窗口管理器查看上述第一界面显示信息中的栈信息包括多少个窗口的栈信息。下面分两种情况介绍:
第一种情况、该栈信息包括一个窗口的栈信息。
如果该栈信息只包括一个窗口(该窗口通常上述第一设备正在显示的用户界面对应的应用的主页窗口)的栈信息,那么,第二设备的窗口管理器将该主页窗口显示在第二设备的显示屏的一边,然后自定义一个填充窗口显示在该显示屏的另一边。该填充窗口可以说是该第二设备自定义的显示在该“应用市场”的用户界面中的窗口。例如,可以参见图9或图13。对图9或图13的介绍可以参见上述的描述,此处不再赘述。
第二种情况、上述栈信息包括多个窗口的栈信息。
该多个窗口通常为上述第一设备正在显示的用户界面对应的应用的主页窗口以及基于该主页窗口进入的另一个用户界面的窗口。这里可以参见上述结合图6B和图10的介绍进行理解,此处不再赘述。
那么,第二设备的窗口管理器根据上述栈信息以及上述第一界面显示信息中的显示数据将该主页窗口显示在第二设备的显示屏的一边,然后将上述基于该主页窗口进入的另一个用户界面的窗口显示在该显示屏的另一边。例如,可以参见图11或图14。对图11或图14的介绍可以参见上述的描述,此处不再赘述。
当然,这里只是以栈信息中包括两个窗口为例说明,也可以包括三个、四个或者更多的窗口,具体的实现过程类似,可以参见上述的描述,此处不再赘述。
在一种可能的实施方式中,上述实施例一至实施例四中,第二设备也可以不将上述第一设备映射过来的协同窗口全屏显示,而是可以只占用第二设备的显示屏的较大一部分的区域,例如四分之三、五分之四区域或者六分之五区域等等来显示该第一设备映射过来的协同窗口,具体的显示该协同窗口的大小可以通过窗口管理器(例如图3中所示的窗口管理器)来管理 实现。
上面介绍的是第二设备将第一设备映射过来的协同窗口全屏显示(或放大显示)的过程和该协同窗口进入多窗口模式的过程,下面结合第二设备自身是否安装有该协同窗口对应的应用来介绍这两个过程。下面通过实施例五至实施例八进行示例介绍。
实施例五
本实施例五结合图6A和结合第二设备是否安装有第一设备显示的用户界面对应的应用来介绍进入全屏显示的过程。
在具体实施例中,在图6A所示的用户界面中,第二设备接收对最大化控件6013的点击或触摸操作,响应于该操作,第二设备判断窗口601A是否具备全屏显示的属性。
通过上述实施例一中的方法判断出窗口601A所示的用户界面对应的应用(该应用可以为桌面应用)的Activity的screenOrientation属性信息为"landscape"的情况下,第二设备可以调用包管理器(例如图3中所述的包管理器)判断第二设备上是否安装有该桌面应用。
具体的,由前面的介绍可知,第二设备接收到的第一设备发送的第一界面显示信息中包括该桌面应用的包名,那么第二设备的包管理器可以获取该桌面应用的包名,然后查询自身安装的应用的包名是否包括该桌面应用的包名,如果有包括则表明第二设备安装了该桌面应用,如果没有包括则表明第二设备没有安装该桌面应用。
或者,在一种可能的实施方式中,上述第二设备可以响应于上述对最大化控件6013的点击或触摸操作,同时执行判断该桌面应用的Activity的screenOrientation属性的操作和判断第二设备上是否安装有该桌面应用的操作。
在第二设备没有安装该桌面应用的情况下,第二设备调用窗口管理器将窗口601A全屏放大,窗口601A全屏放大后的示意图可以参见图8或图12A。对图8或图12A的介绍可以参见上述的描述,此处不再赘述。
在第二设备安装了该桌面应用的情况下,第二设备启动自身的桌面应用(如果已经启动则不用再次启动),并根据上述获取的第一界面显示信息在桌面应用的用户界面显示与第一设备显示的该桌面应用的用户界面相同的内容。该第二设备上显示的该桌面应用的用户界面同样可以是全屏显示的,例如可以参见图12A所示的界面,此时,协同窗口601A退到后台运行。或者,此时也可以将协同窗口601A关闭。
具体的,上述第二设备可以调用窗口管理器通过上述第一界面显示信息的栈信息中的桌面应用的Activity的包名和类名,然后将这个包名和类名组成一个Intent去启动窗口,这个窗口就会和协同窗口601A的界面保持一样,或者说启动的窗口和第一设备显示的界面内容一致。
在一种可能的实施方式中,在第二设备安装了该桌面应用的情况下,第二设备可以提供一个选择窗口给用户,让用户选择是通过协同窗口601A进入全屏界面(或者放大界面)还是通过第二设备自身安装有的该桌面应用进入全屏界面(或者放大界面)。示例性地,可以参见图12B。
在图12B中可以看到,基于图6A所示的用户界面显示了一个选择窗口1201,该选择窗口1201包括说明12011、协同窗口的选择按钮12012和桌面应用的选择按钮12013。示例性地,说明12011的内容可以是“选择通过协同窗口或者桌面应用进入全屏界面”等。
第二设备可以响应于用户对协同窗口的选择按钮12012的点击或触摸操作,通过协同窗 口601A进入全屏界面(或者放大界面)。或者,第二设备可以响应于用户对桌面应用的选择按钮12013的点击或触摸操作,通过第二设备自身安装有的该桌面应用进入全屏界面(或者放大界面)。具体进入全屏界面(或者放大界面)的过程可以参见上述的介绍,此处不再赘述。
实施例六
本实施例六结合图6B和结合第二设备是否安装有第一设备显示的用户界面对应的应用来介绍进入多窗口模式显示的过程。
在具体实施例中,在图6B所示的用户界面中,第二设备接收对最大化控件6013的点击或触摸操作,响应于该操作,第二设备判断窗口601B是否具备全屏显示的属性。
通过上述实施例二中的方法判断出窗口601B所示的用户界面对应的应用(该应用可以为“应用市场”应用)的Activity的screenOrientation属性信息不是"landscape"的情况下,第二设备可以调用包管理器(例如图3中所述的包管理器)判断第二设备上是否安装有该“应用市场”。
具体的,由前面的介绍可知,第二设备接收到的第一设备发送的第一界面显示信息中包括该“应用市场”的包名,那么第二设备的包管理器可以获取该“应用市场”的包名,然后查询自身安装的应用的包名是否包括该“应用市场”的包名,如果有包括则表明第二设备安装了该“应用市场”,如果没有包括则表明第二设备没有安装该“应用市场”。
或者,在一种可能的实施方式中,上述第二设备可以响应于上述对最大化控件6013的点击或触摸操作,同时执行判断该“应用市场”的Activity的screenOrientation属性的操作和判断第二设备上是否安装有该“应用市场”的操作。
在第二设备没有安装该“应用市场”的情况下,第二设备可以根据上述实施例二中的第一种情况和第二种情况这两种情况进行多窗口模式的显示,此处不再赘述。
在第二设备安装了该“应用市场”的情况下,第二设备启动自身的“应用市场”(如果已经启动则不用再次启动),并根据上述获取的第一界面显示信息在“应用市场”的用户界面进入多窗口模式,显示的内容包括第一设备显示的该“应用市场”的用户界面的内容。该第二设备上显示的该“应用市场”的用户界面同样可以是全屏显示的,例如可以参见图13或图14所示的界面,此时,协同窗口601B或协同窗口601C退到后台运行。或者,此时也可以将协同窗口601B或协同窗口601C关闭。
具体的,上述第二设备可以调用窗口管理器通过上述第一界面显示信息的栈信息中的“应用市场”的Bctivity的包名和类名,然后将这个包名和类名组成一个Intent去启动窗口,这个窗口就会包括协同窗口601B的界面的内容,或者说启动的窗口包括第一设备显示的界面显示的内容。
在一种可能的实施方式中,在第二设备安装了该“应用市场”的情况下,第二设备可以提供一个选择窗口给用户,让用户选择是通过协同窗口601B进入多窗口模式还是通过第二设备自身安装有的该桌面应用进入多窗口模式。具体选择的实现过程可以参考上述对图12B的描述,此处不再赘述。
实施例七
本实施例七结合第二设备是否安装有第一设备显示的用户界面对应的应用介绍该第二设备直接根据该第一界面的信息进入全屏显示的过程。
在具体实施例中,上述第一设备通过“协同助手”向第二设备发送的第一界面显示信息 中,可以包括第一设备正在显示的用户界面对应的应用(该应用可以是第一设备上安装的任一个应用,可以称该应用为目标应用)的Activity的screenOrientation属性,该screenOrientation属性可以包括在栈信息中。因此,第二设备可以通过栈管理器(例如图3中所示的栈管理器)提取出该应用的Activity的screenOrientation属性信息,然后,查看该属性信息是否为"landscape"。
在该属性信息为"landscape"的情况下,第二设备可以调用包管理器(例如图3中所述的包管理器)判断第二设备上是否安装有该目标应用。
具体的,由前面的介绍可知,第二设备接收到的第一设备发送的第一界面显示信息中包括该目标应用的包名,那么第二设备的包管理器可以获取该目标应用的包名,然后查询自身安装的应用的包名是否包括该目标应用的包名,如果有包括则表明第二设备安装了该目标应用,如果没有包括则表明第二设备没有安装该目标应用。
或者,在一种可能的实施方式中,上述第二设备接收到上述第一界面显示信息之后,可以同时执行判断该目标应用的Activity的screenOrientation属性的操作和判断第二设备上是否安装有该目标应用的操作。
在第二设备没有安装该目标应用的情况下,第二设备调用窗口管理器根据上述第一界面显示信息全屏显示第一设备映射过来的用户界面,全屏显示的界面例如可以参见图8或图12A所示界面。对图8或图12A的介绍可以参见上述的描述,此处不再赘述。
在第二设备安装了该目标应用的情况下,第二设备启动自身的目标应用(如果已经启动则不用再次启动),并根据上述获取的第一界面显示信息在目标应用的用户界面显示与第一设备显示的该目标应用的用户界面相同的内容。该第二设备上显示的该目标应用的用户界面同样可以是全屏显示的,例如还是参见图12A所示的界面。
具体的,上述第二设备可以调用窗口管理器通过上述第一界面显示信息的栈信息中的目标应用的Activity的包名和类名,然后将这个包名和类名组成一个Intent去启动窗口,这个窗口就会包括第一设备显示的界面内容。
实施例八
本实施例七结合第二设备是否安装有第一设备显示的用户界面对应的应用介绍该第二设备直接根据该第一界面的信息进入多窗口模式的过程。
在具体实施例中,上述第一设备通过“协同助手”向第二设备发送的第一界面显示信息中,可以包括第一设备正在显示的用户界面对应的应用(该应用可以是第一设备上安装的任一个应用,可以称该应用为目标应用)的Activity的screenOrientation属性,该screenOrientation属性可以包括在栈信息中。因此,第二设备可以通过栈管理器(例如图3中所示的栈管理器)提取出该应用的Activity的screenOrientation属性信息,然后,查看该属性信息是否为"landscape"。
在该属性信息不是"landscape"的情况下,第二设备可以调用包管理器(例如图3中所述的包管理器)判断第二设备上是否安装有该目标应用。
具体的,由前面的介绍可知,第二设备接收到的第一设备发送的第一界面显示信息中包括该目标应用的包名,那么第二设备的包管理器可以获取该目标应用的包名,然后查询自身安装的应用的包名是否包括该目标应用的包名,如果有包括则表明第二设备安装了该目标应用,如果没有包括则表明第二设备没有安装该目标应用。
或者,在一种可能的实施方式中,上述第二设备接收到上述第一界面显示信息之后,可 以同时执行判断该目标应用的Activity的screenOrientation属性的操作和判断第二设备上是否安装有该目标应用的操作。
在第二设备没有安装该目标应用的情况下,第二设备可以根据上述实施例四中的第一种情况和第二种情况这两种情况进行窗口的显示,此处不再赘述。
在第二设备安装了该目标应用的情况下,第二设备启动自身的目标应用(如果已经启动则不用再次启动),并根据上述获取的第一界面显示信息在目标应用的用户界面显示与第一设备显示的该目标应用的用户界面相同的内容。该第二设备上显示的该目标应用的用户界面同样可以是全屏显示的,例如可以参见图13或图14所示的界面。
具体的,上述第二设备可以调用窗口管理器通过上述第一界面显示信息的栈信息中的目标应用的Bctivity的包名和类名,然后将这个包名和类名组成一个Intent去启动窗口,这个窗口就会包括第一设备显示的界面显示的内容。
上述各个实施例中举例的应用仅是示例,第一设备中安装有的任意一个应用的用户界面都可以跨设备全屏显示在第二设备的显示屏中。
同样的,在一种可能的实施方式中,上述实施例五至实施例八中,第二设备也可以不将上述第一设备映射过来的协同窗口全屏显示,而是可以只占用第二设备的显示屏的较大一部分的区域,例如四分之三、五分之四区域或者六分之五区域等等来显示该第一设备映射过来的协同窗口,具体的显示该协同窗口的大小可以通过窗口管理器(例如图3中所示的窗口管理器)来管理实现。
上述实施例五至实施例八中,通过对应的应用进入全屏显示或多窗口模式,能够使得显示的用户界面的效果更好,显示延迟降低,提高用户体验。
下面介绍一下第一设备的显示屏显示的用户界面映射到第二设备的显示屏全屏显示或者进入多窗口模式显示之后,在第二设备上响应于用户的指令所做的操作。
在第二设备上,对于通过协同窗口进入全屏显示或者多窗口模式显示的用户界面(可以参见上述实施例一至实施例四),第二设备在该用户界面上接收到用户输入的指令之后,通过自身“协同助手”将该指令发送给第一设备,第一设备根据该指令获取到对应的显示数据之后,将该显示数据通过自身的“协同助手”发给第二设备,第二设备根据接收的显示数据更新显示屏显示的用户界面。当然,第一设备的显示界面也可以根据上述指令进行更新。
在第二设备上,对于通过对应的应用进入全屏显示多窗口模式显示的用户界面(可以参见上述实施例五至实施例八),第二设备在该用户界面上接收到用户输入的指令之后,可以直接跟该应用的服务器交互更新用户界面即可,无需再与第一设备进行交互。相比于上述需要与第一设备交互更新用户界面,本申请实施例直接与服务器高速交互,能够降低用户界面显示的延迟,提高用户体验。
可以看到,在上述实施例一至实施例八中,第一设备的显示屏显示的用户界面映射到第二设备的显示屏全屏显示或者进入多窗口模式显示之后,在第二设备的显示屏显示的用户界面和第一设备的显示屏显示的用户界面的排版布局是不同的。
基于上述的描述,图15示出了本申请实施例提供的一种显示方法的交互流程示意图,参见图15,该显示方法可以包括以下步骤:
S101、第一设备获取第一应用的界面显示信息,该第一设备的显示屏显示该第一应用的 第一用户界面,该界面显示信息包括该第一设备用于显示该第一用户界面的数据。
S102、上述第一设备向上述第二设备发送该界面显示信息。
S103、上述第二设备接收该界面显示信息。
S104、上述第二设备根据该界面显示信息显示第二用户界面,该第二用户界面显示的内容包括该第一用户界面显示的内容;该第二用户界面的排版与该第一用户界面的排版不同,该第二用户界面的显示区域大于该第一用户界面的显示区域。
具体的,上述第二用户界面例如可以是图8、图9、图11、图12A、图13或图14所示的显示界面。上述界面显示信息即为上述方法实施例中所述的第一界面显示信息。上述第一应用可以是第一设备中安装的任意一个应用,例如可以是上述实施例中的桌面应用、“应用市场”应用或者目标应用等。在本申请实施例中用户界面可以是窗口,窗口也可以是用户界面,具体可以参见上述各个示例图的描述。
在一种可能的实施方式中,上述第一设备向上述第二设备发送第一应用的界面显示信息之后,上述第二设备根据上述界面显示信息显示第二用户界面之前,还包括:
上述第二设备根据上述界面显示信息显示第三用户界面,上述第三用户界面的尺寸大小与上述第二设备的显示屏的尺寸大小不匹配,上述第三用户界面显示的内容与上述第一用户界面显示的内容一致;上述第二设备接收针对上述第三用户界面中的第一控件输入的第一操作,上述第一操作用于触发上述第二设备根据上述界面显示信息显示上述第二用户界面的操作。
该第三用户界面例如可以是上述图6A中所示的窗口601A1或图6B中所示的601B1等。
在本申请实施例中,先将第一设备映射过来的用户界面按第一设备的显示屏的尺寸大小显示,以便于用户选择是否全屏显示,为用户提供了更多选择的可能性。
在一种可能的实施方式中,上述第二设备根据上述界面显示信息显示第二用户界面,包括:上述第二设备根据上述界面显示信息判断出上述第一用户界面具备横屏显示的属性;上述第二设备根据上述横屏显示的属性显示上述第二用户界面。
在一种可能的实施方式中,上述第二设备根据上述界面显示信息显示第二用户界面,包括:上述第二设备根据上述界面显示信息判断出上述第一用户界面无横屏显示的属性;上述第二设备根据上述无横屏显示的属性显示上述第二用户界面,其中,上述第二用户界面中包括多个小窗口,上述多个小窗口中包括上述第一应用的主页面的窗口以及包括与上述第一用户界面显示的内容一致的窗口,上述多个小窗口均为属于上述第一应用的窗口。
上述两个可能的实施方式中,不管上述第一用户界面是否具备横屏显示属性,都能够全屏显示在第二设备的显示屏中,多方面为用户提供了可实现的方案,提升用户体验。
在一种可能的实施方式中,上述多个小窗口包括上述第一应用的主页面的窗口和填充窗口,上述填充窗口是上述第二设备自定义显示在上述第一应用的用户界面中的窗口。
该填充窗口例如可以是图9所示的A0填充窗口902等。
在本申请实施例中,通过填充窗口的方式为用户展现第一应用全屏显示的用户界面,提升用户的感官体验。
在一种可能的实施方式中,上述界面显示信息包括上述第一应用在上述第一设备中的任务栈内的信息,上述第二设备接收对上述第三用户界面中的第一控件的第一操作之后,还包括:上述第二设备根据上述第一应用在上述第一设备中的任务栈内的信息判断出上述第二设备中安装有上述第一应用;上述第二设备通过上述第一应用基于上述界面显示信息显示上述 第二用户界面。
在一种可能的实施方式中,上述第二设备通过上述第一应用基于上述界面显示信息显示上述第二用户界面,包括:
上述第二设备响应第二操作通过上述第一应用基于上述界面显示信息显示上述第二用户界面,上述第二操作为对上述第一应用的选择按钮的触控操作。
在本申请实施例中,通过对应的应用进入全屏显示,能够使得显示的用户界面的效果更好,显示延迟降低,提高用户体验。
本申请实施例的具体实现可以参见上述提供的多个实施例的具体描述,此处不再赘述。
综上所述,本申请能够将第一设备(例如手机等)显示的应用内容映射到第二设备(例如平板等)的显示屏中进行大屏显示,充分利用了第二设备的屏幕区域,为用户提供了可大屏操作的可能,提升用户体验。
基于上述的介绍描述,可以理解的是,各个设备为了实现上述对应的功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图16示出了设备的一种可能的逻辑结构示意图,该设备可以是上述实施例所述的第二设备。该设备1600可以包括第一接收单元1601和显示单元1602。其中:
第一接收单元1601,用于接收来自第一设备的第一应用的界面显示信息,其中,上述第一设备的显示屏显示上述第一应用的第一用户界面,上述界面显示信息包括上述第一设备用于显示上述第一用户界面的数据;
显示单元1602,用于根据上述界面显示信息显示第二用户界面,上述第二用户界面显示的内容包括上述第一用户界面显示的内容;上述第二用户界面的排版与上述第一用户界面的排版不同,上述第二用户界面的显示区域大于上述第一用户界面的显示区域。
在一种可能的实施方式中,上述显示单元1602还用于,在上述第一接收单元1601接收来自第一设备的第一应用的界面显示信息之后,上述显示单元1602根据上述界面显示信息显示第二用户界面之前,根据上述界面显示信息显示第三用户界面,上述第三用户界面的尺寸大小与上述设备1600的显示屏的尺寸大小不匹配,上述第三用户界面显示的内容与上述第一用户界面显示的内容一致;
上述设备1600还包括第二接收单元,用于接收针对上述第三用户界面中的第一控件输入的第一操作,上述第一操作用于触发上述设备1600根据上述界面显示信息显示上述第二用户界面的操作。
在一种可能的实施方式中,上述显示单元1602具体用于:根据上述界面显示信息判断出上述第一用户界面具备横屏显示的属性;根据上述横屏显示的属性显示上述第二用户界面。
在一种可能的实施方式中,上述显示单元1602具体用于:根据上述界面显示信息判断出上述第一用户界面无横屏显示的属性;根据上述无横屏显示的属性显示上述第二用户界面,其中,上述第二用户界面中包括多个小窗口,上述多个小窗口中包括上述第一应用的主页面的窗口以及包括与上述第一用户界面显示的内容一致的窗口,上述多个小窗口均为属于上述第一应用的窗口。
在一种可能的实施方式中,上述多个小窗口包括上述第一应用的主页面的窗口和填充窗口,上述填充窗口是上述设备1600自定义显示在上述第一应用的用户界面中的窗口。
在一种可能的实施方式中,上述界面显示信息包括上述第一应用在上述第一设备中的任务栈内的信息,上述设备1600还包括判断单元,用于在上述第二接收单元接收对上述第三用户界面中的第一控件的第一操作之后,根据上述第一应用在上述第一设备中的任务栈内的信息判断出上述设备1600中安装有上述第一应用;
上述显示单元1602,还用于通过上述第一应用基于上述界面显示信息显示上述第二用户界面。
在一种可能的实施方式中,上述显示单元1602具体用于:响应于第二操作通过上述第一应用基于上述界面显示信息显示上述第二用户界面,上述第二操作为对上述第一应用的选择按钮的触控操作。
图16所示设备中各个单元的具体操作以及有益效果可以参见上述图15所示方法实施例的描述,此处不再赘述。
上述实施例中所用,根据上下文,术语“当…时”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其它可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。

Claims (23)

  1. 一种显示交互系统,其特征在于,所述系统包括第一设备和第二设备;其中,
    所述第一设备用于向所述第二设备发送第一应用的界面显示信息,其中,所述第一设备的显示屏显示所述第一应用的第一用户界面,所述界面显示信息包括所述第一设备用于显示所述第一用户界面的数据;
    所述第二设备用于根据所述界面显示信息显示所述第一应用的第二用户界面;所述第二用户界面显示的内容包括所述第一用户界面显示的内容;所述第二用户界面的排版与所述第一用户界面的排版不同,所述第二用户界面的显示区域大于所述第一用户界面的显示区域。
  2. 根据权利要求1所述的系统,其特征在于,所述第一设备用于向所述第二设备发送第一应用的界面显示信息之后,所述第二设备用于根据所述界面显示信息显示第二用户界面之前,还包括:
    所述第二设备用于根据所述界面显示信息显示第三用户界面,所述第三用户界面的尺寸大小与所述第二设备的显示屏的尺寸大小不匹配,所述第三用户界面显示的内容与所述第一用户界面显示的内容一致;
    所述第二设备用于接收针对所述第三用户界面中的第一控件输入的第一操作,所述第一操作用于触发所述第二设备根据所述界面显示信息显示所述第二用户界面的操作。
  3. 根据权利要求1或2所述的系统,其特征在于,所述第二设备用于根据所述界面显示信息显示第二用户界面,包括:
    所述第二设备用于根据所述界面显示信息判断出所述第一用户界面具备横屏显示的属性;
    所述第二设备用于根据所述横屏显示的属性显示所述第二用户界面。
  4. 根据权利要求1或2所述的系统,其特征在于,所述第二设备用于根据所述界面显示信息显示第二用户界面,包括:
    所述第二设备用于根据所述界面显示信息判断出所述第一用户界面无横屏显示的属性;
    所述第二设备用于根据所述无横屏显示的属性显示所述第二用户界面,其中,所述第二用户界面中包括多个小窗口,所述多个小窗口中包括所述第一应用的主页面的窗口以及包括与所述第一用户界面显示的内容一致的窗口,所述多个小窗口均为属于所述第一应用的窗口。
  5. 根据权利要求3所述的系统,其特征在于,所述多个小窗口包括所述第一应用的主页面的窗口和填充窗口,所述填充窗口是所述第二设备自定义显示在所述第一应用的用户界面中的窗口。
  6. 根据权利要求2所述的系统,其特征在于,所述界面显示信息包括所述第一应用在所述第一设备中的任务栈内的信息,所述第二设备用于接收对所述第三用户界面中的第一控件的第一操作之后,还包括:
    所述第二设备用于根据所述第一应用在所述第一设备中的任务栈内的信息判断出所述第二设备中安装有所述第一应用;
    所述第二设备用于通过所述第一应用基于所述界面显示信息显示所述第二用户界面。
  7. 根据权利要求6所述的系统,其特征在于,所述第二设备用于通过所述第一应用基于所述界面显示信息显示所述第二用户界面,包括:
    所述第二设备用于响应第二操作通过所述第一应用基于所述界面显示信息显示所述第二用户界面,所述第二操作为对所述第一应用的选择按钮的触控操作。
  8. 一种显示方法,其特征在于,所述方法包括:
    第二设备接收来自第一设备的第一应用的界面显示信息,其中,所述第一设备的显示屏显示所述第一应用的第一用户界面,所述界面显示信息包括所述第一设备用于显示所述第一用户界面的数据;
    所述第二设备根据所述界面显示信息显示第二用户界面,所述第二用户界面显示的内容包括所述第一用户界面显示的内容;所述第二用户界面的排版与所述第一用户界面的排版不同,所述第二用户界面的显示区域大于所述第一用户界面的显示区域。
  9. 根据权利要求8所述的方法,其特征在于,所述第二设备接收来自第一设备的第一应用的界面显示信息之后,所述第二设备根据所述界面显示信息显示第二用户界面之前,还包括:
    所述第二设备根据所述界面显示信息显示第三用户界面,所述第三用户界面的尺寸大小与所述第二设备的显示屏的尺寸大小不匹配,所述第三用户界面显示的内容与所述第一用户界面显示的内容一致;
    所述第二设备接收针对所述第三用户界面中的第一控件输入的第一操作,所述第一操作用于触发所述第二设备根据所述界面显示信息显示所述第二用户界面的操作。
  10. 根据权利要求8或9所述的方法,其特征在于,所述第二设备根据所述界面显示信息显示第二用户界面,包括:
    所述第二设备根据所述界面显示信息判断出所述第一用户界面具备横屏显示的属性;
    所述第二设备根据所述横屏显示的属性显示所述第二用户界面。
  11. 根据权利要求8或9所述的方法,其特征在于,所述第二设备根据所述界面显示信息显示第二用户界面,包括:
    所述第二设备根据所述界面显示信息判断出所述第一用户界面无横屏显示的属性;
    所述第二设备根据所述无横屏显示的属性显示所述第二用户界面,其中,所述第二用户界面中包括多个小窗口,所述多个小窗口中包括所述第一应用的主页面的窗口以及包括与所述第一用户界面显示的内容一致的窗口,所述多个小窗口均为属于所述第一应用的窗口。
  12. 根据权利要求11所述的方法,其特征在于,所述多个小窗口包括所述第一应用的主页面的窗口和填充窗口,所述填充窗口是所述第二设备自定义显示在所述第一应用的用户界面中的窗口。
  13. 根据权利要求9所述的方法,其特征在于,所述界面显示信息包括所述第一应用在所述第一设备中的任务栈内的信息,所述第二设备接收对所述第三用户界面中的第一控件的第一操作之后,还包括:
    所述第二设备根据所述第一应用在所述第一设备中的任务栈内的信息判断出所述第二设备中安装有所述第一应用;
    所述第二设备通过所述第一应用基于所述界面显示信息显示所述第二用户界面。
  14. 根据权利要求13所述的方法,其特征在于,所述第二设备通过所述第一应用基于所述界面显示信息显示所述第二用户界面,包括:
    所述第二设备响应于第二操作通过所述第一应用基于所述界面显示信息显示所述第二用户界面,所述第二操作为对所述第一应用的选择按钮的触控操作。
  15. 一种显示设备,其特征在于,所述设备包括:
    第一接收单元,用于接收来自第一设备的第一应用的界面显示信息,其中,所述第一设 备的显示屏显示所述第一应用的第一用户界面,所述界面显示信息包括所述第一设备用于显示所述第一用户界面的数据;
    显示单元,用于根据所述界面显示信息显示第二用户界面,所述第二用户界面显示的内容包括所述第一用户界面显示的内容;所述第二用户界面的排版与所述第一用户界面的排版不同,所述第二用户界面的显示区域大于所述第一用户界面的显示区域。
  16. 根据权利要求15所述的显示设备,其特征在于,所述显示单元还用于,在所述第一接收单元接收来自第一设备的第一应用的界面显示信息之后,所述显示单元根据所述界面显示信息显示第二用户界面之前,
    根据所述界面显示信息显示第三用户界面,所述第三用户界面的尺寸大小与所述显示设备的显示屏的尺寸大小不匹配,所述第三用户界面显示的内容与所述第一用户界面显示的内容一致;
    所述显示设备还包括第二接收单元,用于接收针对所述第三用户界面中的第一控件输入的第一操作,所述第一操作用于触发所述显示设备根据所述界面显示信息显示所述第二用户界面的操作。
  17. 根据权利要求15或16所述的显示设备,其特征在于,所述显示单元具体用于:
    根据所述界面显示信息判断出所述第一用户界面具备横屏显示的属性;
    根据所述横屏显示的属性显示所述第二用户界面。
  18. 根据权利要求15或16所述的显示设备,其特征在于,所述显示单元具体用于:
    根据所述界面显示信息判断出所述第一用户界面无横屏显示的属性;
    根据所述无横屏显示的属性显示所述第二用户界面,其中,所述第二用户界面中包括多个小窗口,所述多个小窗口中包括所述第一应用的主页面的窗口以及包括与所述第一用户界面显示的内容一致的窗口,所述多个小窗口均为属于所述第一应用的窗口。
  19. 根据权利要求18所述的显示设备,其特征在于,所述多个小窗口包括所述第一应用的主页面的窗口和填充窗口,所述填充窗口是所述显示设备自定义显示在所述第一应用的用户界面中的窗口。
  20. 根据权利要求16所述的显示设备,其特征在于,所述界面显示信息包括所述第一应用在所述第一设备中的任务栈内的信息,所述显示设备还包括判断单元,用于在所述第二接收单元接收对所述第三用户界面中的第一控件的第一操作之后,
    根据所述第一应用在所述第一设备中的任务栈内的信息判断出所述显示设备中安装有所述第一应用;
    所述显示单元,还用于通过所述第一应用基于所述界面显示信息显示所述第二用户界面。
  21. 根据权利要求20所述的显示设备,其特征在于,所述显示单元具体用于:
    响应于第二操作通过所述第一应用基于所述界面显示信息显示所述第二用户界面,所述第二操作为对所述第一应用的选择按钮的触控操作。
  22. 一种显示设备,其特征在于,所述设备包括处理器、接收接口、发送接口和存储器,其中,所述存储器用于存储计算机程序和/或数据,所述处理器用于执行所述存储器中存储的计算机程序,使得所述设备执行如权利要求8至14任一项所述的方法。
  23. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行以实现权利要求8至14任意一项所述的方法。
PCT/CN2021/107410 2020-07-21 2021-07-20 显示交互系统、显示方法及设备 WO2022017393A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21845720.8A EP4174633A4 (en) 2020-07-21 2021-07-20 DISPLAY INTERACTION SYSTEM, DISPLAY METHOD, AND DEVICE
US18/006,082 US20230350547A1 (en) 2020-07-21 2021-07-20 Display Interaction System, and Display Methodod and Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010708095.8 2020-07-21
CN202010708095.8A CN113961157B (zh) 2020-07-21 2020-07-21 显示交互系统、显示方法及设备

Publications (1)

Publication Number Publication Date
WO2022017393A1 true WO2022017393A1 (zh) 2022-01-27

Family

ID=79460019

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/107410 WO2022017393A1 (zh) 2020-07-21 2021-07-20 显示交互系统、显示方法及设备

Country Status (4)

Country Link
US (1) US20230350547A1 (zh)
EP (1) EP4174633A4 (zh)
CN (2) CN116360725B (zh)
WO (1) WO2022017393A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415877A (zh) * 2022-01-25 2022-04-29 深圳Tcl新技术有限公司 多窗口交互方法、装置、设备和存储介质
CN116560736A (zh) * 2022-01-30 2023-08-08 京东方科技集团股份有限公司 一种显示系统、电子设备和通信控制方法
CN116774870A (zh) * 2022-03-15 2023-09-19 荣耀终端有限公司 截屏方法及装置
CN114816624A (zh) * 2022-04-02 2022-07-29 厦门亿联网络技术股份有限公司 一种双系统的显示交互方法及装置
CN115146192B (zh) * 2022-09-02 2023-01-24 荣耀终端有限公司 内容接续方法和相关装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106055327A (zh) * 2016-05-27 2016-10-26 联想(北京)有限公司 一种显示方法及电子设备
CN106897038A (zh) * 2015-12-17 2017-06-27 北京传送科技有限公司 一种投屏系统
CN108415645A (zh) * 2018-01-19 2018-08-17 广州视源电子科技股份有限公司 智能交互平板的操作方法、装置以及智能交互平板
CN109445733A (zh) * 2018-10-16 2019-03-08 杭州橙鹰数据技术有限公司 跨屏展示方法、装置、计算设备以及储存介质
CN110381345A (zh) * 2019-07-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN111107518A (zh) * 2018-10-25 2020-05-05 上海博泰悦臻电子设备制造有限公司 显示方法、车载终端、显示系统及计算机可读存储介质
US10687018B1 (en) * 2019-01-02 2020-06-16 Lg Electronics Inc. Wireless device receiving a mirroring image from an external device and wireless system including wireless device and external device

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100162128A1 (en) * 2008-12-19 2010-06-24 Nigel Richardson User interfaces and associated apparatus and methods
CN103577058A (zh) * 2012-08-03 2014-02-12 腾讯科技(深圳)有限公司 多视窗浏览方法及浏览装置
US9360997B2 (en) * 2012-08-29 2016-06-07 Apple Inc. Content presentation and interaction across multiple displays
CN104423794A (zh) * 2013-09-11 2015-03-18 上海帛茂信息科技有限公司 一种具有双窗口显示功能的智能型移动设备
CN103902161B (zh) * 2014-04-01 2017-05-31 天津三星通信技术研究有限公司 应用程序的界面显示方法和设备
US20170255340A1 (en) * 2014-09-16 2017-09-07 Nec Corporation Information processing apparatus, and control method and control program thereof
US20160132992A1 (en) * 2014-11-06 2016-05-12 Microsoft Technology Licensing, Llc User interface scaling for devices based on display size
KR102373170B1 (ko) * 2015-01-07 2022-03-11 삼성전자주식회사 하나 이상의 아이템들을 동시에 표시하는 방법 및 이를 위한 전자 장치
US9916075B2 (en) * 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
KR102442527B1 (ko) * 2016-02-26 2022-09-13 엘지전자 주식회사 무선 디바이스
US10579238B2 (en) * 2016-05-13 2020-03-03 Sap Se Flexible screen layout across multiple platforms
CN106843732A (zh) * 2017-01-24 2017-06-13 维沃移动通信有限公司 一种分屏显示的方法及移动终端
US11137968B2 (en) * 2018-03-08 2021-10-05 Mitsubishi Electric Corporation Screen-creation assistance device, display system, screen-creation assistance method, and recording medium
WO2020029306A1 (zh) * 2018-08-10 2020-02-13 华为技术有限公司 一种图像拍摄方法及电子设备
CN111566606B (zh) * 2018-08-20 2022-07-26 华为技术有限公司 界面的显示方法及电子设备
CN111190558B (zh) * 2018-11-15 2022-09-30 腾讯科技(深圳)有限公司 投屏控制方法、装置、计算机可读存储介质和计算机设备
CN109814767A (zh) * 2018-12-10 2019-05-28 华为技术有限公司 消息处理方法及相关装置
CN109766066B (zh) * 2018-12-29 2022-03-01 华为技术有限公司 一种消息处理的方法、相关装置及系统
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN115629730A (zh) * 2019-07-23 2023-01-20 华为技术有限公司 显示方法及相关装置
CN111343698B (zh) * 2020-02-27 2022-04-26 深圳市信锐网科技术有限公司 投屏控制方法、装置、无线控制器及存储介质
CN111367456A (zh) * 2020-02-28 2020-07-03 青岛海信移动通信技术股份有限公司 通信终端及多窗口模式下的显示方法
US11501731B2 (en) * 2020-04-08 2022-11-15 Motorola Solutions, Inc. Method and device for assigning video streams to watcher devices

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106897038A (zh) * 2015-12-17 2017-06-27 北京传送科技有限公司 一种投屏系统
CN106055327A (zh) * 2016-05-27 2016-10-26 联想(北京)有限公司 一种显示方法及电子设备
CN108415645A (zh) * 2018-01-19 2018-08-17 广州视源电子科技股份有限公司 智能交互平板的操作方法、装置以及智能交互平板
CN109445733A (zh) * 2018-10-16 2019-03-08 杭州橙鹰数据技术有限公司 跨屏展示方法、装置、计算设备以及储存介质
CN111107518A (zh) * 2018-10-25 2020-05-05 上海博泰悦臻电子设备制造有限公司 显示方法、车载终端、显示系统及计算机可读存储介质
US10687018B1 (en) * 2019-01-02 2020-06-16 Lg Electronics Inc. Wireless device receiving a mirroring image from an external device and wireless system including wireless device and external device
CN110381345A (zh) * 2019-07-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4174633A4

Also Published As

Publication number Publication date
EP4174633A1 (en) 2023-05-03
CN116360725A (zh) 2023-06-30
EP4174633A4 (en) 2023-12-27
CN113961157A (zh) 2022-01-21
US20230350547A1 (en) 2023-11-02
CN116360725B (zh) 2024-02-23
CN113961157B (zh) 2023-04-07

Similar Documents

Publication Publication Date Title
WO2021013158A1 (zh) 显示方法及相关装置
EP3872609B1 (en) Application display method and electronic device
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
WO2020052529A1 (zh) 全屏显示视频中快速调出小窗口的方法、图形用户接口及终端
WO2020259452A1 (zh) 一种移动终端的全屏显示方法及设备
WO2020177622A1 (zh) Ui组件显示的方法及电子设备
WO2020134869A1 (zh) 电子设备的操作方法和电子设备
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
US20210263564A1 (en) Display Method for Flexible Display, and Terminal
WO2021036571A1 (zh) 一种桌面的编辑方法及电子设备
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
WO2020062294A1 (zh) 系统导航栏的显示控制方法、图形用户界面及电子设备
US20230216990A1 (en) Device Interaction Method and Electronic Device
WO2022068483A1 (zh) 应用启动方法、装置和电子设备
WO2022022575A1 (zh) 显示控制方法、装置和存储介质
WO2022068819A1 (zh) 一种界面显示方法及相关装置
WO2021078032A1 (zh) 用户界面的显示方法及电子设备
WO2022037726A1 (zh) 分屏显示方法和电子设备
WO2022042770A1 (zh) 控制通信服务状态的方法、终端设备和可读存储介质
WO2020155875A1 (zh) 电子设备的显示方法、图形用户界面及电子设备
WO2022057512A1 (zh) 分屏方法、装置及电子设备
WO2022042326A1 (zh) 显示控制的方法及相关装置
WO2021143391A1 (zh) 基于视频通话的共享屏幕方法及移动设备
WO2020238759A1 (zh) 一种界面显示方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21845720

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021845720

Country of ref document: EP

Effective date: 20230126

NENP Non-entry into the national phase

Ref country code: DE