WO2022089294A1 - 一种设备间屏幕协同方法及设备 - Google Patents

一种设备间屏幕协同方法及设备 Download PDF

Info

Publication number
WO2022089294A1
WO2022089294A1 PCT/CN2021/125218 CN2021125218W WO2022089294A1 WO 2022089294 A1 WO2022089294 A1 WO 2022089294A1 CN 2021125218 W CN2021125218 W CN 2021125218W WO 2022089294 A1 WO2022089294 A1 WO 2022089294A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
window
screen
content
screen projection
Prior art date
Application number
PCT/CN2021/125218
Other languages
English (en)
French (fr)
Inventor
叶灵洁
阚彬
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022089294A1 publication Critical patent/WO2022089294A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the present application relates to the technical field of electronic devices, and in particular, to a method and device for screen collaboration between devices.
  • the current inter-device collaboration mode supports device A (such as a mobile phone) to collaboratively project the screen to device B, and the user's operations on device A can be displayed on device B synchronously, while the user's operations on device A on device B, It can also be fed back to device A synchronously.
  • the current inter-device collaboration mode has a master-slave relationship. For example, after device A casts the screen to device B, device B acts as a slave device and cannot cast the screen to device A, resulting in a poor user experience.
  • an embodiment of the present application provides a method for inter-device screen collaboration, which is applied to a system composed of a first electronic device and a second electronic device.
  • the method includes: establishing, by the first electronic device and the second electronic device communication connection; the first electronic device displays a first window, and the first window includes first content; the second electronic device displays a second window, and the second window includes second content; the first window An electronic device sends first screen projection data to the second electronic device, and the first screen projection data includes information of the first content; the second electronic device receives the first screen projection data and displays all the screen projection data.
  • the third window includes the first content
  • the second electronic device sends second screen projection data to the first electronic device, and the second screen projection data includes information of the second content
  • the first electronic device receives the second screen projection data, displays the first window and a fourth window, and the fourth window includes the second content.
  • the first electronic device when the first electronic device displays the first window and the second electronic device displays the second window, the first electronic device sends the first screen projection data corresponding to the first window to the second electronic device, and the second electronic device sends the first screen projection data corresponding to the first window to the second electronic device.
  • the screen projection window of the first window that is, the third window, can be displayed according to the first screen projection data, and the second window of the second electronic device itself can be displayed at the same time, so that the screen of the first electronic device can be projected to the second electronic device.
  • An electronic device receives the second screen projection data sent by the second electronic device, and according to the second screen projection data, displays the screen projection window of the second window of the second electronic device, and simultaneously displays the first electronic device itself
  • the first window of the second electronic device can realize the screen projection of the second electronic device to the first electronic device. Therefore, the above method can realize bidirectional screen projection between the first electronic device and the second electronic device, and both electronic devices can simultaneously display the windows of themselves and the peer device, which improves the screen collaboration efficiency between the electronic devices, and can simultaneously display the windows of the two electronic devices. Improve user experience.
  • the method before the first electronic device sends the first screen projection data to the second electronic device, the method further includes: the first electronic device responds to the received first operation, Send a first screen projection request to the second electronic device, where the first screen projection request is used to request to project the first content to the second electronic device; the second electronic device displays the first screen projection Before the second window and the third window, the method further includes: displaying, by the second electronic device, first prompt information, where the first prompt information is used to prompt the user whether to accept the first electronic device to display the first content Project the screen to the second electronic device.
  • the first electronic device requests screen projection to the second electronic device according to the user operation, and the second electronic device displays corresponding prompt information after receiving the request, so as to determine whether to accept the screen projection according to the user operation. Therefore, the first electronic device and the second electronic device can perform coordinated control of screens between devices based on user requirements, and the user experience is better.
  • the method further includes: in response to the received second operation, the first electronic device updates the the first content in the first window; or, in response to the received third operation, the second electronic device sends a first update request to the first electronic device, where the first update request is used to request the The first electronic device updates the first content in the first window; the first electronic device updates the first content in the first window in response to the received first update request.
  • the user can control the update of the content in the first window of the first electronic device on the side of the first electronic device or the second electronic device, which greatly improves the flexibility and flexibility of device control in the process of screen collaboration between devices. Convenience. At the same time, it can support the second electronic device that accepts the screen projection to control the update of the display window of the first electronic device that initiates the screen projection.
  • the method further includes: the first electronic device sends the updated first content to the second electronic device A screen projection data; the second electronic device updates the first content in the third window according to the received updated first screen projection data.
  • the first electronic device updates the display content of its own first window, it indicates the updated content to the second electronic device, and the second electronic device can synchronously update the screen projection window corresponding to the first window, that is, the third electronic device. Therefore, the user can control the corresponding window displayed by the second electronic device while controlling the first electronic device, so as to ensure the consistency of the displayed contents of the two electronic devices.
  • the method further includes: the first electronic device displays a first control option; the first electronic device In response to the received first update request, before updating the first content in the first window, the method further includes: the first electronic device receiving a fourth operation acting on the first control option .
  • the user can set whether the first electronic device accepts the control of the second electronic device, which improves the flexibility of coordinated control of screens between devices.
  • the method before the second electronic device sends the second screen projection data to the first electronic device, the method further includes: the second electronic device responds to the received fifth operation, Send a second screen projection request to the first electronic device, where the second screen projection request is used to request to project the second content to the first electronic device; the first electronic device displays the first screen projection Before the first window and the fourth window, the method further includes: displaying, by the first electronic device, second prompt information, where the second prompt information is used to prompt the user whether to accept the second electronic device to display the second content Project the screen to the first electronic device.
  • the second electronic device requests screen projection to the first electronic device according to the user operation, and the first electronic device displays corresponding prompt information after receiving the request, so as to determine whether to accept the screen projection according to the user operation. Therefore, the first electronic device and the second electronic device can perform inter-device screen coordination control based on user requirements, and the user experience is better.
  • the method further includes: in response to the received sixth operation, the second electronic device updates the the second content in the second window; or, in response to the received seventh operation, the first electronic device sends a second update request to the second electronic device, where the second update request is used to request the The second electronic device updates the second content in the second window; the second electronic device updates the second content in the second window in response to the received second update request.
  • the user can control the update of the content in the second window of the second electronic device on the side of the second electronic device or the first electronic device, which greatly improves the flexibility and flexibility of device control in the process of screen collaboration between devices. Convenience. At the same time, it can support the first electronic device that accepts the screen projection to control the update of the display window of the second electronic device that initiates the screen projection.
  • the method further includes: the second electronic device sends the updated first electronic device to the first electronic device Second screen projection data; the first electronic device updates the second content in the fourth window according to the received updated second screen projection data.
  • the second electronic device updates the display content of its own second window, it indicates the updated content to the first electronic device, and the first electronic device can synchronously update the screen projection window corresponding to the second window, that is, the fourth electronic device. Therefore, the user can control the corresponding window displayed by the first electronic device while controlling the second electronic device, so as to ensure the consistency of the displayed contents of the two electronic devices.
  • the method further includes: the second electronic device displays a second control option; the second electronic device displays a second control option; In response to the received second update request, before updating the second content in the second window, the method further includes: the second electronic device receiving an eighth operation acting on the second control option .
  • the user can set whether the second electronic device accepts the control of the first electronic device, which improves the flexibility of coordinated control of screens between devices.
  • the display areas of the first window and the fourth window are different, or the display area of the first window covers a part of the display area of the fourth window, or, The display area of the fourth window covers a part of the display area of the first window.
  • the display area of the second window is different from that of the third window, or the display area of the second window covers a part of the display area of the third window, or, The display area of the third window covers a part of the display area of the second window.
  • the system further includes a third electronic device, and after the first electronic device displays the first window and the fourth window, the method further includes: the first electronic device and the third electronic device establishes a communication connection; the third electronic device displays a fifth window, and the fifth window includes third content; the first electronic device sends the first message to the third electronic device screen data; the third electronic device receives the first screen projection data, displays the fifth window and the sixth window, and the sixth window includes the first content; the third electronic device sends the The first electronic device sends third screen projection data, the third screen projection data includes information of the third content; the first electronic device receives the third screen projection data, displays the first window and A seventh window, where the third content is included in the seventh window.
  • the first electronic device after the first electronic device establishes two-way screen projection with the second electronic device, it can also establish two-way screen projection with the third electronic device at the same time. Therefore, the first electronic device can perform screen projection control on multiple electronic devices. And accept the screen projection control of multiple electronic devices at the same time, so as to realize the screen collaboration between devices in the multi-device scenario.
  • an embodiment of the present application provides a method for inter-device screen collaboration, which is applied to a first electronic device.
  • the method includes: establishing a communication connection with the second electronic device; displaying a first window, where the first window includes a first electronic device. content; sending first screen projection data to the second electronic device, the first screen projection data including information of the first content; receiving second screen projection data from the second electronic device, the The second screen projection data includes information of second content, the second content is the content included in the second window displayed by the second electronic device; the first window and the fourth window are displayed, and the fourth window is displayed including the second content.
  • the method before sending the first screen projection data to the second electronic device, the method further includes: in response to the received first operation, sending the first projection data to the second electronic device A screen request, where the first screen projection request is used to request to screen the first content to the second electronic device.
  • the method before receiving the second screen projection data from the second electronic device, the method further includes: displaying second prompt information, where the second prompt information is used to prompt the user whether to accept the The second electronic device projects the second content to the first electronic device.
  • the method further includes: in response to the received second operation, updating the first content in the first window or, in response to a received first update request from the second electronic device, update the first content in the first window, wherein the first update request is used to request the first electronic device The first content in the first window is updated.
  • the method when updating the first content in the first window, the method further includes: sending the updated first screen projection data to the second electronic device.
  • the method further includes: displaying a first control option; in response to the first update request from the second electronic device , before updating the first content in the first window, the method further includes: receiving a fourth operation acting on the first control option.
  • the method further includes: in response to the received seventh operation, sending a second update request to the second electronic device, the The second update request is used to request the second electronic device to update the second content in the second window.
  • the method further includes: in response to the received updated second screen projection data from the second electronic device, The second content in the fourth window is updated.
  • the display areas of the first window and the fourth window are different, or the display area of the first window covers a part of the display area of the fourth window, or, The display area of the fourth window covers a part of the display area of the first window.
  • the method further includes: establishing a communication connection with a third electronic device; sending the first screen projection to the third electronic device data; receiving third screen projection data from the third electronic device, where the third screen projection data includes information about a third content, and the third content includes information in the fifth window displayed by the third electronic device. content; displaying the first window and the seventh window, and the seventh window includes the third content.
  • an embodiment of the present application provides an electronic device, the electronic device includes a display screen, a memory, and one or more processors; wherein, the memory is used to store computer program code, and the computer program code includes a computer an instruction; when the instruction is invoked and executed by the one or more processors, the instruction enables the electronic device to execute the method described in the second aspect or any possible design of the second aspect.
  • an embodiment of the present application provides a chip, which is coupled to a memory in an electronic device, so that the chip invokes a computer program stored in the memory when running, so as to implement the first aspect of the embodiment of the present application or A method for any possible design provided in the first aspect, or a method for implementing any possible design provided in the second aspect or the second aspect of the embodiments of this application.
  • an embodiment of the present application provides a computer storage medium, where a computer program is stored in the computer storage medium, and when the computer program runs on an electronic device, the electronic device causes the electronic device to execute the first aspect or any of the first aspect.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on an electronic device, enables the electronic device to execute the first aspect or a method for any possible design of the first aspect , or a method for implementing the second aspect or any possible design provided by the second aspect of the embodiments of this application.
  • FIG. 1 is a schematic diagram of a cross-device collaborative screen projection provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of an inter-device screen collaboration system architecture provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an Android operating system of an electronic device provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a method for inter-device screen collaboration provided by an embodiment of the present application.
  • 6a is a schematic diagram of a method for establishing screen collaboration between devices according to an embodiment of the present application
  • 6b is a schematic diagram of a display screen of a first electronic device provided by an embodiment of the application.
  • FIG. 6c is a schematic diagram of a display screen of a second electronic device according to an embodiment of the present application.
  • FIG. 6d is a schematic diagram of a display screen of a second electronic device that accepts screen projection according to an embodiment of the present application
  • FIG. 6e is a schematic diagram of a display screen for updating a first electronic device according to an embodiment of the present application.
  • FIG. 6f is a schematic diagram of a display screen updated after a second electronic device accepts screen projection according to an embodiment of the present application
  • FIG. 7a is a schematic diagram of a method for establishing a two-way inter-device screen collaboration provided by an embodiment of the present application
  • FIG. 7b is a schematic diagram of a display screen of a second electronic device that accepts screen projection according to an embodiment of the application;
  • FIG. 7c is a schematic diagram of a display screen of a first electronic device according to an embodiment of the present application.
  • FIG. 7d is a schematic diagram of a display screen of a first electronic device that accepts screen projection according to an embodiment of the present application
  • FIG. 7e is a schematic diagram of a display screen for updating a second electronic device according to an embodiment of the present application.
  • FIG. 7f is a schematic diagram of a display screen updated after the first electronic device accepts screen projection according to an embodiment of the present application
  • FIG. 8 is a schematic flowchart of a two-way inter-device screen collaborative control provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the effect of a two-way inter-device screen collaboration method provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the effect of a two-way inter-device screen collaboration method provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of the effect of a two-way inter-device screen collaboration method provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram of a bidirectional inter-device screen coordination control method provided by an embodiment of the present application.
  • 13a is a schematic diagram of a display screen of a first electronic device that accepts screen projection according to an embodiment of the application;
  • 13b is a schematic diagram of a display screen of a first electronic device receiving screen projections from multiple electronic devices according to an embodiment of the application;
  • FIG. 14 is a schematic diagram of a method for controlling screen coordination between devices according to an embodiment of the present application.
  • 15 is a schematic diagram of a screen collaborative control between devices provided by an embodiment of the present application.
  • 16 is a schematic diagram of a screen collaborative control between devices provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a screen collaboration method between devices provided by an embodiment of the present application.
  • FIG. 18 is a schematic diagram of an electronic device according to an embodiment of the present application.
  • the electronic device may be a portable device, such as a mobile phone, a tablet computer, a wearable device with a wireless communication function (for example, a watch, a wristband, a helmet, a headset, etc.), a vehicle terminal device, an augmented reality , AR)/virtual reality (VR) devices, laptops, ultra-mobile personal computers (UMPCs), netbooks, personal digital assistants (PDAs), smart home devices (e.g., Smart TV, etc.) and other smart terminal devices.
  • a portable device such as a mobile phone, a tablet computer, a wearable device with a wireless communication function (for example, a watch, a wristband, a helmet, a headset, etc.), a vehicle terminal device, an augmented reality , AR)/virtual reality (VR) devices, laptops, ultra-mobile personal computers (UMPCs), netbooks, personal digital assistants (PDAs), smart home devices (e.g., Smart TV, etc.) and other smart terminal devices.
  • the electronic device may also be a portable terminal device that further includes other functions, such as a personal digital assistant and/or a screen display function.
  • portable terminal devices include but are not limited to carrying Or portable terminal devices with other operating systems.
  • the above-mentioned portable terminal device may also be other portable terminal devices, such as a laptop computer (Laptop) having a display screen, or the like. It should also be understood that, in some other embodiments of the present application, the above-mentioned electronic device may not be a portable terminal device, but a desktop computer having a display screen.
  • Screen collaboration also known as multi-screen collaboration, screen projection, same screen, flying screen, and screen sharing, refers to the real-time display of the screen output displayed on the screen a of a device a (such as a mobile phone, tablet, notebook, computer, etc.).
  • the setting area of screen b of device b (such as tablet, notebook, computer, TV, all-in-one machine, projector, etc.).
  • the screen changes of screen a caused by operating device a will be displayed in the setting area of screen b synchronously.
  • the picture change of the setting area of screen b is caused to be displayed on the picture of screen a synchronously.
  • Middleware is a type of software between the operating system and the application program, used to connect the software components of the operating system and the user's application software, and is a general service between the platform (hardware and operating system) and the application. .
  • Middleware uses the basic services (functions) provided by the system software to connect various parts of the application system or different applications on the network to achieve the purpose of resource sharing and function sharing.
  • Middleware is an independent system-level software service program that supports distributed computing and can provide interactive functions of applications or services that are transparent across networks and hardware.
  • At least one refers to one or more, and "a plurality” refers to two or more.
  • And/or which describes the association relationship of the associated objects, means that there can be three kinds of relationships, for example, A and/or B, it can mean: A exists alone, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one (item) of the following or its similar expression refers to any combination of these items, including any combination of single item (item) or plural item (item).
  • At least one (a) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, where a, b, c Can be single or multiple.
  • FIG. 1 it is a schematic diagram of a cross-device collaborative screen projection provided by an embodiment of the present application.
  • device A such as a mobile phone
  • device B such as a tablet computer
  • the display screen of device B outputs the corresponding screen of the entire display screen of display device A.
  • Display the window After device A projects the display window corresponding to the display screen to device B, both device A and device B can control the display window, and the control operations and results of the display window on device A can be corresponding on the display screen of device B.
  • the control operations and results of the display window on the device B can be synchronously displayed in the corresponding display window on the display screen of the device A.
  • an embodiment of the present application provides a screen collaboration method between devices, which is applied in a scenario of cross-device screen projection control.
  • FIG. 2 is a schematic diagram of an inter-device screen collaboration system architecture provided by an embodiment of the present application.
  • the system architecture may include: a first electronic device 201 (such as a mobile phone shown in the figure) and a second electronic device 202 (such as a tablet computer shown in the figure).
  • the second electronic device 202 may be any electronic device among at least one electronic device connected to the first electronic device 201; Project the screen to the display screen of the other party for display, realize two-way screen projection cooperation, and support mutual operation control.
  • the display window on the display screen is an overall window corresponding to the display screen, or a partial window corresponding to an application displayed on the display screen.
  • the first electronic device 201 and the second electronic device 202 can communicate.
  • the first electronic device 201 and the second electronic device 202 are connected to the same local area network.
  • the first electronic device 201 and the second electronic device 202 may communicate through short-range wireless communication technologies such as Bluetooth and Wi-Fi.
  • the first electronic device 201 and the second electronic device 202 are connected in a wired manner and communicate with each other.
  • the first electronic device 201 and the second electronic device 202 may communicate with each other after being connected through a data line such as a Universal Serial Bus (Universal Serial Bus, USB) data line.
  • USB Universal Serial Bus
  • first electronic device 201 and the second electronic device 202 are connected to the same local area network, it may specifically be that the first electronic device 201 and the second electronic device 202 establish a wireless connection with the same wireless access point.
  • the first electronic device 201 and the second electronic device 202 can access the same wireless fidelity (Wireless Fidelity, Wi-Fi) hotspot.
  • the first electronic device 201 and the second electronic device 202 can also use the Bluetooth protocol Access to the same Bluetooth beacon.
  • a communication connection between the first electronic device 201 and the second electronic device 202 may also be triggered by a Near Field Communication (NFC) tag, and encrypted information may be transmitted through a Bluetooth module for identity authentication. After the authentication is successful, data transmission is carried out in a point-to-point (P2P) manner.
  • NFC Near Field Communication
  • P2P point-to-point
  • the first electronic device 201 and the second electronic device 202 are smart devices with an output display function, such as a mobile phone, a tablet, a computer, and a smart TV.
  • this application does not limit the number of electronic devices in the inter-device screen collaboration system, which may include two electronic devices or more electronic devices, for example, may include 3 or 4 electronic devices, etc. .
  • the inter-device screen collaboration method provided by the present application can be executed between any two electronic devices to realize bidirectional screen projection and collaborative control.
  • the inter-device screen collaboration can be that both device A and device B are projected to device C, and device C is projected to the device respectively.
  • the screen collaboration between devices may be that device A projects the screen to device B and device C respectively, device B projects the screen to device A and device C respectively, and device C projects the screen to device A and device B respectively; or , the screen collaboration between devices can be that device A casts the screen to device B, device B casts the screen to device C, and device C casts the screen to device A, etc., which will not be listed here.
  • the electronic device 300 may include a processor 310, an external memory interface 320, an internal memory 321, a USB interface 330, a charging management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, and a mobile communication module 350, wireless communication module 360, audio module 370, speaker 370A, receiver 370B, microphone 370C, headphone jack 370D, sensor module 380, buttons 390, motor 391, indicator 392, camera 393, display screen 394, and SIM card interface 395 Wait.
  • a processor 310 an external memory interface 320, an internal memory 321, a USB interface 330, a charging management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, and a mobile communication module 350, wireless communication module 360, audio module 370, speaker 370A, receiver 370B, microphone 370C, headphone jack 370D, sensor module 380, buttons 390, motor 391, indicator 392, camera 393, display screen 394, and SIM card interface 395 Wait.
  • the sensor module 380 may include a gyroscope sensor, an acceleration sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like.
  • the electronic device shown in FIG. 3 is only an example, and does not constitute a limitation to the electronic device, and the electronic device may have more or less components than those shown in the figure, and two components may be combined. one or more components, or may have different component configurations.
  • the various components shown in Figure 3 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the processor 310 may include one or more processing units, for example, the processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or Neural-network Processing Unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 300 . The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 310 for storing instructions and data.
  • the memory in processor 310 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 310 . If the processor 310 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 310 is reduced, thereby increasing the efficiency of the system.
  • the execution of the inter-device screen coordination method provided by this embodiment of the present application may be controlled by the processor 310 or by invoking other components, for example, by invoking the processing program of this embodiment of the present application stored in the internal memory 321 to control the wireless communication module 360 to send Other electronic devices perform data communication to realize screen collaboration between devices, improve the collaborative control efficiency of electronic devices, and improve user experience.
  • the processor 310 may include different devices. For example, when the CPU and the GPU are integrated, the CPU and the GPU may cooperate to execute the inter-device screen coordination method provided by the embodiments of the present application. Executed by GPU for faster processing efficiency.
  • Display screen 394 is used to display images, videos, and the like.
  • Display screen 394 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 300 may include 1 or N display screens 394 , where N is a positive integer greater than 1.
  • the display screen 394 may be used to display information entered by or provided to the user as well as various graphical user interfaces (GUIs).
  • GUIs graphical user interfaces
  • display screen 394 may display photos, videos, web pages, or files, and the like.
  • the display screen 394 may display a graphical user interface of the electronic device as shown in FIG. 2 .
  • the graphical user interface of the electronic device as shown in FIG. 2 includes a status bar, a Dock bar, a hideable navigation bar, time and weather widgets, and application icons such as browser icons and the like.
  • the status bar includes operator name (eg China Mobile), mobile network (eg 4G), time and remaining battery.
  • the status bar may further include a Bluetooth icon, a Wi-Fi icon, an external device icon, and the like.
  • the graphical user interface of the electronic device shown in FIG. 2 may further include a Dock bar, and the Dock bar may include commonly used application icons and the like.
  • the display screen 394 may be an integrated flexible display screen, or a spliced display screen composed of two rigid screens and a flexible screen located between the two rigid screens.
  • the processor 310 may control the display screen 394 to display the relevant results.
  • Camera 393 front or rear camera, or a camera that can be both a front camera and a rear camera
  • the camera 393 may include a photosensitive element such as a lens group and an image sensor, wherein the lens group includes a plurality of lenses (convex or concave) for collecting the light signal reflected by the object to be photographed, and transmitting the collected light signal to the image sensor .
  • the image sensor generates an original image of the object to be photographed according to the light signal.
  • Internal memory 321 may be used to store computer executable program code, which includes instructions.
  • the processor 310 executes various functional applications and data processing of the electronic device 300 by executing the instructions stored in the internal memory 321 .
  • the internal memory 321 may include a storage program area and a storage data area.
  • the stored program area may store the code of the operating system, the application program (such as the screen coordination function between devices, etc.).
  • the storage data area may store data created during the use of the electronic device 300 (eg, process data generated by executing the inter-device screen coordination function provided by the embodiments of the present application, etc.).
  • the internal memory 321 may also store one or more computer programs corresponding to the inter-device screen collaboration algorithm provided in the embodiment of the present application.
  • the one or more computer programs are stored in the aforementioned internal memory 321 and configured to be executed by the one or more processors 310, and the one or more computer programs include instructions that can be used to perform the following embodiments each step.
  • the internal memory 321 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • non-volatile memory such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the code of the inter-device screen collaboration algorithm provided by the embodiment of the present application may also be stored in an external memory.
  • the processor 310 may execute the code of the inter-device screen coordination algorithm stored in the external memory through the external memory interface 320 .
  • the sensor module 380 may include a gyro sensor, an acceleration sensor, a proximity light sensor, a fingerprint sensor, a touch sensor, and the like.
  • Touch sensor also known as "touch panel”.
  • the touch sensor may be disposed on the display screen 394, and the touch sensor and the display screen 394 form a display screen, also referred to as a "display screen”.
  • a touch sensor is used to detect touch operations on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 394 .
  • the touch sensor may also be disposed on the surface of the electronic device 300 , which is different from the location where the display screen 394 is located.
  • the display screen 394 of the electronic device 300 displays a main interface, and the main interface includes icons of multiple applications (such as a camera application, a WeChat application, etc.).
  • Display screen 394 displays an interface of a camera application, such as a viewfinder interface.
  • the wireless communication function of the electronic device 300 can be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 300 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 350 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the electronic device 300 .
  • the mobile communication module 350 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 350 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 350 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 350 may be provided in the processor 310 .
  • the mobile communication module 350 may be provided in the same device as at least part of the modules of the processor 310 .
  • the mobile communication module 350 may also be used for information interaction with other electronic devices, for example, sending an instruction for screen projection or updating a screen projection window to other electronic devices, or the mobile communication module 350 may be used for receiving other electronic devices The command sent by the device to cast the screen or update the screen cast window.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 370A, the receiver 370B, etc.), or displays an image or video through the display screen 394 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 310, and may be provided in the same device as the mobile communication module 350 or other functional modules.
  • the wireless communication module 360 can provide applications on the electronic device 300 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 360 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 360 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 310 .
  • the wireless communication module 360 can also receive the signal to be sent from the processor 310 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the wireless communication module 360 is configured to establish a connection with other electronic devices to perform data interaction.
  • the wireless communication module 360 may be used to access the access point device, send a screen projection or update screen projection window instruction to other electronic devices, or receive a screen projection or screen projection window update instruction sent by other electronic devices.
  • the first electronic device and the second electronic device can receive or send the instructions and data related to screen projection through the mobile communication module 350 or the wireless communication module 360, so as to realize the inter-device screen.
  • Collaborative function Exemplarily, as shown in FIG. 2 , the first electronic device and the second electronic device can receive or send the instructions and data related to screen projection through the mobile communication module 350 or the wireless communication module 360, so as to realize the inter-device screen. Collaborative function.
  • the electronic device 300 may implement audio functions through an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone interface 370D, an application processor, and the like. Such as music playback, recording, etc.
  • the electronic device 300 may receive the key 390 input and generate the key signal input related to the user setting and function control of the electronic device 300 .
  • the electronic device 300 may use the motor 391 to generate vibration alerts (eg, vibration alerts for incoming calls).
  • the indicator 392 in the electronic device 300 may be an indicator light, which may be used to indicate a charging state, a change in power, and may also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 395 in the electronic device 300 is used to connect the SIM card. The SIM card can be contacted and separated from the electronic device 300 by inserting into the SIM card interface 395 or pulling out from the SIM card interface 395 .
  • the electronic device 300 may include more or less components than those shown in FIG. 3 , which are not limited in this embodiment of the present application.
  • the illustrated electronic device 300 is only an example, and the electronic device 300 may have more or fewer components than those shown, two or more components may be combined, or a different configuration of components may be present.
  • the various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the software system of the electronic device 300 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present invention take an Android system with a layered architecture as an example to illustrate the software structure of an electronic device.
  • FIG. 4 it is a block diagram of a software structure of an electronic device according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a software architecture that can run in the above-mentioned first electronic device or the second electronic device.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the software architecture can be divided into five layers, namely the application layer, the application framework layer, the Android runtime and system library, the hardware abstraction layer and the Linux kernel layer.
  • the application layer is the top layer of the operating system and includes the native applications of the operating system, such as email client, bluetooth, camera, music, video, text messages, calls, calendar, browser, contacts, etc.
  • the APP involved in the embodiments of the present application referred to as an application for short, is a software program capable of implementing one or more specific functions.
  • multiple applications can be installed in a terminal device.
  • the applications mentioned below may be system applications that have been installed when the terminal device is shipped from the factory, or may be third-party applications downloaded from the network or obtained from other terminal devices by the user during the use of the terminal device.
  • the application framework layer can include window managers, content providers, view systems, telephony managers, resource managers, notification managers, etc.
  • the window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • the application layer may include an interface setting service for implementing the presentation of a setting interface
  • the above-mentioned setting interface may be used by a user to set the inter-device screen coordination function of the terminal device.
  • the user can set on or off the screen coordination function between devices in the setting interface.
  • the above-mentioned setting interface may be the content in the status bar or notification bar displayed on the touch screen of the terminal device, or may be the relevant control interface of the device control function displayed on the touch screen of the terminal device.
  • the application can be developed using the Java language, by calling the Application Programming Interface (API) provided by the application framework layer, and the developer can communicate with the operating system through the application framework. interact with the underlying layers (such as hardware abstraction layer, kernel layer, etc.) to develop their own applications.
  • API Application Programming Interface
  • the application framework is mainly a series of services and management systems of the operating system.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer can include some predefined functions. As shown in Figure 4, the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the telephony manager is used to provide the communication function of the terminal device.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction.
  • the application framework layer may further include an inter-device screen collaboration service, which is used to control and implement the inter-device screen collaboration function.
  • the inter-device screen collaboration service may include a multi-window framework service, which is mainly used for coordinating between devices.
  • the screen coordination service provides device control and window display control functions. For example, it can be used to manage currently connected electronic devices, manage windows displayed on the display screen, and so on.
  • the inter-device screen collaboration service may further include a notification manager, which is used for information interaction with other data layers.
  • the Android runtime includes core libraries and a virtual machine.
  • the Android runtime is responsible for the scheduling and management of the Android system.
  • the core library of the Android system consists of two parts: one is the function functions that the Java language needs to call, and the other is the core library of the Android system.
  • the application layer and the application framework layer run in virtual machines. Taking Java as an example, the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager, media library, 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the Surface Manager manages the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • a 2D graphics engine is a drawing engine for 2D drawing.
  • the Hardware Abstraction Layer (HAL) is the support of the application framework and an important link between the application framework layer and the Linux kernel layer. It can provide services for developers through the application framework layer.
  • the kernel (Kernel) layer provides the core system services of the operating system, such as security, memory management, process management, network protocol stack and driver model, all based on the kernel layer.
  • the kernel layer also acts as an abstraction layer between the hardware and software stacks. This layer has many drivers related to electronic devices, the main drivers are: display driver; Linux-based frame buffer driver; keyboard driver as input device; Flash driver based on memory technology device; camera driver; audio driver; Bluetooth driver ; WI-FI driver, etc.
  • the core layer as an abstraction layer between the hardware and the software stack, includes a touch driver service, which is used to obtain operation information received by a hardware part (such as a touch screen, a touch sensor, etc.) and related to triggering a window update, and report.
  • a touch driver service which is used to obtain operation information received by a hardware part (such as a touch screen, a touch sensor, etc.) and related to triggering a window update, and report.
  • both the first electronic device and the second electronic device can be implemented through the above hardware architecture and software architecture.
  • the middleware and the soft bus structure are combined with the above-mentioned software structure to realize bidirectional screen projection control between two electronic devices (the first electronic device and the second electronic device).
  • the middleware connects the application program layer and the application program framework layer of the operating system, and provides the operating system with services such as application interface standardization, protocol unification, and shielding specific operation details.
  • the soft bus is located at the link layer and is a module that encapsulates the operation of the operating system on inter-process communication resources, shared memory and other resources that are commonly used by multiple processes. It can provide standard resource application, use and recycling interfaces for task processes. Use the interface and protocol flags to share resources.
  • the electronic device that initiates screen collaboration between devices sends screen projection information to the application framework layer and the middleware respectively through the application layer, where the screen projection information includes the projection data to be displayed on the screen, the projection screen objects, etc.
  • the middleware encodes the display stream of the projected screen data, it transmits the encoded data to the link layer and sends it to the electronic device that accepts the projected screen by the soft bus.
  • the electronic device that accepts the projection screen decodes the encoded data to obtain the projection screen data, and reports it to the application framework layer. window display.
  • the screen to be projected to the second electronic device is performed through the application layer.
  • the displayed screen projection data is sent to the middleware, and the middleware encodes the display stream of the screen projection data and sends it to the link layer, and the link layer sends the encoded screen projection data to the second electronic device.
  • the link layer utilizes a soft bus, and is sent to the link layer of the second electronic device by means of short-range communication such as Bluetooth and Wi-Fi.
  • the link layer of the second electronic device receives the encoded screen projection data sent by the link layer of the first electronic device, and sends it to the middleware.
  • the middleware decodes the encoded screen projection data and sends it to the application framework.
  • the application framework layer displays the screen projection data.
  • the application framework sends its own window data displayed on the current display screen to the application framework layer through the application layer.
  • the layer outputs the window data and the screen projection data sent by the link layer at the same time to display on the display screen.
  • the application layer sends the screencast data to be displayed on the first electronic device to the middleware, and the middleware encodes the display stream of the screencast data and sends it to the middleware.
  • the link layer sends the encoded screen projection data to the first electronic device.
  • the link layer utilizes a soft bus and is sent to the link layer of the first electronic device by means of short-distance communication.
  • the link layer of the first electronic device receives the encoded screen projection data sent by the link layer of the second electronic device, and sends it to the middleware.
  • the middleware decodes the encoded screen projection data and sends it to the application framework. layer
  • the application framework layer displays the screen projection data.
  • the application framework layer sends the window data displayed on the currently displayed screen to the application framework layer
  • the application framework layer sends this data to the application framework layer.
  • the window data and the screen projection data sent by the link layer are simultaneously output and displayed on the display screen.
  • the above-mentioned screen projection data of the first electronic device and the second electronic device cooperatively projected to each other is window data displayed on the display screen and generated by itself, and the window data is the data of the overall window corresponding to the display screen.
  • the above-mentioned methods for coordinating screen projection by the first electronic device to the second electronic device and the method for coordinating screen projection by the second electronic device to the first electronic device may be performed separately; or may be performed simultaneously to achieve bidirectional simultaneous screen projection.
  • the first electronic device and the second electronic device are in the same local area network, and communicate through a short-distance communication technology.
  • the same local area network may be a WiFi local area network, a Bluetooth local area network, or the like.
  • the first electronic device and the second electronic device respectively establish a two-way inter-device screen coordination information transmission route, which can respectively transmit the screen projection data to the opposite end device, and then display the information on the display screen of the opposite end device.
  • Screen projection data realize two-way projection control, and solve the problem that two-way collaborative sharing between devices cannot be achieved.
  • both the first electronic device and the second electronic device are provided with a screen coordination switch that controls whether the screen coordination function between devices is activated, and the screen coordination switch can be turned on or off by the user.
  • the screen collaboration switch when the screen collaboration switch is turned on, the electronic device supports the screen collaboration function between devices, and can execute the inter-device screen collaboration method provided by the embodiment of the present application, and project the screen to other electronic devices or accept the screen projection of other electronic devices; the screen collaboration switch is turned off , the electronic device does not support the inter-device screen collaboration function.
  • the first electronic device is the device that initiates the screen collaboration between devices first
  • the second electronic device is the device that initiates the screen collaboration between devices as an example. That is, the first electronic device initiates the screen collaboration between devices. After the screen is projected to the second electronic device, the second electronic device initiates inter-device screen collaboration and projects the screen to the first electronic device.
  • establishing screen collaboration between the first electronic device and the second electronic device means that the first electronic device projects the screen to the second electronic device on the basis of establishing a communication connection with the second electronic device ;
  • establishing screen coordination with the first electronic device by the second electronic device means that the second electronic device projects a screen to the first electronic device on the basis of establishing a communication connection with the first electronic device.
  • the first electronic device may automatically initiate the inter-device screen collaboration, or initiate the inter-device screen collaboration according to a user's instruction.
  • the first electronic device when detecting that a communication connection can be established with the second electronic device, can directly establish inter-device screen coordination with the second electronic device, and project its own display window to the second electronic device. displayed on the display screen of the electronic device.
  • a screen-casting button is displayed in the display window on the display screen, and the screen-casting button is used to trigger the first electronic device
  • the device projects its own window displayed on the display screen to the display screen of the second electronic device for display.
  • the first electronic device sends a screen projection request to the second electronic device in response to the user's operation of the screen projection button. After the second electronic device passes the request, the first electronic device cooperates with the screen projection.
  • the second electronic device Go to the second electronic device, and project its own display window A (also referred to as the first window in this application) displayed on the current display screen to the display screen of the second electronic device for display, and the second electronic device is displaying
  • the display window B that displays itself and the screen projection window A1 corresponding to the display window of the first electronic device are simultaneously output on the screen.
  • the window A shown in FIG. 6a is its own display window on the current display screen of the first electronic device
  • the window B is (also referred to as the second window in this application) itself on the current display screen of the second electronic device.
  • window, the window A1 is the screen projection window corresponding to the screen projection of the window A in the first electronic device to the display screen of the second electronic device.
  • the first electronic device and the second electronic device are two different tablet computers
  • the first electronic device detects that it can communicate with the second electronic device.
  • the electronic device (tablet computer 2) establishes a communication connection
  • a screen projection button 601 is displayed in its own display window.
  • the user can click the screen projection button to trigger the first electronic device and the second electronic device to establish a device
  • the first electronic device in response to the user's operation of clicking the screen projection button, sends a screen projection request to the second electronic device, requesting to establish inter-device screen collaboration with the second electronic device.
  • the second electronic device displays a prompt message in its own display window, prompting the user to choose whether to accept the screen projection of the first electronic device, for example, displaying a query message "Do you accept the screen projection? ”, if it is determined according to the user operation to accept the screen projection of the first electronic device, then send the feedback information of accepting the projection screen to the first electronic device.
  • the first electronic device After the first electronic device receives the feedback information, it sends the data corresponding to its own window to the second electronic device, and the second electronic device displays the window displayed by the first electronic device according to the received data, as shown in Figure 6d, wherein , window 1 is the window displayed by the second electronic device before accepting the screen projection, and window 2 is the window that the first electronic device projects to the second electronic device, and the display content is the same as the display content of the window of the first electronic device.
  • the display content in the window includes information displayed on the interface corresponding to the window on the display screen.
  • the first window is the window corresponding to the overall display interface of the display screen, and the first content included in the first window includes all the contents in the overall display interface of the display screen. Display elements, including the status bar content, application icons, etc. shown in the figure.
  • the first window is the window corresponding to the overall display interface of the display screen, and the first content included in the first window includes the first content in the application interface of the SMS application. information.
  • the first window is the window 1 shown in the figure, and the first content included in the first window includes the short message shown in the window 1 in the figure. Information in the app's app interface.
  • the above-mentioned first electronic device can determine the initiation timing of the screen collaboration between the devices according to the actual scene or user settings, which improves the flexibility of the screen collaboration control between the devices.
  • the corresponding screen projection window on the display screen of the second electronic device is updated synchronously.
  • the first electronic device updates the data displayed in the display window, and sends the updated data to the second electronic device and instructs the second electronic device to update the corresponding window synchronously.
  • the screen window the second electronic device displays the received updated data in the screen projection window according to the instruction of the first electronic device. For example, as shown in the window distribution shown in Fig.
  • the content of the projection window A1 on the display screen of the second electronic device is updated synchronously, and the content of the display window A1 is updated synchronously with the display window.
  • the content of A remains the same.
  • the change of the content displayed in the window on the display screen of the electronic device belongs to the window update
  • the operation that can cause the change of the display content of the window belongs to the operation of triggering the window update
  • the window displayed by the first electronic device is updated from the desktop window shown in FIG. 6b to the window corresponding to the short message interface shown in FIG. 6e.
  • the first electronic device sends the updated message window related data to the second electronic device, and the second electronic device synchronously updates the screen projection window (window 2) of the first electronic device shown in FIG. 6d according to the received data, and obtains the following:
  • the window 1 is the window of the second electronic device itself
  • the window 2 is the updated window of the window projected by the first electronic device to the second electronic device.
  • the corresponding display window in the first electronic device is updated synchronously.
  • the second electronic device updates the data displayed in the screen-casting window, and sends the updated data to the first electronic device and instructs the first electronic device to synchronously update the corresponding
  • the first electronic device displays the received updated data in the corresponding display window according to the instruction of the second electronic device.
  • the content of the display window A in the first electronic device is updated synchronously, and the content of the screen projection window is updated synchronously with the screen projection window.
  • the content of A1 remains the same.
  • the inter-device screen collaboration established by the first electronic device and the second electronic device is unidirectional, that is, after the first electronic device and the second electronic device establish inter-device screen collaboration, the first electronic device projects the display window to the second electronic device.
  • the electronic device performs display, but the second electronic device does not project the display window to the first electronic device for display.
  • the second electronic device directly establishes the inter-device screen collaboration with the first electronic device.
  • the screen cooperates to project its own display window to the display screen of the first electronic device for display.
  • the second electronic device A screen-casting button for establishing inter-device screen cooperation with the first electronic device is outputted in its own display window on the display screen, and the screen-casting button is used to trigger the second electronic device to cast its own window displayed on the display screen to the first electronic device. displayed on the display screen of the electronic device.
  • the second electronic device outputs a display screen projection button in its own display window B, and the user clicks the button to display the screen.
  • the screen projection button triggers the second electronic device to project its own window B displayed on the display screen to the display screen of the first electronic device for display.
  • the second electronic device sends a screen projection request message to the first electronic device.
  • the second electronic device cooperates to project the screen to the first electronic device, and the current display
  • the display window B of the screen itself is projected onto the display screen of the first electronic device for display, and the first electronic device simultaneously displays its own display window A and the projection window of the display window of the second electronic device on the display screen.
  • B1 (which may also be referred to as the fourth window in this application) is shown in schematic diagram (b) in FIG. 7a.
  • the window B1 is a corresponding screen projection window after the display window B of the second electronic device is projected onto the display screen of the first electronic device.
  • the above-mentioned second electronic device can determine the initiation timing of the screen collaboration between the devices according to the actual scene or user settings, which improves the flexibility of the screen collaboration control between the devices.
  • the second electronic device displays the screen-casting button 701 in its own display window (window 1), as shown in FIG. 7b .
  • the second electronic device sends a screen-casting request to the first electronic device.
  • the first electronic device displays a prompt message in its own display window, prompting the user to choose whether to accept the screen projection of the second electronic device, for example, displaying a query message "Do you accept the screen projection?
  • the feedback information of accepting the screen projection is sent to the second electronic device.
  • the second electronic device After receiving the feedback information, the second electronic device sends data corresponding to its own window to the first electronic device, and the first electronic device displays the window projected by the second electronic device according to the received data, as shown in FIG. 7d .
  • window 1 is the window displayed by the first electronic device before accepting the screen projection
  • window 2 is the window projected by the second electronic device to the first electronic device
  • the display content is the same as the display content of the window of the second electronic device itself.
  • the second electronic device and the first electronic device establish inter-device screen collaboration
  • the corresponding screen projection window on the display screen of the first electronic device is updated synchronously.
  • the second electronic device updates the data displayed in the display window, and sends the updated data to the first electronic device and instructs the first electronic device to update the corresponding window synchronously.
  • the screen window the first electronic device displays the received updated data in the screen projection window according to the instruction of the second electronic device. For example, as shown in the schematic diagram (b) in FIG. 7a , when the user operates the display window B in the second electronic device to trigger the window update, the content of the screen-casting window B1 on the display screen of the first electronic device is updated synchronously, Consistent with the content of the display window B.
  • the second electronic device displays The window is updated from the desktop window shown in Figure 7b to the window corresponding to the photographing interface shown in Figure 7e.
  • the second electronic device sends the updated and displayed camera window related data to the first electronic device, and the first electronic device synchronously updates the screen projection window (window 2) of the second electronic device shown in FIG. 7d according to the received data, and obtains the following The display window shown in Figure 7f.
  • the window 1 is the window of the first electronic device itself, and the window 2 is the updated window of the window projected by the second electronic device to the first electronic device.
  • the corresponding display window in the second electronic device is updated synchronously.
  • the first electronic device updates the data displayed in the screencast window, and sends the updated data to the second electronic device and instructs the second electronic device to synchronously update the corresponding
  • the second electronic device updates and displays the received updated data in the corresponding display window according to the instruction of the first electronic device. For example, as shown in the schematic diagram (b) of FIG. 7a, when the user operates the screen-casting window B1 on the display screen of the first electronic device to trigger the window update, the content of the display window B in the second electronic device is updated synchronously. The content of the screen projection window B1 remains the same.
  • the above-mentioned first electronic device cooperates to project the screen to the second electronic device.
  • the first electronic device and the second electronic device establish two-way inter-device screen cooperation, which can realize two-way projection. screen control.
  • the display window and the screen projection window on the display screen can be adaptively adjusted respectively, so as to obtain a better visual display Effects, for example, you can adjust the display window, screen projection window size, display area and other information.
  • Step 1 The first electronic device generates corresponding operation information after detecting the user's operation of clicking the screen-casting button in the display window of the display screen.
  • screen projection button refer to the screen projection button in window A of FIG. 6a or refer to the screen projection button 601 in FIG. 6b.
  • Step 2 The first electronic device sends a screen projection request message to the second electronic device according to the operation information, and establishes inter-device screen collaboration with the second electronic device.
  • Step 3 The first electronic device performs display stream encoding on the relevant data in its own display window on the display screen.
  • Step 4 The first electronic device sends the encoded display stream data to the second electronic device.
  • Step 5 After the second electronic device decodes the received encoded display stream data, it generates a corresponding screen projection window, and simultaneously displays its own display window and the screen projection window on the display screen.
  • the display window of the second electronic device please refer to the window B shown in FIG. 6a or the window 1 shown in FIG. 6d, and the screen projection window may refer to the window A1 shown in FIG. Window 2 described in Figure 6d.
  • Step 6 The second electronic device generates corresponding operation information after detecting the operation of the user clicking the screen-casting button in its own display window on the display screen.
  • the display window please refer to the window B in the schematic diagram (a) of FIG. 7a or refer to the window 1 in FIG. 7b
  • the screen projection button refer to the screen projection button in the window B of the schematic diagram (a) of FIG. 7a Or refer to the screen projection button 701 in the window 1 of FIG. 7b.
  • Step 7 The second electronic device sends a screen projection request message to the first electronic device according to the operation information, and establishes inter-device screen collaboration with the first electronic device.
  • Step 8 The second electronic device performs display stream encoding on the relevant data in its own display window on the display screen.
  • Step 9 The second electronic device sends the encoded display stream data to the first electronic device.
  • Step 10 After the first electronic device decodes the received encoded display stream data, it generates a corresponding screen projection window, and simultaneously displays its own display window and the screen projection window on the display screen.
  • the screen projection window can refer to the window B1 of the schematic diagram (b) of FIG. 7a .
  • window 2 described in Figure 7d.
  • the display windows and screen projection windows in the first electronic device and the second electronic device may be displayed in layers, or may be displayed in partitions.
  • the first electronic device and the second electronic device are mobile phones
  • the first electronic device and the second electronic device are displayed in layers.
  • the effect after the device executes the above method is shown in FIG. 9 .
  • the layer 1 of the first electronic device is used to display its own display window
  • the layer 2 is used to display the screen projection window after the display window of the second electronic device is projected to the first electronic device
  • the second electronic device is used to display the screen projection window.
  • the content of the display window displayed in the layer 3 is the same, and the layer 2 is located above the layer 1.
  • Layer 3 of the second electronic device is used to display its own display window, and layer 4 is used to display the screen projection window after the display window of the first electronic device is projected to the second electronic device.
  • the content of the displayed display window is the same, and the layer 4 is located above the layer 3 .
  • window 1 is the display window on the display screen of the first electronic device
  • window 2 is the screen projection window after the display window of the second electronic device is projected to the first electronic device, which is displayed with window 3 in the second electronic device.
  • Window 3 is its own display window on the display screen of the second electronic device
  • window 4 is a screen projection window after the display window of the first electronic device itself is projected to the second electronic device, and the display content is the same as the window 1.
  • the first electronic device or the second electronic device displays multiple windows, for example, simultaneously displaying its own window or projecting the screen of the peer device to its own window.
  • the window to be displayed can be scaled and displayed, such as the display mode shown in FIG.
  • first electronic device and the second electronic device After the first electronic device and the second electronic device establish a two-way inter-device screen collaboration, operations performed by the first electronic device or the second electronic device on each window in the display screen can be synchronized to the corresponding window in the display screen of the opposite end. Therefore, the user using the first electronic device and the user using the second electronic device can view the information displayed by both parties in real time, which improves the user experience, and the first electronic device and the second electronic device can support synchronous operations with each other. The efficiency of collaborative operation between devices is improved. For example, as shown in FIG.
  • the first electronic device and the second electronic device respectively project their own display windows on the display screen of the other party by executing the above-mentioned method for screen coordination between devices, and can display them simultaneously on the display screens of the first electronic device and the second electronic device respectively.
  • the game scene interface of user C and user D realizes collaborative two-way sharing. User C and user D can view the status information of each other in real time in the screen projection window of the display screen of their own devices without switching scenes.
  • a control button for controlling the two-way inter-device screen collaboration can be outputted on the display screen, so that the user can control the first electronic device and the second electronic device.
  • both the first electronic device and the second electronic device can set the control peer switch in the screen projection window on the display screen button, the control switch button is used to control whether the corresponding display window is controlled synchronously when the electronic device updates the screen projection window, so that the display window is updated synchronously.
  • the control switch button in the first electronic device is turned on, when the first electronic device updates the screen projection window, it simultaneously controls the update of the corresponding display window in the second electronic device;
  • the screen projection window is operated to trigger window update, the corresponding display window in the second electronic device is updated synchronously.
  • the switch button for controlling the opposite end in the second electronic device When the switch button for controlling the opposite end in the second electronic device is turned on, when the second electronic device updates the screen projection window, it simultaneously controls the update of the corresponding display window in the first electronic device; the user displays the screen projection window on the screen of the second electronic device When the window update is triggered by the operation, the corresponding display window in the first electronic device is updated synchronously.
  • both the first electronic device and the second electronic device may set a display window on the display screen to allow peer control On/Off button (in this application, the switch button for allowing peer control displayed by the first electronic device may also be referred to as the first control option, and the switch button for allowing peer control displayed on the second electronic device may also be referred to as the second control option. ), which allows the peer control switch button to control whether the display window of the electronic device itself accepts the synchronous control of the corresponding screen projection window, and is updated synchronously with the screen projection window.
  • the first electronic device allows the user to control the display window of the first electronic device through the screen projection window of the second electronic device, and the user controls the display window of the second electronic device.
  • the screen projection window in the display screen is operated to trigger window update
  • the corresponding display window in the first electronic device is updated synchronously.
  • the switch button for allowing peer control in the second electronic device is turned on, the second electronic device allows the user to control the display window of the second electronic device through the screen projection window of the first electronic device, and the user controls the display window of the first electronic device.
  • the screen projection window is operated to trigger window update, the corresponding display window in the second electronic device is updated synchronously.
  • the screen projection window B1 of the first electronic device displays the switch button for controlling the opposite end, and the second electronic device displays on its own display.
  • Window B displays the button to allow the peer to control the switch.
  • the first electronic device responds to the operation and is set to control peer peer mode
  • the first electronic device responds to the operation and sets the non-control peer mode.
  • the second electronic device When the user opens the allow peer control switch button in the display window B by performing a corresponding operation (such as clicking, sliding, etc.) on the allow peer control switch button, the second electronic device responds to the operation and is set to allow peer control.
  • the terminal control mode when the user closes the switch button for allowing remote control in the display window B by performing a corresponding operation on the switch button for allowing remote control, the second electronic device responds to the operation and is set to not allow remote control model.
  • the first electronic device when the first electronic device is set to control the peer mode, it sends request information for controlling the peer to the second electronic device, and after the second electronic device receives the request information, if it is determined that it has been set to allow If it is determined that the peer control mode is not allowed, it will return the feedback information that the control is not allowed to the first electronic device, and the first electronic device receives the first electronic device.
  • the second electronic device After the second electronic device returns the control-allowing feedback information, allowing the user to control the second electronic device through the screen projection window B1, when the user performs an operation on the screen projection window B1 of the first electronic device to trigger the window update, While updating the screen projection window B1, the first electronic device sends a request for synchronously updating the display window B corresponding to the screen projection window B1 to the second electronic device, so that the second electronic device updates the display window B synchronously.
  • the first electronic device receives the feedback message that the control is not allowed returned by the second electronic device, and does not allow the user to control the second electronic device through the screen projection window B1, then the user can perform screen projection on the first electronic device.
  • the window B1 is operated to trigger window update, the first electronic device only updates the screen projection window B1.
  • the first electronic device When the first electronic device is set to not control the peer mode, the user is not allowed to control the second electronic device through the screen projection window B1, then the user performs an operation on the screen projection window B1 of the first electronic device to trigger the window update , the first electronic device only updates the screen projection window B1.
  • the second electronic device when the second electronic device is set to allow peer control mode, it sends to the first electronic device a prompt message that allows the corresponding display window B to be controlled through the screen projection window B1.
  • the first electronic device receives the prompt message Feedback is given to the user, and the mode of controlling the opposite end or not controlling the opposite end can be set according to the corresponding operation of the user by requesting the opposite end control switch button in the screen projection window B1.
  • the first electronic device may determine to update only the screen-casting window B1 according to the set mode of controlling the opposite end or not, or at the same time
  • the second electronic device is controlled to update the corresponding display window B.
  • the second electronic device When the second electronic device is set to the peer control mode not allowed, it sends to the first electronic device prompt information that the screen projection window B1 is not allowed to control the corresponding display window B, and the first electronic device receives the prompt information and sends the prompt information to the first electronic device. Feedback is given to the user, and when the user performs an operation on the screen-casting window B1 of the first electronic device to trigger window update, the first electronic device only updates the screen-casting window B1.
  • the second electronic device when the second electronic device is set to the peer-end control mode not allowed, it sends to the first electronic device a prompt message that the screen projection window control is not accepted, and the first electronic device The device blocks the user's control of B through the screen projection window B1 according to the prompt information, and when the user performs an operation on the screen projection window B1 to trigger window update, the first electronic device only updates the screen projection window B1.
  • the second electronic device when the second electronic device is set to the peer-end control mode not allowed, the second electronic device blocks the user's control of B through the screen projection window B1, and the user is in the screen projection window B1.
  • the first electronic device updates the screen projection window B1 and requests the second electronic device to update the display window B synchronously, but the second electronic device does not respond after receiving the request and maintains the display Window B remains unchanged.
  • the operation and control method of the control switch button in the screen projection window A1 of the second electronic device and the switch button for allowing the peer control in the display window A of the first electronic device can refer to the above-mentioned screen projection window of the first electronic device respectively.
  • the operation and control method of the switch button for controlling the opposite terminal in B1 and the switch button for allowing the opposite terminal to be controlled in the display window B of the second electronic device itself will not be repeated here.
  • the first electronic device and the second electronic device after the first electronic device and the second electronic device establish a two-way screen projection, they are each other's peer devices.
  • the first electronic device or the second electronic device updates the screen-casting window projected by the peer device
  • the screen-casting window can be updated according to the corresponding data to update the projection window.
  • the display window on the peer device corresponding to the screen projection window needs to be controlled to update at the same time, a request for synchronously updating the display window is sent to the peer device, and the peer device updates the displayed display according to the request. window.
  • the electronic device itself cannot generate the content displayed after the screen projection window is updated, it can request the corresponding data of the screen projection window from the opposite end device, and update the screen projection window according to the data returned by the opposite end device.
  • the opposite end device can be requested to synchronously update the The display window corresponding to the screen-casting window, according to the request, when the peer device returns the data corresponding to the screen-casting window after being updated, the display window displayed by it is updated synchronously.
  • the above-mentioned method for establishing a two-way inter-device screen collaboration between a first electronic device and a second electronic device may also be applied to a multi-device scenario including more than two devices.
  • the two electronic devices can be respectively used as the first electronic device and the second electronic device described in the above-mentioned embodiments of the present application, and respectively execute the corresponding inter-device screen coordination method, and project their own display window onto the display screen of the other party. show.
  • any electronic device in the multi-device scenario can be used as the first electronic device described in the foregoing embodiments of the present application, and other electronic devices that can establish a communication connection with the electronic device can be respectively used as
  • the second electronic device described in the above embodiments of the present application cooperates with the first electronic device to execute the inter-device screen coordination method described in the above embodiments of the present application, and projects its own display window onto the display screen of the other party. show.
  • a display asking whether to establish a relationship with the third electronic device is displayed.
  • prompt information of screen collaboration and in response to the received instruction to establish screen collaboration with the third electronic device, on the basis of maintaining the two-way screen projection with the second electronic device, communicate with the third electronic device Create a new screencast relationship.
  • tablet computer 1 For example, based on the example shown in FIG. 7f above, after the first electronic device, namely tablet computer 1, and the second electronic device, namely tablet computer 2, establish a two-way screen projection, tablet computer 1 simultaneously displays its own window (window 1) on the display screen.
  • the window (window 2) projected from the tablet computer 2 if the tablet computer 1 detects that it can establish a multi-screen collaboration with a third electronic device such as a mobile phone 1, the tablet computer 1 displays a prompt message in its own display window, asking the user whether The screen needs to be projected to the mobile phone 1, for example, the query message "Should the screen be projected to the mobile phone 1?" is displayed, as shown in Figure 13a.
  • the tablet computer 1 will be used as the first electronic device and the mobile phone 1 will be used as the second electronic device under the condition that the two-way screen projection between the tablet computer 1 and the tablet computer 2 is not interrupted, and execute this
  • the inter-device screen coordination method applied to the first electronic device and the second electronic device enables the tablet computer 1 to project its own display window onto the display screen of the mobile phone 1 for display.
  • the tablet 1 projects the screen to the mobile phone 1, it can accept the screen projection of the mobile phone 1, and simultaneously display its own display window, the window projected by the tablet computer 2, and the window projected by the mobile phone 1 on its own display screen. For example, based on the display screen of the tablet computer 1 shown in FIG.
  • the tablet computer 1 establishes a two-way screen projection with the tablet computer 2 on the basis of establishing a two-way screen projection with the mobile phone 1.
  • the display interface on the display screen of the tablet computer 1 is shown in Figure 13b.
  • Window 1 is the display window of the tablet computer 1 itself
  • window 2 is the window that the tablet computer 2 projects to the tablet computer 1
  • window 3 is the window that the mobile phone 1 projects to the tablet computer 1.
  • the electronic device detects that a communication connection can be established with multiple other electronic devices, it outputs a coordination request for establishing inter-device screen coordination with other electronic devices in its own display window on the display screen button, and set a corresponding expanded list, the expanded list includes the device logos of the plurality of electronic devices, and the user can select the electronic device that needs to establish a multi-screen connection according to the device logo.
  • an electronic device 1 detects that a communication connection can be established with multiple other electronic devices, it outputs a screen-casting button 1401 for displaying the screen to other electronic devices in its own display window on the display screen, and sets the corresponding expansion button 1401 list, as shown in the schematic diagram (a) in FIG. 14, after the user clicks the screen projection button, the electronic device 1 outputs and displays the expanded list, as shown in the schematic diagram (b) in FIG. 14, the user clicks on the expanded list At least one electronic device is selected from the list, and the first electronic device establishes inter-device screen cooperation with at least one electronic device selected by the user in response to the user's operation. For example, as shown in schematic diagram (b) in FIG.
  • the display list includes four electronic devices including electronic device 2, electronic device 3, electronic device 4, and electronic device 5, the user selects electronic device 2 and electronic device 3 among them. Establish a multi-screen connection. Then, the electronic device 1 is used as the first electronic device described in the above embodiments of the present application, and the electronic device 2 and the electronic device 3 are respectively used as the second electronic devices described in the above embodiments of the present application, which cooperate with the electronic device 1 respectively. Execute the method for inter-device screen collaboration described in the above-mentioned embodiments of the present application to realize the bidirectional screen projection control between the electronic device 1 and any one of the electronic device 2 and the electronic device 3 .
  • window 101 is the display window on the display screen of electronic device 1
  • window 201 is the display window on the display screen of electronic device 2.
  • window 301 is the display window on the display screen of electronic device 3;
  • window 102 is the screen projection window after window 201 in electronic device 2 is projected to electronic device 1, which is the same as the display content of window 201, and
  • window 103 is electronic device 3
  • the screen projection window after the middle window 301 is projected to the electronic device 1 is the same as the display content of the window 301;
  • the window 202 is the screen projection window after the window 101 in the electronic device 1 is projected to the electronic device 2, and the display content is the same as that of the window 101;
  • the window 302 is the screen projection window after the window 101 in the electronic device 1 is projected to the electronic device 3 , and the displayed content is the same as that of the window 101 .
  • the electronic device 2 and the electronic device 3 can also be used as the first electronic device and the second electronic device described in the foregoing embodiments of the present application, respectively, to cooperate with the implementation of the inter-device screen collaboration method described in the foregoing embodiments of the present application. , to realize the two-way screen projection control between the electronic device 2 and the electronic device 3. Furthermore, bidirectional screen projection control between any two electronic devices among the electronic device 1 , the electronic device 2 and the electronic device 3 can be realized.
  • FIG. 16 is a schematic diagram of a two-way screen projection control effect of the above-mentioned inter-device screen collaboration method. As shown in the figure, based on the two-way screen projection control effect shown in FIG. 14 , it also includes: the window 203 is the window 301 in the electronic device 3 The screen projection window after the screen is projected to the electronic device 2 is the same as the display content of the window 301;
  • the above-mentioned embodiments are applied in a multi-device scenario, and can perform inter-device screen collaboration among multiple devices.
  • the display window of one device can be projected to multiple other devices, and can accept the screen projection of multiple other devices. Therefore, in the In multi-device scenarios, the application objects of screen collaboration between devices can be flexibly adjusted according to actual needs, ensuring efficient screen collaboration and improving the practicability of screen collaboration between devices.
  • the embodiments of the present application provide a method for inter-device screen collaboration, which is applied to a system composed of a first electronic device and a second electronic device. As shown in FIG. 17 , the method includes the following steps:
  • Step S1701 The first electronic device establishes a communication connection with the second electronic device.
  • Step S1702 The first electronic device displays a first window, and the first window includes the first content.
  • Step S1703 The second electronic device displays a second window, and the second window includes the second content.
  • Step S1704 The first electronic device sends first screen projection data to the second electronic device, where the first screen projection data includes information of the first content.
  • Step S1705 The second electronic device receives the first screen projection data, displays the second window and a third window, and the third window includes the first content.
  • Step S1706 The second electronic device sends second screen projection data to the first electronic device, where the second screen projection data includes information of the second content.
  • Step S1707 The first electronic device receives the second screen projection data, and displays the first window and the fourth window, where the fourth window includes the second content.
  • step S1706 may be executed after step S1704 and step S1705, or may be executed simultaneously with step S1704.
  • an embodiment of the present application further provides an electronic device, and the electronic device may be a first electronic device or a second electronic device.
  • the electronic device is used to implement the method for inter-device screen collaboration provided by the embodiment of the present application.
  • the electronic device 1800 may include: a display screen 1801, one or more processors 1802, a memory 1803, and one or more computer programs (not shown in the figure).
  • the various devices described above may be coupled through one or more communication buses 1804 .
  • the display screen 1801 is used to display related user interfaces such as images, videos, and application interfaces.
  • One or more computer programs are stored in the memory 1803, and the one or more computer programs include instructions; the processor 1802 invokes the instructions stored in the memory 1803, so that the electronic device 1800 executes the inter-device screen provided by the embodiment of the present application collaborative approach.
  • the methods provided by the embodiments of the present application are introduced from the perspective of an electronic device as an execution subject.
  • the electronic device may include a hardware structure and/or software modules, and implement the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above functions is performed in the form of a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraints of the technical solution.
  • a software unit executed by a processor, or a combination of the two.
  • Software units can be stored in random access memory (RAM), flash memory, read-only memory (ROM), EPROM memory, EEPROM memory, registers, hard disk, removable disk, CD-ROM or this In any other form of storage media in the field.
  • RAM random access memory
  • ROM read-only memory
  • EPROM EPROM memory
  • EEPROM memory EEPROM memory
  • registers hard disk, removable disk, CD-ROM or this
  • a storage medium may be coupled to the processor such that the processor may read information from, and store information in, the storage medium.
  • the storage medium can also be integrated into the processor.
  • the processor and storage medium may be provided in the ASIC.
  • the above functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on, or transmitted over, a computer-readable medium in the form of one or more instructions or code.
  • Computer-readable media includes computer storage media and communication media that facilitate the transfer of computer programs from one place to another. Storage media can be any available media that a general-purpose or special-purpose computer can access.
  • Such computer-readable media may include, but are not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other device that can be used to carry or store instructions or data structures and Other media in the form of program code that can be read by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor.
  • any connection is properly defined as a computer-readable medium, for example, if software is transmitted from a website site, server or other remote source over a coaxial cable, fiber optic computer, twisted pair, digital subscriber line (DSL) Or transmitted by wireless means such as infrared, wireless, and microwave are also included in the definition of computer-readable media.
  • DSL digital subscriber line
  • the discs and discs include compact discs, laser discs, optical discs, digital versatile discs (DVDs), floppy discs and Blu-ray discs. Disks usually reproduce data magnetically, while discs usually use Lasers make optical copies of data. Combinations of the above can also be included in computer readable media.

Abstract

本申请提供一种设备间屏幕协同方法及设备,应用于第一电子设备与第二电子设备组成的系统,该方法包括:第一电子设备与第二电子设备建立通信连接;第一电子设备显示包括第一内容的第一窗口;第二电子设备显示包括第二内容的第二窗口;第一电子设备向第二电子设备发送包括第一内容的信息的第一投屏数据;第二电子设备接收第一投屏数据,显示第二窗口和包括第一内容的第三窗口;第二电子设备向第一电子设备发送包括第二内容的信息的第二投屏数据;第一电子设备接收第二投屏数据,显示第一窗口和包括第二内容的第四窗口。本申请方案中,第一电子设备和第二电子设备能同时投屏到对方,因此能够实现设备间双向屏幕协同,进而提高设备间屏幕协同的效率。

Description

一种设备间屏幕协同方法及设备
相关申请的交叉引用
本申请要求在2020年10月30日提交中国专利局、申请号为202011190260.1、申请名称为“一种设备间屏幕协同方法及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备技术领域,尤其涉及一种设备间屏幕协同方法及设备。
背景技术
随着科技的进步,电子设备的功能越来越强大,设备多窗口模式和设备间协同模式也逐步推广。目前,用户可以在电子设备中同时开启多个操作窗口,也可以把窗口(应用)共享到其他设备上显示。例如,当前设备间协同模式支持设备A(如手机)协同投屏到设备B上,用户对设备A进行的操作可以同步显示在设备B上,同时用户在设备B上对设备A进行的操作,也能同步反馈到设备A上。但是,当前设备间协同模式存在主从关系,例如,设备A投屏到设备B后,设备B作为从设备,不能再投屏到设备A,导致使用体验较低。
因此,当前跨设备共享时,存在只能支持设备间单向共享,不支持双向共享的问题,导致设备间协同的效率较低。
发明内容
第一方面,本申请实施例提供一种设备间屏幕协同方法,应用于第一电子设备与第二电子设备组成的系统,该方法包括:所述第一电子设备与所述第二电子设备建立通信连接;所述第一电子设备显示第一窗口,所述第一窗口中包括第一内容;所述第二电子设备显示第二窗口,所述第二窗口中包括第二内容;所述第一电子设备向所述第二电子设备发送第一投屏数据,所述第一投屏数据包括所述第一内容的信息;所述第二电子设备接收所述第一投屏数据,显示所述第二窗口和第三窗口,所述第三窗口中包括所述第一内容;所述第二电子设备向所述第一电子设备发送第二投屏数据,所述第二投屏数据包括所述第二内容的信息;所述第一电子设备接收所述第二投屏数据,显示所述第一窗口和第四窗口,所述第四窗口中包括所述第二内容。
在该方法中,第一电子设备显示第一窗口、第二电子设备显示第二窗口时,第一电子设备将第一窗口对应的第一投屏数据发送给第二电子设备,第二电子设备能根据所述第一投屏数据显示第一窗口的投屏窗口即第三窗口,同时显示第二电子设备自身的第二窗口,实现第一电子设备投屏到第二电子设备,同时,第一电子设备接收所述第二电子设备发送的第二投屏数据,根据所述第二投屏数据,显示所述第二电子设备的第二窗口的投屏窗口,同时显示第一电子设备自身的第一窗口,能够实现第二电子设备投屏到第一电子设备。因此,上述方法能实现第一电子设备与第二电子设备之间的双向投屏,两个电子设备均能同 时显示自身和对端设备的窗口,提高了电子设备间的屏幕协同效率,同时能提高用户体验。
在一种可能的设计中,所述第一电子设备向所述第二电子设备发送第一投屏数据之前,所述方法还包括:所述第一电子设备响应于接收到的第一操作,向所述第二电子设备发送第一投屏请求,所述第一投屏请求用于请求将所述第一内容投屏到所述第二电子设备;所述第二电子设备显示所述第二窗口和第三窗口之前,所述方法还包括:所述第二电子设备显示第一提示信息,所述第一提示信息用于提示用户是否接受所述第一电子设备将所述第一内容投屏到所述第二电子设备。
在该方法中,第一电子设备根据用户操作,请求投屏到第二电子设备,第二电子设备接收该请求后显示对应的提示信息,从而根据用户操作确定是否接受投屏。因此,第一电子设备和第二电子设备能够基于用户需求进行设备间屏幕协同控制,用户体验较好。
在一种可能的设计中,所述第二电子设备显示所述第二窗口和第三窗口之后,所述方法还包括:所述第一电子设备响应于接收到的第二操作,更新所述第一窗口中的第一内容;或者,所述第二电子设备响应于接收到的第三操作,向所述第一电子设备发送第一更新请求,所述第一更新请求用于请求所述第一电子设备更新所述第一窗口中的第一内容;所述第一电子设备响应于接收到的所述第一更新请求,更新所述第一窗口中的第一内容。
在该方法中,用户在第一电子设备侧或第二电子设备侧均能控制第一电子设备自身的第一窗口中内容的更新,大大提高了设备间屏幕协同过程中设备控制的灵活性和便捷性。同时能够支持接受投屏的第二电子设备对发起投屏的第一电子设备的显示窗口的更新进行控制。
在一种可能的设计中,所述第一电子设备更新所述第一窗口中的第一内容时,所述方法还包括:所述第一电子设备向所述第二电子设备发送更新的第一投屏数据;所述第二电子设备根据接收到的所述更新的第一投屏数据,更新所述第三窗口中的第一内容。
在该方法中,第一电子设备更新自身的第一窗口的显示内容时,将更新的内容指示给第二电子设备,第二电子设备就能同步更新第一窗口对应的投屏窗口即第三窗口,因此,用户能在对第一电子设备进行控制的同时,对第二电子设备显示的对应窗口进行控制,保证两个电子设备显示内容的一致性。
在一种可能的设计中,所述第一电子设备显示所述第一窗口和第四窗口之后,所述方法还包括:所述第一电子设备显示第一控制选项;所述第一电子设备响应于接收到的所述第一更新请求,更新所述第一窗口中的第一内容之前,所述方法还包括:所述第一电子设备接收作用于所述第一控制选项的第四操作。
在该方法中,第二电子设备投屏到第一电子设备之后,用户能够设置第一电子设备是否接受第二电子设备的控制,提高了设备间屏幕协同控制的灵活性。
在一种可能的设计中,所述第二电子设备向所述第一电子设备发送第二投屏数据之前,所述方法还包括:所述第二电子设备响应于接收到的第五操作,向所述第一电子设备发送第二投屏请求,所述第二投屏请求用于请求将所述第二内容投屏到所述第一电子设备;所述第一电子设备显示所述第一窗口和第四窗口之前,所述方法还包括:所述第一电子设备显示第二提示信息,所述第二提示信息用于提示用户是否接受所述第二电子设备将所述第二内容投屏到所述第一电子设备。
在该方法中,第二电子设备根据用户操作,请求投屏到第一电子设备,第一电子设备接收该请求后显示对应的提示信息,从而根据用户操作确定是否接受投屏。因此,第一电 子设备和第二电子设备能够基于用户需求进行设备间屏幕协同控制,用户体验较好。
在一种可能的设计中,所述第一电子设备显示所述第一窗口和第四窗口之后,所述方法还包括:所述第二电子设备响应于接收到的第六操作,更新所述第二窗口中的第二内容;或者,所述第一电子设备响应于接收到的第七操作,向所述第二电子设备发送第二更新请求,所述第二更新请求用于请求所述第二电子设备更新所述第二窗口中的第二内容;所述第二电子设备响应于接收到的所述第二更新请求,更新所述第二窗口中的第二内容。
在该方法中,用户在第二电子设备侧或第一电子设备侧均能控制第二电子设备自身的第二窗口中内容的更新,大大提高了设备间屏幕协同过程中设备控制的灵活性和便捷性。同时能够支持接受投屏的第一电子设备对发起投屏的第二电子设备的显示窗口的更新进行控制。
在一种可能的设计中,所述第二电子设备更新所述第二窗口中的第二内容时,所述方法还包括:所述第二电子设备向所述第一电子设备发送更新的第二投屏数据;所述第一电子设备根据接收到的所述更新的第二投屏数据,更新所述第四窗口中的第二内容。
在该方法中,第二电子设备更新自身的第二窗口的显示内容时,将更新的内容指示给第一电子设备,第一电子设备就能同步更新第二窗口对应的投屏窗口即第四窗口,因此,用户能在对第二电子设备进行控制的同时,对第一电子设备显示的对应窗口进行控制,保证两个电子设备显示内容的一致性。
在一种可能的设计中,所述第二电子设备显示所述第二窗口和第三窗口之后,所述方法还包括:所述第二电子设备显示第二控制选项;所述第二电子设备响应于接收到的所述第二更新请求,更新所述第二窗口中的第二内容之前,所述方法还包括:所述第二电子设备接收作用于所述第二控制选项的第八操作。
在该方法中,第一电子设备投屏到第二电子设备之后,用户能够设置第二电子设备是否接受第一电子设备的控制,提高了设备间屏幕协同控制的灵活性。
在一种可能的设计中,所述第一窗口与所述第四窗口的显示区域不同,或者,所述第一窗口的显示区域覆盖所述第四窗口的显示区域中的部分区域,或者,所述第四窗口的显示区域覆盖所述第一窗口的显示区域中的部分区域。
在该方法中,第一电子设备同时显示自身的第一窗口和第二电子设备投屏来的第四窗口时,可以采用多种不同的显示方式,提高了设备间屏幕协同显示的灵活性,并能适应多种应用场景的显示需求。
在一种可能的设计中,所述第二窗口与所述第三窗口的显示区域不同,或者,所述第二窗口的显示区域覆盖所述第三窗口的显示区域中的部分区域,或者,所述第三窗口的显示区域覆盖所述第二窗口的显示区域中的部分区域。
在该方法中,第二电子设备同时显示自身的第二窗口和第一电子设备投屏来的第三窗口时,可以采用多种不同的显示方式,提高了设备间屏幕协同显示的灵活性,并能适应多种应用场景的显示需求。
在一种可能的设计中,所述系统还包括第三电子设备,所述第一电子设备显示所述第一窗口和第四窗口之后,所述方法还包括:所述第一电子设备与所述第三电子设备建立通信连接;所述第三电子设备显示第五窗口,所述第五窗口中包括第三内容;所述第一电子设备向所述第三电子设备发送所述第一投屏数据;所述第三电子设备接收所述第一投屏数据,显示所述第五窗口和第六窗口,所述第六窗口中包括所述第一内容;所述第三电子设 备向所述第一电子设备发送第三投屏数据,所述第三投屏数据包括所述第三内容的信息;所述第一电子设备接收所述第三投屏数据,显示所述第一窗口和第七窗口,所述第七窗口中包括所述第三内容。
在该方法中,第一电子设备与第二电子设备建立双向投屏之后,还能同时与第三电子设备建立双向投屏,因此,第一电子设备能够对多个电子设备进行投屏控制,并同时接受多个电子设备的投屏控制,进而实现多设备场景下的设备间屏幕协同。
第二方面,本申请实施例提供一种设备间屏幕协同方法,应用于第一电子设备,该方法包括:与第二电子设备建立通信连接;显示第一窗口,所述第一窗口中包括第一内容;向所述第二电子设备发送第一投屏数据,所述第一投屏数据包括所述第一内容的信息;接收来自所述第二电子设备的第二投屏数据,所述第二投屏数据包括第二内容的信息,所述第二内容为所述第二电子设备显示的第二窗口中包括的内容;显示所述第一窗口和第四窗口,所述第四窗口中包括所述第二内容。
在一种可能的设计中,在向所述第二电子设备发送第一投屏数据之前,所述方法还包括:响应于接收到的第一操作,向所述第二电子设备发送第一投屏请求,所述第一投屏请求用于请求将所述第一内容投屏到所述第二电子设备。
在一种可能的设计中,在接收来自所述第二电子设备的第二投屏数据之前,所述方法还包括:显示第二提示信息,所述第二提示信息用于提示用户是否接受所述第二电子设备将所述第二内容投屏到所述第一电子设备。
在一种可能的设计中,在向所述第二电子设备发送第一投屏数据之后,所述方法还包括:响应于接收到的第二操作,更新所述第一窗口中的第一内容;或者,响应于接收到的来自所述第二电子设备的第一更新请求,更新所述第一窗口中的第一内容,其中,所述第一更新请求用于请求所述第一电子设备更新所述第一窗口中的第一内容。
在一种可能的设计中,在更新所述第一窗口中的第一内容时,所述方法还包括:向所述第二电子设备发送更新的第一投屏数据。
在一种可能的设计中,在显示所述第一窗口和第四窗口之后,所述方法还包括:显示第一控制选项;在响应于来自所述第二电子设备的所述第一更新请求,更新所述第一窗口中的第一内容之前,所述方法还包括:接收作用于所述第一控制选项的第四操作。
在一种可能的设计中,在显示所述第一窗口和第四窗口之后,所述方法还包括:响应于接收到的第七操作,向所述第二电子设备发送第二更新请求,所述第二更新请求用于请求所述第二电子设备更新所述第二窗口中的第二内容。
在一种可能的设计中,在向所述第二电子设备发送第二更新请求之后,所述方法还包括:响应于接收到的来自所述第二电子设备的更新的第二投屏数据,更新所述第四窗口中的第二内容。
在一种可能的设计中,所述第一窗口与所述第四窗口的显示区域不同,或者,所述第一窗口的显示区域覆盖所述第四窗口的显示区域中的部分区域,或者,所述第四窗口的显示区域覆盖所述第一窗口的显示区域中的部分区域。
在一种可能的设计中,在显示所述第一窗口和第四窗口之后,所述方法还包括:与第三电子设备建立通信连接;向所述第三电子设备发送所述第一投屏数据;接收来自所述第三电子设备的第三投屏数据,所述第三投屏数据包括第三内容的信息,所述第三内容为所述第三电子设备显示的第五窗口中包括的内容;显示所述第一窗口和第七窗口,所述第七 窗口中包括所述第三内容。
第三方面,本申请实施例提供一种电子设备,所述电子设备包括显示屏幕,存储器和一个或多个处理器;其中,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;当所述指令被所述一个或多个处理器调用执行时,使得所述电子设备能够执行上述第二方面或第二方面的任一可能的设计所描述的方法。
第四方面,本申请实施例提供一种芯片,所述芯片与电子设备中的存储器耦合,使得所述芯片在运行时调用所述存储器中存储的计算机程序,实现本申请实施例第一方面或第一方面提供的任一可能设计的方法,或者实现本申请实施例第二方面或第二方面提供的任一可能设计的方法。
第五方面,本申请实施例提供一种计算机存储介质,该计算机存储介质存储有计算机程序,当所述计算机程序在电子设备上运行时,使得电子设备执行上述第一方面或第一方面的任一种可能的设计的方法,或者实现本申请实施例第二方面或第二方面提供的任一可能设计的方法。
第六方面,本申请实施例提供一种计算机程序产品,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行第一方面或第一方面的任一种可能的设计的方法,或者实现本申请实施例第二方面或第二方面提供的任一可能设计的方法。
附图说明
图1为本申请实施例提供的一种跨设备协同投屏的示意图;
图2为本申请实施例提供的一种设备间屏幕协同系统架构的示意图;
图3为本申请实施例提供的一种电子设备的结构示意图;
图4为本申请实施例提供的一种电子设备的安卓操作系统结构示意图;
图5为本申请实施例提供的一种设备间屏幕协同方法的示意图;
图6a为本申请实施例提供的一种建立设备间屏幕协同方法的示意图;
图6b为本申请实施例提供的一种第一电子设备的显示屏幕示意图;
图6c为本申请实施例提供的一种第二电子设备的显示屏幕示意图;
图6d为本申请实施例提供的一种第二电子设备接受投屏的显示屏幕示意图;
图6e为本申请实施例提供的一种第一电子设备更新的显示屏幕示意图;
图6f为本申请实施例提供的一种第二电子设备接受投屏后更新的显示屏幕示意图;
图7a为本申请实施例提供的一种建立双向设备间屏幕协同方法的示意图;
图7b为本申请实施例提供的一种第二电子设备接受投屏的显示屏幕示意图;
图7c为本申请实施例提供的一种第一电子设备的显示屏幕示意图;
图7d为本申请实施例提供的一种第一电子设备接受投屏的显示屏幕示意图;
图7e为本申请实施例提供的一种第二电子设备更新的显示屏幕示意图;
图7f为本申请实施例提供的一种第一电子设备接受投屏后更新的显示屏幕示意图;
图8为本申请实施例提供的一种双向设备间屏幕协同控制的流程示意图;
图9为本申请实施例提供的一种双向设备间屏幕协同方法的效果示意图;
图10为本申请实施例提供的一种双向设备间屏幕协同方法的效果示意图;
图11为本申请实施例提供的一种双向设备间屏幕协同方法的效果示意图;
图12为本申请实施例提供的一种双向设备间屏幕协同控制方法的示意图;
图13a为本申请实施例提供的一种第一电子设备接受投屏的显示屏幕示意图;
图13b为本申请实施例提供的一种第一电子设备接受多个电子设备投屏的显示屏幕示意图;
图14为本申请实施例提供的一种设备间屏幕协同控制方法的示意图;
图15为本申请实施例提供的一种设备间屏幕协同控制的示意图;
图16为本申请实施例提供的一种设备间屏幕协同控制的示意图;
图17为本申请实施例提供的一种设备间屏幕协同方法示意图;
图18为本申请实施例提供的一种电子设备的示意图。
具体实施方式
为了使本申请实施例的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施例作进一步地详细描述。其中,在本申请实施例的描述中,以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
为了便于理解,示例性的给出了与本申请相关概念的说明以供参考,如下所示:
1)电子设备,为具有显示屏幕的设备。本申请一些实施例中电子设备可以是便携式设备,诸如手机、平板电脑、具备无线通讯功能的可穿戴设备(例如,手表、手环、头盔、耳机等)、车载终端设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、智能家居设备(例如,智能电视等)等智能终端设备。
在本申请一些实施例中,电子设备还可以是还包含其它功能诸如个人数字助理和/或画面显示功能的便携式终端设备。便携式终端设备的示例性实施例包括但不限于搭载
Figure PCTCN2021125218-appb-000001
Figure PCTCN2021125218-appb-000002
或者其它操作系统的便携式终端设备。上述便携式终端设备也可以是其它便携式终端设备,诸如具有显示屏幕的膝上型计算机(Laptop)等。还应当理解的是,在本申请其它一些实施例中,上述电子设备也可以不是便携式终端设备,而是具有显示屏幕的台式计算机。
2)屏幕协同,又称多屏协同,投屏,同屏、飞屏、屏幕共享,是指设备a(如手机、平板、笔记本、电脑等)的屏幕a中输出显示的画面实时地显示到设备b(如平板、笔记本、电脑、电视、一体机、投影仪等)的屏幕b的设定区域。同时通过操作设备a引起的屏幕a的画面变化会同步显示到屏幕b的设定区域中。而通过操作设备b的屏幕b的设定区域,引起屏幕b的所述设定区域的画面变化,也会同步显示到屏幕a的画面中。
3)中间件,是介于操作系统和应用程序之间、用于连接操作系统的软件组件和用户的应用软件的一类软件,是位于平台(硬件和操作系统)和应用之间的通用服务。中间件使用系统软件所提供的基础服务(功能),衔接网络上应用系统的各个部分或不同的应用,能够达到资源共享、功能共享的目的。中间件是独立的系统级软件服务程序,支持分布式计算,能够提供跨网络、跨硬件的透明性的应用或服务的交互功能。
应理解,本申请实施例中“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示: 单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一(项)个”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a、b或c中的至少一项(个),可以表示:a,b,c,a和b,a和c,b和c,或a、b和c,其中a、b、c可以是单个,也可以是多个。
参考图1,为本申请实施例提供的一种跨设备协同投屏的示意图。如图1所示,目前,多设备协同中支持设备A(例如手机)协同投屏到设备B(例如平板电脑)上,从而在设备B的显示屏幕上输出显示设备A的整个显示屏幕对应的显示窗口。设备A将显示屏幕对应的显示窗口投屏到设备B后,设备A和设备B均可以对该显示窗口进行控制,且设备A上对显示窗口的控制操作及结果可以在设备B显示屏幕上对应的显示窗口中同步显示,设备B上对显示窗口的控制操作及结果可以在设备A显示屏幕上对应的显示窗口中同步显示。
但是,上述方法中,设备A的显示窗口投屏到设备B后,设备B自身的显示窗口无法同时投屏到设备A上进行显示,因此,设备A与设备B之间无法同时进行双向协同控制,无法满足用户对双向投屏、双向协同的业务需求。
鉴于此,本申请实施例提供一种设备间屏幕协同方法,应用于跨设备投屏控制的场景中。
图2为本申请实施例提供的一种设备间屏幕协同系统架构的示意图。如图2所示,该系统架构可以包括:第一电子设备201(例如图中所示的手机)和第二电子设备202(例如图中所示的平板电脑)。其中,第二电子设备202可以是与第一电子设备201连接的至少一个电子设备中的任一电子设备;第一电子设备201和第二电子设备202可分别将自身的显示屏幕上的显示窗口投屏到对方的显示屏幕上进行显示,实现双向投屏协作,并可支持相互操作控制。其中,所述显示屏幕上的显示窗口为所述显示屏幕对应的整体窗口,或者为所述显示屏幕上显示的某一应用对应的局部窗口。
在该系统中,所述第一电子设备201与第二电子设备202之间能够进行通信。可选的,第一电子设备201与第二电子设备202接入同一个局域网。第一电子设备201与第二电子设备202可以通过蓝牙、Wi-Fi等短距离无线通信技术进行通信。可选的,第一电子设备201与第二电子设备202以有线方式连接并进行通信。第一电子设备201与第二电子设备202可以通过数据线如通用串行总线(Universal Serial Bus,USB)数据线连接后进行通信。
在第一电子设备201与第二电子设备202接入在同一个局域网的示例中,具体可以为:第一电子设备201和第二电子设备202与同一无线接入点建立无线连接。
另外,第一电子设备201与第二电子设备202可以接入同一个无线保真(Wireless Fidelity,Wi-Fi)热点,再例如,第一电子设备201和第二电子设备202也可以通过蓝牙协议接入同一个蓝牙信标下。再例如,第一电子设备201和第二电子设备202间也可以通过近场通信(Near Field Communication,NFC)标签触发通信连接,通过蓝牙模块传输加密信息进行身份认证。在认证成功后通过点对点(Point to Point,P2P)的方式进行数据传输。
在本申请一些实施例中,第一电子设备201与第二电子设备202为具备输出显示功能的智能设备,例如可以为手机、平板、电脑、智能电视等。
需要说明的是,图2所示的系统并不对本申请提供的设备间屏幕协同方法适用的场景进行限定。
还需要说明的是,本申请也不限定设备间屏幕协同系统中电子设备的数量,其可以包括两个电子设备,也可以包括更多个电子设备,例如可以包括3个或4个电子设备等。
在本申请一些实施例中,设备间屏幕协同系统中包括多个电子设备时,其中任意两个电子设备之间可执行本申请提供的设备间屏幕协同方法,实现双向投屏及协同控制。例如,上述设备间屏幕协同系统中包括设备A、设备B、设备C共3个电子设备时,设备间屏幕协同可以是设备A和设备B均投屏到设备C,设备C分别投屏到设备A和设备B;或者,设备间屏幕协同可以是设备A分别投屏到设备B和设备C,设备B分别投屏到设备A和设备C,设备C分别投屏到设备A和设备B;或者,设备间屏幕协同可以是设备A投屏到设备B,设备B投屏到设备C,设备C投屏到设备A等,此处不再一一列举。
参考图3,对本申请实施例提供的方法适用的电子设备的结构进行介绍。
如图3所示,电子设备300可以包括处理器310,外部存储器接口320,内部存储器321,USB接口330,充电管理模块340,电源管理模块341,电池342,天线1,天线2,移动通信模块350,无线通信模块360,音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,传感器模块380,按键390,马达391,指示器392,摄像头393,显示屏幕394,以及SIM卡接口395等。其中传感器模块380可以包括陀螺仪传感器、加速度传感器、接近光传感器、指纹传感器、触摸传感器、温度传感器、压力传感器、距离传感器、磁传感器、环境光传感器、气压传感器、骨传导传感器等。
可以理解的是,图3所示的电子设备仅仅是一个范例,并不构成对电子设备的限定,并且电子设备可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图3中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
处理器310可以包括一个或多个处理单元,例如:处理器310可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是电子设备300的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器310中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器310中的存储器为高速缓冲存储器。该存储器可以保存处理器310刚用过或循环使用的指令或数据。如果处理器310需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器310的等待时间,因而提高了系统的效率。
本申请实施例提供的设备间屏幕协同方法的执行可以由处理器310来控制或调用其他部件来完成,比如调用内部存储器321中存储的本申请实施例的处理程序,来控制无线通信模块360向其他电子设备进行数据通信,以实现设备间屏幕协同,提高电子设备的协作控制效率,提升用户的体验。处理器310可以包括不同的器件,比如集成CPU和GPU时,CPU和GPU可以配合执行本申请实施例提供的设备间屏幕协同方法,比如设备间屏幕协同方法中部分算法由CPU执行,另一部分算法由GPU执行,以得到较快的处理效率。
显示屏幕394用于显示图像,视频等。显示屏幕394包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode, OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备300可以包括1个或N个显示屏幕394,N为大于1的正整数。显示屏幕394可用于显示由用户输入的信息或提供给用户的信息以及各种图形用户界面(graphical user interface,GUI)。例如,显示屏幕394可以显示照片、视频、网页、或者文件等。再例如,显示屏幕394可以显示如图2所示的电子设备的图形用户界面。例如,如图2所示的电子设备的图形用户界面上包括状态栏、Dock栏、可隐藏的导航栏、时间和天气小组件(widget)、以及应用的图标,例如浏览器图标等。状态栏中包括运营商名称(例如中国移动)、移动网络(例如4G)、时间和剩余电量。此外,可以理解的是,在一些实施例中,状态栏中还可以包括蓝牙图标、Wi-Fi图标、外接设备图标等。还可以理解的是,在另一些实施例中,图2所示的电子设备的图形用户界面中还可以包括Dock栏,Dock栏中可以包括常用的应用图标等。当处理器310检测到用户的手指(或触控笔等)针对某一应用图标的触摸事件后,响应于该触摸事件,打开与该应用图标对应的应用的用户界面,并在显示屏幕394上显示该应用的用户界面。
在本申请实施例中,显示屏幕394可以是一个一体的柔性显示屏,也可以采用两个刚性屏以及位于两个刚性屏之间的一个柔性屏组成的拼接显示屏。当处理器310运行本申请实施例提供的设备间屏幕协同方法后,处理器310可以控制显示屏幕394对相关结果进行显示。
摄像头393(前置摄像头或者后置摄像头,或者一个摄像头既可作为前置摄像头,也可作为后置摄像头)用于捕获静态图像或视频。通常,摄像头393可以包括感光元件比如镜头组和图像传感器,其中,镜头组包括多个透镜(凸透镜或凹透镜),用于采集待拍摄物体反射的光信号,并将采集的光信号传递给图像传感器。图像传感器根据所述光信号生成待拍摄物体的原始图像。
内部存储器321可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器310通过运行存储在内部存储器321的指令,从而执行电子设备300的各种功能应用以及数据处理。内部存储器321可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,应用程序(比如设备间屏幕协同功能等)的代码等。存储数据区可存储电子设备300使用过程中所创建的数据(比如执行本申请实施例提供的设备间屏幕协同功能产生的过程数据等)等。
内部存储器321还可以存储本申请实施例提供的设备间屏幕协同算法对应的一个或多个计算机程序。该一个或多个计算机程序被存储在上述内部存储器321中并被配置为被一个或多个处理器310执行,该一个或多个计算机程序包括指令,上述指令可以用于执行以下实施例中的各个步骤。
此外,内部存储器321可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
当然,本申请实施例提供的设备间屏幕协同算法的代码还可以存储在外部存储器中。这种情况下,处理器310可以通过外部存储器接口320运行存储在外部存储器中的设备间屏幕协同算法的代码。
传感器模块380可以包括陀螺仪传感器、加速度传感器、接近光传感器、指纹传感器、 触摸传感器等。
触摸传感器,也称“触控面板”。触摸传感器可以设置于显示屏幕394,由触摸传感器与显示屏幕394组成显示屏幕,也称“显示屏”。触摸传感器用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏幕394提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器也可以设置于电子设备300的表面,与显示屏幕394所处的位置不同。
示例性的,电子设备300的显示屏幕394显示主界面,主界面中包括多个应用(比如相机应用、微信应用等)的图标。用户通过触摸传感器点击主界面中相机应用的图标,触发处理器310启动相机应用,打开摄像头393。显示屏幕394显示相机应用的界面,例如取景界面。
电子设备300的无线通信功能可以通过天线1,天线2,移动通信模块350,无线通信模块360,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备300中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块350可以提供应用在电子设备300上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块350可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块350可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块350还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块350的至少部分功能模块可以被设置于处理器310中。在一些实施例中,移动通信模块350的至少部分功能模块可以与处理器310的至少部分模块被设置在同一个器件中。在本申请实施例中,移动通信模块350还可以用于与其它电子设备进行信息交互,例如,向其它电子设备发送投屏或更新投屏窗口的指令,或者移动通信模块350可用于接收其它电子设备发送的投屏或更新投屏窗口的指令。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频装置(不限于扬声器370A,受话器370B等)输出声音信号,或通过显示屏幕394显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器310,与移动通信模块350或其他功能模块设置在同一个器件中。
无线通信模块360可以提供应用在电子设备300上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块360可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块360经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器310。无线通信模块360还可以从处理器310接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。本申请实施例中,无线通信模块360, 用于与其它电子设备建立连接,进行数据交互。或者无线通信模块360可以用于接入接入点设备,向其它电子设备发送投屏或更新投屏窗口的指令,或者接收其它电子设备发送的投屏或更新投屏窗口的指令。
示例性的,如图2中所示的第一电子设备和第二电子设备可以通过移动通信模块350或无线通信模块360进行与投屏相关的指令和数据的接收或发送,从而实现设备间屏幕协同功能。
另外,电子设备300可以通过音频模块370,扬声器370A,受话器370B,麦克风370C,耳机接口370D,以及应用处理器等实现音频功能。例如音乐播放,录音等。电子设备300可以接收按键390输入,产生与电子设备300的用户设置以及功能控制有关的键信号输入。电子设备300可以利用马达391产生振动提示(比如来电振动提示)。电子设备300中的指示器392可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。电子设备300中的SIM卡接口395用于连接SIM卡。SIM卡可以通过插入SIM卡接口395,或从SIM卡接口395拔出,实现和电子设备300的接触和分离。
应理解,在实际应用中,电子设备300可以包括比图3所示的更多或更少的部件,本申请实施例不作限定。图示电子设备300仅是一个范例,并且电子设备300可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
电子设备300的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备的软件结构。参考图4,为本发明实施例的电子设备的软件结构框图。示例性的,图4是一种可以运行在上述第一电子设备或第二电子设备中的软件架构示意图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。如图4所示,该软件架构可以分为五层,分别为应用程序层,应用程序框架层,安卓运行时和系统库,硬件抽象层和Linux内核层。
应用程序层是操作系统的最上一层,包括操作系统的原生应用程序,例如电子邮件客户端、蓝牙、相机、音乐、视频、短信、通话、日历、浏览器、联系人等。本申请实施例涉及的APP,简称应用,为能够实现某项或多项特定功能的软件程序。通常,终端设备中可以安装多个应用。比如,相机应用、邮箱应用、智能家居控制应用等。下文中提到的应用,可以是终端设备出厂时已安装的系统应用,也可以是用户在使用终端设备的过程中从网络下载或从其他终端设备获取的第三方应用。
当然,对于开发者来说,开发者可以编写应用程序并安装到该层。应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。其中,窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
在本申请一些实施例中,应用程序层可以包括界面设置服务,用于实现设置界面的呈现,上述设置界面可以用于用户设置终端设备的设备间屏幕协同功能。例如,用户可以在设置界面中进行设备间屏幕协同功能的开启或关闭设置。示例性的,上述设置界面可以为终端设备的触摸屏上所显示的状态栏或通知栏中的内容,还可以是终端设备的触摸屏上所显示的设备控制功能的相关控制界面。
一种可能的实现方式中,应用程序可以使用Java语言开发,通过调用应用程序框架层所提供的应用程序编程接口(Application Programming Interface,API)来完成,开发者可以通过应用程序框架来与操作系统的底层(例如硬件抽象层、内核层等)进行交互,开发自己的应用程序。该应用程序框架主要是操作系统的一系列的服务和管理系统。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层可以包括一些预先定义的函数。如图4所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。电话管理器用于提供终端设备的通信功能。通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。
本申请实施例中,应用程序框架层还可以包含设备间屏幕协同服务,用于进行设备间屏幕协同功能的控制及实现,设备间屏幕协同服务可以包括多窗口框架服务,主要用于配合设备间屏幕协同服务提供设备控制及窗口的显示控制功能。例如可以用于对当前连接的电子设备进行管理,对在显示屏幕上进行显示的窗口进行管理等。
在本申请一些实施例中,设备间屏幕协同服务还可以包括通知管理器,用于与其它数据层进行信息交互。
安卓运行时包括核心库和虚拟机。安卓运行时负责安卓系统的调度和管理。安卓系统的核心库包含两部分:一部分是Java语言需要调用的功能函数,另一部分是安卓系统的核心库。应用程序层和应用程序框架层运行在虚拟机中。以Java举例,虚拟机将应用程序层和应用程序框架层的Java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器,媒体库,三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)等。表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了二维和三维图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。二维图形引擎是二维绘图的绘图引擎。
硬件抽象层(Hardware Abstraction Layer,HAL)是应用程序框架的支撑,是连接应用程序框架层与Linux内核层的重要纽带,其可通过应用程序框架层为开发者提供服务。
内核(Kernel)层提供操作系统的核心系统服务,如安全性、内存管理、进程管理、网络协议栈和驱动模型等都基于内核层实现。内核层同时也作为硬件和软件栈之间的抽象层。该层有许多与电子设备相关的驱动程序,主要的驱动有:显示驱动;基于Linux的帧缓冲驱动;作为输入设备的键盘驱动;基于内存技术设备的Flash驱动;照相机驱动;音频驱动;蓝牙驱动;WI-FI驱动等。
在本申请一些实施例中,核心层作为硬件和软件栈之间的抽象层,包括触摸驱动服务,用于获取硬件部分(例如触摸屏、触摸传感器等)接收的与触发窗口更新相关的操作信息,并进行上报。
需要说明的是,在图2所示的设备间屏幕协同系统中,第一电子设备和第二电子设备均可以通过以上硬件架构和软件架构实现。
下面参考图5所示的设备间屏幕协同方法示意图,结合图3及图4所示的电子设备的硬件结构和软件结构,对本申请实施例提供的设备间屏幕协同方法进行说明。
如图5所示,本申请实施例中,通过中间件与软总线结构结合上述的软件结构,实现两个电子设备(第一电子设备与第二电子设备)之间的双向投屏控制。
其中,中间件连接操作系统的应用程序层和应用程序框架层,为操作系统提供应用的接口标准化、协议统一化及屏蔽具体操作细节等服务。
软总线位于链接层,是封装了操作系统对进程间通信资源、共享内存等有多个进程共同使用的资源的操作的模块,能为任务进程提供标准的资源申请、使用及回收接口,任务进程使用该接口及协议的标志进行资源的共享。
本申请实施例中,发起设备间屏幕协同的电子设备通过应用程序层将投屏信息分别发送到应用程序框架层和中间件,其中,投屏信息包括待投屏显示的投屏数据、投屏对象等信息。中间件对投屏数据进行显示流编码后,将已编码数据传入链接层,由软总线发送到接受投屏的电子设备。接受投屏的电子设备接收发起投屏的电子设备发送的已编码数据后,对已编码数据进行显示流解码得到投屏数据,并上报到应用程序框架层,应用框架层根据投屏数据进行多窗口的显示。
具体的,如图5所示,第一电子设备与第二电子设备建立通信连接后,第一电子设备投屏到第二电子设备时,通过应用程序层将待投屏到第二电子设备进行显示的投屏数据发送到中间件,中间件对投屏数据进行显示流编码后发送到链接层,链接层将编码后的投屏数据发送到第二电子设备。
作为一种可选的实施方式,链接层利用软总线,通过短距离通信方式例如蓝牙、Wi-Fi等发送到第二电子设备的链接层。
第二电子设备的链接层接收第一电子设备的链接层发送的已编码的投屏数据,并发送到中间件,中间件对已编码的投屏数据进行显示流解码,并发送到应用程序框架层,应用程序框架层对投屏数据进行显示。其中,第二电子设备的应用程序框架层对第一电子设备发送的投屏数据进行显示时,通过应用程序层将当前显示屏幕上显示的自身的窗口数据发送到应用程序框架层,应用程序框架层将该窗口数据与链接层发送的投屏数据同时输出在显示屏幕上进行显示。
第二电子设备协同投屏到第一电子设备时,通过应用程序层将待投屏到第一电子设备进行显示的投屏数据发送到中间件,中间件对投屏数据进行显示流编码后发送到链接层,链接层将编码后的投屏数据发送到第一电子设备。
作为一种可选的实施方式,链接层利用软总线,通过短距离通信方式发送到第一电子设备的链接层。
第一电子设备的链接层接收第二电子设备的链接层发送的以编码的投屏数据,并发送到中间件,中间件对已编码的投屏数据进行显示流解码,并发送到应用程序框架层,应用程序框架层对投屏数据进行显示。其中,第一电子设备的应用程序框架层对第二电子设备发送的投屏数据进行显示时,通过应用程序层将当前显示屏幕显示的窗口数据发送到应用程序框架层,应用程序框架层将该窗口数据与链接层发送的投屏数据同时输出在显示屏幕上进行显示。
上述第一电子设备与第二电子设备协同投屏到对方的投屏数据为其显示屏幕中显示的自身生成的窗口数据,该窗口数据为所述显示屏幕对应的整体窗口的数据。
上述第一电子设备协同投屏到第二电子设备和第二电子设备协同投屏到第一电子设备的方法可分别单独执行;也可同时执行,以实现双向同时投屏。
本申请实施例中,作为一种可选的实施方式,第一电子设备和第二电子设备处于同一局域网中,并通过短距离通信技术进行通信。其中,同一局域网可以为WiFi局域网、蓝牙局域网等。
上述实施例中,第一电子设备和第二电子设备分别建立双向设备间屏幕协同的信息传输路线,能够分别将投屏数据传输至对端设备,进而在对端设备的显示屏幕上显示所述投屏数据,实现双向投屏控制,解决了设备间无法实现双向协同共享的问题。
下面结合具体实施例对本申请实施例提供的设备间屏幕协同方法进行详细说明。
在本申请一些实施例中,第一电子设备与第二电子设备中均设置控制设备间屏幕协同功能是否启动的屏幕协同开关,该屏幕协同开关可由用户控制其打开或关闭状态。其中,屏幕协同开关打开时,电子设备支持设备间屏幕协同功能,能够执行本申请实施例提供的设备间屏幕协同方法,投屏到其它电子设备或接受其它电子设备的投屏;屏幕协同开关关闭时,电子设备不支持设备间屏幕协同功能。
本申请实施例中,以第一电子设备为先发起设备间屏幕协同的设备、第二电子设备为后发起设备间屏幕协同的设备为例进行说明,即,第一电子设备发起设备间屏幕协同并投屏到第二电子设备后,第二电子设备再发起设备间屏幕协同并投屏到第一电子设备。
需要说明的是,本申请实施例中,第一电子设备与第二电子设备建立屏幕协同指的是第一电子设备在与第二电子设备建立通信连接的基础上,投屏到第二电子设备;第二电子设备在与第一电子设备建立屏幕协同指的是第二电子设备在与第一电子设备建立通信连接的基础上,投屏到第一电子设备。
在本申请一些实施例中,第一电子设备可以自动发起设备间屏幕协同,或者,根据用户的指示发起设备间屏幕协同。
作为一种可选的实施方式,第一电子设备可在检测到能与第二电子设备建立通信连接时,直接与第二电子设备建立设备间屏幕协同,将自身的显示窗口投屏到第二电子设备的显示屏幕上进行显示。
作为另一种可选的实施方式,第一电子设备在检测到能与第二电子设备建立通信连接时,在显示屏幕上显示窗口中显示投屏按钮,该投屏按钮用于触发第一电子设备将显示屏幕显示的自身的窗口投屏到第二电子设备的显示屏幕上进行显示。示例性的,如图6a所示,第一电子设备响应于用户对该投屏按钮的操作,向第二电子设备发送投屏请求,第二电子设备通过请求后,第一电子设备协同投屏到第二电子设备,将当前显示屏幕上显示的自身的显示窗口A(在本申请中也称为第一窗口)投屏到第二电子设备的显示屏幕上进行显示,第二电子设备在显示屏幕上同时输出显示自身的显示窗口B和第一电子设备的显示窗口对应的投屏窗口A1(在本申请中也称为第三窗口)。其中,图6a中所示的窗口A为第一电子设备当前显示屏幕上自身的显示窗口,窗口B为(在本申请中也称为第二窗口)第二电子设备当前显示屏幕上自身的显示窗口,窗口A1为第一电子设备中窗口A投屏到第二电子设备的显示屏幕后对应的投屏窗口。
示例性的,第一电子设备和第二电子设备为两个不同平板电脑时,若第一电子设备当 前显示窗口为桌面对应的窗口,第一电子设备(平板电脑1)检测到能够与第二电子设备(平板电脑2)建立通信连接时,在自身的显示窗口显示投屏按钮601,如图6b所示,用户可以通过点击该投屏按钮,触发第一电子设备与第二电子设备建立设备间屏幕协同,第一电子设备响应于用户点击该投屏按钮的操作,向第二电子设备发送投屏请求,请求与第二电子设备建立设备间屏幕协同。如图6c所示,第二电子设备接收所述投屏请求后,在自身显示窗口中显示提示信息,提示用户选择是否接受第一电子设备的投屏,例如显示询问信息“是否接受投屏?”,若根据用户操作确定接受第一电子设备的投屏,则向第一电子设备发送接受投屏的反馈信息。第一电子设备接收到所述反馈信息后,将自身窗口对应的数据发送到第二电子设备,第二电子设备根据接收到的数据显示第一电子设备显示的窗口,如图6d所示,其中,窗口1为第二电子设备接受投屏之前显示的自身的窗口,窗口2为第一电子设备投屏到第二电子设备的窗口,显示内容与第一电子设备窗口的显示内容相同。
本申请实施例中,窗口中的显示内容包括显示屏幕上所述窗口对应的界面所展示的信息。例如,如图6b所示的平板电脑1为第一电子设备时,第一窗口为显示屏幕整体显示界面对应的窗口,第一窗口中包括的第一内容则包括显示屏幕整体显示界面中所有的显示元素,包括图中所示的状态栏内容、应用图标等。又例如,如图6e所示的平板电脑1为第一电子设备时,第一窗口为显示屏幕整体显示界面对应的窗口,第一窗口中包括的第一内容则包括短信应用的应用界面中的信息。再例如,如图7f所示的平板电脑1为第一电子设备时,第一窗口为图中所示的窗口1,第一窗口中包括的第一内容则包括图中窗口1所示的短信应用的应用界面中的信息。
上述第一电子设备可根据实际场景或用户设置确定设备间屏幕协同的发起时机,提高了设备间屏幕协同控制的灵活性。
上述第一电子设备与第二电子设备建立设备间屏幕协同后,用户对第一电子设备自身的显示窗口进行操作触发窗口更新时,第二电子设备显示屏幕上对应的投屏窗口同步更新。具体的,第一电子设备接收到用户触发显示窗口更新的操作后,对显示窗口显示的数据进行更新,及,将更新的数据发送到第二电子设备并指示第二电子设备同步更新对应的投屏窗口,第二电子设备根据第一电子设备的指示,将接收到的更新的数据在投屏窗口进行显示。例如,如图6a所示的窗口分布,用户对第一电子设备中显示窗口A进行操作触发窗口更新时,第二电子设备的显示屏幕上投屏窗口A1的内容同步更新,与所述显示窗口A的内容保持一致。
本申请实施例中,电子设备显示屏幕上的窗口中显示的内容发生变化的情况都属于窗口更新,能引起该窗口显示内容发生变化的操作都属于触发窗口更新的操作。
示例性的,基于上述图6a至图6d所示的示例,第一电子设备投屏到第二电子设备之后,若用户点击第一电子设备桌面窗口中的“信息”应用图标打开短信界面,则第一电子设备显示的窗口从图6b所示的桌面窗口更新为如图6e所示的短信界面对应的窗口。第一电子设备将更新后显示的短信窗口相关数据发送到第二电子设备,第二电子设备根据接收的数据同步更新图6d所示的第一电子设备的投屏窗口(窗口2),得到如图6f所示的显示窗口,其中,窗口1为第二电子设备自身的窗口,窗口2为第一电子设备投屏到第二电子设备的窗口更新后的窗口。
上述第一电子设备与第二电子设备建立设备间屏幕协同后,用户对第二电子设备显示屏幕上的投屏窗口进行操作触发窗口更新时,第一电子设备中对应的显示窗口同步更新。 具体的,第二电子设备接收到用户触发投屏窗口更新的操作后,对投屏窗口显示的数据进行更新,及,将更新的数据发送到第一电子设备并指示第一电子设备同步更新对应的显示窗口,第一电子设备根据第二电子设备的指示,将接收到的更新的数据在对应的显示窗口进行显示。例如,如图6a所示的窗口分布,用户对第二电子设备显示屏幕上投屏窗口A1进行操作触发窗口更新时,第一电子设备中显示窗口A的内容同步更新,与所述投屏窗口A1的内容保持一致。
上述第一电子设备与第二电子设备建立的设备间屏幕协同是单向的,即第一电子设备与第二电子设备建立设备间屏幕协同后,第一电子设备将显示窗口投屏到第二电子设备进行显示,但第二电子设备不会将显示窗口投屏到第一电子设备进行显示。
在本申请一些实施例中,作为一种可选的实施方式,在上述第一电子设备与第二电子设备建立设备间屏幕协同的基础上,第二电子设备直接与第一电子设备建立设备间屏幕协同,将自身的显示窗口投屏到第一电子设备的显示屏幕上进行显示。
作为另一种可选的实施方式,在上述第一电子设备与第二电子设备建立设备间屏幕协同,并将显示窗口投屏到第二电子设备的显示屏幕上进行显示后,第二电子设备在显示屏幕上自身的显示窗口中输出显示与第一电子设备建立设备间屏幕协同的投屏按钮,该投屏按钮用于触发第二电子设备将显示屏幕显示的自身的窗口投屏到第一电子设备的显示屏幕上进行显示。示例性的,如图7a中的示意图(a)所示,第一电子设备协同投屏到第二电子设备之后,第二电子设备在自身的显示窗口B输出显示投屏按钮,用户通过点击该投屏按钮,触发第二电子设备将显示屏幕显示的自身的窗口B投屏到第一电子设备的显示屏幕上进行显示。第二电子设备响应于用户对该投屏按钮的操作,向第一电子设备发送投屏请求消息,第一电子设备通过请求后,第二电子设备协同投屏到第一电子设备,将当前显示屏幕中自身的显示窗口B投屏到第一电子设备的显示屏幕上进行显示,第一电子设备在显示屏幕上同时显示自身的显示窗口A和第二电子设备的显示窗口投屏的投屏窗口B1(在本申请中也可以被称为第四窗口),如图7a中示意图(b)所示。其中,窗口B1为第二电子设备中自身的显示窗口B投屏到第一电子设备的显示屏幕后对应的投屏窗口。
上述第二电子设备可根据实际场景或用户设置确定设备间屏幕协同的发起时机,提高了设备间屏幕协同控制的灵活性。
示例性的,基于上述图6f所示的示例,第一电子设备投屏到第二电子设备之后,第二电子设备在自身显示窗口(窗口1)中显示投屏按钮701,如图7b所示。第二电子设备响应于用户点击该投屏按钮的操作,向第一电子设备发送投屏请求。如图7c所示,第一电子设备接收所述投屏请求后,在自身显示窗口中显示提示信息,提示用户选择是否接受第二电子设备的投屏,例如显示询问信息“是否接受投屏?”,若根据用户操作确定接受第二电子设备的投屏,则向第二电子设备发送接受投屏的反馈信息。第二电子设备接收到所述反馈信息后,将自身窗口对应的数据发送到第一电子设备,第一电子设备根据接收到的数据显示第二电子设备投屏的窗口,如图7d所示。其中,窗口1为第一电子设备接受投屏之前显示的自身的窗口,窗口2为第二电子设备投屏到第一电子设备的窗口,显示内容与第二电子设备自身的窗口的显示内容相同。
上述第二电子设备与第一电子设备建立设备间屏幕协同后,用户对第二电子设备自身的显示窗口进行操作触发窗口更新时,第一电子设备显示屏幕上对应的投屏窗口同步更新。具体的,第二电子设备接收到用户触发显示窗口更新的操作后,对显示窗口显示的数据进 行更新,及,将更新的数据发送到第一电子设备并指示第一电子设备同步更新对应的投屏窗口,第一电子设备根据第二电子设备的指示,将接收到的更新的数据在投屏窗口进行显示。例如,如图7a中示意图(b)所示的窗口分布,用户对第二电子设备中显示窗口B进行操作触发窗口更新时,第一电子设备的显示屏幕上投屏窗口B1的内容同步更新,与所述显示窗口B的内容保持一致。
示例性的,基于上述图7a至图7d所示的示例,第二电子设备投屏到第一电子设备之后,若用户点击桌面中的“相机”应用图标打开拍照界面,则第二电子设备显示的窗口从图7b所示的桌面窗口更新为如图7e所示的拍照界面对应的窗口。第二电子设备将更新后显示的相机窗口相关数据发送到第一电子设备,第一电子设备根据接收的数据同步更新图7d所示的第二电子设备的投屏窗口(窗口2),得到如图7f所示的显示窗口。其中,窗口1为第一电子设备自身的窗口,窗口2为第二电子设备投屏到第一电子设备的窗口更新后的窗口。
上述第二电子设备与第一电子设备建立设备间屏幕协同后,用户对第一电子设备显示屏幕上的投屏窗口进行操作触发窗口更新时,第二电子设备中对应的显示窗口同步更新。具体的,第一电子设备接收到用户触发投屏窗口更新的操作后,对投屏窗口显示的数据进行更新,及,将更新的数据发送到第二电子设备并指示第二电子设备同步更新对应的显示窗口,第二电子设备根据第一电子设备的指示,将接收到的更新的数据在对应的显示窗口进行更新显示。例如,如图7a中示意图(b)所示的窗口分布,用户对第一电子设备显示屏幕上投屏窗口B1进行操作触发窗口更新时,第二电子设备中显示窗口B的内容同步更新,与所述投屏窗口B1的内容保持一致。
上述第一电子设备协同投屏到第二电子设备,同时,第二电子设备协同投屏到第一电子设备后,第一电子设备与第二电子设备建立双向设备间屏幕协同,可实现双向投屏控制。
在本申请一些实施例中,第一电子设备和第二电子设备建立双向设备间屏幕协同后,可分别对显示屏幕上的显示窗口和投屏窗口进行适应性调整,以得到更好的视觉显示效果,例如可以调整显示窗口、投屏窗口的大小、显示区域等信息。
参考图8所示的双向设备间屏幕协同控制流程示意图,对本申请提供的设备间屏幕协同方法进行说明。如图8所示,该方法具体流程包括:
步骤1:第一电子设备检测到用户点击显示屏幕的显示窗口中投屏按钮的操作后,生成相应的操作信息。
其中,所述显示窗口可参见附图6a中窗口A或参见附图6b中窗口,所述投屏按钮可参见附图6a的窗口A中投屏按钮或参见附图6b中投屏按钮601。
步骤2:第一电子设备根据所述操作信息,向第二电子设备发送投屏请求消息,与第二电子设备建立设备间屏幕协同。
步骤3:第一电子设备对显示屏幕上自身的显示窗口中相关数据进行显示流编码。
步骤4:第一电子设备将已编码显示流数据发送到第二电子设备。
步骤5:第二电子设备对接收的已编码显示流数据进行解码后,生成对应的投屏窗口,并在显示屏幕上同时显示自身显示窗口与所述投屏窗口。
其中,第二电子设备自身显示窗口可参见附图6a中所示的窗口B或参见附图6d中所述的窗口1,所述投屏窗口可参见附图6a中所示的窗口A1或参见附图6d中所述的窗口2。
步骤6:第二电子设备检测到用户点击显示屏幕上自身的显示窗口中投屏按钮的操作 后,生成相应的操作信息。
其中,所述显示窗口可参见附图7a的示意图(a)中窗口B或参见附图7b中窗口1,所述投屏按钮可参见附图7a的示意图(a)的窗口B中投屏按钮或参见附图7b的窗口1中投屏按钮701。
步骤7:第二电子设备根据所述操作信息,向第一电子设备发送投屏请求消息,与第一电子设备建立设备间屏幕协同。
步骤8:第二电子设备对显示屏幕上自身的显示窗口中相关数据进行显示流编码。
步骤9:第二电子设备将已编码显示流数据发送到第一电子设备。
步骤10:第一电子设备对接收的已编码显示流数据进行解码后,生成对应的投屏窗口,并在显示屏幕上同时显示自身显示窗口与所述投屏窗口。
其中,第一电子设备自身显示窗口可参见7a的示意图(b)中窗口A或参见附图7d中所述的窗口1,所述投屏窗口可参见附图7a的示意图(b)的窗口B1或参见附图7d中所述的窗口2。
本申请实施例中,第一电子设备和第二电子设备中的显示窗口与投屏窗口可以分层显示,或者,可以分区显示。
示例性的,假设上述第一电子设备和第二电子设备均为手机,则第一电子设备和第二电子设备中的显示窗口与投屏窗口分层显示时,第一电子设备与第二电子设备执行上述方法后的效果如图9所示。其中,第一电子设备的图层1用于显示自身的显示窗口,图层2用于显示第二电子设备自身的显示窗口投屏到第一电子设备后的投屏窗口,与第二电子设备的图层3中显示的显示窗口的内容相同,且所述图层2位于所述图层1之上。第二电子设备的图层3用于显示自身的显示窗口,图层4用于显示第一电子设备自身的显示窗口投屏到第二电子设备后的投屏窗口,与所述图层1中显示的显示窗口的内容相同,且所述图层4位于所述图层3之上。
示例性的,假设上述第一电子设备和第二电子设备均为手机,则第一电子设备和第二电子设备中的显示窗口与投屏窗口分区显示时,第一电子设备与第二电子设备执行上述方法后的效果如图10所示。其中,窗口1为第一电子设备显示屏幕上自身的显示窗口,窗口2为第二电子设备自身的显示窗口投屏到第一电子设备后的投屏窗口,与第二电子设备中窗口3显示内容相同。窗口3为第二电子设备显示屏幕上自身的显示窗口,窗口4为第一电子设备自身的显示窗口投屏到第二电子设备后的投屏窗口,与所述窗口1显示内容相同。
本申请实施例中,第一电子设备与第二电子设备建立双向投屏后,第一电子设备或第二电子设备在显示多个窗口,例如同时显示自身窗口或对端设备投屏到自身的窗口时,可以对待显示的窗口按比例缩放后进行显示,例如图9中所示的显示方式;或者适应性调整待显示的窗口中内容布局后进行显示,例如图10中所示的显示方式。
上述第一电子设备与第二电子设备建立双向设备间屏幕协同后,第一电子设备或第二电子设备对显示屏幕中各窗口进行的操作都可同步到对端的显示屏幕中对应的窗口。因此,使用第一电子设备的用户和使用第二电子设备的用户可以实时查看双方展示的信息,提高了用户体验度,且第一电子设备与第二电子设备之间可以相互支持同步操作,提高了设备间协同操作的效率。例如,如图11所示,以第一电子设备和第二电子设备为手机为例,假设使用第一电子设备的用户C与使用第二电子设备的用户D在同一游戏场景中时,第一电 子设备与第二电子设备通过执行上述设备间屏幕协同方法,分别将自身显示窗口投屏到对方的显示屏幕上进行显示,能够分别在第一电子设备和第二电子设备的显示屏幕中同时显示用户C和用户D的游戏场景界面,实现协同双向共享,用户C和用户D无需进行场景切换就能在自身设备的显示屏幕的投屏窗口中实时查看对方的状态信息。
上述第一电子设备与第二电子设备建立双向设备间屏幕协同后,可以进一步在显示屏幕中输出显示对双向设备间屏幕协同进行控制的控制按钮,以使用户对第一电子设备与第二电子设备间的双向投屏进行灵活控制。
在本申请一些实施例中,第一电子设备与第二电子设备建立双向设备间屏幕协同后,第一电子设备和第二电子设备均可以在显示屏幕上的投屏窗口中设置控制对端开关按钮,该控制对端开关按钮用于控制电子设备更新投屏窗口时是否同步控制对应的显示窗口,使所述显示窗口同步更新。具体的,第一电子设备中该控制对端开关按钮打开时,第一电子设备更新投屏窗口时,同时控制第二电子设备中对应的显示窗口更新;用户对第一电子设备显示屏幕中的投屏窗口进行操作触发窗口更新时,第二电子设备中对应的显示窗口同步更新。第二电子设备中该控制对端开关按钮打开时,第二电子设备更新投屏窗口时,同时控制第一电子设备中对应的显示窗口更新;用户对第二电子设备显示屏幕中的投屏窗口进行操作触发窗口更新时,第一电子设备中对应的显示窗口同步更新。
在本申请一些实施例中,第一电子设备与第二电子设备建立双向设备间屏幕协同后,第一电子设备和第二电子设备均可以在显示屏幕上自身的显示窗口中设置允许对端控制开关按钮(在本申请中第一电子设备显示的允许对端控制开关按钮也可以被称为第一控制选项,第二电子设备显示的允许对端控制开关按钮也可以被称为第二控制选项),该允许对端控制开关按钮用于控制电子设备自身的显示窗口是否接受对应的投屏窗口的同步控制,与所述投屏窗口同步更新。具体的,第一电子设备中该允许对端控制开关按钮打开时,第一电子设备允许用户通过第二电子设备的投屏窗口对第一电子设备的显示窗口进行控制,用户对第二电子设备显示屏幕中的投屏窗口进行操作触发窗口更新时,第一电子设备中对应的显示窗口同步更新。第二电子设备中该允许对端控制开关按钮打开时,第二电子设备允许用户通过第一电子设备的投屏窗口对第二电子设备的显示窗口进行控制,用户对第一电子设备显示屏幕中的投屏窗口进行操作触发窗口更新时,第二电子设备中对应的显示窗口同步更新。
示例性的,如图12所示,第一电子设备与第二电子设备建立双向投屏后,第一电子设备的投屏窗口B1中显示控制对端开关按钮,第二电子设备在自身的显示窗口B中显示允许对端控制开关按钮。用户通过对所述控制对端开关按钮进行相应操作(例如点击、滑动等)打开所述投屏窗口B1中的控制对端开关按钮时,第一电子设备响应于该操作,设置为控制对端模式,用户通过对所述控制对端开关按钮进行相应操作关闭所述投屏窗口B1中的控制对端开关按钮时,第一电子设备响应于该操作,设置为不控制对端模式。用户通过对所述允许对端控制开关按钮进行相应操作(例如点击、滑动等)打开所述显示窗口B中的允许对端控制开关按钮时,第二电子设备响应于该操作,设置为允许对端控制模式,用户通过对所述允许对端控制开关按钮进行相应操作关闭所述显示窗口B中的允许对端控制开关按钮时,第二电子设备响应于该操作,设置为不允许对端控制模式。
其中,一种可能的情况:第一电子设备设置为控制对端模式时,向第二电子设备发送控制对端的请求信息,第二电子设备接收所述请求信息后,若确定已设置为允许对端控制 模式,则向第一电子设备返回允许控制的反馈信息,若确定已设置为不允许对端控制模式,则向第一电子设备返回不允许控制的反馈信息,第一电子设备接收到第二电子设备返回的允许控制的反馈信息后,允许用户通过所述投屏窗口B1对所述第二电子设备进行控制,则用户在第一电子设备的投屏窗口B1进行操作触发窗口更新时,第一电子设备更新所述投屏窗口B1的同时,向第二电子设备发送同步更新所述投屏窗口B1对应的显示窗口B的请求,以使第二电子设备同步更新所述显示窗口B。第一电子设备接收到第二电子设备返回的不允许控制的反馈信息后,不允许用户通过所述投屏窗口B1对所述第二电子设备进行控制,则用户在第一电子设备的投屏窗口B1进行操作触发窗口更新时,第一电子设备仅更新所述投屏窗口B1。
第一电子设备设置为不控制对端模式时,不允许用户通过所述投屏窗口B1对所述第二电子设备进行控制,则用户在第一电子设备的投屏窗口B1进行操作触发窗口更新时,第一电子设备仅更新所述投屏窗口B1。
另一种可能的情况:第二电子设备设置为允许对端控制模式时,向第一电子设备发送允许通过投屏窗口B1控制对应的显示窗口B的提示信息,第一电子设备接收提示信息后反馈给用户,并可以通过投屏窗口B1中的请求对端控制开关按钮,根据用户相应操作设置控制对端或不控制对端的模式。在用户在第一电子设备的投屏窗口B1进行操作触发窗口更新时,第一电子设备可根据已设置的控制对端或不控制对端模式,确定仅更新所述投屏窗口B1,或者同时控制第二电子设备更新对应的显示窗口B。
第二电子设备设置为不允许对端控制模式时,向第一电子设备发送不允许通过投屏窗口B1控制对应的显示窗口B的提示信息,第一电子设备接收提示信息后将所述提示信息反馈给用户,并且,在用户在第一电子设备的投屏窗口B1进行操作触发窗口更新时,第一电子设备仅更新所述投屏窗口B1。
在本申请一些实施例中,作为一种可选的实施方式,第二电子设备设置为不允许对端控制模式时,向第一电子设备发送不接受投屏窗口控制的提示信息,第一电子设备根据该提示信息阻断用户通过所述投屏窗口B1对B的控制,则用户在所述投屏窗口B1进行操作触发窗口更新时,第一电子设备仅更新所述投屏窗口B1。作为另一种可选的实施方式,第二电子设备设置为不允许对端控制模式时,第二电子设备阻断用户通过所述投屏窗口B1对B的控制,则用户在所述投屏窗口B1进行操作触发窗口更新时,第一电子设备更新所述投屏窗口B1,并请求第二电子设备同步更新显示窗口B,但第二电子设备接收到该请求后不响应,维持所述显示窗口B不变。
第二电子设备投屏窗口A1中的控制对端开关按钮,及第一电子设备自身显示窗口A中的允许对端控制开关按钮的操作及控制方式,可分别参考上述第一电子设备投屏窗口B1中的控制对端开关按钮,及第二电子设备自身显示窗口B中的允许对端控制开关按钮的操作及控制方式,此处不再赘述。
在本申请一些实施例中,第一电子设备与第二电子设备建立双向投屏后,互为对方的对端设备。第一电子设备或第二电子设备更新对端设备投屏来的投屏窗口时,若电子设备自身能够生成所述投屏窗口更新后显示的内容,则可以根据更新后所述投屏窗口对应的数据更新所述投屏窗口。其中,若需控制所述投屏窗口对应的对端设备上的显示窗口同时更新,则向对端设备发送同步更新所述显示窗口的请求,对端设备根据该请求更新其显示的所述显示窗口。若电子设备自身不能生成所述投屏窗口更新后显示的内容,则可以向对端 设备请求所述投屏窗口更新后对应的数据,根据对端设备返回的数据更新所述投屏窗口。其中,若需控制所述投屏窗口对应的对端设备上的显示窗口更新,则可以在向对端设备请求所述投屏窗口更新后对应的数据时,同时请求对端设备同步更新所述投屏窗口对应的所述显示窗口,对端设备根据该请求,返回所述投屏窗口更新后对应到的数据时,同步更新其显示的所述显示窗口。
上述在第一电子设备与第二电子设备建立双向设备间屏幕协同的基础上,通过在第一电子设备和第二电子设备显示屏幕中的投屏窗口增加控制对端的开关按钮,在显示窗口中增加允许对端控制的开关按钮,能够使用户根据需求对双向投屏控制进行功能选择,对是否发起控制和是否接受控制进行灵活切换,提高了双向设备间屏幕协同控制的灵活性。
在本申请实施例中,上述第一电子设备与第二电子设备建立双向设备间屏幕协同的方法还可应用于包括多于两个设备的多设备场景中,在所述多设备场景中的任意两个电子设备可分别作为本申请上述实施例所述的第一电子设备和第二电子设备,并分别执行对应的设备间屏幕协同方法,将自身的显示窗口投屏到对方的显示屏幕上进行显示。
在本申请一些实施例中,在所述多设备场景中的任一电子设备可作为本申请上述实施例所述的第一电子设备,其它能与该电子设备建立通信连接的电子设备可分别作为本申请上述实施例所述的第二电子设备,与所述第一电子设备配合执行本申请上述实施例所述的设备间屏幕协同方法,将自身的显示窗口投屏到对方的显示屏幕上进行显示。
作为一种可选的实施方式,第一电子设备与第二电子设备建立双向投屏后,若检测到能与第三电子设备建立多屏协同,则显示询问是否与所述第三电子设备建立屏幕协同的提示信息,并响应于接收到的与所述第三电子设备建立屏幕协同的指示,在保持与所述第二电子设备间的双向投屏的基础上,与所述第三电子设备建立新的投屏关系。
例如,基于上述图7f所示的示例,第一电子设备即平板电脑1与第二电子设备即平板电脑2建立双向投屏后,平板电脑1在显示屏幕上同时显示自身的窗口(窗口1)和平板电脑2投屏来的窗口(窗口2),若平板电脑1检测到能够与第三电子设备例如手机1建立多屏协同,则平板电脑1在自身显示窗口中显示提示信息,询问用户是否需要投屏到手机1,例如显示询问信息“是否投屏到手机1?”,如图13a所示。若根据用户操作确定投屏到手机1,则平板电脑1在维持和平板电脑2之间的双向投屏不中断的情况下,作为第一电子设备,将手机1作为第二电子设备,执行本申请实施例提供的应用于第一电子设备与第二电子设备的设备间屏幕协同方法,实现平板电脑1将自身的显示窗口投屏到手机1的显示屏幕上进行显示。平板电脑1投屏到手机1后,可以接受手机1的投屏,在自身的显示屏幕上同时显示自身的显示窗口、平板电脑2投屏来的窗口和手机1投屏来的窗口。例如,基于图7f所示的平板电脑1的显示屏幕,手机1中显示窗口为备忘录界面对应的窗口时,平板电脑1在与平板电脑2建立双向投屏的基础上,又与手机1建立双向投屏后,平板电脑1的显示屏幕上显示界面如图13b所示。其中,窗口1为平板电脑1自身的显示窗口,窗口2为平板电脑2投屏到平板电脑1的窗口,窗口3为手机1投屏到平板电脑1的窗口。
作为另一种可选的实施方式,电子设备若检测到能与其它多个电子设备建立通信连接,则在显示屏幕上自身的显示窗口中输出显示与其它电子设备建立设备间屏幕协同的协同请求按钮,并设置对应的展开列表,所述展开列表包括所述多个电子设备的设备标志,用户可根据设备标志选择需要建立多屏连接的电子设备。
例如,假设某一电子设备1检测到能与其它多个电子设备建立通信连接,则在显示屏 幕上自身的显示窗口中输出显示投屏到其它电子设备的投屏按钮1401,并设置对应的展开列表,如图14中示意图(a)所示,用户点击该投屏按钮后,电子设备1输出显示所述展开列表,如图14中示意图(b)所示,用户通过点击操作在所述展开列表中选择至少一个电子设备,第一电子设备响应于用户的操作,与用户选择的至少一个电子设备建立设备间屏幕协同。例如,如图14中示意图(b)所示,展示列表中包括电子设备2、电子设备3、电子设备4、电子设备5共四个电子设备时,用户选择其中的电子设备2和电子设备3建立多屏连接。则将电子设备1作为本申请上述实施例所述的第一电子设备,将电子设备2和电子设备3分别作为本申请上述实施例所述的第二电子设备,分别与所述电子设备1配合执行本申请上述实施例所述的设备间屏幕协同方法,实现电子设备1与电子设备2和电子设备3中任一电子设备之间的双向投屏控制。
图15为上述设备间屏幕协同方法的一种双向投屏控制效果示意图,如图所示,窗口101为电子设备1显示屏幕上自身的显示窗口,窗口201为电子设备2显示屏幕上自身的显示窗口,窗口301为电子设备3显示屏幕上自身的显示窗口;窗口102为电子设备2中窗口201投屏到电子设备1后的投屏窗口,与窗口201显示内容相同,窗口103为电子设备3中窗口301投屏到电子设备1后的投屏窗口,与窗口301显示内容相同;窗口202为电子设备1中窗口101投屏到电子设备2后的投屏窗口,与窗口101显示内容相同;窗口302为电子设备1中窗口101投屏到电子设备3后的投屏窗口,与窗口101显示内容相同。
在此基础上,还可以将电子设备2和电子设备3分别作为本申请上述实施例所述的第一电子设备和第二电子设备,配合执行本申请上述实施例所述的设备间屏幕协同方法,实现电子设备2与电子设备3之间的双向投屏控制。进而,能够实现电子设备1、电子设备2和电子设备3中任意两个电子设备之间的双向投屏控制。
图16为上述设备间屏幕协同方法的一种双向投屏控制效果示意图,如图所示,在图14所示的双向投屏控制效果基础上,还包括:窗口203为电子设备3中窗口301投屏到电子设备2后的投屏窗口,与窗口301显示内容相同;窗口303为电子设备2中窗口201投屏到电子设备3后的投屏窗口,与窗口201显示内容相同。
上述实施例应用于多设备场景中,能够在多个设备间进行设备间屏幕协同,一个设备的显示窗口可以投屏到其它多个设备中,并可以接受其它多个设备的投屏,因此在多设备场景中能够根据实际需求灵活调整设备间屏幕协同的应用对象,保证高效的屏幕协同,提高了设备间屏幕协同的实用性。
基于上述实施例,本申请实施例提供了一种设备间屏幕协同方法,应用于第一电子设备和第二电子设备组成的系统,如图17所示,该方法包括如下步骤:
步骤S1701:第一电子设备与第二电子设备建立通信连接。
步骤S1702:所述第一电子设备显示第一窗口,所述第一窗口中包括第一内容。
步骤S1703:所述第二电子设备显示第二窗口,所述第二窗口中包括第二内容。
步骤S1704:所述第一电子设备向所述第二电子设备发送第一投屏数据,所述第一投屏数据包括所述第一内容的信息。
步骤S1705:所述第二电子设备接收所述第一投屏数据,显示所述第二窗口和第三窗口,所述第三窗口中包括所述第一内容。
步骤S1706:所述第二电子设备向所述第一电子设备发送第二投屏数据,所述第二投屏数据包括所述第二内容的信息。
步骤S1707:所述第一电子设备接收所述第二投屏数据,显示所述第一窗口和第四窗口,所述第四窗口中包括所述第二内容。
具体的,该方法中第一电子设备和第二电子设备所执行的具体步骤可参阅前述实施例,在此不做过多赘述。
需要说明的是,上述各步骤的执行顺序或时序是可变化的,例如,上述步骤S1706可以在步骤S1704、步骤S1705之后执行,也可以与步骤S1704同时执行。
基于以上实施例,本申请实施例还提供了一种电子设备,所述电子设备可以为第一电子设备或第二电子设备。所述电子设备用于实现本申请实施例提供的设备间屏幕协同方法。如图18所示,所述电子设备1800可以包括:显示屏幕1801,一个或多个处理器1802,存储器1803,以及一个或多个计算机程序(图中未示出)。上述各器件可以通过一个或多个通信总线1804耦合。
其中,显示屏幕1801用于显示图像、视频、应用界面等相关用户界面。存储器1803中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令;处理器1802调用存储器1803中存储的所述指令,使得电子设备1800执行本申请实施例提供的设备间屏幕协同方法。
上述本申请提供的实施例中,从电子设备作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,电子设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
以上实施例中所用,根据上下文,术语“当…时”或“当…后”可以被解释为意思是“如果…”或“在…后”或“响应于确定…”或“响应于检测到…”。类似地,根据上下文,短语“在确定…时”或“如果检测到(所陈述的条件或事件)”可以被解释为意思是“如果确定…”或“响应于确定…”或“在检测到(所陈述的条件或事件)时”或“响应于检测到(所陈述的条件或事件)”。另外,在上述实施例中,使用诸如第一、第二之类的关系术语来区份一个实体和另一个实体,而并不限制这些实体之间的任何实际的关系和顺序。
本申请实施例中所描述的方法或算法的步骤可以直接嵌入硬件、处理器执行的软件单元、或者这两者的结合。软件单元可以存储于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、EPROM存储器、EEPROM存储器、寄存器、硬盘、可移动磁盘、CD-ROM或本领域中其它任意形式的存储媒介中。示例性地,存储媒介可以与处理器连接,以使得处理器可以从存储媒介中读取信息,并可以向存储媒介存写信息。可选地,存储媒介还可以集成到处理器中。处理器和存储媒介可以设置于ASIC中。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个或多个示例性的设计中,本申请实施例所描述的上述功能可以在硬件、软件、固件或这三者的任意组合来实现。如果在软件中实现,这些功能可以存储与电脑可读的媒介上,或以一个或多个指令或代码形式传输于电脑可读的媒介上。电脑可读媒介包括电脑 存储媒介和便于使得让电脑程序从一个地方转移到其它地方的通信媒介。存储媒介可以是任何通用或特殊电脑可以接入访问的可用媒体。例如,这样的电脑可读媒体可以包括但不限于RAM、ROM、EEPROM、CD-ROM或其它光盘存储、磁盘存储或其它磁性存储装置,或其它任何可以用于承载或存储以指令或数据结构和其它可被通用或特殊电脑、或通用或特殊处理器读取形式的程序代码的媒介。此外,任何连接都可以被适当地定义为电脑可读媒介,例如,如果软件是从一个网站站点、服务器或其它远程资源通过一个同轴电缆、光纤电脑、双绞线、数字用户线(DSL)或以例如红外、无线和微波等无线方式传输的也被包含在所定义的电脑可读媒介中。所述的碟片(disk)和磁盘(disc)包括压缩磁盘、镭射盘、光盘、数字通用光盘(digital versatile disc,DVD)、软盘和蓝光光盘,磁盘通常以磁性复制数据,而碟片通常以激光进行光学复制数据。上述的组合也可以包含在电脑可读媒介中。

Claims (19)

  1. 一种设备间屏幕协同方法,应用于第一电子设备与第二电子设备组成的系统,其特征在于,包括:
    所述第一电子设备与所述第二电子设备建立通信连接;
    所述第一电子设备显示第一窗口,所述第一窗口中包括第一内容;
    所述第二电子设备显示第二窗口,所述第二窗口中包括第二内容;
    所述第一电子设备向所述第二电子设备发送第一投屏数据,所述第一投屏数据包括所述第一内容的信息;
    所述第二电子设备接收所述第一投屏数据,显示所述第二窗口和第三窗口,所述第三窗口中包括所述第一内容;
    所述第二电子设备向所述第一电子设备发送第二投屏数据,所述第二投屏数据包括所述第二内容的信息;
    所述第一电子设备接收所述第二投屏数据,显示所述第一窗口和第四窗口,所述第四窗口中包括所述第二内容。
  2. 根据权利要求1所述的方法,其特征在于,
    所述第一电子设备向所述第二电子设备发送第一投屏数据之前,所述方法还包括:
    所述第一电子设备响应于接收到的第一操作,向所述第二电子设备发送第一投屏请求,所述第一投屏请求用于请求将所述第一内容投屏到所述第二电子设备;
    所述第二电子设备显示所述第二窗口和第三窗口之前,所述方法还包括:
    所述第二电子设备显示第一提示信息,所述第一提示信息用于提示用户是否接受所述第一电子设备将所述第一内容投屏到所述第二电子设备。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第二电子设备显示所述第二窗口和第三窗口之后,所述方法还包括:
    所述第一电子设备响应于接收到的第二操作,更新所述第一窗口中的第一内容;或者
    所述第二电子设备响应于接收到的第三操作,向所述第一电子设备发送第一更新请求,所述第一更新请求用于请求所述第一电子设备更新所述第一窗口中的第一内容;
    所述第一电子设备响应于接收到的所述第一更新请求,更新所述第一窗口中的第一内容。
  4. 根据权利要求3所述的方法,其特征在于,所述第一电子设备更新所述第一窗口中的第一内容时,所述方法还包括:
    所述第一电子设备向所述第二电子设备发送更新的第一投屏数据;
    所述第二电子设备根据接收到的所述更新的第一投屏数据,更新所述第三窗口中的第一内容。
  5. 根据权利要求3或4所述的方法,其特征在于,所述第一电子设备显示所述第一窗口和第四窗口之后,所述方法还包括:
    所述第一电子设备显示第一控制选项;
    所述第一电子设备响应于接收到的所述第一更新请求,更新所述第一窗口中的第一内容之前,所述方法还包括:
    所述第一电子设备接收作用于所述第一控制选项的第四操作。
  6. 根据权利要求1~5任一所述的方法,其特征在于,所述第一窗口与所述第四窗口的显示区域不同,或者,所述第一窗口的显示区域覆盖所述第四窗口的显示区域中的部分区域,或者,所述第四窗口的显示区域覆盖所述第一窗口的显示区域中的部分区域。
  7. 根据权利要求1~6任一所述的方法,其特征在于,所述系统还包括第三电子设备,所述第一电子设备显示所述第一窗口和第四窗口之后,所述方法还包括:
    所述第一电子设备与所述第三电子设备建立通信连接;
    所述第三电子设备显示第五窗口,所述第五窗口中包括第三内容;
    所述第一电子设备向所述第三电子设备发送所述第一投屏数据;
    所述第三电子设备接收所述第一投屏数据,显示所述第五窗口和第六窗口,所述第六窗口中包括所述第一内容;
    所述第三电子设备向所述第一电子设备发送第三投屏数据,所述第三投屏数据包括所述第三内容的信息;
    所述第一电子设备接收所述第三投屏数据,显示所述第一窗口和第七窗口,所述第七窗口中包括所述第三内容。
  8. 一种设备间屏幕协同方法,应用于第一电子设备,其特征在于,包括:
    与第二电子设备建立通信连接;
    显示第一窗口,所述第一窗口中包括第一内容;
    向所述第二电子设备发送第一投屏数据,所述第一投屏数据包括所述第一内容的信息;
    接收来自所述第二电子设备的第二投屏数据,所述第二投屏数据包括第二内容的信息,所述第二内容为所述第二电子设备显示的第二窗口中包括的内容;
    显示所述第一窗口和第四窗口,所述第四窗口中包括所述第二内容。
  9. 根据权利要求8所述的方法,其特征在于,在向所述第二电子设备发送第一投屏数据之前,所述方法还包括:
    响应于接收到的第一操作,向所述第二电子设备发送第一投屏请求,所述第一投屏请求用于请求将所述第一内容投屏到所述第二电子设备。
  10. 根据权利要求8或9所述的方法,其特征在于,在接收来自所述第二电子设备的第二投屏数据之前,所述方法还包括:
    显示第二提示信息,所述第二提示信息用于提示用户是否接受所述第二电子设备将所述第二内容投屏到所述第一电子设备。
  11. 根据权利要求8~10任一所述的方法,其特征在于,在向所述第二电子设备发送第一投屏数据之后,所述方法还包括:
    响应于接收到的第二操作,更新所述第一窗口中的第一内容;或者
    响应于接收到的来自所述第二电子设备的第一更新请求,更新所述第一窗口中的第一内容,其中,所述第一更新请求用于请求所述第一电子设备更新所述第一窗口中的第一内容。
  12. 根据权利要求11所述的方法,其特征在于,在更新所述第一窗口中的第一内容时,所述方法还包括:
    向所述第二电子设备发送更新的第一投屏数据。
  13. 根据权利要求11或12所述的方法,其特征在于,在显示所述第一窗口和第四窗口之后,所述方法还包括:
    显示第一控制选项;
    在响应于来自所述第二电子设备的所述第一更新请求,更新所述第一窗口中的第一内容之前,所述方法还包括:
    接收作用于所述第一控制选项的第四操作。
  14. 根据权利要求8~13任一所述的方法,其特征在于,在显示所述第一窗口和第四窗口之后,所述方法还包括:
    响应于接收到的第七操作,向所述第二电子设备发送第二更新请求,所述第二更新请求用于请求所述第二电子设备更新所述第二窗口中的第二内容。
  15. 根据权利要求14所述的方法,其特征在于,在向所述第二电子设备发送第二更新请求之后,所述方法还包括:
    响应于接收到的来自所述第二电子设备的更新的第二投屏数据,更新所述第四窗口中的第二内容。
  16. 根据权利要求8~15任一所述的方法,其特征在于,所述第一窗口与所述第四窗口的显示区域不同,或者,所述第一窗口的显示区域覆盖所述第四窗口的显示区域中的部分区域,或者,所述第四窗口的显示区域覆盖所述第一窗口的显示区域中的部分区域。
  17. 根据权利要求8~16任一所述的方法,其特征在于,在显示所述第一窗口和第四窗口之后,所述方法还包括:
    与第三电子设备建立通信连接;
    向所述第三电子设备发送所述第一投屏数据;
    接收来自所述第三电子设备的第三投屏数据,所述第三投屏数据包括第三内容的信息,所述第三内容为所述第三电子设备显示的第五窗口中包括的内容;
    显示所述第一窗口和第七窗口,所述第七窗口中包括所述第三内容。
  18. 一种电子设备,其特征在于,所述电子设备包括显示屏幕,存储器和一个或多个处理器;
    其中,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;当所述计算机指令被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求8至17中任一项所述的方法。
  19. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括程序指令,当所述程序指令在终端设备上运行时,使得所述终端设备执行如权利要求1至7任一项所述的方法,或者执行如权利要求8至17任一项所述的方法。
PCT/CN2021/125218 2020-10-30 2021-10-21 一种设备间屏幕协同方法及设备 WO2022089294A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011190260.1A CN114442969A (zh) 2020-10-30 2020-10-30 一种设备间屏幕协同方法及设备
CN202011190260.1 2020-10-30

Publications (1)

Publication Number Publication Date
WO2022089294A1 true WO2022089294A1 (zh) 2022-05-05

Family

ID=81357685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/125218 WO2022089294A1 (zh) 2020-10-30 2021-10-21 一种设备间屏幕协同方法及设备

Country Status (2)

Country Link
CN (1) CN114442969A (zh)
WO (1) WO2022089294A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024027238A1 (zh) * 2022-08-05 2024-02-08 荣耀终端有限公司 一种多设备协同方法、电子设备及相关产品

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117480487A (zh) * 2022-05-30 2024-01-30 京东方科技集团股份有限公司 屏幕信息同步方法及系统
WO2024065449A1 (zh) * 2022-09-29 2024-04-04 京东方科技集团股份有限公司 一种数据共享显示的方法及智能显示系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293583A (zh) * 2016-08-03 2017-01-04 广东威创视讯科技股份有限公司 桌面窗口共享方法和系统
US10331394B1 (en) * 2017-12-21 2019-06-25 Logmein, Inc. Manipulating shared screen content
CN110515580A (zh) * 2019-09-02 2019-11-29 联想(北京)有限公司 一种显示控制方法、装置及终端
CN111158624A (zh) * 2019-12-31 2020-05-15 维沃移动通信有限公司 一种应用分享方法、电子设备及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293583A (zh) * 2016-08-03 2017-01-04 广东威创视讯科技股份有限公司 桌面窗口共享方法和系统
US10331394B1 (en) * 2017-12-21 2019-06-25 Logmein, Inc. Manipulating shared screen content
CN110515580A (zh) * 2019-09-02 2019-11-29 联想(北京)有限公司 一种显示控制方法、装置及终端
CN111158624A (zh) * 2019-12-31 2020-05-15 维沃移动通信有限公司 一种应用分享方法、电子设备及计算机可读存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024027238A1 (zh) * 2022-08-05 2024-02-08 荣耀终端有限公司 一种多设备协同方法、电子设备及相关产品

Also Published As

Publication number Publication date
CN114442969A (zh) 2022-05-06

Similar Documents

Publication Publication Date Title
US20220342850A1 (en) Data transmission method and related device
WO2020238874A1 (zh) 一种vr多屏显示方法及电子设备
WO2021057830A1 (zh) 一种信息处理方法及电子设备
WO2022089294A1 (zh) 一种设备间屏幕协同方法及设备
US10983747B2 (en) Remote desktop mirroring
WO2022100315A1 (zh) 应用界面的生成方法及相关装置
WO2021129253A1 (zh) 显示多窗口的方法、电子设备和系统
WO2022100305A1 (zh) 画面跨设备显示方法与装置、电子设备
WO2022100237A1 (zh) 投屏显示方法及相关产品
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
US11853526B2 (en) Window display method, window switching method, electronic device, and system
WO2022105759A1 (zh) 视频处理方法、装置及存储介质
WO2024016559A1 (zh) 一种多设备协同方法、电子设备及相关产品
CN111221845A (zh) 一种跨设备信息搜索方法及终端设备
WO2022028494A1 (zh) 一种多设备数据协作的方法及电子设备
WO2022127661A1 (zh) 应用共享方法、电子设备和存储介质
WO2022033342A1 (zh) 数据传输方法和设备
WO2022089102A1 (zh) 一种控制方法、装置及电子设备
EP4095723B1 (en) Permission reuse method, permission reuse-based resource access method, and related device
WO2021052488A1 (zh) 一种信息处理方法及电子设备
CN114510186A (zh) 一种跨设备控制方法及设备
WO2023088459A1 (zh) 设备协同方法及相关装置
WO2022105716A1 (zh) 基于分布式控制的相机控制方法及终端设备
WO2022228004A1 (zh) 多屏协同过程中恢复窗口的方法、电子设备和系统
WO2024078337A1 (zh) 一种显示屏选择方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21885021

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21885021

Country of ref document: EP

Kind code of ref document: A1