WO2023273460A1 - 一种投屏显示方法及电子设备 - Google Patents

一种投屏显示方法及电子设备 Download PDF

Info

Publication number
WO2023273460A1
WO2023273460A1 PCT/CN2022/084100 CN2022084100W WO2023273460A1 WO 2023273460 A1 WO2023273460 A1 WO 2023273460A1 CN 2022084100 W CN2022084100 W CN 2022084100W WO 2023273460 A1 WO2023273460 A1 WO 2023273460A1
Authority
WO
WIPO (PCT)
Prior art keywords
window
source device
display
interface
target device
Prior art date
Application number
PCT/CN2022/084100
Other languages
English (en)
French (fr)
Inventor
吉伟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023273460A1 publication Critical patent/WO2023273460A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Definitions

  • the present application relates to the technical field of terminals, and in particular to a projection screen display method and electronic equipment.
  • a source device such as a mobile phone (or called a source) can project multiple applications to a target device such as a PC (or called a sink) in the form of multiple windows for display.
  • a target device such as a PC (or called a sink) in the form of multiple windows for display.
  • the mobile phone can project the desktop of the mobile phone to the window 101 of the PC for display. Subsequently, the user can further open other applications of the mobile phone on the PC by operating the application icons in the window 101 . Still as shown in FIG. 1 , when the user clicks the icon of the chat APP in the window 101 , the mobile phone can project the display interface of the chat APP to the window 102 of the PC for display. For another example, after the user clicks the icon of the video APP in the window 101, the mobile phone can project the display interface of the video APP to the window 103 of the PC for display. At this time, the PC can respectively display the display interfaces of multiple applications on the mobile phone through multiple windows.
  • the user can manage every window projected from the mobile phone on the PC. For example, zooming the window, closing the window, etc.
  • the windows displayed on the PC are relatively independent.
  • the process for the user to manage each window is relatively cumbersome, resulting in poor user experience.
  • the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window includes the interface of the first application task projected by the source device, and the second window includes the source The interface of the second application task projected by the terminal device (that is, the multi-window screen projection scene); if the target device detects that the user enters a window covering operation, and the window covering operation is used to cover the first window with the second window, the target device can send a message to the source The end device sends the corresponding window overlay event; then, in response to the window overlay event, the source end device can obtain the screen projection data after the first window covers the second window; and send the screen projection data to the target device; furthermore, the target device can The first interface is displayed according to the screen projection data. At this time, the first interface includes the interface of the first application task, but does not include the interface of the second application task.
  • the user can drag a window on the target device (such as a PC) to cover another window, triggering the interaction between the target device and the source device (such as a mobile phone) to realize the window covering function. That is, the display content in the covered window is closed while the display content in the dragged window is retained, so that the user can efficiently manage multiple windows projected in the target device and improve user experience.
  • the target device such as a PC
  • the source device such as a mobile phone
  • the source device may include a first display module (such as display 1), and the first display module is used to provide screen projection data to the target device during screen projection; the source device may obtain the corresponding display module from the first display module. projection data.
  • a first display module such as display 1
  • the source device may obtain the corresponding display module from the first display module. projection data.
  • both the first application task and the second application task run on the first display module, wherein the second application task can run on the stack top of the first display module;
  • the method further includes: the source device can move the first application task to the stack top of the first display module to run, so that the first window can continue to display the interface of the first application task.
  • the source device may delete the second application task from the first display module (that is, the second application task is killed), and at this time, the source device may close (that is, destroy) the second window corresponding to the second application task.
  • the source device can set the second application task as an invisible attribute in the first display module. At this time, the second application task is not killed, but is switched to the background of the source device to continue running.
  • the terminal device can also set the window attribute of the second window to be invisible. In this way, for the user, although the second application task is switched to the background of the source device to continue running, the user visually feels that the covered second window is closed, and the dragged first window continues to display the first window. Interface for applying tasks.
  • the source device may include a first display module (such as display 1) and a second display module (such as display 0), the first display module is used to provide screen projection data to the target device during screen projection, and the source terminal
  • the device can obtain corresponding projection data from the first display module
  • the second display module is used to provide display data to the source device, and provide projection data to the target device during screen projection
  • the source device can obtain the display data from the second display
  • the display data obtained in the module is displayed on its own display screen, and in the screen projection scenario, the source device can also obtain the projection data of the main display interface from the second display module.
  • the first application task can run in the first display module, and the second application task can run in the second display module; after the source device receives the window overlay event, it also includes: The source device may move the first application task from the first display module to the top of the stack of the second display module to run, that is, perform a stack shifting operation. After the first application task is moved to the top of the stack of the second display module, the second application task is pushed into the stack, and the interface of the first application task can be displayed in the second windows of the source device and the target device. At this point, the source device can close the first window. In this way, the user visually feels that the dragged first window is closed, and the covered first window continues to display the interface of the first application task.
  • the source device can also set the configuration information of the first application task to be the same as The application task (for example, the second application task) originally located at the top of the stack of the second display module has the same configuration information.
  • the application task for example, the second application task
  • the source device refreshes the second display module, it can read that the configuration information of the application task at the top of the stack has not changed, and the source device can continue to execute the first application task instead of reopening the first application task to start execution , to realize the seamless connection of window content.
  • the above-mentioned window covering operation may specifically be: after the user drags the first window, when the overlapping area between the first window and the second window is greater than an area threshold, a release operation input by the user.
  • the above-mentioned window covering operation can also be: the user drags the first window to make the overlapping area between the first window and the second window larger than the area threshold, that is, the above-mentioned window covering function can also be realized when the user does not input the release operation .
  • the above-mentioned window covering operation may also be other predefined operations, and this application does not impose any limitation on this.
  • the present application provides a screen projection display method, including: the source device projects the interface of the first application task to the first window of the target device for display, and the source device projects the interface of the second application task to be displayed in the second window of the target device; subsequently, the source device may receive a window overlay event sent by the target device, and the window overlay event is used to indicate that the user has input a window overlay operation for covering the first window over the second window; furthermore, In response to the window overlay event, the source device can obtain the screen projection data after the first window covers the second window; and send the screen projection data to the target device to realize the above window overlay function.
  • the source device includes a first display module, and the first display module is used to provide screen projection data to the target device during screen projection; before the source device receives the window coverage event, the first application Both the task and the second application task run on the first display module, and the second application task runs on the stack top of the first display module; after the source device receives the window coverage event, it also includes: the source device converts the first application task Move to the stack top of the first display module to run; the source device deletes the second application task from the first display module, or sets the second application task as an invisible attribute in the first display module.
  • the source device includes a first display module and a second display module, the first display module is used to provide screen projection data to the target device during screen projection; the second display module is used to provide the source device with The device provides display data, and provides screen projection data to the target device during screen projection; before the source device receives the window coverage event, the first application task runs in the first display module, and the second application task runs in the second display module ; After the source device receives the window covering event, it also includes: the source device moves the first application task from the first display module to the stack top of the second display module for execution.
  • the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window includes the interface of the first application task projected by the source device, and the second window includes the source The interface of the second application task projected by the terminal device; if the target device detects that the user inputs a window overlay operation, the window overlay operation is used to cover the first window with the second window; then the target device can send the corresponding window overlay to the source device event; furthermore, the target device receives the screen projection data sent by the source device in response to the window overlay event; the target device displays the first interface according to the screen projection data, the first interface includes the interface of the first application task, and does not include the second application task interface.
  • the interface of the first application task in the first interface is located in the first window;
  • the display content of the source device is synchronized with the main display interface of the source device, the interface of the first application task in the first interface is located in the second window.
  • the window covering operation refers to a release operation input by the user when the overlapping area between the first window and the second window is greater than an area threshold after the user drags the first window.
  • the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window includes the interface of the first application task projected by the source device, and the second window includes the source The interface of the second application task projected by the terminal device (that is, a multi-window screen projection scene); subsequently, if the target device detects that the user enters a window merging operation, the window merging operation is used to merge the first window and the second window; then the target device A corresponding window merging event can be sent to the source device; furthermore, in response to the window merging event, the source device can obtain the screen projection data after the merge of the first window and the second window, and send the screen projection data to the target device; , the target device may display the first interface according to the screen projection data, and at this time, the interface of the first application task and the interface of the second application task are displayed on the first interface in a split-screen manner.
  • the user can drag a window on the target device to merge with another window, which can trigger the interaction between the target device and the source device to realize the window merge function, even if the dragged window
  • the displayed content in and the covered window are displayed on the target device in the form of split screens, so that the user can efficiently manage multiple windows projected on the target device and improve the user experience.
  • the target device after the target device sends the window merging event to the source device, it further includes: in response to the window merging event, the source device can set the window attributes of the first window and the second window to split screen window.
  • the WMS in the source device may set the window attributes of the first window and the second window as split-screen windows (or called split-screen attributes).
  • the source device may include a first display module (such as display 1) and a second display module (such as display 0), and the first display module is used to provide the target device with Screen projection data, the source device can obtain the corresponding projection data from the first display module; the second display module is used to provide display data to the source device, and provide projection data to the target device during screen projection, that is, the source
  • the end device can obtain display data from the second display module to display on its own display screen, and in the screen projection scenario, the source device can also obtain the projection data of the main display interface from the second display module.
  • the source device may draw the interface of the first application task in the left split-screen window, and draw the interface of the second application task in the right split-screen window.
  • the source device may draw the interface of the first application task in the upper split-screen window, and draw the interface of the second application task in the lower split-screen window.
  • the source device can send the interface containing two split-screen windows (that is, the projection data) to the target device in the form of video stream, and the target device can display the first application after split-screen through a window (for example, the third window) An interface of the task and an interface of the second application task.
  • each window displayed on the above-mentioned target device may contain a boundary hotspot, for example, the boundary hotspot may be set in an area close to the edge of the window; then, the above-mentioned window merging operation may be as follows: the user drags the After one window, when the overlapping area between the boundary hot zone of the first window and the boundary hot zone of the second window is greater than the area threshold, the release operation input by the user.
  • the above-mentioned window merging operation may be an operation in which the user drags the first window so that the overlapping area between the boundary hotspot of the first window and the boundary hotspot of the second window is greater than the area threshold, that is, the user does not input a release operation
  • the above-mentioned window merging function can also be realized at the same time.
  • the above-mentioned window merging operation may also be other predefined operations, and this application does not impose any limitation on this.
  • the present application provides a screen projection display method, including: the source device projects the interface of the first application task to the first window of the target device for display, and the source device projects the interface of the second application task to be displayed in the second window of the target device; subsequently, the source device may receive a window merging event sent by the target device, and the window merging event is used to indicate that the user has input a window merging operation of merging the first window and the second window; in response For the above window merging event, the source device can obtain the projection data after the combination of the first window and the second window; and send the projection data to the target device to realize the above window merging function.
  • the source device after the source device receives the window merging event sent by the target device, it further includes: in response to the window merging event, the source device sets the window properties of the first window and the second window to split screen window.
  • the source device includes a first display module and a second display module, the first display module is used to provide screen projection data to the target device during screen projection; the second display module is used to provide the source device with The device provides display data, and provides screen projection data to the target device during screen projection; before receiving the window merge event, the first application task runs in the first display module, and the second application task runs in the second display module; in the source After the terminal device receives the window merging event, it further includes: the source device moves the first application task from the first display module to the stack top of the second display module for execution.
  • the target device after the target device sends the window merging event to the source device, it further includes: in response to the window merging event, the source device separates the interface of the first application task from the interface of the second application task. screen is displayed in the second interface.
  • the present application provides a screen projection display method, including: the target device displays a first window and a second window, the first window includes the interface of the first application task projected by the source device, and the second window includes the interface of the source device The projected interface of the second application task; subsequently, the target device detects that the user inputs a window merging operation, and the window merging operation is used to merge the first window and the second window; in response to the window merging operation, the target device may send a Window merging event; the target device receives the screen projection data sent by the source device in response to the window merging event; the target device displays the first interface according to the screen projection data, wherein the interface of the first application task and the interface of the second application task are divided into two screens displayed on the first interface.
  • each window displayed on the target device may contain a boundary hotspot; the above-mentioned window merging operation refers to: after the user drags the first window, when the boundary hotspot of the first window and the second window The release action entered by the user when the overlapping area between boundary hot zones is greater than the area threshold.
  • the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window includes the interface of the first application task projected by the source device, and the second window includes the source The interface of the second application task projected by the end device (that is, the multi-window screen projection scene); after entering, if the target device detects that the user enters the window suspension operation, the window suspension operation is used to suspend the first window on the second window; Then the target device can send a corresponding window floating event to the source device; in response to the window floating event, the source device can obtain the projection data of the first window floating on the second window, and send the projection data to the target device; Furthermore, the target device may display the first interface according to the screen projection data, and in the first interface, the interface of the first application task is displayed on the second window in the form of a floating window.
  • the user can drag a window on the target device to float on another window, which can trigger the interaction between the target device and the source device to realize the window floating function, even if the dragged
  • the display content in the window is suspended in another window, so that the user can efficiently manage multiple windows projected in the target device, and improve the user experience.
  • the method further includes: in response to the window suspension event, the source device sets the window attribute of the first window as the attribute of the floating window.
  • the WMS in the source device may set the window attribute of the first window as the attribute of the floating window.
  • the source device may include a first display module (such as display 1), and the first display module is used to provide screen projection data to the target device during screen projection; Get the corresponding projection data in the module.
  • a first display module such as display 1
  • both the first application task and the second application task can run on the first display module, and at this time, the second application task runs on the stack top of the first display module; then, in After the source device receives the window suspension event, the source device may move the first application task to the stack top of the first display module to run. After the first application task is moved to the top of the stack of the first display module, the second application task is pushed into the stack, and the second application task can still be set as a visible attribute. Since the first window where the first application task is located is the attribute of the floating window, and the second window where the second application task is located is the default attribute, therefore, the source device can draw the second window in full screen in the first display module.
  • the source device can send the interface including the two first application tasks and the second application task (that is, the projection data) to the target device in the form of video stream, so that the target device displays the first window floating on the second window on the interface.
  • the source device may include a first display module (such as display 1) and a second display module (such as display 0), and the first display module is used to provide the target device with Screen projection data, the source device can obtain the corresponding projection data from the first display module; the second display module is used to provide display data to the source device, and provide projection data to the target device during screen projection, that is, the source
  • the end device can obtain display data from the second display module to display on its own display screen, and in the screen projection scenario, the source device can also obtain the projection data of the main display interface from the second display module.
  • the first application task Before receiving the window suspension event, the first application task can run in the first display module, and the second application task can run in the second display module; after the source device receives the window suspension event, the source device can run the first The application task is moved from the first display module to the top of the stack of the second display module to run, that is, to perform a stack shifting operation. After the first application task is moved to the top of the stack of the second display module, the second application task is pushed into the stack, and the second application task can still be set as a visible attribute. Since the first window where the first application task is located is the attribute of the floating window, and the second window where the second application task is located is the default attribute, therefore, the source device can draw the second window in full screen in the second display module.
  • the interface of the first application task is located on the upper layer of the interface of the second application task.
  • the source device can send the interface including the first application task and the second application task (that is, the projection data) to the target device in the form of a video stream, so that the target device displays the image of the first window floating on the second window. interface.
  • the source device can obtain the interface containing the first application task and the second application task from the second display module, and display the first application task and the second application task synchronously with the target device.
  • the above window floating operation may be: after the user drags the first window, when the overlapping area between the first window and the second window is greater than the area threshold and the duration is greater than the time threshold, the user Input release action.
  • the above-mentioned window floating operation may be: the user drags the first window so that the overlapping area between the first window and the second window is greater than the area threshold and the duration is greater than the time threshold, that is, when the user does not input a release operation
  • the above-mentioned window floating function can also be realized.
  • the above-mentioned window suspension operation may also be other predefined operations, and this application does not impose any limitation on this.
  • the present application provides a screen projection display method, including: the source device projects the interface of the first application task to the first window of the target device for display, and the source device projects the interface of the second application task to be displayed in the second window of the target device; the source device can receive a window suspension event sent by the target device, and the window suspension event is used to indicate that the user has entered a window suspension operation to suspend the first window on the second window; in response to the window In the floating event, the source device can obtain the projection data of the first window floating on the second window; and send the projection data to the target device to realize the above window floating function.
  • the source device may also set the window attribute of the first window as a floating window.
  • the source device includes a first display module, and the first display module is used to provide screen projection data to the target device during screen projection; before the source device receives the window suspension event, the first application Both the task and the second application task run on the first display module, and the second application task runs on the stack top of the first display module; after the source device receives the window suspension event, it also includes: the source device converts the first application task Move to the stack top of the first display module to run.
  • the present application provides a screen projection display method, including: the target device displays a first window and a second window, the first window includes the interface of the first application task projected by the source device, and the second window includes the interface of the source device The projected interface of the second application task; the target device detects that the user inputs a window suspension operation, and the window suspension operation is used to suspend the first window on the second window; in response to the window suspension operation, the target device can send the window to the source device Suspension event; the target device receives the screen projection data sent by the source device in response to the window suspension event; the target device displays the first interface according to the screen projection data, wherein, in the first interface, the interface of the first application task is in the form of a floating window displayed on the second window.
  • window covering operation window merging operation, and window floating operation may be any predefined operations, and this application does not impose any limitation on this.
  • the source device may be triggered to interact with the target device to realize the above window floating function.
  • the source device may be triggered to interact with the target device to implement the above-mentioned window overlay function.
  • the source device and the target device can also realize the superposition of various functions in the window covering function, the window merging function and the window floating function through the above method.
  • the source device and the target device may first merge window 1 and window 2 in response to user operation 1, and then merge the merged window with window 3 in response to user operation 2.
  • the source device and the target device may display window 1 floating on window 2 in response to user operation 1, and then merge the window including the floating window with window 3 in response to user operation 2.
  • the present application provides an electronic device, the electronic device is a source device, and the source device includes: a memory, a display screen, a communication module and one or more processors; the memory, the display screen and the processor are coupled .
  • the memory is used to store computer program codes, and the computer program codes include computer instructions; when the electronic device is running, the processor is used to execute one or more computer instructions stored in the memory, so that the electronic device executes the source code in any one of the above aspects.
  • a screen projection method performed by a terminal device.
  • the present application provides an electronic device, the electronic device is a target device, and the target device includes: a memory, a display screen, and one or more processors; the memory, the display screen and the processor are coupled.
  • the memory is used to store computer program codes, and the computer program codes include computer instructions; when the electronic device is running, the processor is used to execute one or more computer instructions stored in the memory, so that the electronic device performs the target in any of the above aspects
  • the screen projection method executed by the device is a target device, and the target device includes: a memory, a display screen, and one or more processors; the memory, the display screen and the processor are coupled.
  • the memory is used to store computer program codes
  • the computer program codes include computer instructions; when the electronic device is running, the processor is used to execute one or more computer instructions stored in the memory, so that the electronic device performs the target in any of the above aspects
  • the screen projection method executed by the device is a target device, and the target device includes: a memory, a display screen, and one or more
  • the present application provides a screen projection display system, including the above-mentioned source device and destination device, and the source device and the destination device can execute the screen projection display method as described in any one of the above aspects through interaction.
  • the present application provides a computer-readable storage medium, where the computer-readable storage medium includes computer instructions.
  • the computer instructions When the computer instructions are run on the electronic device, the electronic device is made to execute the screen projection display method described in any aspect above.
  • the present application provides a computer program product, which, when the computer program product is run on an electronic device, causes the electronic device to execute the screen projection display method described in any one of the above aspects.
  • FIG. 1 is a schematic diagram of a multi-window projection screen scene in the prior art
  • FIG. 3 is a second schematic diagram of an application scenario of a screen projection display method provided by an embodiment of the present application
  • FIG. 4 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of the architecture of an operating system in a mobile phone provided by an embodiment of the present application.
  • FIG. 6 is a first schematic diagram of a multi-window screen projection provided by an embodiment of the present application.
  • FIG. 7 is a second schematic diagram of a multi-window projection screen provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram 3 of a multi-window projection screen provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an application scenario 3 of a projection display method provided in an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an application scenario IV of a screen projection display method provided in an embodiment of the present application.
  • FIG. 11 is an interactive schematic diagram 1 of a projection display method provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an application scenario five of a screen projection display method provided by an embodiment of the present application.
  • FIG. 13 is a sixth schematic diagram of an application scenario of a screen projection display method provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of an application scenario VII of a projection display method provided in an embodiment of the present application.
  • FIG. 16 is a schematic diagram of an application scenario eight of a projection display method provided in an embodiment of the present application.
  • FIG. 17 is a schematic diagram of application scenario 9 of a screen projection display method provided in an embodiment of the present application.
  • FIG. 20 is an eleventh schematic diagram of an application scenario of a screen projection display method provided by an embodiment of the present application.
  • Fig. 21 is an interactive schematic diagram 4 of a projection display method provided in the embodiment of the present application.
  • FIG. 22 is a schematic diagram of an application scenario twelve of a projection display method provided in an embodiment of the present application.
  • FIG. 24 is a fourteenth schematic diagram of an application scenario of a projection display method provided in an embodiment of the present application.
  • Fig. 26 is an interactive schematic diagram 5 of a projection display method provided in the embodiment of the present application.
  • FIG. 27 is a sixteenth schematic diagram of an application scenario of a screen projection method provided in an embodiment of the present application.
  • FIG. 30 is a schematic diagram of an eighteenth application scenario of a screen projection display method provided in an embodiment of the present application.
  • FIG. 31 is a nineteenth schematic diagram of an application scenario of a projection display method provided in an embodiment of the present application.
  • FIG. 33 is a schematic structural diagram of a target device provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • the source device (or called the source end) can establish a connection with the target device (or called the sink end).
  • the source device can establish a Wi-Fi connection, a Bluetooth connection, or a P2P (peer to peer, point-to-point) connection with the target device.
  • the source device can project images, documents, audio, video or applications (or tasks in the application) in the source device to the target device through the MiraCast protocol or DLNA (Digital Living Network Alliance, Digital Living Network Alliance) protocol Display or play, so that the user can use the related functions provided by the source device in the target device.
  • MiraCast protocol Digital Living Network Alliance, Digital Living Network Alliance
  • the electronic tag 201 is generally provided with a coil, and the device information of the PC can be written into the coil of the electronic tag 201 in advance when the PC leaves the factory.
  • the device information of the PC may include one or more items such as the name of the PC, a Bluetooth MAC (media access control, media access control) address, or an IP address.
  • the mobile phone can also trigger the mobile phone to establish a connection with the PC by searching for nearby devices, or through gestures such as dragging and dropping, or the mobile phone can also use UWB (Ultra Wide Band, Ultra Wideband) and other communication technologies to establish a connection with the PC, which is not limited in this embodiment of the present application.
  • UWB Ultra Wide Band, Ultra Wideband
  • the user can use the keyboard or mouse of the PC to input the operation of opening the video APP in the window 203 .
  • the PC can send the operation input by the user to the mobile phone, triggering the mobile phone to project the video APP to the PC in response to the operation.
  • the mobile phone can run a video APP in the background.
  • the mobile phone can send the display data generated when running various tasks of the video APP (ie, the display interface 205 of the video APP) to the PC in the form of a video stream.
  • the PC can display the display interface 205 of the video APP through the window 206 . At this point, the PC can still display the display interface 202 synchronously with the mobile phone in the window 203.
  • the mobile phone can not only project the interface of the application being displayed (subsequently referred to as the main display interface) to the PC window for display, but also project the interfaces of other applications not displayed on the mobile phone to the PC. Display in the window of the mobile phone, so that the PC can display multiple applications projected from the mobile phone in the form of multiple windows.
  • the user can manage the display content of each window in the PC.
  • the user may input a pause operation into the window 206 to trigger the PC to pause and play the video being played by the video APP in the window 206 .
  • the user may click the minimize button in the window 206 to trigger the PC to minimize the window 206 .
  • the mobile phone can include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, and a wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, and a wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, sensor module 180, etc.
  • a universal serial bus universal serial bus, USB
  • the structure shown in the embodiment of the present invention does not constitute a specific limitation on the mobile phone.
  • the mobile phone may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • the wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in a mobile phone can be used to cover single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied to mobile phones.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • the wireless communication module 160 can provide applications on mobile phones including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system ( Global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the mobile phone is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the mobile phone can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • WLAN NFC
  • FM
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the mobile phone realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the mobile phone may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the mobile phone can realize shooting function through ISP, camera 193 , video codec, GPU, display screen 194 and application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the mobile phone may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone selects the frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • a mobile phone can support one or more video codecs.
  • the mobile phone can play or record videos in multiple encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the mobile phone can realize the audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • the cell phone can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to listen to the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the mobile phone may be provided with at least one microphone 170C.
  • the mobile phone can be provided with two microphones 170C, which can also implement a noise reduction function in addition to collecting sound signals.
  • the mobile phone can also be equipped with three, four or more microphones 170C to realize the collection of sound signals, noise reduction, identification of sound sources, and realization of directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the mobile phone may also include a charging management module, a power management module, a battery, buttons, an indicator, and one or more SIM card interfaces, etc., which are not limited in this embodiment of the present application.
  • the software system of the above-mentioned mobile phone may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of this application uses a layered architecture
  • the system is taken as an example to illustrate the software structure of the mobile phone.
  • FIG. 5 is a block diagram of the software structure of the mobile phone according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the The system is divided into four layers, from top to bottom are application layer, application framework layer, Android runtime (Android runtime) and system library, and kernel layer.
  • the application layer can consist of a series of application packages.
  • APPs applications
  • call memo
  • browser contacts
  • camera gallery
  • calendar map
  • bluetooth music, video, and short message
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include an activity management service (ActivityManagerService, AMS) and a window management service (WindowManagerService, WMS).
  • ActivityManagerService ActivityManagerService
  • WMS window management service
  • AMS can be used to manage the life cycle of the application runtime.
  • Applications usually run in the operating system in the form of activities.
  • ActivityRecord application record
  • This ActivityRecord records the state of the Activity of the application.
  • the activity manager can use this ActivityRecord as an identifier to schedule the activity process of the application.
  • a mobile phone running a desktop (also called a desktop application, launcher, etc.) is used as an example.
  • AMS can create a corresponding application stack (stack) for the desktop in the default display module (such as display0) of the mobile phone, such as stack A1.
  • display 0 corresponds to the display screen of the mobile phone, that is, the display interface drawn in display 0 will eventually be output to the display screen of the mobile phone.
  • the display module can also be called a virtual screen or a virtual display.
  • display 0 may also include stacks corresponding to other applications in the mobile phone, such as stack A2 and so on.
  • the stack at the top of the stack in display 0 is the stack of the application running in the foreground of the mobile phone.
  • the stack A2 behind the top of the stack is the stack of applications running in the background of the mobile phone.
  • AMS can set related stack attributes for each stack, such as whether it is visible, whether it is a split-screen application, and so on.
  • the stack at the top of the stack is a visible (visible) attribute, and the display interface generated when this type of stack is running can be displayed on the screen of the mobile phone, while the stack at the top of the stack is an invisible (invisible) attribute, and this type of stack is running. The generated display interface will not be displayed on the mobile phone screen.
  • the WMS can also output the display interface 601 generated by stack A1 at the top of the stack in display 0 to the PC, so that the PC can The display interface 601 being displayed by the mobile phone is displayed in the corresponding window, so as to realize the function of screen projection in a multi-device collaboration scenario.
  • the PC can send the corresponding screen projection command 1 to the mobile phone, instructing the mobile phone to upload the video APP to the mobile phone.
  • the application tasks in the project are projected to the PC display.
  • AMS can create a new display module (such as display 1), and then create a stack B1 corresponding to the video APP in display 1.
  • display 1 corresponds to the display screen of the PC, that is, the display interface drawn in display 1 will eventually be output to the display screen of the PC.
  • stack B1 can include one or more activities that the video app needs to execute.
  • AMS executes the Activity in stack B1, it can call WMS to draw a corresponding display interface in display 1 in real time (for example, display interface 701 shown in FIG. 7), and display interface 701 is associated with window 2 created by WMS.
  • the WMS can output the display interface 701 generated in display 1 to the PC, and the PC displays the display interface 701 in window 2, thereby projecting the application task of the video APP in the mobile phone to the PC for display.
  • the user can also project other applications in the mobile phone to the PC for display according to the above method.
  • the PC can send a corresponding screen projection instruction 2 to the mobile phone, instructing the mobile phone to project the application tasks in the chat APP to the PC for display.
  • AMS in response to the screen projection command 2, AMS can create a stack B2 corresponding to the chat APP in display 1. Different from display 0, all stacks in display 1 can be set as visible attributes, and the stack at the top of the stack is usually the stack of the application opened or operated by the user recently.
  • AMS executes the Activity in stack B2, it can call WMS to draw a corresponding display interface in display 1 in real time (for example, display interface 801 shown in FIG. 8 ), and display interface 801 is associated with window 3 created by WMS. Furthermore, the WMS can output the display interface 801 generated in display 1 to the PC, and the PC displays the display interface 801 in window 3, thereby projecting the application tasks of the chat APP in the mobile phone to the PC for display.
  • the PC can display the desktop 601 of the mobile phone running in display 0 in real time through window 1, and the PC can display the display interface 701 of the video APP running in display 1 of the mobile phone in real time through window 2, and The display interface 801 of the chat APP running on the mobile phone in the display 1 is displayed in real time through the window 3, so that multiple application tasks in the mobile phone can be projected to the PC for display in the form of multiple windows.
  • application tasks in different windows may belong to the same application, or may belong to different applications, which is not limited in this embodiment of the present application.
  • window 1 corresponds to stack A1 at the top of the stack in display 0
  • other windows (such as window 2 and window 3) correspond to each stack in display 1 respectively. That is to say, in the above screen projection scenario, one window in the PC can simultaneously display the display interface of the mobile phone (such as the display interface 601 of the above desktop), and one or more other windows in the PC can be used to project the display interface of other applications in the mobile phone. UI.
  • the user can cover or combine multiple windows displayed on the PC in the above screen projection scene, and trigger the mobile phone to establish an association between the multiple windows projected on the PC, so as to manage the projection on the PC.
  • Multiple windows on the PC enable multiple windows on the PC to achieve functions such as replacement, split screen, suspension or merging, and improve the user experience in screen projection scenarios.
  • the application framework layer may also include power management services, content provision services, view systems, resource management services, notification management services, etc., which are not limited in this embodiment of the present application.
  • Android runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem, and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, etc., which are not limited in this embodiment of the present application.
  • the mobile phone can project the application task (such as the desktop) running in the foreground to the PC, so that the PC displays the display interface 601 of the desktop in window 1 .
  • the application task such as the desktop
  • Stack A1 corresponding to the desktop runs in display 0 of the mobile phone. That is to say, the display interface in window 1 is synchronized with the display interface in the mobile phone screen, and the display interface in window 1 in the screen projection scene can be called the main display interface of the source device (ie, mobile phone) later.
  • the synchronization of the display interface displayed by the mobile phone and the PC in window 1 means that the specific content displayed by the two can be the same, but the shape, size, position, arrangement, resolution or DPI (Dots Per Inch) of the two can be the same. Points) and other display parameters may be different, which is not limited in this embodiment of the present application.
  • the mobile phone can also project the application task of the video APP in the mobile phone to the PC, so that the PC displays the display interface 701 of the video APP in the window 2 .
  • Stack B1 corresponding to the video APP is running on display 1 of the mobile phone.
  • the mobile phone can also project the application tasks of the chatting APP in the mobile phone to the PC, so that the PC displays the display interface 801 of the chatting APP in the window 3 .
  • the Stack B2 corresponding to the chat APP is also running in display 1 of the mobile phone.
  • window 1 displayed synchronously with the mobile phone screen on the PC corresponds to the default display 0 of the mobile phone
  • other windows such as window 2 or window 3 projected on the PC correspond to the newly created display 1 of the mobile phone.
  • the display content generated in display 1 can be sent to the display of the PC for display, but generally not sent to the display of the mobile phone for display.
  • the display content generated in display 0 is generally sent to the display screen of the mobile phone for display.
  • the display content generated by the Stack at the top of the stack in display 0 can be sent to the display of the PC for display.
  • the windows displayed on the PC during screen projection may not only include the display interface of the relevant application, but also include components such as a control bar, a title bar, a status bar, or a tool bar.
  • the control bar may include a maximize button, a minimize button, and a close button.
  • the mouse may send a corresponding release event to the PC, indicating that the user has released the mouse to stop dragging the window 2 .
  • the PC may respond to the release event and send a corresponding window covering event 1 to the mobile phone, and the window covering event 1 indicates that the user has performed an event of dragging window 2 to cover window 3 .
  • the mobile phone After the mobile phone receives the above-mentioned window covering event 1, it can call AMS and WMS to control the PC to no longer display window 3, but to keep the dragged window 2 and the display interface 701 in the window 2, so that the PC displays ( c) The interface shown.
  • the mobile phone after receiving the above-mentioned window coverage event 1, the mobile phone can instruct WMS to set window 3 as an invisible attribute, and can instruct AMS to move Stack B1 corresponding to the video APP in display 1 to the stack top.
  • Stack B1 is located at the top of the display 1 stack, Stack B1 is a visible attribute, and Stack B2 is an invisible attribute.
  • the Stack A1 corresponding to the desktop is still at the top of the display 0 stack, and Stack A1 is a visible attribute.
  • the chat APP is not killed, but switched to run in the background of the mobile phone.
  • the WMS can also send the display interface 601 of the desktop drawn in display 0 to the display screens of the PC and the mobile phone in real time, so that the display interface 601 continues to be displayed synchronously in the window 1 of the mobile phone and the PC.
  • the above embodiment is illustrated by the user dragging the window 2 in the PC to cover the window 3.
  • the user can also drag a certain window in the PC to cover the window containing the main display interface of the mobile phone in the PC (for example, the above window 1), that is, dragging a certain window to cover the window in the PC that is synchronized with the display interface of the mobile phone.
  • the corresponding window overlay function can also be implemented on the PC.
  • the PC displays window 1 , window 2 and window 3 in the above screen projection scenario.
  • the PC detects that the user uses the mouse to drag the window 2
  • the PC can monitor the dragged position of the window 2 on the screen in real time.
  • the PC can calculate the overlapping area 2 between the window 2 and the window 1 in real time.
  • the PC When the overlapping area 2 is greater than the preset area threshold, if the PC receives a release event sent by the mouse, the PC can respond to the release event and send a corresponding window covering event 2 to the mobile phone, and the window covering event 2 indicates that the user has performed dragging the window 2 overrides window 1's events.
  • the covered window 1 displays the main display interface of the mobile phone, as shown in Figure 14, when the mobile phone receives the above-mentioned window covering event 2, unlike Figure 11, the mobile phone can call AMS and WMS to control the PC no longer Display window 2, but continue to display the display interface 701 of window 2 in window 1, so that the PC displays the interface shown in (c) in FIG. 13 .
  • Stack A1 on the desktop in window 1 runs in display 0 of the mobile phone
  • Stack A1 is a visible attribute
  • Stack B1 of the video app in window 2 runs on display 1 of the phone
  • Stack B2 of the chat app in window 3 also runs on display 1 of the phone. Both Stack B1 and Stack B2 are visible properties, and Stack B2 is at the top of the stack.
  • the mobile phone may send an instruction to close the window 2 to the WMS in response to the window covering event 2 .
  • the mobile phone may send an instruction to move Stack B1 from display 1 to display 0 to AMS in response to window coverage event 2.
  • the WMS can destroy the window 2 in response to the received command.
  • AMS can call a preset stack-moving interface (for example, moveStackToDisplay()) to move Stack B1 from display 1 to the top of the stack of display 0.
  • Stack B1 is at the top of the display 0 stack, and Stack B1 is a visible attribute.
  • the Stack B2 corresponding to the chat APP is still located in display 1, and Stack B2 is still a visible attribute.
  • Stack A1 that is, the Stack on the desktop
  • Stack A1 is an invisible attribute at this time
  • Stack A1 can also be deleted, and the embodiment of this application does not do anything about it. limit.
  • the AMS of the mobile phone executes the Activity in the stack B1, it can call the WMS to draw the display interface 701 generated by the video APP in display 0 in real time.
  • the display interface 701 is now associated with the window 1 that displays the main display interface of the mobile phone. Since stack B1 is moved into display 0 at this time, as shown in Figure 16, on the one hand, WMS can send the display interface 701 generated by the video APP to the display screen of the mobile phone for display. On the other hand, the WMS can send the display interface 701 generated by the video APP to the PC in the form of a video stream. As shown in (c) of FIG.
  • the PC after the PC receives the display interface 701 generated by the video APP, it can display the display interface 701 in window 1 in real time.
  • the display interface displayed by the mobile phone and the PC in window 2 is synchronized, that is, the display interface in window 2 is the main display interface of the mobile phone during screen projection.
  • the WMS can also send the display interface 801 of the chat APP drawn in display 1 to the PC, and the PC continues to display the display interface 801 in window 3 in real time.
  • the user can drag a window on the PC (that is, the target device) to cover another window, and trigger the interaction between the PC and the mobile phone (that is, the source device) to realize the window overlay function, that is, Close the covered window while retaining the dragged window, which is convenient for the user to manage multiple windows in the target device.
  • the PC that is, the target device
  • the mobile phone that is, the source device
  • the WMS when the mobile phone's AMS performs a stack shift operation on a certain Stack, the WMS will refresh the Stack in the corresponding display module and re-read the configuration information of the Stack at the top of the stack.
  • the configuration information of the Stack parameters such as resolution and aspect ratio when displaying the corresponding display interface can be recorded. If the configuration information of the Stack at the top of the stack changes after refreshing, the WMS will re-execute the Stack at the top of the stack.
  • the AMS in the scenario where the user drags window 2 to cover window 1, when the AMS of the mobile phone moves Stack B1 from display 1 to the top of display 0, the AMS can set the configuration information of Stack B1 as The same configuration information as Stack A1 originally located at the top of the display 0 stack. Then, after WMS refreshes display 0, it can read that the configuration information of the Stack at the top of the display 0 stack has not changed, then WMS will not restart the execution of Stack B1, but continue to execute the Activity in Stack B1 before the stack shift operation. In this way, after the video APP in window 2 is switched to window 1, the display interface 701 of the video APP can be seamlessly switched to window 1, so as to realize the seamless connection of window content and improve user experience.
  • the user can also drag a window on the PC (ie, the target device) to cover another window, triggering the interaction between the PC and the mobile phone (ie, the source device) to realize the window
  • the floating function makes the dragged window hover over the covered window.
  • the overlapping area 1 is greater than the area threshold and the duration is greater than a preset time threshold (for example, 2s), it indicates that the user may have an operation intention to hover and display window 2 on window 3 .
  • a preset time threshold for example, 2s
  • the PC can respond to the release event and send a corresponding window floating event 1 to the mobile phone.
  • the window floating event 1 indicates that the user has performed dragging the window 2 to float Events in window 3.
  • the mobile phone receives the above-mentioned window floating event 1, it can call AMS and WMS to control the PC to display window 2 in the form of a floating window on window 3, so that the PC displays the interface shown in (c) in Figure 17.
  • Stack B1 of the video APP in window 2 runs on display 1 of the mobile phone
  • Stack B2 of the chat APP in window 3 also runs on In display 1 of the mobile phone.
  • Both Stack B1 and Stack B2 are visible properties, and Stack B2 is at the top of the stack.
  • Stack A1 on the desktop in window 1 runs on display 0 of the mobile phone, and Stack A1 is a visible attribute.
  • the mobile phone may respond to the window suspension event 1 and send an instruction to set window 2 as a floating window to the WMS.
  • the mobile phone can respond to the window suspension event 1 and send an instruction to AMS to move Stack B1 to the top of the stack in display 1.
  • WMS can modify the window attribute of window 2 to the attribute of floating window (FloatingWindow) in response to the received instruction.
  • AMS can move Stack B1 to the top of the stack in display 1 in response to the received command.
  • Stack B1 is at the top of the stack in display 1
  • Stack B2 and Stack B1 are still visible attributes.
  • the Stack A1 corresponding to the desktop is still at the top of the display 0 stack.
  • the AMS of the mobile phone executes the activities in stack B1 and Stack B2, it can call WMS to draw the display interface 701 generated by the video APP and the display interface 801 generated by the chat APP in display 1.
  • the display interface 701 can be used as a floating window. Forms are drawn on top of the display interface 801 .
  • the WMS can send the video stream including the display interface 701 and the display interface 801 to the PC, and the PC displays the display interface including the window 2 (ie, the floating window) in the window 3 .
  • window 2 continues to be displayed on window 3 in the form of a floating window.
  • the above embodiment is illustrated by the user dragging the window 2 on the PC to float on the window 3.
  • the user can also drag a certain window on the PC to include the main display interface of the mobile phone in the PC.
  • Window for example, window 1 above
  • the corresponding window floating function can also be implemented on the PC.
  • the PC displays window 1 , window 2 and window 3 in the above-mentioned screen projection scenario.
  • the PC detects that the user uses the mouse to drag the window 2
  • the PC can monitor the dragged position of the window 2 on the screen in real time.
  • the PC can calculate the overlapping area 2 between the window 2 and the window 1 in real time. When the overlapping area 2 is greater than the preset area threshold, the PC starts timing.
  • window floating event 2 indicates that the user has executed the event of dragging window 2 to hover over window 1.
  • the mobile phone receives the above-mentioned window suspension event 2, it can call AMS and WMS to control the PC to display window 2 in the form of a floating window on window 1, so that the PC displays the interface shown in (c) in Figure 20.
  • window 2 will also be displayed synchronously in the form of a floating window on the screen of the mobile phone.
  • the Stack B1 of the video APP in window 2 runs on the display 1 of the mobile phone
  • the Stack B2 of the chat APP in window 3 also runs on In display 1 of the mobile phone.
  • Both Stack B1 and Stack B2 are visible properties, and Stack B2 is at the top of the stack.
  • Stack A1 on the desktop in window 1 runs on display 0 of the mobile phone
  • Stack A1 is a visible attribute.
  • the mobile phone may respond to the window suspension event 2 and send an instruction to set window 2 as a floating window to the WMS.
  • the mobile phone may send an instruction to move Stack B1 from display 1 to display 0 to AMS in response to window suspension event 2.
  • the WMS may modify the window attribute of window 2 to the attribute of a floating window (FloatingWindow) in response to the received instruction.
  • the window 2 ie, the floating window
  • the window attribute has been modified is still associated with the display screen 701 generated by the video APP.
  • AMS can call a preset stack-moving interface (for example, moveStackToDisplay()) to move Stack B1 from display 1 to the top of the stack of display 0.
  • Stack B1 is at the top of display 0, and Stack A1, which was originally at the top of display 0, is pushed into the stack.
  • Stack B1 at the top of the stack is a visible attribute.
  • the AMS of the mobile phone executes the activities in stack B1 and stack A1, it can call WMS to draw the display interface 701 generated by the video APP and the display interface 601 of the desktop in display 0, and the display interface 701 takes the form of a floating window The form is drawn on the upper layer of the display interface 601 .
  • the WMS can send the display interface 701 and the display interface 601 to the mobile phone.
  • the mobile phone may display the display interface 701 on the display interface 601 of the desktop in the form of a floating window.
  • the WMS can send the video stream including the display interface 701 and the display interface 601 to the PC.
  • the PC may continue to display the display interface 601 in the window 1 and display the display interface 701 in the form of a floating window on the window 1 .
  • the WMS can also send the display interface 801 of the chat APP drawn in display 1 to the PC, and the PC continues to display the display interface 801 in window 3 in real time.
  • the user can drag a window on the PC (that is, the target device) to float on another window, triggering the interaction between the PC and the mobile phone (that is, the source device) to realize the window floating function , even if the dragged window hovers over the covered window.
  • the user can also drag one window to another window on the PC (ie, the target device) to trigger the interaction between the PC and the mobile phone (ie, the source device) to realize the window
  • the merge function makes the dragged window and the covered window merge into one window.
  • the window projected from the PC may be preset to include a boundary hot zone.
  • the window 2401 includes a preset boundary hot zone 2402 .
  • the boundary hot zone 2402 is located near the edge of the window 2401 .
  • the area covered by each boundary extending 100 pixels inward in the window 2401 may be set as the boundary hot zone 2402 of the window 2401 .
  • the PC can detect whether the user triggers the above-mentioned window merging function by detecting the coincidence degree of the border hot zone between the two windows.
  • the PC displays window 1 , window 2 and window 3 in the above screen projection scenario.
  • the PC when the PC detects that the user uses the mouse to drag the window 2, the PC can monitor the dragged position of the window 2 on the screen in real time.
  • the PC can calculate the coincidence of the boundary hot zone between window 2 and window 3 in real time Spend.
  • the coincidence degree 1 between the border hotspot on the left side of window 2 and the border hotspot on the right side of window 3 is greater than the preset coincidence degree threshold, it indicates that the user may have an operation intention to merge windows 2 and 3.
  • the PC receives the release event sent by the mouse, the PC can respond to the release event and send a corresponding window merge event 1 to the mobile phone. 3 merged events.
  • the mobile phone When the mobile phone receives the above-mentioned window merging event 1, it can call AMS and WMS to control the PC to merge window 2 and window 3 into a new window (such as window 4), as shown in (c) in Figure 25, so that the PC can merge
  • AMS and WMS to control the PC to merge window 2 and window 3 into a new window (such as window 4), as shown in (c) in Figure 25, so that the PC can merge
  • the display interface originally located in window 2 and window 3 is displayed in the form of split screen in the subsequent window 4.
  • the mobile phone may send a split-screen instruction of window 2 and window 3 to the WMS.
  • the WMS may respond to the screen splitting instructions of the window 2 and the window 3, and set the window 2 and the window 3 as split screen properties.
  • window 2 can be set as the right split-screen window
  • window 3 can be set as the left split-screen window.
  • the AMS of the mobile phone executes the stack B1 corresponding to the video APP, it can call the WMS to draw the display interface 701 of the video APP in display 1. At this time, the display interface 701 is associated with the right split-screen window.
  • the AMS of the mobile phone executes the stack B2 corresponding to the chat APP, it can call the WMS to draw the display interface 801 of the chat APP in display 1. At this time, the display interface 801 is associated with the left split-screen window.
  • the WMS can send the display interface 701 and the display interface 801 as the display interface in a window (such as window 4) to the PC, and the PC displays the display interface 701 of the video APP and the display interface of the chat APP in the form of split screens in the window 4.
  • the display interface 801 realizes the screen merging function.
  • the user can also drag the window 2 to merge with the window 1 including the main display interface of the mobile phone in the PC.
  • the PC displays window 1 , window 2 and window 3 in the above screen projection scenario.
  • the PC can calculate the overlapping degree of the boundary hot zone between window 2 and window 1 in real time. For example, as shown in (b) in Figure 27, when the coincidence degree 2 between the border hotspot on the upper side of window 2 and the border hotspot on the lower side of window 1 is greater than the preset coincidence degree threshold, it means that the user may have Merge the operation intent of window 2 and window 1.
  • the PC can respond to the release event and send the corresponding window merge event 2 to the mobile phone, and the window merge event 2 indicates that the user has executed the merge of window 2 and window 1 event.
  • the mobile phone receives the above-mentioned window merging event 2, it can call AMS and WMS to control the PC to merge window 2 and window 1 into one window (such as window 5), as shown in (c) in Figure 27, so that the PC can Window 5 of Window 2 displays the original display interface in Window 2 and Window 1 in the form of split screen.
  • the mobile phone may send a split-screen instruction of window 2 and window 1 to the WMS.
  • the mobile phone can also send a message to AMS to move Stack B1 from display 1 to display 0 instructions.
  • the WMS may set window 2 and window 1 as split-screen attributes.
  • window 1 can be set as the upper split-screen window
  • window 2 can be set as the lower split-screen window.
  • AMS can respond to the received command and call the default stack-moving interface to move Stack B1 from display 1 to the top of the stack of display 0.
  • Stack B1 is located at the top of the display 0 stack
  • Stack A1 that is, the Stack of the desktop
  • AMS can set the corresponding Stack B1 and Stack A1 as visible properties.
  • the AMS of the mobile phone executes the stack B1 corresponding to the video APP, it can call the WMS to draw the display interface 701 of the video APP in display 0. At this time, the display interface 701 is associated with the lower split-screen window.
  • the AMS of the mobile phone executes the stack A1 corresponding to the desktop, it can call the WMS to draw the display interface 601 of the desktop in display 0. At this time, the display interface 601 is associated with the upper split-screen window.
  • the WMS can send the display interface 701 and the display interface 601 as a display interface in a window (such as window 5) to the PC in the form of a video stream, and the PC displays the display interface of the video APP in the form of split screens in the window 5 701 and the display interface 601 of the desktop.
  • a window such as window 5
  • the PC displays the display interface of the video APP in the form of split screens in the window 5 701 and the display interface 601 of the desktop.
  • the WMS can also send the display interface 701 and display interface 601 to the display screen of the mobile phone.
  • the mobile phone can display the display interface 701 of the video APP and the display interface 601 of the desktop in the form of split screens, so that the display interface in the mobile phone is synchronized with the display interface in the window 5 in the PC.
  • the PC when the PC detects that the user drags window 2 so that the coincidence degree of the boundary hot zone between window 2 and window 3 is greater than the coincidence degree threshold, if the PC receives the mouse transmission release event, the PC can interact with the mobile phone according to the above method to realize the window merging function.
  • the PC can merge window 2 and window 3 into window 4, and display the display interface 701 and window 3 of the original video APP in window 2 in the form of split screens in window 4
  • the mobile phone can also project the window 4 being displayed in split screen form to the window of the tablet computer according to the above method for display.
  • the mobile phone as the source device can seamlessly switch the displayed interface to the target device to run, so that the user can continue to use the relevant functions provided by the source device in the target device.
  • scenarios shown in (a)-(d) in FIG. 30 are scenarios in which the user merges two windows first, and then covers other windows with the merged window. It can be understood that the user can also superimpose and use the above window covering function, window floating function and window merging function in other ways.
  • the user can use the window merging function to first merge window 1 and window 2, and then merge the merged window with window 3.
  • the user can use the window floating function to float window 1 on top of window 2, and then merge the window containing the floating window with window 3.
  • the mobile phone and the PC can float and display window 2 on window 3 according to the above window floating function. Furthermore, as shown in (b) in Figure 31, if the user drags window 3 to float to window 1 (that is, the main display interface of the mobile phone), window 2 floating on window 3 will be dragged to window 1 Suspension. At this point, if the PC receives the release event sent by the mouse, the PC can instruct the mobile phone to close windows 2 and 3 that do not contain the focus application, and display the remaining windows in window 1 in the form of floating windows according to the above method .
  • the mobile phone can control the PC to delete window 3 and its display content, and display window 2 in window 1 in the form of a floating window.
  • window covering function window floating function and window merging function
  • the user can operate multiple windows on a target device such as a PC, triggering the source device to establish an association between the operated multiple windows on the target device, so that the target device
  • a target device such as a PC
  • the functions of window overlay, window suspension and window merging are realized in the software, so that users can manage multiple windows projected on the target device through one operation, so as to manage multiple windows projected on the target device more efficiently and improve user experience.
  • the mobile phone is used as the source device in the screen projection scenario
  • the PC is used as the target device in the screen projection scenario.
  • the terminal device may also be an electronic device with a screen projection function such as a tablet computer
  • the target device may also be an electronic device with a display function such as a TV or a tablet computer, which is not limited in this embodiment of the present application.
  • the Android system is used as an example to illustrate the implementation of the above screen projection display method between various functional modules. It is understandable that other operating systems (such as Hongmeng system, etc.) can also be set corresponding
  • the function module implements the above method. As long as the functions implemented by each device and functional module are similar to the embodiments of the present application, they fall within the scope of the claims of the present application and their equivalent technologies.
  • the embodiment of the present application discloses an electronic device, and the electronic device may be the above-mentioned source device (such as a mobile phone).
  • the electronic device may specifically include: a touch screen 3201, the touch screen 3201 including a touch sensor 3206 and a display screen 3207; one or more processors 3202; a memory 3203; a communication module 3208; one or more application programs (not shown);
  • the above-mentioned components can be connected by one or more communication buses 3205 .
  • the above-mentioned one or more computer programs 3204 are stored in the above-mentioned memory 3203 and configured to be executed by the one or more processors 3202, the one or more computer programs 3204 include instructions, and the instructions can be used to execute the above-mentioned Relevant steps performed by the source device in the embodiment.
  • the embodiment of the present application discloses an electronic device, and the electronic device may be the above-mentioned target device (such as a PC).
  • the electronic device may specifically include: a display screen 3301; one or more processors 3302; a memory 3303; a communication module 3306; one or more application programs (not shown); Connections may be via one or more communication buses 3305 .
  • the electronic device can also be equipped with input devices such as a touch screen, a mouse or a keyboard.
  • the above-mentioned one or more computer programs 3304 are stored in the above-mentioned memory 3303 and configured to be executed by the one or more processors 3302, the one or more computer programs 3304 include instructions, and the instructions can be used to execute the above-mentioned Relevant steps performed by the target device in the embodiment.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component displayed as a unit may be one physical unit or multiple physical units, that is, it may be located in one place, or may be distributed to multiple different places . Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种投屏显示方法及电子设备,涉及终端技术领域,可在投屏场景下增加投射的多个窗口之间的关联性,方便用户高效管理目标设备中投射的多个窗口,提高用户的使用体验。该方法包括:目标设备显示第一窗口和第二窗口,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面;如果目标设备检测到用户输入将第一窗口覆盖第二窗口的窗口覆盖操作,则目标设备向源端设备发送窗口覆盖事件;响应于窗口覆盖事件,源端设备获取第一窗口覆盖第二窗口后的投屏数据,并向目标设备发送该投屏数据;目标设备根据该投屏数据显示第一界面,第一界面中包括第一应用任务的界面,不包括第二应用任务的界面。

Description

一种投屏显示方法及电子设备
本申请要求于2021年06月30日提交国家知识产权局、申请号为202110736178.2、发明名称为“一种投屏显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种投屏显示方法及电子设备。
背景技术
在一些投屏场景下,手机等源端设备(或称为source端)可以将多个应用以多窗口的形式投射至PC等目标设备(或称为sink端)中显示。
例如,如图1所示,手机可以将手机的桌面投射至PC的窗口101中显示。后续,用户可以通过操作窗口101中应用的图标,进一步在PC中打开手机的其他应用。仍如图1所示,当用户点击窗口101中聊天APP的图标后,手机可将聊天APP的显示界面投射至PC的窗口102中显示。又例如,当用户点击窗口101中视频APP的图标后,手机可以将视频APP的显示界面投射至PC的窗口103中显示。此时,PC可以通过多个窗口分别显示手机上多个应用的显示界面。
在这种场景下,用户可以管理PC上从手机投射来的每个窗口。例如,缩放窗口、关闭窗口等。但是,PC上显示的各个窗口之间是相对独立的,当手机在PC上投射的窗口的数量较多时,用户对每个窗口进行管理的过程较为繁琐,使用户的使用体验不高。
发明内容
本申请提供一种投屏显示方法及电子设备,可在投屏场景下增加投射的多个窗口之间的关联性,方便用户高效管理目标设备中投射的多个窗口,提高用户的使用体验。
第一方面,本申请提供一种投屏显示方法,包括:目标设备显示第一窗口和第二窗口,其中,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面(即多窗口投屏场景);如果目标设备检测到用户输入窗口覆盖操作,窗口覆盖操作用于将第一窗口覆盖第二窗口,则目标设备可向源端设备发送对应的窗口覆盖事件;进而,响应于窗口覆盖事件,源端设备可获取第一窗口覆盖第二窗口后的投屏数据;并向目标设备发送该投屏数据;进而,目标设备可根据该投屏数据显示第一界面,此时,第一界面中包括第一应用任务的界面,不包括第二应用任务的界面。
也就是说,在多窗口投屏的场景下,用户可以通过在目标设备(例如PC)中拖动一个窗口覆盖另一个窗口,触发目标设备与源端设备(例如手机)交互实现窗口覆盖功能,即关闭被覆盖的窗口中的显示内容同时保留被拖动的窗口中的显示内容,从而方便用户高效管理目标设备中投射的多个窗口,提高用户的使用体验。
在一种可能的实现方式中,上述第一窗口和第二窗口中均不包括源端设备的主显示界面,即用户操作的第一窗口和第二窗口不是源端设备投射来的主显示界面所在的窗口;此时,上述第一界面中第一应用任务的界面位于第一窗口。也就是说,上述窗口覆盖功能可关闭被覆盖的第二窗口,保留被拖动的第一窗口继续显示第一应用任务的界面。
在上述场景中,源端设备可包括第一display模块(例如display 1),第一display 模块用于在投屏时向目标设备提供投屏数据;源端设备可从第一display模块中获取相应的投屏数据。在源端设备接收到上述窗口覆盖事件之前,第一应用任务和第二应用任务均运行在第一display模块,其中,第二应用任务可运行在第一display模块的栈顶;在源端设备接收到上述窗口覆盖事件之后,还包括:源端设备可将第一应用任务移动至第一display模块的栈顶运行,使第一窗口可继续显示第一应用任务的界面。
另外,源端设备可将第二应用任务从第一display模块删除(即第二应用任务被kill),此时,源端设备可关闭(即销毁)与第二应用任务对应的第二窗口。或者,源端设备可将第二应用任务在第一display模块中设置为不可见的属性,此时,第二应用任务并没有被kill,而是被切换至源端设备的后台继续运行,源端设备可将第二窗口的窗口属性也设置为不可见。这样,对用户而言,虽然第二应用任务被切换至源端设备的后台继续运行,但用户视觉上感受到被覆盖的第二窗口被关闭,被拖动的第一窗口在继续显示第一应用任务的界面。
在一种可能的实现方式中,上述第二窗口中的显示内容与源端设备的主显示界面同步,即用户操作的第一窗口不是源端设备投射来的主显示界面所在的窗口,但被覆盖的第二窗口显示是的源端设备的主显示界面;此时,上述第一界面中第一应用任务的界面位于第二窗口。也就是说,上述窗口覆盖功能可关闭被拖动的第一窗口,并将第一窗口中显示的第一应用任务的界面切换至第二窗口中继续显示。
在上述场景中,源端设备可包括第一display模块(例如display 1)和第二display模块(例如display 0),第一display模块用于在投屏时向目标设备提供投屏数据,源端设备可从第一display模块中获取相应的投屏数据;第二display模块用于向源端设备提供显示数据,并在投屏时向目标设备提供投屏数据,源端设备可从第二display模块中获取显示数据在自身的显示屏中显示,并且,在投屏场景下源端设备也可从第二display模块中获取主显示界面的投屏数据。
在源端设备接收到上述窗口覆盖事件之前,第一应用任务可运行在第一display模块,第二应用任务可运行在第二display模块;在源端设备接收到窗口覆盖事件之后,还包括:源端设备可将第一应用任务从第一display模块移动至第二display模块的栈顶运行,即执行移栈操作。第一应用任务移动至第二display模块的栈顶后,第二应用任务被压入栈内,源端设备以及目标设备的第二窗口中可显示第一应用任务的界面。此时,源端设备可关闭第一窗口。这样,用户视觉上感受到被拖动的第一窗口被关闭,被覆盖的第一窗口继续显示第一应用任务的界面。
在一种可能的实现方式中,当源端设备将第一应用任务从第一display模块移动至第二display模块的栈顶后,源端设备还可以将第一应用任务的配置信息设置为与原本位于第二display模块栈顶的应用任务(例如第二应用任务)相同的配置信息。这样,源端设备刷新第二display模块时可以读取到位于栈顶的应用任务的配置信息没有发生改变,则源端设备可继续执行第一应用任务,而不是重新打开第一应用任务开始执行,实现窗口内容的无缝接续。
示例性的,上述窗口覆盖操作具体可以为:用户拖动第一窗口后,当第一窗口与第二窗口之间的重叠面积大于面积阈值时,用户输入的释放操作。或者,上述窗口覆盖操作也可以为:用户拖动第一窗口使第一窗口与第二窗口之间的重叠面积大于面积阈值的操作,即用户没有输入的释放操作时也可实现上述窗口覆盖功能。当然,上述窗口覆盖操作还可以是其他预先定义的操作,本申请对此不做任何限制。
第二方面,本申请提供一种投屏显示方法,包括:源端设备将第一应用任务的界面投射至目标设备的第一窗口中显示,并且,源端设备将第二应用任务的界面投射至目标设备的第二窗口中显示;后续,源端设备可接收目标设备发送的窗口覆盖事件,该窗口覆盖事件用于指示用户输入了将第一窗口覆盖第二窗口的窗口覆盖操作;进而,响应于窗口覆盖事件,源端设备可获取第一窗口覆盖第二窗口后的投屏数据;并向目标设备发送该投屏数据,实现上述窗口覆盖功能。
在一种可能的实现方式中,源端设备包括第一display模块,第一display模块用于在投屏时向目标设备提供投屏数据;在源端设备接收到窗口覆盖事件之前,第一应用任务和第二应用任务均运行在第一display模块,第二应用任务运行在第一display模块的栈顶;在源端设备接收到窗口覆盖事件之后,还包括:源端设备将第一应用任务移动至第一display模块的栈顶运行;源端设备将第二应用任务从第一display模块删除,或者,将第二应用任务在第一display模块中设置为不可见的属性。
在一种可能的实现方式中,源端设备包括第一display模块和第二display模块,第一display模块用于在投屏时向目标设备提供投屏数据;第二display模块用于向源端设备提供显示数据,并在投屏时向目标设备提供投屏数据;在源端设备接收到窗口覆盖事件之前,第一应用任务运行在第一display模块,第二应用任务运行在第二display模块;在源端设备接收到窗口覆盖事件之后,还包括:源端设备将第一应用任务从第一display模块移动至第二display模块的栈顶运行。
第三方面,本申请提供一种投屏显示方法,包括:目标设备显示第一窗口和第二窗口,其中,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面;如果目标设备检测到用户输入窗口覆盖操作,改窗口覆盖操作用于将第一窗口覆盖第二窗口;则目标设备可向源端设备发送对应的窗口覆盖事件;进而,目标设备接收源端设备响应于窗口覆盖事件发送的投屏数据;目标设备根据投屏数据显示第一界面,第一界面中包括第一应用任务的界面,不包括第二应用任务的界面。
在一种可能的实现方式中,当第一窗口和第二窗口中均不包括源端设备的主显示界面时,第一界面中第一应用任务的界面位于第一窗口;当第二窗口中的显示内容与源端设备的主显示界面同步时,第一界面中第一应用任务的界面位于第二窗口。
在一种可能的实现方式中,窗口覆盖操作是指:用户拖动第一窗口后,当第一窗口与第二窗口之间的重叠面积大于面积阈值时,用户输入的释放操作。
第四方面,本申请提供一种投屏显示方法,包括:目标设备显示第一窗口和第二窗口,其中,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面(即多窗口投屏场景);后续,如果目标设备检测到用户输入窗口合并操作,改窗口合并操作用于合并第一窗口和第二窗口;则目标设备可向源端设备发送对应的窗口合并事件;进而,响应于窗口合并事件,源端设备可获取第一窗口与第二窗口合并后的投屏数据,并向目标设备发送该投屏数据;进而,目标设备可根据该投屏数据显示第一界面,此时,第一应用任务的界面和第二应用任务的界面以分屏的方式显示在第一界面中。
也就是说,在多窗口投屏的场景下,用户可以通过在目标设备中拖动一个窗口与另一个窗口合并,可触发目标设备与源端设备交互实现窗口合并功能,即使被拖动的窗口中的显示内容和被覆盖的窗口中的显示内容以分屏的形式显示在目标设备中,从而方便用户高效管理目标设备中投射的多个窗口,提高用户的使用体验。
在一种可能的实现方式中,在目标设备向源端设备发送窗口合并事件之后,还包括:响应于窗口合并事件,源端设备可将第一窗口和第二窗口的窗口属性设置为分屏窗口。例如,源端设备中的WMS可将第一窗口和第二窗口的窗口属性设置为分屏窗口(或称为分屏属性)。
在一种可能的实现方式中,源端设备可包括第一display模块(例如display 1),第一display模块用于在投屏时向目标设备提供投屏数据;源端设备可从第一display模块中获取相应的投屏数据。在源端设备接收到上述窗口合并事件之前,如果第一应用任务和第二应用任务均运行在第一display模块,则接收到上述窗口合并事件之后,源端设备无需修改第一display模块中的第一应用任务和第二应用任务。
在另一种可能的实现方式中,源端设备可包括第一显示display模块(例如display 1)和第二display模块(例如display 0),第一display模块用于在投屏时向目标设备提供投屏数据,源端设备可从第一display模块中获取相应的投屏数据;第二display模块用于向源端设备提供显示数据,并在投屏时向目标设备提供投屏数据,即源端设备可从第二display模块中获取显示数据在自身的显示屏中显示,并且,在投屏场景下源端设备也可从第二display模块中获取主显示界面的投屏数据。
那么,在接收到窗口合并事件之前,如果第一应用任务运行在第一display模块,第二应用任务运行在第二display模块;则在源端设备接收到窗口合并事件之后,源端设备可将第一应用任务从第一display模块移动至第二display模块的栈顶运行,即执行移栈操作。第一应用任务移动至第二display模块的栈顶后,第二应用任务被压入栈内,由于第一应用任务和第二应用任务均为分屏窗口中的应用任务,因此,源端设备可在第二display模块中以分屏的形式分别绘制第一应用任务的界面和第二应用任务的界面。
例如,源端设备可将第一应用任务的界面绘制在左分屏窗口中,并将第二应用任务的界面绘制在右分屏窗口中。又例如,源端设备可将第一应用任务的界面绘制在上分屏窗口中,并将第二应用任务的界面绘制在下分屏窗口中。后续,源端设备可将包含两个分屏窗口的界面(即投屏数据)以视频流的方式发送给目标设备,目标设备可通过一个窗口(例如第三窗口)显示分屏后第一应用任务的界面和第二应用任务的界面。
在一种可能的实现方式中,在目标设备向源端设备发送窗口合并事件之后,还包括:响应于窗口合并事件,源端设备将第一应用任务的界面和第二应用任务的界面以分屏的方式显示在第二界面中,即源端设备可与目标设备中的一个窗口(例如第三窗口)同步显示分屏后第一应用任务的界面和第二应用任务的界面。
在一种可能的实现方式中,上述目标设备显示的每个窗口可包含边界热区,例如,边界热区可设置在靠近窗口边缘的区域;那么,上述窗口合并操作可以为:用户拖动第一窗口后,当第一窗口的边界热区与第二窗口的边界热区之间的重叠面积大于面积阈值时,用户输入的释放操作。或者,上述窗口合并操作可以为:用户拖动第一窗口,使第一窗口的边界热区与第二窗口的边界热区之间的重叠面积大于面积阈值的操作,即用户没有输入的释放操作时也可实现上述窗口合并功能。当然,上述窗口合并操作还可以是其他预先定义的操作,本申请对此不做任何限制。
第五方面,本申请提供一种投屏显示方法,包括:源端设备将第一应用任务的界面投射至目标设备的第一窗口中显示,并且,源端设备将第二应用任务的界面投射至目标设备的第二窗口中显示;后续,源端设备可接收目标设备发送的窗口合并事件,该窗口合并事件用于指示用户输入了合并第一窗口和第二窗口的窗口合并操作;响应于上述窗 口合并事件,源端设备可获取第一窗口与第二窗口合并后的投屏数据;并向目标设备发送投屏数据,实现上述窗口合并功能。
在一种可能的实现方式中,在源端设备接收目标设备发送的窗口合并事件之后,还包括:响应于窗口合并事件,源端设备将第一窗口和第二窗口的窗口属性设置为分屏窗口。
在一种可能的实现方式中,源端设备包括第一display模块和第二display模块,第一display模块用于在投屏时向目标设备提供投屏数据;第二display模块用于向源端设备提供显示数据,并在投屏时向目标设备提供投屏数据;在接收到窗口合并事件之前,第一应用任务运行在第一display模块,第二应用任务运行在第二display模块;在源端设备接收到窗口合并事件之后,还包括:源端设备将第一应用任务从第一display模块移动至第二display模块的栈顶运行。
在一种可能的实现方式中,在目标设备向源端设备发送窗口合并事件之后,还包括:响应于窗口合并事件,源端设备将第一应用任务的界面和第二应用任务的界面以分屏的方式显示在第二界面中。
第六方面,本申请提供一种投屏显示方法,包括:目标设备显示第一窗口和第二窗口,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面;后续,目标设备检测到用户输入窗口合并操作,该窗口合并操作用于合并第一窗口和第二窗口;响应于窗口合并操作,目标设备可向源端设备发送窗口合并事件;目标设备接收源端设备响应于窗口合并事件发送的投屏数据;目标设备根据投屏数据显示第一界面,其中,第一应用任务的界面和第二应用任务的界面以分屏的方式显示在第一界面中。
在一种可能的实现方式中,目标设备显示的每个窗口可包含边界热区;上述窗口合并操作是指:用户拖动第一窗口后,当第一窗口的边界热区与第二窗口的边界热区之间的重叠面积大于面积阈值时,用户输入的释放操作。
第七方面,本申请提供一种投屏显示方法,包括:目标设备显示第一窗口和第二窗口,其中,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面(即多窗口投屏场景);后入,如果目标设备检测到用户输入窗口悬浮操作,改窗口悬浮操作用于将第一窗口悬浮在第二窗口上;则目标设备可向源端设备发送对应的窗口悬浮事件;响应于窗口悬浮事件,可源端设备获取第一窗口悬浮在第二窗口上的投屏数据,并向目标设备发送该投屏数据;进而,目标设备可根据该投屏数据显示第一界面,在第一界面中,第一应用任务的界面以悬浮窗的形式显示在第二窗口上。
也就是说,在多窗口投屏的场景下,用户可以通过在目标设备中拖动一个窗口悬浮在另一个窗口上,可触发目标设备与源端设备交互实现窗口悬浮功能,即使被拖动的窗口中的显示内容悬浮在另一窗口中,从而方便用户高效管理目标设备中投射的多个窗口,提高用户的使用体验。
在一种可能的实现方式中,在目标设备向源端设备发送窗口悬浮事件之后,还包括:响应于窗口悬浮事件,源端设备将第一窗口的窗口属性设置为悬浮窗的属性。例如,源端设备中的WMS可将第一窗口的窗口属性设置为悬浮窗的属性。
在一种可能的实现方式中,源端设备可包括第一display模块(例如display 1),第一display模块用于在投屏时向目标设备提供投屏数据;源端设备可从第一display模块 中获取相应的投屏数据。
在源端设备接收到上述窗口悬浮事件之前,第一应用任务和第二应用任务均可运行在第一display模块,此时,第二应用任务运行在第一display模块的栈顶;那么,在源端设备接收到上述窗口悬浮事件之后,源端设备可将第一应用任务移动至第一display模块的栈顶运行。第一应用任务移动至第一display模块的栈顶后,第二应用任务被压入栈内,第二应用任务仍可被设置为可见的属性。由于第一应用任务所在的第一窗口为悬浮窗的属性,第二应用任务所在的第二窗口为默认的属性,因此,源端设备可在以第一display模块中以全屏的形式绘制第二应用任务的界面,并以悬浮窗的形式绘制第一应用任务的界面,第一应用任务的界面位于第二应用任务的界面的上层。后续,源端设备可将包含两个第一应用任务和第二应用任务的界面(即投屏数据)以视频流的方式发送给目标设备,使目标设备显示出第一窗口悬浮在第二窗口上的界面。
在另一种可能的实现方式中,源端设备可包括第一显示display模块(例如display 1)和第二display模块(例如display 0),第一display模块用于在投屏时向目标设备提供投屏数据,源端设备可从第一display模块中获取相应的投屏数据;第二display模块用于向源端设备提供显示数据,并在投屏时向目标设备提供投屏数据,即源端设备可从第二display模块中获取显示数据在自身的显示屏中显示,并且,在投屏场景下源端设备也可从第二display模块中获取主显示界面的投屏数据。
在接收到窗口悬浮事件之前,第一应用任务可运行在第一display模块,第二应用任务可运行在第二display模块;在源端设备接收到窗口悬浮事件之后,源端设备可将第一应用任务从第一display模块移动至第二display模块的栈顶运行,即执行移栈操作。第一应用任务移动至第二display模块的栈顶后,第二应用任务被压入栈内,第二应用任务仍可被设置为可见的属性。由于第一应用任务所在的第一窗口为悬浮窗的属性,第二应用任务所在的第二窗口为默认的属性,因此,源端设备可在以第二display模块中以全屏的形式绘制第二应用任务的界面,并以悬浮窗的形式绘制第一应用任务的界面,第一应用任务的界面位于第二应用任务的界面的上层。后续,源端设备可将包含第一应用任务和第二应用任务的界面(即投屏数据)以视频流的方式发送给目标设备,使目标设备显示出第一窗口悬浮在第二窗口上的界面。
同时,由于第二display模块还用于向源端设备提供显示数据,因此源端设备可从第二display模块中获取包含第一应用任务和第二应用任务的界面,与目标设备同步显示第一窗口悬浮在第二窗口上的界面。
在一种可能的实现方式中,上述窗口悬浮操作可以为:用户拖动第一窗口后,当第一窗口与第二窗口之间的重叠面积大于面积阈值,且持续时间大于时间阈值时,用户输入的释放操作。或者,上述窗口悬浮操作可以为:用户拖动第一窗口,使第一窗口与第二窗口之间的重叠面积大于面积阈值,且持续时间大于时间阈值的操作,即用户没有输入的释放操作时也可实现上述窗口悬浮功能。当然,上述窗口悬浮操作还可以是其他预先定义的操作,本申请对此不做任何限制。
第八方面,本申请提供一种投屏显示方法,包括:源端设备将第一应用任务的界面投射至目标设备的第一窗口中显示,并且,源端设备将第二应用任务的界面投射至目标设备的第二窗口中显示;源端设备可接收目标设备发送的窗口悬浮事件,窗口悬浮事件用于指示用户输入了将第一窗口悬浮在第二窗口上的窗口悬浮操作;响应于窗口悬浮事件,源端设备可获取第一窗口悬浮在第二窗口上的投屏数据;并向目标设备发送投屏数 据,实现上述窗口悬浮功能。
在一种可能的实现方式中,响应于窗口悬浮事件,源端设备还可以将第一窗口的窗口属性设置为悬浮窗。
在一种可能的实现方式中,源端设备包括第一display模块,第一display模块用于在投屏时向目标设备提供投屏数据;在源端设备接收到窗口悬浮事件之前,第一应用任务和第二应用任务均运行在第一display模块,第二应用任务运行在第一display模块的栈顶;在源端设备接收到窗口悬浮事件之后,还包括:源端设备将第一应用任务移动至第一display模块的栈顶运行。
在一种可能的实现方式中,源端设备包括第一display模块和第二display模块,第一display模块用于在投屏时向目标设备提供投屏数据;第二display模块用于向源端设备提供显示数据,并在投屏时向目标设备提供投屏数据;在接收到窗口悬浮事件之前,第一应用任务运行在第一display模块,第二应用任务运行在第二display模块;在源端设备接收到窗口悬浮事件之后,还包括:源端设备将第一应用任务从第一display模块移动至第二display模块的栈顶运行。
第九方面,本申请提供一种投屏显示方法,包括:目标设备显示第一窗口和第二窗口,第一窗口包括源端设备投射的第一应用任务的界面,第二窗口包括源端设备投射的第二应用任务的界面;目标设备检测到用户输入窗口悬浮操作,窗口悬浮操作用于将第一窗口悬浮在第二窗口上;响应于窗口悬浮操作,目标设备可向源端设备发送窗口悬浮事件;目标设备接收源端设备响应于窗口悬浮事件发送的投屏数据;目标设备根据投屏数据显示第一界面,其中,在第一界面中,第一应用任务的界面以悬浮窗的形式显示在第二窗口上。
在一种可能的实现方式中,上述窗口悬浮操作是指:用户拖动第一窗口后,当第一窗口与第二窗口之间的重叠面积大于面积阈值,且持续时间大于时间阈值时,用户输入的释放操作。
需要说明的是,上述窗口覆盖操作、窗口合并操作、窗口悬浮操作可以是预先定义的任何操作,本申请对此不做任何限制。例如,检测到用户输入上述窗口覆盖操作后,可触发源端设备与目标设备交互实现上述窗口悬浮功能。又例如,检测到用户输入上述窗口合并操作后,可触发源端设备与目标设备交互实现上述窗口覆盖功能。
另外,源端设备与目标设备还可以通过上述方法实现窗口覆盖功能、窗口合并功能以及窗口悬浮功能中多种功能的叠加。例如,源端设备与目标设备可响应用户的操作1先将窗口1和窗口2合并,再响应用户的操作2将合并后的窗口与窗口3合并。又例如,源端设备与目标设备可响应用户的操作1将窗口1悬浮显示在窗口2上,再响应用户的操作2将包含悬浮窗的窗口与窗口3合并。
第十方面,本申请提供一种电子设备,所述电子设备为源端设备,该源端设备包括:存储器、显示屏、通信模块和一个或多个处理器;存储器、显示屏与处理器耦合。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;当电子设备运行时,该处理器用于执行该存储器存储的一个或多个计算机指令,以使电子设备执行上述任一方面中源端设备执行的投屏显示方法。
第十一方面,本申请提供一种电子设备,所述电子设备为目的设备,该目的设备包括:存储器、显示屏和一个或多个处理器;存储器、显示屏与处理器耦合。其中,存储器用于存储计算机程序代码,计算机程序代码包括计算机指令;当电子设备运行时,该 处理器用于执行该存储器存储的一个或多个计算机指令,以使电子设备执行上述任一方面中目标设备执行的投屏显示方法。
第十二方面,本申请提供一种投屏显示系统,包括上述源端设备和目的设备,源端设备和目的设备通过交互可执行如上述任一方面所述的投屏显示方法。
第十三方面,本申请提供一种计算机可读存储介质,该计算机可读存储介质包括计算机指令。当计算机指令在电子设备上运行时,使得该电子设备执行上述任一方面所述的投屏显示方法。
第十四方面,本申请提供一种计算机程序产品,当所述计算机程序产品在电子设备上运行时,使得电子设备执行上述任一方面所述的投屏显示方法。
可以理解地,上述各个方面所提供的电子设备、计算机可读存储介质以及计算机程序产品均应用于上文所提供的对应方法,因此,其所能达到的有益效果可参考上文所提供的对应方法中的有益效果,此处不再赘述。
附图说明
图1为现有技术中多窗口投屏场景的示意图;
图2为本申请实施例提供的一种投屏显示方法的应用场景示意图一;
图3为本申请实施例提供的一种投屏显示方法的应用场景示意图二;
图4为本申请实施例提供的一种手机的结构示意图;
图5为本申请实施例提供的一种手机内操作系统的架构示意图;
图6为本申请实施例提供的一种多窗口投屏的原理示意图一;
图7为本申请实施例提供的一种多窗口投屏的原理示意图二;
图8为本申请实施例提供的一种多窗口投屏的原理示意图三;
图9为本申请实施例提供的一种投屏显示方法的应用场景示意图三;
图10为本申请实施例提供的一种投屏显示方法的应用场景示意图四;
图11为本申请实施例提供的一种投屏显示方法的交互示意图一;
图12为本申请实施例提供的一种投屏显示方法的应用场景示意图五;
图13为本申请实施例提供的一种投屏显示方法的应用场景示意图六;
图14为本申请实施例提供的一种投屏显示方法的交互示意图二;
图15为本申请实施例提供的一种投屏显示方法的应用场景示意图七;
图16为本申请实施例提供的一种投屏显示方法的应用场景示意图八;
图17为本申请实施例提供的一种投屏显示方法的应用场景示意图九;
图18为本申请实施例提供的一种投屏显示方法的交互示意图三;
图19为本申请实施例提供的一种投屏显示方法的应用场景示意图十;
图20为本申请实施例提供的一种投屏显示方法的应用场景示意图十一;
图21为本申请实施例提供的一种投屏显示方法的交互示意图四;
图22为本申请实施例提供的一种投屏显示方法的应用场景示意图十二;
图23为本申请实施例提供的一种投屏显示方法的应用场景示意图十三;
图24为本申请实施例提供的一种投屏显示方法的应用场景示意图十四;
图25为本申请实施例提供的一种投屏显示方法的应用场景示意图十五;
图26为本申请实施例提供的一种投屏显示方法的交互示意图五;
图27为本申请实施例提供的一种投屏显示方法的应用场景示意图十六;
图28为本申请实施例提供的一种投屏显示方法的交互示意图六;
图29为本申请实施例提供的一种投屏显示方法的应用场景示意图十七;
图30为本申请实施例提供的一种投屏显示方法的应用场景示意图十八;
图31为本申请实施例提供的一种投屏显示方法的应用场景示意图十九;
图32为本申请实施例提供的一种源端设备的结构示意图;
图33为本申请实施例提供的一种目标设备的结构示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
目前,多设备协同使用已成为较为常见的办公和娱乐方式。在多设备协同使用时,源端设备(或称为source端)可与目标设备(或称为sink端)建立连接。例如,源端设备可与目标设备建立Wi-Fi连接、蓝牙连接或P2P(peer to peer,点对点)连接等。进而,源端设备可通过MiraCast协议或DLNA(Digital Living Network Alliance,数字生活网络联盟)协议将源端设备中的图像、文档、音频、视频或应用(或应用中的任务)投射至目标设备中显示或播放,使得用户可以在目标设备中使用源端设备提供的相关功能。
示例性的,以手机为源端设备、PC为目标设备举例,如图2所示,PC上可以设置电子标签201,电子标签201也可称为NFC(near field communication,近场通信)标签或NFC贴片等。电子标签201中一般设置有线圈,在PC出厂时可预先向电子标签201的线圈中写入该PC的设备信息。例如,PC的设备信息可包括PC的名称、蓝牙MAC(media access control,媒体访问控制)地址或IP地址等一项或多项。
当用户需要将手机中的应用、文档等数据投射至PC中显示时,可开启手机的NFC功能,将手机靠近或接触PC上的电子标签201。这样,手机在与电子标签201彼此靠近的情况下,可通过发射近场信号从电子标签201中读取PC的设备信息。进而,手机可根据PC的设备信息与PC建立无线通信连接。例如,该无线通信连接具体可以为蓝牙连接、Wi-Fi连接或Wi-Fi P2P连接等,本申请实施例对此不做任何限制。
当然,除了可以通过上述“碰一碰”的方式触发手机与PC建立连接外,手机也可以通过搜索附近的设备,或者通过拖拽等手势触发手机与PC建立连接,或者,手机也可以通过UWB(Ultra Wide Band,超宽带)等其他通信技术与PC建立连接,本申请实施例对此不做任何限制。
手机与PC建立无线通信连接后,仍如图2所示,手机可实时的将当前的显示界面202通过已建立的无线通信连接发送给PC。例如,手机可以以视频流的形式将当前的显示界面202实时传输至PC。PC可通过窗口203显示显示界面202,窗口203中可以包括控制栏204,控制栏204可以包括最大化、最小化以及关闭等按钮。以显示界面202为手机的桌面举例,PC在窗口203中显示出手机的桌面后,用户可以在窗口203中使用手机提供的各项功能。
例如,用户可以使用PC的键盘或者鼠标在窗口203中输入打开视频APP的操作。进而,PC可将用户输入的操作发送给手机,触发手机响应该操作将视频APP投射至PC中。例如,手机可在后台运行视频APP,如图3所示,手机可将运行视频APP的各个任务时产生的显示数据(即视频APP的显示界面205)以视频流的形式发送给PC。进而,PC可通过窗口206显示视频APP的显示界面205。此时,PC仍然可以在窗口203中与 手机同步显示显示界面202。
也就是说,手机作为源端设备不仅可以将正在显示的应用的界面(后续可称为主显示界面)投射至PC的窗口中显示,还可以将手机中没有显示的其他应用的界面投射至PC的窗口中显示,使PC以多窗口的形式显示手机投射来的多个应用。
在这种场景下,用户可以管理PC中每个窗口的显示内容。例如,用户可以向窗口206中输入暂停操作,触发PC在窗口206中暂停播放视频APP正在播放的视频。又例如,用户可以点击窗口206中最小化的按钮,触发PC将窗口206最小化。
在本申请实施例中,用户除了可以单独管理投屏场景下PC(即目标设备)上的各个窗口外,还可以在PC上通过窗口覆盖、窗口合并或窗口悬浮等形式建立投屏时各个窗口之间的关联。例如,用户可以拖动在PC中投射的一个窗口覆盖至另一个窗口,使被覆盖的窗口关闭。又例如,用户可以拖动在PC中投射的一个窗口与另一个窗口拼接,使这两个窗口合并为一个窗口。这样,用户通过一次操作可管理PC(即目标设备)中投射的多个窗口,从而更加高效的管理目标设备中投射的多个窗口,提高用户的使用体验。
仍以手机为上述投屏场景中的源端设备举例,图4示出了手机的结构示意图。
如图4所示,手机可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180等。
可以理解的是,本发明实施例示意的结构并不构成对手机的具体限定。在本申请另一些实施例中,手机可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
手机的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在手机上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在手机上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机可以包括1个或N个显示屏194,N为大于1的正整数。
手机可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机可以支持一种或多种视频编解码器。这样,手机可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
手机可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机可以设置至少一个麦克风170C。在另一些实施例中,手机可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机还 可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
传感器模块180中可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
当然,手机还可以包括充电管理模块、电源管理模块、电池、按键、指示器以及1个或多个SIM卡接口等,本申请实施例对此不做任何限制。
上述手机的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的
Figure PCTCN2022084100-appb-000001
系统为例,示例性说明手机的软件结构。
图5是本申请实施例的手机的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将
Figure PCTCN2022084100-appb-000002
系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
1、应用程序层
应用程序层可以包括一系列应用程序包。
如图5所示,应用程序层中可以安装通话,备忘录,浏览器,联系人,相机,图库,日历,地图,蓝牙,音乐,视频,短信息等APP(应用,application)。
2、应用程序框架层
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
在本申请实施例中,如图5所示,应用程序框架层中可以包括活动管理服务(ActivityManagerService,AMS)和窗口管理服务(WindowManagerService,WMS)。
其中,AMS可用于管理应用运行时的生命周期。应用通常以Activity的形式运行在操作系统中。对于每一个Activity,在活动管理器中都会有一个与之对应的应用记录(ActivityRecord),这个ActivityRecord记录了该应用的Activity的状态。活动管理器可以利用这个ActivityRecord作为标识,调度应用的Activity进程。
WMS可用于管理在手机屏幕上使用的图形用户界面(graphical user interface,GUI)资源,具体可包括:窗口的创建和销毁、窗口的显示与隐藏、窗口的布局、焦点的管理以及输入法和壁纸管理等。
在本申请实施例中,以手机运行桌面(也可称为桌面应用、launcher等)举例。手机开始运行桌面后,如图6所示,AMS可在手机默认的display(显示)模块(例如display0)中为桌面创建对应的应用栈(stack),例如stack A1。一般,display 0与手机的显示屏对应,即在display 0中绘制的显示界面最终会输出至手机的显示屏显示。display模块也可称为虚拟屏或虚拟display等。
示例性的,上述Stack A1中可以包括桌面需要执行的一个或多个Activity(活动,或称为任务或应用任务)。AMS在执行stack A1中的Activity时,可调用WMS在display 0中实时绘制对应的显示界面(例如图6所示的显示界面601),显示界面601与WMS 创建的窗口1关联。进而,WMS可将在display 0中产生的显示界面601输出至手机的显示器,由手机的显示器在窗口1中显示显示界面601,从而向用户在手机屏幕上呈现出桌面。
需要说明的是,display 0中还可以包括与手机中其他应用对应的stack,例如stack A2等。一般,在display 0中位于栈顶的stack是正在手机前台运行的应用的stack。在栈顶后面的stack A2等为手机后台运行的应用的stack。AMS可为各个stack设置相关的堆栈属性,例如,是否可见、是否为分屏应用等。一般,位于栈顶的stack为可见(visible)属性,这类stack运行时产生的显示界面可被显示在手机屏幕中,而位于栈内的stack为不可见(invisible)属性,这类stack运行时产生的显示界面不会在手机屏幕中显示。
如果手机开启投屏功能与目标设备(例如PC)建立了连接,仍如图6所示,WMS还可将在display 0中位于栈顶的stack A1产生的显示界面601输出至PC,使得PC可在对应的窗口中显示手机正在显示的显示界面601,从而实现多设备协同场景下的投屏功能。
在这种投屏场景下,如果检测到用户操作PC中的显示界面601打开手机中的某一应用(例如视频APP),则PC可向手机发送对应的投屏指令1,指示手机将视频APP中的应用任务投射至PC显示。此时,如图7所示,响应于投屏指令1,AMS可创建新的display模块(例如display 1),进而在display 1中创建与视频APP对应的stack B1。display 1与PC的显示屏对应,即在display 1中绘制的显示界面最终会输出至PC的显示屏显示。
与stack A1类似的,stack B1中可以包括视频APP需要执行的一个或多个Activity。AMS在执行stack B1中的Activity时,可调用WMS在display 1中实时绘制对应的显示界面(例如图7所示的显示界面701),显示界面701与WMS创建的窗口2关联。进而,WMS可将在display 1中产生的显示界面701输出至PC,由PC在窗口2中显示显示界面701,从而将手机中的视频APP的应用任务投射至PC中显示。
类似的,用户还可以按照上述方法将手机中的其他应用投射至PC中显示。例如,如果检测到用户操作PC打开手机中的聊天APP,则PC可向手机发送对应的投屏指令2,指示手机将聊天APP中的应用任务投射至PC显示。如图8所示,响应于投屏指令2,AMS可在display 1中创建与聊天APP对应的stack B2。与display 0不同的是,display 1中所有的stack均可设置为可见属性,并且,位于栈顶的stack通常为用户最近打开或操作的应用的stack。AMS在执行stack B2中的Activity时,可调用WMS在display 1中实时绘制对应的显示界面(例如图8所示的显示界面801),显示界面801与WMS创建的窗口3关联。进而,WMS可将在display 1中产生的显示界面801输出至PC,由PC在窗口3中显示显示界面801,从而将手机中的聊天APP的应用任务投射至PC中显示。
此时,如图9所示,PC可通过窗口1实时显示手机在display 0中运行的桌面601,并且,PC可通过窗口2实时显示手机在display 1中运行的视频APP的显示界面701,并通过窗口3实时显示手机在display 1中运行的聊天APP的显示界面801,实现将手机中的多个应用任务以多窗口的形式投射至PC中显示。需要说明的是,不同窗口中的应用任务可以属于同一应用,也可以属于不同应用,本申请实施例对此不做任何限制。
其中,窗口1与display 0中位于栈顶的stack A1对应,其他窗口(例如窗口2和窗口3)分别与display 1中的各个stack对应。也就是说,在上述投屏场景中,PC中的一个窗口可同步显示手机的显示界面(例如上述桌面的显示界面601),PC中的其他一个或多个窗口可用于投射手机中其他应用的显示界面。
在本申请实施例中,用户可以在上述投屏场景中将PC上述显示的多个窗口进行覆盖或合并等操作,触发手机建立PC上投射的多个窗口之间的关联,以管理PC上投射的多个窗口,使PC上的多窗口实现替换、分屏、悬浮或合并等功能,提高投屏场景下用户的使用体验。
其中,手机(源端设备)与PC(目标设备)在多应用投屏时的具体显示过程将在后续实施例中详细阐述,故此处不予赘述。
另外,应用程序框架层还可以包括电源管理服务,内容提供服务,视图系统,资源管理服务,通知管理服务等,本申请实施例对此不做任何限制。
3、Android runtime和系统库
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
其中,表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
4、内核层
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动等,本申请实施例对此不做任何限制。
以下,仍以投屏场景下手机为源端设备、PC为目标设备举例,结合附图详细阐述本申请实施例提供的一种投屏显示方法。
参考图6-图9的相关描述,手机与PC建立通信连接后,手机可将正在前台运行的应用任务(例如桌面)投射至PC,使PC在窗口1中显示桌面的显示界面601。此时,与桌面对应的Stack A1运行在手机的display 0中。也就是说,窗口1中的显示界面与手机屏幕中的显示界面同步,后续可将投屏场景下窗口1中的显示界面称为源端设备(即手机)的主显示界面。
其中,手机与PC在窗口1中显示的显示界面同步是指:二者显示的具体内容可以相同,但二者的形状、大小、位置、排布、分辨率或DPI(Dots Per Inch,每英寸点数)等显示参数可以不同,本申请实施例对此不做任何限制。
示例性的,仍如图9所示,手机还可以将手机中视频APP的应用任务投射至PC,使PC在窗口2中显示视频APP的显示界面701。此时,与视频APP对应的Stack B1运行在手机的display 1中。示例性的,手机还可以将手机中聊天APP的应用任务投射至PC,使PC在窗口3中显示聊天APP的显示界面801。此时,与聊天APP对应的Stack B2也运行在手机的display 1中。
也就是说,PC上除了与手机屏幕同步显示的窗口1与手机默认的display 0对应外, PC上投射的其他窗口(例如窗口2或窗口3)与手机新创建的display 1对应。display 1中产生的显示内容可以发送至PC的显示器进行显示,但一般不会发送至手机的显示屏进行显示。相应的,display 0中产生的显示内容一般发送至手机的显示屏进行显示,在投屏场景下,display 0中位于栈顶的Stack产生的显示内容可发送至PC的显示器进行显示。
需要说明的是,投屏时PC显示的窗口(例如上述窗口1-窗口3)中除了包括相关应用的显示界面外,还可以包括控制栏、标题栏、状态栏或工具栏等组件。例如,控制栏中可以包括最大化按钮、最小化按钮以及关闭按钮等。
当然,手机还可以按照上述方法将更多的应用任务投射至PC中,使PC以窗口的形式显示对应应用任务的显示界面,本申请实施例对此不做任何限制。
在本申请实施例中,仍以上述投屏场景举例,PC作为目标设备可接收用户对投射来的各个窗口输入的操作。例如,用户可以在PC上拖动一个窗口覆盖另一个窗口。
示例性的,如图10中的(a)所示,PC在上述投屏场景下可显示窗口1、窗口2以及窗口3。其中,窗口1中包括手机桌面的显示界面601,窗口2中包括手机中视频APP的显示界面701,窗口3中包括手机中聊天APP的显示界面801。用户可以使用鼠标、键盘等输入设备操作PC上的各个窗口。
例如,如图11所示,当PC检测到用户使用鼠标拖动窗口2时,PC可实时监测窗口2在屏幕中被拖动的位置。如图10中的(b)所示,如果检测到被拖动的窗口2与PC中的其他窗口(例如窗口3)重叠,则PC可计算窗口2与窗口3之间重叠区域的面积(即重叠面积1)。如果检测到重叠面积1大于预设的面积阈值(例如,面积阈值为窗口3的80%),说明用户可能有使用被拖动的窗口2替换窗口3的操作意图。此时,如果用户释放鼠标,则鼠标可向PC发送对应的释放事件,指示用户已经释放鼠标停止拖动窗口2。进而,仍如图12所示,PC可响应该释放事件向手机发送对应的窗口覆盖事件1,窗口覆盖事件1指示用户执行了拖动窗口2覆盖窗口3的事件。手机接收到上述窗口覆盖事件1后,可调用AMS和WMS控制PC不再显示窗口3,而是保留被拖动的窗口2以及窗口2中的显示界面701,使PC显示如图10中的(c)所示的界面。
例如,如图12中的(a)所示,手机在接收到窗口覆盖事件1之前,窗口2中视频APP的Stack B1运行在手机的display 1中,窗口3中聊天APP的Stack B2也运行在手机的display 1中。Stack B1和Stack B2均为可见属性,且Stack B2位于栈顶。窗口1中桌面的Stack A1运行在手机的display 0中。手机接收到上述窗口覆盖事件1后,可根据窗口覆盖事件1确定用户拖动窗口2覆盖了窗口3。此时,手机可响应窗口覆盖事件1杀掉(kill)聊天APP。例如,手机的WMS可将与聊天APP对应的窗口3关闭(也可称为销毁)。并且,手机的AMS可在display 1中删除与聊天APP对应的Stack B2。Stack B2被删除后,WMS也不会再继续在display 1中绘制聊天APP产生的显示界面801,也不会再向PC发送显示界面801。
或者,如图11所示,手机接收到上述窗口覆盖事件1后,可指示WMS将窗口3设置为不可见的属性,并且,可指示AMS将display 1中与视频APP对应的Stack B1移动至栈顶。此时,如图12中的(b)所示,Stack B1位于display 1的栈顶,Stack B1为可见属性,而Stack B2为不可见属性。与桌面对应的Stack A1仍然位于display 0栈顶,且Stack A1为可见属性。此时,聊天APP并没有被kill,而是被切换至手机的后台运行。
这样一来,手机的AMS在执行stack B1中的Activity时,可调用WMS在display 1中实时绘制视频APP产生的显示界面701,显示界面701仍然与窗口2关联。进而,WMS 可将视频APP产生的显示界面701以视频流的方式发送给PC。PC接收到视频APP产生的显示界面701后,可继续在窗口2中实时显示显示界面701。由于Stack B2的堆栈属性为不可见,窗口3的窗口属性也为不可见,因此,WMS不会继续在display 1中绘制聊天APP产生的显示界面801,也不会再向PC发送显示界面801。
这样,如图10中的(c)所示,用户拖动PC中从手机投射来的窗口2覆盖窗口3后,可以从视觉上实现关闭窗口3同时保留窗口2的窗口覆盖功能。当然,WMS还可以将display 0中绘制的桌面的显示界面601实时发送给PC和手机的显示屏,使手机和PC的窗口1中继续同步显示显示界面601。
上述实施例是以用户在PC中拖动窗口2覆盖窗口3举例说明的,在另一些实施例中,用户也可以在PC中拖动某一窗口覆盖PC中包含手机的主显示界面的窗口(例如上述窗口1),即拖动某一窗口覆盖PC中与手机的显示界面同步的窗口。此时,PC上也可以实现对应的窗口覆盖功能。
示例性的,如图13中的(a)所示,仍以PC在上述投屏场景下显示窗口1、窗口2以及窗口3举例。如图14所示,当PC检测到用户使用鼠标拖动窗口2时,PC可实时监测窗口2在屏幕中被拖动的位置。如图13中的(b)所示,如果检测到被拖动的窗口2与包含手机的主显示界面的窗口1重叠,则PC可实时计算窗口2与窗口1之间的重叠面积2。当重叠面积2大于预设的面积阈值时,如果PC接收到鼠标发送的释放事件,则PC可响应该释放事件向手机发送对应的窗口覆盖事件2,窗口覆盖事件2指示用户执行了拖动窗口2覆盖窗口1的事件。
由于被覆盖的窗口1中显示的是手机的主显示界面,如图14所示,当手机接收到上述窗口覆盖事件2后,与图11不同的是,手机可调用AMS和WMS控制PC不再显示窗口2,而是在窗口1中继续显示窗口2的显示界面701,使PC显示如图13中的(c)所示的界面。
例如,如图15中的(a)所示,手机在接收到窗口覆盖事件1之前,窗口1中桌面的Stack A1运行在手机的display 0中,Stack A1为可见属性。窗口2中视频APP的Stack B1运行在手机的display 1中,窗口3中聊天APP的Stack B2也运行在手机的display 1中。Stack B1和Stack B2均为可见属性,且Stack B2位于栈顶。手机接收到上述窗口覆盖事件2后,可根据窗口覆盖事件1确定用户拖动窗口2覆盖了窗口1,且窗口1为投射至PC上的手机的主显示界面。此时,仍如图14所示,手机可响应窗口覆盖事件2向WMS发送关闭窗口2的指令。并且,手机可响应窗口覆盖事件2向AMS发送将Stack B1从display 1移动至display 0的指令。
相应的,WMS可响应接收到的指令销毁窗口2。AMS可响应接收到的指令,调用预设的移栈接口(例如,moveStackToDisplay())将Stack B1从display 1移动至display 0的栈顶。此时,如图15中的(b)所示,Stack B1位于display 0栈顶,且Stack B1为可见属性。与聊天APP对应的Stack B2仍然位于display 1中,Stack B2仍然为可见属性。原本位于display 0栈顶的Stack A1(即桌面的Stack)可以被压入栈内(此时Stack A1为不可见属性),或者,Stack A1也可以被删除,本申请实施例对此不做任何限制。
这样一来,手机的AMS在执行stack B1中的Activity时,可调用WMS在display 0中实时绘制视频APP产生的显示界面701,显示界面701此时与显示手机的主显示界面的窗口1关联。由于此时stack B1被移入display 0,如图16所示,WMS一方面可将视频APP产生的显示界面701发送至手机的显示屏进行显示。另一方面,WMS可将视频 APP产生的显示界面701以视频流的方式发送给PC。如图13中的(c)所示,PC接收到视频APP产生的显示界面701后,可在窗口1中实时显示显示界面701。此时,手机与PC在窗口2中显示的显示界面同步,即窗口2中的显示界面为投屏时手机的主显示界面。当然,WMS还可以将display 1中绘制的聊天APP的显示界面801发送给PC,由PC继续在窗口3中实时显示显示界面801。
也就是说,在多窗口投屏的场景下,用户可以通过在PC(即目标设备)中拖动一个窗口覆盖另一个窗口,触发PC与手机(即源端设备)交互实现窗口覆盖功能,即关闭被覆盖的窗口同时保留被拖动的窗口,方便用户管理目标设备中的多个窗口。
一般,当手机的AMS对某一Stack执行了移栈操作后,WMS会刷新相应display模块中的Stack并重新读取位于栈顶的Stack的配置信息。Stack的配置信息中可以记录显示对应显示界面时的分辨率、宽高比等参数。如果刷新后位于栈顶的Stack的配置信息发生改变,则WMS会重新执行位于栈顶的Stack。
在本申请实施例中,在上述用户拖动窗口2覆盖窗口1的场景下,当手机的AMS将Stack B1从display 1移动至display 0的栈顶时,AMS可将Stack B1的配置信息设置为与原本位于display 0栈顶的Stack A1相同的配置信息。那么,WMS刷新display 0后,可以读取到位于display 0栈顶的Stack的配置信息没有发生改变,则WMS不会重新开始执行Stack B1,而是继续执行移栈操作前Stack B1中的Activity。这样,窗口2中的视频APP被切换至窗口1后,视频APP的显示界面701可以无缝切换至窗口1中,实现窗口内容的无缝接续,提高用户的使用体验。
在另一些实施例中,在多窗口投屏的场景下,用户还可以通过在PC(即目标设备)中拖动一个窗口覆盖另一个窗口,触发PC与手机(即源端设备)交互实现窗口悬浮功能,使被拖动的窗口悬浮显示在被覆盖的窗口上。
示例性的,如图17中的(a)所示,仍以PC在上述投屏场景下显示窗口1、窗口2以及窗口3举例。如图18所示,当PC检测到用户使用鼠标拖动窗口2时,PC可实时监测窗口2在屏幕中被拖动的位置。如图17中的(b)所示,如果检测到被拖动的窗口2与窗口3重叠,则PC可实时计算窗口2与窗口3之间的重叠面积1。当重叠面积1大于预设的面积阈值时,PC开始计时。当重叠面积1大于面积阈值的持续时间大于预设的时间阈值(例如2s)时,说明用户可能有将窗口2悬浮显示在窗口3上的操作意图。此时,仍如图18所示,如果PC接收到鼠标发送的释放事件,则PC可响应该释放事件向手机发送对应的窗口悬浮事件1,窗口悬浮事件1指示用户执行了拖动窗口2悬浮在窗口3的事件。当手机接收到上述窗口悬浮事件1后,可调用AMS和WMS控制PC在窗口3上以悬浮窗的形式显示窗口2,使PC显示如图17中的(c)所示的界面。
例如,如图19中的(a)所示,手机在接收到窗口悬浮事件1之前,窗口2中视频APP的Stack B1运行在手机的display 1中,窗口3中聊天APP的Stack B2也运行在手机的display 1中。Stack B1和Stack B2均为可见属性,且Stack B2位于栈顶。窗口1中桌面的Stack A1运行在手机的display 0中,Stack A1为可见属性。手机接收到上述窗口悬浮事件1后,可根据窗口悬浮事件1确定用户拖动窗口2悬浮在窗口3上。此时,仍如图18所示,手机可响应窗口悬浮事件1向WMS发送将窗口2设置为悬浮窗的指令。并且,手机可响应窗口悬浮事件1向AMS发送在display 1中将Stack B1移动至栈顶的指令。
相应的,WMS可响应接收到的指令将窗口2的窗口属性修改为悬浮窗 (FloatingWindow)的属性,此时,窗口属性修改后的窗口2(即悬浮窗)仍然与视频APP产生的显示画面701关联。AMS可响应接收到的指令在display 1中将Stack B1移动至栈顶。此时,如图19中的(b)所示,Stack B1位于display 1中的栈顶,且Stack B2和Stack B1仍然为可见属性。与桌面对应的Stack A1仍然位于display 0栈顶。
这样一来,手机的AMS在执行stack B1和Stack B2中的Activity时,可调用WMS在display 1中绘制视频APP产生的显示界面701以及聊天APP产生的显示界面801,显示界面701可以悬浮窗的形式绘制在显示界面801的上层。进而,WMS可将包含显示界面701和显示界面801的视频流发送给PC,由PC在窗口3中显示包含窗口2(即悬浮窗)的显示界面。此时,如图17中的(c)所示,窗口2以悬浮窗的形式继续显示在窗口3上。
上述实施例是以用户在PC中拖动窗口2在窗口3上悬浮举例说明的,在另一些实施例中,用户也可以在PC中拖动某一窗口在PC中包含手机的主显示界面的窗口(例如上述窗口1)上悬浮。此时,PC上也可以实现对应的窗口悬浮功能。
示例性的,如图20中的(a)所示,仍以PC在上述投屏场景下显示窗口1、窗口2以及窗口3举例。如图21所示,当PC检测到用户使用鼠标拖动窗口2时,PC可实时监测窗口2在屏幕中被拖动的位置。如图20中的(b)所示,如果检测到被拖动的窗口2与包含手机的主显示界面的窗口1重叠,则PC可实时计算窗口2与窗口1之间的重叠面积2。当重叠面积2大于预设的面积阈值时,PC开始计时。仍如图22所示当重叠面积2大于面积阈值的持续时间大于预设的时间阈值(例如2s)时,如果PC接收到鼠标发送的释放事件,则PC可响应该释放事件向手机发送对应的窗口悬浮事件2,窗口悬浮事件2指示用户执行了拖动窗口2悬浮在窗口1的事件。当手机接收到上述窗口悬浮事件2后,可调用AMS和WMS控制PC在窗口1上以悬浮窗的形式显示窗口2,使PC显示如图20中的(c)所示的界面。不同的是,此时手机屏幕中也会同步以悬浮窗的形式显示窗口2。
例如,如图22中的(a)所示,手机在接收到窗口悬浮事件2之前,窗口2中视频APP的Stack B1运行在手机的display 1中,窗口3中聊天APP的Stack B2也运行在手机的display 1中。Stack B1和Stack B2均为可见属性,且Stack B2位于栈顶。窗口1中桌面的Stack A1运行在手机的display 0中,Stack A1为可见属性。手机接收到上述窗口悬浮事件2后,可根据窗口悬浮事件2确定用户拖动窗口2悬浮在窗口1上,且窗口1的显示界面与手机的显示界面同步。此时,仍如图21所示,手机可响应窗口悬浮事件2向WMS发送将窗口2设置为悬浮窗的指令。并且,手机可响应窗口悬浮事件2向AMS发送将Stack B1从display 1移动至display 0的指令。
相应的,WMS可响应接收到的指令,将窗口2的窗口属性修改为悬浮窗(FloatingWindow)的属性。此时,窗口属性修改后的窗口2(即悬浮窗)仍然与视频APP产生的显示画面701关联。AMS可响应接收到的指令,调用预设的移栈接口(例如,moveStackToDisplay())将Stack B1从display 1移动至display 0的栈顶。此时,如图22中的(b)所示,Stack B1位于display 0的栈顶,原本位于display 0栈顶的Stack A1被压入栈内。位于栈顶的Stack B1为可见属性。由于与Stack B1关联的窗口2为悬浮窗,因此,在Stack B1下面的Stack A1也为可见属性。Stack B1移出display 1后,Stack B2位于display 1的栈顶,且Stack B2仍为可见属性。
这样一来,手机的AMS在执行stack B1和stack A1中的Activity时,可调用WMS 在display 0中绘制视频APP产生的显示界面701和桌面的显示界面601,并且,显示界面701以悬浮窗的形式绘制在显示界面601的上层。此时,WMS可将显示界面701和显示界面601发送至手机。如图23所示,手机可将显示界面701以悬浮窗的形式显示在桌面的显示界面601上。同时,WMS可将包含显示界面701和显示界面601的视频流发送至PC。如图20中的(c)所示,PC可继续在窗口1中显示显示界面601,并在窗口1上以悬浮窗的形式显示显示界面701。当然,WMS还可以将display 1中绘制的聊天APP的显示界面801发送给PC,由PC继续在窗口3中实时显示显示界面801。
也就是说,在多窗口投屏的场景下,用户可以通过在PC(即目标设备)中拖动一个窗口悬浮在另一个窗口上,触发PC与手机(即源端设备)交互实现窗口悬浮功能,即使被拖动的窗口悬浮显示在被覆盖的窗口上。
在另一些实施例中,在多窗口投屏的场景下,用户还可以通过在PC(即目标设备)中拖动一个窗口至另一个窗口,触发PC与手机(即源端设备)交互实现窗口合并功能,使被拖动的窗口与被覆盖的窗口合并为一个窗口。
示例性的,可预先设置PC(即目标设备)中投射来的窗口包含边界热区。例如,如图24所示,窗口2401中包括预设的边界热区2402。边界热区2402设置在靠近窗口2401边缘的位置。例如,可以将窗口2401中每一条边界向内延伸100个像素点所覆盖的区域设置为窗口2401的边界热区2402。
在多窗口投屏的场景下,当用户拖动PC上的一个窗口与另一个窗口覆盖时,PC可通过检测两个窗口之间边界热区的重合度,判断用户是否触发上述窗口合并功能。
示例性的,如图25中的(a)所示,仍以PC在上述投屏场景下显示窗口1、窗口2以及窗口3举例。如图26所示,当PC检测到用户使用鼠标拖动窗口2时,PC可实时监测窗口2在屏幕中被拖动的位置。如图25中的(b)所示,如果检测到被拖动的窗口2的边界热区与窗口3的边界热区重合,则PC可实时计算窗口2与窗口3之间边界热区的重合度。例如,当窗口2左侧的边界热区与窗口3右侧的边界热区之间的重合度1大于预设的重合度阈值时,说明用户可能有合并窗口2和窗口3的操作意图。此时,仍如图26所示,如果PC接收到鼠标发送的释放事件,则PC可响应该释放事件向手机发送对应的窗口合并事件1,窗口合并事件1指示用户执行了将窗口2与窗口3合并的事件。当手机接收到上述窗口合并事件1后,可调用AMS和WMS控制PC将窗口2和窗口3合并为新的窗口(例如窗口4),如图25中的(c)所示,使PC在合并后的窗口4中以分屏的形式显示原本位于窗口2和窗口3中的显示界面。
例如,手机接收到上述窗口合并事件1后,可向WMS发送窗口2和窗口3的分屏指令。进而,WMS可响应窗口2和窗口3的分屏指令,将窗口2与窗口3设置为分屏属性。例如,可设置窗口2为右分屏窗口,窗口3为左分屏窗口。
这样,手机的AMS在执行与视频APP对应的stack B1时,可调用WMS在display 1中绘制视频APP的显示界面701,此时,显示界面701与右分屏窗口关联。同样,手机的AMS在执行与聊天APP对应的stack B2时,可调用WMS在display 1中绘制聊天APP的显示界面801,此时,显示界面801与左分屏窗口关联。进而,WMS可将显示界面701和显示界面801作为一个窗口(例如窗口4)中的显示界面发送至PC,由PC在窗口4中以分屏的形式显示视频APP的显示界面701和聊天APP的显示界面801,实现屏幕合并功能。
在另一些实施例中,用户还可以拖动窗口2与PC中包含手机的主显示界面的窗口1 合并。
示例性的,如图27中的(a)所示,仍以PC在上述投屏场景下显示窗口1、窗口2以及窗口3举例。如图28所示,如果检测到被拖动的窗口2的边界热区与窗口1的边界热区重合,则PC可实时计算窗口2与窗口1之间边界热区的重合度。例如,如图27中的(b)所示,当窗口2上侧的边界热区与窗口1下侧的边界热区之间的重合度2大于预设的重合度阈值时,说明用户可能有合并窗口2和窗口1的操作意图。此时,如果PC接收到鼠标发送的释放事件,则图28所示,PC可响应该释放事件向手机发送对应的窗口合并事件2,窗口合并事件2指示用户执行了将窗口2与窗口1合并的事件。当手机接收到上述窗口合并事件2后,可调用AMS和WMS控制PC将窗口2和窗口1合并为一个窗口(例如窗口5),如图27中的(c)所示,使PC在合并后的窗口5中以分屏的形式显示原本窗口2和窗口1中的显示界面。
例如,仍如图28所示,手机接收到上述窗口合并事件2后,可向WMS发送窗口2和窗口1的分屏指令。并且,由于窗口2中的视频APP与窗口1中的桌面分别对应手机中不同的display模块,因此,手机接收到上述窗口合并事件2后,还可以向AMS发送将Stack B1从display 1移动至display 0的指令。
相应的,WMS接收到窗口2和窗口1的分屏指令后,可将窗口2和窗口1设置为分屏属性。例如,可设置窗口1为上分屏窗口,窗口2为下分屏窗口。并且,AMS可响应接收到的指令,调用预设的移栈接口将Stack B1从display 1移动至display 0的栈顶。此时,与图22中的(b)类似的,Stack B1位于display 0的栈顶,Stack A1(即桌面的Stack)位于display 0的栈内。由于窗口2和窗口1为分屏属性,因此,AMS可将对应的Stack B1和Stack A1均设置为可见属性。
这样,手机的AMS在执行与视频APP对应的stack B1时,可调用WMS在display 0中绘制视频APP的显示界面701,此时,显示界面701与下分屏窗口关联。同样,手机的AMS在执行与桌面对应的stack A1时,可调用WMS在display 0中绘制桌面的显示界面601,此时,显示界面601与上分屏窗口关联。进而,WMS可以视频流的形式,将显示界面701和显示界面601作为一个窗口(例如窗口5)中的显示界面发送至PC,由PC在窗口5中以分屏的形式显示视频APP的显示界面701和桌面的显示界面601。
另外,由于stack B1和stack A1均位于display 0中,因此,WMS还可以将显示界面701和显示界面601发送至手机的显示屏。此时,如图29所示,手机可在以分屏的形式显示视频APP的显示界面701和桌面的显示界面601,使得手机中的显示界面与PC中窗口5内的显示界面同步。
需要说明的是,PC或手机在在合并后的窗口(例如上述窗口4或窗口5)中进行显示时,可以对窗口中显示界面801、显示界面701或显示界面601中的元素进行删减或重新布局,本申请实施例对此不做任何限制。
上述实施例中阐述了在多窗口投屏场景下,通过拖动PC(即目标设备)上的窗口分别实现窗口覆盖功能、窗口悬浮功能以及窗口合并功能。可以理解的是,用户还可以叠加使用这些功能,方便用户管理投屏场景下的多个窗口。
示例性的,如图30中的(a)所示,当PC检测到用户拖动窗口2使得窗口2与窗口3之间边界热区的重合度大于重合度阈值时,如果PC接收到鼠标发送的释放事件,则PC可按照上述方法与手机交互实现窗口合并功能。此时,如图30中的(b)所示,PC可将窗口2和窗口3合并为窗口4,在窗口4中以分屏的形式显示原本窗口2中视频APP 的显示界面701和窗口3中聊天APP的显示界面801。
后续,如图30中的(c)所示,当PC检测到用户拖动窗口4使窗口4与窗口1重叠,且窗口4与窗口1之间的重叠面积大于面积阈值时,如果PC接收到鼠标发送的释放事件,则PC可按照上述方法与手机交互实现窗口覆盖功能。此时,如图30中的(d)所示,PC不再显示被覆盖的窗口1,而是在窗口4中与手机同步显示视频APP的显示界面701和聊天APP的显示界面801(手机的显示界面图30中未示出)。
后续,如果手机与其他具有投屏功能的电子设备(例如平板电脑)建立无线通信连接,则手机还可以按照上述方法将正在以分屏形式显示的窗口4投射至平板电脑的窗口中显示。这样,手机作为源端设备可将正在显示的界面无缝切换至目标设备中运行,使用户可以继续在目标设备中使用源端设备提供的相关功能。
需要说明的是,图30中的(a)-(d)所示的场景为用户先将两个窗口合并,再将合并后的窗口覆盖其他窗口的场景。可以理解的是,用户还可以以其他方式叠加使用上述窗口覆盖功能、窗口悬浮功能以及窗口合并功能。例如,用户可以使用窗口合并功能先将窗口1和窗口2合并,再将合并后的窗口与窗口3合并。又例如,用户可以使用窗口悬浮功能将窗口1悬浮显示在窗口2上,再将包含悬浮窗的窗口与窗口3合并。
在一些场景中,如图31中的(a)所示,手机和PC可按照上述窗口悬浮功能将窗口2悬浮显示在窗口3上。进而,如图31中的(b)所示,如果用户拖动窗口3悬浮至窗口1(即手机的主显示界面)上,在窗口3上悬浮的窗口2会一并被拖动至窗口1上悬浮。此时,如果PC接收到鼠标发送的释放事件,则PC可指示手机关闭窗口2和窗口3中不包含焦点应用的窗口,并将剩余的窗口按照上述方法以悬浮窗的形式显示在窗口1中。例如,如果窗口2中的视频APP为当前的焦点应用,而窗口3中的聊天APP不是当前的焦点应用,则如图31中的(c)所示,手机可控制PC删除窗口3及其显示内容,并将窗口2以悬浮窗的形式显示在窗口1中。当然,本领域技术人员可以按照实际经验或实际应用场景设置叠加使用上述窗口覆盖功能、窗口悬浮功能以及窗口合并功能时的具体实现方案,本申请实施例对此不做任何限制。
可以看出,在多窗口投屏的场景下,用户可以在PC等目标设备中操作多个窗口,触发源端设备在目标设备上建立被操作的多个窗口之间的关联,从而在目标设备中实现窗口覆盖、窗口悬浮以及窗口合并等功能,使用户通过一次操作可管理目标设备中投射的多个窗口,从而更加高效的管理目标设备中投射的多个窗口,提高用户的使用体验。
另外,上述实施例中是以手机为投屏场景中的源端设备、以PC为投屏场景中的目标设备举例说明的,可以理解的是,在投屏场景中应用本申请实施例的源端设备还可以是平板电脑等具有投屏功能的电子设备,目标设备还可以是电视或平板电脑等具有显示功能的电子设备,本申请实施例对此不做任何限制。
需要说明的是,上述实施例中是以Android系统为例阐述的各个功能模块之间实现上述投屏显示方法,可以理解的是,也可以在其他操作系统(例如鸿蒙系统等)中设置相应的功能模块实现上述方法。只要各个设备和功能模块实现的功能和本申请的实施例类似,即属于本申请权利要求及其等同技术的范围之内。
如图32所示,本申请实施例公开了一种电子设备,该电子设备可以为上述源端设备(例如手机)。该电子设备具体可以包括:触摸屏3201,所述触摸屏3201包括触摸传感器3206和显示屏3207;一个或多个处理器3202;存储器3203;通信模块3208;一个或多个应用程序(未示出);以及一个或多个计算机程序3204,上述各器件可以通过一 个或多个通信总线3205连接。其中,上述一个或多个计算机程序3204被存储在上述存储器3203中并被配置为被该一个或多个处理器3202执行,该一个或多个计算机程序3204包括指令,该指令可以用于执行上述实施例中源端设备执行的相关步骤。
如图33所示,本申请实施例公开了一种电子设备,该电子设备可以为上述目标设备(例如PC)。该电子设备具体可以包括:显示屏3301;一个或多个处理器3302;存储器3303;通信模块3306;一个或多个应用程序(未示出);以及一个或多个计算机程序3304,上述各器件可以通过一个或多个通信总线3305连接。当然,该电子设备还可以配备触摸屏、鼠标或键盘等输入设备。其中,上述一个或多个计算机程序3304被存储在上述存储器3303中并被配置为被该一个或多个处理器3302执行,该一个或多个计算机程序3304包括指令,该指令可以用于执行上述实施例中目标设备执行的相关步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (29)

  1. 一种投屏显示方法,其特征在于,包括:
    目标设备显示第一窗口和第二窗口,所述第一窗口包括源端设备投射的第一应用任务的界面,所述第二窗口包括所述源端设备投射的第二应用任务的界面;
    所述目标设备检测到用户输入窗口覆盖操作,所述窗口覆盖操作用于将所述第一窗口覆盖所述第二窗口;
    响应于所述窗口覆盖操作,所述目标设备向所述源端设备发送窗口覆盖事件;
    响应于所述窗口覆盖事件,所述源端设备获取所述第一窗口覆盖所述第二窗口后的投屏数据;
    所述源端设备向所述目标设备发送所述投屏数据;
    所述目标设备根据所述投屏数据显示第一界面,所述第一界面中包括所述第一应用任务的界面,不包括第二应用任务的界面。
  2. 根据权利要求1所述的方法,其特征在于,所述第一窗口和所述第二窗口中均不包括所述源端设备的主显示界面;
    其中,所述第一界面中所述第一应用任务的界面位于所述第一窗口。
  3. 根据权利要求2所述的方法,其特征在于,所述源端设备包括第一显示display模块,所述第一display模块用于在投屏时向所述目标设备提供投屏数据;
    在所述源端设备接收到所述窗口覆盖事件之前,所述第一应用任务和所述第二应用任务均运行在所述第一display模块,所述第二应用任务运行在所述第一display模块的栈顶;
    在所述源端设备接收到所述窗口覆盖事件之后,还包括:
    所述源端设备将所述第一应用任务移动至所述第一display模块的栈顶运行;
    所述源端设备将所述第二应用任务从所述第一display模块删除,或者,将所述第二应用任务在所述第一display模块中设置为不可见的属性。
  4. 根据权利要求1所述的方法,其特征在于,所述第二窗口中的显示内容与所述源端设备的主显示界面同步;
    其中,所述第一界面中所述第一应用任务的界面位于所述第二窗口。
  5. 根据权利要求4所述的方法,其特征在于,所述源端设备包括第一display模块和第二display模块,所述第一display模块用于在投屏时向所述目标设备提供投屏数据;所述第二display模块用于向所述源端设备提供显示数据,并在投屏时向所述目标设备提供投屏数据;
    在所述源端设备接收到所述窗口覆盖事件之前,所述第一应用任务运行在所述第一display模块,所述第二应用任务运行在所述第二display模块;
    在所述源端设备接收到所述窗口覆盖事件之后,还包括:
    所述源端设备将所述第一应用任务从所述第一display模块移动至所述第二display模块的栈顶运行。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述窗口覆盖操作是指:用户拖动所述第一窗口后,当所述第一窗口与所述第二窗口之间的重叠面积大于面积阈值时,用户输入的释放操作。
  7. 一种投屏显示方法,其特征在于,包括:
    源端设备将第一应用任务的界面投射至目标设备的第一窗口中显示,并且,所述源端设备将第二应用任务的界面投射至所述目标设备的第二窗口中显示;
    所述源端设备接收所述目标设备发送的窗口覆盖事件,所述窗口覆盖事件用于指示用户输入了将所述第一窗口覆盖所述第二窗口的窗口覆盖操作;
    响应于所述窗口覆盖事件,所述源端设备获取所述第一窗口覆盖所述第二窗口后的投屏数据;
    所述源端设备向所述目标设备发送所述投屏数据。
  8. 根据权利要求7所述的方法,其特征在于,所述源端设备包括第一显示display模块,所述第一display模块用于在投屏时向所述目标设备提供投屏数据;
    在所述源端设备接收到所述窗口覆盖事件之前,所述第一应用任务和所述第二应用任务均运行在所述第一display模块,所述第二应用任务运行在所述第一display模块的栈顶;
    在所述源端设备接收到所述窗口覆盖事件之后,还包括:
    所述源端设备将所述第一应用任务移动至所述第一display模块的栈顶运行;
    所述源端设备将所述第二应用任务从所述第一display模块删除,或者,将所述第二应用任务在所述第一display模块中设置为不可见的属性。
  9. 根据权利要求7所述的方法,其特征在于,所述源端设备包括第一display模块和第二display模块,所述第一display模块用于在投屏时向所述目标设备提供投屏数据;所述第二display模块用于向所述源端设备提供显示数据,并在投屏时向所述目标设备提供投屏数据;
    在所述源端设备接收到所述窗口覆盖事件之前,所述第一应用任务运行在所述第一display模块,所述第二应用任务运行在所述第二display模块;
    在所述源端设备接收到所述窗口覆盖事件之后,还包括:
    所述源端设备将所述第一应用任务从所述第一display模块移动至所述第二display模块的栈顶运行。
  10. 一种投屏显示方法,其特征在于,包括:
    目标设备显示第一窗口和第二窗口,所述第一窗口包括源端设备投射的第一应用任务的界面,所述第二窗口包括所述源端设备投射的第二应用任务的界面;
    所述目标设备检测到用户输入窗口覆盖操作,所述窗口覆盖操作用于将所述第一窗口覆盖所述第二窗口;
    响应于所述窗口覆盖操作,所述目标设备向所述源端设备发送窗口覆盖事件;
    所述目标设备接收所述源端设备响应于所述窗口覆盖事件发送的投屏数据;
    所述目标设备根据所述投屏数据显示第一界面,所述第一界面中包括所述第一应用任务的界面,不包括第二应用任务的界面。
  11. 根据权利要求10所述的方法,其特征在于,
    当所述第一窗口和所述第二窗口中均不包括所述源端设备的主显示界面时,所述第一界面中所述第一应用任务的界面位于所述第一窗口;
    当所述第二窗口中的显示内容与所述源端设备的主显示界面同步时,所述第一界面中所述第一应用任务的界面位于所述第二窗口。
  12. 根据权利要求10或11所述的方法,其特征在于,所述窗口覆盖操作是指:用户拖动所述第一窗口后,当所述第一窗口与所述第二窗口之间的重叠面积大于面积阈值时, 用户输入的释放操作。
  13. 一种源端设备,其特征在于,包括:
    显示屏;
    一个或多个处理器;
    存储器;
    通信模块;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述源端设备执行时,使得所述源端设备执行如权利要求7-9中任一项所述的投屏显示方法。
  14. 一种目标设备,其特征在于,包括:
    显示屏;
    一个或多个处理器;
    存储器;
    通信模块;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述目标设备执行时,使得所述目标设备执行如权利要求10-12中任一项所述的投屏显示方法。
  15. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在源端设备上运行时,使得所述源端设备执行如权利要求7-9中任一项所述的投屏显示方法;或者,当所述指令在目标设备上运行时,使得所述目标设备执行如权利要求10-12中任一项所述的投屏显示方法。
  16. 一种投屏显示方法,其特征在于,包括:
    目标设备显示第一窗口和第二窗口,所述第一窗口包括源端设备投射的第一应用任务的界面,所述第二窗口包括所述源端设备投射的第二应用任务的界面;
    所述目标设备检测到用户输入窗口合并操作,所述窗口合并操作用于合并所述第一窗口和所述第二窗口;
    响应于所述窗口合并操作,所述目标设备向所述源端设备发送窗口合并事件;
    响应于所述窗口合并事件,所述源端设备获取所述第一窗口与所述第二窗口合并后的投屏数据;
    所述源端设备向所述目标设备发送所述投屏数据;
    所述目标设备根据所述投屏数据显示第一界面,其中,所述第一应用任务的界面和所述第二应用任务的界面以分屏的方式显示在所述第一界面中。
  17. 根据权利要求16所述的方法,其特征在于,在所述目标设备向所述源端设备发送窗口合并事件之后,还包括:
    响应于所述窗口合并事件,所述源端设备将所述第一窗口和所述第二窗口的窗口属性设置为分屏窗口。
  18. 根据权利要求16或17所述的方法,其特征在于,所述源端设备包括第一显示display模块和第二display模块,所述第一display模块用于在投屏时向所述目标设备提供投屏数据;所述第二display模块用于向所述源端设备提供显示数据,并在投屏时向所述目标设备提供投屏数据;
    在所述接收到所述窗口合并事件之前,所述第一应用任务运行在所述第一display模 块,所述第二应用任务运行在所述第二display模块;
    在源端设备接收到所述窗口合并事件之后,还包括:
    所述源端设备将所述第一应用任务从所述第一display模块移动至所述第二display模块的栈顶运行。
  19. 根据权利要求18所述的方法,其特征在于,在所述目标设备向所述源端设备发送窗口合并事件之后,还包括:
    响应于所述窗口合并事件,所述源端设备将所述第一应用任务的界面和所述第二应用任务的界面以分屏的方式显示在第二界面中。
  20. 根据权利要求16-19中任一项所述的方法,其特征在于,所述目标设备显示的每个窗口包含边界热区;所述窗口合并操作是指:用户拖动所述第一窗口后,当所述第一窗口的边界热区与所述第二窗口的边界热区之间的重叠面积大于面积阈值时,用户输入的释放操作。
  21. 一种投屏显示方法,其特征在于,包括:
    源端设备将第一应用任务的界面投射至目标设备的第一窗口中显示,并且,所述源端设备将第二应用任务的界面投射至所述目标设备的第二窗口中显示;
    所述源端设备接收所述目标设备发送的窗口合并事件,所述窗口合并事件用于指示用户输入了合并所述第一窗口和所述第二窗口的窗口合并操作;
    响应于所述窗口合并事件,所述源端设备获取所述第一窗口与所述第二窗口合并后的投屏数据;
    所述源端设备向所述目标设备发送所述投屏数据。
  22. 根据权利要求21所述的方法,其特征在于,在所述源端设备接收所述目标设备发送的窗口合并事件之后,还包括:
    响应于所述窗口合并事件,所述源端设备将所述第一窗口和所述第二窗口的窗口属性设置为分屏窗口。
  23. 根据权利要求22所述的方法,其特征在于,所述源端设备包括第一显示display模块和第二display模块,所述第一display模块用于在投屏时向所述目标设备提供投屏数据;所述第二display模块用于向所述源端设备提供显示数据,并在投屏时向所述目标设备提供投屏数据;
    在所述接收到所述窗口合并事件之前,所述第一应用任务运行在所述第一display模块,所述第二应用任务运行在所述第二display模块;
    在源端设备接收到所述窗口合并事件之后,还包括:
    所述源端设备将所述第一应用任务从所述第一display模块移动至所述第二display模块的栈顶运行。
  24. 根据权利要求21-23中任一项所述的方法,其特征在于,在所述目标设备向所述源端设备发送窗口合并事件之后,还包括:
    响应于所述窗口合并事件,所述源端设备将所述第一应用任务的界面和所述第二应用任务的界面以分屏的方式显示在第二界面中。
  25. 一种投屏显示方法,其特征在于,包括:
    目标设备显示第一窗口和第二窗口,所述第一窗口包括源端设备投射的第一应用任务的界面,所述第二窗口包括所述源端设备投射的第二应用任务的界面;
    所述目标设备检测到用户输入窗口合并操作,所述窗口合并操作用于合并所述第一 窗口和所述第二窗口;
    响应于所述窗口合并操作,所述目标设备向所述源端设备发送窗口合并事件;
    所述目标设备接收所述源端设备响应于所述窗口合并事件发送的投屏数据;
    所述目标设备根据所述投屏数据显示第一界面,其中,所述第一应用任务的界面和所述第二应用任务的界面以分屏的方式显示在所述第一界面中。
  26. 根据权利要求25所述的方法,其特征在于,所述目标设备显示的每个窗口包含边界热区;所述窗口合并操作是指:用户拖动所述第一窗口后,当所述第一窗口的边界热区与所述第二窗口的边界热区之间的重叠面积大于面积阈值时,用户输入的释放操作。
  27. 一种源端设备,其特征在于,包括:
    显示屏;
    一个或多个处理器;
    存储器;
    通信模块;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述源端设备执行时,使得所述源端设备执行如权利要求21-24中任一项所述的投屏显示方法。
  28. 一种目标设备,其特征在于,包括:
    显示屏;
    一个或多个处理器;
    存储器;
    通信模块;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述目标设备执行时,使得所述目标设备执行如权利要求25或26所述的投屏显示方法。
  29. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在源端设备上运行时,使得所述源端设备执行如权利要求21-24中任一项所述的投屏显示方法;或者,当所述指令在目标设备上运行时,使得所述目标设备执行如权利要求25或26所述的投屏显示方法。
PCT/CN2022/084100 2021-06-30 2022-03-30 一种投屏显示方法及电子设备 WO2023273460A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110736178.2 2021-06-30
CN202110736178.2A CN115543163A (zh) 2021-06-30 2021-06-30 一种投屏显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2023273460A1 true WO2023273460A1 (zh) 2023-01-05

Family

ID=84692447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/084100 WO2023273460A1 (zh) 2021-06-30 2022-03-30 一种投屏显示方法及电子设备

Country Status (2)

Country Link
CN (1) CN115543163A (zh)
WO (1) WO2023273460A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325054A (ja) * 2000-05-16 2001-11-22 Fujitsu Ten Ltd マルチウィンドウ表示方法、及びウィンドウ管理方法
CN103853381A (zh) * 2012-12-06 2014-06-11 三星电子株式会社 显示设备及控制显示设备的方法
CN112017576A (zh) * 2020-08-26 2020-12-01 北京字节跳动网络技术有限公司 显示控制方法、装置、终端和存储介质
CN112995727A (zh) * 2019-12-17 2021-06-18 华为技术有限公司 一种多屏协同方法、系统及电子设备
CN113050841A (zh) * 2019-12-26 2021-06-29 华为技术有限公司 显示多窗口的方法、电子设备和系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325054A (ja) * 2000-05-16 2001-11-22 Fujitsu Ten Ltd マルチウィンドウ表示方法、及びウィンドウ管理方法
CN103853381A (zh) * 2012-12-06 2014-06-11 三星电子株式会社 显示设备及控制显示设备的方法
CN112995727A (zh) * 2019-12-17 2021-06-18 华为技术有限公司 一种多屏协同方法、系统及电子设备
CN113050841A (zh) * 2019-12-26 2021-06-29 华为技术有限公司 显示多窗口的方法、电子设备和系统
CN112017576A (zh) * 2020-08-26 2020-12-01 北京字节跳动网络技术有限公司 显示控制方法、装置、终端和存储介质

Also Published As

Publication number Publication date
CN115543163A (zh) 2022-12-30

Similar Documents

Publication Publication Date Title
WO2020238871A1 (zh) 一种投屏方法、系统及相关装置
US20240168624A1 (en) Screen capture method and related device
WO2021052147A1 (zh) 一种数据传输的方法及相关设备
WO2020259452A1 (zh) 一种移动终端的全屏显示方法及设备
WO2020224485A1 (zh) 一种截屏方法及电子设备
WO2021139768A1 (zh) 跨设备任务处理的交互方法、电子设备及存储介质
WO2020052529A1 (zh) 全屏显示视频中快速调出小窗口的方法、图形用户接口及终端
WO2021000881A1 (zh) 一种分屏方法及电子设备
WO2020108356A1 (zh) 一种应用显示方法及电子设备
WO2021036571A1 (zh) 一种桌面的编辑方法及电子设备
CN112558825A (zh) 一种信息处理方法及电子设备
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
WO2022179405A1 (zh) 一种投屏显示方法及电子设备
CN112527174B (zh) 一种信息处理方法及电子设备
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
WO2024016559A1 (zh) 一种多设备协同方法、电子设备及相关产品
WO2022127632A1 (zh) 一种资源管控方法及设备
CN112527222A (zh) 一种信息处理方法及电子设备
WO2022105803A1 (zh) 摄像头调用方法、系统及电子设备
WO2024045801A1 (zh) 用于截屏的方法、电子设备、介质以及程序产品
WO2022063159A1 (zh) 一种文件传输的方法及相关设备
WO2021190524A1 (zh) 截屏处理的方法、图形用户接口及终端
WO2023020012A1 (zh) 设备之间的数据通信方法、设备、存储介质及程序产品
WO2023045597A1 (zh) 大屏业务的跨设备流转操控方法和装置
CN115242994B (zh) 视频通话系统、方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22831291

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE