CN115543163A - Screen projection display method and electronic equipment - Google Patents

Screen projection display method and electronic equipment Download PDF

Info

Publication number
CN115543163A
CN115543163A CN202110736178.2A CN202110736178A CN115543163A CN 115543163 A CN115543163 A CN 115543163A CN 202110736178 A CN202110736178 A CN 202110736178A CN 115543163 A CN115543163 A CN 115543163A
Authority
CN
China
Prior art keywords
window
display
interface
source device
application task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110736178.2A
Other languages
Chinese (zh)
Inventor
吉伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110736178.2A priority Critical patent/CN115543163A/en
Priority to PCT/CN2022/084100 priority patent/WO2023273460A1/en
Publication of CN115543163A publication Critical patent/CN115543163A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Abstract

The application discloses a screen projection display method and electronic equipment, relates to the technical field of terminals, and can increase the relevance among a plurality of projected windows in a screen projection scene, so that a user can conveniently and efficiently manage the plurality of projected windows in target equipment, and the use experience of the user is improved. The method comprises the following steps: the target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by the source equipment, and the second window comprises an interface of a second application task projected by the source equipment; if the target device detects a window covering operation of covering the first window with the second window by user input, the target device sends a window covering event to the source device; responding to the window covering event, the source end equipment acquires screen projection data after the first window covers the second window, and sends the screen projection data to the target equipment; and the target equipment displays a first interface according to the screen projection data, wherein the first interface comprises an interface of a first application task and does not comprise an interface of a second application task.

Description

Screen projection display method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to a screen projection display method and electronic equipment.
Background
In some screen projection scenes, a source device (or source device) such as a mobile phone may project a plurality of applications to a target device (or sink device) such as a PC in a multi-window manner for display.
For example, as shown in FIG. 1, a cell phone may project the desktop of the cell phone into a window 101 of a PC for display. Subsequently, the user can further open other applications of the mobile phone in the PC by operating the icons of the applications in the window 101. Still as shown in fig. 1, after the user clicks the icon of the chat APP in the window 101, the mobile phone may project the display interface of the chat APP to the window 102 of the PC for display. For another example, after the user clicks the icon of the video APP in the window 101, the mobile phone may project the display interface of the video APP to the window 103 of the PC for display. At this time, the PC can display the display interfaces of the plurality of applications on the mobile phone through the plurality of windows, respectively.
In this scenario, the user can manage each window projected from the handset on the PC. Such as scaling a window, closing a window, etc. However, the windows displayed on the PC are relatively independent, and when the number of windows projected on the PC by the mobile phone is large, the process of managing each window by the user is cumbersome, so that the user experience is not high.
Disclosure of Invention
The application provides a screen projection display method and electronic equipment, which can increase the relevance among a plurality of projected windows in a screen projection scene, facilitate the efficient management of the plurality of projected windows in target equipment by a user, and improve the use experience of the user.
In a first aspect, the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by the source device, and the second window comprises an interface of a second application task projected by the source device (namely a multi-window screen projection scene); if the target device detects that the user inputs a window covering operation, the window covering operation is used for covering the first window with the second window, and the target device can send a corresponding window covering event to the source device; furthermore, in response to the window covering event, the source device may obtain the screen projection data after the first window covers the second window; sending the screen projection data to the target equipment; furthermore, the target device can display a first interface according to the screen projection data, wherein the first interface comprises an interface of the first application task and does not comprise an interface of the second application task.
That is to say, in a multi-window screen projection scene, a user may drag one window to cover another window in a target device (e.g., a PC), and trigger the target device to interact with a source device (e.g., a mobile phone) to implement a window covering function, that is, to close display content in the covered window and to retain display content in the dragged window, so that the user can conveniently and efficiently manage multiple windows projected in the target device, and the user experience is improved.
In a possible implementation manner, neither the first window nor the second window includes a main display interface of the source device, that is, the first window and the second window operated by the user are not windows where the main display interface projected by the source device is located; at this time, the interface of the first application task in the first interface is located in the first window. That is, the window covering function may close the covered second window, and leave the dragged first window to continue displaying the interface of the first application task.
In the above scenario, the source device may include a first display module (e.g., display 1), where the first display module is configured to provide, when a screen is projected, screen projection data to the target device; the source device may obtain corresponding screen projection data from the first display module. Before the source device receives the window covering event, the first application task and the second application task are both operated on the first display module, wherein the second application task can be operated on the stack top of the first display module; after the source device receives the window coverage event, the method further includes: the source device may move the first application task to the top of the stack of the first display module to run, so that the first window may continue to display the interface of the first application task.
In addition, the source device may delete the second application task from the first display module (i.e., the second application task is kill), and at this time, the source device may close (i.e., destroy) the second window corresponding to the second application task. Or the source device may set the second application task as an invisible attribute in the first display module, at this time, the second application task is not kill but is switched to the background of the source device to continue running, and the source device may set the window attribute of the second window as invisible. In this way, for the user, although the second application task is switched to the background of the source device to continue running, the user visually feels that the overlaid second window is closed, and the dragged first window continues to display the interface of the first application task.
In a possible implementation manner, the display content in the second window is synchronized with the main display interface of the source device, that is, the first window operated by the user is not the window where the main display interface projected by the source device is located, but the covered second window displays the main display interface of the source device; at this time, the interface of the first application task in the first interface is located in the second window. That is, the window covering function may close the dragged first window, and switch the interface of the first application task displayed in the first window to the second window for continuous display.
In the above scenario, the source device may include a first display module (e.g., display 1) and a second display module (e.g., display 0), where the first display module is configured to provide screen projection data to the target device when a screen is projected, and the source device may obtain corresponding screen projection data from the first display module; the second display module is used for providing display data for the source device and providing screen projection data for the target device during screen projection, the source device can acquire the display data from the second display module to display in a display screen of the source device, and the source device can also acquire the screen projection data of the main display interface from the second display module in a screen projection scene.
Before the source device receives the window covering event, the first application task may run on a first display module, and the second application task may run on a second display module; after the source device receives the window covering event, the method further includes: the source device may move the first application task from the first display module to the top of the stack of the second display module for running, that is, perform the stack moving operation. After the first application task is moved to the stack top of the second display module, the second application task is pressed into the stack, and interfaces of the first application task can be displayed in second windows of the source end device and the target device. At this time, the source device may close the first window. In this way, the user visually perceives that the first window being dragged is closed, and the overlaid first window continues to display the interface for the first application task.
In a possible implementation manner, after the source device moves the first application task from the first display module to the stack top of the second display module, the source device may further set the configuration information of the first application task to the same configuration information as an application task (for example, a second application task) originally located at the stack top of the second display module. Therefore, when the source device refreshes the second display module, it can read that the configuration information of the application task at the top of the stack is not changed, and the source device can continue to execute the first application task instead of reopening the first application task to start execution, thereby implementing seamless connection of window content.
For example, the window covering operation may specifically be: and after the user drags the first window, when the overlapping area between the first window and the second window is larger than the area threshold, the user inputs a release operation. Alternatively, the window covering operation may be: the user drags the first window to enable the overlapping area between the first window and the second window to be larger than the area threshold, namely the window covering function can be achieved when the user does not input releasing operation. Of course, the window covering operation may also be other predefined operations, and this application does not limit this.
In a second aspect, the present application provides a screen projection display method, including: the source end device projects an interface of a first application task into a first window of the target device for display, and projects an interface of a second application task into a second window of the target device for display; subsequently, the source device may receive a window covering event sent by the target device, where the window covering event is used to indicate that the user inputs a window covering operation for covering the first window with the second window; furthermore, in response to the window covering event, the source device may obtain the screen projection data after the first window covers the second window; and sending the screen projection data to the target equipment to realize the window covering function.
In a possible implementation manner, the source device includes a first display module, where the first display module is configured to provide screen projection data to the target device when a screen is projected; before a source end device receives a window covering event, a first application task and a second application task both run on a first display module, and the second application task runs on the top of a stack of the first display module; after the source device receives the window covering event, the method further includes: the source end device moves the first application task to the stack top of the first display module to run; the source device deletes the second application task from the first display module, or sets the second application task as an invisible attribute in the first display module.
In a possible implementation manner, the source device includes a first display module and a second display module, where the first display module is configured to provide screen projection data to the target device during screen projection; the second display module is used for providing display data to the source end device and providing screen projection data to the target device during screen projection; before the source device receives the window covering event, a first application task runs on a first display module, and a second application task runs on a second display module; after the source device receives the window covering event, the method further includes: and the source device moves the first application task from the first display module to the stack top of the second display module to run.
In a third aspect, the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by the source device, and the second window comprises an interface of a second application task projected by the source device; if the target device detects that the user inputs a window covering operation, the window covering operation is used for covering the first window with the second window; the target device may send a corresponding window coverage event to the source device; furthermore, the target device receives screen projection data sent by the source device in response to the window covering event; and the target equipment displays a first interface according to the screen projection data, wherein the first interface comprises an interface of the first application task and does not comprise an interface of the second application task.
In a possible implementation manner, when neither the first window nor the second window includes a main display interface of the source device, an interface of a first application task in the first interface is located in the first window; and when the display content in the second window is synchronous with the main display interface of the source device, the interface of the first application task in the first interface is positioned in the second window.
In one possible implementation, the window covering operation refers to: and after the user drags the first window, when the overlapping area between the first window and the second window is larger than the area threshold, the user inputs a release operation.
In a fourth aspect, the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by the source device, and the second window comprises an interface of a second application task projected by the source device (namely a multi-window screen projection scene); subsequently, if the target device detects that the user inputs a window merging operation, the window merging operation is used for merging the first window and the second window; the target device may send a corresponding window merge event to the source device; furthermore, in response to the window merging event, the source device can acquire screen projection data obtained by merging the first window and the second window, and send the screen projection data to the target device; furthermore, the target device can display the first interface according to the screen projection data, and at the moment, the interface of the first application task and the interface of the second application task are displayed in the first interface in a split screen mode.
That is to say, in a multi-window screen-casting scene, a user may drag one window in a target device to merge with another window, which may trigger the target device and a source device to interact to implement a window merging function, even if display content in the dragged window and display content in the overlaid window are displayed in the target device in a split screen manner, so that the user may conveniently and efficiently manage multiple windows cast in the target device, and the user experience may be improved.
In a possible implementation manner, after the target device sends the window merge event to the source device, the method further includes: in response to the window merge event, the source device may set the window attributes of the first window and the second window as split-screen windows. For example, the WMS in the source device may set the window attributes of the first window and the second window as split-screen windows (or referred to as split-screen attributes).
In one possible implementation, the source device may include a first display module (e.g., display 1), where the first display module is configured to provide, to the target device, screen projection data when a screen is projected; the source device may obtain corresponding screen projection data from the first display module. Before the source device receives the window merging event, if the first application task and the second application task are both running in the first display module, after receiving the window merging event, the source device does not need to modify the first application task and the second application task in the first display module.
In another possible implementation manner, the source device may include a first display module (e.g., display 1) and a second display module (e.g., display 0), where the first display module is configured to provide screen projection data to the target device when a screen is projected, and the source device may obtain corresponding screen projection data from the first display module; the second display module is configured to provide display data to the source device and provide screen projection data to the target device when a screen is projected, that is, the source device may obtain, from the second display module, the display data to display on its own display screen, and in a screen projection scenario, the source device may also obtain, from the second display module, the screen projection data of the main display interface.
Then, before receiving the window merge event, if the first application task is running on the first display module and the second application task is running on the second display module; after the source device receives the window merge event, the source device may move the first application task from the first display module to the top of the second display module to run, that is, perform a stack move operation. After the first application task is moved to the stack top of the second display module, the second application task is pressed into the stack, and the first application task and the second application task are both application tasks in a split-screen window, so that the source end device can respectively draw an interface of the first application task and an interface of the second application task in the second display module in a split-screen mode.
For example, the source device may draw an interface for a first application task in a left split-screen window and draw an interface for a second application task in a right split-screen window. For another example, the source device may draw an interface of the first application task in the upper split-screen window and draw an interface of the second application task in the lower split-screen window. Subsequently, the source device may send an interface (i.e., screen projection data) including two split-screen windows to the target device in a video stream manner, and the target device may display the interface of the first application task and the interface of the second application task after the split-screen windows through one window (e.g., a third window).
In a possible implementation manner, after the target device sends the window merge event to the source device, the method further includes: in response to the window merging event, the source device displays the interface of the first application task and the interface of the second application task in the second interface in a split-screen manner, that is, the source device can synchronously display the interface of the first application task and the interface of the second application task after the split-screen with one window (for example, a third window) in the target device.
In one possible implementation, each window displayed by the target device may include a boundary hot zone, for example, the boundary hot zone may be disposed in a region near an edge of the window; then, the window merging operation may be: and after the user drags the first window, when the overlapping area between the boundary hot area of the first window and the boundary hot area of the second window is larger than the area threshold value, the user inputs a release operation. Alternatively, the window merging operation may be: the user drags the first window to enable the overlapping area between the boundary hot area of the first window and the boundary hot area of the second window to be larger than the area threshold value, namely the window merging function can be achieved when the user does not input the releasing operation. Of course, the window merging operation may also be other predefined operations, which is not limited in this application.
In a fifth aspect, the present application provides a screen projection display method, including: the source device projects an interface of the first application task into a first window of the target device for display, and the source device projects an interface of the second application task into a second window of the target device for display; subsequently, the source device may receive a window merging event sent by the target device, where the window merging event is used to indicate that a user inputs a window merging operation for merging the first window and the second window; responding to the window merging event, and enabling the source end equipment to acquire screen projection data after the first window and the second window are merged; and screen projection data is sent to the target equipment, so that the window merging function is realized.
In a possible implementation manner, after the source device receives the window merging event sent by the target device, the method further includes: in response to the window merging event, the source device sets the window attributes of the first window and the second window as split-screen windows.
In a possible implementation manner, a source device includes a first display module and a second display module, where the first display module is configured to provide screen projection data to a target device when a screen is projected; the second display module is used for providing display data to the source end device and providing screen projection data to the target device during screen projection; before a window merging event is received, a first application task is operated on a first display module, and a second application task is operated on a second display module; after the source device receives the window merge event, the method further includes: and the source device moves the first application task from the first display module to the stack top of the second display module to run.
In a possible implementation manner, after the target device sends the window merge event to the source device, the method further includes: and responding to the window merging event, and displaying the interface of the first application task and the interface of the second application task in a second interface in a split screen mode by the source end equipment.
In a sixth aspect, the present application provides a screen projection display method, including: the target device displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by the source device, and the second window comprises an interface of a second application task projected by the source device; subsequently, the target device detects that a user inputs a window merging operation, wherein the window merging operation is used for merging the first window and the second window; in response to the window merging operation, the target device may send a window merging event to the source device; the target device receives screen projection data sent by the source device in response to the window merging event; and the target equipment displays a first interface according to the screen projection data, wherein the interface of the first application task and the interface of the second application task are displayed in the first interface in a split screen mode.
In one possible implementation, each window displayed by the target device may contain a boundary hotspot; the window merging operation is as follows: and after the user drags the first window, when the overlapping area between the boundary hot area of the first window and the boundary hot area of the second window is larger than the area threshold value, the user inputs a release operation.
In a seventh aspect, the present application provides a screen projection display method, including: the method comprises the steps that target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by source equipment, and the second window comprises an interface of a second application task projected by the source equipment (namely a multi-window screen projection scene); secondly, if the target device detects that the user inputs the window suspension operation, the window suspension operation is used for suspending the first window on the second window; the target device may send a corresponding window suspension event to the source device; responding to the window suspension event, the source end device acquires screen projection data of the first window suspended on the second window and sends the screen projection data to the target device; furthermore, the target device can display a first interface according to the screen projection data, and in the first interface, an interface of the first application task is displayed on the second window in a floating window mode.
That is to say, in a multi-window screen-casting scene, a user can drag one window in a target device to suspend on another window, and can trigger the target device and a source device to interact to realize a window suspending function, even if display content in the dragged window suspends in another window, so that the user can conveniently and efficiently manage multiple windows cast in the target device, and the user experience is improved.
In a possible implementation manner, after the target device sends the window floating event to the source device, the method further includes: in response to the window floating event, the source device sets the window attribute of the first window to the attribute of the floating window. For example, a WMS in the source device may set a window attribute of the first window to an attribute of the floating window.
In one possible implementation, the source device may include a first display module (e.g., display 1), where the first display module is configured to provide, to the target device, screen projection data when a screen is projected; the source device may obtain corresponding screen projection data from the first display module.
Before the source device receives the window suspension event, both the first application task and the second application task can run on the first display module, and at the moment, the second application task runs on the stack top of the first display module; then, after the source device receives the window floating event, the source device may move the first application task to the top of the stack of the first display module to run. After the first application task moves to the top of the stack of the first display module, the second application task is pushed into the stack, and the second application task can still be set as a visible attribute. Because the first window where the first application task is located is the attribute of the floating window, and the second window where the second application task is located is the default attribute, the source device can draw the interface of the second application task in a full-screen form in the first display module, and draw the interface of the first application task in a floating window form, and the interface of the first application task is located at the upper layer of the interface of the second application task. Subsequently, the source device may send an interface (i.e., screen projection data) including the two first application tasks and the second application task to the target device in a video stream manner, so that the target device displays an interface in which the first window is suspended on the second window.
In another possible implementation manner, the source device may include a first display module (e.g., display 1) and a second display module (e.g., display 0), where the first display module is configured to provide screen projection data to the target device when a screen is projected, and the source device may obtain corresponding screen projection data from the first display module; the second display module is configured to provide display data to the source device and provide screen projection data to the target device when a screen is projected, that is, the source device may obtain, from the second display module, the display data to display on its own display screen, and in a screen projection scenario, the source device may also obtain, from the second display module, the screen projection data of the main display interface.
Before receiving the window suspension event, the first application task may run on a first display module, and the second application task may run on a second display module; after the source device receives the window suspension event, the source device may move the first application task from the first display module to the stack top of the second display module to run, that is, execute the stack moving operation. After the first application task moves to the top of the stack of the second display module, the second application task is pushed into the stack, and the second application task can still be set as a visible attribute. Because the first window where the first application task is located is the attribute of the floating window, and the second window where the second application task is located is the default attribute, the source device can draw the interface of the second application task in a full-screen form in the second display module, and draw the interface of the first application task in a floating window form, and the interface of the first application task is located at the upper layer of the interface of the second application task. Subsequently, the source device may send an interface (i.e., screen projection data) including the first application task and the second application task to the target device in a video stream manner, so that the target device displays an interface in which the first window is suspended on the second window.
Meanwhile, the second display module is further configured to provide display data to the source device, so that the source device can obtain an interface including the first application task and the second application task from the second display module, and synchronously display, with the target device, an interface in which the first window is suspended on the second window.
In a possible implementation manner, the window floating operation may be: and after the user drags the first window, when the overlapping area between the first window and the second window is larger than the area threshold and the duration time is larger than the time threshold, releasing operation input by the user. Alternatively, the window floating operation may be: and dragging the first window by the user to enable the overlapping area between the first window and the second window to be larger than the area threshold, and enable the duration to be larger than the time threshold, namely, the window suspension function can be realized when the user does not input the release operation. Of course, the window floating operation may also be other predefined operations, which is not limited in this application.
In an eighth aspect, the present application provides a screen projection display method, including: the source device projects an interface of the first application task into a first window of the target device for display, and the source device projects an interface of the second application task into a second window of the target device for display; the source end device can receive a window suspension event sent by the target device, wherein the window suspension event is used for indicating that a user inputs a window suspension operation of suspending a first window on a second window; responding to the window suspension event, the source end equipment can acquire screen projection data of the first window suspended on the second window; and screen projection data is sent to the target equipment, so that the window suspension function is realized.
In one possible implementation manner, in response to the window floating event, the source device may further set the window attribute of the first window as the floating window.
In a possible implementation manner, the source device includes a first display module, where the first display module is configured to provide screen projection data to the target device when a screen is projected; before a source device receives a window suspension event, a first application task and a second application task both run on a first display module, and the second application task runs on the top of a stack of the first display module; after the source device receives the window floating event, the method further includes: and the source device moves the first application task to the stack top of the first display module to run.
In a possible implementation manner, the source device includes a first display module and a second display module, where the first display module is configured to provide screen projection data to the target device during screen projection; the second display module is used for providing display data to the source end device and providing screen projection data to the target device during screen projection; before a window suspension event is received, a first application task is operated on a first display module, and a second application task is operated on a second display module; after the source device receives the window floating event, the method further includes: the source device moves the first application task from the first display module to the top of the stack of the second display module to run.
In a ninth aspect, the present application provides a screen projection display method, including: the target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by the source equipment, and the second window comprises an interface of a second application task projected by the source equipment; the target equipment detects that a user inputs window suspension operation, and the window suspension operation is used for suspending a first window on a second window; in response to the window floating operation, the target device may send a window floating event to the source device; the target device receives screen projection data sent by the source device in response to the window suspension event; and the target equipment displays a first interface according to the screen projection data, wherein in the first interface, the interface of the first application task is displayed on the second window in a floating window mode.
In a possible implementation manner, the window floating operation refers to: and after the user drags the first window, when the overlapping area between the first window and the second window is larger than the area threshold and the duration time is larger than the time threshold, releasing operation input by the user.
It should be noted that the window covering operation, the window merging operation, and the window floating operation may be any predefined operations, which is not limited in this application. For example, after detecting that the user inputs the window covering operation, the source device may be triggered to interact with the target device to implement the window suspension function. For another example, after detecting that the user inputs the window merging operation, the source device and the target device may be triggered to interact to implement the window covering function.
In addition, the source device and the target device can also realize the superposition of multiple functions in the window covering function, the window merging function and the window suspension function by the method. For example, the source device and the target device may first merge the window 1 and the window 2 in response to the operation 1 of the user, and then merge the merged window with the window 3 in response to the operation 2 of the user. For another example, the source device and the target device may hover and display the window 1 on the window 2 in response to the operation 1 of the user, and merge the window containing the hovering window with the window 3 in response to the operation 2 of the user.
In a tenth aspect, the present application provides an electronic device, where the electronic device is a source device, and the source device includes: the system comprises a memory, a display screen, a communication module and one or more processors; the memory and the display screen are coupled with the processor. Wherein the memory is to store computer program code, the computer program code comprising computer instructions; when the electronic device is running, the processor is configured to execute the one or more computer instructions stored in the memory, so as to cause the electronic device to perform the screen projection display method performed by the source device in any of the above aspects.
In an eleventh aspect, the present application provides an electronic device, which is a destination device, including: a memory, a display screen, and one or more processors; the memory and the display screen are coupled with the processor. Wherein the memory is to store computer program code, the computer program code comprising computer instructions; the processor is configured to execute the one or more computer instructions stored in the memory when the electronic device is running, so as to cause the electronic device to perform the screen projection display method performed by the target device in any of the above aspects.
In a twelfth aspect, the present application provides a screen projection display system, which includes the source device and the destination device, and the source device and the destination device may execute the screen projection display method according to any aspect above through interaction.
In a thirteenth aspect, the present application provides a computer-readable storage medium comprising computer instructions. When the computer instructions are run on the electronic equipment, the electronic equipment is caused to execute the screen projection display method of any one aspect.
In a fourteenth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to execute the screen projection display method according to any one of the above aspects.
It is understood that the electronic device, the computer-readable storage medium and the computer program product provided in the foregoing aspects are all applied to the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer-readable storage medium and the computer program product provided in the foregoing aspects can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Drawings
FIG. 1 is a diagram illustrating a multi-window projection scene in the prior art;
fig. 2 is a first application scenario diagram of a screen projection display method according to an embodiment of the present application;
fig. 3 is a schematic view of an application scenario ii of a screen projection display method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating an architecture of an operating system in a mobile phone according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a multi-window screen projection according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a multi-window projection screen according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a multi-window screen projection principle provided in the embodiment of the present application;
fig. 9 is a schematic view of an application scenario three of a screen projection display method according to an embodiment of the present application;
fig. 10 is a schematic view of an application scenario of a screen projection display method according to an embodiment of the present application;
fig. 11 is a first interaction diagram of a screen projection display method according to an embodiment of the present application;
fig. 12 is a schematic view of an application scenario of a screen projection display method according to an embodiment of the present application;
fig. 13 is a schematic view sixth of an application scenario of a screen projection display method provided in an embodiment of the present application;
fig. 14 is a second interaction diagram of a screen projection display method according to an embodiment of the present application;
fig. 15 is a schematic view of an application scenario of a screen projection display method according to an embodiment of the present application;
fig. 16 is an application scenario schematic diagram eight of a screen projection display method according to an embodiment of the present application;
fig. 17 is a schematic view nine of an application scenario of a screen projection display method according to an embodiment of the present application;
fig. 18 is a third interaction diagram of a screen projection display method according to an embodiment of the present application;
fig. 19 is a schematic application scenario diagram ten of a screen projection display method provided in an embodiment of the present application;
fig. 20 is an eleventh application scenario schematic diagram of a screen projection display method according to an embodiment of the present application;
fig. 21 is a fourth interaction diagram of a screen projection display method according to an embodiment of the present application;
fig. 22 is a schematic view twelfth of an application scenario of a screen projection display method provided in the embodiment of the present application;
fig. 23 is a schematic view thirteen of an application scenario of a screen projection display method according to an embodiment of the present application;
fig. 24 is a schematic view fourteen in an application scenario of a screen projection display method provided in the embodiment of the present application;
fig. 25 is a schematic view fifteen of an application scenario of a screen projection display method provided in the embodiment of the present application;
fig. 26 is an interaction diagram of a screen projection display method according to an embodiment of the present application;
fig. 27 is a schematic view sixteen of an application scenario of a screen projection display method according to an embodiment of the present application;
fig. 28 is an interaction diagram six of a screen projection display method according to an embodiment of the present application;
fig. 29 is a seventeenth application scenario diagram of a screen projection display method according to an embodiment of the present application;
fig. 30 is an application scenario schematic diagram eighteen of a screen projection display method provided in the embodiment of the present application;
fig. 31 is a nineteen application scenario schematic diagram of a screen projection display method according to an embodiment of the present application;
fig. 32 is a schematic structural diagram of a source device according to an embodiment of the present application;
fig. 33 is a schematic structural diagram of a target device according to an embodiment of the present application.
Detailed Description
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
At present, the cooperative use of multiple devices has become a common way for office and entertainment. When multiple devices are cooperatively used, a source device (or called source end) may establish a connection with a target device (or called sink end). For example, the source device may establish a Wi-Fi connection, a bluetooth connection, or a P2P (peer to peer) connection, etc., with the target device. Furthermore, the source device may project, through MiraCast protocol or DLNA (Digital Living Network Alliance) protocol, an image, a document, audio, video, or an application (or a task in the application) in the source device to the target device for display or playing, so that the user may use a related function provided by the source device in the target device.
For example, a mobile phone is used as a source device, a PC is used as a target device, as shown in fig. 2, an electronic tag 201 may be set on the PC, and the electronic tag 201 may also be referred to as an NFC (near field communication) tag or an NFC patch. The electronic tag 201 is generally provided with a coil, and the device information of the PC can be written in the coil of the electronic tag 201 in advance when the PC is shipped. For example, the device information of the PC may include one or more items of a name of the PC, a bluetooth MAC (media access control) address, or an IP address.
When a user needs to project data such as applications and documents in the mobile phone to the PC for display, the NFC function of the mobile phone may be turned on, and the mobile phone is brought close to or in contact with the electronic tag 201 on the PC. In this way, the mobile phone can read the device information of the PC from the electronic tag 201 by transmitting a near field signal when the mobile phone and the electronic tag 201 are close to each other. Furthermore, the mobile phone can establish wireless communication connection with the PC according to the equipment information of the PC. For example, the wireless communication connection may specifically be a bluetooth connection, a Wi-Fi connection, or a Wi-Fi P2P connection, which is not limited in this embodiment of the present application.
Certainly, in addition to triggering the mobile phone to establish connection with the PC through the "touch-and-dash" manner, the mobile phone may also trigger the mobile phone to establish connection with the PC through searching for a nearby device, or through a gesture such as dragging, or the mobile phone may also establish connection with the PC through other communication technologies such as UWB (Ultra Wide Band), which is not limited in this embodiment of the present application.
After the wireless communication connection is established between the mobile phone and the PC, as shown in fig. 2, the mobile phone can send the current display interface 202 to the PC in real time through the established wireless communication connection. For example, a cell phone may transmit the current display interface 202 to a PC in real-time in the form of a video stream. The PC may display a display 202 through a window 203, which may include a control bar 204 in the window 203, which may include maximize, minimize, and close buttons 204. Taking the display interface 202 as an example of a desktop of a mobile phone, after the PC displays the desktop of the mobile phone in the window 203, the user can use various functions provided by the mobile phone in the window 203.
For example, the user may input an operation of opening the video APP in the window 203 using a keyboard or a mouse of the PC. Furthermore, the PC can send the operation input by the user to the mobile phone, and the mobile phone is triggered to respond to the operation to project the video APP into the PC. For example, the mobile phone may run the video APP in the background, and as shown in fig. 3, the mobile phone may send display data (i.e., the display interface 205 of the video APP) generated when each task of the video APP is run to the PC in the form of a video stream. Further, the PC can display a display interface 205 of the video APP through a window 206. At this point, the PC can still display the display interface 202 in the window 203 in synchronization with the handset.
That is to say, the mobile phone as the source device may not only project the interface of the application being displayed (which may be referred to as a main display interface hereinafter) into the window of the PC for display, but also project the interfaces of other applications that are not displayed in the mobile phone into the window of the PC for display, so that the PC displays multiple applications that are projected by the mobile phone in a multi-window manner.
In this scenario, the user can manage the display content of each window in the PC. For example, the user may input a pause operation into the window 206, triggering the PC to pause playing the video being played by the video APP in the window 206. As another example, the user may click on a minimize button in window 206, triggering the PC to minimize window 206.
In the embodiment of the application, a user can separately manage each window on a PC (namely, a target device) in a screen projection scene, and can establish association among the windows during screen projection on the PC through window covering, window merging, window suspension or other forms. For example, the user may drag one window projected in the PC to overlay another window, causing the overlaid window to close. For another example, the user may drag one window projected in the PC to splice with another window, so that the two windows are merged into one window. Therefore, a user can manage a plurality of windows projected in the PC (target equipment) through one-time operation, so that the plurality of windows projected in the target equipment can be managed more efficiently, and the use experience of the user is improved.
Still take a mobile phone as an example of the source device in the above-mentioned screen projection scene, fig. 4 shows a schematic structural diagram of the mobile phone.
As shown in fig. 4, the mobile phone may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention is not to be specifically limited to a mobile phone. In other embodiments of the present application, the handset may include more or fewer components than illustrated, or combine certain components, or split certain components, or arrange different components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose-input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The wireless communication function of the mobile phone can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to a mobile phone. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to a mobile phone, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone realizes the display function through the GPU, the display screen 194, the application processor and the like. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the cell phone may include 1 or N display screens 194, N being a positive integer greater than 1.
The mobile phone can realize shooting functions through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV and other formats. In some embodiments, the handset may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset is in frequency bin selection, the digital signal processor is used for performing fourier transform and the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The handset may support one or more video codecs. Thus, the mobile phone can play or record videos in various encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the use process of the mobile phone. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The mobile phone can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The handset can listen to music, or to hands-free talk, through the speaker 170A.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the mobile phone receives a call or voice information, the receiver 170B can be close to the ear to receive voice.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The handset may be provided with at least one microphone 170C. In other embodiments, the mobile phone may be provided with two microphones 170C to achieve the noise reduction function in addition to collecting the sound signal. In other embodiments, the mobile phone may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
Certainly, the mobile phone may further include a charging management module, a power management module, a battery, a key, an indicator, 1 or more SIM card interfaces, and the like, which is not limited in this embodiment of the present application.
The software system of the mobile phone can adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture or a cloud architecture. The embodiment of the application adopts a layered architecture
Figure BDA0003141680730000121
The system is used as an example to illustrate the software structure of the mobile phone.
Fig. 5 is a block diagram of a software structure of a mobile phone according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the method will comprise
Figure BDA0003141680730000122
The system is divided into four layers, namely an application program layer, an application program framework layer, an Android runtime (Android runtime) and system library and a kernel layer from top to bottom.
1. Application layer
The application layer may include a series of application packages.
As shown in fig. 5, applications such as calls, memo, browser, contacts, camera, gallery, calendar, map, bluetooth, music, video, and short message may be installed in the application layer.
2. Application framework layer
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
In the embodiment of the present application, as shown in fig. 5, an Activity Management Service (AMS) and a Window Management Service (WMS) may be included in the application framework layer.
Among other things, AMS may be used to manage the lifecycle of an application runtime. Applications typically run in the form of Activity in an operating system. For each Activity, there is an application record (activetyrecord) in the Activity manager corresponding to it, which records the state of the Activity of the application. The Activity manager can schedule Activity processes for the application using this Activity record as an identification.
The WMS may be configured to manage Graphical User Interface (GUI) resources used on a mobile phone screen, and specifically may include: the method comprises the following steps of creating and destroying windows, displaying and hiding windows, layout of windows, management of focuses, input method and wallpaper management and the like.
In the embodiment of the present application, a mobile phone is used to run a desktop (also referred to as a desktop application, launcher, etc.). After the cell phone starts running the desktop, as shown in fig. 6, the AMS may create a corresponding application stack (stack), such as stack A1, for the desktop in a default display module (e.g., display 0) of the cell phone. Generally, display 0 corresponds to a display screen of the mobile phone, that is, a display interface drawn in display 0 is finally output to the display screen of the mobile phone for display. The display module may also be referred to as a virtual screen or virtual display, etc.
For example, the Stack A1 may include one or more activities (activities, or tasks or application tasks) that the desktop needs to execute. When the AMS executes Activity in stack A1, the WMS may be invoked to draw a corresponding display interface (e.g., display interface 601 shown in fig. 6) in real time in display 0, where the display interface 601 is associated with window 1 created by the WMS. Further, the WMS may output the display interface 601 generated in display 0 to the display of the mobile phone, and display the display interface 601 in the window 1 by the display of the mobile phone, thereby presenting a desktop on the screen of the mobile phone to the user.
It should be noted that display 0 may further include a stack corresponding to other applications in the mobile phone, for example, stack A2. Generally, the stack at the top of the stack in display 0 is the stack of an application running in the foreground of the mobile phone. And the stack A2 and the like behind the stack top are the stacks of the applications running in the background of the mobile phone. The AMS may set the associated stack attributes for each stack, e.g., whether it is visible, whether it is a split screen application, etc. Generally, the stack at the top of the stack is a visible (visible) attribute, a display interface generated by the stack during operation can be displayed in a mobile phone screen, and the stack in the stack is an invisible (invisible) attribute, and the display interface generated by the stack during operation cannot be displayed in the mobile phone screen.
If the mobile phone starts the screen projection function and establishes connection with a target device (for example, a PC), as shown in fig. 6, the WMS may further output a display interface 601 generated by a stack A1 located at the top of the display 0 to the PC, so that the PC may display the display interface 601 being displayed by the mobile phone in a corresponding window, thereby implementing the screen projection function in the multi-device collaborative scene.
In such a screen projection scene, if it is detected that a user operates the display interface 601 in the PC to open a certain application (for example, a video APP) in the mobile phone, the PC may send a corresponding screen projection instruction 1 to the mobile phone, and instruct the mobile phone to project an application task in the video APP to the PC for display. At this time, as shown in fig. 7, in response to the screen-casting instruction 1, the ams may create a new display module (e.g., display 1), and then create a stack B1 corresponding to the video APP in the display 1. The display 1 corresponds to a display screen of the PC, that is, the display interface drawn in the display 1 is finally output to the display screen of the PC for display.
Similar to stack A1, one or more Activities that the video APP needs to execute can be included in stack B1. The AMS may invoke the WMS to draw a corresponding display interface (e.g., the display interface 701 shown in FIG. 7) in real time in display 1 when executing Activity in stack B1, and the display interface 701 is associated with the window 2 created by the WMS. Further, the WMS may output the display interface 701 generated in the display 1 to the PC, and the PC may display the display interface 701 in the window 2, thereby projecting the application task of the video APP in the mobile phone to the PC for display.
Similarly, the user can project other applications in the mobile phone to the PC for display according to the above method. For example, if it is detected that the user operates the PC to open the chat APP in the mobile phone, the PC may send a corresponding screen projection instruction 2 to the mobile phone to instruct the mobile phone to project the application task in the chat APP to the PC for display. As shown in FIG. 8, in response to the screen-on instruction 2, the AMS may create a stack B2 in display 1 corresponding to the chat APP. Unlike display 0, all the stacks in display 1 can be set to visible properties, and the stack at the top of the stack is typically the stack of the application that the user has recently opened or operated. The AMS, when executing Activity in stack B2, may invoke the WMS to render a corresponding display interface (e.g., display interface 801 shown in fig. 8) in display 1 in real time, where the display interface 801 is associated with the window 3 created by the WMS. Further, the WMS may output the display interface 801 generated in the display 1 to the PC, and the PC displays the display interface 801 in the window 3, thereby projecting the application task of the chat APP in the mobile phone to the PC for display.
At this time, as shown in fig. 9, the PC may display a desktop 601 running in display 0 of the mobile phone in real time through a window 1, and the PC may display a display interface 701 of a video APP running in display 1 of the mobile phone in real time through a window 2 and display an display interface 801 of a chat APP running in display 1 of the mobile phone in real time through a window 3, so that multiple application tasks in the mobile phone are projected to the PC in a multi-window form for display. It should be noted that application tasks in different windows may belong to the same application or belong to different applications, and this is not limited in this embodiment of the present application.
The window 1 corresponds to the stack A1 at the top of the display 0, and the other windows (for example, the window 2 and the window 3) correspond to the respective stacks in the display 1. That is, in the above-mentioned screen projection scenario, one window in the PC may synchronously display the display interface of the mobile phone (for example, the display interface 601 of the desktop), and another window or windows in the PC may be used to project the display interfaces of other applications in the mobile phone.
In the embodiment of the application, a user can perform operations such as covering or merging the multiple windows displayed on the PC in the screen-shooting scene, and trigger the mobile phone to establish the association between the multiple windows projected on the PC so as to manage the multiple windows projected on the PC, so that the multiple windows on the PC realize functions such as replacement, screen splitting, suspension or merging, and the like, and the use experience of the user in the screen-shooting scene is improved.
The specific display process of the mobile phone (source device) and the PC (target device) during multi-application screen projection will be described in detail in the following embodiments, and therefore, details are not repeated here.
In addition, the application framework layer may further include a power management service, a content providing service, a view system, a resource management service, a notification management service, and the like, which is not limited in this embodiment of the present application.
3. Android runtime and system library
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
Wherein the surface manager is used for managing the display subsystem and providing the fusion of the 2D and 3D layers for a plurality of application programs. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
4. Inner core layer
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver and the like.
Hereinafter, a screen projection display method provided by the embodiment of the present application is described in detail with reference to the accompanying drawings, by taking a mobile phone as a source device and a PC as a target device in a screen projection scene as examples.
6-9, after the mobile phone establishes a communication connection with the PC, the mobile phone may project an application task (e.g., desktop) running in the foreground to the PC, so that the PC displays a display interface 601 of the desktop in the window 1. At this time, stack A1 corresponding to the desktop runs in display 0 of the mobile phone. That is to say, the display interface in the window 1 is synchronized with the display interface in the screen of the mobile phone, and subsequently, the display interface in the window 1 in the screen projection scene may be referred to as a main display interface of the source device (i.e., the mobile phone).
The mobile phone and the PC display the display interface in the window 1 synchronously, which means: the specific contents displayed by the two display devices may be the same, but the shapes, sizes, positions, arrangements, resolutions, DPI (Dots Per Inch), and other display parameters of the two display devices may be different, which is not limited in this embodiment of the present invention.
Illustratively, as also shown in fig. 9, the mobile phone may also project an application task of the video APP in the mobile phone to the PC, so that the PC displays a display interface 701 of the video APP in the window 2. At this time, stack B1 corresponding to the video APP runs in display 1 of the handset. For example, the mobile phone may also project an application task of the chat APP in the mobile phone to the PC, so that the PC displays the display interface 801 of the chat APP in the window 3. At this time, stack B2 corresponding to the chat APP is also running in display 1 of the handset.
That is, except that the window 1 displayed on the PC in synchronization with the screen of the mobile phone corresponds to the default display 0 of the mobile phone, other windows (e.g., window 2 or window 3) projected on the PC correspond to the newly created display 1 of the mobile phone. The display content generated in display 1 may be sent to the display of the PC for display, but is not typically sent to the display of the handset for display. Correspondingly, the display content generated in the display 0 is generally sent to the display screen of the mobile phone for displaying, and in a screen projection scene, the display content generated by the Stack at the top of the display 0 can be sent to the display of the PC for displaying.
It should be noted that the windows displayed on the PC during screen projection (for example, the windows 1 to 3) may include components such as a control bar, a title bar, a status bar, or a toolbar in addition to the display interface of the relevant application. For example, a maximize button, a minimize button, and a close button, etc. may be included in the control bar.
Of course, the mobile phone may also project more application tasks into the PC according to the above method, so that the PC displays the display interface corresponding to the application task in the form of a window, which is not limited in this embodiment of the present application.
In the embodiment of the present application, still taking the above-described screen projection scenario as an example, the PC as the target device may receive an operation input by the user on each of the projected windows. For example, a user may drag one window over another on a PC.
Illustratively, as shown in fig. 10 (a), the PC may display window 1, window 2, and window 3 in the above-described screen-projection scene. The window 1 includes a display interface 601 of a mobile phone desktop, the window 2 includes a display interface 701 of a video APP in a mobile phone, and the window 3 includes a display interface 801 of a chat APP in a mobile phone. The user may operate various windows on the PC using an input device such as a mouse, keyboard, etc.
For example, as shown in fig. 11, when the PC detects that the user drags the window 2 using the mouse, the PC may monitor the position of the window 2 dragged in the screen in real time. As shown in (b) of fig. 10, if it is detected that the dragged window 2 overlaps with other windows (e.g., window 3) in the PC, the PC may calculate the area of the overlapping region between the window 2 and the window 3 (i.e., overlap area 1). If it is detected that the overlap area 1 is greater than a preset area threshold (e.g., the area threshold is 80% of the window 3), it indicates that the user may have an operational intention to replace the window 3 with the dragged window 2. At this point, if the user releases the mouse, the mouse may send a corresponding release event to the PC indicating that the user has released the mouse to stop dragging window 2. Further, as also shown in fig. 12, the PC may send a corresponding window covering event 1 to the handset in response to the release event, the window covering event 1 indicating that the user performed an event of dragging the window 2 to cover the window 3. After the mobile phone receives the window covering event 1, the AMS and the WMS may be invoked to control the PC not to display the window 3 any more, but to keep the dragged window 2 and the display interface 701 in the window 2, so that the PC displays the interface shown in (c) of fig. 10.
For example, as shown in fig. 12 (a), before the mobile phone receives the window coverage event 1, the Stack B1 of the video APP in the window 2 runs in the display 1 of the mobile phone, and the Stack B2 of the chat APP in the window 3 also runs in the display 1 of the mobile phone. Stack B1 and Stack B2 are both visible attributes, and Stack B2 is at the top of the Stack. Stack A1 of the desktop in the window 1 runs in display 0 of the mobile phone. After receiving the window covering event 1, the mobile phone can determine that the window 3 is covered by the user dragging the window 2 according to the window covering event 1. At this point, the handset may kill (kill) the chat APP in response to window covering event 1. For example, the WMS of the cell phone may close (also referred to as destroy) window 3 corresponding to chat APP. In addition, the AMS of the mobile phone may delete the Stack B2 corresponding to the chat APP in the display 1. After Stack B2 is deleted, WMS will not continue drawing display interface 801 generated by chat APP in display 1, nor will WMS send display interface 801 to PC.
Or, as shown in fig. 11, after receiving the window overlay event 1, the mobile phone may instruct the WMS to set the window 3 to be an invisible attribute, and may instruct the AMS to move the Stack B1 corresponding to the video APP in the display 1 to the top of the Stack. At this time, as shown in fig. 12 (B), stack B1 is located at the top of display 1, stack B1 is a visible attribute, and Stack B2 is an invisible attribute. Stack A1 corresponding to the desktop is still at the top of the display 0 Stack, and Stack A1 is a visible attribute. At this time, the chat APP is not kill, but is switched to the background running of the mobile phone.
In this way, when the AMS of the mobile phone executes Activity in stack B1, the WMS may be called to draw the display interface 701 generated by the video APP in real time in display 1, and the display interface 701 is still associated with the window 2. Further, the WMS may transmit the display interface 701 generated by the video APP to the PC in a video stream. After receiving the display interface 701 generated by the video APP, the PC may continue to display the display interface 701 in real time in the window 2. Since the Stack attribute of Stack B2 is invisible and the window attribute of window 3 is also invisible, the WMS does not continue to draw the display interface 801 generated by the chat APP in display 1, and does not send the display interface 801 to the PC any more.
In this way, as shown in fig. 10 (c), after the user drags the window 2 projected from the mobile phone in the PC to cover the window 3, the window covering function of closing the window 3 while retaining the window 2 can be visually realized. Certainly, the WMS may also send the display interface 601 of the desktop drawn in the display 0 to the PC and the display screens of the mobile phone in real time, so that the display interface 601 continues to be synchronously displayed in the windows 1 of the mobile phone and the PC.
In other embodiments, the user may also drag a window (e.g., the window 1) of a window overlay PC, where the window overlay PC includes the main display interface of the mobile phone, that is, a window in the window overlay PC that is synchronized with the display interface of the mobile phone is dragged. In this case, a corresponding window covering function may also be implemented on the PC.
Illustratively, as shown in fig. 13 (a), the PC is still used to display the window 1, the window 2, and the window 3 in the above-described screen projection scene. As shown in fig. 14, when the PC detects that the user drags the window 2 using the mouse, the PC can monitor the position of the window 2 dragged in the screen in real time. As shown in (b) of fig. 13, if it is detected that the dragged window 2 overlaps the window 1 including the main display interface of the mobile phone, the PC may calculate the overlap area 2 between the window 2 and the window 1 in real time. When the overlapping area 2 is larger than the preset area threshold, if the PC receives a release event sent by the mouse, the PC can respond to the release event and send a corresponding window covering event 2 to the mobile phone, and the window covering event 2 indicates that the user executes an event of dragging the window 2 to cover the window 1.
Since the main display interface of the mobile phone is displayed in the covered window 1, as shown in fig. 14, after the mobile phone receives the window covering event 2, unlike fig. 11, the mobile phone may call the AMS and the WMS control PC to no longer display the window 2, but continue to display the display interface 701 of the window 2 in the window 1, so that the PC displays the interface shown in (c) of fig. 13.
For example, as shown in (a) in fig. 15, before the cell phone receives the window covering event 1, stack A1 of the desktop in the window 1 runs in display 0 of the cell phone, and Stack A1 is a visible attribute. The Stack B1 of the video APP in the window 2 runs in the display 1 of the mobile phone, and the Stack B2 of the chat APP in the window 3 also runs in the display 1 of the mobile phone. Stack B1 and Stack B2 are both visible attributes, and Stack B2 is at the top of the Stack. After the mobile phone receives the window covering event 2, it can be determined that the window 1 is covered by the user dragging the window 2 according to the window covering event 1, and the window 1 is a main display interface of the mobile phone projected to the PC. At this point, as also shown in fig. 14, the handset may send an instruction to close window 2 to the WMS in response to window overlay event 2. Also, the handset may send an instruction to the AMS to move Stack B1 from display 1 to display 0 in response to the window overlay event 2.
Accordingly, the WMS may destroy the window 2 in response to the received instruction. The AMS may call a preset Stack moving interface (e.g., movestack to display ()) to move Stack B1 from display 1 to the top of the Stack of display 0 in response to the received instruction. At this time, as shown in (B) in fig. 15, stack B1 is located at the top of the display 0 Stack, and Stack B1 is a visible attribute. Stack B2 corresponding to chat APP is still located in display 1, and Stack B2 is still a visible attribute. Stack A1 (i.e., stack of the desktop) originally located at the top of display 0 Stack may be pushed into the Stack (at this time, stack A1 is an invisible attribute), or Stack A1 may also be deleted, which is not limited in this embodiment of the present invention.
Thus, when the AMS of the mobile phone executes the Activity in stack B1, the WMS may be invoked to draw the display interface 701 generated by the video APP in real time in the display 0, where the display interface 701 is associated with the window 1 displaying the main display interface of the mobile phone. Since stack B1 is moved into display 0 at this time, as shown in fig. 16, on one hand, the WMS may send the display interface 701 generated by the video APP to the display screen of the mobile phone for displaying. On the other hand, the WMS may transmit the display interface 701 generated by the video APP to the PC in a video stream. As shown in (c) of fig. 13, after the PC receives the display interface 701 generated by the video APP, the display interface 701 may be displayed in real time in the window 1. At this time, the display interfaces displayed in the window 2 by the mobile phone and the PC are synchronous, that is, the display interface in the window 2 is the main display interface of the mobile phone when the screen is projected. Of course, the WMS may also send the display interface 801 of the chat APP drawn in display 1 to the PC, and the PC continues to display the display interface 801 in real time in window 3.
That is to say, in a multi-window screen-casting scene, a user may drag one window to cover another window in a PC (i.e., a target device) to trigger the PC and a mobile phone (i.e., a source device) to interact to implement a window covering function, that is, close the covered window while keeping the dragged window, so that the user can manage multiple windows in the target device conveniently.
Generally, when the AMS of the mobile phone performs a Stack moving operation on a Stack, the WMS refreshes the Stack in the corresponding display module and re-reads the configuration information of the Stack at the top of the Stack. The configuration information of Stack may record parameters such as resolution and aspect ratio when displaying the corresponding display interface. If the configuration information of the Stack at the top of the Stack is changed after being refreshed, the WMS may re-execute the Stack at the top of the Stack.
In this embodiment of the application, in the above-mentioned scene where the user drags the window 2 to cover the window 1, when the AMS of the mobile phone moves the Stack B1 from the display 1 to the Stack top of the display 0, the AMS may set the configuration information of the Stack B1 to the same configuration information as the Stack A1 originally located at the Stack top of the display 0. Then, after the WMS refreshes display 0, it may be read that the configuration information of Stack at the top of display 0 has not changed, and the WMS does not resume executing Stack B1, but continues to execute Activity in Stack B1 before the Stack moving operation. Therefore, after the video APP in the window 2 is switched to the window 1, the display interface 701 of the video APP can be seamlessly switched to the window 1, seamless connection of window content is achieved, and use experience of a user is improved.
In other embodiments, in a multi-window screen-projection scenario, a user may also drag one window in a PC (i.e., a target device) to cover another window, and trigger the PC to interact with a mobile phone (i.e., a source device) to implement a window floating function, so that the dragged window is floating and displayed on the covered window.
Illustratively, as shown in fig. 17 (a), window 1, window 2, and window 3 are displayed in the above-described screen projection scene by the PC. As shown in fig. 18, when the PC detects that the user drags the window 2 using the mouse, the PC can monitor the position of the window 2 dragged in the screen in real time. As shown in (b) of fig. 17, if it is detected that the window 2 dragged overlaps the window 3, the PC may calculate the overlap area 1 between the window 2 and the window 3 in real time. When the overlapping area 1 is larger than the preset area threshold, the PC starts timing. When the duration that the overlap area 1 is greater than the area threshold is greater than a preset time threshold (for example, 2 s), it indicates that the user may have an operation intention to display the window 2 on the window 3 in a floating manner. At this time, as also shown in fig. 18, if the PC receives a release event sent by the mouse, the PC may send a corresponding window hover event 1 to the mobile phone in response to the release event, where the window hover event 1 indicates that the user performed an event of dragging the window 2 to hover over the window 3. When the mobile phone receives the window floating event 1, the AMS and the WMS may be invoked to control the PC to display a window 2 on a window 3 in the form of a floating window, so that the PC displays an interface as shown in fig. 17 (c).
For example, as shown in (a) in fig. 19, before the cell phone receives the window hover event 1, stack B1 of the video APP in window 2 runs in display 1 of the cell phone, and Stack B2 of the chat APP in window 3 also runs in display 1 of the cell phone. Stack B1 and Stack B2 are both visible attributes, and Stack B2 is at the top of the Stack. Stack A1 of the desktop in the window 1 runs in display 0 of the mobile phone, and the Stack A1 is a visible attribute. After receiving the window suspension event 1, the mobile phone can determine that the user drags the window 2 to suspend on the window 3 according to the window suspension event 1. At this point, as also shown in fig. 18, the handset may send an instruction to set window 2 to the floating window to the WMS in response to window floating event 1. And the mobile phone can respond to the window floating event 1 and send an instruction of moving the Stack B1 to the top of the Stack in display 1 to the AMS.
Accordingly, the WMS may modify the window attribute of the window 2 to the attribute of the floating window (floating window) in response to the received instruction, and at this time, the window 2 with the modified window attribute (i.e., floating window) is still associated with the display screen 701 generated by the video APP. The AMS may move Stack B1 to the top of the Stack in display 1 in response to the received instruction. At this time, as shown in fig. 19 (B), stack B1 is positioned at the top of the Stack in display 1, and Stack B2 and Stack B1 are still visible attributes. Stack A1 corresponding to the desktop is still at the top of the display 0 Stack.
In this way, when the AMS of the mobile phone executes the Activity in Stack B1 and Stack B2, the WMS may be called to draw the display interface 701 generated by the video APP and the display interface 801 generated by the chat APP in display 1, and the display interface 701 may be drawn on the upper layer of the display interface 801 in the form of a floating window. Further, the WMS may transmit a video stream including the display interface 701 and the display interface 801 to the PC, and the PC may display the display interface including the window 2 (i.e., the floating window) in the window 3. At this time, as shown in (c) of fig. 17, the window 2 continues to be displayed on the window 3 in the form of a floating window.
The above embodiment is exemplified by the user dragging the window 2 in the PC to float on the window 3, in other embodiments, the user may also drag a certain window in the PC to float on a window (for example, the above window 1) of the PC that includes the main display interface of the mobile phone. In this case, the PC may also implement a corresponding window floating function.
Illustratively, as shown in fig. 20 (a), window 1, window 2, and window 3 are displayed in the above-described screen projection scene by the PC. As shown in fig. 21, when the PC detects that the user drags the window 2 using the mouse, the PC can monitor the position of the window 2 dragged in the screen in real time. As shown in (b) of fig. 20, if it is detected that the dragged window 2 overlaps the window 1 including the main display interface of the mobile phone, the PC may calculate the overlap area 2 between the window 2 and the window 1 in real time. When the overlapping area 2 is larger than the preset area threshold, the PC starts timing. Still as shown in fig. 22, when the duration that the overlapping area 2 is greater than the area threshold is greater than a preset time threshold (e.g., 2 s), if the PC receives a release event sent by the mouse, the PC may send a corresponding window hovering event 2 to the mobile phone in response to the release event, where the window hovering event 2 indicates that the user performed an event of dragging the window 2 to hover over the window 1. When the mobile phone receives the window hovering event 2, the AMS and WMS may be invoked to control the PC to display the window 2 on the window 1 in the form of a hovering window, so that the PC displays an interface as shown in fig. 20 (c). The difference is that, at this time, the window 2 is also displayed in the form of a floating window in the mobile phone screen.
For example, as shown in (a) in fig. 22, before the cell phone receives the window hover event 2, stack B1 of the video APP in the window 2 runs in display 1 of the cell phone, and Stack B2 of the chat APP in the window 3 also runs in display 1 of the cell phone. Stack B1 and Stack B2 are both visible attributes, and Stack B2 is at the top of the Stack. Stack A1 of the desktop in the window 1 runs in display 0 of the mobile phone, and the Stack A1 is a visible attribute. After the mobile phone receives the window suspension event 2, it can be determined that the user drags the window 2 to suspend on the window 1 according to the window suspension event 2, and the display interface of the window 1 is synchronous with that of the mobile phone. At this time, as also shown in fig. 21, the handset may send an instruction to set window 2 as the floating window to the WMS in response to window floating event 2. And, the handset may send an instruction to move Stack B1 from display 1 to display 0 to the AMS in response to the window hover event 2.
Accordingly, the WMS may modify the window attribute of the window 2 to an attribute of a floating window (floating window) in response to the received instruction. At this time, window attribute modified window 2 (i.e., floating window) is still associated with display 701 generated by video APP. The AMS may call a preset Stack moving interface (e.g., movestack to display ()) to move Stack B1 from display 1 to the top of the Stack of display 0 in response to the received instruction. At this time, as shown in fig. 22 (B), stack B1 is positioned at the top of display 0, and Stack A1 originally positioned at the top of display 0 is pushed into the Stack. Stack B1 at the top of the Stack is a visible attribute. Since window 2 associated with Stack B1 is a floating window, stack A1 below Stack B1 is also a visible attribute. After Stack B1 moves out of display 1, stack B2 is at the top of display 1, and Stack B2 is still a visible property.
In this way, when the AMS of the mobile phone executes Activity in stack B1 and stack A1, WMS may be invoked to draw a display interface 701 generated by the video APP and a display interface 601 of the desktop in display 0, and the display interface 701 is drawn on an upper layer of the display interface 601 in the form of a floating window. At this time, the WMS may send the display interface 701 and the display interface 601 to the cell phone. As shown in fig. 23, the mobile phone may display a display interface 701 on a display interface 601 of the desktop in the form of a floating window. Meanwhile, the WMS may transmit a video stream including the display interface 701 and the display interface 601 to the PC. As shown in (c) in fig. 20, the PC may continue to display the display interface 601 in the window 1, and display the display interface 701 in the form of a floating window on the window 1. Of course, the WMS may also send the display interface 801 of the chat APP drawn in display 1 to the PC, and the PC continues to display the display interface 801 in real time in window 3.
That is to say, in a multi-window screen-projection scenario, a user may drag one window to float on another window in a PC (i.e., a target device), and trigger the PC to interact with a mobile phone (i.e., a source device) to implement a window-floating function, even if the dragged window floats on a covered window.
In other embodiments, in a multi-window screen-projection scenario, a user may also drag one window to another window in a PC (i.e., a target device) to trigger the PC and a mobile phone (i.e., a source device) to interact to implement a window merging function, so that the dragged window and the covered window are merged into one window.
For example, the projected window in the PC (i.e., the target device) may be preset to contain a boundary hotspot. For example, as shown in fig. 24, a window 2401 includes a predetermined boundary hotspot 2402 therein. The boundary hot zone 2402 is disposed near the edge of the window 2401. For example, the area covered by 100 pixels extending inward from each boundary in the window 2401 may be set as the boundary hotspot 2402 of the window 2401.
In a multi-window screen-casting scene, when a user drags one window on a PC to cover another window, the PC can judge whether the user triggers the window merging function or not by detecting the contact ratio of a boundary hot area between the two windows.
Illustratively, as shown in fig. 25 (a), window 1, window 2, and window 3 are displayed in the above-described screen projection scene by the PC. As shown in fig. 26, when the PC detects that the user drags the window 2 using the mouse, the PC can monitor the position of the window 2 dragged in the screen in real time. As shown in (b) of fig. 25, if it is detected that the boundary hotspot of the window 2 being dragged coincides with the boundary hotspot of the window 3, the PC may calculate the coincidence of the boundary hotspots between the window 2 and the window 3 in real time. For example, when the degree of coincidence 1 between the boundary hot areas on the left side of the window 2 and the boundary hot areas on the right side of the window 3 is greater than the preset coincidence degree threshold value, it indicates that the user may have an operation intention to merge the window 2 and the window 3. At this time, as also shown in fig. 26, if the PC receives a release event sent by the mouse, the PC may send a corresponding window merge event 1 to the handset in response to the release event, where the window merge event 1 indicates that the user performed an event of merging the window 2 with the window 3. When the mobile phone receives the window merging event 1, the AMS and the WMS may be invoked to control the PC to merge the window 2 and the window 3 into a new window (e.g., window 4), as shown in fig. 25 (c), so that the PC displays the display interfaces originally located in the window 2 and the window 3 in the merged window 4 in a split-screen manner.
For example, after receiving the window merge event 1, the mobile phone may send a screen split command for the window 2 and the window 3 to the WMS. Further, the WMS may set the window 2 and the window 3 to the split-screen attribute in response to a split-screen instruction of the window 2 and the window 3. For example, window 2 may be set as a right split screen window and window 3 may be set as a left split screen window.
Thus, when the AMS of the mobile phone executes the stack B1 corresponding to the video APP, the WMS may be called to draw the display interface 701 of the video APP in the display 1, and at this time, the display interface 701 is associated with the right split-screen window. Similarly, when the AMS of the mobile phone executes the stack B2 corresponding to the chat APP, the WMS may be called to draw the display interface 801 of the chat APP in the display 1, and at this time, the display interface 801 is associated with the left split screen window. Furthermore, the WMS may send the display interface 701 and the display interface 801 to the PC as a display interface in one window (for example, window 4), and the PC displays the display interface 701 of the video APP and the display interface 801 of the chat APP in the window 4 in a split-screen manner, thereby implementing a screen merging function.
In other embodiments, the user may also drag window 2 to merge with window 1 of the PC that contains the main display interface of the handset.
Illustratively, as shown in fig. 27 (a), window 1, window 2, and window 3 are displayed in the above-described screen projection scene by the PC. As shown in fig. 28, if the boundary hotspot of the dragged window 2 is detected to coincide with the boundary hotspot of the window 1, the PC can calculate the degree of coincidence of the boundary hotspots between the window 2 and the window 1 in real time. For example, as shown in (b) of fig. 27, when the degree of overlap 2 between the boundary hotspot on the upper side of the window 2 and the boundary hotspot on the lower side of the window 1 is greater than the preset degree of overlap threshold, it indicates that the user may have an operation intention to merge the window 2 and the window 1. At this time, if the PC receives a release event sent by the mouse, as shown in fig. 28, the PC may send a corresponding window merge event 2 to the mobile phone in response to the release event, where the window merge event 2 indicates that the user performed an event of merging the window 2 with the window 1. When the mobile phone receives the window merging event 2, the AMS and the WMS may be invoked to control the PC to merge the window 2 and the window 1 into one window (e.g., window 5), as shown in fig. 27 (c), so that the PC displays the display interfaces in the original window 2 and the original window 1 in the merged window 5 in a split-screen manner.
For example, as shown in fig. 28, after receiving the window merge event 2, the mobile phone may send a window 2 and window 1 split command to the WMS. Moreover, since the video APP in the window 2 and the desktop in the window 1 respectively correspond to different display modules in the mobile phone, after receiving the window merge event 2, the mobile phone may further send an instruction to move the Stack B1 from the display 1 to the display 0 to the AMS.
Correspondingly, after receiving the split-screen instruction of the window 2 and the window 1, the WMS may set the window 2 and the window 1 as the split-screen attribute. For example, window 1 may be set as an upper split screen window and window 2 may be set as a lower split screen window. And, the AMS may call a preset Stack moving interface to move the Stack B1 from display 1 to the top of the Stack of display 0 in response to the received instruction. At this time, similar to (B) in fig. 22, stack B1 is located at the top of the Stack of display 0, and Stack A1 (i.e., the Stack of the desktop) is located within the Stack of display 0. Since window 2 and window 1 are split attributes, the AMS may set both the corresponding Stack B1 and Stack A1 as visible attributes.
Thus, when the AMS of the mobile phone executes the stack B1 corresponding to the video APP, the WMS can be called to draw the display interface 701 of the video APP in the display 0, and at the moment, the display interface 701 is associated with the lower split screen window. Similarly, when the AMS of the mobile phone executes stack A1 corresponding to the desktop, the WMS may be invoked to draw a display interface 601 of the desktop in display 0, and at this time, the display interface 601 is associated with the upper split screen window. Furthermore, the WMS may transmit the display interface 701 and the display interface 601 to the PC as a display interface in one window (e.g., window 5) in the form of a video stream, and the PC displays the display interface 701 of the video APP and the display interface 601 of the desktop in the form of split screens in the window 5.
In addition, since both stack B1 and stack A1 are located in display 0, the WMS may also send the display interface 701 and the display interface 601 to the display screen of the mobile phone. At this time, as shown in fig. 29, the mobile phone may display the display interface 701 of the video APP and the display interface 601 of the desktop in a split screen manner, so that the display interface in the mobile phone is synchronized with the display interface in the window 5 in the PC.
It should be noted that, when the PC or the mobile phone displays in the merged window (for example, the window 4 or the window 5), elements in the display interface 801, the display interface 701, or the display interface 601 in the window may be deleted or rearranged, which is not limited in this embodiment of the application.
In the foregoing embodiment, it is described that, in a multi-window screen-casting scenario, a window covering function, a window suspending function, and a window merging function are respectively implemented by dragging a window on a PC (i.e., a target device). It can be understood that the user can also use the functions in an overlapping mode, and the user can conveniently manage a plurality of windows in a screen projection scene.
For example, as shown in (a) of fig. 30, when the PC detects that the user drags the window 2 so that the contact ratio of the boundary hot zone between the window 2 and the window 3 is greater than the contact ratio threshold, if the PC receives a release event sent by a mouse, the PC may interact with the mobile phone according to the above method to implement the window merging function. At this time, as shown in (b) of fig. 30, the PC may merge window 2 and window 3 into window 4, and display the display interface 701 of the video APP in the native window 2 and the display interface 801 of the chat APP in window 3 in the form of split screens in window 4.
Subsequently, as shown in fig. 30 (c), when the PC detects that the user drags the window 4 to overlap the window 4 and the window 1, and the overlapping area between the window 4 and the window 1 is greater than the area threshold, if the PC receives a release event sent by the mouse, the PC may interact with the mobile phone according to the method described above to implement the window covering function. At this time, as shown in (d) in fig. 30, the PC no longer displays the overlaid window 1, but displays the display interface 701 of the video APP and the display interface 801 of the chat APP in synchronization with the cellular phone in the window 4 (the display interfaces of the cellular phone are not shown in fig. 30).
Subsequently, if the mobile phone establishes a wireless communication connection with other electronic devices (e.g., tablet computers) with a screen projection function, the mobile phone may further project the window 4 being displayed in a split-screen manner into a window of the tablet computer for display according to the method described above. Therefore, the mobile phone as the source device can seamlessly switch the displayed interface to the target device for operation, so that the user can continue to use the related functions provided by the source device in the target device.
It should be noted that the scenes shown in (a) - (d) in fig. 30 are scenes in which the user first merges two windows and then overlaps the merged window with another window. It is understood that the user may also use the above-described window covering function, window floating function, and window merging function in an overlapping manner in other ways. For example, the user may use the window merge function to merge window 1 and window 2 first, and then merge the merged window with window 3. For another example, the user may use the window floating function to float window 1 on window 2 and then merge the window containing the floating window with window 3.
In some scenarios, as shown in (a) of fig. 31, the mobile phone and the PC may hover the window 2 displayed on the window 3 according to the above-described window hover function. Further, as shown in fig. 31 (b), if the user drags the window 3 to float on the window 1 (i.e., the main display interface of the mobile phone), the window 2 floating on the window 3 is dragged to float on the window 1. At this time, if the PC receives a release event sent by the mouse, the PC may instruct the mobile phone to close a window that does not include the focus application among the windows 2 and 3, and display the remaining windows in the form of a floating window in the window 1 according to the above method. For example, if the video APP in window 2 is the current focus application and the chat APP in window 3 is not the current focus application, the handset may control the PC to delete window 3 and its display and display window 2 in window 1 in the form of a floating window, as shown in (c) of fig. 31. Certainly, a person skilled in the art may set, according to actual experience or an actual application scenario, a specific implementation scheme for overlapping and using the window covering function, the window suspending function, and the window merging function, which is not limited in this embodiment of the present application.
It can be seen that, in a multi-window screen projection scene, a user may operate multiple windows in a target device such as a PC, and trigger a source device to establish associations between the operated multiple windows on the target device, so that functions such as window coverage, window suspension, and window merging are implemented in the target device, and the user can manage the multiple windows projected in the target device through one operation, thereby managing the multiple windows projected in the target device more efficiently, and improving user experience.
In addition, in the foregoing embodiment, a mobile phone is taken as an example of a source device in a screen projection scene, and a PC is taken as an example of a target device in the screen projection scene, it can be understood that the source device applied in the embodiment of the present application may also be an electronic device with a screen projection function, such as a tablet computer, and the target device may also be an electronic device with a display function, such as a television or a tablet computer, and the embodiment of the present application does not limit this.
It should be noted that, in the embodiment, the screen projection display method is implemented among the functional modules described by taking the Android system as an example, and it may be understood that corresponding functional modules may also be set in other operating systems (for example, a hongmeng system, etc.) to implement the method. As long as the functions implemented by the respective devices and functional modules are similar to the embodiments of the present application, they are within the scope of the claims of the present application and their equivalents.
As shown in fig. 32, an embodiment of the present application discloses an electronic device, which may be the source device (e.g., a mobile phone). The electronic device may specifically include: a touch screen 3201, the touch screen 3201 including a touch sensor 3206 and a display screen 3207; one or more processors 3202; a memory 3203; a communication module 3208; one or more application programs (not shown); and one or more computer programs 3204, which may be connected by one or more communication buses 3205. Wherein the one or more computer programs 3204 are stored in the memory 3203 and configured to be executed by the one or more processors 3202, the one or more computer programs 3204 include instructions that may be used to perform the relevant steps performed by the source device in the embodiments described above.
As shown in fig. 33, an embodiment of the present application discloses an electronic device, which may be the above-mentioned target device (e.g., a PC). The electronic device may specifically include: a display screen 3301; one or more processors 3302; a memory 3303; a communication module 3306; one or more application programs (not shown); and one or more computer programs 3304, which can be connected via one or more communication buses 3305. Of course, the electronic device may also be equipped with an input device such as a touch screen, a mouse, or a keyboard. Wherein the one or more computer programs 3304 are stored in the memory 3303 and configured to be executed by the one or more processors 3302, the one or more computer programs 3304 include instructions that may be used to perform the steps associated with performing the target device in the embodiments described above.
Through the description of the above embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, or portions of the technical solutions that substantially contribute to the prior art, or all or portions of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (29)

1. A screen projection display method is characterized by comprising the following steps:
the method comprises the steps that target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by source equipment, and the second window comprises an interface of a second application task projected by the source equipment;
the target device detects that a user inputs a window covering operation, wherein the window covering operation is used for covering the first window with the second window;
responding to the window covering operation, and sending a window covering event to the source device by the target device;
responding to the window covering event, and acquiring screen projection data of the first window covering the second window by the source end equipment;
the source device sends the screen projection data to the target device;
and the target equipment displays a first interface according to the screen projection data, wherein the first interface comprises an interface of the first application task and does not comprise an interface of the second application task.
2. The method of claim 1, wherein neither the first window nor the second window includes a main display interface of the source device;
wherein the interface of the first application task in the first interface is located in the first window.
3. The method of claim 2, wherein the source device comprises a first display module for providing screen projection data to the target device when projecting a screen;
before the source device receives the window coverage event, the first application task and the second application task are both run on the first display module, and the second application task is run on the stack top of the first display module;
after the source device receives the window coverage event, the method further includes:
the source device moves the first application task to the stack top of the first display module to run;
and the source device deletes the second application task from the first display module, or sets the second application task to be an invisible attribute in the first display module.
4. The method of claim 1, wherein the display content in the second window is synchronized with a main display interface of the source device;
wherein the interface of the first application task in the first interface is located in the second window.
5. The method of claim 4, wherein the source device comprises a first display module and a second display module, the first display module being configured to provide screen projection data to the target device when projecting a screen; the second display module is used for providing display data to the source device and providing screen projection data to the target device during screen projection;
before the source device receives the window covering event, the first application task runs on the first display module, and the second application task runs on the second display module;
after the source device receives the window coverage event, the method further includes:
and the source device moves the first application task from the first display module to the stack top of the second display module to run.
6. The method according to any one of claims 1-5, wherein the window covering operation is: and after the user drags the first window, when the overlapping area between the first window and the second window is larger than an area threshold value, releasing operation input by the user.
7. A screen projection display method is characterized by comprising the following steps:
the source device projects an interface of a first application task into a first window of a target device for display, and projects an interface of a second application task into a second window of the target device for display;
the source device receives a window covering event sent by the target device, wherein the window covering event is used for indicating that a user inputs a window covering operation for covering the first window with the second window;
responding to the window covering event, and acquiring screen projection data of the first window covering the second window by the source end equipment;
and the source device sends the screen projection data to the target device.
8. The method of claim 7, wherein the source device comprises a first display module for providing screen projection data to the target device when projecting a screen;
before the source device receives the window covering event, the first application task and the second application task both run on the first display module, and the second application task runs on the top of the first display module;
after the source device receives the window coverage event, the method further includes:
the source end device moves the first application task to the stack top of the first display module to run;
the source device deletes the second application task from the first display module, or sets the second application task as an invisible attribute in the first display module.
9. The method of claim 7, wherein the source device comprises a first display module and a second display module, the first display module being configured to provide screen projection data to the target device when projecting a screen; the second display module is used for providing display data to the source device and providing screen projection data to the target device during screen projection;
before the source device receives the window covering event, the first application task runs on the first display module, and the second application task runs on the second display module;
after the source device receives the window coverage event, the method further includes:
and the source end device moves the first application task from the first display module to the stack top of the second display module to run.
10. A screen projection display method is characterized by comprising the following steps:
the method comprises the steps that target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by source equipment, and the second window comprises an interface of a second application task projected by the source equipment;
the target device detects that a user inputs a window covering operation, wherein the window covering operation is used for covering the first window with the second window;
responding to the window covering operation, and sending a window covering event to the source device by the target device;
the target device receives screen projection data sent by the source device in response to the window covering event;
and the target equipment displays a first interface according to the screen projection data, wherein the first interface comprises an interface of the first application task and does not comprise an interface of the second application task.
11. The method of claim 10,
when neither the first window nor the second window includes the main display interface of the source device, the interface of the first application task in the first interface is located in the first window;
and when the display content in the second window is synchronous with the main display interface of the source device, the interface of the first application task in the first interface is located in the second window.
12. The method according to claim 10 or 11, wherein the window covering operation is: and after the user drags the first window, when the overlapping area between the first window and the second window is larger than an area threshold value, releasing operation input by the user.
13. A source device, comprising:
a display screen;
one or more processors;
a memory;
a communication module;
wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the source device, cause the source device to carry out the screen projection display method of any of claims 7-9.
14. A target device, comprising:
a display screen;
one or more processors;
a memory;
a communication module;
wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the target device, cause the target device to perform the screen projection display method of any of claims 10-12.
15. A computer-readable storage medium having instructions stored therein, which when run on a source device, cause the source device to perform the screen projection display method of any one of claims 7-9; or, when run on a target device, cause the target device to perform the screen projection display method of any of claims 10-12.
16. A screen projection display method is characterized by comprising the following steps:
the method comprises the steps that target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by source equipment, and the second window comprises an interface of a second application task projected by the source equipment;
the target device detects that a user inputs a window merging operation, wherein the window merging operation is used for merging the first window and the second window;
responding to the window merging operation, and sending a window merging event to the source device by the target device;
responding to the window merging event, and acquiring screen projection data after merging the first window and the second window by the source terminal equipment;
the source device sends the screen projection data to the target device;
and the target equipment displays a first interface according to the screen projection data, wherein the interface of the first application task and the interface of the second application task are displayed in the first interface in a split screen mode.
17. The method of claim 16, after the target device sends a window merge event to the source device, further comprising:
and responding to the window merging event, and setting the window attributes of the first window and the second window as split screen windows by the source end equipment.
18. The method of claim 16 or 17, wherein the source device comprises a first display module and a second display module, the first display module being configured to provide the target device with screen projection data when projecting a screen; the second display module is used for providing display data to the source device and providing screen projection data to the target device when a screen is projected;
prior to the receiving the window merge event, the first application task running at the first display module, the second application task running at the second display module;
after the source device receives the window merging event, the method further includes:
and the source device moves the first application task from the first display module to the stack top of the second display module to run.
19. The method of claim 18, wherein after the target device sends a window merge event to the source device, further comprising:
and responding to the window merging event, and displaying the interface of the first application task and the interface of the second application task in a second interface by the source end device in a split screen mode.
20. The method of any of claims 16-19, wherein each window displayed by the target device contains a boundary hotspot; the window merging operation refers to: and after the user drags the first window, when the overlapping area between the boundary hot area of the first window and the boundary hot area of the second window is larger than an area threshold value, releasing operation input by the user.
21. A screen projection display method is characterized by comprising the following steps:
the source device projects an interface of a first application task into a first window of a target device for display, and projects an interface of a second application task into a second window of the target device for display;
the source device receives a window merging event sent by the target device, wherein the window merging event is used for indicating that a user inputs a window merging operation for merging the first window and the second window;
responding to the window merging event, and acquiring screen projection data after the first window and the second window are merged by the source end equipment;
and the source device sends the screen projection data to the target device.
22. The method of claim 21, after the source device receives the window merge event sent by the target device, further comprising:
and responding to the window merging event, and setting the window attributes of the first window and the second window as split screen windows by the source end equipment.
23. The method of claim 22, wherein the source device comprises a first display module and a second display module, the first display module being configured to provide screen projection data to the target device when projecting a screen; the second display module is used for providing display data to the source device and providing screen projection data to the target device during screen projection;
prior to said receiving said window merge event, said first application task running on said first display module, said second application task running on said second display module;
after the source device receives the window merging event, the method further includes:
and the source end device moves the first application task from the first display module to the stack top of the second display module to run.
24. The method of any of claims 21-23, after the target device sends a window merge event to the source device, further comprising:
and responding to the window merging event, and displaying the interface of the first application task and the interface of the second application task in a second interface by the source end device in a split screen mode.
25. A screen projection display method is characterized by comprising the following steps:
the method comprises the steps that target equipment displays a first window and a second window, wherein the first window comprises an interface of a first application task projected by source equipment, and the second window comprises an interface of a second application task projected by the source equipment;
the target device detects that a user inputs a window merging operation, wherein the window merging operation is used for merging the first window and the second window;
responding to the window merging operation, and sending a window merging event to the source device by the target device;
the target device receives screen projection data sent by the source device in response to the window merging event;
and the target equipment displays a first interface according to the screen projection data, wherein the interface of the first application task and the interface of the second application task are displayed in the first interface in a split screen mode.
26. The method of claim 25, wherein each window displayed by the target device contains a boundary hotspot; the window merging operation is as follows: and after the user drags the first window, when the overlapping area between the boundary hot area of the first window and the boundary hot area of the second window is larger than an area threshold value, releasing operation input by the user.
27. A source device, comprising:
a display screen;
one or more processors;
a memory;
a communication module;
wherein the memory has stored therein one or more computer programs comprising instructions which, when executed by the source device, cause the source device to carry out the on-screen display method of any of claims 21-24.
28. A target device, comprising:
a display screen;
one or more processors;
a memory;
a communication module;
wherein the memory has stored therein one or more computer programs comprising instructions which, when executed by the target device, cause the target device to perform the screen projection display method of claim 25 or 26.
29. A computer-readable storage medium having instructions stored therein, which when run on a source device, cause the source device to perform the screen projection display method of any of claims 21-24; alternatively, the instructions, when executed on a target device, cause the target device to perform the screen projection display method of claim 25 or 26.
CN202110736178.2A 2021-06-30 2021-06-30 Screen projection display method and electronic equipment Pending CN115543163A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110736178.2A CN115543163A (en) 2021-06-30 2021-06-30 Screen projection display method and electronic equipment
PCT/CN2022/084100 WO2023273460A1 (en) 2021-06-30 2022-03-30 Screen projection display method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110736178.2A CN115543163A (en) 2021-06-30 2021-06-30 Screen projection display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN115543163A true CN115543163A (en) 2022-12-30

Family

ID=84692447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110736178.2A Pending CN115543163A (en) 2021-06-30 2021-06-30 Screen projection display method and electronic equipment

Country Status (2)

Country Link
CN (1) CN115543163A (en)
WO (1) WO2023273460A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001325054A (en) * 2000-05-16 2001-11-22 Fujitsu Ten Ltd Multi-window display method and window-managing method
US10585553B2 (en) * 2012-12-06 2020-03-10 Samsung Electronics Co., Ltd. Display device and method of controlling the same
CN116055773A (en) * 2019-12-17 2023-05-02 华为技术有限公司 Multi-screen collaboration method, system and electronic equipment
CN113050841A (en) * 2019-12-26 2021-06-29 华为技术有限公司 Method, electronic equipment and system for displaying multiple windows
CN112017576A (en) * 2020-08-26 2020-12-01 北京字节跳动网络技术有限公司 Display control method, device, terminal and storage medium

Also Published As

Publication number Publication date
WO2023273460A1 (en) 2023-01-05

Similar Documents

Publication Publication Date Title
US11922005B2 (en) Screen capture method and related device
US11669242B2 (en) Screenshot method and electronic device
JP7473101B2 (en) Application display method and electronic device
WO2020238871A1 (en) Screen projection method and system and related apparatus
WO2020253719A1 (en) Screen recording method and electronic device
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
WO2021000881A1 (en) Screen splitting method and electronic device
US11921987B2 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
CN112558825A (en) Information processing method and electronic equipment
WO2021036770A1 (en) Split-screen processing method and terminal device
EP4060475A1 (en) Multi-screen cooperation method and system, and electronic device
CN112527174B (en) Information processing method and electronic equipment
CN114554000B (en) Camera calling method, system, electronic equipment and storage medium
US20230353862A1 (en) Image capture method, graphic user interface, and electronic device
WO2022179405A1 (en) Screen projection display method and electronic device
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
CN114035721B (en) Touch screen display method and device and storage medium
CN114185503A (en) Multi-screen interaction system, method, device and medium
WO2021190524A1 (en) Screenshot processing method, graphic user interface and terminal
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium
WO2023020012A1 (en) Data communication method between devices, device, storage medium, and program product
WO2021204103A1 (en) Picture preview method, electronic device, and storage medium
WO2023273460A1 (en) Screen projection display method and electronic device
US20240134591A1 (en) Projection display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination