CN116156229A - Screen projection method, user interface and electronic equipment - Google Patents

Screen projection method, user interface and electronic equipment Download PDF

Info

Publication number
CN116156229A
CN116156229A CN202210151376.7A CN202210151376A CN116156229A CN 116156229 A CN116156229 A CN 116156229A CN 202210151376 A CN202210151376 A CN 202210151376A CN 116156229 A CN116156229 A CN 116156229A
Authority
CN
China
Prior art keywords
screen
user interface
user
throwing
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210151376.7A
Other languages
Chinese (zh)
Inventor
任国锋
石金得
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN116156229A publication Critical patent/CN116156229A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • H04N21/43637Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]

Abstract

The application discloses a screen projection method, a user interface and electronic equipment. The screen throwing method relates to screen throwing equipment and screen throwing equipment, in the process that the screen throwing equipment displays the screen throwing content of the screen throwing equipment, the screen throwing equipment can receive user operation and send information indicating the user operation to the screen throwing equipment, the screen throwing equipment analyzes relevant parameters of a following animation triggered by the user operation according to a moving track of the user operation and sends the relevant parameters to the screen throwing equipment, the screen throwing equipment refreshes screen display according to the relevant parameters, and the user interface is switched by using the following manual drawing, so that the screen throwing equipment changes the screen throwing content according to the user operation and the following animation is displayed. Therefore, the user can control the screen throwing content by operating the screen throwing equipment, and the screen throwing effect without perception is provided for the user by manually drawing in the switching process of the display interface.

Description

Screen projection method, user interface and electronic equipment
Technical Field
The application relates to the technical field of terminals and communications, in particular to a screen projection method, a user interface and electronic equipment.
Background
With the development of multimedia technology, people can watch videos, play games, develop online conferences and the like through electronic devices such as mobile phones, tablet computers and the like, wherein screen projection is becoming one of the mainstream viewing modes of people.
The screen projection means that the content in one device (screen projection device) is projected onto another device (screen projection device) to be displayed. For example, in a specific application scenario, a user may put a video in a mobile phone on a television to play, so as to obtain a more comfortable visual experience.
At this time, the screen-projected device serves as only one display device, and the user can control the screen-projected content in the screen-projected device only by operating the screen-projected device. However, when the screen-throwing device is not at the side of the user or the user is inconvenient to operate the screen-throwing device, the user cannot control the screen-throwing content, so that the operability of the user is low, and the screen-throwing experience of the user is affected.
Disclosure of Invention
The screen throwing method, the user interface and the electronic equipment realize that the screen throwing equipment can detect user operation in the process of displaying the content provided by the screen throwing equipment, finish the switching of the user interface according to the operation, display the hand-following animation in the interface switching process, provide a feeling of locally controlling the screen throwing equipment for a user and provide a non-perceived screen throwing effect for the user.
In a first aspect, the present application provides a screen projection method, including: the first equipment and the second equipment establish screen-throwing connection; the first device sends the first content to the second device based on the screen-drop connection; the second device displays a first user interface including first content; the second device detecting a first operation acting on the second device; the second device sending first information indicative of the first operation to the first device; the first device sends animation switching parameters of the first animation to the second device, and the animation switching parameters are determined by the first device according to the moving track of the first operation; the second device displays the first animation, and switches the first user interface to the second user interface.
In this embodiment of the present application, the first device may refer to a screen-throwing device, and the second device may refer to a screen-thrown device, where the screen-throwing device may establish a screen-throwing connection with the screen-thrown device, and throw the content of the screen-throwing device into a display screen of the screen-thrown device for display.
In the method provided by the first aspect, in the process that the screen-throwing device displays the screen-throwing content provided by the screen-throwing device, the screen-throwing device can detect the user operation of the user on the screen-throwing device and send the information indicating the user operation to the screen-throwing device, the screen-throwing device can analyze the following animation triggered by the user operation according to the sliding track of the user operation and send the animation switching parameters of the following animation to the screen-throwing device so as to enable the screen-throwing device to display the following manual drawing and complete the switching of a user interface. Therefore, the screen-throwing equipment is not used as a display device any more, the content provided by the screen-throwing equipment is displayed, the screen-throwing equipment can control the screen-throwing content according to the user operation, the switching of a user interface is completed, the relation between the screen-throwing equipment and the screen-throwing equipment in the screen-throwing process is weakened, and the screen-throwing experience without perception is provided for the user.
With reference to the first aspect, in some embodiments, the first operation is a gesture operation acting on a display screen of the second device.
In some embodiments, the user operation on the device to be projected may refer to a gesture operation on the display screen of the device to be projected, at this time, the device to be projected may display, according to a touch point on the display screen acted by the user, a follow-up manual drawing in which a window where the user interface is located moves along with the touch point, so as to provide an indistinguishable interaction experience for the user before and after projection.
Further, the first operation may refer to a gesture operation acting on the second content displayed by the second device.
With reference to the first aspect, in some embodiments, the second user interface includes content local to the second device.
That is, the content provided by the screen-casting device may be contained in the switched user interface displayed by the screen-casting device. Therefore, even if the screen-throwing equipment is in the process of establishing screen-throwing connection with the screen-throwing equipment, the screen-throwing equipment can still receive user operation and switch to local content, so that the screen-throwing equipment can provide richer content for the user in the screen-throwing process and more convenient operation, and the user can realize the rapid switching of the screen-throwing content and the local content according to own requirements.
With reference to the first aspect, in some embodiments, the first user interface and the second user interface are upper and lower pages of a same application; or, the first user interface and the second user interface are user interfaces provided by different applications, wherein the second user interface is a user interface provided by an application in the second device; alternatively, the second user interface is a desktop main interface that provides a user interface for the first device or the second device or a multi-tasking interface that includes a history browse window for one or more applications in the second device.
In some embodiments, the first operation may include a Back gesture, a Home gesture, a centers gesture, a QuickSwitch gesture, and so on. The user may be enabled by using the gestures described above for the display of the projected device: and returning to the upper page, returning to the desktop main interface, entering the latest task, rapidly switching the switching effect of the user interfaces of the application and the like. Thus, the user can experience a convenient cross-device system navigation function on the device to be screened in the screen projection process.
Further, when the first operation is used to trigger a return to the desktop main interface, the device to be projected may return to the desktop main interface of the device to be projected. When the first operation is used for triggering switching of the application, the switched user interface can be a user interface provided by the application or the installed application which is browsed by the history in the screen throwing device. When the first operation is used for triggering the entry of the latest task, a browse record of the application or the installed application which is browsed by the history in the screen throwing device can be included in the multi-task interface. Therefore, the screen-throwing equipment can realize convenient switching of the screen-throwing content and the local content in the screen-throwing process, the relation between the screen-throwing equipment and the screen-throwing equipment in the screen-throwing process is weakened, and a gesture navigation function of the cross equipment is provided for a user.
With reference to the first aspect, in some embodiments, the first information includes one or more of: event type, coordinates, time, identification of the second device; the event type is used for determining gesture operation corresponding to the first operation, the coordinates are the positions of the first operation on the display screen of the second device, and the time is the time of the first operation on the display screen of the second device; the animation switching parameters include one or more of the following: the scaling, the moving track, the moving speed and the moving position of the window where the first user interface and the second user interface are located.
With reference to the first aspect, in some embodiments, before the first device establishes the screen-casting connection with the second device, the method further includes: the first device displays a third user interface including the first content.
That is, the screen-throwing device may send, after establishing the screen-throwing connection with the screen-throwing device, the content displayed by the current screen-throwing device to the screen-throwing device through the screen-throwing connection, so that the screen-throwing device may display the content.
With reference to the first aspect, in some embodiments, before or after the first device sends the animation switching parameters of the first animation to the second device, the method further includes: the first device displays a first animation, and switches the third user interface to a fourth user interface.
In the embodiment of the present application, the screen projection manner between the screen projection device and the screen projection device may include, but is not limited to, the following two types:
1) Homologous screen
The homologous screen projection refers to the display content displayed by the screen projection equipment, and the full screen is completely displayed in the form of a local window in a 'copy' mode in the display screen of the screen projection equipment. That is, the screen-casting device and the screen-cast device may display the same content at the same time.
In addition, in the embodiment of the application, the device to be screened can receive the user operation and respond to the operation to complete the user interface switching. Meanwhile, the screen throwing device can also respond to the user operation to synchronously complete the switching of the user interface.
When the screen-throwing mode between the screen-throwing equipment and the screen-throwing equipment is the same-source screen-throwing mode, the screen-throwing equipment can display the same content as the screen-throwing equipment, and when the screen-throwing equipment displays the follow-up animation along with the operation of a user, the same follow-up manual picture is displayed, so that the switching of a user interface is completed. For example, the user can screen the display content of the large-screen device into the mobile phone to synchronously display, and operate on the mobile phone, and simultaneously change the display content of the mobile phone and the large screen. Therefore, a user can display the screen throwing content through a plurality of devices, and when the screen throwing device is inconvenient to operate, the screen throwing device controls and operates the display content of the screen throwing device, so that a more convenient means is provided for the display content of the user control device, and the operability and the sharing property of the screen throwing in the application process are improved.
2) Heterogeneous screen of throwing
The heterogeneous screen projection refers to the display content provided by the screen projection device, but the content displayed by the screen projection device can be different from the content displayed by the screen projection device. That is, the screen throwing device may be different from the content displayed by the screen throwing device.
In addition, in the embodiment of the application, the device to be screened can receive the user operation and respond to the operation to complete the switching of the user interface. Moreover, the screen-throwing device may not be affected by the user operation, i.e. the content displayed by the screen-throwing device may not change following the user operation.
When the screen-throwing mode between the screen-throwing device and the screen-throwing device is the heterogeneous screen-throwing mode, the screen-throwing device can display different contents with the screen-throwing device. The screen-throwing device establishes a screen-throwing connection with the screen-throwing device, but the two devices are just like two independent devices, can each receive user operation, and display content of the device is changed according to the operation. For example, the portable screen may display the screen contents provided by the mobile phone and receive user operations to change the contents displayed by the portable screen, while the display contents of the mobile phone are not affected by the operations received on the portable screen. Therefore, the user can put the screen throwing content provided by the mobile phone into the portable screen with larger screen size for display, the display effect of the screen throwing content is improved, meanwhile, the display content of the mobile phone is not affected, the user can use the mobile phone to perform other entertainment activities, and the relation between the screen throwing end and equipment of the screen throwing section in the screen throwing process is weakened.
With reference to the first aspect, in some embodiments, the third user interface and the fourth user interface are user interfaces provided for upper and lower pages of a same application or different applications; alternatively, the fourth user interface is a desktop main interface or a multi-tasking interface.
When the screen-throwing mode between the screen-throwing device and the screen-throwing device is the homologous screen-throwing mode, the screen-throwing device can trigger the switching effect of the user interface such as returning to the upper page, returning to the desktop main menu, entering the latest task, rapidly switching the application and the like according to the user operation of the user acting on the screen-throwing device. In this way, the user can simultaneously change the display contents of the screen throwing device and the screen throwing device by the user operation acting on the screen throwing device.
With reference to the first aspect, in some embodiments, before the first device establishes the screen-casting connection with the second device, the method further includes: the second device displays a fifth user interface containing second content, wherein the second content and the first content are the same content; the second device detects a second operation, the second operation and the first operation being the same operation; the second device displays a sixth user interface, the sixth user interface being different from the second user interface.
In some embodiments, when the screen-thrown device does not support the system navigation function, the screen-thrown device may not trigger the switching of the user interface when receiving the first operation before establishing the screen-throwing connection with the screen-thrown device, and the first operation only can respond as a gesture instruction at an application level. When the screen throwing device supports the system navigation function after the screen throwing device establishes screen throwing connection with the screen throwing device, the screen throwing device can respond to the first operation received from the screen throwing device as a gesture instruction of a system level, generate a hand following animation triggered by the gesture instruction of the system level, and the screen throwing device displays the hand following animation again, wherein the screen throwing device responds to the first operation as the gesture instruction of the system level. It can be seen that the system level gesture instructions are higher in priority than the application-specific gesture instructions. That is, during the screen-casting process, when a user operation received by the screen-casting device may respond as either an application-level gesture or a system-level gesture, the system-level gesture is preferentially responded to. Therefore, the requirement of switching the user interface between applications by the user in the screen throwing process is preferentially met.
With reference to the first aspect, in some embodiments, after the first device establishes the screen-casting connection with the second device, the method further includes: the first device detecting a third operation, displaying a seventh user interface comprising third content; the first device sends third content to the second device based on the screen-drop connection; the second device displays an eighth user interface containing third content.
In addition, in the homologous screen throwing process, the screen throwing equipment can also receive the display content of the screen throwing equipment and the screen throwing equipment which are changed by the user operation, and the operability of the screen throwing equipment and the screen throwing equipment in the screen throwing process is improved.
With reference to the first aspect, in some embodiments, before the first device establishes the screen-casting connection with the second device, the method further includes: the first device displays a first window containing a first control; the first device detects a fourth operation acting on the first control, and the fourth operation is used for triggering the first device to establish screen-throwing connection with the second device.
That is, the screen throwing device can provide a screen throwing control, the user can trigger to open or close the screen throwing function through the screen throwing control, and the user can determine that screen throwing connection is established with other devices under what conditions according to own requirements, so that the operability of the user is improved.
With reference to the first aspect, in some embodiments, after the first device detects the fourth operation acting on the first control, the method further includes: the first device displaying one or more device options; the one or more device options comprise a first device option, and the first device option corresponds to the second device; the first device detects a fifth operation acting on the first device option, the fifth operation being for triggering the first device to establish a screen-drop connection with the second device.
With reference to the first aspect, in some embodiments, before the first device sends the first content to the second device based on the drop connection, the method further includes: the second device displays first prompt information, wherein the first prompt information is used for prompting whether a user agrees to establish screen projection connection with the first device; the second device detects a sixth operation for triggering the first device to send the first content to the second device.
In a second aspect, an embodiment of the present application provides a screen projection method, where the method includes: the first equipment and the second equipment establish screen-throwing connection; the first device sends the first content to the second device based on the screen-drop connection; the first device receives first information indicating a first operation, wherein the first operation is an operation acting on first content in a first user interface which is displayed by the second device and comprises the first content; the first device transmits animation switching parameters of the first animation to the second device, and the animation switching parameters are determined by the first device according to the moving track of the first operation.
At the screen throwing end, the screen throwing equipment can be used as operation equipment to analyze the following animation triggered by the user operation of the user acting on the screen throwing equipment, so that the screen throwing equipment can trigger the following animation according to the user operation acting on the screen throwing equipment, and the effect of controlling the screen throwing content by the screen throwing equipment according to the user operation is realized.
With reference to the second aspect, in some embodiments, before the first device establishes the screen-casting connection with the second device, the method further includes: the first device displays a third user interface including the first content.
That is, the screen-throwing device may send, after establishing the screen-throwing connection with the screen-throwing device, the content displayed by the current screen-throwing device to the screen-throwing device through the screen-throwing connection, so that the screen-throwing device may display the content.
With reference to the second aspect, in some embodiments, before or after the first device sends the animation switching parameters of the first animation to the second device, the method further includes: the first device displays a first animation, and switches the third user interface to a fourth user interface.
When the screen-throwing mode between the screen-throwing equipment and the screen-throwing equipment is the homologous screen-throwing mode, the screen-throwing equipment can display the same follow-up manual picture when the screen-throwing equipment displays the follow-up animation along with the operation of a user, and the switching of a user interface is completed. Therefore, a user can display the screen throwing content through a plurality of devices, and when the screen throwing device is inconvenient to operate, the screen throwing device controls and operates the display content of the screen throwing device, so that a more convenient means is provided for the display content of the user control device, and the operability and the sharing property of the screen throwing in the application process are improved.
With reference to the second aspect, in some embodiments, the third user interface and the fourth user interface are user interfaces provided for upper and lower pages of a same application or different applications; alternatively, the fourth user interface is a desktop main interface or a multi-tasking interface.
When the screen-throwing mode between the screen-throwing device and the screen-throwing device is the homologous screen-throwing mode, the screen-throwing device can trigger the switching effect of the user interface such as returning to the upper page, returning to the desktop main menu, entering the latest task, rapidly switching the application and the like according to the user operation of the user acting on the screen-throwing device. In this way, the user can simultaneously change the display contents of the screen throwing device and the screen throwing device by the user operation acting on the screen throwing device.
With reference to the second aspect, in some embodiments, after the first device establishes the screen-casting connection with the second device, the method further includes: the first device detecting a third operation, displaying a seventh user interface comprising third content; the first device sends the third content to the second device based on the drop connection.
In addition, in the homologous screen throwing process, the screen throwing equipment can also receive the display content of the screen throwing equipment and the screen throwing equipment which are changed by the user operation, and the operability of the screen throwing equipment and the screen throwing equipment in the screen throwing process is improved.
With reference to the second aspect, in some embodiments, before the first device establishes the screen-casting connection with the second device, the method further includes: the first device displays a first window containing a first control; the first device detects a fourth operation acting on the first control, and the fourth operation is used for triggering the first device to establish screen-throwing connection with the second device.
That is, the screen throwing device can provide a screen throwing control, the user can trigger to open or close the screen throwing function through the screen throwing control, and the user can determine that screen throwing connection is established with other devices under what conditions according to own requirements, so that the operability of the user is improved.
With reference to the second aspect, in some embodiments, after the first device detects the fourth operation acting on the first control, the method further includes: the first device displaying one or more device options; the one or more device options comprise a first device option, and the first device option corresponds to the second device; the first device detects a fifth operation acting on the first device option, the fifth operation being for triggering the first device to establish a screen-drop connection with the second device.
In a third aspect, an embodiment of the present application provides a screen projection method, where the method includes: the second equipment establishes screen-throwing connection with the first equipment; the second device receives first content sent by the first device based on screen throwing connection; the second device displays a first user interface including first content; the second device detecting a first operation acting on the second device; the second device sending first information indicative of the first operation to the first device; the second equipment receives animation switching parameters of a first animation sent by the first equipment, wherein the animation switching parameters are determined by the first equipment according to a moving track of a first operation; the second device displays the first animation, and switches the first user interface to the second user interface.
On the screen-throwing side, the screen-throwing equipment can display the content provided by the screen-throwing equipment and send the information of the user operation detected on the screen-throwing equipment to the screen-throwing equipment so as to enable the screen-throwing equipment to analyze the hand-following animation triggered by the user operation, and the screen-throwing equipment can display the hand-following animation triggered by the user operation again, so that a user can change the screen-throwing content by controlling the screen-throwing equipment and provide a non-perception screen-throwing experience for the user.
With reference to the third aspect, in some embodiments, the first operation is a gesture operation acting on a display screen of the second device.
In some embodiments, the user operation on the device to be projected may refer to a gesture operation on the display screen of the device to be projected, at this time, the device to be projected may display, according to a touch point on the display screen acted by the user, a follow-up manual drawing in which a window where the user interface is located moves along with the touch point, so as to provide an indistinguishable interaction experience for the user before and after projection.
Further, the first operation may refer to a gesture operation acting on the second content displayed by the second device.
With reference to the third aspect, in some embodiments, the second user interface includes content local to the second device.
That is, the content provided by the screen-casting device may be contained in the switched user interface displayed by the screen-casting device. Therefore, even if the screen-throwing equipment is in the process of establishing screen-throwing connection with the screen-throwing equipment, the screen-throwing equipment can still receive user operation and switch to local content, so that the screen-throwing equipment can provide richer content for the user in the screen-throwing process and more convenient operation, and the user can realize the rapid switching of the screen-throwing content and the local content according to own requirements.
With reference to the third aspect, in some embodiments, the first user interface and the second user interface are upper and lower pages of a same application; or, the first user interface and the second user interface are user interfaces provided by different applications, wherein the second user interface is a user interface provided by an application in the second device; alternatively, the second user interface is a desktop main interface that provides a user interface for the first device or the second device or a multi-tasking interface that includes a history browse window for one or more applications in the second device.
In some embodiments, the first operation may include a Back gesture, a Home gesture, a centers gesture, a QuickSwitch gesture, and so on. The user may be enabled by using the gestures described above for the display of the projected device: and returning to the upper page, returning to the desktop main interface, entering the latest task, rapidly switching the switching effect of the user interfaces of the application and the like. Thus, the user can experience a convenient cross-device system navigation function on the device to be screened in the screen projection process.
Further, when the first operation is used to trigger a return to the desktop main interface, the device to be projected may return to the desktop main interface of the device to be projected. When the first operation is used for triggering switching of the application, the switched user interface can be a user interface provided by the application or the installed application which is browsed by the history in the screen throwing device. When the first operation is used for triggering the entry of the latest task, a browse record of the application or the installed application which is browsed by the history in the screen throwing device can be included in the multi-task interface. Therefore, the screen-throwing equipment can realize convenient switching of the screen-throwing content and the local content in the screen-throwing process, the relation between the screen-throwing equipment and the screen-throwing equipment in the screen-throwing process is weakened, and a gesture navigation function of the cross equipment is provided for a user.
With reference to the third aspect, in some embodiments, the first information includes one or more of: event type, coordinates, time, identification of the second device; the event type is used for determining gesture operation corresponding to the first operation, the coordinates are the positions of the first operation on the display screen of the second device, and the time is the time of the first operation on the display screen of the second device; the animation switching parameters include one or more of the following: the scaling, the moving track, the moving speed and the moving position of the window where the first user interface and the second user interface are located.
With reference to the third aspect, in some embodiments, before the second device establishes the screen-casting connection with the first device, the method further includes: the second device displays a fifth user interface containing second content, wherein the second content and the first content are the same content; the second device detects a second operation, the second operation and the first operation being the same operation; the second device displays a sixth user interface, the sixth user interface being different from the second user interface.
In some embodiments, when the screen-thrown device does not support the system navigation function, the screen-thrown device may not trigger the switching of the user interface when receiving the first operation before establishing the screen-throwing connection with the screen-thrown device, and the first operation only can respond as a gesture instruction at an application level. When the screen throwing device supports the system navigation function after the screen throwing device establishes screen throwing connection with the screen throwing device, the screen throwing device can respond to the first operation received from the screen throwing device as a gesture instruction of a system level, generate a hand following animation triggered by the gesture instruction of the system level, and the screen throwing device displays the hand following animation again, wherein the screen throwing device responds to the first operation as the gesture instruction of the system level. It can be seen that the system level gesture instructions are higher in priority than the application-specific gesture instructions. That is, during the screen-casting process, when a user operation received by the screen-casting device may respond as either an application-level gesture or a system-level gesture, the system-level gesture is preferentially responded to. Therefore, the requirement of switching the user interface between applications by the user in the screen throwing process is preferentially met.
With reference to the third aspect, in some embodiments, after the second device establishes the screen-casting connection with the first device, the method further includes: the second device receives third content sent by the first device based on the screen throwing connection; the second device displays an eighth user interface containing third content.
In the screen throwing process, the screen throwing equipment can synchronously display the changed screen throwing content when the screen throwing equipment changes the screen throwing content. Therefore, the user can synchronously display the same content and the content after the user triggers the change through a plurality of devices, and the content sharing performance in the screen throwing process is improved.
With reference to the third aspect, in some embodiments, before the second device receives the first content sent by the first device based on the drop connection, the method further includes: the second device displays first prompt information, wherein the first prompt information is used for prompting whether a user agrees to establish screen projection connection with the first device; the second device detects a sixth operation for triggering the first device to send the first content to the second device.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a memory, one or more processors, and one or more programs; the one or more processors, when executing the one or more programs, cause the electronic device to implement the method as described in the second aspect or any implementation of the second aspect, the third aspect or any implementation of the third aspect.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium comprising instructions that, when run on an electronic device, cause the electronic device to perform a method as described in the second aspect or any implementation of the second aspect, the third aspect or any implementation of the third aspect.
In a sixth aspect, the present embodiments provide a computer program product which, when run on a computer, causes the computer to perform a method as described in the second aspect or any implementation of the second aspect, the third aspect or any implementation of the third aspect.
Drawings
Fig. 1 is a schematic structural diagram of a communication system according to an embodiment of the present application;
fig. 2A-fig. 2B are specific application scenarios of the screen projection method provided in the embodiments of the present application;
3A-3F, 4A-4D, 5A-5F, 6A-6D, 7A-7B, and 8A-8D are some of the user interfaces provided by embodiments of the present application;
fig. 9 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
FIG. 10 is a schematic diagram of module interaction between devices under the heterogeneous screen projection provided in the embodiment of the present application;
FIG. 11 is a schematic diagram of module interaction between devices under the condition of providing a homologous screen projection in an embodiment of the present application;
fig. 12 is a schematic flow chart of a screen projection method according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a screen projection system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, "/" means or is meant unless otherwise indicated, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an application program or an operating system and a user, which enables conversion between an internal form of information and an acceptable form of the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
In order to improve the display effect of content in devices, there is a screen projection technology that can project content in one device (screen projection device) onto another device (screen projection device) to display. However, the screen-thrown device can only be used as a display device to display the content required to be thrown, and cannot receive the content operated and controlled by the user, so how to improve the operability of the screen-thrown device is a problem to be solved at present.
The embodiment of the application provides a screen projection method, which relates to screen projection equipment and screen projection equipment, wherein in the process that the screen projection equipment displays the screen projection content of the screen projection equipment, the screen projection equipment can receive user operation, generate an Input event based on the user operation and send the Input event to the screen projection equipment, the screen projection equipment determines interaction parameters of the screen projection equipment according to equipment information carried by the Input event, then the screen projection equipment analyzes relevant parameters of a follow-up animation triggered by the user operation according to the interaction parameters and the Input event, sends the relevant parameters to the screen projection equipment, and the screen projection equipment refreshes screen display according to the relevant parameters and completes user interface switching by using the follow-up manual animation, so that the effect that the screen projection equipment changes the screen projection content according to the user operation and displays the follow-up animation is achieved. Thus, the user can control the screen throwing content by controlling the screen throwing equipment, and the screen throwing effect without perception is provided for the user.
Wherein, in the embodiment of the present application, the user operation may refer to gesture navigation in system navigation, including but not limited to: back gestures, home gestures, centers gestures, quickSwitch gestures, and the like. The user may implement by using the gestures described above for the display screen of the screened device: and returning to the upper page, returning to the desktop main interface, entering the latest task, rapidly switching the switching effect of the user interfaces such as the application and the like, thereby facilitating the user to realize the switching of the user interfaces and providing more convenient touch experience for the user. In addition, manually drawing refers to that in the switching process of the user interface, the switching track of the user interface follows the moving track of the user gesture, so that an effect of locally manipulating the content displayed by the screen throwing device is presented to the user. A specific description of gesture navigation described above may be found in the subsequent UI embodiments, which are not first developed here.
The Input event may include information of event type, coordinates, time, device information, and so forth. Wherein the event types may include: a down event, a move event, an on event, the down event indicating the start of a user gesture, the on event indicating the end of a user gesture, the move event indicating the course of a user gesture. An Input event triggered by a user gesture may include a down event, a plurality of move events, and an up event. The user operation, in particular a Back gesture, a Home gesture, a rets gesture, or a QuickSwitch gesture, may be identified by the event type. The coordinates refer to the position of the user operation on the display screen, the time refers to the time when the user triggers the user operation, the setting information comprises a device ID, and the screen throwing device can determine the device source of the Input event according to the device ID. The screen throwing device can recognize the gesture of the user according to the information carried in the Input event, and determine the related parameters drawn manually according to the gesture of the user.
In general, the screen projection method provided by the embodiment of the invention can enable the screen projection equipment to respond to the user operation, control the screen projection content of the screen projection equipment, display the follow-up animation in the interface switching process triggered by the user operation, weaken the relation between the screen projection equipment and the screen projection equipment in the screen projection process, provide the user with a non-perceived cross-equipment system navigation function in the screen projection process, and provide the user with more convenient screen projection experience.
Fig. 1 illustrates a communication system 1000 provided by an embodiment of the present application.
The communication system 1000 may include a screen-casting device, the number of which may be one or more, and a screen-casting device, the number of which may be one or more. Illustratively, in the communication system 1000 shown in fig. 1, the number of screen-projecting devices and the number of screen-projected devices are each one. As shown in fig. 1, the communication system 1000 may include: electronic device 1001 and electronic device 1002. The electronic device 1002 can display the screen content provided by the electronic device 1001, that is, the electronic device 1001 is a screen device, and the electronic device 1002 is a screen device.
The electronic device 1001 may establish a communication connection with the electronic device 1002, and put the content provided by the electronic device 1001 on a display screen of the electronic device 1002 for display.
In the process that the electronic device 1002 displays the screen content provided by the electronic device 1001, the electronic device 1002 may be configured to receive a user operation performed by a user on the screen content, generate an Input event based on the user operation, send the Input event to the screen device, receive a parameter related to a manual picture returned by the screen device, refresh the screen content according to the parameter related to complete switching of a user interface with the manual picture.
In the process that the electronic device 1002 displays the screen content provided by the electronic device 1001, the electronic device 1001 may be configured to receive an Input event, determine an interaction parameter of the electronic device 1002 according to device information carried in the Input event, analyze, according to the interaction parameter and the Input event, a related parameter triggered by the user operation and drawn manually, and send the related parameter to the electronic device 1002.
In addition, the communication connection between the electronic device 1001 and the electronic device 1002 may be a wired connection or a wireless connection. The electronic device 1001 may send the screen-casting content to the electronic device 1002 based on the communication connection, and the electronic device 1002 may display the screen-casting content on the display screen, thereby completing the screen casting between the electronic device 1001 and the electronic device 1002. The wireless connection may be a close range connection such as a high fidelity wireless communication (wireless fidelity, wi-Fi) connection, a bluetooth connection, an infrared connection, an NFC connection, a ZigBee connection, or a remote connection including, but not limited to, a remote connection of a mobile network based on 2g,3g,4g,5g, and subsequent standard protocols. For example, the electronic device 1001 and the electronic device 1002 may log in to the same user account (e.g., hua as an account) and then make a remote connection through a server.
In this embodiment of the present application, the following two screen projection modes are mainly involved:
1) Homologous screen
The homologous screen projection refers to the display content displayed by the screen projection equipment, and the display content is completely displayed in the form of a full screen or a partial window in a 'copy' mode in the display screen of the screen projection equipment. In other words, the screen throwing device may also display the same screen throwing content during the process of throwing the screen throwing content onto the screen thrown device for display. In this way, the screen-casting device and the screen-cast device can simultaneously display the same content.
In addition, in the embodiment of the application, the device to be screen-cast can also receive the user operation acting on the screen-cast content, and respond to the user operation to complete the switching of the user interface. Meanwhile, the screen throwing device can also respond to the user operation to finish the switching of the user interface.
By way of example, fig. 2A illustrates one particular application scenario for a homologous screen-drop. As shown in fig. 2A, the screen-throwing device is a large-screen device, and the screen-throwing device is a mobile phone. The large screen device can put the displayed content into the display screen of the mobile phone for synchronous display, and the mobile phone can receive user operation and change the displayed content of the mobile phone and the large screen. Therefore, a user can display the screen throwing content through a plurality of devices, and when the large screen device is inconvenient to operate, the mobile phone is used for controlling and operating the display content of the large screen device, so that a more convenient means is provided for the user control device, and the operability and the sharing performance of the screen throwing in the application process are improved.
2) Heterogeneous screen of throwing
The heterogeneous screen projection refers to the display content provided by the screen projection device being presented by the screen projection device, but the content presented by the screen projection device may be different from the content presented by the screen projection device. In other words, the screen throwing device may not display the screen throwing content in the process that the screen throwing device throws the screen throwing content to the screen thrown device for display. In this way, the screen-casting device may be different from the content displayed by the screen-casting device.
In addition, in the embodiment of the application, the device to be screen-cast can also receive the user operation acting on the screen-cast content, and respond to the user operation to complete the switching of the user interface. At this time, the screen-throwing device may not be affected by the user operation, i.e., the content displayed by the screen-throwing device may not change following the user operation.
By way of example, fig. 2B illustrates one specific application scenario for a heterogeneous screen-cast. As shown in fig. 2B, the screen-throwing device is a mobile phone, and the screen-throwing device is a portable screen, where the portable screen may display screen-throwing content provided by the mobile phone, and receive user operation, and change screen-throwing content of the portable screen, and meanwhile, display content of the mobile phone is not affected by the portable screen, and its display content may be different from the screen-throwing content displayed by the portable screen. Therefore, the user can put the screen throwing content provided by the mobile phone into the portable screen with larger screen size for display, the display effect of the screen throwing content is improved, meanwhile, the display content of the mobile phone is not affected, the user can directly act on the portable screen to change the display content of the portable screen, and the relationship between the screen throwing end and the equipment of the screen throwing end in the screen throwing process is weakened.
Taking the electronic device 1001 and the electronic device 1002 in the communication system 1000 shown in fig. 1 as an example, some user interfaces provided in the embodiments of the present application are described below in conjunction with fig. 3A-3F, fig. 4A-4D, fig. 5A-5F, fig. 6A-6D, fig. 7A-7B, and fig. 8A-8D.
Among these user interfaces, the electronic device 1001 may be a large screen, and the electronic device 1002 may be a mobile phone, for example. It may be understood that the device types of the screen throwing device and the screen throwing device are not limited in the embodiments of the present application, and in the embodiments of the present application, the screen throwing device and the screen throwing device may be a mobile phone, a tablet, a computer, a portable screen, a bracelet, a watch, a large screen, or the like, which is not limited in the embodiments of the present application.
Fig. 3A-3F illustrate some of the user interfaces involved in establishing a screen-cast connection between an electronic device 1001 and an electronic device 1002. Wherein fig. 3A-3D are related user interfaces on the electronic device 1001, and fig. 3E-3F are related user interfaces on the electronic device 1002.
Fig. 3A illustrates a user interface 1-1 provided by a setup application when the electronic device 1001 opens the setup application. The user interface 1-1 may include: flight mode, wi-Fi, bluetooth, hot spot, mobile network, etc. The electronic device 1001 may detect a touch operation by a user on different function options in the user interface 1-1, in response to which the electronic device 1001 may turn on, off, or further set the functions of flight mode, wi-Fi, bluetooth, hot spot, mobile network, etc.
As shown in fig. 3A and 3B, when the electronic apparatus 1001 detects a downward sliding operation on the display screen, in response to the sliding operation, the electronic apparatus 1001 may display the window 111 shown in fig. 3B on the user interface 1-1. As shown in fig. 3B, a control 111A may be displayed in the window 111, and the control 111A may accept a user operation (e.g., a touch operation, a click operation) to turn on/off the screen-casting function of the electronic device 1001. The presentation of control 111A can include icons and/or text (e.g., text "drop-in", "wireless drop-in", "multi-screen interaction", etc.). Other functionality such as Wi-Fi, bluetooth, flashlight, etc. switch controls may also be displayed in window 111.
As shown in fig. 3B, the electronic device 1001 may detect a click operation on the control 111A, turning on the screen-casting function. In some embodiments, upon detection of a user operation on control 111A, electronic device 1001 may alter the display form of control 111A, such as increasing shadows when control 111A is displayed, and the like.
Not limited to the user interface 1-1 shown in fig. 3A, the user may also input a sliding down operation on other interfaces of the setup application or the user interface of other applications, triggering the electronic device 1001 to display the window 111.
Not limited to the user operation of the user on the control 111A in the window 111 shown in fig. 3A and 3B. Alternatively, the user operation for opening the screen throwing function may also be an opening operation for function options in the setting application, for example, the user interface 1-1 shown in fig. 3A may further include function options corresponding to the screen throwing function, and the electronic device 1001 may detect that the user acts on the function options corresponding to the screen throwing function to open/close the screen throwing function. For another example, the user may further attach the electronic device 1001 to an NFC tag of the electronic device 1002, and trigger the electronic device 1001 to turn on the screen-throwing function. The embodiment of the application does not limit the user operation of opening the screen throwing function.
The electronic device 1001 starts one or more of WLAN, bluetooth, or NFC in the wireless communication module 160 in response to a user operation of a user to start a screen-casting function, and may discover a screen-casting device that may establish a screen-casting connection through one or more wireless communication technologies of Wi-Fi direct, bluetooth, and NFC.
After the electronic device 1001 finds the screenable devices that can establish a screenable connection, the identities of these screenable devices may be displayed, illustratively, as shown in window 112 of fig. 3C.
As shown in fig. 3C, window 112 may have displayed therein: identification and connection options for one or more of the screenable devices. The electronic apparatus 1001 may detect a user operation acting on the connection option, and the screen-throwing connection is established with the screen-throwing apparatus indicated by the identification corresponding to the screen-throwing option. Wherein the identification and connection options of the one or more screenable devices include an identification 112A and a connection option 112B. When the electronic device 1001 detects a user operation by the user on the connection option 112B, in response to the operation, the electronic device 1001 may transmit a screen-cast connection to the device identified as "HUAWEI 20" displayed in the identification 112A. And, the connection option 112B may be updated to the option 112C as shown in fig. 3D.
It may be appreciated that the user operation of the electronic device 1001 to select a device for establishing a screen-throwing connection is not limited in the embodiments of the present application, and the electronic device 1001 may display other information, such as a device type of the screen-throwing device, in addition to the identification of the screen-throwing device.
After the device identified as "HUAWEI 20" receives the request for a screen-cast connection, a user interface 2-0 may be displayed as shown in fig. 3E, where the user interface 2-0 includes a window 201, where the window 201 is used to prompt the user whether to agree to set up the screen-cast connection. Wherein window 201 may comprise: a confirm control 201A, a cancel control 201B. The confirmation control 201A may be configured to establish a screen connection with the electronic device 1001 in response to a user operation, where the device identified as "HUAWEI 20" is an electronic device 1002 that establishes a screen connection with the electronic device 1001, and the electronic device 1002 may display a user interface provided by the electronic device 1001, specifically referring to the user interface 2-1 shown in fig. 3F. The cancel control 201B may refuse to establish a screen-drop connection with the electronic device 1001 in response to a user operation.
As shown in fig. 3F, after the electronic device 1001 establishes a screen-drop connection with the electronic device 1002, the user interface 2-1 displayed by the electronic device 1002 is the same as the content displayed by the user interface 1-1 displayed by the electronic device 1001, which is the content provided by the setting application in the electronic device 1001.
In some embodiments, the screen content may be adaptively adapted according to the screen size of the electronic device 1002, e.g., in the user interfaces shown in fig. 3A and 3F, the functional items presented in the user interface 1-1 of the electronic device 1001 may be less than the functional items presented in the user interface 2-1 of the electronic device 1002, and the lateral length of the functional items may be consistent with the lateral length of the screen of the device. Alternatively, in other embodiments, the screen content may be put in the display screen of the electronic device 1002 according to the size ratio in the electronic device 1001, that is, the screen content is not adaptively adapted according to the screen size of the electronic device 1002, which is not limited in the embodiments of the present application.
Alternatively, after receiving the screen connection request, the electronic device 1002 may directly establish a screen connection with the electronic device 1001 without displaying the prompt information, that is, without displaying the window 201 as shown in fig. 3E, and display the screen content provided by the electronic device 1001.
As can be seen from fig. 3A-3F, the electronic device 1001 may establish a screen-projection connection with the electronic device 1002 through a screen-projection control, and content in the electronic device 1001 may be projected into the electronic device 1002 for display.
Fig. 4A-4D illustrate that in the process of establishing a screen-drop connection between an electronic device 1001 and an electronic device 1002, the electronic device 1001 may change screen-drop contents displayed by the electronic device 1001 and the electronic device 1002 according to a user operation. Fig. 4A-4B are related user interfaces on the electronic device 1001, and fig. 4C-4D are related user interfaces on the electronic device 1002.
As shown in fig. 4A, the user interface 1-1 is a user interface provided for a setting application in the electronic device 1001, where the user interface 1-1 includes an option 113, and the option 113 is used to set Wi-Fi functions. The electronic device 1001 may detect a user operation by the user on the option 113, and in response to the operation, display a detailed setting interface of the Wi-Fi function, that is, a user interface 1-2 as shown in fig. 4B, where the user interface 1-2 is a next-level user interface of the user interface 1-1, for displaying Wi-Fi connected to the electronic device 1001 and connectable Wi-Fi.
In some embodiments, when the electronic device 1001 detects an operation on the option 113, the option 113 may be in a selected state, and the option 113 may change its background color, for example, displaying a dark background in the option 113 as shown in fig. 4A.
Because the electronic device 1001 and the electronic device 1002 establish a screen connection, the electronic device 1001 may send, to the electronic device 1002, content in a user interface (i.e., the user interface 1-2 shown in fig. 4B) displayed on the electronic device 1001 after modification according to a user operation based on the screen connection, so that the electronic device 1002 synchronously displays the modified content.
As shown in fig. 4C-4D, the electronic device 1002 changes the display from the user interface 2-1 shown in fig. 4C to the user interface 2-2 shown in fig. 4D based on the operation of the user acting on the electronic device 1001. Specifically, as shown in fig. 4C, in the user interface 2-1 displayed on the electronic device 1002, the option 211 may change the background color synchronously according to the change of the background color in the option 113 in fig. 4A, after that, the electronic device 1002 displays the user interface 2-2 shown in fig. 4D, where the user interface 2-2 is the next level of the user interface 2-1 displayed after the option 113 is selected, and the user interface 2-2 is the same as the content in the user interface 1-2 shown in fig. 4B, and is the user interface provided by the setting application in the electronic device 1001.
As can be seen from fig. 4A to fig. 4D, in the process of establishing a screen connection between the electronic device 1001 and the electronic device 1002, the electronic device 1001 synchronously puts the content displayed on the electronic device 1001 on the basis of the screen connection to the electronic device 1002 for displaying, and meanwhile, the electronic device 1001 may synchronously put the content changed according to the user operation on the basis of the screen connection to the electronic device 1002 for displaying.
It should be noted that fig. 4A-4D illustrate user interfaces involved on electronic device 1001 and electronic device 1002 during the process of establishing a homologous screen-drop connection between electronic device 1001 and electronic device 1002. When the screen connection established by the electronic device 1001 and the electronic device 1002 is a heterogeneous screen, the electronic device 1001 may not affect the content displayed on the electronic device 1002 according to the content changed by the user operation. For example, in the process of the electronic device 1001 displaying the user interface as shown in fig. 4A to 4B, the electronic device 1002 may always display the user interface as shown in fig. 4C.
Fig. 5A-5F, 6A-6D, 7A-7B, and 8A-8D illustrate some user interfaces that the electronic device 1002 may involve with the electronic device 1001 when the electronic device 1002 receives a Back gesture, a Home gesture, a centers gesture, and a QuickSwitch gesture, respectively. Fig. 5A-5B, 5E-5F, 6A-6B, 7A, 8A-8B are related user interfaces on the electronic device 1002, and fig. 5C-5D, 6C-6D, 7B, 8C-8D are related user interfaces on the electronic device 1001.
As shown in fig. 5A and 5B, the electronic device 1002 may receive a Back gesture applied by the user to the user interface 2-2, where the Back gesture may be represented as a user operation that slides inward from the left edge of the display screen as shown in fig. 5A, and in response to the operation, the electronic device 1002 quickly switches the upper and lower pages of the application, and returns to the upper user interface as shown in fig. 5B, that is, the user interface 2-1.
It can be appreciated that the Back gesture may also be represented as a user operation that slides inward by the right edge of the display screen, and the specific operation of the Back gesture is not limited in the embodiments of the present application.
In some embodiments, the Back gesture is received when the electronic device 1002 is displaying a main interface provided by the application, and the Back gesture may be used to trigger a return to the desktop main interface of the electronic device 1002.
In addition, during the time that the electronic device 1002 receives the Back gesture, the electronic device 1002 may display the return indicator 221 as shown in fig. 5A, and the return indicator 221 may change its display shape according to the distance between the touch point of the user acting on the display screen and the edge of the screen, thereby exhibiting an animation effect that the return indicator 221 exists according to the gesture of the user. Also, in the electronic device 1002 returning to the previous level user interface in response to the Back gesture, the electronic device 1002 may display a window switching effect from the user interface 2-2 to the user interface 2-1, for example, the transparency of the content displayed in the user interface 2-2 gradually increases, the transparency of the content displayed in the user interface 2-1 gradually decreases, or the content displayed in the user interface 2-2 moves in the direction in which the user gesture moves and gradually decreases until it disappears, and the content displayed in the user interface 2-1 moves in the direction in which the user gesture moves and gradually enlarges until the displayed content occupies the entire display screen of the electronic device 1002. The embodiment of the present application does not limit the following animation displayed by the electronic device 1002 following the user operation.
In the process of returning to the previous level user interface by the electronic device 1002 according to the Back gesture of the user, the electronic device 1001 may also return to the previous level user interface in response to the Back gesture, that is, sequentially display the user interfaces as shown in fig. 5C to 5D.
As shown in fig. 5C-5D, the electronic device 1001 may display the return indicator 121 in the user interface 1-2 shown in fig. 5C following a user operation by a user on the electronic device 1002, and the animation effect of the return indicator 121 may be the same as that of the return indicator 221 shown in fig. 5A. In addition, in the process of returning the electronic device 1002 to the previous-level user interface in response to the Back gesture, the electronic device 1001 may also switch from the user interface 1-2 shown in fig. 5C to the previous-level user interface shown in fig. 5D, that is, the user interface 1-1, according to the interface switching of the electronic device 1002, and the window switching effect from the user interface 1-2 to the user interface 1-1 may be similar to the window switching effect from the user interface 2-2 to the user interface 2-1 shown in fig. 5A-5B.
As can be seen from fig. 5A-5D, the electronic device 1002 may switch from the currently displayed user interface to the previous level user interface according to a Back gesture of the user on the electronic device 1002. Meanwhile, the electronic device 1001 may also switch from the currently displayed user interface to the previous user interface according to the Back gesture applied to the electronic device 1002 by the user. And electronic device 1002 and electronic device 1001 may display similar heel animation according to the Back gesture.
In some embodiments, the priority of gesture navigation instructions in system navigation is higher than the priority of gesture navigation instructions at the application level. The gesture navigation in the system navigation is used for realizing switching and management operations among applications, such as returning to a desktop main menu triggered by a Back gesture, and the gesture navigation at the application level is used for realizing page management operations in the applications, such as deleting a chat list by using a left-sliding operation in the chat application. That is, the electronic device 1002 responds preferentially to system navigation when the same gesture is either a gesture that is a system navigation or a gesture that is an application navigation.
Fig. 5E-5F illustrate some user interfaces involved in receiving a user operation (the same as the Back gesture shown in fig. 5A) sliding inward from the left edge of the display screen while the electronic device 1002 displays the user interface provided by the setup application, i.e., user interface 2-2, before a screen-cast connection is established with the electronic device 1001.
The user interface 2-2 shown in fig. 5E is used to display Wi-Fi connected to the electronic device 1001 and Wi-Fi connectable thereto. The user interface 2-2 may include one or more network options, where the one or more network options are connectable Wi-Fi networks discovered by the electronic device 1001, and in addition, in the setting application of the electronic device 1002, the one or more network options may receive a user operation that the user slides to the left, and display a setting window of the network options.
As shown in fig. 5E, when the electronic device 1002 detects a user operation of sliding inward by the left edge of the display screen, that is, the same operation as the Back gesture shown in fig. 5A, on the user interface 2-2 before the electronic device 1002 establishes a screen connection with the electronic device 1001, since the area on which the user operation acts is located in the area where the network option 222 is located, the network option 222 may display the setting window 223 shown in fig. 5F in response to the left-sliding operation acting thereon.
As shown in fig. 5F, the setting window 223 may be used to add, delete, and set Wi-Fi networks corresponding to the network option 222.
Note that when the electronic device 1002 receives the same operation before and after the screen connection is established with the electronic device 1001 to display a different user interface, the electronic device 1002 may not support the system navigation function before the screen connection is established, so that the application-level response can only be triggered according to the operation to display the user interface as shown in fig. 5E to 5F. After the electronic device 1002 establishes the screen connection, the screen device supports the screen function, so that the operation is responded as a system navigation, and the electronic device 1002 displays the interface switching effect as shown in fig. 5A-5B.
As can be seen by comparing fig. 5A-5B and fig. 5E-5F, after the electronic device 1002 and the electronic device 1001 establish a screen-throwing connection, when a user operation received on the electronic device 1002 can be used as a system-level navigation for triggering switching of a user interface, and also can be used as an application-level navigation for triggering a control response in an application, the user operation is preferentially responded as the system-level navigation, so that the requirement of switching of the user interface between applications by the user in the screen-throwing process is preferentially met.
Similarly, other gesture navigation of the system navigation mentioned below responds with the highest priority as well when it conflicts with the application navigation, and the user interface mentioned below will not be repeated.
It will be appreciated that when the electronic device 1002 itself supports the system navigation function, there is no case where the same operations are received as shown in fig. 5A-5B and 5E-5F, and the responses triggered after and before the screen is different, i.e., there is no user interface as shown in fig. 5E-5F.
As shown in fig. 6A and 6B, the electronic device 1002 may receive a Home gesture by a user on the user interface 2-1, which may appear as a user operation that is slid up the bottom edge of the display screen as shown in fig. 6A, and in response to the operation, return to the desktop main interface of the electronic device 1002, for example, display the user interface 2-0 as shown in fig. 6B.
In addition, in the process that the electronic device 1002 receives the Home gesture, the content in the user interface 2-1 currently displayed by the electronic device 1002 may change the display proportion and the display position along with the touch point of the user in the display screen, until the electronic device 1002 cannot detect the touch point, the electronic device 1002 switches the currently displayed user interface 2-1 to the user interface 2-0. In addition, in the process of switching the user interface, there may be a window switching effect, for example, the transparency of the content displayed in the user interface 2-1 gradually increases, the transparency of the content displayed in the user interface 2-0 gradually decreases, and in addition, the window where the user interface 2-1 is located may have a return action effect of shrinking to the desktop icon.
In the process of returning the electronic device 1002 to the desktop main interface according to the Home gesture of the user, the electronic device 1001 may also return to the desktop main interface of the electronic device 1001 in response to the Home gesture. As shown in fig. 6C-6D, the electronic device 1001 may display an animation effect of the content displayed in the zoom-out and move user interface 1-1 shown in fig. 6C, and immediately display a desktop main interface, i.e., a user interface 1-0, as shown in fig. 6D, following a user operation by a user acting on the electronic device 1002, wherein the user interface 1-0 displayed by the electronic device 1001 and the user interface 2-0 displayed by the electronic device 1002 shown in fig. 6B are the desktop main interfaces provided by the electronic device 1001 and the electronic device 1002, respectively.
As can be seen from fig. 6A-6D, the electronic device 1002 may return to the desktop main interface from the currently displayed user interface according to the Home gesture of the user acting on the electronic device 1002. Meanwhile, the electronic device 1001 may return to the desktop main interface from the currently displayed user interface according to the Home gesture acting on the electronic device 1002. And electronic device 1001 and electronic device 1002 may display similar heel animations according to the Home gesture.
Note that when the electronic device 1002 establishes a screen-drop connection with the electronic device 1001, the electronic device 1002 may display the same screen-drop content as the electronic device 1001, but when the electronic device 1002 receives the Home gesture, both the electronic device 1001 and the electronic device 1002 may return to the desktop main interface provided by each in response to the gesture. In this way, on the electronic device 1002, a user can not only view the content provided by the electronic device 1001 through the electronic device 1002, but also view the desktop main interface provided by the electronic device 1002 by controlling the electronic device 1002, so as to provide a feeling of locally controlling the device for the user using the electronic device 1002, and provide a non-perceived screen-throwing experience across devices for the user.
As shown in FIG. 7A, the electronic device 1002 may receive a Reccents gesture by a user on the user interface 2-1, which may appear as a user operation that is slid up and down by the bottom edge of the display screen as shown in FIG. 7A, in response to which the electronic device 1002 displays the user interface 2-3 as shown in FIG. 7A, the user interface 2-3 being a multi-tasking interface for displaying the most recent tasks of the electronic device 1002, including an application browsing record of the user over the most recent period of time, the user interface 2-3 may be displayed with content contained in the user interface provided by the setup application, and content contained in the user interface provided by other applications, the user interface 2-3 may receive a slide-left operation by the user, displaying content contained in the user interface provided by other applications, as shown in FIG. 7A.
In addition, in the process that the electronic device 1002 receives the receive gesture, the user interface currently displayed by the electronic device 1002 (i.e., the user interface 2-1) may have a gradually blurred motion effect, and the window where the user interface 2-1 is located may follow the position of the touch point of the user in the display screen, gradually reduce the display proportion and change the display position until the display position is reduced to a fixed card size, and the position is located in the middle of the display screen, and at the same time, there may also be other cards flying from the left side and approaching the motion effect of the card in the middle of the display screen, where at this time, the electronic device 1002 may display the user interface 2-3, where one or more cards may be included in the user interface 2-3, where the content displayed after the window where the user interface is located is scaled down is displayed, where the content displayed by the application where the user browsed historically is displayed, and the one or more cards may receive a user operation (such as a clicking operation), switch the application corresponding to the card to the foreground operation, and display the user interface corresponding to the content.
In the process of the electronic device 1002 displaying the latest task according to the Reccents gesture of the user, the electronic device 1001 may also display a user interface 1-3 as shown in FIG. 7B in response to the Reccents gesture, where the user interface is a multi-task interface for displaying the latest task of the electronic device 1001, that is, an application browsing record of the electronic device 1001 in the latest period of time.
As can be seen from fig. 7A and 7B, the electronic device 1002 may switch from the currently displayed user interface to the most recent task according to a Reccents gesture by the user on the electronic device 1002. At the same time, the electronic device 1001 may switch from the currently displayed user interface to the most recent task according to the Reccents gesture acting on the electronic device 1002. Also, the electronic device 1001 and the electronic device 1002 may display similar follow-up animation according to the Reccents gesture.
It should be noted that the electronic device 1001 and the electronic device 1002 each manage and operate a task stack locally, and thus, in response to the centers gesture, the electronic device 1001 and the electronic device 1002 may include different application browsing records in the displayed latest tasks. For example, applications that the electronic device 1001 historically run include: the application A, B, the application that the electronic device 1002 historically runs includes the application C, and if the content of the electronic device 1001 that establishes the screen connection with the electronic device 1002 is the content provided by the application D, the user interface 1-3 displayed by the electronic device 1001 includes the historical browsing interface provided by the application C, D, and the user interface 2-3 displayed by the electronic device 1002 includes the historical browsing interface provided by the application A, B, D. In this way, a user using electronic device 1002 may be provided with a non-perceived screen-casting experience that locally manipulates the device.
As shown in fig. 8A, the electronic device 1002 may receive a QuickSwitch gesture applied by a user to the user interface 2-1, where the QuickSwitch gesture may be represented as a user operation that is performed by a bottom edge of the display screen to slide laterally as shown in fig. 8A, and in response to this operation, the electronic device 1002 quickly switches the application, and displays a window transition animation as shown in fig. 8A, where when the QuickSwitch gesture is performed by sliding from left to right, the window where the currently displayed user interface is located follows a touch point where the user acts on the display screen, and the window where the next user interface is located follows the currently displayed user interface and moves from left to right, until after the user's finger leaves the display screen, the electronic device 1002 displays the user interface after switching, that is, the user interface 2-4 as shown in fig. 8B, which may be a user interface provided by an application (such as a calculator application) that the electronic device 1002 has browsed historically.
It will be appreciated that the QuickSwitch gesture may also be a right-to-left swipe, with the window transition animation comprising the currently displayed user interface moving from right to left according to the user's touch point on the display screen, the next user interface appearing from right and moving to left according to the currently displayed user interface. The moving direction of the QuickSwitch gesture is not limited in the embodiment of the application.
In the process of the electronic device 1002 switching the user interface according to the QuickSwitch gesture of the user, the electronic device 1001 may also respond to the QuickSwitch gesture to display a window switching animation as shown in fig. 8C, where the window transition animation may be similar to the window switching animation shown in fig. 8A, and after the finger of the user leaves the display screen, display the switched user interface, that is, the user interfaces 1-4 shown in fig. 8D, which may be user interfaces provided by applications (such as a music playing application) that the electronic device 1001 has browsed historically.
As can be seen from fig. 8A-8D, the electronic device 1002 can quickly switch the user interface between applications according to a QuickSwitch gesture by a user acting on the electronic device 1002. Meanwhile, the electronic device 1001 can quickly switch the user interface between applications according to the QuickSwitch gesture acting on the electronic device 1002. Also, the electronic device 1001 and the electronic device 1002 may display similar follow-up animation according to the QuickSwitch gesture.
Similar to the task stacks mentioned in fig. 7A-7B, the electronic device 1001 and the electronic device 1002, because each manages and manipulates a local task stack, the electronic device 1001 and the electronic device 1002 respond to a QuickSwitch gesture, and the switched user interface may be a user interface provided for different applications. In this way, a user using electronic device 1002 may be provided with a non-perceived screen-casting experience that locally manipulates the device.
It should be noted that, when the above-mentioned fig. 5A-5D, fig. 6A-6D, fig. 7A-7B, fig. 8A-8D show that the screen projection manner between the electronic device 1001 and the electronic device 1002 is a homologous screen projection, some user interfaces related to the electronic device 1001 and the electronic device 1002, and when the screen projection manner between the electronic device 1001 and the electronic device 1002 is a heterologous screen projection, the user interface displayed by the electronic device 1001 is not affected by the user operation applied to the electronic device 1002 by the user, the electronic device 1002 may still display the user interfaces shown in fig. 5A-5B, fig. 6A-6B, fig. 7A-8B, and the electronic device 1001 may not change the displayed user interfaces, for example, the user interfaces 1-1 shown in fig. 3A are always displayed, or the electronic device 1001 may also detect the user operation applied to the electronic device 1001 by the user, and change the content displayed in the electronic device 1001. In addition, the embodiments of the present application do not limit the specific operations of the Back gesture, the Home gesture, the centers gesture, the QuickSwitch gesture, and the hand-following animation triggered by these gestures.
Fig. 9 shows a hardware configuration diagram of the electronic device 100.
The electronic device 100 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook, as well as a cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) device, virtual Reality (VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device, smart home device, and/or smart city device, with the specific types of such electronic devices not being particularly limited in the embodiments of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The charge management module 140 is configured to receive a charge input from a charger.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD). The display panel may also be manufactured using organic light-emitting diode (OLED), active-matrix organic light-emitting diode (AMOLED) or active-matrix organic light-emitting diode (active-matrix organic light emitting diode), flexible light-emitting diode (FLED), mini, micro-OLED, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193.
The camera 193 is used to capture still images or video.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals.
Video codecs are used to compress or decompress digital video.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is used to sense ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint.
The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, etc.
The motor 191 may generate a vibration cue.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card.
In the embodiment of the present application, when the electronic apparatus 100 is the electronic apparatus 1001:
the processor 110 may be configured to determine an interaction parameter of the device to be projected according to device information carried by the Input event, identify a user gesture according to the interaction parameter and the Input event, analyze a relevant parameter of a hand following animation triggered by the user gesture according to the user gesture, and draw a hand following animation according to the relevant parameter.
The mobile communication module 150 and the wireless communication module 160 may be configured to send a screen-throwing connection request to other devices, and establish a screen-throwing connection with the other devices, obtain an Input event sent by the screen-throwing device, and send parameters related to manual drawing to the screen-throwing device.
The display 194 may be used to display a user interface involved in the screen casting process as well as a user action to a follow-up animation triggered by user operation of the screen casting device.
In the embodiment of the present application, when the electronic device 100 is the electronic device 1002:
the processor 110 may be configured to generate Input events based on user actions on the electronic device 1002 and to draw a follow-up drawing based on parameters associated with the follow-up animation.
The mobile communication module 150 and the wireless communication module 160 may be configured to obtain a screen connection request sent by another device, establish a screen connection with the other device, send an Input event to the screen device, and obtain a parameter related to a manual drawing sent by the screen device.
The touch sensor 180K may be used to detect user operations acting on or near it. The user operation may include, but is not limited to, a Back gesture, a Home gesture, a centers gesture, or a QuickSwitch gesture operation.
The display 194 may be used to display a user interface involved in the screen casting process as well as a user-operated trigger of the follow-up animation by the user on the electronic device 1002.
The following describes the module interaction process between devices under the heterogeneous screen projection and the homogeneous screen projection with reference to fig. 10 and 11.
Fig. 10 shows a module interaction diagram between devices under heterogeneous screen projection, and fig. 11 shows a module interaction diagram between devices under homogeneous screen projection.
As shown in fig. 10, the module interaction diagram illustrates the module interaction between the electronic device 1001 and the electronic device 1002 after the electronic device 1001 and the electronic device 1002 establish a heterogeneous screen-drop connection. Illustratively, in the module interaction diagram shown in fig. 10, the electronic device 1001 may be a mobile phone, and the electronic device 1002 may be a portable screen. The description of the heterogeneous screen projection between the mobile phone and the portable screen can be referred to in the foregoing related description of fig. 2B, and will not be repeated here.
The module interaction diagram relates to the application layer, framework layer, underlying library, and driver layer of the electronic device 1001. Wherein:
The application layer may be used to provide recognition of user gestures, navigation driving and hand following animation for the application interface. The application layer may include a series of application packages, and in the electronic device 1001, the application layer may include a user interface, a setting, an application program, and a gesture navigation service module (not shown). The gesture navigation service module may generate two examples by instantiation after starting the screen projection function, including: local and drop instances. The local instance is used to identify a user gesture received locally by the electronic device 1001 and parse out a user interface and parameters related to manual drawing for the gesture trigger switch, and the screen-drop instance is used to identify a user gesture received by other devices, such as the electronic device 1002 and parse out a user interface and parameters related to manual drawing for the gesture trigger switch.
Wherein, table 1 shows the detailed distinction and association of the local instance and the drop instance, as shown in table 1:
TABLE 1
Figure BDA0003510671800000201
Figure BDA0003510671800000211
As can be seen from Table 1, the bound user Input is the Input channel of the projection device, serving the local instance of the local projection device, that is, the local instance receives the Input event generated by the user's user operation on the electronic device 1001. The Display ID of the binding is the Display ID of the screen throwing device, that is, the output obtained by calculating the local instance according to the Input event is finally transmitted to the electronic device 1001 for Display. The application switching triggered by the user operation is that the switching object of binding and driving is an application existing in the task of the screen throwing end, that is, when the local instance triggers the switching of the application according to the Input event, the switched application is an application in the task stack managed and operated in the electronic device 1001. The relevant parameters (such as window scaling, moving track, speed, moving position of the window, etc.) of the Input event-triggered catch-up animation are calculated according to the window display parameters (such as the size of the application display window, the position of the application icon in the desktop, etc.) of the screen-throwing end, and the catch-up manual is displayed on the screen-throwing end, that is, the local instance calculates the relevant parameters of the Input event-triggered catch-up animation according to the interaction parameters of the electronic device 1001, and displays the catch-up manual in the electronic device 1001.
Similarly, serving the screen instance of the screen-projected device, the bound user Input is the Input channel of the screen-projected device, that is, the screen instance receives an Input event generated by a user operation of the user on the electronic device 1002. The Display screen ID bound is the Display ID of the device to be projected, that is, the output obtained by calculating the Input event by the projection instance is finally transmitted to the electronic device 1002 for Display. The application switching triggered by the user operation is that the switching object which is bound and driven is an application existing in the task of the screen-thrown end, that is, when the screen-thrown instance triggers the switching application according to the Input event, the switched application is an application in the task stack which is managed and operated in the electronic device 1002. The relevant parameters (such as window scaling, moving track, speed, moving position of the window, etc.) of the Input event-triggered catch-up animation are calculated according to the window display parameters (such as the size of the application display window, the position of the application icon in the desktop, etc.) of the Input event-triggered catch-up animation, and the catch-up animation is displayed on the screen, that is, the screen-throwing instance calculates the relevant parameters of the Input event-triggered catch-up animation according to the interaction parameters of the electronic device 1002, and displays the catch-up animation in the electronic device 1002.
It should be noted that, the local instance and the screen-throwing instance serve a local screen-throwing device, i.e. the electronic device 1001, and a screen-throwing device, i.e. the electronic device 1002, which is connected to the local screen-throwing device. When the electronic device 1001 establishes a screen-casting connection with a plurality of devices, that is, the screen-casting content provided by the electronic device 1001 is cast onto the plurality of devices to be displayed, an application layer of the electronic device 1001 may include a plurality of screen-casting instances, for example: a screen-throwing example 1, a screen-throwing example 2 and the like, wherein the screen-throwing examples respectively serve the screen-throwing equipment with the screen-throwing connection established, and the number of local examples and the number of screen-throwing examples are not limited in the embodiment of the application.
Specifically, the local instance and the screen throwing instance comprise: the system comprises a gesture service module, a gesture interaction management module, a gesture recognition module, a gesture event processing module and a gesture animation module. The gesture service module may be configured to accept an Input event sent by the gesture navigation management module. The gesture interaction management module is a controller used for identifying and analyzing the Input event in the local instance. The gesture recognition module may be used to recognize Input events. The gesture event processing module may be used to determine a switch of the application and a hand-following animation during the switch according to the gesture. The gesture animation module can be used for determining parameters related to manual drawing according to parameters of the gesture and the application of switching.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The framework layer includes some predefined functions. In the electronic apparatus 1001, the frame layer may include: input frame, gesture navigation management module, application management module, window management module, view management module, screen projection management module (screen projection Source end).
Specifically, the Input framework may be used to further distribute the Input event, and identify a service corresponding to the Input event. The gesture navigation management module can be used for further transmitting the Input event to the gesture navigation service module of the application layer according to the service type of the Input time, further processing the Input event, and in addition, the gesture navigation management module can also determine whether to transmit the Input event to a local instance or a screen throwing instance according to the equipment information carried in the Input event. The application management module may be used to manage the running of the application during the application switching process. The window management module may be used to manage window programs. The window management module may obtain the size of the display screen, determine whether status bars exist, lock the screen, intercept the screen, and so on. The view management module may include visual controls, such as controls to display text, controls to display pictures, and so forth. The view management module may be used to build an application. The display interface may be composed of one or more views, for example, a display interface including a text notification icon may include a view displaying text and a view displaying a picture. The screen-throwing management module (screen-throwing Source end) can be used for transmitting relevant parameters of the hand-operated drawing to the screen-throwing equipment through screen-throwing connection so that the screen-throwing equipment can refresh screen-throwing content according to the relevant parameters and display hand-operated animation triggered by user gestures.
The underlying library may comprise a plurality of functional modules. Specifically, the underlying library may include: input distribution module, graph drawing module, hardware channel. The Input distribution module may be configured to filter the Input event according to different event processors registered by the Input event. Graphics rendering may be used to render a view according to parameters associated with a manual drawing. The hardware channel may be used to implement cross-device transmission of Input events over a FlashLight connection.
The driver layer may be used to provide the device with the ability to input recognition and output display. Specifically, the driving layer may include: TP drive, display module. The TP driver may be used to detect a user gesture by a user on the display screen and generate an Input event based on the user gesture. The display module may be used to drive a display screen of the device to display a user interface and a user gesture-triggered follow-up animation.
In addition, the module interaction diagram relates to the framework layer, underlying library, and driver layer of the electronic device 1002. Wherein:
the framework layer includes some predefined functions. In the electronic device 1002, the frame layer may include: window management module, throw screen management module (throw screen Sink end). In particular, the window management module may be used to manage window programs. The screen-throwing management module (screen-throwing Sink end) can be used for receiving relevant parameters which are sent by the screen-throwing equipment and are drawn manually through screen-throwing connection.
The underlying library may comprise: input distribution module, graph drawing module, hardware channel.
The driving layer may include: TP drive, display module.
For a description of each module mentioned in the electronic device 1002, reference may be made to the corresponding description of the same module in the electronic device 1001, which is not repeated here.
The following describes the interaction process of each module in the screen projection process through the data flow directions of 4 data flows in the modules shown in fig. 10.
Wherein the local event delivery flow and the local gesture control flow described below are an input data flow and an output data flow, respectively, from when the user gesture is detected locally by the screen throwing device to when the follow-up animation is displayed in response to the user gesture. In addition, the screen throwing event transfer flow and the screen throwing gesture control flow are respectively an input data flow and an output data flow from the screen throwing device to the condition that the screen throwing device detects the gesture of a user and the following animation is displayed in response to the gesture of the user.
1) Local event delivery flow
In the electronic device 1001, the TP driver may detect a user gesture of a user acting on a touch screen of the electronic device 1001, and generate an Input event based on the user gesture, where the Input event may include information such as an event type, coordinates, time, device information, and the like, and the TP driver may transmit the Input event to the Input distribution module, where the Input distribution module filters the Input event according to an event processor registered by the Input event, and transmits an Input event that is not filtered to an Input frame, where the Input frame identifies the type of the Input event, transmits the Input event that belongs to a gesture navigation type to the gesture navigation management module, where the gesture navigation management module transmits the Input event to the local instance according to device information carried in the Input event, and where the gesture service module sends the Input event to the gesture interaction management module, and the gesture interaction management module sends the Input event to the gesture identification module.
2) Local gesture control flow
In the electronic device 1001, after the Input event is acquired, the gesture recognition module in the local instance may recognize, according to information carried in the Input event, for example, an event type, coordinates, time, and the like, a gesture corresponding to the Input event, which includes but is not limited to: back gestures, home gestures, centers gestures, quickSwitch gestures, etc., the gesture recognition module in the local instance then passes the recognized gestures to a gesture event processing module in the local instance, which determines the application switch type from the gesture, including but not limited to: returning to the upper page, returning to the desktop, entering the latest task, rapidly switching the application and the like, controlling the operation of the switched application by the gesture event processing module through the application management module according to the application switching type, sending the gesture parameters and the switched application to the gesture animation module in the local instance by the gesture event processing module, determining related parameters drawn manually according to the gesture by the gesture animation module, sending the related parameters to the window management module, sending the related parameters to the graph drawing module by the window management module, drawing the view by the graph drawing module according to the related parameters, displaying the hand-following animation by the display module, and refreshing the display content in the display screen of the electronic device 1001.
3) Screen event delivery flow
In the electronic device 1002, the TP driver may detect a user gesture of a user acting on a touch screen of the electronic device 1002, and generate an Input event based on the user gesture, where the Input event may include information such as an event type, coordinates, time, device information, and the like, and the TP driver may transmit the Input event to an Input distribution module, where the Input distribution module filters the Input event according to an event processor registered by the Input event, and sends the Input event that is not filtered to a hardware channel, where the hardware channel sends the Input event to the electronic device 1001 through a FlashLight connection.
In the electronic device 1001, a hardware channel may receive an Input event sent by the electronic device 1002, the hardware channel sends the Input event to an Input distribution module, the Input distribution module sends the Input event to an Input frame, the Input frame identifies a type of the Input event, the Input event belonging to a gesture navigation type is sent to a gesture navigation management module, the gesture navigation management module sends the Input event to a screen throwing example according to device information carried in the Input event, in the screen throwing example, a gesture service module sends the Input event to a gesture interaction management module, the gesture interaction management module sends the Input event to a gesture interaction management module, and the gesture interaction management module sends the Input event to a gesture identification module.
4) Screen-throwing gesture control flow
In the electronic device 1001, after the Input event is acquired, the gesture recognition module in the screen-drop instance may recognize, according to information carried in the Input event, for example, an event type, coordinates, an event, and so on, a gesture corresponding to the Input event, which includes but is not limited to: back gestures, home gestures, centers gestures, quickSwitch gestures, etc., the gesture recognition module in the screen-drop instance then passes the recognized gestures to the gesture event processing module in the screen-drop instance, which determines the application switching type from the gestures, including but not limited to: returning to the upper page, returning to the desktop, entering the latest task, rapidly switching the application and the like, controlling the operation of the switched application by the gesture event processing module through the application management module according to the application switching type, and sending the parameters of the gesture and the switched application to the gesture animation module in the screen throwing example by the gesture event processing module, wherein the gesture animation module determines the relevant parameters drawn manually according to the gesture and sends the relevant parameters to the window management module, and then sending the relevant parameters to the screen throwing management module (screen throwing Source end), and then sending the relevant parameters of the hand following animation to the electronic equipment 1002 by the screen throwing management module (screen throwing Source end).
In the electronic device 1002, a screen-throwing management module (screen-throwing Sink end) may receive a relevant parameter of a follow-up animation sent by the electronic device 1001, and then the screen-throwing management module (screen-throwing Sink end) may send the relevant parameter to a window management module, and the window management module may send the relevant parameter to a graphics drawing module, where the graphics drawing module draws a view according to the relevant parameter, displays the follow-up animation through a display module, and refreshes display content in a display screen of the electronic device 1002.
From the above 4 data streams, it can be seen that the screen-throwing device detects a user operation, and the modules and sequences through which the data streams pass are substantially identical to the modules and sequences through which the screen-throwing device detects a user operation in response to the operation display and the manual drawing in response to the operation display, and that the recognition and analysis of the gestures are implemented in the corresponding modules of the screen-throwing device. The method is characterized in that the generation of the Input event and the display of the follow-up animation are respectively realized through corresponding modules in respective devices, and in the process of responding to the operation to display the follow-up animation by detecting the user operation by the screen throwing device, the Input event and the related parameters of the follow-up animation are transmitted across devices through a cross-device communication module (such as a screen throwing management module and a hardware channel).
As shown in fig. 11, the module interaction diagram illustrates the module interaction between the electronic device 1002 and the electronic device 1001 after the electronic device 1002 and the electronic device 1001 establish a homologous screen-drop connection. Illustratively, in the module interaction diagram shown in fig. 11, the electronic device 1002 may be a mobile phone, and the electronic device 1002 may be a large screen. The description of the homologous screen projection between the mobile phone and the large screen can be referred to in the foregoing related description of fig. 2A, and will not be repeated here.
The module interaction diagram relates to an application layer, a framework layer, an underlying library, and a driver layer of the electronic device 1002, and an application layer, a framework layer, an underlying library, and a driver layer of the electronic device 1001. The modules included in the electronic device 1002 and the electronic device 1001 are basically the same, except that a screen-projection management module in the electronic device 1002 is a screen-projection Sink terminal, and a screen-projection management module in the electronic device 1001 is a screen-projection Source terminal, and details about each module may be referred to in related content in fig. 10, which is not described herein again.
The following describes the interaction process of each module before and during screen projection by the data flow directions of 4 data flows in the modules shown in fig. 11.
The local event delivery flow and the local gesture control flow described below are, respectively, an input data flow and an output data flow from when the electronic device 1002 detects a user gesture before dropping a screen, to when the user gesture is responded to, displaying a follow-up animation. In addition, the screen-throwing time transfer stream and the screen-throwing gesture control stream are respectively an input data stream and an output data stream of the electronic device 1002 and the electronic device 1001 for displaying the hand-following animation in response to the user gesture when the electronic device 1002 detects the user gesture in the screen throwing process.
1) Local event delivery flow
In the electronic device 1002, the TP driver may detect a user gesture applied to a touch screen of the electronic device 1002 by a user, and generate an Input event based on the user gesture, where the TP driver may transmit the Input event to an Input distribution module, the Input distribution module may transmit the Input event to an Input frame, the Input frame may transmit the Input event to a gesture navigation management module, the gesture navigation management module may transmit the Input event to a local instance, and in the local instance, the gesture service module may transmit the Input event to a gesture interaction management module, and the gesture interaction management module may transmit the Input event to a gesture recognition module.
2) Local gesture control flow
In the electronic device 1002, after the Input event is acquired by the gesture recognition module in the local instance, the gesture recognition module in the local instance transmits the recognized gesture to the gesture event processing module in the local instance, the gesture event processing module controls the operation of the switched application through the application management module according to the application switching type, the gesture event processing module can also transmit the parameters of the gesture and the switched application to the gesture animation module in the local instance, the gesture animation module determines the relevant parameters of the manual drawing according to the gesture and transmits the relevant parameters to the window management module, the window management module transmits the relevant parameters to the graph drawing module, the graph drawing module draws a view according to the relevant parameters, the display module displays the manual drawing animation, and the display content in the display screen of the electronic device 1002 is refreshed.
It should be noted that the local event delivery flow and the local gesture control flow in the electronic device 1002 are identical to the local event delivery flow and the local gesture control flow in the electronic device 1001 shown in fig. 10, and specifically, regarding the local event delivery flow and the local gesture control flow in the electronic device 1002, details that are not mentioned in the local event delivery flow and the local gesture control flow may be referred to the relevant description in the electronic device 1001, and may not be repeated.
3) Screen event delivery flow
In the electronic device 1002, the TP driver may detect a user gesture applied to a touch screen of the electronic device 1002 by a user, generate an Input event based on the user gesture, and transmit the Input event to the Input distribution module, except that the Input distribution module does not transmit the Input event to the Input framework any more, but transmits the Input event to a hardware channel, and the hardware channel sends the Input event to the electronic device 1001 through a FlashLight connection.
In the electronic device 1001, a hardware channel may receive an Input event sent by the electronic device 1002 and send the Input event to an Input distribution module, where the Input distribution module sends the Input event to an Input frame, the Input frame sends the Input event to a gesture navigation management module, the gesture navigation management module transfers the Input event to a screen-throwing instance, in the screen-throwing instance, a gesture service module transfers the Input event to a gesture interaction management module, and the gesture interaction management module transfers the Input event to a gesture recognition module.
4) Screen-throwing gesture control flow
In the electronic device 1001, a gesture recognition module recognizes a gesture of a user according to an Input event and transmits the gesture to a gesture event processing module, the gesture event processing module controls operation of a switched application through an application management module according to an application switching type, a gesture time processing module can also transmit parameters of the gesture and the switched application to a gesture animation module, the gesture animation module determines related parameters drawn manually according to the gesture and transmits the related parameters to a window management module, the window management module transmits the related parameters to a screen throwing management module (screen throwing Source end), the screen throwing management module (screen throwing Source end) transmits the related parameters of the hand throwing animation to the electronic device 1002, in addition, the window management module can also transmit the related parameters to a local graph drawing module, the graph drawing module draws a view according to the related parameters, and displays the hand throwing animation through a display module of the electronic device 1001, and refreshes display contents in a display screen of the electronic device 1001.
In the electronic device 1002, a screen-throwing management module (screen-throwing Sink end) may receive a relevant parameter of a follow-up animation sent by the electronic device 1001, and then the screen-throwing management module (screen-throwing Sink end) may send the relevant parameter to a window management module, and the window management module may send the relevant parameter to a graphics drawing module, where the graphics drawing module draws a view according to the relevant parameter, displays the follow-up animation through a display module, and refreshes display content in a display screen of the electronic device 1002.
It should be noted that, the screen-throwing event transmission flow and the screen-throwing gesture control flow in fig. 11 are substantially the same as the data flow directions of the screen-throwing event transmission flow and the screen-throwing gesture control flow in fig. 10, except that, in addition to the relevant parameters of the screen-throwing management module, the window management module in the electronic device 1001 shown in fig. 11 sends the relevant parameters of the screen-throwing animation to the graphic drawing module, so that the electronic device 1002 can display the screen-throwing animation, and the electronic device 1001 can also respond to the gesture of the user to display the screen-throwing animation, thereby achieving the effect of homologous screen throwing.
Fig. 12 shows a flowchart of a screen projection method according to an embodiment of the present application.
As shown in fig. 12, the method includes:
step one: establishing a screen-throwing connection
S101, the first device displays a user interface comprising first content.
In this embodiment of the present application, the first device may be configured to establish a screen-throwing connection with another device, and provide screen-throwing content (e.g., first content) to the other device (e.g., second device) for the other device to display the screen-throwing content. The description of the first device may refer to the related description of the foregoing screen-projection device, which is not repeated herein. In the implementation of the present application, the user interface including the first content displayed by the first device may also be referred to as a third user interface.
The first content can be content provided by an application installed in the first device, and the first device can send the first content to other devices for display through the screen-throwing connection after the screen-throwing connection with the other devices is established. Illustratively, the first content may be content provided by a setup application as shown in FIG. 3A, and the user interface may refer to user interface 1-1 as shown in FIG. 3A.
S102, the first device detects operation of starting a screen throwing function.
The screen dropping function may be used to drop screen dropping content of one device (e.g., a first device) onto another device (e.g., a second device) for display. In this way, the user can view the screen content provided by the device across the devices.
In this embodiment of the present application, the first device may provide a window (for example, a first window) including a screen-throwing control (for example, a first control), where the screen-throwing control may be used to trigger to open or close a screen-throwing function, and the user may determine under what condition to open or close the screen-throwing function through the screen-throwing control, so as to improve operability of the user.
By way of example, the operation of opening the screen-projecting function may refer to a series of operations as shown in fig. 3A to 3B, including a down-slide operation acting on the user interface 1-1 shown in fig. 3A, and a click operation acting on the control 111A shown in fig. 3B.
In response to this operation, the first device may turn on one or more of WLAN, bluetooth, NFC, or mobile networks in the wireless communication module 160 and may discover other devices that may establish a screen-cast connection through one or more of Wi-Fi direct, bluetooth, NFC, mobile networks.
In some embodiments, after the first device finds out a plurality of devices that can establish a screen-throwing connection, the identifiers of the devices may be displayed in a user interface, so that a user selects one or more devices from the identifiers to establish the screen-throwing connection, and screen-throwing content of the first device is thrown into a display interface of the one or more devices to be displayed. Illustratively, the first device may, upon discovering a plurality of devices that may establish a screen-cast connection, display a window 112 as shown in fig. 3C, the window 112 being operable to display an identification of one or more screen-cast devices and a connection option operable to select a different device to establish a screen-cast connection. In this embodiment of the present application, the connection options of the one or more screen-throwing devices may also be referred to as one or more device options, where the one or more device incense may include a first device option corresponding to a second device, and the first device may detect an operation (e.g., a fifth operation) acting on the first device option, and trigger the first device to establish a screen-throwing connection with the second device.
S103, responding to the detected operation of selecting the second device, and establishing screen-throwing connection between the first device and the second device.
Illustratively, this operation may refer to a user operation on the connection option 112B as shown in fig. 3C.
In response to the operation, the first device may establish a screen-drop connection with the second device through one or more of Wi-Fi, bluetooth, NFC, wireless communication technology in the mobile network.
After the first device and the second device establish the screen-throwing connection, capability negotiation can be performed based on the screen-throwing connection, and the first device can acquire interaction parameters of the second device so that the first device can identify user operation corresponding to the Input event according to the interaction parameters. Wherein the interaction parameters may include, but are not limited to: displaying parameters, a designated position of a preset operation and a threshold value. The display parameter may indicate a size of a display area of the second device.
It may be understood that the first device and the second device may not perform capability negotiation, and may pre-introduce interaction parameters of different types of devices before the first device starts the screen projection function, for example, when leaving the factory. The embodiments of the present application are not limited in this regard.
In some embodiments, after detecting the operation of selecting the second device, the first device may send a screen connection request to the second device, where the second device may directly establish a screen connection with the first device according to the screen connection request, or the second device may display a prompt message to prompt the user whether to agree to establish the screen connection, and after receiving the operation that the user agrees to establish the screen connection, establish the screen connection with the first device. Illustratively, after receiving the request for the screen-drop connection, the second device may display a window 201 as shown in fig. 3E, where the window 201 is used to prompt the user whether to agree to set up the screen-drop connection. In the embodiment of the present application, the prompt information displayed in the window 201 may also be referred to as a first prompt information.
S104, the second device displays a user interface comprising the first content based on the screen projection connection.
The first device may send the currently displayed content to the second device via the on-screen connection, such that the second device may display the on-screen content provided by the first device, i.e. the user interface comprising the first content, based on the on-screen connection. The user interface displayed by the second device based on the drop-in connection may be referred to as user interface 2-1 shown in fig. 3F or user interface 2-2 shown in fig. 4D, for example.
It should be noted that when the screen-throwing content is displayed on the first device and the second device, the layout of the interface and the display proportion of the interface can be different, because the second device can adaptively adapt the screen-throwing content according to the screen size of the second device when receiving the screen-throwing content, certain difference can exist between different devices when displaying the same content, and more coordinated visual experience is provided for the user in the screen-throwing process.
It will be appreciated that after the first device and the second device establish the on-screen connection, the first device and the second device may simultaneously display the same on-screen content, e.g., the first content. In this way, the user can share the screen content into a plurality of devices for display. Alternatively, after the first device and the second device establish the screen-drop connection, the first device and the second device display different contents, for example, the first device may stop displaying the first content after sending the first content to the second device, at which time the second device displays a user interface containing the first content, and the first device displays a user interface not containing the first content. In this way, the user can switch the display content of one device to the other device for display, and display effects on different devices are provided for the user.
In some embodiments, in the process that the first device and the second device synchronously display the same screen-throwing content, the first device can also receive a user operation to change the first content, and meanwhile, the second device synchronously displays the changed first content. Thus, the user can change the screen throwing content synchronously displayed on the screen throwing device and the screen throwing device by controlling the screen throwing device. For example, the user interface of the first device for changing the screen content according to the user operation may be referred to in fig. 4A to 4B, and the user interface of the second device for synchronously changing the screen content following the first device may be referred to in fig. 4C to 4D.
Step two: switching user interfaces according to user operations
S105, the second device detects user operation.
In the embodiment of the present application, the user operation may refer to a touch operation acting on the display screen of the second device, and the second device may detect the touch operation acting thereon or thereabout through the touch sensor 180K. The user operation may refer to a gesture navigation operation in system navigation, and in particular, the third operation may include, but is not limited to, the following four gesture types: back gestures, home gestures, centers gestures, quickSwitch gestures, and the like. The Back gesture can be used for triggering and returning to a previous user interface, the Home gesture can be used for triggering and returning to a desktop main interface, the RECENTs gesture can be used for triggering and displaying a multi-task interface, and the QuickSwitch gesture can be used for triggering and rapidly switching application. That is, the user operation may be used to trigger completion of a switch of the user interface.
Illustratively, the Back gesture may refer to a user operation sliding inward by the left edge of the display screen as shown in FIG. 5A, the Home gesture may refer to a user operation sliding upward by the bottom edge of the display screen as shown in FIG. 6A, the Recunts gesture may refer to a user operation sliding upward and pausing by the bottom edge of the display screen as shown in FIG. 7A, and the QuickSwitch gesture may refer to a user operation sliding laterally by the bottom edge of the display screen as shown in FIG. 6B. It will be appreciated that the Back gesture, home gesture, RECENTS gesture, quickSwitch gesture may also be presented as other gesture operations, which are not limited by embodiments of the present application.
S106, the second device generates first information based on the user operation.
The second device may generate an Input event based on the user operation, and the first information may refer to information of an event type, coordinates, time, device information, and the like included in the Input event, and the first information is used to indicate the user operation. The first information may be used for the first device to determine the gesture type corresponding to the user operation, and details regarding the information included in the Input event may be referred to in the foregoing, which is not described herein again.
In a specific implementation, the second device may detect a user operation of the user on the display screen through a TP driver of the driver layer, and generate the Input event based on the user operation.
And S107, the second device sends the first information to the first device.
Specifically, the second device may send an Input event carrying the first information to the first device based on the FlashLight connection.
It should be noted that, in this embodiment of the present application, the second device may display, in real time, a follow-up manual drawing triggered by a user gesture according to the user gesture, and because the follow-up manual drawing needs to display, in real time, a switching track of a user interface along with a movement track of the user gesture, the second device needs to transmit, in real time, information of the user operation to the first device after receiving the user operation, so that the first device achieves fast analysis of the follow-up animation. The FlashLight connection is different from the screen connection mentioned above, the FlashLight connection can directly realize data transmission through the bottom layer of the device, the transmission speed is faster, real-time cross-device transmission can be carried out on user operation of a user acting on the device, the device can respond to the cross-device operation of the user faster, the follow-up animation triggered by the gesture of the user is displayed in real time, the time delay of the device for changing display content according to the trigger of the user operation is reduced, and the experience of the device in local real-time operation is provided for the user.
In a specific implementation, the second device may send the Input event to the first device through a hardware channel in the underlying library, and the description of the hardware channel may refer to relevant content in fig. 10 and 11, which is not repeated herein.
S108, the first device recognizes user gestures corresponding to user operations acting on the second device according to the first information.
The first device may determine, according to the device information in the first information, that the source of the first information is the second device, so as to identify, according to the interaction parameter of the second device, a user gesture corresponding to the user operation shown in step S105, that is, a Back gesture, a Home gesture, a centers gesture, or a QuickSwitch gesture. The interaction parameters may include a specified location of the gesture and a swipe threshold. For example, taking a Back gesture as an example, the interaction parameter may include a designated position of the Back gesture and a sliding threshold, when the coordinate in the first information is the same as the designated position of the Back gesture preset in the second device, and the sliding distance of the third operation meets the sliding threshold of the Back gesture preset in the second device, the first device may determine that the user operation acting on the second device is the Back gesture, and the user operation may be used to trigger to return to the previous user interface.
In a specific implementation, the first device may first determine, according to the first information, whether the user operation detected on the second device is gesture navigation through a gesture navigation management module in the frame layer, and after determining that the user operation is gesture navigation, complete identification of the user operation through a gesture navigation service module in the application layer. The gesture navigation service module may generate one or more instances by instantiation after the screen-throwing function is started, where the one or more instances may be used to complete recognition of the gesture and event processing of the gesture, including but not limited to: local and drop instances. The local instance can be used for serving the user operation detected by the local (i.e. first equipment), and the screen throwing instance can be used for serving the user operation detected by the screen throwing end (i.e. second equipment). That is, after the first device obtains the first information, the user gesture may be identified from the first information through the screen-drop instance.
In addition, the first device can also detect the user operation acting on the local, generate an Input event, and identify the user operation according to the Input event through the local instance, so that the first device can trigger to change the display content according to the user operation acting on the local. By way of example, the user operation acting locally may refer to a user operation as shown in fig. 4A, and the user interfaces before and after modification may be the user interfaces shown in fig. 4A and 4B.
S109, the first device determines animation switching parameters of the first animation according to the gesture of the user.
The animation switching parameters include, but are not limited to: window scaling, window movement trajectory, speed, movement position, etc. The first device may calculate the animation switching parameter in combination with a sliding parameter of a gesture of the user and a window display parameter of the second device, for example, a size of an application display window, a position of an application icon in a desktop, and the like, so that when the second device triggers the hand-following animation according to the user operation, the application window can move along with the gesture of the user, thereby providing a local operation for the user to operate the second device and triggering the effect of hand-following animation. Among other parameters, the sliding parameters of the user gesture include, but are not limited to: the sliding speed, sliding acceleration, sliding track and the like of the user gesture can be calculated by the first device according to the coordinates and the time energy information in the first information.
Taking a Home gesture as an example, the first device may calculate a sliding track of the Home gesture, such as a track moving from below to above the screen, according to the Home gesture acted on the second device by the user, and then, the first device calculates a moving track of the window, that is, a track moving from below to above the screen, according to the sliding track of the Home gesture, so that when the second device displays a follow-up manual picture, there is a follow-up manual picture effect that the window moves from below to above the screen according to the Home gesture of the user. In addition, the first device may calculate a finer movement track of the window according to the position of the application icon in the desktop in the second device, when the user triggers the Home gesture, there may be an effect that the window moves towards the application icon in the window moving process, and if the application icon is located at a position far to the left on the desktop main interface, the movement track of the window moves from the lower side to the upper left of the screen.
In addition, since the user operation is a gesture navigation operation in system navigation, different gesture navigation operations correspond to different application switching types, including but not limited to: return to the upper page, return to the desktop, enter the most recent task, fast switch applications, etc. It can be seen that these application switch types can alter the operating state of the application. For example, since the Home gesture can be quickly switched from the user interface provided by the application to the desktop main interface, at this time, the application can be quickly switched from the operating state of the foreground operation to the operating state of the background operation, or the closed operating state. Thus, in determining the animation switching parameter by the first device according to the user gesture, the first device may also manage the lifecycle of the application according to the user gesture. In a specific implementation, the first device may manage the life cycle of the application through the application management module according to the gesture of the user identified by the first device, so as to ensure normal running of the application and control normal switching of each application.
S110, the first device sends animation switching parameters to the second device.
The first device may send the animation switching parameter to the second device through the screen connection. In a specific implementation, the first device may send the animation switching parameter to the second device through a screen-throwing management module (screen-throwing Source end), so that the second device triggers the following animation according to the gesture of the user.
S111, the second device uses the first animation to switch the user interface.
The second device may determine the first animation according to the animation switching parameter and complete switching of the user interface using the first animation. In a specific implementation, the second device may draw the following animation through the graphics drawing module in the second device according to the animation switching parameter, and display the following animation through the display module.
Specifically, the second device switches from the first user interface to the second user interface using the first animation. The first user interface may include first content that the first device sends to the second device via a drop-in connection. That is, the second device may control the switching of the screen content according to the user gesture acting on the second device and display the follow-up animation of the switching process.
When the user gesture is a Back gesture, the Back gesture is used for triggering switching of upper and lower pages of an application, and the first user interface and the second user interface can be upper and lower pages provided by the same application. Illustratively, the user gesture may refer to a user operation as shown in FIG. 5A, the first user interface may refer to user interface 2-2 as shown in FIG. 5A, the second user interface may refer to user interface 2-1 as shown in FIG. 5B, and the first animation may be described with reference to the interface switching process in FIGS. 5A-5B.
When the user gesture is a Home gesture, the Home gesture is used to trigger a return to the desktop main interface, and the second user interface may be the desktop main interface. Illustratively, the user gesture may refer to a user operation as shown in FIG. 6A, the first user interface may refer to a user interface 2-1 as shown in FIG. 3F or FIG. 5B, the second user interface may refer to a user interface 2-0 as shown in FIG. 6B, and the first animation may be referred to in connection with the interface switching process as shown in FIG. 6A.
When the user gesture is a Reccents gesture, the Reccents gesture is given to trigger entry into a most recent task, and the second user interface may be a multi-tasking interface. Illustratively, the user gesture may refer to a user operation as shown in FIG. 7A, the first user interface may refer to a user interface 2-1 as shown in FIG. 3F or FIG. 5B, the second user interface may refer to a user interface 2-3 as shown in FIG. 7A, and the first animation may be referred to in connection with the interface switching process as shown in FIG. 7A.
When the user gesture is a QuickSwitch gesture, the QuickSwitch gesture is used to trigger a switch between applications, and the first user interface and the second user interface may be user interfaces provided for different applications. Illustratively, the user gesture may refer to a user operation as shown in FIG. 8A, the first user interface may refer to a user interface 2-1 as shown in FIG. 3F or FIG. 5B, the second user interface may refer to a user interface 2-4 as shown in FIG. 8B, and the first animation may be referred to in connection with the interface switching process as shown in FIG. 8A.
S112 the first device switches the user interface using the first animation.
When the screen projection mode between the first equipment and the second equipment is homologous screen projection, the first equipment can synchronously use the first animation to finish the switching of the user interface according to the user operation of the user acting on the second equipment. In a specific implementation, the second device may draw the following animation through the graphics drawing module in the first device according to the animation switching parameter, and display the following animation through the display module.
Specifically, the first device switches from the third user interface to the fourth user interface using the first animation. Wherein the third user interface may include first content that the first device sends to the second device over the drop connection. That is, the first device may alter the content displayed on the first device based on user gestures made by the user on the second device.
When the user gesture is a Back gesture, the Back gesture is used for triggering switching of upper and lower pages of the application, and the third user interface and the fourth user interface can be upper and lower pages provided by the same application. Illustratively, the user gesture may refer to a user operation as shown in FIG. 5A, the third user interface may refer to user interface 1-2 as shown in FIG. 5C, the second user interface may refer to user interface 1-1 as shown in FIG. 5D, and the first animation may be referred to in connection with the description of the interface switching process in FIG. 5C.
When the user gesture is a Home gesture, the Home gesture is used to trigger a return to the desktop main interface, and the fourth user interface may be the desktop main interface. Illustratively, the user gesture may refer to a user operation as shown in FIG. 6A, the third user interface may refer to user interface 1-1 as shown in FIG. 3A, the fourth user interface may refer to user interface 1-0 as shown in FIG. 6D, and the first animation may be referred to in connection with the description of the interface switching process as shown in FIG. 6C.
When the user gesture is a Reccents gesture, the Reccents gesture is given to trigger entering a most recent task, and the fourth user interface may be a multi-tasking interface. Illustratively, the user gesture may refer to a user operation as shown in FIG. 7A, the third user interface may refer to user interface 1-1 as shown in FIG. 3A, the fourth user interface may refer to user interface 1-3 as shown in FIG. 7B, and the first animation may be referred to in connection with the description of the interface switching process as shown in FIG. 7B.
When the user gesture is a QuickSwitch gesture, the QuickSwitch gesture is used to trigger a switch between applications, and the third and fourth user interfaces may be user interfaces provided for different applications. Illustratively, the user gesture may refer to a user operation as shown in FIG. 8A, the third user interface may refer to user interface 1-1 as shown in FIG. 3A, the fourth user interface may refer to user interface 1-4 as shown in FIG. 8D, and the first animation may be described with reference to the interface switching process as shown in FIG. 8C.
It should be noted that step S112 is an optional step. When the screen-casting manner between the first device and the second device is a heterogeneous screen-casting manner, the content displayed by the first device may not be affected by the user operation on the second device, and for example, the first device may always display a third user interface.
As can be seen from step S111 and step S112, in the screen-throwing process, both the first device and the second device may complete the switching of the user interface according to the user gesture applied to the second device by the user, but the user interface after switching may include content other than the screen-throwing content sent by the first device. Wherein the user interface after the second device is switched (i.e., the second user interface) may include content provided locally by the second device. For example, when the user gesture is a Home gesture, the desktop main interface returned by the second device may be a desktop main interface provided by the second device, when the user gesture is a centers gesture, a multi-task interface displayed by the second device may include an application browsing window that is historically browsed in the second device, and when the user gesture is a QuickSwitch gesture, a user interface after the second device is switched may be a user interface provided by an application that is historically browsed in the second device. Therefore, when the user controls the screen-throwing equipment in the screen-throwing process, the relation between the screen-throwing equipment and the screen-throwing equipment can be weakened, and the user is given a feeling of locally controlling the equipment.
In addition, the gesture of the user acting on the second device mentioned in the embodiment of the present application is a navigation gesture at a system level, and the priority of the navigation gesture at the system level is higher than that of the gesture at the application level. In other words, when the user gesture received on the second device may respond as either a system level navigation or an application level navigation, the second device preferentially responds to the system level navigation, so as to implement the switching of the user interface. 5E-5F, a user gesture may be received by the first device between the second device and the first device, where the user gesture may trigger a response of a control within the application, and after the second device and the first device establish the screen connection, the user gesture preferably responds as a system-level navigation when the first device receives the same user gesture, i.e., completes the switching of the user interface, as shown in FIGS. 5A-5B. Therefore, the second equipment can conveniently realize switching among applications in the screen throwing process, and more convenient screen throwing experience is provided for the user.
It will be appreciated that the parts not mentioned in fig. 12 above may be referred to in the foregoing description of the user interfaces shown in fig. 3A-3F, fig. 4A-4D, fig. 5A-5F, fig. 6A-6D, fig. 7A-7B, fig. 8A-8D, and the module interaction diagrams shown in fig. 10 and 11, and will not be repeated herein.
The principle of the screen throwing device changing the screen throwing content according to the gesture of the user in the process of displaying the screen throwing content provided by the screen throwing device is described below in connection with the screen throwing system shown in fig. 13.
As shown in fig. 13, the screen projection system may include: an event recognition module 001, an event transmission module 002, an event processing module 003, a gesture recognition module 004, a local display module 005, a display transmission module 006, and a projection display module 007. Wherein:
the event recognition module 001 is configured to recognize a gesture of a user acting on the display screen, and generate an Input event based on the gesture of the user, where relevant parameters of the gesture of the user may be included, and these parameters may be used to recognize and distinguish between different gestures of the user. Illustratively, the Input event may include an event type, coordinates, time, and so forth. In addition, the event recognition module 001 may also pass the Input event to the event transmission module 002.
The event transmission module 002 is configured to label an Input event with a device label, so that the Input event carries device information of a sender device, and realize cross-device transmission of the Input event.
The event processing module 003 may be configured to process the Input event according to the device tag distribution, so as to distinguish different devices, including the Input event generated by the screen throwing device and the screen throwing device, so as to identify a user gesture according to interaction parameters of the different devices, and analyze a hand following animation triggered on the corresponding device based on the user gesture.
The gesture recognition module 004 may be configured to recognize a user gesture corresponding to the Input event according to the Input event sent by the event processing module 003, in combination with an interaction parameter of a device indicated by a device tag, convert the user gesture into a navigation operation of a corresponding system interface, determine user interfaces before and after switching, and calculate relevant parameters of a following animation triggered by the navigation operation.
The local display module 005 may be configured to execute, at a screen-throwing end corresponding to the screen-throwing device, switching of the user interface triggered by the user gesture and a hand-following animation in the switching process according to the parameters related to the hand-following drawing.
The display transmission module 006 may include a Source end corresponding to the screen device and a Sink end corresponding to the screen device, where the display transmission module 006 may be configured to send relevant parameters of the follow-up animation in the switching process from the Source end to the Sink end, so as to implement cross-device transmission of the content to be displayed.
The screen display module 007 may be configured to receive parameters related to the hand-following animation during the switching process, and perform switching of the user interface triggered by the user gesture at the screen end corresponding to the screen device and display the hand-following animation during the switching process.
It should be noted that, the above-mentioned local display module 005 is an optional module, when the screen-throwing mode between the screen-throwing device and the screen-throwing device is a homologous screen-throwing mode, the local display module 005 can trigger the switching of the user interface on the screen-throwing device according to the user gesture acting on the screen-throwing device and the display of the following hand animation in the switching process, when the screen-throwing mode between the screen-throwing device and the screen-throwing device is a heterologous screen-throwing mode, the screen-throwing system may not include the local display module 005, the gesture recognition module 004 can directly send the user interface to be displayed and the following hand painting to the display transmission module 006, so that the display transmission module 006 sends to the Sink end corresponding to the screen-throwing device, and the purpose that the screen-throwing device changes the screen-throwing content according to the user gesture is achieved.
It will be appreciated that what is not described in the screen projection system shown in fig. 13 may be referred to in the foregoing fig. 3A-3D, fig. 4A-4B, fig. 5A-5F, fig. 6A-6D, fig. 7A-7B, fig. 8A-8D, and the user interfaces shown in fig. 10 and fig. 11, and the module interaction schematic diagrams shown in fig. 12, and the method flowcharts will not be repeated herein.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (21)

1. A method of screening, the method comprising:
the first equipment and the second equipment establish screen-throwing connection;
the first device sends first content to the second device based on the screen-drop connection;
the second device displays a first user interface including the first content;
the second device detecting a first operation acting on the second device;
the second device sending first information indicative of the first operation to the first device;
The first device sends animation switching parameters of a first animation to the second device, wherein the animation switching parameters are determined by the first device according to the moving track of the first operation;
the second device displays the first animation and switches the first user interface to a second user interface.
2. The method of claim 1, wherein the first operation is a gesture operation acting on a display screen of the second device.
3. The method of claim 1 or 2, wherein the second user interface comprises content local to the second device.
4. A method according to any of claims 1-3, wherein the first user interface and the second user interface are upper and lower pages of the same application;
or, the first user interface and the second user interface are user interfaces provided by different applications, wherein the second user interface is a user interface provided by an application in a second device;
or the second user interface is a desktop main interface or a multi-task interface, wherein the desktop main interface provides a user interface for the first device or the second device, and the multi-task interface comprises a history browse window of one or more applications in the second device.
5. The method of any one of claims 1-4, wherein the first information comprises one or more of: event type, coordinates, time, identification of the second device; the event type is used for determining gesture operation corresponding to the first operation, the coordinates are positions of the first operation acting on the display screen of the second device, and the time is the time of the first operation acting on the display screen of the second device; the animation switching parameters include one or more of the following: and the scaling, the moving track, the moving speed and the moving position of the window where the first user interface and the second user interface are positioned.
6. The method of any of claims 1-5, wherein prior to the first device establishing a screen-drop connection with the second device, the method further comprises:
the first device displays a third user interface including the first content.
7. The method of any of claims 6, wherein the first device transmits animation switching parameters of a first animation to the second device before or after the first device, the method further comprising:
The first device displays the first animation and switches the third user interface to a fourth user interface.
8. The method of claim 7, wherein the step of determining the position of the probe is performed,
the third user interface and the fourth user interface are user interfaces provided by upper and lower pages of the same application or different applications;
or, the fourth user interface is a desktop main interface or a multi-task interface.
9. A method of screening, the method comprising:
the first equipment and the second equipment establish screen-throwing connection;
the first device sends first content to the second device based on the screen-drop connection;
the first device receives first information indicating a first operation, wherein the first operation is an operation acting on the first content in a first user interface which is displayed by the second device and comprises the first content;
the first device sends animation switching parameters of a first animation to the second device, and the animation switching parameters are determined by the first device according to the moving track of the first operation.
10. The method of claim 9, wherein prior to the first device establishing the screen-drop connection with the second device, the method further comprises:
The first device displays a third user interface including the first content.
11. The method of claim 10, wherein the first device sends the animation switching parameters of the first animation to the second device before or after the first device, the method further comprising:
the first device displays the first animation and switches the third user interface to a fourth user interface.
12. The method of claim 11, wherein the third user interface and the fourth user interface are user interfaces provided for upper and lower pages of a same application or for different applications;
or, the fourth user interface is a desktop main interface or a multi-task interface.
13. The method of any of claims 9-12, wherein after the first device establishes a screen-drop connection with the second device, the method further comprises:
the first device detecting a third operation, displaying a seventh user interface comprising third content;
the first device sends the third content to the second device based on the drop connection.
14. A method of screening, the method comprising:
The second equipment establishes screen-throwing connection with the first equipment;
the second device receives first content sent by the first device based on the screen projection connection;
the second device displays a first user interface including the first content;
the second device detecting a first operation acting on the second device;
the second device sending first information indicative of the first operation to the first device;
the second device receives animation switching parameters of a first animation sent by the first device, wherein the animation switching parameters are determined by the first device according to a moving track of the first operation;
the second device displays the first animation and switches the first user interface to a second user interface.
15. The method of claim 14, wherein the first operation is a gesture operation acting on a display screen of the second device.
16. The method of claim 14 or 15, wherein the second user interface comprises content local to the second device.
17. The method of any of claims 14-16, wherein the first user interface and the second user interface are upper and lower pages of a same application;
Or, the first user interface and the second user interface are user interfaces provided by different applications, wherein the second user interface is a user interface provided by an application in a second device;
or the second user interface is a desktop main interface or a multi-task interface, wherein the desktop main interface provides a user interface for the first device or the second device, and the multi-task interface comprises a history browse window of one or more applications in the second device.
18. The method of any of claims 14-17, wherein the first information comprises one or more of: event type, coordinates, time, identification of the second device; the event type is used for determining gesture operation corresponding to the first operation, the coordinates are positions of the first operation acting on the display screen of the second device, and the time is the time of the first operation acting on the display screen of the second device; the animation switching parameters include one or more of the following: and the scaling, the moving track, the moving speed and the moving position of the window where the first user interface and the second user interface are positioned.
19. An electronic device comprising a memory, one or more processors, and one or more programs; the one or more processors, when executing the one or more programs, cause the electronic device to implement the method of any of claims 9-13, 14-18.
20. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 9 to 13, 14 to 18.
21. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method of any of claims 9 to 13, 14 to 18.
CN202210151376.7A 2021-11-22 2022-02-18 Screen projection method, user interface and electronic equipment Pending CN116156229A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111383799 2021-11-22
CN2021113837993 2021-11-22

Publications (1)

Publication Number Publication Date
CN116156229A true CN116156229A (en) 2023-05-23

Family

ID=86356915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210151376.7A Pending CN116156229A (en) 2021-11-22 2022-02-18 Screen projection method, user interface and electronic equipment

Country Status (1)

Country Link
CN (1) CN116156229A (en)

Similar Documents

Publication Publication Date Title
CN114764298B (en) Cross-device object dragging method and device
WO2021013158A1 (en) Display method and related apparatus
CN112486363B (en) Cross-equipment content sharing method, electronic equipment and system
US20230046708A1 (en) Application Interface Interaction Method, Electronic Device, and Computer-Readable Storage Medium
US20210342044A1 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
US11861382B2 (en) Application starting method and apparatus, and electronic device
WO2020238759A1 (en) Interface display method and electronic device
CN115348350B (en) Information display method and electronic equipment
WO2021078032A1 (en) User interface display method and electronic device
US20220327190A1 (en) Screen Display Control Method and Electronic Device
WO2022028537A1 (en) Device recognition method and related apparatus
WO2023088459A1 (en) Device collaboration method and related apparatus
CN114079691B (en) Equipment identification method and related device
CN116156229A (en) Screen projection method, user interface and electronic equipment
WO2020228735A1 (en) Method for displaying application, and electronic device
CN114489535A (en) External screen display method and electronic equipment
CN114764300B (en) Window page interaction method and device, electronic equipment and readable storage medium
US20240137438A1 (en) Information display method and electronic device
WO2023071718A1 (en) Floating window adjusting method and electronic device
WO2023088093A1 (en) Display method and electronic device
CN117008772A (en) Display method of application window and electronic equipment
CN116257201A (en) Content collaboration method, electronic device, and computer-readable storage medium
CN117093119A (en) Application page switching method
CN117311580A (en) Screen splitting method and foldable electronic equipment
CN116719494A (en) Multi-service display method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination