CN117631902A - Focus switching method, electronic device, chip, storage medium, and program product - Google Patents

Focus switching method, electronic device, chip, storage medium, and program product Download PDF

Info

Publication number
CN117631902A
CN117631902A CN202210989423.5A CN202210989423A CN117631902A CN 117631902 A CN117631902 A CN 117631902A CN 202210989423 A CN202210989423 A CN 202210989423A CN 117631902 A CN117631902 A CN 117631902A
Authority
CN
China
Prior art keywords
focus
electronic device
window
interface element
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210989423.5A
Other languages
Chinese (zh)
Inventor
王波
芮江
朱培
黄德才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210989423.5A priority Critical patent/CN117631902A/en
Publication of CN117631902A publication Critical patent/CN117631902A/en
Pending legal-status Critical Current

Links

Abstract

The embodiment of the application discloses a focus switching method, electronic equipment, a chip, a storage medium and a program product, which are applied to the technical field of computers. The focus switching method is applied to first electronic equipment, the first electronic equipment is in communication connection with second electronic equipment, and the focus switching method comprises the following steps: displaying a first interface, wherein the first interface comprises a first interface element in a focus state, receiving a first input, wherein the first input is used for indicating a first direction of focus switching, switching a second interface element to the focus state according to the first interface and the first direction in response to the first input, and canceling the focus state of the first interface element, wherein the second interface element is displayed on a second electronic device. When the first electronic equipment does not have interface elements for switching the focus, the focus is switched from the first electronic equipment to the second electronic equipment, and the focus can be switched between the electronic equipment, so that more possibility is brought to interaction between the user interface and the user.

Description

Focus switching method, electronic device, chip, storage medium, and program product
Technical Field
The embodiment of the application relates to the technical field of electronics, in particular to a focus switching method, electronic equipment, a chip, a storage medium and a program product.
Background
The user may enter a directional event into an electronic device (e.g., device a) by operating a directional key of an input device such as a keyboard. Device a may switch focus within a window on device a in response to a directional event. Currently, when the device a and the device B are in a collaborative scene, the device a cannot switch the focus from the device a to the device B in response to a direction event, and a user can only switch the focus depending on pointing devices such as a mouse, a touch screen and the like, for example, the user can obtain the focus by clicking an edit box, so that the use experience of the user is affected.
Disclosure of Invention
The embodiment of the application provides a focus switching method, electronic equipment, a chip, a storage medium and a program product, which can respond to a first direction of an input event, when no interface element for focus switching exists on first electronic equipment, focus is switched from the first electronic equipment to second electronic equipment, the focus can be switched between the electronic equipment, brand new experience is brought to a user, meanwhile, the use experience of the user is improved, convenience is brought to the user, and more possibility is brought to interaction between a user interface and the user.
In a first aspect, an embodiment of the present application provides a focus switching method, which is applied to a first electronic device, where the first electronic device is communicatively connected to a second electronic device, and the focus switching method includes: displaying a first interface, wherein the first interface comprises a first interface element in a focus state, receiving a first input, wherein the first input is used for indicating a first direction of focus switching, switching a second interface element to the focus state according to the first interface and the first direction in response to the first input, and canceling the focus state of the first interface element, wherein the second interface element is displayed on a second electronic device.
Wherein, the interface element in the focus state means that the interface element obtains focus. An interface element that is in a non-focus state refers to the interface element not being in a focus state, i.e., not obtaining focus. Interface elements in focus have visual cues that are distinguishable from other interface elements in non-focus.
Wherein, switching the interface element to the focus state refers to switching the interface element from the non-focus state to the focus state, that is, the interface element switched to the focus state is in the non-focus state before switching. Canceling the focus state of the first interface element refers to switching the interface element from the focus state to the non-focus state. And switching the second interface element into a focus state, and then enabling the second interface element to be in the focus state. The first interface element is in a non-focus state after canceling the focus state of the first interface element.
The switching of the focus state indicates switching of the focus, for example, switching the second interface element to the focus state and canceling the focus state of the first interface element, that is, indicating switching of the focus from the first interface element (interface element canceling the focus state) to the second interface element (interface element switched to the focus state).
The first direction is the direction in which the user desires to perform focus switching, that is, the direction in which the user desires focus movement. The focal point switching may include switching between different interface elements on the same electronic device, or switching between different electronic devices. Switching the focus between different interface elements on the same electronic device includes one or more of the following: the focus switches between different interface elements of the same window on the same electronic device, or the focus switches between different interface elements of different windows on the same electronic device. The focus switches between different electronic devices, e.g. the focus switches between a first interface element of a first electronic device and a second interface element of a second electronic device, the focus switches from the first interface element to the second interface element, or from the second interface element to the first interface element.
In the embodiment of the application, the first electronic device displays the first interface element in the focus state through the first interface. The user inputs a first input to the first electronic device, wherein the first input indicates a first direction of focus switching, that is, indicates the first electronic device to switch the focus to other interface elements according to the first direction, and the other interface elements may be interface elements on the first electronic device or interface elements of other electronic devices in a collaborative state with the first electronic device. The first electronic device receives a first input of a user, and responds to the first input, the first electronic device determines a second interface element according to a first interface and a first direction, wherein the second interface element is displayed on the second electronic device, namely, the interface element positioned on the second electronic device is determined to be the second interface element according to the first interface and the first direction, and further responds to the determination of the second interface element, the second interface element is set to be in a focus state, the focus state of the first interface element is canceled, and the focus is switched from the first interface element to the second interface element, so that the focus is switched from the first electronic device to the second electronic device, and the focus switching is not limited in the same window any more, and is not limited between different windows of the same electronic device any more. The user allows the first electronic equipment to be in communication connection with the second electronic equipment, namely the user hopes to operate the first electronic equipment and the second electronic equipment at the same time, through the focus switching method provided by the application, the focus can be switched between the electronic equipment, the user use experience is improved while brand new experience is brought to the user, convenience is brought to the user, and more possibility is brought to interaction between the user interface and the user.
Furthermore, the switching of the input focus can be rapidly completed by inputting the first direction of the switching of the indication focus, so that the use efficiency of the input equipment such as a remote controller, a touch pad on a keyboard and the like is greatly improved.
In some embodiments, the first electronic device switching the second interface element to the focus state and canceling the focus state of the first interface element in response to the first input according to the first interface and the first direction comprises: the first electronic device responds to the first input and searches an interface element for switching the focus on the first interface according to a first direction. When the first interface has no interface element for focus switching in the first direction, the first electronic device determines a first virtual interface element on the first virtual screen according to the first direction. The first electronic device switches the first virtual interface element to a focus state and cancels the focus state of the first interface element, wherein the first virtual interface element is projected to the second electronic device and displayed as the second interface element on the second electronic device.
The interface element for focus switching is in a non-focus state, and the interface element can be switched to a focus state. In other words, the interface element for focus switching is that the interface element does not currently acquire focus, which can acquire focus.
The searching for the interface element capable of being used for focus switching on the first interface according to the first direction, wherein the fact that the interface element without being used for focus switching on the first interface means that the interface element without being used for focus switching is determined on the first interface according to the first direction, may include the following situations: 1) There are no other interface elements on the first interface than the first interface element according to the first direction. 2) There are other interface elements on the first virtual screen than the first interface element according to the first direction, but the other interface elements cannot obtain focus.
The first virtual interface element is a virtual interface element on the first virtual screen. And the first electronic equipment throws the first virtual screen to the second electronic equipment, and correspondingly, the second electronic equipment displays the content on the first virtual screen, for example, the first virtual interface element is displayed as the first interface element.
Wherein the first electronic device determining a first virtual interface element on the first virtual screen according to the first direction comprises: the first electronic equipment searches for a virtual interface element capable of being switched by the focus on the first virtual screen according to the first direction, and if the virtual interface element capable of being switched by the focus is found, the virtual interface element is the first virtual interface element. The first electronic device switches the first virtual interface element from the non-focus state to the focus state.
In the embodiment of the application, the first electronic device is in communication connection with the second electronic device, a screen is thrown in a heterogeneous mode between the first electronic device and the second electronic device, and a Main screen (Main Display) and a first Virtual screen (Virtual Display) are created on the first electronic device. The main screen displays a picture (i.e., a first interface) displayed by a display screen of the first electronic device, and the virtual interface displayed on the first virtual screen is a screen-throwing interface from the first electronic device to the second electronic device, and the screen-throwing interface is displayed as a second interface on the second electronic device, so that the second electronic device displays the second interface. The content on the first virtual screen is invisible to the user on the first electronic device, runs in the background of the first electronic device, and is visible to the user after being projected to the second electronic device.
In the embodiment of the application, the first electronic device drops the created first virtual screen and the first virtual interface element to the second electronic device, and the first virtual interface element is displayed as the second interface element on the second electronic device. And the first electronic equipment responds to the fact that no interface element for focal point switching exists on the first interface, the focal point is switched to the first virtual interface element on a virtual screen (such as a first virtual screen) created by the first electronic equipment, and correspondingly, the second interface element displayed on the second electronic equipment is also in a focal point state. The second interface element is switched to a focus state, that is, a first virtual interface element corresponding to the second interface element obtains focus, and the first virtual interface element is selected to receive input. The first virtual interface element receives input and is displayed by the second electronic device in response to an interface change presented by the input. Taking the second interface element and the first virtual interface element as input boxes as examples, the first virtual interface element is switched to a focus state, and in response, the second interface element on the second electronic device is provided with a visual prompt which is different from other interface elements in a non-focus state, and a user knows that the second interface element is in the focus state currently through the visual prompt, and inputs 'A' to the second interface element. If the user directly inputs "A" to the first electronic device, the first virtual interface element of the first electronic device obtains "A" and responds to the input to perform corresponding processing. If the user inputs "A" through the second electronic device, the second electronic device may transmit "A" to the first virtual interface element of the first electronic device through reverse control. And after the first virtual interface element receives the A, presenting a corresponding processing result according to the actual processing logic of the A, and then rendering a new interface on the first virtual screen by the first electronic equipment according to the processing result, wherein the first electronic equipment projects the new interface on the first virtual screen to the second electronic equipment so as to enable the second electronic equipment to display the new interface. Thus, the user's view angle is focused as if the second interface element were in focus and may receive input. Therefore, the first electronic device responds to the first input of the user, the interface element of the focus state transformation is switched from the first interface element (displayed on the first electronic device) to the second interface element (displayed on the second electronic device), namely, the focus is switched from the first electronic device to the second electronic device, so that a new focus switching experience is provided for the user, convenience is brought for the user, and more possibility is brought for interaction between the user interface and the user.
In some embodiments, the second electronic device may respond to the first virtual interface element being in the focus state, set the second interface element of the first virtual interface projected and displayed on the second electronic device to the focus state, and then the user may directly operate the second electronic device, input a corresponding instruction to the second electronic device, and receive the instruction by the second interface element.
In some embodiments, the number of electronic devices communicatively coupled to the first electronic device is N, N being an integer greater than or equal to 1; the method further includes, prior to determining the first virtual interface element on the first virtual screen according to the first direction: determining a second electronic device positioned in the first direction according to the position relation between the N electronic devices and the first electronic device; a first virtual screen for projecting to a second electronic device is determined.
The positional relationship includes a distance between the N electronic devices and the first electronic device, and a direction of the N electronic devices relative to the first electronic device (e.g., the electronic devices are located on a left side or a right side of the first electronic device).
Illustratively, the N electronic devices include a device a and a device B, where the device a is located on the left side of the first electronic device and the device B is located on the right side of the first electronic device. In response to a first input to switch focus to the right, i.e., in a first direction to the right, a second electronic device located in the first direction may be determined to be device B, and the first electronic device determines that a virtual screen projected to device B is a first virtual screen.
For example, with N electronic devices including a device a and a device B, where the device a and the device B are located on the right of the first electronic device, and the device a is located between the first electronic device and the device B, in response to a first input of switching the focus to the right, that is, in a first direction to the right, it may be determined that a second electronic device located in the first direction is the device a, and the first electronic device determines that a virtual screen projected onto the device a is a first virtual screen.
Optionally, when the first electronic device searches for an interface element for focal point switching on the first interface according to the first direction, and when the interface element for focal point switching is not available, the electronic device which is close to the first electronic device and is located in the first direction is preferentially determined so as to acquire a virtual screen projected to the electronic device.
In this embodiment of the present application, the first electronic device searches for an interface element capable of being used for focus switching on the first interface according to the first direction, and when there is no interface element capable of being used for focus switching, the first electronic device may determine, according to the positional relationship between the N electronic devices and the first electronic device, a second electronic device located in the first direction, and in response to determining the second electronic device, obtain a first virtual screen projected by the first electronic device to the second electronic device. The first electronic device then looks for interface elements on the first virtual screen that are available for focus switching according to the first direction. That is, the first electronic device responds to the first input to find the second electronic device in the first direction, so that the found second interface element can be ensured to meet the requirement of the user for focus switching.
In some embodiments, the focus switching method further comprises: receiving a second input, wherein the second input is used for indicating a second direction of focus switching; responding to the second input, and searching a virtual interface element for focus switching on the first virtual screen according to a second direction; and when the first virtual screen has no virtual interface element for focus switching and has no second virtual screen for focus switching, switching the third interface element into a focus state and canceling the focus state of the second interface element, wherein the third interface element is displayed on the first electronic equipment.
Wherein the third interface element may be the same as or different from the first interface element. The first direction may be the same as or different from the second direction. The second virtual screen can be projected to the second electronic device or the third electronic device.
Wherein, the second virtual screen without available focus switching may include the following cases: 1) There is no second virtual screen, i.e. the first electronic device only creates the first virtual screen. 2) There is a second virtual screen but no interface element for focus switching on the second virtual screen, e.g., no interface element is rendered on the second virtual screen, or an interface element is rendered on the second virtual screen but the interface element does not get focus. In some embodiments, the second virtual screen without available focus switching may further include: 3) There is a second virtual screen, but the direction of the electronic device to which the first electronic device projects the second virtual screen does not correspond to the second direction with respect to the first electronic device. For example, the second direction is to the right, but the electronic device to which the second virtual screen is projected is located to the left of the first electronic device.
The first electronic device responding to that no virtual interface element for focal point switching exists on the first virtual screen and no second virtual screen for focal point switching exists, and switching the third interface element into a focal point state comprises the following steps: the first electronic device responds to the fact that no virtual interface element for focus switching exists on the first virtual screen, and no second virtual screen for focus switching exists on the first virtual screen, searches for a third interface element for focus switching on the first interface according to the second direction, and switches the third interface element to a focus state.
In this embodiment of the present application, the first electronic device searches for an interface element for focal point switching according to the second direction with the starting point of the first interface element, and if it is determined that there is no interface element for focal point switching on the first electronic device or other electronic devices communicatively connected to the first electronic device according to the second direction, then the edge of the first interface is taken as the starting point, and the interface element for focal point switching is searched for again on the first interface. In response to user input, focus switching is performed, and a focus switchable experience is provided for the user.
In some embodiments, the method further comprises a third electronic device communicatively coupled to the first electronic device, the method further comprising: when the first virtual screen has no virtual interface element for focus switching and a second virtual screen for focus switching exists, determining a second virtual interface element on the second virtual screen according to a second direction; and switching the second virtual interface element into a focus state, and canceling the focus state of the first virtual interface element, wherein the second virtual screen is projected to the second electronic device or the third electronic device.
And if the second virtual screen is projected to the second electronic equipment, the second virtual interface element is projected to the second electronic equipment. And if the second virtual screen is projected to the third electronic equipment, the third virtual interface element is projected to the second electronic equipment. In an embodiment of the application, the first electronic device may switch the focus from the first interface element (displayed on the first electronic device) to the second interface element (displayed on the second electronic device) in response to the first input. The first electronic device responds to the second input, can switch the focus from the second interface element to the third interface element (displayed on the third electronic device), can realize the switching of the focus between more than two electronic devices, provides a new focus switching experience for users, brings convenience for the users, and also brings more possibility for the interaction between the user interface and the users.
In a second aspect, an embodiment of the present application provides a focus switching method, which is applied to a first electronic device and a second electronic device, where the first electronic device is communicatively connected to the second electronic device, and the focus switching method includes: the first electronic device displays a first interface, and the second electronic device displays a second interface, wherein the first interface comprises a first interface element in a focus state; the first electronic device receives a first input, wherein the first input is used for indicating a first direction of focus switching; the first electronic device responds to the first input and outputs first information to the second electronic device according to the first interface and a first direction, wherein the first information comprises the first direction; the second electronic device responds to the first information, and switches the second interface element to a focus state according to the second interface and the first direction, wherein the second interface element is displayed on the second interface.
In the embodiment of the application, the first electronic device displays the first interface element in the focus state through the first interface. The user inputs a first input to the first electronic device, wherein the first input indicates a first direction of focus switching, i.e., the first electronic device switches the focus to other interface elements according to the first direction. The first electronic device receives a first input, and in response to the first input, the first electronic device outputs first information to the second electronic device according to a first interface and a first direction, wherein the first information includes the first direction. The second electronic device determines a second interface element according to the second interface and the first direction in response to the first information, wherein the second interface element is displayed on the second interface. The second electronic device switches the second interface element to the focus state in response to determining the second interface element. Thereby enabling a switching of focus from the first electronic device to the second electronic device. So that the focus switching is no longer limited to the same window or between different windows of the same electronic device. The user allows the first electronic equipment to be in communication connection with the second electronic equipment, namely the user hopes to operate the first electronic equipment and the second electronic equipment at the same time, through the focus switching method provided by the application, the focus can be switched between the electronic equipment, the user use experience is improved while brand new experience is brought to the user, convenience is brought to the user, and more possibility is brought to interaction between the user interface and the user. Furthermore, the switching of the input focus can be rapidly completed by inputting the first direction of the switching of the indication focus, so that the use efficiency of the input equipment such as a remote controller, a touch pad on a keyboard and the like is greatly improved.
In some embodiments, the first electronic device outputting the first information to the second electronic device according to the first interface and the first direction in response to the first input comprises: the first electronic equipment responds to a first input and searches an interface element for focal point switching on a first interface according to a first direction; when the first interface has no interface element for focus switching, the first electronic device outputs first information to the second electronic device.
In the embodiment of the application, the first electronic device responds to the first input to search for an interface element for focal point switching according to a first direction on the first interface. When the searching result is that the interface element for focus switching is unavailable, the first information is output to the second electronic device, so that the second electronic device searches the interface element for focus switching according to the first direction, the focus is allowed to be switched to the second electronic device which is in communication connection with the first electronic device, a new focus switching experience is provided for the user, convenience is brought to the user, and more possibility is brought to interaction between the user interface and the user. And the switching of the input focus can be rapidly completed by inputting the first direction of the switching of the input instruction focus, so that the use efficiency of the input equipment such as a remote controller, a touch pad on a keyboard and the like is greatly improved.
In some embodiments, the number of electronic devices communicatively coupled to the first electronic device is N, N being an integer greater than or equal to 1; the method further includes, prior to outputting the first information to the second electronic device: the first electronic device determines a second electronic device located in a first direction according to the position relation between the N electronic devices and the first electronic device.
In the embodiment of the application, the first electronic device searches the second electronic device in the first direction in response to the fact that no interface element for focal point switching exists on the first interface according to the first direction, and the searched second interface element can be ensured to meet the requirement of a user for focal point switching.
In some embodiments, the second electronic device switching the second interface element to the focus state according to the second interface and the first direction in response to the first information comprises: the second electronic device responds to the first information to create a virtualized input device, wherein the virtualized input device is used for receiving information in a first direction; the second electronic device receives information of the first direction in response to the virtualized input device, determines a second interface element on the second interface according to the first direction, and switches the second interface element to a focus state.
In the embodiment of the application, the first electronic device and the second electronic device share the same input device. I.e. no screen is thrown between the first electronic device and the second electronic device. And the focus is allowed to be switched to a second electronic device which is in communication connection with the first electronic device, so that a new focus switching experience is provided for the user, convenience is brought to the user, and more possibility is brought to interaction between the user interface and the user. And the switching of the input focus can be rapidly completed by inputting the first direction of the switching of the input instruction focus, so that the use efficiency of the input equipment such as a remote controller, a touch pad on a keyboard and the like is greatly improved.
In some embodiments, the focus switching method further comprises: the first electronic device receives a second input, wherein the second input is used for indicating a second direction of focus switching; the first electronic device outputting second information to the second electronic device in response to the second input, wherein the second information includes a second direction; the second electronic equipment responds to the second information and searches for interface elements for focal point switching on a second interface according to a second direction; when the second interface has no interface element for focal point switching in the second direction, the first electronic device switches a third interface element to a focal point state according to the second direction, the second electronic device cancels the focal point state of the second interface element, and the third interface element is displayed on the first electronic device.
In some embodiments, the first electronic device receives a second input, wherein the second input is for indicating a second direction of focus switching; the first electronic device outputting second information to the second electronic device in response to the second input, wherein the second information includes a second direction; the second electronic equipment responds to the second information and searches for interface elements for focal point switching on a second interface according to a second direction; the first electronic equipment responds to the fact that no interface element for focal point switching exists on the second interface, and searches for third electronic equipment for focal point switching; when the third electronic equipment for focal point switching does not exist, the first electronic equipment switches the third interface element to a focal point state, wherein the third interface element is displayed on the first electronic equipment; the second electronic device responds to the first electronic device to switch the third interface element to the focus state, and the focus state of the second interface element is canceled.
The third electronic device capable of being subjected to focus switching, namely the electronic device and the first electronic device are in a cooperative state, and the electronic device is provided with an interface element capable of being subjected to focus switching.
In some embodiments, the first electronic device searches for a third electronic device available for focus switching according to the second direction in response to no interface element available for focus switching on the second interface.
In the embodiment of the application, the first electronic device responds to the second input, and outputs second information to the second electronic device when determining that no interface element for focal point switching exists on the first interface according to the second direction. And the second electronic equipment responds to the second information and searches the interface element for switching the focus on the second interface according to the second direction. And the second electronic equipment informs the first electronic equipment when determining that no interface element for focus switching exists on the second interface according to the second direction. And the first electronic equipment responds to the fact that no interface element for focal point switching exists on the second interface according to the second direction, and searches for a third electronic equipment for focal point switching. If the interface element for focus switching does not exist in the second direction, the first electronic equipment searches the interface element for focus switching on the first interface according to the second direction. And if the third interface element is determined on the first interface according to the second direction, switching the third interface element into a focus state. In response to user input, focus switching is performed, and a focus switchable experience is provided for the user.
In some embodiments, a third electronic device communicatively coupled to the first electronic device, the third electronic device displaying a third interface; the focus switching method further comprises the following steps: the first electronic device responds to the existence of a third electronic device for focal point switching, and the first electronic device outputs third information to the third electronic device, wherein the third information comprises a second direction; the third electronic device responds to the third information, and switches a fourth interface element to a focus state according to the third interface and the second direction, wherein the fourth interface element is displayed on the third electronic device; and the second electronic device responds to the third electronic device to switch the fourth interface element into the focus state, and the focus state of the second interface element is canceled.
In the embodiment of the application, in response to the input of focus switching, the focus is switched from the first electronic device to the second electronic device according to the second direction of focus switching, and then the focus is switched from the second electronic device to the third electronic device, so that a new focus switching experience is provided for a user, convenience is brought to the user, and more possibility is brought to interaction between a user interface and the user.
In a third aspect, an embodiment of the present application provides an electronic device, including a display screen, a processor, and a memory, where the memory is configured to store instructions, and the processor is configured to invoke the instructions in the memory, so that the electronic device performs the method as described above.
In a fourth aspect, embodiments of the present application provide a chip coupled to a memory in an electronic device, the chip configured to control the electronic device to perform a method as described above.
In a fifth aspect, embodiments of the present application provide a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method as above.
In a sixth aspect, embodiments of the present application provide a computer program product for, when run on a computer, causing the computer to perform the method as above.
Drawings
Fig. 1A to fig. 1B are schematic application scenarios of the focus switching method provided in the embodiments of the present application.
Fig. 2A to fig. 2B are schematic diagrams of a focus switching method applied to a multi-window display scene according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a software architecture of an electronic device in a multi-window display scenario according to an embodiment of the present application.
Fig. 4 is a schematic software interaction flow chart of focus switching in a multi-window display scene according to an embodiment of the present application.
Fig. 5A to fig. 5C are schematic diagrams of a policy for searching a target window according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a policy for finding a control according to an embodiment of the present application.
Fig. 7A to fig. 7C are schematic diagrams illustrating a scenario in which the focus switching method provided in the embodiments of the present application is applied to a plurality of electronic devices in cooperation.
Fig. 8 is a diagram illustrating an example of software architecture of the first electronic device and the second electronic device in the mirror mode according to the embodiment of the present application.
Fig. 9A to fig. 9C are schematic diagrams of a focus switching method applied to a mirror mode scene according to an embodiment of the present application.
Fig. 10A is a schematic software interaction flow diagram of focus switching shown in fig. 9A to 9B in a mirror mode according to an embodiment of the present application.
Fig. 10B is a schematic software interaction flow diagram of the focus switching shown in fig. 9B to 9C in the mirror mode according to the embodiment of the present application.
Fig. 11A to 11B are schematic views of a focus switching method applied to another mirror mode scene according to an embodiment of the present application.
Fig. 12 is a schematic software interaction flow diagram of focus switching in another mirror mode according to an embodiment of the present application.
Fig. 13 is a diagram illustrating an example of a software architecture of a first electronic device and a second electronic device when a screen is projected in a heterogeneous manner according to an embodiment of the present application.
Fig. 14A to fig. 14B are schematic diagrams of a focus switching method applied to a heterogeneous screen projection scene according to an embodiment of the present application.
Fig. 15 is a schematic software interaction flow chart of focal point switching under a heterogeneous screen according to an embodiment of the present application.
Fig. 16 is a diagram illustrating an example of a software architecture of the first electronic device and the second electronic device in the sharing mode according to the embodiment of the present application.
Fig. 17A to 17B are schematic diagrams of a focus switching method applied to a sharing mode scenario according to an embodiment of the present application.
Fig. 18 is a schematic software interaction flow diagram of focus switching in a sharing mode according to an embodiment of the present application.
Fig. 19 is a schematic flow chart of a focus switching method according to an embodiment of the present application.
Fig. 20 is a flowchart of another focus switching method according to an embodiment of the present application.
Fig. 21 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The term "plurality" as used herein refers to two or more. In addition, it should be understood that in the description of this application, the words "first," "second," and the like are used merely for distinguishing between the descriptions and not for indicating or implying any relative importance or order.
For ease of understanding, a description of some of the concepts related to the embodiments of the present application are given by way of example for reference.
The electronic device may display a User Interface (UI) including one or more Interface elements through a display screen.
Interface elements include any suitable elements or objects in a user interface. Non-limiting examples of interface elements include digital images, windows, videos, text, icons (e.g., application launch icons, shortcut icons, etc.), controls (e.g., buttons, menus, check boxes, edit boxes, dialog boxes, etc.), links, and the like. Wherein the interface element is visible such that it can be seen and selected by the user.
The focus indicates the interface element currently selected to receive input. The interface element with the focus is that the interface element obtains the focus, or the interface element is in the focus state. The interface element where the focus is located, the interface element where the focus is obtained and the interface element where the focus is located indicate that the interface element is currently selected, and input can be received. For example, the edit box may receive text entered by the user.
The user interface elements may include interface elements in a focus state and interface elements in a non-focus state. Wherein the interface element in focus, i.e., the interface element obtains focus, may be selected and receive input. The interface element in the non-focus state, i.e. the interface element is not currently in focus.
Interface elements in focus have visual cues that are distinguishable from other interface elements that do not acquire focus, non-limiting examples of which include highlighting, background, border bolstering, and the like. In some embodiments, the visual cues may also be processing the content of the interface element (e.g., text or images, etc.), such as magnifying the content of the interface element, changing colors, changing fonts, etc. The embodiment of the present application does not specifically limit the visual cue.
The embodiment of the application provides a focus switching method which is executed by electronic equipment. The method can be applied to a multi-window display scene, and can also be applied to a scene in which a plurality of electronic devices are cooperatively used. In a multi-window display scenario, multiple windows may be displayed on an electronic device, with focus being switchable between the multiple windows. In a scenario where multiple electronic devices are used in concert, the multiple electronic devices are communicatively connected to each other, and the focus may be switched between the multiple electronic devices. The method can also be applied to a combined scene of the two scenes, namely, a scene that a plurality of electronic devices are cooperatively used and at least one electronic device in the plurality of electronic devices displays a plurality of windows.
Non-limiting examples of electronic devices include cell phones, notebook computers, tablet computers, personal computers (personal computer, PC), laptop computers, foldable electronic devices, handheld computers, ultra-mobile personal computers (UMPC), netbooks, televisions, smart screens, displays, all-in-one machines, etc., and the specific types of electronic devices are not particularly limited by the embodiments of the present application.
The focus switching method provided in the embodiments of the present application will be exemplarily described below with reference to the accompanying drawings. The following figures depict exemplary user interfaces herein as Two-Dimensional (2D) graphical user interfaces (Graphical User Interface, GUIs). In some embodiments, the user interface may be a three-dimensional (Three Dimensional, 3D) graphical user interface.
Taking an electronic device as a notebook computer as an example, please refer to fig. 1A to 1B, an application scenario of the focus switching method provided in the embodiments of the present application is exemplarily described.
As shown in fig. 1A, a user interface 11 is displayed on a notebook computer 100. The user interface 11 comprises a window 1 and a window 2. Window 1 and control 1 in window 1 get focus. The visual cues focused on window 1 and control 1 are bolded for the borders of window 1 and control 1. The user 200 controls the focus movement using the directional keys 13 of the keypad 12. The user 200 clicks the right button 14 for the first time, inputs a first input to the notebook computer 100 indicating that the direction of focus switching is rightward, i.e., the user desires focus to be moved (switched) rightward.
It will be appreciated that the first direction is the direction indicated by the direction key 13. For example, user 200 clicks left key 15, the first direction is to the left, indicating that the focus is moving to the left. The user 200 clicks the up key 16 in a first direction indicating that the focus is moving upwards. The user 200 clicks the down button 17 in a first direction down, indicating that the focus is moving down.
As shown in fig. 1B, the notebook computer 100 responds to the first input by searching for an interface element (e.g., a control) for focus switching to the right starting from the control 1. If the control 2 can obtain the focus, determining that the control 2 is the searched control, and switching the focus from the control 1 to the control 2. In the user interface 18 displayed by the notebook computer 100, the control 2 obtains a focus, and the visual cue of the focus on the control 2 is bolded for the border of the control 2. And canceling the focus state of the control 1, canceling the visual prompt of the control 1, and changing the frame of the control 1 into the original frame.
The following figures illustrate exemplary visual cues that are bolded, it being understood that the visual cues may also be highlighting, etc., which is not specifically limited in this application.
The user 200 clicks the right button 14 a second time. As shown in fig. 1B, starting from control 2, there is no interface element for switching the focus in the rightward direction within window 1, and the focus cannot be moved rightward. In response to the user 200 clicking the right button 14 a second time, the notebook computer 100 cannot switch focus from the control 2 in the window 1 to the other interface elements to the right, and the notebook computer 100 displays the user interface 18.
The user interface in the embodiment of the application comprises interface elements capable of obtaining focus and interface elements incapable of obtaining focus. An interface element that does not get focus means that the interface element cannot be selected, which cannot display a visual cue that is distinct from other interface elements that do not get focus.
The interface element for focal point switching can obtain the interface element of focal point. An interface element is found on the user interface according to a first direction, to which focus is switched, i.e. an interface element is found on the user interface according to the first direction, to which focus is allowed to be switched.
As in the example above, no interface element in the rightward direction is available for focus switching may be the following: 1) There is no interface element in the rightward direction starting from control 2 where the focus is located. Control 2 is located at the edge of window 1 as shown in fig. 1B, with no interface element in the rightward direction within window 1. 2) Control 2 is not located at the edge of window 1, and within window 1, control 2 has an interface element in the rightward direction, but the interface element cannot get focus.
The above embodiments can only switch focus within the same window. The focus is positioned at the edge of the window, the interface element which can be switched by the focus is searched in the window (namely the focus window) where the focus is positioned according to the first direction, and when the searching result is that the interface element which can be switched by the focus is not available, the focus can not be switched according to the first direction. In view of this, the embodiment of the present application provides a focus switching method, which is applied in a multi-window display scene, and the focus can be switched between multiple windows. Even if the interface element for focus switching is found in the focus window according to the first direction, when the found result is that there is no interface element for focus switching, another window can be determined according to the first direction, and focus can be switched to another window determined according to the first direction, so that focus switching between different windows (such as window 1 and window 2 in fig. 2A-2B) is realized.
Multi-window display scene
Referring to fig. 2A to 2B, an application of the focus switching method provided in the embodiment of the present application to a multi-window display scene is exemplarily described.
As shown in fig. 2A, the focus is on control 2 of window 1. The user 200 clicks the right button 14 to input the focus switching direction to the notebook computer 100 to the right. As shown in fig. 2B, in response to the user 200 clicking the right button 14, the notebook computer 100 starts with the control 2 where the focus is located in the window 1, and searches for an interface element for switching the focus to the right. Based on the fact that a window 2 is arranged on the right side of the window 1, the window 2 is determined to be an interface element for focal point switching, and focal points are switched from the control 2 to the window 2. The notebook computer 100 may continue to search for interface elements (e.g., controls) within the window 2 for focus switching, and find the control 21 within the window 2. The notebook computer 100 switches (sets) the window 2 to the focus state, and switches the control 21 to the focus state, canceling the focus states of the window 1 and the control 2. As shown in fig. 2B, notebook computer 100 displays user interface 19, and on user interface 19, visual cues focused on window 2 and control 21 are bolded by the borders of window 2 and control 21. The borders of window 1 and control 2 become the original borders.
Setting or switching an interface element to a focus state herein refers to switching an interface element in a non-focus state to a focus state.
Referring to fig. 3, a schematic diagram of a software architecture of the electronic device shown in fig. 2A to 2B is exemplarily described.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, from top to bottom, an application layer, an application framework layer, runtime (run time) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc. For convenience of description, an application program will be hereinafter simply referred to as an application. The application on the electronic device may be a native application or a third party application, and the embodiments of the present application are not limited.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3, the application framework layer may include a layout subsystem, a window management service (window manage rservice, VMS) and an incoming event management service (input manager service, IMS).
The layout subsystem is used for managing the mapping control, including creating and destroying the mapping control and associating management of the mapping control and the source control, and layout of the mapping control in the system auxiliary window.
The layout subsystem may load the mapping control configuration file into memory. The mapping control configuration file is configured with information of a source control supporting mapping. In the mapping control configuration file, the source control supporting mapping may be identified by using an application package name and a control name, or may be identified by using an application package name, a control name and a control unique identifier, which is not limited herein.
In some embodiments, the layout subsystem is responsible for the association management of the mapping control and the source control, which may refer to the mapping tables that the layout subsystem needs to be responsible for mapping the control and the source control. In the mapping table, the mapping controls are in one-to-one correspondence with the source controls, namely the electronic equipment can find the source control corresponding to the mapping control from the table.
In some embodiments, the layout subsystem is in communication with the input event management service, and is responsible for managing the controls in the window that get focus, i.e., in response to an input event from the input event management service.
The window management service is used for managing the display of the window and the management of the task stack of the window. The window management service can set the size and window state of the window, and can also be used for updating the size, display content and display state of the window.
The window management service may also obtain the size of the display screen, determine if there is a status bar, lock the screen, intercept the screen, etc.
The window management service is also responsible for the switching management of the window with focus, and provides various window display capabilities such as full-screen display, multi-window display and the like.
The input event management service may be used to translate, package, etc. the original input event to obtain an input event containing more information, and send the input event to the window management service, where the window management service stores clickable areas (such as controls) of each application program, location information of a window in a focus state (focus window for short), etc. Thus, the window management service can properly distribute input events to designated controls or focus windows.
In some embodiments, activity management services (activity manager service, AMS), content providers, view systems, and the like may also be included.
The Activity management service is used for managing Activity and is responsible for the work of starting, switching, scheduling and the like of each component in the system and the management and scheduling of application programs.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The Runtime (run time) includes core libraries and virtual machines. Run time is responsible for scheduling and management of the system.
The core library consists of two parts: one part is the function that the programming language (e.g., java language) needs to call, and the other part is the core library of the system.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (Media Libraries), two-dimensional graphics processing library, three-dimensional graphics processing library (e.g., openGL ES), etc.
The surface manager is used to manage the display subsystem and provides a fusion of two-dimensional and three-dimensional layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer may contain display drivers, input/output device drivers (e.g., keyboard, camera, remote control, touch pad on keyboard, touch screen, headphones, speakers, microphones, etc.), device nodes, camera drivers, audio drivers, and sensor drivers, among others. The user performs input operation through the input device, and the kernel layer can generate corresponding original input events according to the input operation and store the corresponding original input events in the device node.
It is emphasized that fig. 3 is merely a schematic example; the software structure of the electronic device provided in the embodiment of the application may also adopt other software architectures, such asLinux, or other operating system software architecture.
Based on the software modules in fig. 3, please refer to fig. 4, which exemplarily illustrates a software interaction flow for focus switching in a multi-window display scene according to an embodiment of the present application.
Step S41, the input/output device driver acquires an input event.
As shown in fig. 2A, the user 200 presses the right key 14 of the keyboard 12, i.e., the user 200 enters a first input. The input/output device driver detects an input event and acquires the input event.
Wherein the first input is an input regarding a focus switching direction. The input event is an event regarding the focus switching direction for indicating the direction of focus switching. The input event includes a first direction for indicating a focus switch. As shown in fig. 2A, the user 200 presses the right key 14 of the keyboard 12, and the notebook computer 100 obtains an input event, and the first direction is right, that is, indicates to switch focus to the right.
Input events include, but are not limited to, the following: 1) Key events of the keyboard input device include up-down-left-right directional key input events. 2) The key event of the remote controller comprises an up-down left-right direction key input event. 3) The sliding event of the touch pad on the keyboard comprises an up-down left-right sliding event. 4) Gesture events recognized by the camera device comprise gesture events in the up-down and left-right directions.
Step S42, the input/output device driver reports the input event.
After the input/output device driver detects an input event, the input event is reported to the input event management service.
Step S43, the input event management service distributes the input event to the first window in the focus state.
The window management service is responsible for managing the focus window, and a clickable area (such as a control) of each application program, position information of the window where the focus is located and the like are stored in the window management service. After receiving an input event, the window management service typically decides to process the input event by the window in focus. Each time the electronic device updates the focus window, the window management service will send the information of the focus window to the input event management service. Thus, the input event management service may properly distribute the input event to the control in focus, the designated control, or the window in focus, such as to the first window in focus.
Specifically, the input event management service sends the received input event to the window management service, the window management service determines that the window for processing the input event is a first window in a focus state, and then the input event management service sends the input event to a client corresponding to the first window according to the first window determined by the window management service. The client of the first window processes the input event.
As shown in fig. 2A, the first window in focus is window 1. The window management service determines that the window that handles the input event is window 1, and the input event management service distributes the input event to window 1.
Step S44, the layout subsystem searches for a control in the first window according to the first direction corresponding to the input event.
In step S44, the electronic device searches for an interface element for focal point switching in the first window according to the first direction corresponding to the input event. Specifically, after the input event management service distributes the input event to the first window in the focus state in response to the decision of the window management service, the window management service calls the interface of the layout subsystem to find (query) whether other interface elements for focus switching exist in the first window according to the first direction of the input event. As shown in fig. 2A, the window management service invokes the interface of the layout subsystem to look right within window 1 for a control available for focus switching.
In the embodiment of the application, the strategies for searching the control for focus switching include, but are not limited to, the following: 1) And searching for a control for focal point switching according to the first direction. For example, the portion of the content that may be for the control being sought is in a first direction of the interface element (e.g., control) where the focus is located. As shown in fig. 2A, the first direction is right, the control with focus is control 2, and then part of the content of the found control is on the right of control 2. 2) The control for focus switching is found to be in a visible state, namely the control is set as a visible.
In some embodiments, the layout subsystem may be able to obtain focus by looking for the looked-for control within the first window according to the first direction. If the control cannot obtain the focus, the focus cannot be switched to the control.
Illustratively, as shown in FIG. 2A, the first direction is to the right and the focus control is control 2. And searching for a control for focal point switching to the right according to the first direction by taking the control 2 as a starting point, and based on the fact that the control 2 is positioned at the edge of the window 1, if no control is positioned on the right side of the control 2, no control for focal point switching is positioned in the first window.
Illustratively, as shown in FIG. 1A, the first direction is to the right and the focus control is control 1. And searching a control for focal point switching to the right according to the first direction by taking the control 1 as a starting point, finding a control 2 at the right side of the control 1, and if the control 2 cannot acquire the focal point, then no control for focal point switching exists in the first window.
If the layout subsystem finds a control for focus switching in the first window according to the first direction, the layout subsystem switches the control into a focus state, and the frame of the control can be thickened on a visual prompt, or the control is set to be in a highlight state, and the highlight state is realized by modifying the background of the control, or re-coloring and drawing the frame of the control, so as to highlight the focus attribute of the control.
If the control available for focus switching is not found in step S44, step S45 is continued.
And step S45, when no control for focus switching exists, the window management service searches for a target window according to the first direction.
In step S45, when no control available for focus switching is queried in the first window according to the first direction in step S44, the layout subsystem invokes an interface of the window management service, and notifies the window management service to find an interface element (e.g., a target window) available for focus switching according to the first direction. As shown in fig. 2A, the window management service looks for a target window for focus switching according to a first direction for a control for focus switching already available on the right of control 2.
Strategies for finding target windows include, but are not limited to, the following:
1) The target window is found according to the first direction. For example, the target window portion content may be in a first direction of the focus window. As shown in fig. 2A, the first direction is to the right, the focus window is window 1, and then the partial content of the found target window is to the right of the current focus window (window 1).
In some embodiments, the special case allows the target window not to be in the first direction. For example, the focus window is already at the edge of the user interface, as shown in fig. 2B, the focus window is window 2, and window 2 is at the edge of the user interface 19. If the user 200 continues to press the right key 14, the notebook computer 100 may select to search for a target window from the left side of the user interface 19, such as from the left side of the user interface 19, in response to the user 200 pressing the right key 14, determining that the target window is window 1, where the target window (window 1) is in the opposite direction of the first direction of the focus window.
2) The window state of the target window is visible (visible). I.e. the target window is at least partially visible on the user interface, e.g. in special scenarios the target window may be partially obscured by other windows.
The target window may be an application window or a system window, where the system window may include a special window such as a drop-down notification bar, a system bullet frame window, and the embodiment of the present application is not limited in detail.
The following description deals with finding a target window policy example according to a first direction:
example one:
referring to fig. 5A, the user 200 presses the right key 14, and the direction of the input event is rightward, i.e., the first direction is rightward. The first window is taken as a starting point, and a target window is searched right. The target window may be any one of window a, window B, window C, window D. Ensuring that the target window is in a first direction of the first window. Preferably, the target window is window a or window B.
Example two:
referring to fig. 5B, the user 200 presses the right key 14, and the direction of the input event is rightward, i.e., the first direction is rightward. The first window is taken as a starting point, and a target window is searched right. The target window may be any one of window a, window B, window C, window D. Window a has a portion of the content to the right of the first window and window a is closest to the first window, preferably the target window is window a.
Example three:
referring to fig. 5C, the user 200 presses the left key 15, and the direction of the input event is left, i.e., the first direction is left. And searching the target window leftwards by taking the first window as a starting point. If the first window cannot find the target window to the left, and the search in the opposite direction to the first direction can be considered, then window a, window B, window C, and window D in fig. 5C can all be target windows. The preferred target windows are window C, window D.
In the embodiment of the present application, if the window management service cannot find the target window in step S45, the flow is terminated, and the current input event is not processed.
Step S46, the window management service sets the target window to the focus state.
After the window management service finds the target window in step S45, the following settings may be made for the target window: 1) The target window is set to the focus state, i.e. the target window is switched to the focus window. As shown in fig. 2B, the window management service finds the target window as window 2, and thickens the border of window 2. 2) The target window level is increased and higher than other application windows can be selected.
Step S47, the layout subsystem searches for a control in the target window.
After the window management service finds the target window, the window management service can call the interface of the layout subsystem to find a control for focus switching in the target window.
In the embodiment of the application, strategies for searching the control available for focus switching in the window include, but are not limited to, the following: 1) Looking in a first direction (i.e., the direction in which the event is entered). Referring to fig. 6, the user 200 presses the right key 14, and the direction of the input event is rightward. After the window management service determines the target window according to the first direction, the window management service calls the interface of the layout subsystem to search for the target focus control in the target window, and any one of the control 1, the control 3 or the control 5 can be determined according to the first direction (rightward). 2) Looking for in a first direction while looking for from top to bottom. As shown in fig. 6, looking in the first direction, while looking from top to bottom, control 1 can be determined. 3) Looking for in a first direction and simultaneously looking for from bottom to top. As shown in fig. 6, looking in a first direction, while looking from top to bottom, the control 5 can be determined. 4) Not in the first direction. As shown in fig. 6, when searching for the target focus control in the target window, any one of the controls 1-5 is found. The present application is not particularly limited thereto.
Step S48, the layout subsystem sets the control to be in a focus state.
In this embodiment of the present application, after the interface of the layout subsystem is called by the window management service to find a control for focal point switching, the layout subsystem may directly set the state of the control to be the focal point state.
In the implementation of the application, after the layout subsystem sets the control for focus switching to be in a focus state and the window management service sets the target window to be in a focus state, the layout subsystem and the window management service call a display driver to display the control where the focus is and the visual prompt of the target window. As shown in fig. 2B, the display driver thickens the border of the control 21. In some embodiments, the focus state of window 1 and control 2 where the original focus was may be cancelled.
In the embodiment of the application, the focal point can be rapidly switched by inputting the first direction indicating the focal point switching, so that the use efficiency of the equipment for providing the direction input, such as a remote controller, a touch pad on a keyboard, a camera device and the like, is greatly improved. The focus can be switched in different windows, so that the focus switching is not limited in the same window, the focus can be switched among a plurality of windows, the user use experience is improved while convenience is brought to the user, and meanwhile, brand new experience is brought to the user, and more possibilities are brought to interaction between the user interface and the user.
In some embodiments, such as user interface 19 shown in fig. 2B, user 200 presses right key 14, focus switches to control 22, the user presses right key 14, control 22 is the rightmost control of window 2, and since window 2 is the rightmost window of user interface 19, the search for interface elements available for focus switching according to the first direction continues in the electronic device, with no interface elements available for focus switching.
The above embodiments can only switch focus within the same electronic device. The focus is positioned at the edge of a user interface (display screen), the interface element which can be switched by the focus is searched in the user interface where the focus is positioned according to the first direction, and when the searching result is that the interface element which can be switched by the focus is not available, the focus can not be switched according to the first direction. In view of this, the embodiment of the present application provides a focus switching method, which is applied to a scenario where a plurality of electronic devices are cooperatively used, and the focus can be switched between the plurality of electronic devices. When the focus is at the edge of the user interface, searching for an interface element for focus switching according to the first direction in the focus window, and if the searching result is that the interface element for focus switching is unavailable, the focus can be switched to another electronic device according to the first direction, for example, the focus can be switched to another electronic device in the first direction.
In the following, an application of focus switching in a scenario of cooperative use of multiple electronic devices will be described, where the above-mentioned content related to focus switching in a multi-window display scenario (for example, a policy of searching for an interface element (a control or a target window) for focus switching according to a first direction, etc.) is applicable to a scenario of cooperative use of multiple electronic devices, that is, in a scenario of cooperative use of multiple electronic devices, an interface element may be found with reference to the above-mentioned policy of searching for an interface element for focus switching according to the first direction. If any one or more of the plurality of electronic devices display multiple windows in the scenario where the plurality of electronic devices are used cooperatively, the one or more electronic devices may apply the above method for switching focal points of the multiple window display scenario, and the focal points may be switched between multiple windows of the same electronic device. The present application is not particularly limited thereto.
Multiple electronic devices co-operating with a scene
In a scenario in which multiple electronic devices are used in concert, the multiple electronic devices are communicatively connected. Wherein the plurality of electronic devices includes two or more electronic devices. Taking the example that the plurality of electronic devices includes a first electronic device and a second electronic device. The first electronic device and the second electronic device can be connected together through a communication network to realize functions such as screen projection, multi-screen collaboration, screen expansion, sharing of input devices and the like. The communication network may be a wired network or a wireless network. The communication network may be implemented using any known network communication protocol, which may be various wired or wireless communication protocols such as ethernet, universal serial bus (universal serial bus, USB), high definition multimedia interface (High Definition Multimedia Interface, HDMI), global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code divisionmultiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), bluetooth, wireless fidelity (wireless fidelity, wi-Fi), NFC, voice over internet protocol (voice over Internet protocol, voIP), communication protocols supporting a network slice architecture, or any other suitable communication protocol.
In the embodiments of the present application, an electronic device that projects a display interface of the electronic device may be referred to as a source device, and an electronic device that receives the projection of the source device and displays the display interface of the source device may be referred to as a sink device. The interface projected by the source device displayed on the receiving device is referred to as a screen projection interface, and the window used by the receiving device to display the screen projection interface is referred to as a screen projection window.
In a scenario in which multiple electronic devices are used cooperatively, the multiple electronic devices may use, but are not limited to, the following cooperative approach:
1) Mirror mode. The source device drops the display interface on its main display to the receiving device, which may or may not display the drop interface of the source device in full screen. The mirror mode uses a mirror screen technique (mirroring technology) that includes a variety of technology types, such as Miracast, airplay, private mirror screen technique, and the like.
2) And (5) heterogeneous screen projection. The source device creates a virtual screen (virtual display) and screens the interface on the virtual screen to the sink device. The receiving device may display an interface of the virtual screen either full screen or non-full screen.
3) Sharing mode. The electronic devices do not throw screen, and only the input devices are shared. The user may operate the first electronic device as well as the second electronic device using the input device. If the user uses the input device to control the first electronic device, the first electronic device can share the input device to the second electronic device, so that the user can control the second electronic device.
The mirror image screen-throwing technology, the heterogeneous screen-throwing mode and the sharing mode are the prior art, and are not described herein.
The difference between the mirror image screen projection and the heterogeneous screen projection is that the mirror image screen projection means that the audio and video rendered by the source equipment are identical to those of the receiving equipment, the picture, the audio or the video is opened on the source equipment, and the receiving equipment also displays the picture and plays the audio or the video. In the heterogeneous screen projection, the screen projection interface on the screen projection window displayed by the source device is different from the content of the display interface of the source device.
It can be appreciated that the embodiment of the present application does not specifically limit the device configuration and the number of electronic devices in a coordinated state.
In embodiments of the present application, the resources shared between electronic devices (mirrored or heterologous) may include any one or a combination of the following: video, text, pictures, photographs, audio, or forms, etc. For example, it may be an image resource such as a movie, a television show, a short video, a musical show, etc.
In a scenario in which a plurality of electronic devices are used cooperatively, the plurality of electronic devices have previously established communication connections, and the plurality of electronic devices are in a cooperative state. The manner of establishing the communication connection between the electronic devices is not particularly limited. The following embodiments are described taking as an example that communication connections have been established between the plurality of electronic devices, which are in a cooperative state.
Taking the first electronic device as a notebook computer and the second electronic device as a tablet.
As shown in fig. 7A, the notebook computer 100 is communicatively connected to the tablet pc 300, and the notebook computer 100 and the tablet pc 300 are in a cooperative state. The notebook computer 100 displays a user interface 110 with the window 2 and the control 22 in focus. The notebook computer 100, in response to the user 200 pressing the right key 14, looks right for an interface element available for focus switching. Since control 22 is the rightmost control of window 2, window 2 is located at the right edge of user interface 110, i.e., window 2 is the rightmost window of the display screen of notebook computer 100, looking for interface elements to the right that are not available for focus switching. As shown in fig. 7B, tablet 300 is positioned to the right of control 22. Starting with control 22, find tablet 300 to the right. After the plate 300 is determined, an interface element, such as window 3, is found on the plate 300 for focus switching. The control for focus switching can also be found in the window 3, for example, the leftmost control 31 in the window 3 is found, the window 3 and the control 31 are switched to the focus state, and the focus states of the window 2 and the control 22 are canceled.
When the notebook computer 100 and the tablet pc 300 adopt the mirror image screen projection mode, the notebook computer 100 projects the window 2 to the tablet pc 300, and the window 3 is an example of a screen projection window, the contents of the window 2 and the window 3 are the same. When the notebook computer 100 and the tablet pc 300 adopt the heterogeneous screen projection mode, one or more of the windows 3 and 4 are contents on the virtual screen of the notebook computer 300. When the notebook computer 100 and the tablet pc 300 adopt the sharing mode, the window 1 and the window 2 are local windows running on the notebook computer 100. Window 3 and window 4 are native windows running on tablet 300.
In some embodiments, the plurality of electronic devices may further include a third electronic device. The first electronic device, the second electronic device and the third electronic device are in a cooperative state, the second electronic device is located on the right side of the first electronic device, and the third electronic device is located on the right side of the second electronic device. In response to user 200 input indicating a focus switch direction (e.g., indicating to switch focus to the right), focus may be switched from the first electronic device to the second electronic device and then from the second electronic device to the third electronic device.
Taking the third electronic device as a mobile phone, as shown in fig. 7C, the notebook computer 100, the tablet pc 300 and the mobile phone 400 are in a collaborative state, the tablet pc 300 is located on the right side of the notebook computer 100, and the mobile phone 400 is located on the right side of the tablet pc 300. The user 200 presses the right key 14 and the focus moves to the right until the focus falls on the control 42 of the window 4 of the tablet 300. The user 200 continues to press the right key 14, as shown in fig. 7C, switching the window 5 and the control 51 to the focus state, canceling the focus state of the window 4 and the control 42. The focus is switched from notebook 100 to tablet 300, then from tablet 300 to handset 400, and the focus is switched from control 42 of window 4 to control 51 of window 5.
In some embodiments, the electronic device where the focus is located at the beginning is the first electronic device, and if there is no interface element available for focus switching in the electronic device that cooperates with the first electronic device, the focus may be switched back to the first electronic device. As shown in fig. 7C, user 200 continues to press right key 14, switching focus from control 51 of window 5 to control 52 of window 5. The user 200 continues to press the right key 14, there is no interface element available for focus switching in the right direction on the handset 400, and there is no electronic device available for focus switching to the right, focus switching from control 52 of window 5 to control 1 of window 1 of the handset 100.
In some embodiments, the focus may skip a second electronic device located between the first electronic device and the third electronic device, the focus switching from the first electronic device to the third electronic device. As shown in fig. 7C, the user 200 presses the right key 14, searches for interface elements available for focus switching to the right, and determines the tablet 300 located on the right side of the notebook computer 100. Searching for an interface element for focus switching on the tablet pc 300, and if no interface element for focus switching is available on the user interface 111 on the tablet pc 300, continuing to search for an interface element for focus switching to the right, and determining the mobile phone 400 located on the right side of the notebook pc 100 as the second electronic device. If an interface element for focus switching is found, for example, a control 51 of a window 5 is found, the focus is switched from a control 22 of a window 2 of the notebook computer 100 to the control 51 of the window 5 of the mobile phone 400.
The technical implementation of the above-described cooperative manner will be described in detail below.
Mirror mode:
referring to fig. 8, an example of a software architecture in the mirror mode of the first electronic device and the second electronic device is given.
The left half of fig. 8 is a software architecture diagram of the source device. The software architecture of the source device may include an application layer, an application framework layer, and a driver layer; it should be noted that more or fewer layers may be included, and embodiments of the present application are not limited.
The content of the application layer may refer to the application layer in fig. 3, and will not be described herein. The application layer of fig. 8 differs from the application layer of fig. 3 in that: the application layer of fig. 8 includes an application of screen-casting management for managing screen-casting. Applications for screen-cast management include, for example, device connection applications. The device connection application may send instructions to a cross-device connection management framework in an application framework layer for controlling connection of a source device with a sink device.
The application framework layer includes a base framework, a cross-device connection management framework, and a data management framework.
The basic framework includes an input event management service, a layout subsystem, and a window management service, where the content of the input event management service, the layout subsystem, and the window management service refer to the related description of fig. 3, and are not described herein.
And the cross-device connection management framework is responsible for controlling the driving layer to realize the functions of proximity discovery, authentication, connection and the like between the source device and the receiving device.
In the embodiment of the application, the cross-device connection management framework stores connection information, direction information, distance information and the like of the electronic devices which are in communication connection with the cross-device connection management framework. The connection information may be a device identification of the electronic device, such as an internet protocol (internet protocol, IP) address, a port number, or an account number on which the electronic device is logged in, etc. The account number logged in by the electronic device may be an account number provided by an operator for the user, such as a Hua Cheng account number. The account logged in by the electronic device may also be an application account, such as a WeChat account. The direction information includes information of up-down, left-right directions, and the distance information includes a distance between the electronic devices. For example, in fig. 7C, the cross-device connection management frame of the notebook computer 100 may store device identifications of the tablet pc 300 and the mobile phone 400, direction information of the tablet pc 300 and the mobile phone 400 on the right of the notebook computer 100, a distance between the tablet pc 300 and the notebook computer 100, and a distance between the mobile phone 400 and the notebook computer 100.
And the data management framework is responsible for data code streams such as audio, video, layers and the like transmitted between the source equipment and the receiving equipment, and can also be responsible for reverse control and the like. The reverse control is reverse event control, and the reverse event is an event triggered by the receiving device and used for controlling the source device.
The driving layer comprises a bottom layer driving layer and is responsible for the work of discovery, authentication, connection and the like, for example, the driving layer receives a command issued by a cross-device connection management framework in the application framework layer and executes the actions of connection, disconnection and the like. Specifically, the driving layer includes a device discovery module, a device authentication module, and a device connection module. The device discovery module is responsible for device discovery, the device authentication module is responsible for device authentication, and the device connection module is responsible for device connection. Of course, the driver layer may also be a hardware driver such as a display driver.
The right half of fig. 8 is a software architecture diagram of the receiving device. The software architecture of the receiving device may include an application layer, an application framework layer, and a driver layer. The function modules of the layers and the roles of the function modules can be referred to the source device. That is, the roles of the receiving device and the source device may be interchanged, and the receiving device may also screen the source device.
The receiving device differs from the source device in that the application layer of the receiving device has a projection display application. The screen display application is for displaying screen content of the remote device. It will be appreciated that the source device may also have a projection display application installed thereon.
Taking the first electronic device as a tablet, the second electronic device as a notebook computer as an example.
Referring to fig. 9A to 9C, an application of the focus switching method provided in the embodiment of the present application to a mirror mode scene is exemplarily described.
As shown in fig. 9A, the tablet pc 300 serves as a source device and the notebook computer 100 serves as a sink device. A user interface 302 is displayed on a display 301 of the tablet 300. The user interface 302 includes a native window 303. Tablet 300 screens the image data (memo interface) of native window 303 to notebook 100. The notebook computer 100 displays the data from the screen in a non-full screen mode, the display screen 101 of the notebook computer 100 displays a user interface 102, the user interface 102 comprises a screen-throwing window 103 and a local window 104, the local window 104 displays a video interface, and the screen-throwing window 103 displays a memo interface from the screen throwing of the tablet 300. The native window 303 on tablet 300 and the note control 304 are in focus. The screen drop window 103 in fig. 9A and the note control on the memo interface displayed on the screen drop window 103 are in focus. The user 200 presses the right key 14. As shown in fig. 9B, based on the notebook computer 100 being located on the right side of the tablet 300, the local window 303 and the note control 304 of the tablet 300 cancel the focus state, and the notebook computer 100 switches the local window 104 and the control 105 to the focus state. I.e., according to the rightward direction indicated by the right key 14, the focus is switched by the native window 303 and the note control 304 of the tablet pc 300 to the native window 104 and the control 105 of the notebook pc 100. The focus is switched between tablet 300 and notebook 100, and the focus is switched between screen window 103 and native window 104, which are visible on notebook 100.
As shown in fig. 9B, the user 200 presses the left key 15 on the notebook computer 100. As shown in fig. 9C, the native window 104 and the control 105 of the notebook computer 100 cancel the focus state, and the native window 303 and the note control 304 on the tablet 300 switch to the focus state. I.e. according to the left direction indicated by the left key 15, the focus is switched by the native window 104 and control 105 of the notebook computer 100 to the native window 303 and the note control 304 of the tablet 300. The focus is switched between the tablet pc 300 and the notebook computer 100. The focus is seen on the notebook 100 to switch between the screen drop window 103 and the native window 104. Further, based on the tablet pc 300 and the notebook pc 100 being in a cooperative state, the notebook pc 100 can reversely control the tablet pc 300, and the notebook pc 100 responds to the user 200 pressing the left key 15 on the notebook pc 100, and the tablet pc 300 is located at the left side of the notebook pc 100, and the focus is switched to the tablet pc 300.
The local window 303 on the tablet pc 300 is in a focus state, and the screen window 103 that screens to the notebook pc 100 may be in a focus state or not. The screen window 103 on the notebook computer 100 may or may not display a visual cue of the focus state. The screen-drop window 103 may be higher than the native window 104 or not higher than the native window 104. The present application is not particularly limited thereto.
Referring to fig. 10A, a software interaction flow of focus switching shown in fig. 9A to 9B in the mirror mode is exemplarily described based on the software modules shown in fig. 8.
In step S101, the input event management service of the notebook computer 100 acquires an input event.
As shown in fig. 9A, the user 200 presses the right key 14 of the notebook computer 100, and the input/output device of the notebook computer 100 detects an input event pressing the right key 14 and reports the input event to the input event management service.
In step S102, the input event management service of the notebook computer 100 distributes the input event to the projection display application.
In some embodiments, the input event management service of the notebook computer 100 forwards the input event to the window management service of the notebook computer 100. The window management service of the notebook computer 100 determines to distribute an input event to a window in focus or an application corresponding to the window in focus. As shown in fig. 9A, the window currently in focus on the notebook computer 100 is a screen projection window 103, and for this purpose, the input event management service distributes the input event to the application corresponding to the screen projection window 103 (i.e., the application corresponding to the interface element currently in focus). The input events are distributed to the screen display application for this purpose based on the screen interface displayed on the screen window 103 corresponding to the screen display application. In some embodiments, if the notebook computer 100 and the tablet pc 300 are the same network element, the notebook computer 100 and the tablet pc 300 have only one focus. As shown in fig. 9A, the native window 303 and the note control on the tablet 300 are in focus, while the drop screen window 103 on the notebook 100 is not in focus. After the input event management service of the notebook computer 100 receives the input event, the window management service of the notebook computer 100 determines that the window in the focus state is the native window 303 on the tablet pc 300, that is, the focus is not on the notebook computer 100, and therefore, the input event needs to be distributed to the screen display application.
Step S103, the screen display application distributes the input event to the tablet 300.
In the embodiment of the present application, when the notebook computer 100 is used in conjunction with the tablet pc 300, the screen display application can know that its screen interface comes from the tablet pc 300, and for this purpose, the screen display application distributes the input event to the tablet pc 300. The screen display application transmits the input event to the cross-device connection management frame of the notebook computer 100, and the cross-device connection management frame of the notebook computer 100 transmits the received input event to the cross-device connection management frame of the tablet 300.
In other embodiments, the cross-device connection management framework of the notebook computer 100 stores connection information of the electronic device communicatively connected thereto, and the projection display application transmits the input event to the cross-device connection management framework of the notebook computer 100, and the cross-device connection management framework of the notebook computer 100 transmits the input event to the tablet 300 based on the stored connection information.
Step S104, the cross-device connection management framework of the tablet 300 injects the received input event into the input event management service of the tablet 300.
In step S105, the input event management service of the tablet pc 300 distributes the input event to the first window in the focus state.
As shown in fig. 9A, the input event management service distributes the input event to the native window 303 based on the current focus on the native window 303 on the tablet 300, that is, the first window in focus state is the native window 303. The native window 303 and note control 304 are in focus, with their borders bolded.
Step S106, the layout subsystem of the tablet 300 searches for a control in the first window according to the first direction corresponding to the input event.
As shown in fig. 9A, the first direction corresponding to the input event of the user 200 pressing the right key 14 is rightward. The control that obtains focus on the native window 303 is a note control 304, the tablet 300 starts with the note control 304, and the window management service of the tablet 300 calls the interface of the layout subsystem to find the interface element for focus switching rightward on the native window 303.
In step S107, when there is no control, the window management service of the tablet pc 300 searches for the target window according to the first direction.
As shown in fig. 9A, based on the note control 304 being located at the right edge of the native window 303, there is no interface element (such as a control) available for focus switching to the right on the native window 303, then the layout subsystem of the tablet 300 invokes the interface of the window management service to find other interface elements available for focus switching on the user interface 302, such as finding a target window available for focus switching.
In step S108, when there is no target window on the tablet pc 300, the window management service of the tablet pc 300 notifies the notebook pc 100 to find the target window.
As shown in fig. 9A, based on the native window 303 being located at the right edge of the user interface 302, that is, at the right edge of the display 301, there is no interface element (target window) available for focus switching to the right on the user interface 302. Tablet 300 looks to the right for an electronic device (co-device) to which it is communicatively connected, tablet 300 being aware that the co-device includes notebook 100 based on its cross-device connection management framework (co-usage between tablet 300 and notebook 100), and that notebook 100 is located to the right. After the tablet pc 300 determines the cooperative device notebook computer 100 according to the first direction, the tablet pc 100 is notified to find the target window by transmitting the first information to the notebook pc 100 through the cross-device connection management frame.
In this embodiment, the first information may be a search result of the tablet 300 searching for the target window on the user interface 302 according to the first direction, for example, there is no interface element available for focus switching on the user interface 302 according to the first direction. The first information may also be information for controlling the notebook computer 100 to search for a target window on the user interface 102 according to the first direction.
In step S109, the window management service of the notebook computer 100 searches for the target window according to the first direction.
After the notebook computer 100 receives the first information, in response to the first information, the window management service of the notebook computer 100 searches the user interface 102 for an interface element (such as a target window) for focus switching according to a first direction.
As shown in fig. 9B, the notebook computer 100 searches for an interface element for focus switching according to a first direction on the user interface 102, for example, searches for an interface element for focus switching from left to right on the user interface 102, and finds the native window 104.
In other embodiments, the notebook computer 100 starts with the screen window 103 and searches the user interface 102 for interface elements for focus switching to the right. Based on the screen shot window 103 being located on the right side of the user interface 102, the notebook computer 100 may look for a target window, such as the native window 104 to the left, with reference to the target window look-for strategy described above. The target window is determined to be the native window 104.
In step S110, when the target window is found, the window management service of the notebook computer 100 sets the target window to the focus state.
As shown in fig. 9B, the window management service may set the native window 104 to a focus state with its visual cues bolded for borders and raise the native window 104 level, which may be selected to be higher than other application windows.
Step S111, the window management service of the notebook computer 100 calls the layout subsystem to search for a control in the target window.
In some embodiments, after the notebook computer 100 finds the interface element (the native window 104) for focus switching, it may also find the target control on the native window 104. As shown in fig. 9B, the control is looked for from left to right within the native window 104, finding the control 105.
The notebook computer 100 may search for the control according to the above strategy for searching for the control for focus switching.
Step S112, the layout subsystem of the notebook computer 100 sets the control to the focus state.
As shown in fig. 9B, the layout subsystem of the notebook computer 100 sets the control 105 to the focus state with the visual cues bolded for the bezel.
Step S113, the window management service of the notebook computer 100 notifies the tablet pc 300 to cancel the focus state of the first window.
As shown in FIG. 9B, the native window 303 and the note control 304 on the tablet 300 have canceled the focus state.
Referring to fig. 10B, a software interaction flow of focus switching shown in fig. 9B to 9C in the mirror mode is exemplarily described based on the software modules shown in fig. 8.
In step S10, the input event management service of the notebook computer 100 obtains an input event.
As shown in fig. 9B, the user 200 presses the left key 15 of the notebook computer 100, and the input/output device of the notebook computer 100 detects an input event pressing the left key 15 and reports the input event to the input event management service of the notebook computer 100.
In step S11, the input event management service of the notebook computer 100 distributes the input event to the first window in the focus state.
As shown in fig. 9B, the input event management service distributes the input event to the native window 104 based on the current focus on the native window 104 of the notebook computer 100, that is, the first window in focus state is the native window 104. The native window 104 and control 105 are in focus, with the borders bolded.
In step S12, the notebook computer 100 searches for a control in the first window according to the first direction corresponding to the input event.
The window management service of the notebook computer 100 invokes an interface of the layout subsystem to find a control in the first window according to the first direction.
As shown in fig. 9B, the first direction corresponding to the input event of the user 200 pressing the left key 15 is left. The notebook computer 100 starts with a control 105 and searches the local window 104 for interface elements for focus switching to the left.
In step S13, when there is no control, the window management service of the notebook computer 100 searches for the target window according to the first direction.
As shown in fig. 9B, based on the control 105 being located at the left edge of the native window 104, there is no interface element (e.g., control) available for focus switching to the left on the native window 104. When the layout subsystem of the notebook computer 100 determines that there is no interface element available for focus switching in the native window 104, the layout subsystem of the notebook computer 100 invokes an interface of the window management service to search for other interface elements available for focus switching on the user interface 102, such as searching for a target window available for focus switching.
In step S14, when there is no target window on the notebook computer 100, the window management service of the notebook computer 100 notifies the tablet to find the target window.
As shown in fig. 9B, based on the native window 104 being located at the left edge of the user interface 102, i.e., at the left edge of the display screen 101, there is no interface element (target window) available for focus switching to the left on the user interface 102. The notebook computer 100 searches for an electronic device (cooperative device) communicatively connected thereto to the left, and the notebook computer 100 knows that the cooperative device includes the tablet 300 based on its cross-device connection management frame (cooperative use between the tablet 300 and the notebook computer 100), and that the tablet 300 is located at the left thereof. After the notebook computer 100 determines the tablet pc 300 according to the first direction, the tablet pc 300 is notified to find the target window by transmitting the first information to the tablet pc 300 through the cross-device connection management frame. When the window management service of the notebook computer 100 determines that there is no window available for focus switching on the notebook computer 100, the first information is transmitted to the window management service of the tablet pc 300 through the cross-device connection management frame of the notebook computer 100 and the tablet pc 300.
The first information may be a search result of the notebook computer 100 searching for the target window according to the first direction on the user interface 102, for example, there is no interface element available for focus switching on the user interface 102 according to the first direction. The first information may also be information for the control panel 300 to find a target window according to the first direction.
In step S15, the window management service of the tablet pc 300 searches for the target window according to the first direction.
After the tablet 300 receives the first information, in response to the first information, the window management service of the tablet 300 searches the second interface for an interface element (such as a target window) available for focus switching according to the first direction.
As shown in fig. 9C, the tablet 300 searches for an interface element for focus switching according to a first direction on the user interface 302, for example, searches for an interface element for focus switching from right to left on the user interface 302, and finds the native window 303.
In step S16, when the target window is found, the window management service of the tablet pc 300 sets the target window to the focus state.
As shown in fig. 9C, the window management service of the tablet 300 may set the native window 303 to a focus state, with its visual cues bolded as borders, and increase the native window 303 level. If there are other windows on tablet 300, the level of native window 303 may be selected to be higher than the other application windows.
Step S17, the layout subsystem of the tablet 300 searches for a control in the target window.
In some embodiments, the window management service of tablet 300 may invoke the interface of the layout subsystem to find a target control on native window 303 for focus switching. As shown in fig. 9C, the control is searched from right to left in the window corresponding to the native window 303, and the note control 304 is found.
Wherein tablet 300 may find a control with reference to the above-described policies for finding controls for focus switching.
Step S129, the layout subsystem of tablet 300 sets the control to the focus state.
As shown in FIG. 9C, the layout subsystem of tablet 300 sets found note control 304 to a focus state with its visual cues framed.
In step S130, the window management service of the tablet pc 300 notifies the notebook computer 100 to cancel the focus state of the first window.
As shown in fig. 9C, the native window 104 and control 105 on the notebook computer 100 have been out of focus.
Taking the first electronic device as a notebook computer and the second electronic device as a tablet.
Referring to fig. 11A to 11B, an application of the focus switching method provided in the embodiment of the present application to another mirror mode scene is exemplarily described.
Fig. 11A differs from fig. 9A in that: the native window 104 and control 105 of the notebook computer 100 are in focus. The user 200 presses the left key 15 and as shown in fig. 11B, the notebook computer 100 cancels the focus state of the native window 104 and the control 105. The focus falls on the local window 303 of the tablet 300. The local window 303 in the focus state may or may not be in the focus state to the screen window 103 of the notebook computer 100, and the note control on the screen interface in the screen window 103 displays a visual cue of the focus state. In the focus switching scenario shown in fig. 11A to 11B, the focus is switched between the tablet pc 300 and the notebook pc 100, and the focus is switched between the screen-projecting window 103 and the local window 104, which can be seen on the notebook pc 100.
Based on the software modules shown in fig. 8, please refer to fig. 12, which illustrates a software interaction flow for focus switching in another mirror mode.
In step S121, the input event management service of the notebook computer 100 acquires an input event.
As shown in fig. 11A, the user 200 presses the left key 15 of the notebook computer 100, and the input/output device of the notebook computer 100 detects an input event pressing the left key 15 and reports the input event to the input event management service.
In step S122, the input event management service of the notebook computer 100 distributes the input event to the first window in the focus state.
As shown in fig. 11A, based on the current focus on the native window 104 of the notebook computer 100, that is, the first window in focus state is the native window 104, for this purpose the input event management service distributes the input event to the native window 104. The native window 104 and control 105 are in focus, with the borders bolded.
In step S123, the layout subsystem of the notebook computer 100 searches for a control in the first window according to the first direction corresponding to the input event.
The window management service of the notebook computer 100 invokes an interface of the layout subsystem to find a control in the first window according to the first direction.
As shown in fig. 11A, the first direction corresponding to the input event of the user 200 pressing the left key 15 is left. The notebook computer 100 starts with a control 105 and searches the local window 104 for interface elements for focus switching to the left.
In step S124, when there is no control, the window management service of the notebook computer 100 searches for the target window according to the first direction.
As shown in fig. 11A, based on the control 105 being located at the left edge of the native window 104, there is no interface element (such as a control) available for focus switching to the left on the native window 104, when the layout subsystem of the notebook computer 100 determines that there is no interface element available for focus switching in the native window 104, the layout subsystem of the notebook computer 100 invokes an interface of the window management service, and searches for other interface elements available for focus switching on the user interface 102, such as searching for a target window available for focus switching.
In step S125, when there is no target window on the notebook computer 100, the window management service of the notebook computer 100 notifies the tablet to find the target window.
As shown in fig. 11A, based on the native window 104 being located at the left edge of the user interface 102, i.e., at the left edge of the display screen 101, there is no interface element (target window) available for focus switching to the left on the user interface 102. The notebook computer 100 searches for an electronic device (cooperative device) communicatively connected thereto to the left, and the notebook computer 100 knows that the cooperative device includes the tablet 300 based on its cross-device connection management frame (cooperative use between the tablet 300 and the notebook computer 100), and that the tablet 300 is located at the left thereof. After the notebook computer 100 determines the cooperative device tablet 300 according to the first direction, the first information is transmitted to the tablet 300 through the cross-device connection management frame to inform the notebook computer 100 of finding the target window. When the window management service of the notebook computer 100 determines that there is no window available for focus switching on the notebook computer 100, the first information is transmitted to the window management service of the tablet pc 300 through the cross-device connection management frame of the notebook computer 100 and the tablet pc 300.
The first information may be a search result of the notebook computer 100 searching for the target window according to the first direction on the user interface 102, for example, there is no interface element available for focus switching on the user interface 102 according to the first direction. The first information may also be information for the control panel 300 to find a target window according to the first direction.
In step S126, the window management service of the tablet pc 300 searches for the target window according to the first direction.
After the tablet 300 receives the first information, in response to the first information, the window management service of the tablet 300 searches the second interface for an interface element (such as a target window) available for focus switching according to the first direction.
As shown in fig. 11B, the tablet 300 searches for an interface element for focus switching according to a first direction on the user interface 302, for example, searches for an interface element for focus switching from right to left on the user interface 102, and finds the native window 303.
Step S127, when the target window is found, the window management service of the tablet pc 300 sets the target window to the focus state.
As shown in fig. 11B, the window management service of the tablet 300 may set the native window 303 to a focus state, its visual cue is bolded, and the native window 303 is hierarchically improved, and may be selected to be higher than other application windows. If there are other windows on tablet 300, the level of native window 303 may be selected to be higher than the other application windows.
Step S128, the layout subsystem of the tablet 300 searches for a control within the target window.
In some embodiments, the window management service that tablet 300 may call the interface of the layout subsystem to find the target control on native window 303. As shown in fig. 11B, the control is searched from right to left in the window corresponding to the native window 303, and the note control 304 is found.
Wherein tablet 300 may find a control with reference to the above-described policies for finding controls for focus switching.
Step S129, the layout subsystem of tablet 300 sets the control to the focus state.
As shown in FIG. 11B, the note control 304 is set to a focus state with its visual cues framed.
In step S130, the window management service of the tablet pc 300 notifies the notebook computer 100 to cancel the focus state of the first window.
As shown in fig. 11B, the native window 104 and control 105 on the notebook computer 100 have been out of focus.
Heterologous screen casting:
referring to fig. 13, an exemplary diagram of a software architecture when the first electronic device and the second electronic device are in a heterogeneous screen is shown.
The contents of the application layer, the application framework layer and the driver layer in fig. 13 may be described with reference to fig. 3, and will not be described herein. Fig. 13 differs from fig. 8 in that: the infrastructure of fig. 13 includes a display management module.
The display management module manages the works of creating, updating state, destroying and the like of the virtual screen. In a source device, the need to coordinate with a picture of another device is achieved by creating a virtual screen. There may be multiple screens (home screen and virtual screen) on the source device.
Taking the first electronic device as a tablet, the second electronic device as a notebook computer as an example.
Referring to fig. 14A to 14B, an application of the focus switching method provided in the embodiment of the present application to a heterogeneous screen scene is exemplarily described.
As shown in fig. 14A, the tablet pc 300 serves as a source device and the notebook computer 100 serves as a sink device. The tablet 300 is connected to a keyboard 305. The contents of the home screen (i.e., user interface 302) are displayed on the display screen 301 of the tablet 300. The user interface 302 includes a video window 306. The tablet 300 creates a virtual screen 307, a virtual interface is rendered on the virtual screen 307, virtual interface elements such as a memo window 310 are rendered on the virtual interface, a memo interface is displayed on the memo window 310, and the virtual screen 307 is in the background of the tablet 300 and is invisible to the user. Content on virtual screen 307 is displayed by dropping a screen across devices onto notebook computer 100. The user interface 102 is displayed on the display screen 101 of the notebook computer 100, the user interface 102 includes a screen projection window 103, and the content on the screen projection window 103 is the content of the virtual screen 307 on the tablet 300.
Video window 306 and control 309 of tablet 300 are in focus. The user 200 presses the right key 308 on the keypad 305. As shown in fig. 14B, the memo window 310 and the to-do control 311 on the virtual screen 307 are switched to the focus state. Video window 306 and control 309 of tablet 300 cancel the focus state. The focus switches between the home screen on tablet 300 and virtual screen 307, and the user 200 sees the focus switching between tablet 300 and notebook 100. The memo window 310 and the to-do control 311 are in focus, and the screen window 103 and the to-do control 106 that are screen-cast to the notebook computer 100 may be in focus. The drop screen window 103 and the to-do control 106 on the notebook computer 100 display visual cues of the focus state.
Based on the software modules shown in fig. 8, please refer to fig. 15, which exemplarily illustrates a software interaction flow of the focus switching scenario shown in fig. 14A to 14B.
In step S151, the input/output device driver of the tablet pc 300 acquires an input event.
As shown in fig. 14A, the user 200 presses the right key 14 on the keyboard 305 of the tablet 300, and the input/output device of the tablet 300 detects an input event pressing the right key 14, and reports the input event to the input event management service.
In step S152, the input event management service of the tablet pc 300 distributes the input event to the home screen.
Two displays are managed on the tablet 300, in which only one input focus exists. The input event management service of the tablet 300 distributes input events to the display management module. Based on the default display (i.e., home screen) being the display where focus is located, the display management module distributes input events to the home screen for this purpose.
Step S153, the display management module distributes the input event to the first window in the focus state.
The display management module distributes the input event to the window management service, and the window management service determines that the window for processing the input event is the first window, and for this purpose, distributes the input event to the first window in the focus state. As shown in fig. 14A, the first window in focus is video window 306, video window 306 and control 309 are in focus, and their borders are bolded.
Step S154, the layout subsystem searches for a control in the first window according to a first direction corresponding to the input event.
After the first window in the focus state receives the input event, an interface of the layout subsystem is called to find (inquire) whether other interface elements for focus switching exist in the first window according to the first direction of the input event. The window management service invokes an interface of the layout subsystem to find interface elements (e.g., controls) available for focus switching in the first window according to the first direction. As shown in fig. 14A, the first direction corresponding to the input event of the user 200 pressing the right key 308 is rightward. Tablet 300 starts with control 309 and looks right for interface elements available for focus switching on video window 306.
Step S155, when no control exists, the window management service searches for a target window on the main screen.
As shown in fig. 14A, based on the control 309 being located at the right edge of the video window 306, there is no interface element available for focus switching to the right on the video window 306, the layout subsystem invokes the window management service to find other interface elements available for focus switching on the home screen (i.e., the user interface 302), such as finding a target window available for focus switching.
In step S156, when there is no target window on the home screen, the window management service searches for a target window on the virtual screen.
As shown in fig. 14A, there is no interface element available for focus switching to the right on the user interface 302 based on the video window 306 being located at the right edge of the user interface 302, i.e., at the right edge of the display 301. Tablet 300 looks to the right for an electronic device co-device (co-device) with which to communicatively connect, tablet 300 knows that the co-device includes notebook 100 based on its cross-device connection management framework (co-usage between tablet 300 and notebook 100), and that notebook 100 is located to the right. After the tablet pc 300 determines the cooperative device notebook pc 100 according to the first direction, it searches for a virtual screen 307 that is projected onto the notebook pc 100, and determines the virtual screen 307 as the first virtual screen. The window management service looks for interface elements on the first virtual screen (virtual screen 307) that are available for focus switching.
Step S157, when the target window is found on the virtual screen, the window management service sets the target window to the focus state.
As shown in fig. 14B, the window management service looks for a target window according to a first direction on a first virtual screen (virtual screen 307). The window management service may set the memo window 310 to a focus state with the frame thereof thickened by determining that the memo window 310 on the virtual screen 307 is a target window with reference to the above-described policy of searching for a target window.
Step S158, the layout subsystem searches for a control in the target window.
Tablet 300 looks for a target control on memo window 310 and the window management service invokes the layout subsystem to look for a control on memo window 310. As shown in FIG. 11B, the to-Do control 311 may be found in the memo window 310 with reference to the policy of finding controls described above.
Step S159, the layout subsystem sets the control to be in a focus state.
As shown in FIG. 14B, the layout subsystem sets the to-Do control 311 to the focus state with the borders bolded. The memo window 306 set in the focus state on the virtual screen 307 is projected onto the notebook computer 100 and displayed as the projection window 103. The to-Do control 311 is displayed as the to-Do control 106 on the screen cast to the notebook computer 100. Based on the virtual memo window 306 and the to-be-handled control 311 being in focus, the virtual memo window is rendered as a frame bolded on the virtual screen 307, and then the visual cues displayed on the notebook computer 100 by the screen-throwing window 103 and the to-be-handled control 106 are rendered as a frame bolded.
Step S160, the window management service informs the layout subsystem to cancel the focus state of the control on the main screen.
In the embodiment of the present application, the window management service cancels the focus state of the window corresponding to the video interface on the main screen, and notifies the layout subsystem to cancel the focus state of the control 309 on the video window 306 on the main screen. As in fig. 14B, the borders of the video window 306 and the control 309 become the original borders.
Sharing mode:
referring to fig. 16, an exemplary diagram of a software architecture when the first electronic device and the second electronic device share a mode is shown.
The contents of the application layer, the application framework layer and the driver layer in fig. 16 may be described with reference to fig. 3, and will not be described herein. Fig. 16 differs from fig. 8 in that: the infrastructure of fig. 16 does not include a data management framework. In some embodiments, the application layer of FIG. 16 may also not include a drop display application. When the first electronic equipment and the second electronic equipment cooperate through the sharing mode, screen projection is not carried out between the first electronic equipment and the second electronic equipment.
In some embodiments, the first electronic device and the second electronic device may be different software systems, for example, the software system of the first electronic device is a windows system and the software system of the second electronic device is an Android system.
Taking the first electronic device as a tablet, the second electronic device as a notebook computer as an example.
Referring to fig. 17A to 17B, an application scenario diagram of a focus switching method in a sharing mode according to an embodiment of the present application is exemplarily described.
Tablet 300 and notebook 100 do not project, tablet 300 is connected to an input device, such as keyboard 305, through which user 200 may operate tablet 300 and notebook 100. When the user 200 operates the notebook computer 100 through the keyboard 30, the tablet 300 is required to map an input event into the notebook computer 100 through the cross-device connection management frame. As shown in fig. 17A, a user interface 302 is displayed on a display screen 301 of the tablet 300, and a video window 306 is displayed on the user interface 302. A user interface 102 is displayed on the display screen 101 of the notebook computer 100, and a memo window 107 is displayed on the user interface 102. Video window 306 and control 309 of tablet 300 are both in focus. The user 300 presses the right key 308 of the tablet 300. As shown in fig. 17B, the focus is on the memo window 107 and the to-do control 108 of the notebook computer 100, and the memo window 107 and the note control 108 on the notebook computer 100 are in focus. Video window 306 and control 309 of tablet 300 cancel the focus state.
Referring to fig. 18, a software interaction flow at the time of focus switching according to an embodiment of the present application will be exemplarily described based on the software modules shown in fig. 16.
In step S181, the input/output device driver of the tablet pc 300 acquires an input event.
As shown in fig. 17A, the user 200 presses a right key 308 on the keyboard 305 of the tablet 300, and the input/output device of the tablet 300 detects an input event pressing the right key 308, and reports the input event to the input event management service.
In step S182, the input event management service of the tablet pc 300 distributes the input event to the first window in the focus state.
The input event management service of the tablet 300 distributes the input event to the window management service, which determines the window that processes the input event as the first window. The input event management service distributes the input event to the first window.
As shown in fig. 17A, the first window in focus is a video window 306 on the tablet 300.
In step S183, the layout subsystem of the tablet pc 300 searches for a control in the first window according to the first direction corresponding to the input event.
As shown in fig. 17A, the first direction corresponding to the input event of the user 200 pressing the right key 308 is rightward. The control that obtains focus on video window 306 of tablet 300 is control 309, tablet 300 starts with control 309 and searches interface elements for focus switching to the right on video window 306. The window management service of tablet 300 invokes an interface of the layout subsystem to find a control in the first window for focus switching according to the first direction.
In step S184, when there is no control, the window management service of the tablet pc 300 searches for the target window according to the first direction.
When the layout subsystem determines that the control for focus switching does not exist in the first window, the layout subsystem calls an interface of the window management service, and searches a target window for focus switching according to a first direction.
As shown in fig. 17A, based on the control 309 on the tablet 300 being located at the right edge of the video window 306, there is no interface element available for focus switching to the right on the video window 306, the tablet 300 searches for other interface elements available for focus switching on the user interface 302, such as searching for a target window available for focus switching.
In step S185, when there is no target window on the tablet pc 300, the window management service of the tablet pc 300 notifies the notebook pc 100 to find the target window.
As shown in fig. 17A, there is no interface element available for focus switching to the right on the user interface 302 based on the video window 306 being located at the right edge of the user interface 302, i.e., at the right edge of the display 301. Tablet 300 looks to the right for a device (co-device) to which it is communicatively connected, tablet 300 knowing that notebook 100 is located on the right based on its cross-device connection management framework (co-usage between tablet 300 and notebook 100). After the tablet pc 300 determines the cooperative device notebook pc 100 according to the first direction, the tablet pc 100 is notified to find the target window by transmitting the first information to the notebook pc 100 through the cross-device connection management frame.
Wherein the first information may be a search result of the tablet 300 searching for the target window according to the first direction on the user interface 302. The first information may also be information for controlling the notebook computer 100 to search for the target window according to the first direction.
In step S186, the notebook computer 100 creates a virtualized input device, and injects an input event into the virtualized input device.
After the notebook computer 100 receives the first information, a virtualized input device is created in response to the first information, and an input event is injected into the virtualized input device. The virtualized input device is used for receiving information of a first direction corresponding to an input event.
In step S187, the window management service of the notebook computer 100 searches for the target window according to the first direction.
After receiving the input event, the virtualized input device of the notebook computer 100 reports the input event to the window management service of the notebook computer 100. The window management service of the notebook computer 100 looks for a target window on the user interface 102 according to a first direction.
In step S188, when the target window is found, the window management service of the notebook computer 100 sets the target window to the focus state.
As shown in fig. 17B, the notebook computer 100 may determine the memo window 107 on the user interface 102 as the target window with reference to the above-described policy of finding the target window. Setting the memo window 107 to the focus state may further increase the window level corresponding to the memo window 107, and may select a higher window than other application windows.
In step S189, the layout subsystem of the notebook computer 100 searches for a control in the target window.
The window management service of the notebook computer 100 invokes the interface of the layout subsystem to find the target control on the window corresponding to the memo window 107, and as shown in fig. 17B, find the control to the right in the window corresponding to the memo window 107, and find the to-be-handled control 108.
Step S190, the layout subsystem of the notebook computer 100 sets the control to the focus state.
As shown in FIG. 17B, the to-Do control 108 is switched to the focus state with the visual cues framed.
In step S191, the window management service of the notebook computer 100 notifies the tablet pc 300 to cancel the focus state of the first window.
As shown in fig. 17B, video window 306 and control 308 on tablet 300 have been out of focus.
In the embodiment of the present application, the tablet pc 300 displays its own screen, and the notebook pc 100 displays its own screen. In the sharing mode, focus switching is accomplished by cross-device collaboration. The user 200 operates the notebook computer 100 through the tablet 305, so that a virtualized input device is required on the notebook computer 100, and when the virtualized input device on the notebook computer 100 acquires an input event, the tablet 300 directly forwards the input event received by the keyboard 305 to the notebook computer 100, and the tablet 300 does not respond to the input event received by the keyboard 305. The tablet pc 300 and the notebook pc 100 belong to the same network element, so that only one focus control is provided in the network element, and when the focus is switched to the next control, the focus state of the control where the original focus is located needs to be canceled.
In the embodiment of the application, under the cross-device cooperative scene, the first direction of the response input event can be realized, the focus can be switched between the cross-devices in a seamless manner, and the switching of the input focus can be completed rapidly through the devices for providing the directional input through the keyboard and the like.
In the embodiment of the application, the input device can work in a common mode or a multi-connection mode. The multi-connection mode refers to a working mode of simultaneously maintaining communication connection with a plurality of electronic devices; in contrast, the normal mode refers to an operation mode in which communication connection is maintained with only a single electronic device. In the embodiment of the application, the input device in the multi-connection mode can support the input device to realize seamless synchronous migration of the focus among the plurality of electronic devices by simultaneously keeping communication connection with the plurality of electronic devices.
In the embodiment shown in fig. 17A to 17B, both the tablet pc 300 and the notebook pc 100 are connected to the same wireless local area network (wireless local area network, WLAN), i.e. the tablet pc 300 and the notebook pc 100 are in communication connection. The keypad 305 operates in a normal mode, and the keypad 305 is connected only to the tablet 300.
In other embodiments, where the keyboard 305 is operated in a multi-connection mode, the keyboard 305 may be freely portable among a plurality of electronic devices. The tablet 300 and the notebook computer 100 are both connected to the same wireless local area network (wireless local area network, WLAN), and in an initial state, the keyboard 305 establishes a wireless communication connection (such as a bluetooth connection) with the tablet 300 to operate or control the tablet 300. When the tablet 300 receives an input event entered by the keyboard 305, the tablet 300 instructs the keyboard 305 to switch hosts connected thereto by sending a connect command to the keyboard 305 when the interface element on the display 301 of the tablet 300 where the focus is located at the edge of the display 301 of the tablet 300. Assuming that tablet 300 instructs keyboard 305 to switch to notebook computer 100, further, keyboard 305 establishes a wireless communication connection (e.g., a bluetooth connection) with notebook computer 100. To this end, the keyboard 305 is switched from the tablet pc 300 to the notebook pc 100, and the keyboard 305 can operate or control the notebook pc 100. The tablet pc 300 does not need to transmit the first information to the notebook computer 100. When the keyboard 305 is switched to connect with the notebook computer 100, the keyboard 305 directly inputs an input event to the notebook computer 100. In response to the input event, the notebook computer 100 searches for an interface element for focus switching on the notebook computer 100, and sets the found interface element to a focus state.
In other embodiments, the communication connection between the input device and the source host is not broken after the input device may be migrated from one electronic device (i.e., the source host) to another electronic device (i.e., the target host). In an initial state, the keypad 305 establishes a wireless communication connection (e.g., a bluetooth connection) with the tablet 300 to operate or control the tablet 300. When the tablet 300 receives an input event entered by the keyboard 305, the tablet 300 instructs the keyboard 305 to establish a connection with other electronic devices by sending a connection command to the keyboard 305 when the interface element on the display 301 of the tablet 300 where the focus is located at the edge of the display 301 of the tablet 300. For example, the keypad 305 discovers other electronic device(s) connectable around based on device discovery (e.g., near field discovery, discovery of devices logged onto the same account, etc.), and performs an authentication pairing to establish a wireless connection therewith. Assume that the keyboard 305 discovers surrounding connectable notebook computers 100 based on device discovery and performs authentication pairing to establish wireless connection with the notebook computers 100. Further, the keyboard 305 establishes a wireless communication connection (e.g., a bluetooth connection) with the notebook computer 100. To this end, the keyboard 305 may be in communication with the tablet pc 300 and the notebook pc 100, but the tablet pc 300 and the notebook pc 100 may not be in communication with each other. The keyboard 305 may operate or control the tablet 300 and the notebook computer 100. The tablet pc 300 does not need to transmit the first information to the notebook computer 100. When the keyboard 305 is switched to connect with the notebook computer 100, the keyboard 305 directly inputs an input event to the notebook computer 100. In response to the input event, the notebook computer 100 searches for an interface element for focus switching on the notebook computer 100, and sets the found interface element to a focus state.
Referring to fig. 19, fig. 19 is a flowchart of a focus switching method according to an embodiment of the present application. The focus switching method can be applied to the first electronic equipment, the first electronic equipment is in communication connection with the second electronic equipment, and the application scene can be a heterogeneous screen projection scene.
Step S1: the first electronic device displays a first interface, wherein the first interface includes a first interface element in a focus state.
As shown in fig. 14A to 14B, the first electronic device is a tablet pc 300, and the user interface 302 displayed on the tablet pc 300 is a first interface. If the focus is only on video window 306, then the first interface element is video window 306. If the focus is on a control within the video window 306, the first interface element is control 309.
Step S2: the first electronic device receives a first input, wherein the first input is for indicating a first direction of focus switching.
Wherein the first input is for indicating a first direction of focus switching. The first direction is the direction in which the user desires to switch focus in the user interface (including the same user interface of the same electronic device and different user interfaces between different electronic devices), i.e. the direction in which the user desires to move focus.
The first input indicates a first direction, which is the basis for focus switching. The first input is not indicative of a particular interface element. The user may input a first input to the electronic device through the input device. Thus, non-limiting examples of input devices in embodiments of the present application include devices that provide directional input through keyboards, remote controls, touch pads on keyboards, cameras, and the like.
In some embodiments, the input device does not include a device having a pointing function through a mouse, touch screen, or the like.
It will be appreciated that if the user indicates, via a mouse or a touch screen, to switch focus to a particular interface element (e.g., a particular interface element clicked by the mouse), then the input is not the first input in the embodiments of the present application. If the user indicates the input in the specific direction through the mouse or the touch screen, for example, the user slides up or down or left or right on the touch screen to indicate that the first direction is up or down or left or right, the input indicating the specific direction is the first input in the embodiment of the present application.
Illustratively, a user may manipulate directional keys (including up, down, left, and right directional keys) on a keyboard in order to input a first input to an electronic device. As illustrated in fig. 2A, the user presses the right key 14 of the keyboard 12 to enter a first input into the notebook computer 100, the first direction being to the right. For another example, the user may manipulate direction keys (including up, down, left, and right direction keys) on the remote control. For another example, the user may slide (including up and down, left and right, etc.) on a touch pad on the keyboard. For example, the user inputs a gesture indicating a specific direction (including up, down, left, and right directions) before the image capturing apparatus, and it is understood that the specific direction may be set according to the actual situation, which is not specifically limited in this application.
Step S3: the first electronic device responds to the first input, switches the second interface element to a focus state according to the first interface and the first direction, and cancels the focus state of the first interface element, wherein the second interface element is displayed on the second electronic device.
In some embodiments, the first electronic device switching the second interface element to the focus state and canceling the focus state of the first interface element in response to the first input according to the first interface and the first direction comprises: the first electronic device responds to the first input and searches an interface element for switching the focus on the first interface according to a first direction. When the first electronic device determines that the first virtual interface element is not available for focus switching on the first interface according to the first direction, the first electronic device determines the first virtual interface element on the first virtual screen according to the first direction. The first electronic device switches the first virtual interface element to a focus state and cancels the focus state of the first interface element, wherein the first virtual interface element is projected to the second electronic device and displayed as the second interface element on the second electronic device. In some embodiments, the number of electronic devices communicatively coupled to the first electronic device is N, N being an integer greater than or equal to 1; the method further includes, prior to determining the first virtual interface element on the first virtual screen according to the first direction: determining a second electronic device positioned in the first direction according to the position relation between the N electronic devices and the first electronic device; and determining that the first electronic device projects a screen to a first virtual screen of the second electronic device.
The N electronic devices are in communication connection with the first electronic device, namely, the N electronic devices and the first electronic device are in a cooperative state. Wherein determining that the second electronic device is in the first orientation may be: and searching the electronic equipment along the first direction by taking the position of the first electronic equipment as a starting point, and determining the found electronic equipment as the second electronic equipment. And acquiring the equipment identifier of the second electronic equipment, and determining a first virtual screen of the first electronic equipment to be projected to the second electronic equipment according to the equipment identifier of the electronic equipment.
In this embodiment of the present application, when N electronic devices are in a cooperative state with the first electronic device in communication connection, the first electronic device may obtain a positional relationship between the N electronic devices and the first electronic device, including a distance between the N electronic devices and the first electronic device, and a direction of the N electronic devices relative to the first electronic device. The first electronic device may preferentially determine a second electronic device from among the N electronic devices located in the first direction and closest to the first electronic device, so as to find an interface element for focal point switching in the determined second electronic device. For example, after determining the second electronic device according to the first direction, determining that the virtual screen of the first electronic device projected to the second electronic device is the first virtual screen.
Specifically, as shown in fig. 14A to 14B, the notebook computer 100 is located on the right side of the tablet pc 300. The tablet 300 (first electronic device) creates a virtual screen 307 and screens the contents on the virtual screen 307 to the notebook computer 100 (second electronic device). When the tablet pc 300 queries the interface elements available for focus switching on the main screen according to the first direction and determines that there are no interface elements available for focus switching, the tablet pc 300 determines that the second electronic device located in the first direction is the notebook pc 100 according to the positional relationship between the N electronic devices connected to the tablet pc 300 in a communication manner, and then the tablet pc 100 obtains a virtual screen 307 that is projected onto the notebook pc 100, where the virtual screen 307 projected onto the notebook pc 100 is the first virtual screen. Tablet 300 looks for interface elements available for focus switching on virtual screen 307. If the virtual interface element memo window 310 is found, the memo window 310 is the first virtual interface element, and the forget window 310 is set to the focus state. Further, the tablet pc 300 continues to search for the focus control of the virtual interface element in the memo window 310, and finds the to-be-handled control 311, and then the to-be-handled control 311 is the first virtual interface element, and the to-be-handled control 311 is set to the focus state.
As shown in fig. 14A to 14B, the focus is switched from the control 309 (on the main screen) on the video window 306 to the to-do control 311 (on the virtual screen 307) on the memo window 310, and the user-viewable focus is switched between the user interface 302 (first interface) of the tablet 300 and the user interface 103 (second interface) of the notebook computer 100. In some embodiments, the focus switching method further comprises: the first electronic device receives a second input, wherein the second input is for indicating a second direction of focus switching. The first electronic device responds to the second input and searches for a virtual interface element for focal point switching on the first virtual screen according to a second direction; when the first virtual screen has no virtual interface element for focus switching and has no second virtual screen for focus switching, the first electronic device switches the third interface element to a focus state and cancels the focus state of the second interface element, wherein the second virtual screen is projected to the second electronic device or the third electronic device.
The content of the second input may refer to the first input, which is not described herein. The first direction may be the same as or different from the second direction.
In some embodiments, the method further comprises a third electronic device communicatively coupled to the first electronic device, the method further comprising: when the first virtual screen has no virtual interface element for focus switching and a second virtual screen for focus switching exists, the first electronic device determines the second virtual interface element on the second virtual screen according to the second direction. And the first electronic device switches the second virtual interface element into a focus state and cancels the focus state of the first virtual interface element, wherein the second virtual interface element is projected to the second electronic device or the third electronic device.
Referring to fig. 20, fig. 20 is a schematic flow chart of another focus switching method according to an embodiment of the present application. The focus switching method can be applied to a plurality of electronic devices in a cooperative state, the first electronic device is in communication connection with the second electronic device, and the application scene can be a mirror image mode scene and a sharing mode scene.
Step S201: the first electronic device displays a first interface and the second electronic device displays a second interface, wherein the first interface includes a first interface element in a focus state.
The first electronic device is an electronic device that processes the first input, that is, an electronic device that responds to the first input to find an interface element for focus switching. The first electronic device receiving the first input may be the first electronic device directly receiving the first input of the user input, or may be the first input forwarded by other electronic devices. As shown in fig. 11A, the notebook computer 100 receives an input. The tablet shown in fig. 17A receives input. The first input of the user may also be received for the second electronic device, which transmits the first input to the second electronic device. As shown in fig. 9A, the notebook computer 100 receives input from a user, and the notebook computer 100 transmits the input to the tablet pc 300 for processing.
Step S202: the first electronic device receives a first input, wherein the first input is for indicating a first direction of focus switching.
The content of the first input may be referred to above, and will not be described herein.
Step S203: the first electronic device is responsive to the first input to output first information to the second electronic device according to the first interface and a first direction, wherein the first information includes the first direction.
In some embodiments, the first electronic device outputting the first information to the second electronic device according to the first interface and the first direction in response to the first input comprises: the first electronic device is responsive to the first input to find an interface element on the first interface that is settable to a focus state according to a first direction. And outputting the first information to the second electronic equipment when the searching result is that the interface element which can be set to the focus state is not available.
The first information is used for indicating the second electronic equipment to search an interface element for focal point switching according to the first direction. The first information may be a search result of the first electronic device searching for an interface element available for focus switching according to the first direction on the first interface. The first information may also be control information output by the first electronic device according to the search result.
In some embodiments, the number of electronic devices communicatively coupled to the first electronic device is N, N being an integer greater than or equal to 1; the method further includes, prior to outputting the first information to the second electronic device: the first electronic device determines a second electronic device located in a first direction according to the position relation between the N electronic devices and the first electronic device.
The information related to determining the second electronic device may refer to the foregoing, which is not described herein.
Step S204: the second electronic device switches the second interface element to a focus state according to the second interface and the first direction in response to the first information, wherein the second interface element is displayed on the second interface.
In some embodiments, the second electronic device determining, in response to the first information, a second interface element from the second interface and the first direction includes: the second electronic device is responsive to the first information to create a virtualized input device, wherein the virtualized input device is to receive a first direction; the second electronic device determines a second interface element on the second interface according to the first direction in response to the virtualized input device receiving the first direction. As shown in fig. 17A to 17B, when the first electronic device and the second electronic device are in the sharing mode, the second electronic device virtualizes the input device after receiving the first information, and the virtual input device searches for an interface element for focal point switching on a second interface (such as the user interface 102 of the notebook computer 100) of the second electronic device according to the first direction.
In some embodiments, as shown in fig. 9A to 9B, the notebook computer 100 directly searches the user interface 102 for interface elements for focus switching according to the first information, and switches the found interface elements (such as the native window 104 and the control 105) to a focus state. The focus state of the first interface element is canceled, such as canceling the focus state of the native window 303 and the note control 304 on the tablet 300. As shown in fig. 9B to 9C, the tablet 300 directly searches for interface elements available for focus switching on the user interface 302 according to the first information, and switches the found interface elements (such as the native window 303 and the note control 304) to a focus state. The focus state of the first interface element, such as the local window 104 and control 105 on the notebook computer 100, is canceled. As shown in fig. 11A to 11B, the tablet pc 300 directly searches for an interface element for focus switching on the user interface 302 according to the first information, and switches the found interface element (such as the native window 303 and the note control 304) to a focus state, and cancels the focus state of the native window 104 and the control 105 on the notebook computer 100. In some embodiments, the focus switching method further comprises: the first electronic device receives a second input, wherein the second input is used for indicating a second direction of focus switching; the first electronic device outputting second information to the second electronic device in response to the second input, wherein the second information includes a second direction; the second electronic equipment responds to the second information and searches for interface elements for focal point switching on a second interface according to a second direction; the first electronic device searches a third electronic device for focal point switching in response to the fact that no interface element for focal point switching exists on the second interface; when the third electronic equipment for focal point switching does not exist, the first electronic equipment switches the third interface element to a focal point state, wherein the third interface element is displayed on the first electronic equipment; the second electronic device responds to the first electronic device to switch the third interface element to the focus state, and the focus state of the second interface element is canceled.
Wherein the second input may refer to the content of the first input, and the first direction may be the same as or different from the second direction.
In some embodiments, a third electronic device communicatively coupled to the first electronic device, the third electronic device displaying a third interface; the focus switching method further comprises the following steps: the first electronic device responds to the existence of a third electronic device for focal point switching, and the first electronic device outputs third information to the third electronic device, wherein the third information comprises a second direction; the third electronic device responds to the third information, and switches a fourth interface element to a focus state according to the third interface and the second direction, wherein the fourth interface element is displayed on the third electronic device; and the second electronic device responds to the third electronic device to switch the fourth interface element into the focus state, and the focus state of the second interface element is canceled.
Referring to fig. 21, an embodiment of the present application provides a schematic structural diagram of an electronic device, which may be used to execute the focus switching method in the embodiment of the present application.
As shown in fig. 21, the electronic device 210 may include: processor 211, memory 212, display 213, etc. In addition, these components may also be connected and communicate via one or more buses or the like.
The processor 211 is a control center of the electronic device 210, connects various parts of the entire electronic device 210 using various interfaces and lines, and performs various functions of the electronic device 210 and/or processes data by running or executing software programs and/or modules stored in the memory 212, and invoking data stored in the memory 212. The processor 211 may be composed of an integrated circuit (Integrated Circuit, simply referred to as IC), for example, a single packaged IC, or may be composed of a plurality of packaged ICs connected to the same function or different functions. For example, the processor 211 may be a central processor.
The display 213 is used for displaying corresponding environmental change information, such as information of the user interface shown in fig. 1A to 1B, 2A to 2B, 5A to 5C, 7A to 7C, 9A to 9C, 11A to 11B, 14A to 14B, and 17A to 17B.
In a specific implementation, the present application further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the focus switching method provided in the present application when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random access memory (random access memory, RAM), or the like.
It should also be appreciated that the memory in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DRRAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
The embodiments of the present application also provide a computer program product, which when executed on a computer, causes the computer to perform the above-mentioned related steps, so as to implement the focus switching method of the application program in the above-mentioned method embodiments.
The embodiment of the application also provides a computer storage medium, which comprises computer instructions, and when the computer instructions run on a terminal device, the terminal device is caused to execute the focus switching method of the embodiment.
The terminal device, the computer storage medium, the computer program product, or the chip system provided in the embodiments of the present application are all configured to execute the corresponding method provided above, so that the beneficial effects that can be achieved by the terminal device, the computer storage medium, the computer program product, or the chip system can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated unit may be stored in a readable storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application.

Claims (14)

1. A focus switching method, applied to a first electronic device, the first electronic device being communicatively connected to a second electronic device, the method comprising:
Displaying a first interface, wherein the first interface comprises a first interface element in a focus state;
receiving a first input, wherein the first input is used for indicating a first direction of focus switching;
and responding to the first input, switching a second interface element to the focus state according to the first interface and the first direction, and canceling the focus state of the first interface element, wherein the second interface element is displayed on the second electronic device.
2. The focus switching method of claim 1, wherein said switching a second interface element to the focus state and canceling the focus state of the first interface element in accordance with the first interface and the first direction in response to the first input comprises:
responsive to the first input, searching for an interface element available for focus switching on the first interface according to the first direction;
when the first interface has no interface element for focal point switching in the first direction, determining a first virtual interface element on a first virtual screen according to the first direction;
and switching the first virtual interface element into the focus state, and canceling the focus state of the first interface element, wherein the first virtual interface element is projected to the second electronic equipment and displayed as the second interface element on the second electronic equipment.
3. The focus switching method according to claim 2, wherein the number of electronic devices communicatively connected to the first electronic device is N, N being an integer greater than or equal to 1; before the determining the first virtual interface element on the first virtual screen according to the first direction, further comprises:
determining the second electronic equipment positioned in the first direction according to the position relation between the N electronic equipment and the first electronic equipment;
determining the first virtual screen for projecting to the second electronic device.
4. A focus switching method as claimed in claim 2 or 3, characterized in that the method further comprises:
receiving a second input, wherein the second input is used for indicating a second direction of focus switching;
responding to the second input, and searching a virtual interface element for focus switching on the first virtual screen according to the second direction;
and when the first virtual screen has no virtual interface element for focus switching and has no second virtual screen for focus switching, switching a third interface element to the focus state and canceling the focus state of the second interface element, wherein the third interface element is displayed on the first electronic equipment.
5. The focus switching method of claim 4, further comprising a third electronic device communicatively coupled to said first electronic device; the method further comprises the steps of:
when the first virtual screen has no virtual interface element for focus switching and the second virtual screen for focus switching exists, determining a second virtual interface element on the second virtual screen according to the second direction, wherein the second virtual screen is projected to the second electronic equipment or the third electronic equipment;
and switching the second virtual interface element into the focus state, and canceling the focus state of the first virtual interface element.
6. A focus switching method, applied to a first electronic device and a second electronic device, where the first electronic device is communicatively connected to the second electronic device, the method comprising:
the first electronic device displays a first interface, and the second electronic device displays a second interface, wherein the first interface comprises a first interface element in a focus state;
the first electronic device receives a first input, wherein the first input is used for indicating a first direction of focus switching;
The first electronic device responds to the first input and outputs first information to the second electronic device according to the first interface and the first direction, wherein the first information comprises the first direction;
and the second electronic equipment responds to the first information, and switches a second interface element to the focus state according to the second interface and the first direction, wherein the second interface element is displayed on the second interface.
7. The focus switching method of claim 6, wherein the first electronic device outputting first information to the second electronic device according to the first interface and the first direction in response to the first input comprises:
the first electronic equipment responds to the first input and searches an interface element for focal point switching on the first interface according to the first direction;
and when the first interface has no interface element for focal point switching, the first electronic equipment outputs first information to the second electronic equipment.
8. The focus switching method according to claim 6 or 7, wherein the number of electronic devices communicatively connected to the first electronic device is N, N being an integer greater than or equal to 1; before said outputting the first information to the second electronic device, further comprising:
The first electronic device determines the second electronic device located in the first direction according to the position relation between the N electronic devices and the first electronic device.
9. The focus switching method according to any one of claims 6 to 8, wherein the second electronic device switching a second interface element to the focus state according to the second interface and the first direction in response to the first information comprises:
the second electronic device responds to the first information to create a virtualized input device, wherein the virtualized input device is used for receiving the information of the first direction;
the second electronic device receives the information of the first direction in response to the virtualized input device, determines the second interface element on the second interface according to the first direction, and switches the second interface element to the focus state.
10. The focus switching method according to any one of claims 6 to 9, characterized in that the method further comprises:
the first electronic device receives a second input, wherein the second input is used for indicating a second direction of focus switching;
the first electronic device outputting second information to the second electronic device in response to the second input, wherein the second information includes the second direction;
The second electronic equipment responds to the second information and searches for an interface element for focal point switching on the second interface according to the second direction;
when the second interface has no interface element for focal point switching in the second direction, the first electronic device switches a third interface element to the focal point state according to the second direction, and the second electronic device cancels the focal point state of the second interface element, wherein the third interface element is displayed on the first electronic device.
11. An electronic device comprising a display screen, a processor and a memory, the memory for storing instructions, the processor for invoking the instructions in the memory to cause the electronic device to perform the focus switching method of any of claims 1-10.
12. A chip coupled to a memory in an electronic device, wherein the chip is configured to control the electronic device to perform the focus switching method of any one of claims 1-10.
13. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the focus switching method of any one of claims 1-10.
14. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the focus switching method of any one of claims 1-10.
CN202210989423.5A 2022-08-17 2022-08-17 Focus switching method, electronic device, chip, storage medium, and program product Pending CN117631902A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210989423.5A CN117631902A (en) 2022-08-17 2022-08-17 Focus switching method, electronic device, chip, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210989423.5A CN117631902A (en) 2022-08-17 2022-08-17 Focus switching method, electronic device, chip, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN117631902A true CN117631902A (en) 2024-03-01

Family

ID=90030894

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210989423.5A Pending CN117631902A (en) 2022-08-17 2022-08-17 Focus switching method, electronic device, chip, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN117631902A (en)

Similar Documents

Publication Publication Date Title
EP4130963A1 (en) Object dragging method and device
KR101385364B1 (en) Method and apparatus for providing application interface portions on peripheral computer devices
CN111666055B (en) Data transmission method and device
US7624192B2 (en) Framework for user interaction with multiple network devices
AU2022211850B2 (en) Application function implementation method and electronic device
US20110246904A1 (en) Interacting with Remote Applications Displayed Within a Virtual Desktop of a Tablet Computing Device
CN111049935B (en) System for remotely controlling electronic equipment and electronic equipment thereof
WO2017167126A1 (en) Window display method, information exchange method and system
KR102311248B1 (en) Digital device and method of processing video data thereof
WO2024016559A1 (en) Multi-device cooperation method, electronic device and related product
US11947998B2 (en) Display method and device
CN114065706A (en) Multi-device data cooperation method and electronic device
CN115480670A (en) Navigation bar display method, navigation bar display method and first electronic equipment
CN117631902A (en) Focus switching method, electronic device, chip, storage medium, and program product
WO2023020541A1 (en) Electronic device and human-computer interaction method
CN113642010B (en) Method for acquiring data of extended storage device and mobile terminal
CN114780001B (en) Control method of multi-way switch equipment, terminal equipment and server
EP4236260A1 (en) Request processing method and related apparatus
CN117707450A (en) Concurrent method, equipment and storage medium for screen collaboration and keyboard and mouse sharing
WO2024066992A1 (en) Multi-device networking system and method, and terminal devices
EP4246940A1 (en) Camera control method and apparatus, and storage medium
CN116360657A (en) Cross-device object moving method, electronic device and system
CN116360648A (en) Display method and electronic equipment
CN117478682A (en) Method, equipment and cooperative work system for establishing point-to-point channel
CN115708058A (en) Display method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination