CN114327324A - Distributed display method of interface, electronic equipment and communication system - Google Patents

Distributed display method of interface, electronic equipment and communication system Download PDF

Info

Publication number
CN114327324A
CN114327324A CN202011149035.3A CN202011149035A CN114327324A CN 114327324 A CN114327324 A CN 114327324A CN 202011149035 A CN202011149035 A CN 202011149035A CN 114327324 A CN114327324 A CN 114327324A
Authority
CN
China
Prior art keywords
electronic device
interface
functional area
electronic
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011149035.3A
Other languages
Chinese (zh)
Inventor
彭玉卓
王红军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2021/119324 priority Critical patent/WO2022068628A1/en
Priority to EP21874289.8A priority patent/EP4206897A4/en
Priority to US18/246,986 priority patent/US11995370B2/en
Publication of CN114327324A publication Critical patent/CN114327324A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a distributed display method of an interface, electronic equipment and a communication system, relates to the technical field of electronics, and can realize distributed cooperative display of different contents on the same interface, improve the use convenience of users and improve the use experience of the users. In this application, the first electronic device may cooperatively display the contents of the plurality of functional areas (e.g., the first functional area and the second functional area) on the first electronic device and another electronic device (e.g., the second electronic device) in a distributed manner. Specifically, when the first electronic device displays the content of the first functional area and the content of the second functional area, the first electronic device may send the content of one of the functional areas (e.g., the first functional area) to the second electronic device for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.

Description

Distributed display method of interface, electronic equipment and communication system
The present application claims priority of chinese patent application entitled "an interface distributed display method, electronic device, and communication system" filed on 29.9.2020 and 29.9.2020, and filed on 202011053974.8 and entitled "a multitask cooperative display method, electronic device, and communication system", the entire contents of which are incorporated herein by reference.
Technical Field
The embodiment of the application relates to the technical field of electronics, in particular to a distributed display method of an interface, electronic equipment and a communication system.
Background
With the development of terminal technology, the multi-device display brings more and more convenience to the life of people. The multi-device display means that interface display is realized through a plurality of electronic device display screens.
In general, a multi-device display may include a mirror display, a transition display, or an extended display. For example, as shown in (a) of fig. 1, the mirror display refers to mirroring an interface a displayed on a device a to a device B, where the device B displays the same interface on the display screen of the device a. As shown in fig. 1 (B), the transfer display refers to the transfer of the interface a displayed on the device a to the display on the device B. The expansion display means that an interface is displayed on the display screens of the equipment B and the equipment A in a splicing mode. For example, as shown in (c) of fig. 1, the interface a is geometrically split into an interface a1 and an interface a2, and the interface a1 and the interface a2 are displayed on the display screen of the device a and the display screen of the device B, respectively.
However, neither the mirror display nor the shift display in the above conventional technologies can achieve distributed cooperative display of different contents on different devices on an interface. The expansion display can realize the splicing display of multiple devices on the same interface, but still cannot realize the distributed cooperative display and cooperative operation of different contents on different devices on the same interface. In some scenarios, distributed display and collaborative operation of interface content may be important to enhance user experience.
Disclosure of Invention
The application provides a distributed display method of an interface and electronic equipment, which can realize distributed cooperative display of different contents on the same interface, improve the use convenience of a user and improve the use experience of the user.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a method for distributed display of an interface is provided, the method including: the method comprises the steps that a first electronic device displays a first interface of a first application program at least comprising a first functional area and a second functional area; the method comprises the steps that a first electronic device detects touch operation with a second electronic device; responding to the detected touch operation, the first electronic equipment sends the content of the first functional area to the second electronic equipment for displaying; the first electronic equipment displays a second interface of the first application program, and the second interface comprises a second functional area and does not comprise the first functional area.
In the solution provided by the first aspect, the first electronic device may display the contents of the plurality of functional areas (e.g., the first functional area and the second functional area) on the first electronic device and another electronic device (e.g., the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the content of the first functional area and the content of the second functional area, the first electronic device may send the content of one of the functional areas (e.g., the first functional area) to the second electronic device for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, after the first electronic device sends the content of the first functional area to the second electronic device for displaying, the method further includes: the first electronic device reallocates the display layout of the remaining functional regions on the first electronic device. The method and the device support flexible adjustment of display layout of a plurality of functional areas.
In a possible implementation manner, the first functional area is determined by the first electronic device according to a selection operation of a user on the functional area; or the first functional area is determined by the first electronic device according to the function realized by the functional area on the first interface. The method and the device support that the content of a certain functional area is sent to other electronic equipment to be displayed according to the actual selection of a user, or the content of the certain functional area is sent to other electronic equipment to be displayed according to the specific functions realized by different functional areas.
In a possible implementation manner, the first functional area is determined by the first electronic device according to a selection operation of a user on the content of the functional area; or the first functional area is determined by the first electronic equipment according to the task attribute of the content of the functional area on the first interface. The method and the device support that the content of a certain functional area is sent to other electronic equipment to be displayed according to the actual selection of a user, or the content of the certain functional area is sent to other electronic equipment to be displayed according to the specific task attributes of the content of different functional areas.
In a possible implementation manner, the first functional area and the second functional area on the first interface are arranged on the first electronic device in a preset relative position relationship.
In a possible implementation manner, the first functional area and the second functional area support adjustment of adaptation capability; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability. In the application, each first functional area can receive flexible adaptive capacity adjustment, including stretching capacity, zooming capacity, hiding capacity, folding capacity, sharing capacity, proportion capacity, extension capacity and the like.
In a possible implementation manner, the method further includes: the first electronic equipment responds to the operation that a user triggers the retraction of the collaborative display, and the content of the first functional area is retracted from the second electronic equipment; after the first electronic equipment withdraws the content of the first functional area from the second electronic equipment, the first electronic equipment displays a first interface of the first application program, wherein the first interface at least comprises a first functional area and a second functional area. In the present application, the primary device (i.e., the first device) can retrieve the content of the functional area sent to the secondary device (i.e., the second device) at any time.
In a possible implementation manner, the content of the first functional area and the content of the second functional area on the first interface are jointly rendered on the virtual screen of the first electronic device by the first electronic device according to a preset frame template in the form of one or more atomization services. In the application, because the contents of the plurality of functional areas are jointly rendered in the form of one or more atomization services according to the preset framework template, the content splitting can be performed at the granularity of the functional areas.
In a possible implementation manner, the sending, by the first electronic device, the content of the first functional area to the second electronic device for displaying includes: and the first electronic equipment sends the standard video stream corresponding to the content of the first functional area on the virtual screen to the second electronic equipment for displaying.
In a second aspect, a first electronic device is provided, the first electronic device comprising: a memory for storing computer program code, the computer program code comprising instructions; the radio frequency circuit is used for transmitting and receiving wireless signals; the processor is used for executing the instructions to enable the first electronic equipment to display a first interface of a first application program at least comprising a first functional area and a second functional area; when the touch operation with the second electronic equipment is detected, responding to the detected touch operation, and sending the content of the first functional area to the second electronic equipment for displaying; after the first electronic device sends the content of the first functional area to the second electronic device through the radio frequency circuit for displaying, the first electronic device displays a second interface of the first application program, and the second interface includes the second functional area and does not include the first functional area.
The second aspect provides a solution that the first electronic device can display the contents of the plurality of functional areas (e.g., the first functional area and the second functional area) on the first electronic device and the other electronic device (e.g., the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the content of the first functional area and the content of the second functional area, the first electronic device may send the content of one of the functional areas (e.g., the first functional area) to the second electronic device through the radio frequency circuit for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the processor is further configured to execute the instruction, so that after the radio frequency circuit of the first electronic device sends the content of the first functional region to the second electronic device through the radio frequency circuit for displaying, the display layout of the remaining functional regions on the first electronic device is reallocated. The method and the device support flexible adjustment of display layout of a plurality of functional areas.
In a possible implementation manner, the first functional area is determined by the processor executing the instruction according to a selection operation of a user on the functional area; or the function is determined according to the function realized by the functional area on the first interface. The method and the device support that the content of a certain functional area is sent to other electronic equipment to be displayed according to the actual selection of a user, or the content of the certain functional area is sent to other electronic equipment to be displayed according to the specific functions realized by different functional areas.
In a possible implementation manner, the first functional area is determined by the processor executing the instruction according to the selection operation of the user on the content of the functional area; or the task attribute of the content of the functional area on the first interface is determined by the user. The method and the device support that the content of a certain functional area is sent to other electronic equipment to be displayed according to the actual selection of a user, or the content of the certain functional area is sent to other electronic equipment to be displayed according to the specific task attributes of the content of different functional areas.
In a possible implementation manner, the first functional area and the second functional area on the first interface are arranged on the first electronic device in a preset relative position relationship.
In a possible implementation manner, the first functional area and the second functional area support adjustment of adaptation capability; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability. In the application, each first functional area can receive flexible adaptive capacity adjustment, including stretching capacity, zooming capacity, hiding capacity, folding capacity, sharing capacity, proportion capacity, extension capacity and the like.
In a possible implementation manner, the processor is further configured to execute the instruction, and in response to an operation of triggering retraction of the cooperative display by a user, retract the content of the first functional area from the second electronic device; after the first electronic equipment withdraws the content of the first functional area from the second electronic equipment, the first electronic equipment displays a first interface of the first application program, wherein the first interface at least comprises a first functional area and a second functional area. In the present application, the primary device (i.e., the first device) can retrieve the content of the functional area sent to the secondary device (i.e., the second device) at any time.
In a possible implementation manner, the content of the first functional area and the content of the second functional area on the first interface are executed by the processor, and are jointly rendered on the virtual screen of the first electronic device in the form of one or more atomization services according to a preset framework template. In the application, because the contents of the plurality of functional areas are jointly rendered in the form of one or more atomization services according to the preset framework template, the content splitting can be performed at the granularity of the functional areas.
In a possible implementation manner, the processor is specifically configured to execute the instruction, so that the first electronic device sends a standard video stream corresponding to the content of the first functional area on the virtual screen to the second electronic device through the radio frequency circuit for display.
In a third aspect, a first electronic device is provided, the first electronic device comprising: the display unit is used for displaying a first interface of a first application program at least comprising a first functional area and a second functional area; the processing unit is used for responding to the detected touch operation when the touch operation with the second electronic equipment is detected, and sending the content of the first functional area to the second electronic equipment for displaying; after the first electronic device sends the content of the first functional area to the second electronic device through the transceiver unit for displaying, the first electronic device displays a second interface of the first application program, wherein the second interface includes the second functional area and does not include the first functional area.
In the solution provided by the third aspect, the first electronic device may display the contents of the plurality of functional areas (e.g., the first functional area and the second functional area) on the first electronic device and the other electronic device (e.g., the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the content of the first functional area and the content of the second functional area, the first electronic device may send the content of one of the functional areas (e.g., the first functional area) to the second electronic device for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the processing unit is further configured to, after the content of the first functional area is sent to the second electronic device through the transceiver unit for displaying, reallocate a display layout of the remaining functional areas on the first electronic device. The method and the device support flexible adjustment of display layout of a plurality of functional areas.
In a possible implementation manner, the first functional area is determined by the processing unit according to a selection operation of a user on the functional area; or the function is determined according to the function realized by the functional area on the first interface. The method and the device support that the content of a certain functional area is sent to other electronic equipment to be displayed according to the actual selection of a user, or the content of the certain functional area is sent to other electronic equipment to be displayed according to the specific functions realized by different functional areas.
In a possible implementation manner, the first functional area is determined by the processing unit according to a selection operation of a user on the content of the functional area; or the task attribute of the content of the functional area on the first interface is determined by the user. The method and the device support that the content of a certain functional area is sent to other electronic equipment to be displayed according to the actual selection of a user, or the content of the certain functional area is sent to other electronic equipment to be displayed according to the specific task attributes of the content of different functional areas.
In a possible implementation manner, the first functional area and the second functional area on the first interface are arranged on the first electronic device in a preset relative position relationship.
In a possible implementation manner, the first functional area and the second functional area support adjustment of adaptation capability; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability. In the application, each first functional area can receive flexible adaptive capacity adjustment, including stretching capacity, zooming capacity, hiding capacity, folding capacity, sharing capacity, proportion capacity, extension capacity and the like.
In a possible implementation manner, the processing unit is further configured to retrieve, in response to an operation of a user triggering retrieval of the cooperative display, content of the first functional area from the second electronic device; after the first electronic equipment withdraws the content of the first functional area from the second electronic equipment, the first electronic equipment displays a first interface of the first application program, wherein the first interface at least comprises a first functional area and a second functional area. In the present application, the primary device (i.e., the first device) can retrieve the content of the functional area sent to the secondary device (i.e., the second device) at any time.
In a possible implementation manner, the content of the first functional area and the content of the second functional area on the first interface are jointly rendered on the virtual screen of the first electronic device by the processing unit according to a preset frame template in the form of one or more atomization services. In the application, because the contents of the plurality of functional areas are jointly rendered in the form of one or more atomization services according to the preset framework template, the content splitting can be performed at the granularity of the functional areas.
In a possible implementation manner, the processing unit is specifically configured to send a standard video stream corresponding to the content of the first functional area on the virtual screen to the second electronic device through the transceiving unit for displaying.
In a fourth aspect, a method for distributed display of an interface is provided, the method comprising: the method comprises the steps that a first electronic device displays a first interface at least comprising a first application interface, a second application interface and a third application interface; the method comprises the steps that a first electronic device detects touch operation with a second electronic device; in response to the detection of the touch operation, the first electronic equipment sends the content of the first application interface to the second electronic equipment for display; the first electronic device displays a second interface that includes the second application interface and the third application interface and that does not include the first application interface.
The solution provided by the fourth aspect above, the first electronic device may display the content of the multiple application interfaces (such as the first application interface, the second application interface, and the third application interface) on the first electronic device and the other electronic devices (such as the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the first application interface, the second application interface, and the third application interface, the first electronic device may send the content of one of the application interfaces (e.g., the first application interface) to the second electronic device for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, after the first electronic device sends the content of the first application interface to the second electronic device for displaying, the method further includes: the first electronic device reallocates the display layout of the remaining application interfaces on the first electronic device. The method and the device support flexible adjustment of display layouts of a plurality of application interfaces.
In a possible implementation manner, the first application interface is determined by the first electronic device according to a selection operation of a user on the application interface; or the first application interface is determined by the first electronic device according to the functions realized by the first application interface, the second application interface and the third application interface. The method and the device support that a certain application interface is sent to other electronic equipment to be displayed according to actual selection of a user, or the certain application interface is sent to other electronic equipment to be displayed according to specific functions realized by different application interfaces.
In a possible implementation manner, the windows of the first application interface, the second application interface and the third application interface support adjustment of the adaptation capability; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability.
In a possible implementation manner, the method further includes: the first electronic equipment responds to the operation that a user triggers the retraction of the collaborative display, and retracts the first application interface from the second electronic equipment; after the first electronic device withdraws the first application interface from the second electronic device, the first electronic device displays a first interface, wherein the first interface at least comprises a first application interface, a second application interface and a third application interface. In the present application, the primary device (i.e. the first device) can retrieve the application interface sent to the secondary device (i.e. the second device) at any time.
In a fifth aspect, a first electronic device is provided, the first electronic device comprising: a memory for storing computer program code, the computer program code comprising instructions; the radio frequency circuit is used for transmitting and receiving wireless signals; a processor configured to execute the instructions to cause a first electronic device to display a first interface including at least a first application interface, a second application interface, and a third application interface; when the touch operation with the second electronic equipment is detected, responding to the detection of the touch operation, and sending the content of the first application interface to the second electronic equipment for displaying; after the first electronic device sends the content of the first application interface to the second electronic device for displaying, the first electronic device displays a second interface, wherein the second interface comprises a second application interface and a third application interface and does not comprise the first application interface.
The fifth aspect provides an approach that the first electronic device may display content of multiple application interfaces (such as the first application interface, the second application interface, and the third application interface) on the first electronic device and other electronic devices (such as the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the first application interface, the second application interface, and the third application interface, the first electronic device may send the content of one of the application interfaces (e.g., the first application interface) to the second electronic device for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the processor is further configured to execute the instructions, so that the first electronic device reallocates a display layout of the remaining application interfaces on the first electronic device after sending the content of the first application interface to the second electronic device for display. The method and the device support flexible adjustment of display layouts of a plurality of application interfaces.
In a possible implementation manner, the first application interface is determined by the processor executing the instruction according to the selection operation of the user on the application interface; or the function is determined according to the functions realized by the first application interface, the second application interface and the third application interface. The method and the device support that a certain application interface is sent to other electronic equipment to be displayed according to actual selection of a user, or the certain application interface is sent to other electronic equipment to be displayed according to specific functions realized by different application interfaces.
In a possible implementation manner, the windows of the first application interface, the second application interface and the third application interface support adjustment of the adaptation capability; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability.
In a possible implementation manner, the processor is further configured to execute the instruction, and in response to an operation of triggering retraction of the cooperative display by a user, retract the first application interface from the second electronic device; after the first electronic device withdraws the first application interface from the second electronic device, the first electronic device displays a first interface, wherein the first interface at least comprises a first application interface, a second application interface and a third application interface. In the present application, the primary device (i.e. the first device) can retrieve the application interface sent to the secondary device (i.e. the second device) at any time.
In a sixth aspect, a first electronic device is provided, the first electronic device comprising: the display unit is used for displaying a first interface at least comprising a first application interface, a second application interface and a third application interface; the processing unit is used for responding to the detection of touch operation when the touch operation with the second electronic equipment is detected, and sending the content of the first application interface to the second electronic equipment through the transceiving unit for displaying; after the first electronic device sends the content of the first application interface to the second electronic device for displaying, the first electronic device displays a second interface, wherein the second interface comprises a second application interface and a third application interface and does not comprise the first application interface.
The solution provided by the above sixth aspect is that the first electronic device may display the content of the multiple application interfaces (such as the first application interface, the second application interface, and the third application interface) on the first electronic device and the other electronic devices (such as the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the first application interface, the second application interface, and the third application interface, the first electronic device may send the content of one of the application interfaces (e.g., the first application interface) to the second electronic device for display. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the processing unit is further configured to, after the content of the first application interface is sent to the second electronic device for display, reallocate a display layout of the remaining application interfaces on the first electronic device. The method and the device support flexible adjustment of display layouts of a plurality of application interfaces.
In a possible implementation manner, the first application interface is determined by the processing unit according to a selection operation of a user on the application interface; or the function is determined according to the functions realized by the first application interface, the second application interface and the third application interface. The method and the device support that a certain application interface is sent to other electronic equipment to be displayed according to actual selection of a user, or the certain application interface is sent to other electronic equipment to be displayed according to specific functions realized by different application interfaces.
In a possible implementation manner, the windows of the first application interface, the second application interface and the third application interface support adjustment of the adaptation capability; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability.
In a possible implementation manner, the processing unit is further configured to retrieve the first application interface from the second electronic device in response to an operation of a user triggering to retrieve the collaborative display; after the first electronic device withdraws the first application interface from the second electronic device, the first electronic device displays a first interface, wherein the first interface at least comprises a first application interface, a second application interface and a third application interface. In the present application, the primary device (i.e. the first device) can retrieve the application interface sent to the secondary device (i.e. the second device) at any time.
In a seventh aspect, a communication system is provided, where the communication system includes a first electronic device and a second electronic device as in any possible implementation manner of the second, third, fifth or sixth aspect.
In an eighth aspect, a method for distributed display of an interface is provided, the method comprising: the first electronic equipment displays a first interface, wherein the first interface comprises an electronic document list, and the electronic document list comprises at least one unread electronic document and at least one read electronic document; the method comprises the steps that the first electronic equipment detects touch operation with the second electronic equipment, and the second electronic equipment is determined to be auxiliary equipment of the first electronic equipment; the first electronic equipment searches the latest unread electronic document in the at least one unread electronic document and displays the detail interface of the latest unread electronic document on the second electronic equipment; the first electronic device continues to display the list of electronic documents, wherein the unread electronic document of the most recent one of the list of electronic documents is marked as read.
In the solution provided by the above eighth aspect, the first electronic device may display the list of electronic documents and the detail interface of the unread electronic document (e.g., the latest unread electronic document) on the first electronic device and the other electronic devices (e.g., the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the electronic document list, the first electronic device may display a detailed interface of a latest unread electronic document on the second electronic device. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the searching, by the first electronic device, a latest unread electronic document of the at least one unread electronic document, and displaying the detail interface of the latest unread electronic document on the second electronic device, includes: and responding to the touch operation, the first electronic equipment searches the latest unread electronic document in the at least one unread electronic document, and sends the detail interface of the latest unread electronic document to the second electronic equipment for displaying. In the application, when the electronic document list is displayed, in response to receiving a preset operation (for example, a touch operation between the second electronic device and the first electronic device), the first electronic device may send the detail interface of the latest unread electronic document to the second electronic device for displaying, so that a user can process different tasks through the plurality of electronic devices, and the user operation is facilitated.
In one possible implementation, the electronic document list is any one of the following: an email list, a short message list, or a memo list. The scheme provided by the application supports distributed cooperative display of the list interface and the detail interface of electronic documents such as e-mails, short messages or memos and the like on different electronic equipment, so that a user can process different tasks through a plurality of electronic equipment, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the method further comprises the following steps: in response to receiving the editing operation of the user on the detail interface of the latest unread electronic document, the second electronic device sends the input keyboard to the first electronic device for displaying, and the second electronic device only displays the editing interface of the latest unread electronic document. In the application, when the second electronic device displays the editing interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard on the first electronic device in a distributed and cooperative manner, so that a user can edit the electronic document through the first electronic device conveniently, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the method further comprises the following steps: in response to receiving an editing operation of a user on the detail interface of the latest unread electronic document, the second electronic device displays an input keyboard on the editing interface of the latest unread electronic document in a floating or overlaying manner; and responding to the detection of the touch operation with the first electronic equipment, the second electronic equipment sends the input keyboard to the first electronic equipment for displaying, and the second electronic equipment only displays the editing interface of the latest unread electronic document. In the application, when the second electronic device displays the editing interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard displayed in a suspended or superposed manner on the first electronic device in a distributed and cooperative manner, so that a user can edit the electronic document through the first electronic device, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the method further comprises the following steps: and in response to receiving a reply operation of the user to the latest unread electronic document, the second electronic equipment sends the input keyboard to the first electronic equipment for displaying, and the second electronic equipment only displays a reply interface of the latest unread electronic document. In the application, when the second electronic device displays the reply interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard on the first electronic device in a distributed and cooperative manner, so that a user can reply the electronic document through the first electronic device, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the method further comprises the following steps: in response to receiving a reply operation of the user to the latest unread electronic document, the second electronic device suspends or superposes the display input keyboard on the reply interface of the latest unread electronic document; and responding to the detected touch operation with the first electronic equipment, the second electronic equipment sends the input keyboard to the first electronic equipment for displaying, and the second electronic equipment only displays the reply interface of the latest unread electronic document. In the application, when the second electronic device displays the reply interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard displayed in a suspended or superposed manner on the first electronic device in a distributed and cooperative manner, so that a user can reply the electronic document through the first electronic device, and the operation of the user is facilitated.
In one possible implementation, the electronic document list is a memo list; the method further comprises the following steps: and in response to receiving the editing operation of the user on the latest unprocessed memo note, the second electronic equipment sends the input keyboard to the first electronic equipment for displaying, and the second electronic equipment only displays the editing interface of the latest unprocessed memo note. In the application, when the second electronic device distributively and cooperatively displays the editing interface of one electronic document (such as the latest unprocessed memo note) in the electronic document list on the first electronic device, the second electronic device may also distributively and cooperatively display the suspended or overlaid displayed input keyboard on the first electronic device, so that a user can conveniently process the latest unprocessed memo note through the first electronic device, and the user operation is facilitated.
In one possible implementation, the electronic document list is a memo list; the method further comprises the following steps: in response to receiving the editing operation of the user on the latest unprocessed memo note, the second electronic equipment suspends or superposes and displays the input keyboard on the editing interface of the latest unprocessed memo note; and responding to the detected touch operation with the first electronic equipment, the second electronic equipment sends the input keyboard to the first electronic equipment for displaying, and the second electronic equipment only displays the editing interface of the latest unprocessed memo note. In the application, when the second electronic device distributively and cooperatively displays the editing interface of one electronic document (such as the latest unprocessed memo note) in the electronic document list on the first electronic device, the second electronic device may also distributively and cooperatively display the suspended or overlaid and displayed input keyboard on the first electronic device, so that a user can conveniently process the electronic document through the first electronic device, and the operation of the user is facilitated.
In a possible implementation manner, the determining, by the first electronic device, that the second electronic device is a slave device of the first electronic device includes: the first electronic equipment is used for operating according to a second operation of the user, and the second electronic equipment is a secondary equipment of the first electronic equipment; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device by the first electronic device in response to the touch operation, and is used for a user to select the main device and the auxiliary device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the determining, by the first electronic device, that the second electronic device is a slave device of the first electronic device includes: the first electronic equipment determines that the second electronic equipment is the auxiliary equipment of the first electronic equipment according to the second operation of the user; the second operation is a selection operation of the user on the prompt box; the prompt box is that the first electronic equipment is displayed on a display screen of the first electronic equipment in response to a drag of a draggable icon by a user, the prompt box is used for the user to select the main equipment and the auxiliary equipment, and the draggable icon is displayed on the display screen of the first electronic equipment in response to a touch operation by the first electronic equipment. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the determining, by the first electronic device, that the second electronic device is a slave device of the first electronic device includes: the first electronic equipment determines that the second electronic equipment is a secondary equipment of the first electronic equipment according to the motion data of the first electronic equipment; the motion data of the first electronic device indicates that the first electronic device is static or the motion acceleration is smaller than a preset threshold. In this application, the first electronic device may determine, according to the motion data of the first electronic device, a second electronic device (i.e., a slave device) that is displayed in distributed cooperation with the first electronic device.
In a ninth aspect, there is provided a communication system comprising: the electronic document management system comprises a first electronic device, a second electronic device and a display device, wherein the first interface comprises an electronic document list, and the electronic document list comprises at least one unread electronic document and at least one read electronic document; the method comprises the steps that the first electronic equipment detects touch operation with the second electronic equipment, and the second electronic equipment is determined to be auxiliary equipment of the first electronic equipment; the first electronic equipment searches the latest unread electronic document in the at least one unread electronic document and displays the detail interface of the latest unread electronic document on the second electronic equipment; a second electronic device for displaying a detail interface of a latest one of the unread electronic documents from the first electronic device; after the first electronic equipment sends the detail interface of the latest unread electronic document to the second electronic equipment, the first electronic equipment continues to display the electronic document list, wherein the latest unread electronic document in the electronic document list is marked as a read state.
In the method provided by the ninth aspect, the first electronic device may display the electronic document list and the detail interface of the unread electronic document (e.g., the latest unread electronic document) on the first electronic device and the other electronic devices (e.g., the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the electronic document list, the first electronic device may display a detailed interface of a latest unread electronic document on the second electronic device. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the first electronic device is specifically configured to, in response to the touch operation, search for a latest unread electronic document in the at least one unread electronic document, and send a detailed interface of the latest unread electronic document to the second electronic device for displaying. In the application, when the electronic document list is displayed, in response to receiving a preset operation (for example, a touch operation between the second electronic device and the first electronic device), the first electronic device may send the detail interface of the latest unread electronic document to the second electronic device for displaying, so that a user can process different tasks through the plurality of electronic devices, and the user operation is facilitated.
In one possible implementation, the electronic document list is any one of the following: an email list, a short message list, or a memo list. The scheme provided by the application supports distributed cooperative display of the list interface and the detail interface of electronic documents such as e-mails, short messages or memos and the like on different electronic equipment, so that a user can process different tasks through a plurality of electronic equipment, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the second electronic device is further used for responding to the editing operation of the receiving user on the detail interface of the latest unread electronic document, and sending the input keyboard to the first electronic device for displaying; after the second electronic device sends the input keyboard to the first electronic device for displaying, the first electronic device displays the input keyboard, and the second electronic device only displays the editing interface of the latest unread electronic document. In the application, when the second electronic device displays the editing interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard on the first electronic device in a distributed and cooperative manner, so that a user can edit the electronic document through the first electronic device conveniently, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the second electronic device is further configured to, in response to receiving an editing operation of the user on the detail interface of the latest unread electronic document, suspend or overlay the display input keyboard on the editing interface of the latest unread electronic document; responding to the detected touch operation with the first electronic equipment, and sending an input keyboard to the first electronic equipment for displaying; after the second electronic device sends the input keyboard to the first electronic device for displaying, the first electronic device displays the input keyboard, and the second electronic device only displays the editing interface of the latest unread electronic document. In the application, when the second electronic device displays the editing interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard displayed in a suspended or superposed manner on the first electronic device in a distributed and cooperative manner, so that a user can edit the electronic document through the first electronic device, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the second electronic device is further configured to send the input keyboard to the first electronic device for display in response to receiving a reply operation of the user to the latest unread electronic document; after the second electronic device sends the input keyboard to the first electronic device for displaying, the first electronic device displays the input keyboard, and the second electronic device only displays the reply interface of the latest unread electronic document. In the application, when the second electronic device displays the reply interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard on the first electronic device in a distributed and cooperative manner, so that a user can reply the electronic document through the first electronic device, and the operation of the user is facilitated.
In a possible implementation manner, the electronic document list is an email list or a short message list; the second electronic device is further configured to, in response to receiving a reply operation of the user to the latest unread electronic document, suspend or superimpose the input keyboard on the reply interface of the latest unread electronic document; responding to the detected touch operation with the first electronic equipment, and sending an input keyboard to the first electronic equipment for displaying; after the second electronic device sends the input keyboard to the first electronic device for displaying, the first electronic device displays the input keyboard, and the second electronic device only displays the reply interface of the latest unread electronic document. In the application, when the second electronic device displays the reply interface of one electronic document in the electronic document list on the first electronic device in a distributed and cooperative manner, the second electronic device can also display the input keyboard displayed in a suspended or superposed manner on the first electronic device in a distributed and cooperative manner, so that a user can reply the electronic document through the first electronic device, and the operation of the user is facilitated.
In one possible implementation, the electronic document list is a memo list; the second electronic device is further used for responding to the received editing operation of the user on the latest unprocessed memorandum note, and sending the input keyboard to the first electronic device for displaying; after the second electronic device sends the input keyboard to the first electronic device for displaying, the first electronic device displays the input keyboard, and the second electronic device only displays the editing interface of the latest unprocessed memo note. In the application, when the second electronic device distributively and cooperatively displays the editing interface of one electronic document (such as the latest unprocessed memo note) in the electronic document list on the first electronic device, the second electronic device may also distributively and cooperatively display the suspended or overlaid displayed input keyboard on the first electronic device, so that a user can conveniently process the latest unprocessed memo note through the first electronic device, and the user operation is facilitated.
In one possible implementation, the electronic document list is a memo list; the second electronic device is further used for displaying an input keyboard on an editing interface of the latest unprocessed memo note in a floating or overlapping manner in response to receiving the editing operation of the latest unprocessed memo note by the user; responding to the detected touch operation with the first electronic equipment, and sending an input keyboard to the first electronic equipment for displaying; after the second electronic device sends the input keyboard to the first electronic device for displaying, the first electronic device displays the input keyboard, and the second electronic device only displays the editing interface of the latest unprocessed memo note. In the application, when the second electronic device distributively and cooperatively displays the editing interface of one electronic document (such as the latest unprocessed memo note) in the electronic document list on the first electronic device, the second electronic device may also distributively and cooperatively display the suspended or overlaid and displayed input keyboard on the first electronic device, so that a user can conveniently process the electronic document through the first electronic device, and the operation of the user is facilitated.
In a possible implementation manner, the determining, by the first electronic device, that the second electronic device is a slave device of the first electronic device includes: the first electronic equipment determines that the second electronic equipment is the auxiliary equipment of the first electronic equipment according to the second operation of the user; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device by the first electronic device in response to the touch operation, and is used for a user to select the main device and the auxiliary device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the determining, by the first electronic device, that the second electronic device is a slave device of the first electronic device includes: the first electronic equipment determines that the second electronic equipment is the auxiliary equipment of the first electronic equipment according to the second operation of the user; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device in response to the dragging of a draggable icon by the first electronic device, the prompt box is used for the user to select the main device and the auxiliary device, and the draggable icon is displayed on the display screen of the first electronic device in response to the touch operation by the first electronic device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the determining, by the first electronic device, that the second electronic device is a slave device of the first electronic device includes: the first electronic equipment determines that the second electronic equipment is a secondary equipment of the first electronic equipment according to the motion data of the first electronic equipment; the motion data of the first electronic device indicates that the first electronic device is static or the motion acceleration is smaller than a preset threshold. In this application, the first electronic device may determine, according to the motion data of the first electronic device, a second electronic device (i.e., a slave device) that is displayed in distributed cooperation with the first electronic device.
In a tenth aspect, there is provided a first electronic device comprising: a memory for storing computer program code, the computer program code comprising instructions; the radio frequency circuit is used for transmitting and receiving wireless signals; a processor configured to execute the instructions to cause a first electronic device to display a first interface, wherein the first interface includes a list of electronic documents, and the list of electronic documents includes at least one unread electronic document and at least one read electronic document; detecting a touch operation with second electronic equipment, and determining that the second electronic equipment is a secondary device of the first electronic equipment; searching a latest unread electronic document in the at least one unread electronic document, and displaying a detail interface of the latest unread electronic document on the second electronic equipment; wherein after displaying the detail interface of the latest one of the unread electronic documents on the second electronic device, the first electronic device continues to display the list of electronic documents, wherein the latest one of the unread electronic documents in the list of electronic documents is marked as a read state.
In the solution provided by the tenth aspect above, the first electronic device may display the list of electronic documents and the detail interface of the unread electronic document (e.g., the latest unread electronic document) on the first electronic device and the other electronic devices (e.g., the second electronic device) in a distributed and coordinated manner. Specifically, when the first electronic device displays the electronic document list, the first electronic device may display a detailed interface of a latest unread electronic document on the second electronic device. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the processor is specifically configured to execute the instruction, so that the first electronic device searches, in response to the touch operation, a latest unread electronic document in the at least one unread electronic document, and sends, through the radio frequency circuit, a detailed interface of the latest unread electronic document to the second electronic device for displaying. In the application, when the electronic document list is displayed, in response to receiving a preset operation (for example, a touch operation between the second electronic device and the first electronic device), the first electronic device may send the detail interface of the latest unread electronic document to the second electronic device for displaying, so that a user can process different tasks through the plurality of electronic devices, and the user operation is facilitated.
In one possible implementation, the electronic document list is any one of the following: an email list, a short message list, or a memo list. The scheme provided by the application supports distributed cooperative display of the list interface and the detail interface of electronic documents such as e-mails, short messages or memos and the like on different electronic equipment, so that a user can process different tasks through a plurality of electronic equipment, and the operation of the user is facilitated.
In a possible implementation manner, the processor is specifically configured to execute the instruction, so that the first electronic device determines, according to a second operation of a user, that the second electronic device is a slave device of the first electronic device; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device by the first electronic device in response to the touch operation, and is used for a user to select the main device and the auxiliary device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the processor is specifically configured to execute the instruction, so that the first electronic device determines, according to a second operation of a user, that the second electronic device is a slave device of the first electronic device; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device in response to the dragging of a draggable icon by the first electronic device, the prompt box is used for the user to select the main device and the auxiliary device, and the draggable icon is displayed on the display screen of the first electronic device in response to the touch operation by the first electronic device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the processor is specifically configured to execute the instruction, so that the first electronic device determines, according to the motion data of the first electronic device, that the second electronic device is a slave device of the first electronic device; the motion data of the first electronic device indicates that the first electronic device is static or the motion acceleration is smaller than a preset threshold. In this application, the first electronic device may determine, according to the motion data of the first electronic device, a second electronic device (i.e., a slave device) that is displayed in distributed cooperation with the first electronic device.
In an eleventh aspect, a first electronic device is provided, the first electronic device comprising: the display screen is used for displaying a first interface, wherein the first interface comprises an electronic document list, and the electronic document list comprises at least one unread electronic document and at least one read electronic document; the processing unit is used for determining that the second electronic equipment is a secondary equipment of the first electronic equipment when the first electronic equipment detects the touch operation with the second electronic equipment; searching a latest unread electronic document in the at least one unread electronic document, and displaying a detail interface of the latest unread electronic document on the second electronic equipment; wherein after displaying the detail interface of the latest one of the unread electronic documents on the second electronic device, the first electronic device continues to display the list of electronic documents, wherein the latest one of the unread electronic documents in the list of electronic documents is marked as a read state.
The above eleventh aspect provides an approach that the first electronic device may distributively and cooperatively display the electronic document list and the detail interface of the unread electronic document (e.g., the latest one) on the first electronic device and the other electronic device (e.g., the second electronic device). Specifically, when the first electronic device displays the electronic document list, the first electronic device may display a detailed interface of a latest unread electronic document on the second electronic device. Through the scheme, a user can process different tasks through the plurality of electronic devices conveniently, and the user operation is facilitated.
In a possible implementation manner, the first electronic device further includes: a transceiver unit; the processing unit is specifically configured to enable the first electronic device to respond to the touch operation, search for a latest unread electronic document in the at least one unread electronic document, and send a detailed interface of the latest unread electronic document to the second electronic device through the transceiving unit for displaying. In the application, when the electronic document list is displayed, in response to receiving a preset operation (for example, a touch operation between the second electronic device and the first electronic device), the first electronic device may send the detail interface of the latest unread electronic document to the second electronic device for displaying, so that a user can process different tasks through the plurality of electronic devices, and the user operation is facilitated.
In one possible implementation, the electronic document list is any one of the following: an email list, a short message list, or a memo list. The scheme provided by the application supports distributed cooperative display of the list interface and the detail interface of electronic documents such as e-mails, short messages or memos and the like on different electronic equipment, so that a user can process different tasks through a plurality of electronic equipment, and the operation of the user is facilitated.
In a possible implementation manner, the processing unit is specifically configured to enable the first electronic device to determine, according to a second operation of the user, that the second electronic device is a slave device of the first electronic device; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device by the first electronic device in response to the touch operation, and is used for a user to select the main device and the auxiliary device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the processing unit is specifically configured to enable the first electronic device to determine, according to a second operation of the user, that the second electronic device is a slave device of the first electronic device; the second operation is a selection operation of the user on the prompt box; the prompt box is displayed on a display screen of the first electronic device in response to the dragging of a draggable icon by the first electronic device, the prompt box is used for the user to select the main device and the auxiliary device, and the draggable icon is displayed on the display screen of the first electronic device in response to the touch operation by the first electronic device. In the application, the first electronic device may determine, according to a selection of a user, a second electronic device (i.e., a secondary device) that is displayed in a distributed and cooperative manner with the first electronic device.
In a possible implementation manner, the processing unit is specifically configured to enable the first electronic device to determine, according to the motion data of the first electronic device, that the second electronic device is a slave device of the first electronic device; the motion data of the first electronic device indicates that the first electronic device is static or the motion acceleration is smaller than a preset threshold. In this application, the first electronic device may determine, according to the motion data of the first electronic device, a second electronic device (i.e., a slave device) that is displayed in distributed cooperation with the first electronic device.
In a twelfth aspect, a computer-readable storage medium is provided, having stored thereon computer-executable instructions that, when executed by a processor, implement a method as in any one of the possible implementations of the first, fourth or eighth aspect.
In a thirteenth aspect, a chip system is provided, which includes a processor, a memory, and instructions stored in the memory; the instructions, when executed by the processor, implement a method as in any one of the possible implementations of the first, fourth or eighth aspect. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
In a fourteenth aspect, a computer program product is provided which, when run on a computer, causes the implementation of the method as in any one of the possible implementations of the first, fourth or eighth aspect.
Drawings
FIG. 1 is a schematic diagram of three multi-device display modes;
fig. 2 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 4 is a first exemplary view of an application scenario of distributed display of two interfaces provided in an embodiment of the present application;
fig. 5 is a second exemplary view of an application scenario of distributed display of two interfaces provided in the embodiment of the present application;
fig. 6 is a third exemplary view of an application scenario of distributed display of two interfaces provided in the embodiment of the present application;
FIG. 7 is a diagram illustrating a first exemplary operation provided by an embodiment of the present application;
FIG. 8 is a diagram illustrating three first operation examples provided by an embodiment of the present application;
FIG. 9 is a diagram of three other first operation examples provided by the embodiment of the present application;
fig. 10 is a schematic diagram of a method for determining a primary device and a secondary device by an electronic device according to an embodiment of the present application;
fig. 11A is a schematic diagram of another method for determining a primary device and a secondary device by an electronic device according to an embodiment of the present application;
fig. 11B is a schematic diagram of a method for determining a primary device and a secondary device by a third electronic device according to an embodiment of the present application;
FIG. 12 is a schematic diagram illustrating an option operation method for removing a draggable icon according to an embodiment of the present disclosure;
FIG. 13 is a diagram of a distributed display example of an interface according to an embodiment of the present disclosure;
FIG. 14 is a diagram of an example of a distributed display of an interface according to an embodiment of the present disclosure;
fig. 15A is a third exemplary view illustrating a distributed display of an interface according to an embodiment of the present disclosure;
fig. 15B is a diagram of a distributed display example of an interface according to an embodiment of the present application;
FIG. 15C is a diagram of an example of a distributed display of an interface according to an embodiment of the present application;
fig. 16 is a sixth example of a distributed display of an interface according to an embodiment of the present disclosure;
fig. 17 is a seventh exemplary view illustrating a distributed display of an interface according to an embodiment of the present disclosure;
fig. 18 is an exemplary diagram eight illustrating a distributed display of an interface according to an embodiment of the present application;
FIG. 19 is a diagram illustrating an example of a distributed display of an interface according to an embodiment of the present application;
fig. 20 is a diagram ten illustrating an example of a distributed display of an interface according to an embodiment of the present application;
fig. 21 is an eleventh exemplary view of a distributed display of an interface according to an embodiment of the present application;
FIG. 22 is a twelfth example interface distributed display provided by an embodiment of the present application;
FIG. 23 is a thirteen schematic diagram illustrating an exemplary distributed display of an interface provided by an embodiment of the present application;
FIG. 24 is an exemplary diagram of several adaptation capabilities provided by embodiments of the present application;
FIG. 25 is a schematic diagram of a selection distribution interface provided by an embodiment of the present application;
FIG. 26 is a schematic diagram of a reclaiming distributed interface according to an embodiment of the present application;
FIG. 27 is a schematic diagram of another example of a retrieval distributed interface provided by embodiments of the present application;
fig. 28 is a block diagram of an electronic device according to an embodiment of the present application;
fig. 29 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The embodiment of the application provides a distributed display method of an interface, which can be used for displaying different contents on one interface on different electronic devices in a distributed and cooperative manner.
The electronic equipment in the application comprises one or more display screens. For example, the electronic device may be a smart phone, a netbook, a tablet computer, a smart camera, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a Personal Computer (PC), an ultra-mobile personal computer (UMPC), and the like. Alternatively, the electronic device may be other types or structures of electronic devices including a display screen, and the application is not limited thereto.
Referring to fig. 2, fig. 2 shows a hardware structure diagram of an electronic device according to an embodiment of the present application, taking a smart phone as an example. As shown in fig. 2, the electronic device may include a processor 210, a memory (including an external memory interface 220 and an internal memory 221), a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display screen 294, and a Subscriber Identity Module (SIM) card interface 295, and the like. The sensor module 280 may include a gyro sensor 280A, an acceleration sensor 280B, a magnetic sensor 280C, a touch sensor 280D, a fingerprint sensor 280E, and a pressure sensor 280F, among others. In some embodiments, the sensor module 280 may also include a barometric pressure sensor, a distance sensor, a proximity light sensor, a temperature sensor, an ambient light sensor, a bone conduction sensor, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not limit the electronic device. In other embodiments of the present application, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units. For example: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a flight controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The charge management module 240 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 240 may receive charging input from a wired charger via the USB interface 230. In some wireless charging embodiments, the charging management module 240 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 240 may also supply power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used to connect the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charging management module 240, and provides power to the processor 210, the internal memory 221, the display 294, the camera 293, and the wireless communication module 260. The power management module 241 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charging management module 240 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communications bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device. The mobile communication module 250 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 250 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 250 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays images or video through the display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 210, and may be disposed in the same device as the mobile communication module 250 or other functional modules.
The wireless communication module 260 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLANs), such as WiFi networks, bluetooth BT, Global Navigation Satellite Systems (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 260 may be one or more devices integrating at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260 so that the electronic device can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device implements display functions via the GPU, the display screen 294, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device may include 1 or N display screens 294, N being a positive integer greater than 1.
The electronic device may implement a shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294, and the application processor.
The external memory interface 220 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device. The external memory card communicates with the processor 210 through the external memory interface 220 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, including instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 210 executes various functional applications of the portable device and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The gyro sensor 280A may be used to determine a pose during motion of the electronic device. In some embodiments, the angular velocity of the electronic device in the preset coordinate system may be determined by the gyro sensor 280A.
The acceleration sensor 280B may detect the direction of motion and the acceleration of motion of the electronic device. When the electronic device is at rest, the magnitude and direction of gravity can be detected. The method can also be used for recognizing the gesture of the electronic equipment, and is applied to applications such as pedometers and the like.
The magnetic sensor 280C is a device for converting a change in the magnetic property of the sensitive element caused by an external factor such as a magnetic field, a current, a stress strain, a temperature, a light, etc. into an electric signal, in such a manner as to detect a corresponding physical quantity. In some embodiments, the magnetic sensor can measure the included angle of the electronic device to the four directions of the south, the east, the west and the north.
The touch sensor 280D may also be referred to as a "touch panel". The touch sensor 280D may be disposed on the display screen 294, and the touch sensor 280D and the display screen 294 form a touch screen, which is also called a "touch screen". The touch sensor 280D is used to detect a touch operation applied thereto or nearby. The touch sensor 280D can communicate the detected touch operation to the application processor to determine the touch event type. The electronic device may provide visual output related to touch operations, etc. through the display screen 294. In other embodiments, the touch sensor 280D can be disposed on a surface of the electronic device at a different location than the display screen 294.
The fingerprint sensor 280E is used to capture a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, application lock access, fingerprint photographing, incoming call answering and the like.
The pressure sensor 280F is used for sensing the pressure signal and converting the pressure signal into an electrical signal. For example, the pressure sensor 280F may be disposed on the display screen 294. The touch operations which act on the same touch position but have different touch operation intensities can correspond to different operation instructions.
The electronic device may implement audio functions through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the application processor, and the like. Such as music playing, recording, etc. As to the specific operation principle and action of the audio module 270, the speaker 270A, the receiver 270B and the microphone 270C, reference may be made to the description in the conventional art.
The keys 290 include a power-on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device.
The motor 291 may generate a vibration cue. The motor 291 can be used for both incoming call vibration prompting and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 291 may also respond to different vibration feedback effects for touch operations on different areas of the display 294. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 292 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, missed call, notification, etc.
The SIM card interface 295 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 295 or being pulled out of the SIM card interface 295. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 295 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 295 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 295 may also be compatible with different types of SIM cards. The SIM card interface 295 may also be compatible with external memory cards. The electronic equipment realizes functions of conversation, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
It should be noted that the hardware modules included in the electronic device shown in fig. 2 are only described by way of example, and do not limit the specific structure of the electronic device. For example, the electronic device may also include other functional modules.
In the present application, the operating system of the electronic device may include, but is not limited to
Figure BDA0002740623190000171
(Symbian)、
Figure BDA0002740623190000172
(Android)、
Figure BDA0002740623190000174
(iOS)、
Figure BDA0002740623190000173
(Blackberry), hong meng (Harmony), etc., and the present application is not limited thereto.
Referring to fig. 3, fig. 3 specifically describes a software structure diagram of an electronic device in an embodiment of the present application, taking an Android operating system as an example.
As shown in fig. 3, the Android operating system may include an application layer, an application framework layer (FWK), a system library, an Android runtime, and a kernel layer (kernel).
Wherein the application layer may provide some core applications. For convenience of description, the application program will be simply referred to as an application hereinafter. The applications in the application layer may include native applications (e.g., applications installed in the electronic device when the operating system is installed before the electronic device is shipped from the factory), such as a camera, a map, music, a short message, a gallery, an address book, bluetooth, and the like, as shown in fig. 3. Applications in the application layer may also include third party applications (e.g., applications installed by a user via an application store download), such as the judder, email, today's headline, and video applications shown in fig. 3.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. As shown in fig. 3, the application framework layer may include a window manager service (VMS), an Activity Manager Service (AMS), and an input event manager service (IMS). In some embodiments, the application framework layer may also include a content provider, a view system, a telephony manager, an explorer, a notification manager, and the like (not shown in FIG. 3).
As shown in fig. 3, the application framework layer may include a window management server (VMS), an Activity Management Server (AMS), an input event management server (IMS), a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window management server is used for managing window programs. The window management server can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The Activity Manager Service (AMS) is responsible for managing Activity, and is responsible for the operations of starting, switching, scheduling, managing and scheduling application programs, and the like of each component in the system.
An input event management server (IMS) may be configured to perform processing such as translation and encapsulation on an original input event, obtain an input event containing more information, and send the input event to a window management server, where a clickable area (such as a control) of each application program, location information of a focus window, and the like are stored in the window management server. Therefore, the window management server can correctly distribute the input event to the designated control or focus window.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The system library and the Android run contain function functions required to be called by the FWK, a core library of the Android and an Android virtual machine. The system library may include a plurality of functional modules. For example: browser kernel, three-dimensional (3D) graphics, font library, etc.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGLES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is the basis of the Android operating system, and the final functions of the Android operating system are completed through the kernel layer. The kernel layer may contain display drivers, input/output device drivers (e.g., keyboard, touch screen, headphones, speakers, microphones, etc.), device nodes, bluetooth drivers, camera drivers, audio drivers, and sensor drivers, among others. The user carries out input operation through the input device, and the kernel layer can generate a corresponding original input event according to the input operation and store the original input event in the device node.
As described above, it is understood that a conventional multi-device display may implement a mirror display, a transition display, or a tiled extended display. For example, based on the Sidecar function of the Mac OS, a mirror display (i.e., an interface displayed on a Mac OS device is mirrored by an iPad), a transfer display (i.e., an interface displayed on a Mac OS device is transferred to an iPad), or a mosaic extended display (i.e., an interface displayed on a Mac OS device is extended by a mosaic of an iPad) can be implemented by using the display screen of the iPad as an extended screen of the Mac. Briefly, the Sidecar function allows the iPad to be a Mac external display. During the implementation process of the Sidecar function, the iPad and the Mac need to be wirelessly connected in the same network environment or connected through a USB-C data line.
However, the mirror image display, the transfer display and the splicing type extended display do not relate to the distributed display of different contents on the interface. For example, the extended display may implement a multi-device tiled display on the same interface, but still cannot implement distributed collaborative display and collaborative operation of different contents on different devices on the same interface. In some scenarios, distributed cooperative display and cooperative operation of different contents on one interface often can bring great convenience to users, and are very important for improving user experience.
For example, referring to fig. 4, fig. 4 shows an exemplary diagram of an application scenario in which two interfaces are distributed according to an embodiment of the present application. As shown in fig. 4 (a), on the electronic classroom, the live video of the teacher, the electronic lecture, and the class communication group interface are simultaneously displayed on the electronic device. In some scenarios, as shown in fig. 4 (a), a memo of the student may also be displayed on the electronic device. In the live broadcast process of the electronic classroom, students can refer to the content in the electronic lecture in the process of watching live broadcast videos of teachers. In some scenarios, students can record classroom notes of knowledge points, questions, etc. in a memo at hand. In some scenarios, students may view messages of teachers or other students, such as notifications, questions, and the like, related information on the class communication group interface.
On a multitasking interface such as that shown in fig. 4 (a), a teacher needs to switch a plurality of task interfaces on the same electronic device to complete live video, presentation or annotation of electronic lectures, group message distribution, and the like. Students also need to switch multiple task interfaces on the same electronic device to accomplish live video viewing, electronic lecture viewing or labeling, group message viewing or publishing, classroom note recording, etc. The task interfaces (such as a group message interface and a memo interface) cannot be displayed completely on one display screen due to the plurality of task interfaces. And, the switching of the above-mentioned multitasking interface is complicated for the user (such as a teacher or a student) to operate.
In addition, for the convenience of clearly displaying the task interfaces and facilitating the user to view, the task interfaces are usually displayed on a large-screen device (such as a computer). However, since a user (e.g., a teacher or a student) needs to operate a large-screen device (e.g., a computer) at any time, it is necessary to maintain a short distance from the large-screen device. The user keeps a short distance from the large-screen device for a long time, and the eyesight health of the user can be influenced.
As shown in fig. 4 (b), a memo interface and a short message interface are simultaneously displayed on the electronic device in a split screen manner. The user can simultaneously process the memo related task and the short message related task using the electronic device. However, for some small-screen devices, such as smart phones, on the multitasking interface shown in (b) of fig. 4, for example, the display areas of the memo interface and the short message interface may be small, the screen area available for the user to process the memo related task and the short message related task may also be small, the efficiency of the user in processing the memo related task and the short message related task may be greatly affected, and the eyesight health of the user may be affected.
As also shown in fig. 5 (a), a video playback interface is displayed on the electronic device. The video playing interface comprises a video playing frame, a video collection area, a shortcut processing area (used for commenting, collecting, downloading or sharing and the like), more video areas and the like. The user can use the electronic device to watch the video in the video playing frame and perform other video playing interface related operations, such as comment making, more video browsing and the like. However, for some small-screen devices, such as smart phones, on a multitasking interface such as that shown in fig. 5 (a), if a user makes a comment or browses more videos, the user is forced to fail to maximize the video playing box. The inability to maximize the video playing box greatly affects the user experience of watching the video.
As also shown in fig. 5 (b), a small video interface with trembling tones is displayed on the electronic device. The trembled small video interface includes a video play box and a comment area. Here, as shown in fig. 5 (b), when the user makes a comment in the comment area, the trembling application calls the input keyboard. The input keyboard is displayed on the small video interface in a floating or superposed mode. A user may use the electronic device to enter comments via the input keyboard while viewing the small video and browsing the comments. However, the above-described input keyboard with floating or superimposed display may obscure most of the comment area. Therefore, the experience of the user in browsing the comments is affected.
As shown in fig. 6 (a), a picture list is displayed on the electronic device, and the picture list includes thumbnails of a plurality of pictures. The user can select the picture in the picture list to enter the detail interface of the selected picture for picture detail browsing. However, the above method for viewing the details interface of the picture is complicated to operate, and the user needs to select pictures one by one, view the details interface of the picture, return to the picture list, and select another picture to enter the details interface … … of the picture. In addition, the method for viewing the detailed interface of the picture influences the efficiency of the user for browsing the detailed interface of the picture.
As also shown in fig. 6 (b), an email list including an abbreviated list of multiple mails (including unread mails and/or read mails) is displayed on the electronic device. The user can select the mails in the email list to enter the detail interface of the selected mails for mail detail browsing. However, the above method for viewing the email details interface is complicated in operation, and the user needs to select emails one by one, process the emails in the email details interface, return to the email list, and select another email to enter the email details interface … …. In addition, the above method for viewing the detailed interface of the email may affect the efficiency of the user in processing the detailed interface of the email.
Based on the above consideration, the application provides a distributed display method of an interface. The distributed display method of the interface is specifically used for displaying the content displayed on the first electronic device on a plurality of different electronic devices (such as the first electronic device and the second electronic device) in a distributed and coordinated manner. Therefore, the user can process different tasks through a plurality of electronic devices, and the operation of the user is facilitated.
For example, the one interface is a first interface. The first interface may be an interface of any application. For example, the first interface is an application (i.e., native application) interface installed in the first electronic device when the operating system is installed before the first electronic device is shipped from the factory, such as an interface of a short message application, an address book application, or a gallery application shown in fig. 3. As another example, the first interface may be an application (i.e., third-party application) interface for a user to download an installed application through an application store, such as that shown in FIG. 3
Figure BDA0002740623190000201
Applications, email applications,
Figure BDA0002740623190000202
An application or a video application, etc. The application does not limit the specific application corresponding to the first interface and the interfaceThe specific content displayed. In some embodiments, the first electronic device may respond to an operation (e.g., a first operation) used by a user to trigger the distributed collaborative display, and the part of the content on the first interface is distributively and collaboratively displayed on the second electronic device. For example, the first electronic device displays a video collection area, a shortcut processing area (for making comments, collecting, downloading, sharing, or the like), and more video areas as shown in (a) in fig. 5; the second electronic device displays a video play box as shown in (a) of fig. 5. As another example, the first electronic device displays a video play box and a comment area as shown in (b) in fig. 5; the second electronic device displays an input keyboard as shown in (b) of fig. 5.
Further, in some embodiments, the first interface may include a plurality of task interfaces, and in some embodiments, the first electronic device may distributively and cooperatively display the plurality of task interfaces on the second electronic device in response to an operation (e.g., the first operation) used by the user to trigger the distributed and cooperative display. For example, a first electronic device displays a first interface of a first application, the first interface including at least a first functional area and a second functional area. The content of the first functional area is a first task interface, and the content of the second functional area is a second task interface. The first electronic device can send the first task interface of the first functional area to the second electronic device for display. After the first electronic device can send the first task interface of the first functional area to the second electronic device for display, the first electronic device displays a second interface of the first application, wherein the second interface comprises the second functional area and does not comprise the first functional area. For example, a first electronic device displays a live video as shown in (a) in fig. 4; the second electronic device displays the electronic lecture, the class communication group interface, and the memo as shown in (a) of fig. 4. As another example, the first electronic device displays a memo interface as shown in (b) of fig. 4; the second electronic device displays a short message interface as shown in (b) of fig. 4.
It should be noted that the first electronic device and the second electronic device in the present application may be any type or structure of electronic devices including a display screen. For example, the first electronic device and the second electronic device may be a smart phone, a netbook, a tablet, a smart camera, a palm top computer, a PDA, a PMP, an AR/VR device, a notebook, a PC, a UMPC, etc. as described above, which is not limited in this application.
In addition, in the present application, the first electronic device and the second electronic device may be the same type of electronic device. For example, the first electronic device is a smartphone 1, and the second electronic device is a smartphone 2. The first electronic device and the second electronic device may also be different types of electronic devices. For example, the first electronic device is a smart phone, and the second electronic device is a tablet computer.
In addition, in the present application, the number n of electronic devices of the distributed collaborative display is greater than or equal to 2, where n is an integer. For example, n is 2, which means that different contents on one interface displayed on the first electronic device are distributively and cooperatively displayed on 2 electronic devices, namely, the first electronic device and one second electronic device. For another example, if n is 3, the different content on one interface displayed on the first electronic device is distributedly and cooperatively displayed on the first electronic device and two second electronic devices, which are 3 electronic devices in total.
The following describes a distributed display method of an interface provided in an embodiment of the present application in detail with reference to the accompanying drawings.
In this embodiment, the first interface may include a plurality of functional regions (e.g., a first functional region and a second functional region). Wherein the content of the different functional areas is used to implement different functions. The first electronic device may also be referred to as a master device. The master device may transfer one or more of the plurality of task interfaces displayed on the master device to devices of the other one or more devices. For example, transferring the task interface of one or more of the functional zones to a second electronic device presentation. Opposite the master device is a slave device (i.e., a second electronic device).
As shown in fig. 5 (a), the video playing interface (i.e., the first interface) includes a functional area for playing a video, i.e., a video playing frame, and also includes functional areas for commenting, collecting, downloading, or sharing, i.e., a video collection area and a shortcut processing area. As another example, as shown in fig. 5 (b), the small video interface with tremble (i.e., the first interface) includes a functional area for playing video, i.e., a video playing box, and also includes a functional area for making comments, i.e., a comment area. When a user posts a comment in the comment area, the trembled small video interface (i.e., the first interface) also includes a functional area for inputting a comment, i.e., an input keyboard.
As another example, a plurality of application interfaces or a plurality of applet interfaces may be included on the first interface, as shown in fig. 4 (a) or fig. 4 (b). The plurality of application interfaces may be composed of task interfaces of a plurality of applications, or may be composed of different task interfaces of the same application. The plurality of applet interfaces may be composed of task interfaces of a plurality of applets, or may be composed of different task interfaces of the same applet. The application does not limit the specific source of the plurality of task interfaces displayed on the first electronic device.
In some embodiments, the plurality of task interfaces displayed on the first electronic device may be different application interfaces. For example, the plurality of task interfaces may be any plurality of application interfaces installed in the first electronic device.
In other embodiments, the plurality of task interfaces displayed on the first electronic device may be the same application interface. It is to be appreciated that the same application can include different application interfaces at different levels. Wherein, different levels may have a top-bottom level relationship or no level relationship. In response to a user operating on an interface, the interface may invoke a next level interface of the interface. Therefore, the plurality of task interfaces displayed on the first electronic device can be application interfaces of the same application or different levels. For example, two application interfaces are displayed on the first electronic device, one of which is
Figure BDA0002740623190000211
The friend circle interface, the other is the public number interface.
In other embodiments, the plurality of task interfaces displayed on the first electronic device may be applet interfaces. For example, applications (e.g. applications)
Figure BDA0002740623190000212
Or
Figure BDA0002740623190000213
) Middle applet interface.
The plurality of task interfaces in the embodiment of the present application may be displayed in the same window, or may be displayed in a plurality of windows, respectively, and the present application is not limited.
In some embodiments, multiple task interfaces in embodiments of the present application may be displayed in one window. For example, if the multiple task interfaces in the embodiment of the present application are combinations of multiple applet interfaces, multiple task interfaces may be displayed in one window. As shown in fig. 4 (a), a live video interface, an electronic lecture interface, a class communication group interface, and a memo interface are displayed in one application (an electronic classroom application as shown in fig. 4 (a)) window.
In other embodiments, a plurality of task interfaces in the embodiments of the present application may be displayed in a plurality of windows, respectively. For example, if the plurality of task interfaces in the embodiment of the present application include a plurality of application interfaces, the plurality of task interfaces may be displayed in a plurality of windows. As shown in (b) of fig. 4, a memo interface is displayed in the memo application window, and a short message interface is displayed in the short message application window.
In some embodiments, a plurality of task interfaces of a plurality of functional areas (e.g., a first functional area and a second functional area) on the first interface may be laid out on the display screen of the first electronic device in a preset relative positional relationship. For example, a plurality of frame templates for specifying the number of function areas, the position of each function area, the shape and size of each function area, and the like are previously provided in the first electronic device; or for specifying the number of task interfaces that can be displayed, the display position of each task interface, the shape and size of each task interface, and the like, and the present application is not limited thereto. The plurality of task interfaces may be arranged on the display screen of the first electronic device according to a suitable framework template.
For example, in this embodiment of the application, the first interface may be that the first electronic device renders the virtual screen of the first electronic device together in the form of one or more atomic capabilities (AA) (also called an atomic service) according to a preset framework template, and sends the rendered virtual screen to the display screen of the first electronic device. Taking the first interface including the first functional area and the second functional area as an example, the first electronic device may render the first content in the form of one atomization service in the first functional area of the virtual screen, render the second content in the form of another atomization service in the second functional area of the virtual screen, and send the first content and the second content, which are rendered together on the virtual screen, to the display screen of the first electronic device according to a preset frame template.
It is understood that in the embodiments of the present application, the business logic of the application can be decoupled and split into atomic capabilities that can work independently on any device. Based on different atomic capabilities, migration across equipment can be completed on a distributed infrastructure. Where each atomic capability may implement a program function. And an interface can be opened to a developer and directly called by the system or other systems. In addition, different atomic capabilities may support flexible assembly to form an interface for an application, such as the first interface in the embodiments of the present application. In some embodiments, the first electronic device may determine which frame template to use based on the number and function of the functional regions, and so on. For example, if the first interface includes 2 functional areas, the 2 functional areas are displayed using a frame template of an upper and lower split screen display or a left and right split screen display. For another example, if the first interface includes 5 functional areas, where 1 functional area is used to implement a core function, and the remaining 4 functional areas are used to implement a non-core function, the 5 functional areas are displayed using the frame template of 1 large functional area +4 small functional areas.
In other embodiments, the first electronic device may determine which frame template to use based on the number and attributes of the task interfaces, and so on. For example, in the case of 2 task interfaces, the 2 task interfaces are displayed using a frame template of the top-bottom split screen display or the left-right split screen display, as shown in fig. 4 (b). For another example, if there are 5 task interfaces, where 1 task interface is a core task interface, and the remaining 4 task interfaces are non-core task interfaces, the 5 task interfaces are displayed using the frame template of 1 large interface +4 small interfaces shown in (a) in fig. 4, as shown in (a) in fig. 4.
In the embodiment of the application, when the first electronic device displays the first interface, the first electronic device can transfer part of content on the first interface to the second electronic device in response to a first operation of a user. For example, one or more task interfaces of the first functional area are sent to the second electronic device to be displayed. The first functional area is any functional area in the functional areas on the first interface of the first electronic device. For example, a functional area selected by the user, a functional area determined by the first electronic device, and the like.
In this embodiment of the application, the transferring, by the first electronic device, part of the content on the first interface to the second electronic device may specifically include: and the first electronic equipment sends the standard video stream corresponding to part of the content on the first interface to the second electronic equipment for displaying. For example, the first electronic device sends a standard video stream corresponding to the contents of one or more functional areas to the second electronic device for display. Wherein, the contents of different functional areas correspond to different task interfaces.
In some embodiments, the first operation may include, but is not limited to, a touch operation of the first electronic device with the second electronic device (e.g., an operation of a user touching (i.e., "touch") the first electronic device with the second electronic device), an operation of a user "shaking" the first electronic device and the second electronic device simultaneously, an operation of a user "shaking" the first electronic device, the first electronic device comprises a first electronic device display screen, a second electronic device display screen and a display screen, wherein the first electronic device display screen comprises a first electronic device display screen and a second electronic device display screen, the second electronic device display screen comprises a first electronic device display screen and a second electronic device display screen, the first electronic device display screen and the second electronic device display screen are arranged in the first electronic device, a preset gesture operation of a user on a physical key of the first electronic device (such as a preset operation on a power key, a volume increasing ("+") key and a volume decreasing key ("-"), a selection operation of the user (including a selection operation on one or more options) and the like. Alternatively, the first operation may also be another preset operation for triggering the distributed display of the interface, and the application does not limit a specific operation form.
Referring to fig. 7, fig. 7 is a diagram illustrating a first exemplary operation provided by an embodiment of the present application. Assume that a first interface as shown in (a) of fig. 5 is displayed on the smartphone 701 (i.e., the first electronic device). As shown in fig. 7, the first operation is an operation 703 of the user holding the smartphone 702 "one touch" with the smartphone 701.
It should be noted that fig. 7 only takes one corner of the smartphone 701 and one corner of the smartphone 702 as an example. In the embodiment of the present application, "collision" may be performed between any portion of the first electronic device and any portion of the second electronic device.
In this embodiment of the application, an electronic device (for example, a smart phone 701 or a smart phone 702 shown in fig. 7) may acquire a motion direction, a motion acceleration, and a motion speed of the electronic device in real time through an acceleration sensor and/or a gravity sensor, so as to determine whether the electronic device performs a "touch-and-dash" operation. For example, if the electronic device suddenly stops moving while moving at a certain speed, the electronic device presumes that the electronic device may be touched with another electronic device.
In other embodiments, the electronic device may acquire, in real time, a rotation direction, a rotation angular velocity, a rotation angle, and the like of the electronic device through the gyroscope sensor, so as to determine whether the electronic device performs a "touch-and-dash" operation. For example, if the electronic device suddenly stops moving while moving at a certain rotational angular velocity, the electronic device presumes that the electronic device may be touched with another electronic device.
In other embodiments, the electronic device may determine whether the electronic device has performed a "bump-on-bump" operation by analyzing audio data collected by the microphone. For example, if the silver screen data received by the microphone satisfies a certain tone, a certain loudness, and a specific impact tone, the electronic device presumes that the electronic device may be touched with other electronic devices. As for the basis for the electronic device to determine whether a touch operation with another electronic device occurs, reference may be made to the conventional technology, which is not listed here.
As another example, fig. 8 illustrates three first operation example diagrams provided by the embodiment of the present application. Assume that a first interface is displayed on the smartphone 701 (i.e., the first electronic device). As shown in fig. 8 (a), the first operation is a preset gesture operation 801 by a user on a touch screen (i.e., a display screen) of the smartphone 701. As shown in (b) in fig. 8, the first operation is an operation 802 in which the user presses the smartphone 701 power key and the volume up ("+") key at the same time. As shown in (c) of fig. 8, the first operation is a click operation 803 (including single click, double click, long press, and the like) of the virtual button on the first interface by the user.
In fig. 8, (a) is an example of a "one-line" preset gesture operation of a user at a middle position of a touch screen of the smartphone 701. In the embodiment of the present application, a specific trajectory of the preset gesture is not limited, and for example, the preset gesture may also be any shape such as "o" shape, "+" font or "+" font. The embodiment of the present application also does not limit the specific form of the gesture, and for example, the gesture may also be a multi-finger (e.g., three-finger) sliding gesture. Further, the present application also does not limit the input position of the preset gesture, for example, the preset gesture is also a user slides inward from the edge of the touch screen of the smartphone 701. In addition, (b) in fig. 8 is an example in which the volume up ("+") key and the volume down ("-") key are located on the long sides on the right side of the screen of the smartphone 701 with the power key located on the short sides above the screen of the smartphone 701, showing a first operation in which the user presses the power key and the volume up ("+") key of the smartphone 701 simultaneously. In the embodiment of the present application, specific positions of the physical keys on the electronic device are not limited. In addition, in fig. 8 (c), the virtual button is displayed at the lower left corner of the touch screen of the smartphone 701 as an example, in this embodiment, the virtual button may also be displayed at any other position of the touch screen, and this application is not limited.
As another example, fig. 9 is a diagram illustrating another three first operation examples provided in the embodiment of the present application, by taking the multitasking scenario of the electronic classroom illustrated in (a) in fig. 4 as an example.
As shown in fig. 9 (a), a virtual key 901 for triggering a multitasking distributed display is displayed on the display screen of the first electronic device. In response to the operation of clicking the virtual key 901 by the user, the first electronic device triggers the multitask distributed display and transfers one or more of the plurality of task interfaces displayed on the display screen to the second electronic device.
As shown in fig. 9 (b), a virtual key 901 for triggering the multitasking distributed display is displayed on the display screen of the first electronic device. In response to an operation of clicking the virtual key 901 by the user, the first electronic device displays an option box 902. As shown in fig. 9 (b), a second electronic device list is included in the option box 902 for the user to select second electronic devices to be displayed in a distributed manner with the first electronic device, such as "my smartphone", "my tablet", and "my television" shown in fig. 9 (b). In response to a user selecting a second electronic device (e.g., "my tablet") from the list of second electronic devices, the first electronic device triggers a multitask distributed display to transfer one or more of the plurality of task interfaces displayed on the display screen to the user-selected second electronic device (e.g., "my tablet").
As also shown in fig. 9 (c), an option box 902 for triggering the multitasking distributed display is displayed on the display screen of the first electronic device. As shown in fig. 9 (b), a second electronic device list is included in the option box 902 for the user to select second electronic devices to be displayed in a distributed manner with the first electronic device, such as "my smartphone", "my tablet", and "my television" shown in fig. 9 (b). In response to a user selecting a second electronic device (e.g., "my tablet") from the list of second electronic devices, the first electronic device triggers a multitask distributed display to transfer one or more of the plurality of task interfaces displayed on the display screen to the user-selected second electronic device (e.g., "my tablet").
In some examples, as shown in fig. 9 (b) and 9 (c), options of a multi-device display mode, such as a mirror display, a transition display, or a distributed collaborative display, may also be displayed in the option box 902. In response to the user selecting the second electronic device (e.g., "my tablet") from the second electronic device list and the user selecting the multitask collaborative display, clicking the determined option by the user, the first electronic device triggers the multitask distributed collaborative display, and one or more of the plurality of task interfaces displayed on the display screen are transferred to the second electronic device (e.g., "my tablet") selected by the user.
In other examples, as shown in fig. 9 (b) and 9 (c), device options for audio output may also be displayed in options box 902. This option is used for the user to select the device that outputs the audio. For example, in response to an operation of the user selecting the second electronic device (e.g., "my tablet") from the second electronic device list, an operation of the user selecting the multitask collaborative display, and an operation of the user clicking the determination option after selecting the operation of outputting the audio through the "my tablet", the first electronic device triggers the multitask distributed collaborative display to transfer one or more of the plurality of task interfaces displayed on the display screen to the second electronic device (e.g., "my tablet") selected by the user. Wherein. "my tablet" is responsible for outputting audio.
It should be noted that fig. 9 (a), fig. 9 (b) and fig. 9 (b) only illustrate several possible first operations in the embodiment of the present application by taking the first operation in the form of clicking a virtual key or selecting an option as an example. The application does not limit the specific virtual key icon form and display position, and also does not limit the specific display form, display position or included specific content of the options. For example, options box 902 may also display device options for a camera that captures the user's avatar, or a microphone that captures the user's voice, etc.
In addition, (a) in fig. 9, (b) in fig. 9, and (b) in fig. 9 are examples of the multitasking scenario of the electronic classroom shown in (a) in fig. 4, and the first operation based on the virtual key or the option shown in (a) in fig. 9, (b) in fig. 9, and (b) in fig. 9 is applicable to any multitasking scenario, and embodiments of the present application are not listed.
In some embodiments, the virtual keys or options may always be displayed on the first electronic device in a floating manner. As shown in fig. 9 (a) and 9 (b), the virtual key 901 is displayed on the first electronic device interface in a floating manner.
In other embodiments, the virtual key or the option may be hidden and displayed on the first electronic device. For example, in response to a user clicking operation on a retract key 903 illustrated in (b) in fig. 9 or (c) in fig. 9, the first electronic device hides a display option box 902 on the interface.
In the present application, the second electronic device for distributed cooperative display with the first electronic device may be determined by the first electronic device according to the first operation of the user, or may be determined by the first electronic device itself in response to the first operation of the user, which is not limited in the present application.
For the case that the first electronic device determines the second electronic device according to the first operation of the user, for example, assuming that the first operation is an operation of "hitting" the second electronic device with the first electronic device held by the user, the first electronic device may determine the second electronic device by detecting a device whose distance from the first electronic device is smaller than a preset threshold (e.g., 5 cm). For another example, assuming that the first operation is an operation of the user "shaking" the first electronic device and the second electronic device at the same time, the first electronic device may determine the second electronic device by detecting a device that is "shaking" near the first electronic device.
As another example, assuming that the first operation is a selection operation (including a selection operation of one or more options) by the user on an option displayed on the display screen of the first electronic device as shown in (a) in fig. 9, the second electronic device may also be determined by the first electronic device according to the selection operation by the user on the option displayed on the display screen of the first electronic device, for example, for the example shown in (b) in fig. 9 or (c) in fig. 9, the second electronic device is determined by the first electronic device according to the option selected by the user.
For the case that the first electronic device determines the second electronic device by itself in response to the first operation of the user, for example, assuming that the first operation is an operation of "shaking" the first electronic device by the user, a preset gesture operation on the first electronic device by the user, a preset operation on a physical key of the first electronic device by the user, or an operation on a virtual key of the first electronic device by the user, the first electronic device may obtain, in response to the first operation of the user, electronic device information that displays interface content in a distributed manner with the first electronic device within a preset time period, so as to determine the second electronic device according to the electronic device information. For example, the second electronic device may be an electronic device that distributively displays the interface content with the first electronic device the most number of times within a preset time period. For another example, the second electronic device may be the electronic device that was most recently distributed with the first electronic device to display interface content. The preset time period may include the last three days, the last week, the last month or the last three months, and the application is not limited.
As another example, for a case where the first operation is a selection operation (including a selection operation of one or more options) by the user on an option displayed on the display screen of the first electronic device as shown in (a) in fig. 9, the second electronic device may be determined by the first electronic device according to the electronic device information of the multitasking interface displayed in distributed cooperation with the first electronic device within the preset time period. For example, the second electronic device may be an electronic device that displays the multitasking interface the most times in a distributed cooperation with the first electronic device within a preset time period. For another example, the second electronic device may be the electronic device that most recently displayed the multi-tasking interface in distributed cooperation with the first electronic device. The preset time period may include the last three days, the last week, the last month or the last three months, and the application is not limited.
In some embodiments of the present application, it is desirable to first determine the primary device and the secondary device. For different forms of first operation, the primary device and the secondary device may be determined in at least two ways:
mode 1: the primary device and the secondary device are determined according to the user's selection.
In some embodiments, the primary device and the secondary device may be determined according to a selection operation (i.e., a second operation) by a user on an interface (e.g., a second interface) displayed by the electronic device in response to a first operation by the user. Wherein the method is applicable to any of the above-described forms of the first operation.
In some examples, in response to a first operation by the user, the electronic device may display a prompt box for the user to select the primary device and the secondary device.
For example, assuming that the first operation is an operation (operation 703 shown in fig. 7) in which the user holds the first electronic device "hit-and-hit" the second electronic device, the interface (e.g., the second interface) displayed by the first electronic device may be as shown by the interface 1001 (i.e., the second interface) of (a) in fig. 10 in response to the first operation by the user. In some examples, a prompt box 1002 is displayed on the interface 1001 for prompting a user of a device model (which may also be a device identifier or a user-defined device name, which is not shown in fig. 10, but is not limited in this application) of "hit-and-hit" with the device. In other embodiments, as shown in interface 1001, prompt field 1002 is further configured for the user to select whether to extend the interface content of the native device (i.e., the first electronic device) to the second electronic device or to extend the interface content of the second electronic device to the native device (i.e., the first electronic device).
Based on a prompt box 1002 on an interface 1001 shown in fig. 10 (a), if the user selects an option of "the local device is extended to the second electronic device" (i.e., the second operation), the first electronic device determines that the first electronic device is the primary device and the second electronic device is the secondary device. That is, the first electronic device determines to distributively display a portion of the content on the first interface of the first electronic device to the second electronic device.
If the user selects the option of "the second electronic device is extended to the local device" (i.e. the second operation), the first electronic device determines that the second electronic device is the primary device and the first electronic device is the secondary device. That is, the first electronic device determines that the first electronic device distributively displays a portion of content on the first interface of the second electronic device.
As another example, assuming that the first operation is an operation (operation 703 shown in fig. 7) of "bumping against" the second electronic device while holding the first electronic device by the user, the interface (e.g., the second interface) displayed by the second electronic device may be the interface 1003 (i.e., the second interface) shown in (b) of fig. 10 in response to the first operation by the user. In some examples, a prompt box 1004 is displayed on the interface 1003, and is used to prompt the user of a device model (which may also be a device identifier or a device name defined by the user, and is not shown in fig. 10, but is not limited in this application) of "hit-and-hit" with the device. In other embodiments, as shown in interface 1003, prompt box 1004 is further used for the user to select whether to extend the interface content of the native device (i.e., the second electronic device) to the first electronic device or to extend the interface content of the first electronic device to the native device (i.e., the second electronic device).
Based on a prompt box 1004 on the interface 1003 shown in fig. 10 (b), if the user selects an option of "the local device is expanded to the first electronic device" (i.e., the second operation), the second electronic device determines that the second electronic device is the primary device and the first electronic device is the secondary device. That is, the second electronic device determines to distributively display a portion of the content on the first interface of the second electronic device to the first electronic device.
If the user selects the option of "the first electronic device is extended to the local device" (i.e. the second operation), the second electronic device determines that the first electronic device is the primary device and the second electronic device is the secondary device. That is, the second electronic device determines that the second electronic device distributively displays a portion of the content on the first interface of the first electronic device.
It should be noted that, in some embodiments, assuming that the first operation is an operation of "hitting" the second electronic device with the first electronic device held by the user (operation 703 shown in fig. 7), prompt boxes may be displayed on both the first electronic device and the second electronic device in response to the first operation of the user. For example, the first electronic device displays an interface 1001 shown in (a) of fig. 10, and the second electronic device displays an interface 1003 shown in (b) of fig. 10. In this case, the user may select whether to extend the interface content of the native device (i.e., the second electronic device) to the first electronic device or to extend the interface content of the first electronic device to the native device (i.e., the second electronic device) through a prompt box on the interface 1001 or the interface 1003. Wherein, the response of the device selected by the user is received as the final result.
In another example, in response to a first operation by a user, the electronic device may display a draggable icon to determine the primary device and the secondary device according to a drag action of the user in conjunction with a selection operation by the user on a third interface displayed by the electronic device in response to the drag action of the user.
For example, in response to a first operation (e.g., operation 703 shown in fig. 7, operation 801 shown in (a) in fig. 8, operation 802 shown in (B) in fig. 8, or operation 803 shown in (c) in fig. 8, etc.) of the first electronic device by the user, the first electronic device displays an interface 1101 (i.e., a second interface) shown in (a) in fig. 11A or (a) in fig. 11B. In which an icon "transmission angle" 1102 is displayed on the interface 1101. The "transmission angle" 1102 is used for the user to determine interface content to be distributively displayed on other devices through a drag action.
Based on the "transmission angle" 1102 of the interface 1101 shown in fig. 11A or fig. 11B, if the first electronic device receives a drag operation of the "transmission angle" 1102 by the user, it may be preliminarily determined that the first electronic device is the master device. The dragging operation of the "transmission angle" 1102 by the user may include, but is not limited to, the user dragging the "transmission angle" 1102 to a content area to be expanded to the distributed display of the other device, or the user dragging the content to be expanded to the distributed display of the other device to the "transmission angle" 1102, which is not limited in the present application.
In some embodiments, when the "transmission angle" 1102 first appears, a usage description about the "transmission angle" 1102 may also be displayed on the interface 1101 (i.e., the second interface), as shown by the guiding box 1103 in (a) in fig. 11A or (a) in fig. 11B.
In some embodiments, the first electronic device may determine that the second electronic device is a secondary device of the first electronic device according to a second operation of the user. The second operation is a selection operation of the user on the prompt box; the prompt box is displayed on the first electronic device in response to the dragging of a draggable icon by the first electronic device, the prompt box is used for the user to select the main device and the auxiliary device, and the draggable icon is displayed on the first electronic device in response to the touch operation by the first electronic device. For example, in response to a user's drag operation on the "transmission angle" 1102, the first electronic device displays an interface 1104 (i.e., a third interface) as shown in (b) of fig. 11A. Wherein, a prompt box 1105 is displayed on the interface 1104. In some embodiments, as shown in fig. 11A (b), a prompt box 1105 is used for a user to select whether to extend the interface content of the native device (i.e., the first electronic device) to the second electronic device or to extend the interface content of the second electronic device to the native device (i.e., the first electronic device).
Based on the prompt box 1105 on the interface 1104 shown in fig. 11A (b), if the user selects the option "the native device is extended to the second electronic device" (i.e., the third operation), it may be further determined that the first electronic device is the primary device and the second electronic device is the secondary device. That is, the first electronic device determines to distributively display a portion of the content on the first interface of the first electronic device to the second electronic device.
If the user selects the option "second electronic device extends to the local device" (i.e. the third operation), it may be further determined that the second electronic device is the primary device and the first electronic device is the secondary device. That is, the first electronic device determines that the first electronic device distributively displays a portion of content on the first interface of the second electronic device.
In other embodiments, in response to a user dragging operation (i.e., the second operation) to the "transmission angle" 1102, the first electronic device displays an interface 1106 (i.e., a third interface) as shown in (B) of fig. 11B. Wherein, a prompt box 1107 is displayed on the interface 1106. In some embodiments, as shown in fig. 11B (B), a prompt box 1107 is used for the user to determine whether to extend the interface content of the native device (i.e., the first electronic device) to the second electronic device.
Based on a prompt box 1107 on the interface 1106 shown in fig. 11B (B), if the user determines to extend the interface content of the native device (i.e., the first electronic device) to the second electronic device, the second electronic device may be determined to be the secondary device. That is, the first electronic device determines to distributively display a portion of the content on the first interface of the first electronic device to the second electronic device. If the user cancels the expansion of the interface content of the local device (namely the first electronic device) to the second electronic device, the current distributed display process is stopped.
In another example, in response to a first operation by a user, the electronic device may display a draggable icon to determine the primary device from a drag action (i.e., a second operation) by the user, and to determine the secondary device by itself.
For example, in response to a first operation (e.g., operation 703 shown in fig. 7, operation 801 shown in (a) in fig. 8, operation 802 shown in (B) in fig. 8, or operation 803 shown in (c) in fig. 8, etc.) of the first electronic device by the user, the first electronic device displays an interface 1101 (i.e., a second interface) shown in (a) in fig. 11A or (a) in fig. 11B. In which an icon "transmission angle" 1102 is displayed on the interface 1101. The "transmission angle" 1102 is used for the user to determine interface content to be distributively displayed on other devices through a drag action.
Based on the "transmission angle" 1102 of the interface 1101 shown in fig. 11A or fig. 11B, if the first electronic device receives a drag operation (i.e., a second operation) of the "transmission angle" 1102 by the user, it may be determined that the first electronic device is the main device.
In another example, the electronic device may also display a prompt box 1002 shown in (a) of fig. 10 or a prompt box 1004 shown in (b) of fig. 10 for the user to select the primary device and the secondary device when discovering that a device establishes a connection (e.g., a wired connection or a wireless connection) with it.
In other examples, the electronic device may further display a draggable icon when discovering that the device establishes a connection (e.g., a wired connection or a wireless connection) with the electronic device, to determine the primary device according to a dragging action (i.e., a second operation) of the user, and to determine the secondary device by itself. Or, according to the dragging action (namely, the second operation) of the user, in combination with the selection operation (namely, the third operation) of the user on a third interface displayed by the electronic device in response to the dragging action (namely, the second operation) of the user, the main device and the auxiliary device are determined.
Further, the first electronic device may acquire electronic device information that displays interface content in a distributed manner with the first electronic device within a preset time period, so as to determine the second electronic device according to the electronic device information. For example, the second electronic device may be an electronic device that distributively displays the interface content with the first electronic device the most number of times within a preset time period. For another example, the second electronic device may be an electronic device that has displayed interface content in a distributed manner with the first electronic device last time, and the application is not limited thereto.
In some embodiments, the draggable icon (e.g., the "transmission angle") may be always displayed on the display screen of the primary device (i.e., the first electronic device) in a floating manner during the distributed display of the primary device and the secondary device (i.e., the second electronic device), so that the user may adjust the interface content to be transferred to the secondary device at any time.
In other embodiments, the primary device may display an option to remove the draggable icon for selection by the user when the primary device (i.e., the first electronic device) and the secondary device (i.e., the second electronic device) begin a distributed display. In some examples, as shown in fig. 12 (a), in response to a user clicking on the remove draggable icon 1201, the host device removes the draggable icon (e.g., "transfer angle" 1202) while removing the icon 1201. In other examples, as shown in (b) of fig. 12, if the host device does not receive an operation of clicking the icon 1201 by the user within a preset time (e.g., within 10 seconds), the host device removes the icon 1201, but still displays the "transmission angle" 1202.
Note that the prompt box 1004 shown in fig. 10 (a) and 10 (b) in the present application is only an example, and the present application does not limit the specific display form of the prompt box, and includes a display position and a size of the prompt box on the display screen, a specific language expression in the prompt box, and the like. The "transmission angle" 1102 in fig. 11A, 11B, and the "transmission angle" 1202 in fig. 12 (a) and 12 (B) of the present application are merely examples of one type of draggable icon, and the present application does not limit a specific display form of the draggable icon, including an icon form of the draggable icon, a display position on a display screen, a size, and the like.
In addition, (a) in fig. 12 and (b) in fig. 12 of the present application are examples in which an option to remove a draggable icon is displayed on the draggable icon in a floating manner, and the present application does not limit a specific display form of the option to remove the draggable icon. For example, the option may also be displayed elsewhere independently of the draggable icon. And, the option of removing the draggable icon may also be presented to the user in other forms of icons.
Mode 2: the master device and the slave device are determined from motion data of the electronic device.
Wherein, the method is suitable for the operation of 'touch and touch'.
Taking an operation (i.e., a first operation) of a user holding a first electronic device to "touch" a second electronic device, for example, the first electronic device and/or the second electronic device may collect motion data of itself in response to the first operation of the user. The motion data may include, but is not limited to, a motion direction, a motion acceleration, a motion speed, a rotation direction, a rotation angular speed, a rotation angle, and the like of the device.
In some examples, a stationary device may be determined to be a primary device and a moving device may be determined to be a secondary device based on motion data of the first electronic device and/or the second electronic device.
In other examples, a device with a small motion acceleration (e.g., less than a preset threshold) may be determined as the master device and a device with a large motion acceleration (e.g., greater than the preset threshold) may be determined as the slave device according to the motion data of the first electronic device and/or the second electronic device.
In other examples, if the motion accelerations of the first electronic device and the second electronic device are the same, the primary device and the secondary device may be determined according to a user selection. For example, the first electronic device and/or the second electronic device may pop up a prompt box (as shown in fig. 10 (a) or fig. 10 (b)) for the user to select which device the interface content is to be expanded to, i.e., for the user to select which device is the primary device and which device is the secondary device.
In the application, after determining the primary device (i.e., the first electronic device) and the secondary device (i.e., the second electronic device), the primary device (i.e., the first electronic device) transfers part of the content on the first interface to the secondary device (i.e., the second electronic device) for display.
The number of the task interfaces displayed on each second electronic device is not limited in the embodiment of the application.
In some embodiments, when the first electronic device and the second electronic devices are displayed in a distributed and coordinated manner, one of the task interfaces is displayed on each of the second electronic devices. For example, when the first electronic device and the second electronic device display the plurality of task interfaces shown in (a) of fig. 4 in a distributed cooperation manner, any one of a live video interface, an electronic lecture interface, a class communication group interface, or a memo interface shown in (a) of fig. 4 may be displayed on the second electronic device. For another example, when the first electronic device and the second electronic device display the multiple task interfaces shown in (b) of fig. 4 in a distributed and cooperative manner, a memo interface or a short message interface shown in (b) of fig. 4 may be displayed on the second electronic device.
As shown in fig. 13, fig. 13 is an exemplary diagram illustrating an interface distributed display provided in an embodiment of the present application, where the first electronic device is a notebook computer, the second electronic device is a tablet computer, and the notebook computer displays a multitask scene of an electronic classroom shown in (a) of fig. 4 as an example. As shown in fig. 13, in response to an operation that the user selects "my tablet" from the second electronic device list of the option box 902 through a mouse cursor, an operation that the user selects the multitask collaborative display, and an operation that the user clicks the determination option after an operation that the audio is output through "my tablet" is selected, the notebook computer triggers the multitask distributed collaborative display. As shown in fig. 13, the notebook computer transfers the memo interface to the tablet computer for display, and the notebook computer displays the live video interface, the electronic lecture interface and the class communication group interface.
In the distributed collaborative display example shown in fig. 13, in response to an operation of the user selecting "my tablet" from the second electronic device list of the option box 902 through a mouse cursor, an operation of the user selecting multitask collaborative display, and an operation of the user clicking a determination option after an operation of outputting audio through "my tablet" is selected, the notebook computer may decouple a plurality of task interfaces displayed, for example, a live video interface, an electronic lecture interface, and a class communication group interface into one atomic capability (e.g., a first atomic capability), and a memo interface into another atomic capability (e.g., a second atomic capability). And transferring the memo interface corresponding to the second atomic capability to the tablet computer for display, wherein the notebook computer still displays the live broadcast video interface, the electronic lecture interface and the class communication group interface corresponding to the first atomic capability.
As shown in fig. 14, fig. 14 is an example that the first electronic device is a smartphone a, the second electronic device is a smartphone B, and the smartphone a displays a split-screen display scene shown in (B) in fig. 4, which is an example diagram for describing an interface distributed display provided in an embodiment of the present application. As shown in fig. 14, in response to an operation of the user selecting "my smartphone" from the second electronic device list of the option box 902, an operation of the user selecting a multitask collaborative display, and an operation of the user clicking a determination option after selecting an operation of outputting audio through "my smartphone", the smartphone a triggers a multitask distributed collaborative display. As shown in fig. 14, smartphone a transfers the memo interface to smartphone B for display, and smartphone a displays the short message interface.
In other embodiments, each of the second electronic devices displays a plurality of the task interfaces. For example, when the first electronic device and the second electronic device cooperatively display the plurality of task interfaces shown in (a) of fig. 4 in a distributed manner, two or more of a live video interface, an electronic lecture interface, a class communication group interface, or a memo interface shown in (a) of fig. 4 may be displayed on the second electronic device.
As shown in fig. 15A, fig. 15A is an exemplary view of an interface distributed display provided by an embodiment of the present application, in which a first electronic device is a notebook computer, a second electronic device is a tablet computer, and the notebook computer displays a multitasking scene of an electronic classroom shown in (a) of fig. 4 as an example. As shown in fig. 15A, in response to an operation in which the user selects "my tablet" from the second electronic device list of the option box 902 through a mouse cursor, an operation in which the user selects the multitask collaborative display, and an operation in which the user clicks a determination option after an operation in which audio is output through "my tablet" is selected, the notebook triggers the multitask distributed collaborative display. As shown in fig. 15A, the notebook computer transfers the memo interface and the class communication group interface to the tablet computer for display, and the notebook computer displays the live video interface and the electronic lecture interface.
As shown in fig. 15B, fig. 15B is an example of an interface distributed display scenario that is provided in an embodiment of the present application and takes a first electronic device as a smart phone a, a second electronic device as a smart phone B, and the smart phone a displays the scenario in a split screen manner. Assuming that a memo interface, a short message interface and a video interface are displayed on the smartphone a in a split screen manner, in response to an operation that the user selects "my smartphone" from the second electronic device list of the option box 902, an operation that the user selects multitask collaborative display, and an operation that the user clicks a determination option after selecting an operation that audio is output through "my smartphone", the smartphone a triggers multitask distributed collaborative display. As shown in fig. 15B, the smartphone a transfers the video interface to the smartphone B for display, and the smartphone a displays the short message interface and the memo interface. The short message interface and the memo interface can be displayed on the smartphone a in a split screen manner as shown in fig. 15B. In this embodiment of the application, after the task interfaces displayed on the first electronic device are distributedly and cooperatively displayed on the first electronic device and the second electronic device, that is, after one or more task interfaces displayed on the first electronic device are transferred to the second electronic device, the first electronic device reallocates the display layout of the remaining functional regions on the first electronic device. For example, the first electronic device may reallocate the functional areas of the display screen and may rearrange the remaining task interfaces in the reallocated functional areas. For example, if the first electronic device sends the task interface of the first functional area to the second electronic device for display, the first electronic device renders the remaining functional areas together on the virtual screen of the first electronic device according to the successfully newly determined frame template, and sends the rendered functional areas to the display screen of the first electronic device.
The distributed display method of the interface provided by the embodiment of the application can support, but is not limited to, the following specific distributed display modes: mode (one) the second electronic equipment shows the focus content on the first interface; in the mode (II), the second electronic equipment displays the lower-layer interface information of the first interface; in the mode (III), the second electronic equipment displays the suspended content or the superposed content on the first interface; and (IV) displaying the task interface of the functional area selected by the user on the first interface by the second electronic equipment. The following will specifically describe the distributed display method based on the four modes of the interface with reference to the drawings.
The method (I): the second electronic equipment displays the focus attention content on the first interface.
In the present application, the focused attention content may be understood as interface content with a relatively high user attention or core content of the first interface. In some embodiments, the core content may be self-defined by business attributes or application attributes, etc.
In some examples, the first electronic device may determine, according to an application attribute corresponding to the first interface, interface content or core content on the first interface with a higher user attention. For example, for a map application interface, interface content with a higher attention degree by a user is interface content corresponding to a navigation window, and core content of the map application interface is also interface content corresponding to the navigation window. For this situation, the interface content corresponding to the navigation window may be transferred from the display screen of the first electronic device to the second electronic device for display, and the remaining other related information (e.g., other selectable routes, nearby supermarkets, banks, etc.) of the first interface is displayed on the first electronic device.
For another example, for a video playing interface, the interface content with the higher attention degree by the user is the interface content corresponding to the video playing frame, and the core content of the video playing interface is also the interface content corresponding to the video playing frame. For this situation, the interface content corresponding to the video playing frame may be transferred from the display screen of the first electronic device to the second electronic device for display, and the remaining other related information (e.g., the selection set, more information in the video) of the first interface is displayed on the first electronic device.
Referring to fig. 16, fig. 16 is a diagram illustrating an example of a distributed display of an interface according to an embodiment of the present disclosure. As shown in fig. 16 (a), it is assumed that a video playing interface 1601 (i.e., a first interface) is displayed on a vertical screen of a smartphone 1601 (i.e., a first electronic device). The interface content corresponding to the video playing frame 1602 is a focused attention content on the video playing interface 1601 (i.e., the first interface), and the remaining content is video related content. In this case, as shown in (b) of fig. 16, the smartphone 1601 may transfer the interface content corresponding to the video playing box 1602 to the tablet computer 1603 (i.e., the second electronic device) to be played, and the remaining video-related content is still displayed on the smartphone 1601 (i.e., the first electronic device).
It should be noted that, in the present application, interface content with a relatively high user attention level on the transferred first interface or core content of the first interface (e.g., a video playing frame 1602 shown in (b) of fig. 16) may be displayed on the second electronic device (e.g., the tablet computer 1603) in a full screen mode, or may be displayed on the second electronic device in a window display mode, which is not limited in the present application.
For example, referring to fig. 17, the smartphone 1601 (i.e., the first electronic device) in fig. 17 transfers the video playing frame 1602 on the video playing interface 1601 (i.e., the first interface) of the portrait screen to the tablet computer 1603. Tablet computer 1603 displays video playback box 1602 (shown as video playback window 1701 in fig. 17) transferred by smartphone 1601 in the form of a video playback window. Among them, a window adjustment button, a close button, and a drag button are displayed on the video play window 1701. The window adjustment button is used for adjusting the shape and size of the video playing window 1701; the close button is used to close the video playback window 1701 on the smartphone 1603 (i.e., the second electronic device), and the drag button is used to change the position of the video playback window 1701 by dragging.
The scenario in which the first electronic device transfers the video playing frame to the second electronic device as shown in fig. 16 or 17 is applicable to any of the above embodiments of the present application. Through the distributed display scheme of the interface shown in fig. 16 or fig. 17, the user can view the video on the tablet 1603 and browse the video-related content on the smartphone 1601. The convenience of the user in watching the video and browsing other video related information is greatly improved.
For another example, in some embodiments of the present application, the first electronic device determines the first functional region according to functions implemented by the plurality of functional regions, and determines a task interface transferred from the first electronic device to the second electronic device. For example, a task interface on a functional area for realizing a main function (i.e., an interface with focus on content on a first interface) may be preferentially displayed on a large-screen device; and the task interface on the functional area for realizing the secondary function can be preferentially displayed on the small-screen equipment. The main function can be understood as a function that can be realized by a task with a relatively high degree of attention of the user. An electronic lecture interface and a live video interface as shown in (a) of fig. 4.
In some embodiments, the user may also return the current video playback interface on the smartphone 1601 to browse the previous level of interface. In other embodiments, the user may click on an option on the current video playback interface to enter the next level of interface. In other embodiments, the user may also exit the current video playing interface and open the interface of other applications.
In some embodiments, the form in which the second electronic device (and the secondary device) displays the interface content transferred by the first electronic device (i.e., the primary device), including a full-screen display form or a window display form, may be determined by an application attribute corresponding to the first interface, and may also be selected or adjusted by a user in a customized manner, which is not limited in this application.
In some embodiments, the second electronic device may also accept user-defined adjustments (e.g., dragging a corner of the window), adjusting the shape and size of the window, and so forth. Specifically, reference may be made to an adjustment method of an application window in the conventional technology, which is not described herein again.
The second mode: and the second electronic equipment displays the lower-layer interface information of the first interface.
The lower-layer interface information of the first interface can be understood as hidden interface information of the first interface, such as detailed interface information. It is to be appreciated that the same application can include different application interfaces at different levels. Wherein, different levels may have a top-bottom level relationship or no level relationship. In response to a user operating on an interface, the interface may invoke a next level interface of the interface.
For example, the lower-level interface information of the first interface refers to a lower-level interface of a part of content on the first interface, such as detail interface information. Illustratively, the first interface includes an electronic document list including at least one electronic document. Wherein the electronic document list may be any one of the following: an email list, a picture list, a short message list, or a memo list. The first electronic device may send a corresponding interface, such as an email detail interface, a picture detail interface, a short message detail interface, or a memo detail interface, to the second electronic device for display. In some embodiments, the electronic document list may include at least one unread electronic document. For example, the electronic document list may be an email list or a short message list, etc.
For example, in some embodiments, a first electronic device displays a first interface, wherein the first interface includes a list of electronic documents including at least one unread electronic document; after the first electronic device detects the touch operation with the second electronic device, determining that the second electronic device is a secondary device of the first electronic device; the first electronic device searches for a most recent one of the at least one unread electronic documents and displays a detailed interface of the most recent one of the unread electronic documents on the second electronic device. After the first electronic device displays the detail interface of the latest unread electronic document on the second electronic device, the first electronic device continues to display the electronic document list, wherein the latest unread electronic document in the electronic document list is marked as a read state.
In some embodiments, if the electronic document list includes at least one unread electronic document, the electronic document list may further include at least one read electronic document.
For example, the first interface includes a picture list, the picture list includes thumbnails of multiple pictures, and the lower-layer interface information of the first interface may be a detailed interface of a certain picture (e.g., a picture selected by the user or a latest picture). Referring to fig. 18, as shown in (a) of fig. 18, it is assumed that a thumbnail interface 1802 (i.e., a first interface) of a gallery application is displayed on a tablet computer 1801 (i.e., a first electronic device). Thumbnail interface 1802 includes thumbnails of multiple pictures. In this case, assuming that the second electronic device (i.e., the secondary device) is determined to be the smartphone 1803, if the user selects (e.g., clicks) the thumbnail of picture a on the thumbnail interface 1802, as shown in fig. 18 (b), the tablet computer 1801 (i.e., the first electronic device) may transfer the next hierarchical interface of picture a (i.e., the picture a details interface 1104) to the smartphone 1803 (i.e., the second electronic device) for presentation, and the original thumbnail interface 1802 (i.e., the first interface) is still displayed on the tablet computer 1801 (i.e., the first electronic device).
Through the distributed display scheme of the interface shown in fig. 18, a user can view the picture detail interface on the smart phone 1803 while browsing the thumbnails on the tablet 1801. The method and the device greatly reduce the operation complexity of the user when using the gallery and improve the convenience of the user when using the gallery.
For another example, if the first interface includes an email list that includes an abbreviated list of multiple emails (including unread emails and/or read emails), the underlying interface information of the first interface may be a detailed interface of a particular email (e.g., an email selected by the user). Referring to fig. 19, as shown in (a) of fig. 19, it is assumed that a mailing list interface 1902 (i.e., a first interface) of the mailing application is displayed on the tablet computer 1901 (i.e., the first electronic device). The mailing list interface 1902 includes therein thumbnail information of a plurality of mails. In this case, assuming that the second electronic device (i.e., the secondary device) is determined to be the smart phone 1903, if the user selects (e.g., clicks) the thumbnail information of the email a on the email list interface 1902, as shown in fig. 19 (b), the tablet computer 1901 (i.e., the first electronic device) may transfer the next-level interface of the email a (i.e., the email a details interface 1904) to the smart phone 1903 (i.e., the second electronic device) for presentation, and the original email list interface 1902 (i.e., the first interface) is still displayed on the tablet computer 1901 (i.e., the first electronic device).
Alternatively, assume that a mailing list interface (i.e., a first interface) of the mail application is displayed on the first electronic device. The mail list interface comprises the thumbnail information of a plurality of mails. Wherein, the plurality of mails comprise at least one unread mail. In this case, if the first electronic device and the second electronic device establish a communication connection, the second electronic device may display a mail detail interface of the latest unread mail by default. Namely, the first electronic device transfers the next-level interface (i.e. the mail detail interface) of the latest unread mail to the second electronic device for presentation, and the original mail list interface is still displayed on the first electronic device. And the last unread mail on the original mail list interface is marked as a read state. Alternatively, assume that a mailing list interface (i.e., a first interface) of the mail application is displayed on the first electronic device. The mail list interface comprises the thumbnail information of a plurality of mails. Wherein, the multiple mails include at least one unread mail and at least one read mail. In this case, if the first electronic device and the second electronic device establish a communication connection, the second electronic device may display a mail detail interface of the latest mail by default. Namely, the first electronic device transfers the next-level interface (i.e. the mail detail interface) of the latest mail to the second electronic device for presentation, and the original mail list interface is still displayed on the first electronic device. If the last mail is unread mail, the last mail on the original mail list interface is marked as a read state.
In the present application, if the first electronic device transfers the lower-layer interface information of the first interface to the second electronic device, the lower-layer interface information of the first interface may be displayed on the second electronic device in a full screen manner (as shown in fig. 18 (b) and fig. 19 (b)), or may be displayed on the second electronic device in a window display manner. If the lower-layer interface information of the first interface is displayed on the second electronic device in a window display form, in some embodiments, the shape, size, or position of the window may also be adjusted in a user-defined manner (for example, by dragging the window), which is not described herein again.
Taking the tablet 1901 shown in fig. 20 as an example to transfer the next-level interface of email a (i.e., email a details interface 2001) to the smartphone 1903 (i.e., the second electronic device), as shown in fig. 20, the email a details interface 2001 is displayed on the smartphone 1903 (i.e., the second electronic device) in the form of a window display. Among them, a window adjustment button, a close button, and a drag button are displayed on the window (window 2001 as shown in (b) in fig. 20). The window adjusting button is used for adjusting the shape and size of the window 2001; the close button is used to close the window 2001 on the smartphone 1903 (i.e., the second electronic device), and the drag button is used to change the position of the window 2001 by dragging.
Through the distributed display scheme of the interface shown in fig. 19 or fig. 20, a user can view the mail detail interface on the smartphone 1903 while browsing the mail thumbnail information on the tablet computer 1901. The operation complexity of the user in processing the mails is greatly reduced, and the convenience of the user in processing the mails is improved.
For another example, the first interface includes a short message list, and if the first electronic device and the second electronic device establish a communication connection and the short message list includes at least one short message, the first electronic device may default to send the latest short message or the short message selected by the user to the second electronic device for display, and the first electronic device still displays the original short message list. Or, further, if the short message list includes at least one unread short message, the second electronic device may default to send the latest unread short message to the second electronic device for display, and the original short message list is still displayed on the first electronic device, where the latest unread short message on the interface of the original short message list is marked as a read state.
For another example, the first interface includes a memo list, and if the first electronic device and the second electronic device establish a communication connection and the short message list includes at least one memo note, the first electronic device may default to send the latest memo note or the memo note selected by the user to the second electronic device for display, and the memo list is still displayed on the first electronic device.
Mode (iii): the second electronic device displays the floating content or the superposed content on the first interface.
In some embodiments, the floating content or the superimposed content on the first interface may be interface content corresponding to an auxiliary function, such as an input keyboard.
As shown in fig. 21 (a), it is assumed that a small video interface 2102 (i.e., a first interface) with a trembling tone is displayed on a smartphone 2101 (i.e., a first electronic device). The small video interface 2102 includes a video playing box and a comment area. When the user posts a comment in the comment area, the tremble application invokes the input keyboard. Here, as shown in fig. 21 (a), the input keyboard 2103 is displayed in a floating or superimposed form on the small video interface 2102. In this case, assuming that the second electronic device (i.e., the slave device) is determined to be the smartphone 2104, as shown in (b) of fig. 21, the smartphone 2101 (i.e., the first electronic device) may transfer the input keyboard 2103 to the smartphone 2104 (i.e., the second electronic device) for playing, with the remaining small video-related content (e.g., the video play box and the comment area) still displayed on the smartphone 2101 (i.e., the first electronic device).
With the distributed display scheme of the interface shown in fig. 21, a user can watch videos and comments on the smartphone 2101 while using the smartphone 2104 as an input keyboard to post comments. The convenience of the user in watching the small videos and making comments is greatly improved.
In some embodiments, in response to receiving an editing operation of a user on the detail interface of the latest unread electronic document, the second electronic device sends the input keyboard to the first electronic device for display, and the second electronic device displays only the editing interface of the latest unread electronic document.
For another example, in some embodiments, in response to receiving an editing operation by the user on the detail interface of the latest unread electronic document, the second electronic device hovers or overlays the input keyboard on the editing interface of the latest unread electronic document; and responding to the detection of touch operation with the first electronic equipment, sending the input keyboard to the first electronic equipment by the second electronic equipment for displaying, and only displaying the editing interface of the latest unread electronic document by the second electronic equipment.
For example, assuming that the electronic document list is an electronic mail list or a short message list, in response to receiving a reply operation of a user to a latest unread electronic document (such as an unread electronic mail or an unread short message), the second electronic device sends the input keyboard to the first electronic device for display, and the second electronic device displays only a reply interface of the latest unread electronic document (such as an unread electronic mail or an unread short message). Or, in response to receiving a reply operation of the user to the latest unread electronic document (such as an unread email or an unread short message), the second electronic device floats or superposes and displays the input keyboard on the reply interface of the latest unread electronic document (such as an unread email or an unread short message); and responding to the detection of the touch operation with the first electronic equipment, the second electronic equipment sends the input keyboard to the first electronic equipment for displaying, and the second electronic equipment only displays the reply interface of the latest unread electronic document (such as an unread email or an unread short message). As shown in fig. 22 (a), it is assumed that the tablet 1901 transfers the next-level interface of email a (i.e., email a details interface 2001) to be displayed on the smartphone 1903 (i.e., the second electronic device), where the email a details interface 2001 is displayed in the form of a window display on the smartphone 1903 (i.e., the second electronic device). Further, assuming that the user clicks an edit button below the mail a details interface 2001, the smartphone 1903 (i.e., the second electronic device) sends the input keyboard to the tablet 1901 (i.e., the first electronic device) for display.
In some embodiments, assuming that the user clicks an edit button below the mail a details interface 2001, the smartphone 1903 (i.e., the second electronic device) hovers or overlays the display input keyboard 2202 on the mail a editing interface 2201 as shown in fig. 22 (b).
Further, assuming that the tablet computer 1901 and/or the smartphone 1903 receive a first operation (for example, "bump one touch") of the user when the tablet computer 1901 and the smartphone 1903 are in the state shown in fig. 22 (b), as shown in fig. 22 (c), the smartphone 1903 sends the input keyboard 2202 to the tablet computer 1901 for display, and the smartphone 1903 still displays the email a editing interface 2201.
As another example, assume that the first electronic device transfers a next-level interface (i.e., a short message detail interface) of a certain short message (e.g., a default short message, a last unread short message, or a short message selected by a user) to the second electronic device for presentation. Further, assuming that the user clicks a reply button on the short message detail interface, the second electronic device sends the input keyboard to the first electronic device for displaying. And displaying a short message reply interface on the second electronic equipment.
In some embodiments, assuming that the user clicks a reply button on the short message details interface, the second electronic device hovers or overlays the display input keypad on the short message editing interface. Further, assuming that the second electronic device receives a first operation (for example, a touch operation with the first electronic device) of the user, the second electronic device sends the input keyboard to the first electronic device for displaying, and the short message reply interface is still displayed on the second electronic device.
For another example, assume that the first electronic device transfers a next level interface (i.e., memo note detail interface) of a certain memo note (e.g., a latest memo note or a memo note selected by the user) to the second electronic device for display. Further, assuming that the user clicks a reply button on the memo note detail interface, the second electronic device sends the input keyboard to the first electronic device for display. And displaying a memo note editing interface on the second electronic equipment.
In some embodiments, assuming that the user clicks an edit button on the memo note details interface, the second electronic device hovers or overlays the display input keypad on the memo note editing interface. Further, assuming that the second electronic device receives a first operation (for example, a touch operation with the first electronic device) of the user, the second electronic device sends the input keyboard to the first electronic device for displaying, and the memo note editing interface is still displayed on the second electronic device.
In the present application, the floating content or the overlay content (for example, the input keyboard 2202 shown in (c) of fig. 22) on the transferred first interface may be displayed on the tablet computer 1901 in a landscape state or displayed on the tablet computer 1901 in a portrait state, which is not limited in the present application.
Through the distributed display scheme of the interface shown in fig. 22, a user can edit a mail while viewing the mail detail interface on the smartphone 1903, using the tablet 1901 as an input keyboard. The convenience of the user in processing the mails is greatly improved.
And (IV) displaying the task interface of the functional area selected by the user on the first interface by the second electronic equipment.
For example, in other embodiments of the present application, the first electronic device may determine the first functional area according to a selection operation of the user on the functional area on the first interface, and determine a task interface transferred from the first electronic device to the second electronic device. For example, the first electronic device may determine a functional area to which a touch screen area, to which the first electronic device receives a user selection operation, belongs, and determine all task interfaces on the functional area.
In some examples, the tablet 1901 may determine whether to display the input keyboard 2202 in the landscape state or the portrait state depending on its device state. As shown in fig. 22 (c), if the device state of the tablet computer 1901 is the landscape state, the tablet computer 1901 displays the input keyboard 2202 in the landscape state. Assuming that the device state of the tablet computer 1901 is a vertical screen state, the tablet computer 1901 may display an input keyboard in the vertical screen state.
As another example, when the smartphone 2101 (i.e., the first electronic device) in fig. 23 transfers the input keyboard 2103 that is floating or superimposed on the small video interface 2102 (i.e., the first interface) to the smartphone 2104, the smartphone 2104 may determine to display the input keyboard transferred by the smartphone 2101 in a landscape state according to the device state thereof. As shown in fig. 23 (b), the smartphone 2104 displays the input keyboard 2301 in a landscape state, and the remaining small video-related content (such as a video play box and a comment area) remains displayed on the smartphone 2101 (i.e., the first electronic device).
With the distributed display scheme of the interface shown in fig. 23, a user can watch videos and comments on the smartphone 2101 while using the smartphone 2104 as an input keyboard to post comments. The convenience of the user in watching the small videos and making comments is greatly improved.
In some embodiments, after the content displayed on the first electronic device is distributively and cooperatively displayed on the first electronic device and the second electronic device, if one task interface (e.g., a task interface on the second ribbon) remains on the first electronic device, the first electronic device may display the remaining one task interface in a full screen. Alternatively, the first electronic device may display the remaining one of the task interfaces in a window of a preset size. Alternatively, the first electronic device may display the remaining task interface at the optimal display scale of the interface, which is not limited in this application.
In other embodiments, after the content displayed on the first electronic device is cooperatively and distributively displayed on the first electronic device and the second electronic device, if two task interfaces (for example, a task interface on the second functional area and a task interface on the third functional area) remain on the first electronic device, the two task interfaces may be displayed in a left-right split screen manner or in an up-down split screen manner, which is not limited in the present application. For example, the remaining interface short message interface and the memo interface after the shift display of the smartphone a shown in fig. 15B are split up and down on the smartphone a.
In some embodiments, the first electronic device may determine a split screen mode of the remaining content interface (e.g., the task interface on the second functional area and the task interface on the third functional area) according to the device state, for example, if the first electronic device is in a vertical state, the remaining content interface is split up and down (as shown in fig. 15B); and if the first electronic equipment is in the horizontal screen state, displaying the residual content interface in a left-right split screen mode. In this embodiment of the application, an interface transferred from the first electronic device to the second electronic device may be displayed on the second electronic device in a landscape mode or in a portrait mode, and the application is not limited. For example, the video interface that smartphone a shown in fig. 15B transfers to smartphone B is displayed on smartphone B in a portrait screen. The video window is provided with a window adjusting button, a closing button and a dragging button. The window adjusting button is used for adjusting the shape and size of the video window; the closing button is used for closing the video window, and the dragging button is used for changing the position of the video window through dragging.
In some embodiments, the second electronic device may determine the interface transferred by the first electronic device based on the device state. For example, if the second electronic device is in the vertical screen state, the vertical screen displays the interface transferred by the first electronic device (as shown in fig. 15B); if the second electronic device is in the landscape state, the landscape displays the interface transferred by the first electronic device (as shown in fig. 15C).
In other embodiments, after the content displayed on the first electronic device is distributedly and cooperatively displayed on the first electronic device and the second electronic device, if more than two task interfaces (e.g., a task interface on the second functional area and a task interface on the third functional area) remain on the first electronic device, the first electronic device may be arranged on the display screen of the first electronic device in a preset relative position relationship. For example, the first electronic device may determine (e.g., according to the number, properties, functions, and the like of the task interfaces) to display the task interfaces on the display screen of the first electronic device using a suitable frame template from a plurality of frame templates preset in the first electronic device. Similarly, if more than two task interfaces (i.e. multiple task interfaces) are displayed on the second electronic device, the second electronic device may determine (e.g. determine according to the number, attributes, functions, etc. of the task interfaces) to use a suitable frame template to display the multiple task interfaces on the display screen of the second electronic device from multiple frame templates preset in the second electronic device.
It should be noted that, in the above embodiments, the first electronic device and the second electronic device are taken as examples to display the multitasking interface in a distributed manner, and the application does not limit the number of the second electronic devices that cooperate with the first electronic device to display the multitasking interface.
For example, taking the first electronic device as a laptop, the second electronic device as a tablet pc and a smartphone, and the laptop displays the distributed collaborative scene of the electronic classroom shown in (a) in fig. 4, in response to an operation that the user selects "my tablet pc" from the second electronic device list of the option box 502 by using a mouse cursor, an operation that the user selects the multitask collaborative display, and an operation that the user clicks the determination option after selecting an operation that the audio is output by using the "my tablet pc", the laptop triggers the multitask distributed collaborative display. For example, the notebook computer transfers the memo interface to the tablet computer for display, transfers the class communication group interface to the smart phone for display, and displays the live video interface and the electronic lecture interface.
By the interface distributed display method provided by the embodiment of the application, the content displayed on one electronic device (such as a first electronic device) can be displayed on a plurality of different electronic devices (such as the first electronic device and a second electronic device) in a distributed and cooperative manner, so that a user can process different tasks through the plurality of electronic devices, and the operation of the user is facilitated. In addition, the problems of incomplete task interface display, inconvenient operation and the like caused by a plurality of tasks on one display screen can be solved.
In some embodiments, windows of the task interface are displayed on the first electronic device and the second electronic device, or the task interface may accept adjustments to the adaptation capability. The adjustment of the adaptive capacity is used for adapting the size of a display screen of the equipment and the display effect of the task interface. The adjustment to the adaptation capability may include, but is not limited to, an automatic adjustment of the device and an adjustment of the device to accept an operation (e.g., a drag operation) by a user. The adaptation capabilities may include, but are not limited to, stretch capabilities, zoom capabilities, hide capabilities, fold capabilities, share capabilities, and extend capabilities.
Where stretch capability refers to the ability to change the shape and size of the window/interface. For example, by device adaptation, or by the user's ability to stretch the window/interface to change its shape and size. The stretch capability may include a transverse stretch capability and a longitudinal stretch capability. Zoom capability refers to the ability to change the size of a window/interface without changing the shape of the window/interface. The ability to change the size of the window/interface by device adaptation, or by the user's ability to drag a corner of the window/interface. The zoom capability may include a zoom-out capability and a zoom-in capability. Hiding capability refers to the ability to adjust the window/interface display form. The hiding capabilities may include the ability to hide a window/interface, and to display a window/interface. For example, by device adaptation, or by user selection operations to hide the window/interface or to display the window/interface capability. The line folding capability refers to the ability to adjust multiple windows/interfaces from a one-line display to a multi-line display. For example, by device adaptation, or by the ability of a user's operation of dragging a window/interface to adjust multiple windows/interfaces from a one-line display to a multi-line display. The sharing capability refers to the capability of displaying multiple windows/interfaces within a shared functional area. For example, by device adaptation, or by the ability of the user to select the share option to display multiple windows/interfaces within the shared functional area. The proportion capacity refers to the capacity of adjusting the proportion of the functional areas of a plurality of windows/interfaces on the display screen of the equipment to the display screen. For example, by the device adapting, or by the user entering a scale to adjust the ratio of the functional area of the plurality of windows/interfaces on the display of the device to the display. Extended capability refers to the ability to adjust multiple windows/interfaces from a multiple row (e.g., two row) display to a single row display. For example, by device adaptation, or by the ability of a user's operation of dragging a window/interface to adjust multiple windows/interfaces from a multiple row (e.g., two row) display to a single row display.
Referring to fig. 24, fig. 24 shows an exemplary diagram of several adaptation capabilities provided by an embodiment of the present application. Wherein (a) in fig. 24 shows the stretching capability, (b) in fig. 24 shows the zooming capability, (c) in fig. 24 shows the hiding capability, (d) in fig. 24 shows the folding capability, (e) in fig. 24 shows the averaging capability, (f) in fig. 24 shows the duty ratio capability, and (g) in fig. 24 shows the extending capability. The several adapting capabilities shown in fig. 24 are merely examples, and the adapting capabilities described in the embodiment of the present application may further include other window/interface adjustment capabilities for adapting the size of the display screen of the device and the display effect of the task interface.
In the embodiment of the application, a plurality of task interfaces can be jointly rendered on a virtual screen of the first electronic device and sent to the first electronic device for display. For example, the first electronic device renders different task interfaces to different functional areas on the virtual screen according to a preset frame template.
In an embodiment of the present application, the first electronic device may determine the first functional area from the plurality of functional areas by any one of the following methods: determining a first functional area according to task attributes corresponding to a plurality of task interfaces; determining a first functional area according to the selection operation of a user on one or more task interfaces, wherein the one or more task interfaces are positioned in the first functional area; determining a first functional area according to functions realized by a plurality of functional areas; and determining the first functional area according to the selection operation of the user on the functional area.
In some embodiments of the present application, the first electronic device determines the first functional area according to a task attribute of content of the functional area on the first interface, and determines a task interface transferred from the first electronic device to the second electronic device. And the task interface is a task interface on the functional area determined by the first electronic equipment. For example, for a mission-critical interface or a mission-critical interface, the display can be preferentially performed on a large-screen device; for secondary task interfaces, the display may be prioritized on small screen devices. The key task interface or the main task interface can be understood as a task interface with a higher attention degree by a user or a core task interface in a multi-task interface displayed by the first electronic device. An electronic lecture interface and a live video interface as shown in (a) of fig. 4. For another example, assuming that the second electronic device is a large-screen device (e.g., a television or a notebook computer), the first electronic device may send content (e.g., a video playing interface on a video function area) suitable for being displayed on the large-screen device to the second electronic device for display.
In other embodiments of the present application, the first electronic device determines the first functional area according to a selection operation (e.g., a click operation, a drag operation, etc.) of the functional area on the first interface of the user, and determines a task interface transferred from the first electronic device to the second electronic device. Wherein the task interface is a task interface on the functional area selected by the user.
For example, in response to the operation that the user selects the interface on the multi-task interface according to the reminder of the first electronic device, the first electronic device transfers the task interface selected by the user to the second electronic device for displaying. For example, as shown in fig. 25, in response to the user selecting "my tablet" from the second electronic device list in the option box 902, the user selecting the operation of the multitask collaborative display, and the user clicking the operation of the determination option after selecting the operation of outputting the audio through "my tablet", the first electronic device displays a prompt box 2501 shown in fig. 25 for prompting the user to select a task interface transferred to the second electronic device (i.e., the "my tablet") on the multitask interface by clicking the operation.
In some embodiments, when the first electronic device determines the content to transfer to the second electronic device, the first electronic device may transfer the corresponding content to the second electronic device and re-render the remaining content (e.g., the remaining task interfaces) to the virtual screen of the first electronic device and re-display to the first electronic device. For example, the first electronic device re-renders the remaining task interfaces to different areas of the virtual screen according to the re-determined frame template.
In some embodiments, the first electronic device may retrieve the content transferred to the second electronic device at any time during the process of displaying the multitasking interface by the first electronic device and the second electronic device in a distributed and coordinated manner. Illustratively, the first electronic device may be responsive to the user (i.e., transferred to the task interface of the second electronic device).
For example, the user may retract the task interface transferred to the second electronic device by clicking the corresponding device button again in option box 902 shown in fig. 9 (c) or fig. 9 (c). As shown in fig. 26, it is assumed that the laptop and the tablet computer cooperatively display a multitasking interface in a distributed manner, wherein the laptop displays a live video interface, an electronic lecture interface and a class communication group interface, and the tablet computer displays a memo interface. In response to the user clicking the "my tablet" operation again in the option box 502 displayed on the notebook computer, the notebook computer retracts the memo interface transferred to the tablet computer. After the notebook computer withdraws the memo interface transferred to the tablet computer, the notebook computer displays a live video interface, an electronic lecture interface, a class communication group interface and a memo interface (i.e., a first interface).
As another example, the user may retrieve the task interface transferred to the second electronic device by clicking "retrieve to main device" in option box 902 shown in fig. 9 (b) or fig. 9 (c). As shown in fig. 27, it is assumed that smartphone a and smartphone B cooperatively display a multitasking interface in a distributed manner, where smartphone a displays a short message interface and smartphone B displays a memo interface. In response to an operation of the user clicking a button retract key 903 of the hidden option box on the smartphone a, the smartphone a displays the option box 902. In response to the user clicking "retrieve to primary" in options box 902, smartphone a retrieves the memo interface transferred to smartphone B. After smartphone a regains the memo interface transferred to smartphone B, smartphone a displays the short message interface and the memo interface (i.e., the first interface).
It is to be understood that the electronic device (e.g. the first electronic device or the second electronic device) comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the functions of any of the above embodiments. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device (such as the first electronic device or the second electronic device) may be divided into the functional modules, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
For example, in a case where each functional module is divided in an integrated manner, as shown in fig. 28, the functional module is a block diagram of an electronic device provided in an embodiment of the present application. The electronic device may be the first electronic device or the second electronic device. The electronic device may include a processing unit 2810, a transceiving unit 2820, and a display 2830.
When the electronic device is a first electronic device, the display unit 2830 is configured to support the first electronic device to display a first interface of a first application program including at least a first functional area and a second functional area. The processing unit 2810 is configured to support the first electronic device to, when detecting a touch operation with the second electronic device, respond to the detected touch operation and send the content of the first functional region to the second electronic device through the transceiver 2820 for displaying. The transceiver 2820 transmits the content of the first functional area to the second electronic device for displaying. After the transceiver 2820 sends the content of the first functional area to the second electronic device for displaying, the first electronic device displays a second interface of the first application, where the second interface includes the second functional area and does not include the first functional area.
Or, when the electronic device is a first electronic device, wherein the display unit 2830 is configured to support the first electronic device to display a first interface including a list of electronic documents. Wherein the electronic document list includes at least one unread electronic document and at least one read electronic document. The processing unit 2810 is configured to support that the first electronic device detects a touch operation with a second electronic device at the first electronic device, and determine that the second electronic device is a secondary device of the first electronic device; and searching a latest unread electronic document in the at least one unread electronic document, and displaying a detail interface of the latest unread electronic document on the second electronic device. The transceiving unit 2820 is configured to enable the first electronic device to send the detail interface of the latest unread electronic document to the second electronic device for displaying. Wherein, after the transceiving unit 2820 transmits the detail interface of the latest unread electronic document to the second electronic device for display, the display unit 2830 continues to display the electronic document list in which the latest unread electronic document in the electronic document list is marked in a read state.
It should be noted that the transceiver 2820 may include a radio frequency circuit. Specifically, the electronic device may receive and transmit wireless signals through the radio frequency circuit. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
It should be understood that the modules in the electronic device may be implemented in software and/or hardware, and are not particularly limited thereto. In other words, the electronic device is presented in the form of a functional module. As used herein, a "module" may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor and memory that execute one or more software or firmware programs, an integrated logic circuit, and/or other devices that may provide the described functionality. Alternatively, in a simple embodiment, those skilled in the art will appreciate that the electronic device may take the form shown in FIG. 29. The processing unit 2810 may be implemented by the processor 2910 illustrated in fig. 29. The transceiving unit 2820 may be implemented by the transceiver 2920 shown in fig. 29. In particular, the processor is implemented by executing a computer program stored in the memory. Optionally, the memory is a storage unit in the chip, such as a register, a cache, and the like, and the storage unit may also be a storage unit located outside the chip in the computer device, such as a memory 2930 shown in fig. 29.
In an alternative, when the data transfer is implemented using software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are implemented in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or may be embodied in software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. Additionally, the ASIC may reside in an electronic device. Of course, the processor and the storage medium may reside as discrete components in an electronic device.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In an alternative, the present application provides a communication system including a first electronic device and a second electronic device, where the first electronic device and the second electronic device are configured to implement the method in any one of the possible implementations provided by the present application.
In an alternative aspect, the present application provides a chip system, where the chip system includes a processor and a memory, where the memory stores instructions; when executed by a processor, implement the method of any one of the possible implementations provided herein. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
In the several embodiments provided in the present application, it should be understood that the disclosed electronic device and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (27)

1. A method for distributed display of an interface, the method comprising:
the method comprises the steps that a first electronic device displays a first interface of a first application program, wherein the first interface at least comprises a first functional area and a second functional area;
the first electronic equipment detects touch operation with second electronic equipment;
responding to the touch operation, and sending the content of the first functional area to the second electronic equipment by the first electronic equipment for displaying;
the first electronic device displays a second interface of the first application, wherein the second interface comprises the second functional area and does not comprise the first functional area.
2. The method of claim 1, wherein after the first electronic device sends the content of the first functional area to the second electronic device for display, the method further comprises:
the first electronic device reallocates the display layout of the remaining functional regions on the first electronic device.
3. The method according to claim 1 or 2,
the first functional area is determined by the first electronic equipment according to the selection operation of a user on the functional area; or the first functional area is determined by the first electronic device according to the function realized by the functional area on the first interface.
4. The method according to claim 1 or 2,
the first functional area is determined by the first electronic equipment according to the selection operation of the user on the content of the functional area; alternatively, the first and second electrodes may be,
the first functional area is determined by the first electronic equipment according to the task attribute of the content of the functional area on the first interface.
5. The method according to any of claims 1-4, wherein the first functional area and the second functional area on the first interface are laid out on the first electronic device in a pre-set relative positional relationship.
6. The method according to any of claims 1-5, wherein the first functional area and the second functional area support an adjustment of adaptation capabilities; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability.
7. The method according to any one of claims 1-6, further comprising:
the first electronic equipment responds to an operation that a user triggers retraction of the collaborative display, and the first electronic equipment retracts the content of the first functional area from the second electronic equipment;
after the first electronic device withdraws the content of the first functional area from the second electronic device, the first electronic device displays a first interface of the first application program, wherein the first interface at least comprises the first functional area and the second functional area.
8. The method according to any one of claims 1-7, wherein the content of the first functional area and the content of the second functional area on the first interface are collectively rendered by the first electronic device on a virtual screen of the first electronic device in the form of one or more atomization services according to a preset frame template.
9. The method of claim 8, wherein the first electronic device sends the content of the first functional area to the second electronic device for display, and wherein the sending comprises:
and the first electronic equipment sends the standard video stream corresponding to the content of the first functional area on the virtual screen to the second electronic equipment for displaying.
10. A first electronic device, wherein the first electronic device comprises:
a memory for storing computer program code, the computer program code comprising instructions;
the radio frequency circuit is used for transmitting and receiving wireless signals;
the processor is used for executing the instructions to enable the first electronic equipment to display a first interface of a first application program, wherein the first interface at least comprises a first functional area and a second functional area; and the number of the first and second groups,
when touch operation with second electronic equipment is detected, responding to the touch operation, and sending the content of the first functional area to the second electronic equipment for displaying;
after the first electronic device sends the content of the first functional area to the second electronic device for displaying, the first electronic device displays a second interface of the first application program, wherein the second interface includes the second functional area and does not include the first functional area.
11. The first electronic device of claim 10, wherein the processor is further configured to execute the instructions to cause the first electronic device to re-allocate a display layout of remaining functional regions on the first electronic device after transmitting the content of the first functional region to the second electronic device for display.
12. The first electronic device according to claim 10 or 11, wherein the first functional area is determined by the processor executing the instructions according to a user selection operation of the functional area; or the function is determined according to the function realized by the functional area on the first interface.
13. The first electronic device of claim 10 or 11, wherein the first functional area is determined by the processor executing the instructions according to a user selection operation of a content of the functional area; or the task attribute of the content of the functional area on the first interface is determined by the user.
14. The first electronic device of any of claims 10-13, wherein the first functional area and the second functional area on the first interface are laid out on the first electronic device in a pre-set relative positional relationship.
15. The first electronic device of any of claims 10-14, wherein the first functional area and the second functional area support adjustment of adaptation capabilities; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability.
16. The first electronic device of any of claims 10-15, wherein the processor is further configured to execute the instructions to cause the first electronic device to retrieve content of the first functional area from the second electronic device in response to a user triggering an operation to retrieve a collaborative display;
after the first electronic device withdraws the content of the first functional area from the second electronic device, the first electronic device displays a first interface of the first application program, wherein the first interface at least comprises the first functional area and the second functional area.
17. The first electronic device of any of claims 10-16,
the content of the first functional area and the content of the second functional area on the first interface are executed by the processor, and are jointly rendered on a virtual screen of the first electronic equipment in the form of one or more atomization services according to a preset framework template.
18. The first electronic device of claim 17, wherein the processor is further configured to,
and executing the instruction to enable the first electronic equipment to send the standard video stream corresponding to the content of the first functional area on the virtual screen to the second electronic equipment for displaying.
19. A method for distributed display of an interface, the method comprising:
the method comprises the steps that a first electronic device displays a first interface, wherein the first interface at least comprises a first application interface, a second application interface and a third application interface;
the first electronic equipment detects touch operation with second electronic equipment;
responding to the touch operation, and sending the content of the first application interface to the second electronic equipment by the first electronic equipment for displaying;
the first electronic device displays a second interface that includes the second application interface and a third application interface and that does not include the first application interface.
20. The method of claim 19, wherein after the first electronic device sends the content of the first application interface to the second electronic device for display, the method further comprises:
the first electronic device reallocates the display layout of the rest application interfaces on the first electronic device.
21. The method of claim 19 or 20,
the first application interface is determined by the first electronic equipment according to the selection operation of a user on the application interface; or the first application interface is determined by the first electronic device according to the functions realized by the first application interface, the second application interface and the third application interface.
22. The method of any of claims 19-21, wherein the windows of the first application interface, the second application interface, and the third application interface support adjustment of adaptation capabilities; the adaptation capability includes: stretching ability, zooming ability, hiding ability, folding ability, sharing ability, proportion ability and extension ability.
23. The method according to any one of claims 19-22, further comprising:
the first electronic equipment responds to an operation that a user triggers retraction of the collaborative display, and retracts the first application interface from the second electronic equipment;
after the first electronic device withdraws the first application interface from the second electronic device, the first electronic device displays the first interface, wherein the first interface at least comprises the first application interface, the second application interface and the third application interface.
24. A first electronic device, wherein the first electronic device comprises:
a memory for storing computer program code, the computer program code comprising instructions;
the radio frequency circuit is used for transmitting and receiving wireless signals;
a processor configured to execute the instructions to cause the first electronic device to implement the method of any of claims 19-23.
25. A communication system, the communication system comprising:
a first electronic device as claimed in any one of claims 10-18, or a first electronic device as claimed in claim 24; and a second electronic device.
26. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processing circuit, implement the method of any one of claims 1-9 or 19-23.
27. A computer program product, characterized in that it comprises program instructions which, when executed, implement the method according to any one of claims 1-9 or 19-23.
CN202011149035.3A 2020-09-29 2020-10-23 Distributed display method of interface, electronic equipment and communication system Pending CN114327324A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2021/119324 WO2022068628A1 (en) 2020-09-29 2021-09-18 Distributed display method of interface, and electronic device and communication system
EP21874289.8A EP4206897A4 (en) 2020-09-29 2021-09-18 Distributed display method of interface, and electronic device and communication system
US18/246,986 US11995370B2 (en) 2020-09-29 2021-09-18 Distributed interface display method, electronic device, and communication system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2020110505737 2020-09-29
CN202011050573 2020-09-29
CN2020110539748 2020-09-29
CN202011053974 2020-09-29

Publications (1)

Publication Number Publication Date
CN114327324A true CN114327324A (en) 2022-04-12

Family

ID=81031777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011149035.3A Pending CN114327324A (en) 2020-09-29 2020-10-23 Distributed display method of interface, electronic equipment and communication system

Country Status (1)

Country Link
CN (1) CN114327324A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102027450A (en) * 2008-05-20 2011-04-20 思杰系统有限公司 Methods and systems for using external display devices with a mobile computing device
US20130027404A1 (en) * 2011-07-29 2013-01-31 Apple Inc. Systems, methods, and computer-readable media for managing collaboration on a virtual work of art
CN109271121A (en) * 2018-08-31 2019-01-25 维沃移动通信有限公司 A kind of application display method and mobile terminal
CN110515576A (en) * 2019-07-08 2019-11-29 华为技术有限公司 Display control method and device
CN111078091A (en) * 2019-11-29 2020-04-28 华为技术有限公司 Split screen display processing method and device and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102027450A (en) * 2008-05-20 2011-04-20 思杰系统有限公司 Methods and systems for using external display devices with a mobile computing device
US20130027404A1 (en) * 2011-07-29 2013-01-31 Apple Inc. Systems, methods, and computer-readable media for managing collaboration on a virtual work of art
CN109271121A (en) * 2018-08-31 2019-01-25 维沃移动通信有限公司 A kind of application display method and mobile terminal
CN110515576A (en) * 2019-07-08 2019-11-29 华为技术有限公司 Display control method and device
CN111078091A (en) * 2019-11-29 2020-04-28 华为技术有限公司 Split screen display processing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
WO2021027747A1 (en) Interface display method and device
US20220342850A1 (en) Data transmission method and related device
WO2021103981A1 (en) Split-screen display processing method and apparatus, and electronic device
KR102534354B1 (en) System navigation bar display control method, graphical user interface and electronic device
US11687235B2 (en) Split-screen method and electronic device
KR101864618B1 (en) Mobile terminal and method for providing user interface thereof
WO2020221063A1 (en) Method of switching between parent page and subpage, and related device
JP2023514631A (en) Interface layout method, apparatus and system
WO2021115194A1 (en) Application icon display method and electronic device
US11914850B2 (en) User profile picture generation method and electronic device
CN113553014A (en) Application interface display method under multi-window screen projection scene and electronic equipment
WO2021082835A1 (en) Method for activating function and electronic device
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
US20240077987A1 (en) Widget display method and electronic device
WO2021190524A1 (en) Screenshot processing method, graphic user interface and terminal
US20240192835A1 (en) Display method and related apparatus
CN114666427A (en) Image display method, electronic equipment and storage medium
WO2022143118A1 (en) Image processing method and electronic device
CN114327324A (en) Distributed display method of interface, electronic equipment and communication system
WO2022068628A1 (en) Distributed display method of interface, and electronic device and communication system
US20240086035A1 (en) Display Method and Electronic Device
WO2023226975A1 (en) Display method and electronic device
WO2024017332A1 (en) Method for controlling component, and related apparatus
CN118193092A (en) Display method and electronic equipment
CN117991937A (en) Multi-window management method, graphical interface and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination