CN118259995A - Cross-equipment split-screen method and related device - Google Patents

Cross-equipment split-screen method and related device Download PDF

Info

Publication number
CN118259995A
CN118259995A CN202211695536.0A CN202211695536A CN118259995A CN 118259995 A CN118259995 A CN 118259995A CN 202211695536 A CN202211695536 A CN 202211695536A CN 118259995 A CN118259995 A CN 118259995A
Authority
CN
China
Prior art keywords
task
terminal
application
window
split
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211695536.0A
Other languages
Chinese (zh)
Inventor
倪银堂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN118259995A publication Critical patent/CN118259995A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses a cross-equipment split-screen method, wherein a first terminal displays a task management interface, the interface comprises a first terminal and at least one equipment identifier of cooperative equipment, and a first area of the interface is used for displaying task identifiers of preset tasks of a currently selected terminal; after receiving a first input operation acting on a task identifier of a first task, adding the first task as a task of a split screen window; after receiving a second input operation acting on the task identification of the second task, adding the second task as a task of the split screen window; the first task and the second task are tasks of any one of the first terminal and at least one cooperative device; after receiving the third input operation, indicating the third terminal to display a split screen window, wherein a first split screen area of the split screen window displays a first task, and a second split screen area displays a second task. Thus, the cross-equipment split screen of tasks of other equipment is realized, and the cooperative mode of the display screens of multiple equipment is enriched.

Description

Cross-equipment split-screen method and related device
Technical Field
The application relates to the technical field of electronics, in particular to a cross-equipment split-screen method and a related device.
Background
With the continuous development of information technology, the variety of terminals is increasing, and the interaction between the terminals of users is also increasing. In view of the wide variety of terminals generally having different sized displays, the visual effects presented by the different sized displays are different, and multi-screen collaboration between terminals is widely used.
The current multi-screen collaboration technology can screen the display content of one terminal to the screen of another terminal. However, the collaboration of the display screens of the multiple devices currently available is single, and the user experience is to be improved.
Disclosure of Invention
The application provides a cross-equipment split-screen method and a related device, which can finish the cross-equipment split-screen, realize the task migration of a plurality of tasks from at least one equipment at one time, enrich the cooperative mode of the display screens of the plurality of equipment and improve the use experience of users.
In a first aspect, the present application provides a cross-device split-screen method, which is applied to a first terminal, where the first terminal is provided with at least one cooperative device, and the at least one cooperative device includes a third terminal; the method comprises the following steps: the first terminal displays a task management interface, wherein the task management interface comprises equipment identifiers of the first terminal and at least one equipment identifier corresponding to the cooperative equipment respectively; the display state of the equipment identifier comprises a selected state and an unselected state, wherein the selected state is used for indicating that the terminal corresponding to the equipment identifier is selected; the first area of the task management interface is used for displaying task identifiers corresponding to at least one preset task of the currently selected terminal respectively; the first terminal receives a first input operation of a task identifier of a first task acting on a first area; responding to a first input operation, and adding a first task as a task of a split screen window; the first terminal receives a second input operation of a task identifier of a second task acting on the first area; responding to a second input operation, and adding a second task as a task of the split screen window; the first task and the second task are tasks of the first terminal and any one of the at least one cooperative device; the method comprises the steps that a first terminal receives a third input operation, wherein the third input operation is used for determining that target equipment crossing equipment split screens is the third terminal; responding to a third input operation, and sending a first instruction to a third terminal; the first instruction is used for indicating the third terminal to display a split screen window, a first split screen area of the split screen window is used for displaying a first task, and a second split screen area of the split screen window is used for displaying a second task.
By implementing the embodiment of the application, the first terminal can add at least two tasks from the local and/or any cooperative device to the split-screen window, and stream the split-screen window to any cooperative device (such as a third terminal) for display; therefore, the cross-device split screen of tasks of other devices is realized on the third terminal, and task migration of a plurality of tasks from at least one device is realized at one time. Thus, the collaborative mode of the display screens of the multiple devices is enriched, the use efficiency of the screen by the user is improved, and the use experience of the user is improved.
In one implementation, the task management interface further includes a first control; the first input operation comprises a sliding operation of dragging the task identifier of the first task to the first control after long-pressing the task identifier of the first task; the second input operation comprises a sliding operation of dragging the task identifier of the second task to the first control after long-pressing the task identifier. By implementing the embodiment of the application, the task crossing the device split screen window can be added by dragging the task identifier of the task to the first control. Therefore, the operation is simple and clear, the interestingness is high, and the use experience of a user is improved.
In one implementation, the third input operation includes a sliding operation of dragging the first control to the device identifier of the third terminal after long pressing. By implementing the embodiment of the application, the target device can be triggered to display the cross-device split screen window by dragging the device identifier of the first control to the target device. Therefore, the operation is simple and clear, the interestingness is high, and the use experience of a user is improved.
In one implementation, the third terminal is configured with at least two display screens; the method further comprises the steps of: when detecting the equipment identifiers dragging the first control to the third terminal, displaying the display screen identifiers respectively corresponding to the two display screens; the third input operation comprises that under the condition that a first control is dragged to a display screen mark of a first display screen, a first instruction is used for indicating a third terminal to display a split screen window through the first display screen only; the two display screens include a first display screen. When the embodiment of the application is implemented, the target device is provided with a plurality of display screens, and a specific display screen can be selected to display a split-screen window of the cross-device. Therefore, the requirement that a user uses a specific display screen to split the screen across devices can be met, and the use experience of the user is improved.
In one implementation manner, the at least one cooperative device further includes a second terminal, and the first task is a task of the second terminal; before the first terminal receives the second input operation of the task identifier of the first task acting on the first area, the first terminal further includes: the first terminal receives a fourth input operation of the equipment identifier acting on the second terminal; in response to a fourth input operation, the first terminal switches the display state of the equipment identifier of the second terminal from an unselected state to a selected state, and the first area is used for displaying a task identifier corresponding to at least one preset task of the second terminal, wherein the at least one preset task of the second terminal comprises the first task. By implementing the embodiment of the application, the preset tasks of the local and cooperative equipment can be checked through the task management interface. Therefore, the user can conveniently select the preset task of any device to split the screen across the devices, and the use experience of the user is improved.
In one implementation, the preset task includes some or all of the following: running tasks recently, presetting application program windows in installed application programs and collecting tasks; the latest running tasks comprise background running tasks and/or foreground running tasks; the collection tasks include some or all of the following: the method comprises the steps of manually collecting tasks by a user, M tasks with maximum historical operation time length in preset time length, and M tasks with maximum historical operation frequency in preset time length, wherein M is a positive integer. By implementing the embodiment of the application, the cross-equipment split screen can be carried out on the latest running task, the preset application program window and the collection task in the installed application program. Therefore, the method can meet the diversified split screen requirements of the user, and improves the use experience of the user.
In one implementation manner, the preset application window includes a home page of the application program, an application window with the largest historical operating frequency of the application program within a preset duration, or an application window with the largest historical operating duration of the application program within the preset duration. By implementing the embodiment of the application, the cross-device split screen can be performed on the preset application program window (such as the home page) in the application program through the application identifier (such as the application icon) of the application program. The preset application window may be set by the first terminal or the user. Therefore, the method can meet the diversified split screen requirements of the user, and improves the use experience of the user.
In one implementation, the preset task includes a preset application window in the application that has been recently run and installed; the first task is a preset application window in the application program; before the first terminal receives the first input operation of the task identifier of the first task acting on the first area, the first terminal displays the task identifier corresponding to the latest running task of the selected device in the first area of the task management interface, and the method further comprises: receiving a fifth input operation; responding to a fifth input operation, and displaying an application identifier of at least one application program installed by the selected equipment in a first area by the first terminal, wherein the application identifier is a task identifier of a preset application program window in the application program; the at least one application program includes an affiliated application of the first task. By implementing the embodiment of the application, the latest running task of the selected device can be displayed first, and the user can switch the latest running task into the application identifier of the application program of the selected device through specific operation. Therefore, the preset tasks are classified and checked according to the types, the use experience of the user is improved, and the user is clear.
In one implementation manner, when the first terminal displays a task identifier corresponding to a most recently operated task of the selected device in the first area of the task management interface, a second control is further displayed in the task management interface, and a fifth input operation acts on the second control.
In one implementation, the preset task further includes a collection task; when the collection task of the currently selected device comprises the task of the first application program, the first terminal displays a collection identifier corresponding to the application identifier of the first application program, and the collection identifier is used for indicating that the first application program has at least one collection task; the method further comprises the steps of: the first terminal receives a sixth input operation of an application identifier acting on the first application program; and responding to the sixth input operation, the first terminal displays a task identifier corresponding to the collection task of the first application program installed by the selected equipment in the first area. By implementing the embodiment of the application, the collection tasks are divided according to the applications, and the applications with the collection tasks are indicated through the collection identification. Therefore, the collection tasks are divided and checked according to the application, the collection tasks are clear and the use experience of the user is improved.
In one implementation manner, the preset task includes a recently operated task, a task identifier corresponding to the recently operated task is a task card, and the task card includes an application identifier of an application to which the recently operated task belongs, and an interface snapshot when the recently operated task is switched to the background.
In one implementation, the first task and the second task are tasks of the first terminal. Therefore, the scene requirement of a user for performing cross-equipment split-screen on two tasks of the first terminal at the third terminal can be met.
In one implementation manner, the at least one cooperative device further includes a second terminal, the first task is a task of the first terminal, and the second task is a task of the second terminal. Therefore, the scene requirement of the user for performing cross-equipment split screen on the task of the first terminal and the task of the second terminal at the third terminal can be met.
In an implementation manner, before the displaying, in the first area, a task identifier corresponding to at least one preset task of the second terminal, the method further includes: the first terminal acquires multi-task data of the second terminal, wherein the multi-task data comprises task information of at least one preset task of the second terminal.
In a second aspect, the present application provides a communications apparatus comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the communications apparatus to perform the cross-device split-screen method in any of the possible implementations of the first aspect described above.
In a third aspect, an embodiment of the present application provides a computer storage medium, including computer instructions that, when executed on an electronic device, cause a communication apparatus to perform a method of splitting a screen across devices in any of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which when run on a computer causes the computer to perform the cross-device split-screen method in any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic view of a cross-device split screen provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a cross-device split-screen system according to an embodiment of the present application;
Fig. 3 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 4A to fig. 4D are schematic interface diagrams of a task management interface according to an embodiment of the present application;
fig. 5A to 5D are schematic interface diagrams for adding tasks to a split-screen window of a cross-device according to an embodiment of the present application;
Fig. 5E to fig. 5G are schematic interface diagrams of a device-across split screen window displayed by a trigger target device according to an embodiment of the present application;
fig. 6A to fig. 6F are various display forms of each application window in the split-screen window according to the embodiment of the present application;
FIG. 7A is a schematic diagram of an interface of a split-screen widget across devices according to an embodiment of the present application;
FIG. 7B is an interface schematic diagram of a target device configured with multiple display screens according to an embodiment of the present application;
fig. 8A to 8C are schematic diagrams of specific implementation of a split screen of a cross-device according to an embodiment of the present application;
fig. 9A to fig. 9C are schematic interface views of a cross-device split screen of 3 tasks according to an embodiment of the present application;
FIG. 10A is a schematic diagram of an interface of a preview cross-device split screen provided by an embodiment of the present application;
Fig. 10B to fig. 10G are schematic diagrams illustrating adjustment of display positions and display sizes of tasks of a split screen window according to an embodiment of the present application;
FIG. 11A is a schematic diagram of an interface for a target device to control tasks in a split-screen window according to an embodiment of the present application;
11B-11G are schematic interface views of tasks in a target device management split-screen window according to an embodiment of the present application;
fig. 12A to fig. 12D are schematic views of an interface for performing cross-device split-screen on a home page of an application according to an embodiment of the present application;
fig. 13A to fig. 13C are schematic views of an interface for performing cross-device split-screen on a collection task according to an embodiment of the present application;
fig. 14A to 14C are schematic views of interfaces for viewing applications and collection tasks according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: the three cases where a exists alone, a and B exist together, and B exists alone, and furthermore, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The display content of the terminal 1 can be displayed on the screen of the terminal 2 by utilizing the multi-screen cooperation technology. Currently, there are the following scenario requirements: scene requirements 1, i.e. user intention to view user interface 1 of terminal 1 and user interface 2 of terminal 2 on terminal 3; scene requirements 2, i.e. user intention to view user interface 1 and user interface 3 of terminal 1 on terminal 3.
In the scheme provided by the embodiment of the application, aiming at the scene requirement 1, a user can open the user interface 1 of the terminal 1 and trigger the terminal 1 to screen the user interface 1 on the screen of the terminal 3; the user then reopens the user interface 2 of the terminal 2 and triggers the terminal 2 to screen the user interface 2 onto the screen of the terminal 3. In this way, the user can view the user interface 1 of the terminal 1 and the user interface 2 of the terminal 2 on the terminal 3. However, the above scheme has the following problems: user interface 2 may obscure user interface 1 such that a user cannot view user interface 1 and user interface 2 simultaneously; the user needs to operate the terminal 1 and the terminal 2 to execute screen throwing operation respectively, and the user operation is complicated; terminal 3 may not support receiving the screen of terminal 1 and the screen of terminal 2 simultaneously.
In the scheme provided by the embodiment of the application, aiming at the scene requirement 2, a user can operate the terminal 1 to simultaneously display the user interface 1 and the user interface 3; and then, the display content of the terminal 1 is projected to the terminal 1 by using the mirror image projection. In this way, the user can view the user interface 1 and the user interface 3 of the terminal 1 on the terminal 3. However, the above scheme has the following problems: when the mirror image is projected, the display contents of the terminal 1 and the terminal 3 are completely the same, and when the user interface 1 and the user interface 3 of the terminal 1 are checked on the terminal 3, a user cannot normally use the terminal 1 to display other contents; the terminal 1 may not support simultaneous display of two user interfaces.
In the cross-device split-screen scheme provided by the embodiment of the application, the terminal 1 can configure a cross-device split-screen window; when the terminal 1 adds N tasks to the split screen window, the split screen window may be divided into N split screen areas, where N is a positive integer, and the N split screen areas are used for displaying the N tasks respectively. For example, the terminal 1 adds the user interface of the device or the user interface 2 of the cooperatively connected terminal 2 in a split screen window; the terminal 1 may then stream the split window to the cooperatively connected terminal 3 for display. When the user interface 1 of the terminal 1 and the user interface 2 of the terminal 2 are added in the split screen window, the scene requirement 1 of a user can be met; the split screen window is added with a user interface 1 and a user interface 3 of the terminal 1, so that the scene requirement 2 of a user can be met. In the scheme, a terminal 1 is a control device for cross-device split-screen, a terminal 3 is a target device for cross-device split-screen, a source device of a user interface 1 is the terminal 1, and a source device of a user interface 2 is the terminal 2.
Illustratively, referring to fig. 1, the cooperatively-connected devices of the tablet computer include a cell phone and a computer (personal computer, PC); the Word application of the tablet computer comprises a document interface 1, and the instant messaging application of the mobile phone comprises a chat interface 1; the tablet personal computer can acquire task information of the chat interface 1 of the mobile phone; then, respectively adding a chat interface 1 of a mobile phone and a document interface 1 of the device in a split screen window of the cross-device, and transferring the split screen window to a PC for display; thereby realizing cross-device split screen on the PC.
The split screen in the prior art is limited to split screen of the application of the local machine, and the scheme provided by the embodiment of the application can realize the migration and connection of a plurality of tasks from at least one device at one time by split screen display of at least one application from at least one other device through the split screen window of the cross-device. Therefore, the collaborative mode of the display screens of the multiple devices is enriched, the use efficiency of the screen by the user is improved, and the use experience of the user is improved.
The following describes a cross-device split-screen system 10 related to a cross-device split-screen method provided by an embodiment of the present application.
Fig. 2 schematically illustrates a cross-device split-screen system 10 provided in an embodiment of the present application. As shown in fig. 2, the cross-device split screen system 10 includes a terminal 100 configured with a display device, and at least one terminal (which may also be simply referred to as a cooperative device) configured with a display device, such as a terminal 200 and a terminal 300, which are currently cooperatively connected with the terminal 100. In some embodiments, the terminals in the cross-device split-screen system 10 are co-devices with each other.
The display device may be a display screen of the terminal device itself, or may be an external display device (for example, an external expansion screen, an external projector, etc.), which is not limited herein. For example, the terminals in the cross-device split-screen system 10 may be cell phones, tablet computers, desktop computers, laptop computers, handheld computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, cell phones, personal Digital Assistants (PDA), augmented reality (augmented reality, AR) devices, virtual Reality (VR) devices, artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) devices, wearable devices (smart bracelets), vehicle-mounted devices, projectors, smart home devices (smart televisions, smart screens, large-screen devices, etc.), and/or smart city devices, and the specific types of the terminals are not particularly limited by the embodiments of the present application. The "terminal" according to the embodiment of the present application may also be referred to as "terminal device" or "electronic device". In the cross-device split-screen system 10, the types of any two terminals can be the same or different; any two terminals may be equipped with the same operating system, or may be equipped with different operating systems, for example, iOS, android, microsoft or hong Meng's operating systems.
In the embodiment of the present application, the terminal 100 may acquire and view task identifiers (e.g., task cards, application identifiers, etc.) corresponding to preset tasks of the present device and any current cooperative device (e.g., the terminal 200) of the terminal 100, where the preset tasks include, for example, a part or all of a task that the terminal 200 has recently operated (a foreground operation task and/or a background operation task), a task that a user manually collects, a task that the terminal automatically collects, and a home page of an application installed by the terminal. The terminal 100 may respectively add at least two tasks to the split-screen window of the cross-device through the task identifier corresponding to the preset task, where the at least two tasks may include a task on the terminal 100, and may also include a task on any cooperative device (for example, the terminal 200 and the terminal 300) of the terminal 100; the terminal 100 may then stream the display content across the split windows of the device to the terminal 300 for display. When N tasks are added to the cross-device split window, the cross-device split window displayed by the terminal 300 includes N split areas, where the N split areas are in one-to-one correspondence with the N tasks. Through the cross-device split-screen scheme, migration and connection of multiple tasks from at least one device are achieved at one time.
In the embodiment of the present application, the establishment of the cooperative connection with the terminal 100 means that the mutually trusted relationship is established with the terminal 100. In some embodiments, taking the terminal 300 as an example, the terminal 100 and the terminal 300 establish a cooperative connection, it is necessary to satisfy some or all of the following conditions (1) to (3). (1) The terminal 100 and the terminal 300 access the same preset network (for example, wiFi local area network or bluetooth mesh networking), and devices in the preset network are mutually trusted devices. (2) The terminal 100 and the terminal 300 log in the same account (for example, hua is an account), log in an account belonging to the same group (for example, belong to the same family account), or log in an account having a preset binding relationship (for example, friends). (3) The terminal 100 and the terminal 300 establish a mutual trust relationship in other ways. For example, terminal 300 is connected to a hotspot shared by terminal 100. The communication process and the related user interface for establishing the cooperative connection between the terminal 100 and the terminal 300 according to the embodiments of the present application are not particularly limited. In some embodiments, the terminal 100 and the terminal 300 satisfying some or all of the conditions (1) to (3) means that the two devices have automatically established a cooperative connection. In some embodiments, after the terminal 100 searches for the terminal 300 satisfying the above condition and establishes a communication connection with the terminal 300, a cooperative connection request may be transmitted to the terminal 300; based on the above-mentioned cooperative connection request, the terminal 300 may feed back a cooperative connection response to the terminal 100, the cooperative connection response being used to determine to establish a cooperative connection; the cooperative connection response may carry a cooperative parameter, which may be used to indicate a display size of the terminal 300, whether cross-device split screen is supported, and the like. The terminal 100 receives the above-mentioned cooperative connection response, which means that the two devices have established a cooperative connection.
In some embodiments, taking terminal 300 as an example, terminal 100 and terminal 300 may be directly connected through a short-range wireless communication connection or a local wired connection. By way of example, the terminals 100 and 300 may have one or more of a wireless fidelity (WIRELESS FIDELITY, wiFi) communication module, an Ultra Wideband (UWB) communication module, an infrared communication module, a bluetooth (blue) communication module, a Near Field Communication (NFC) communication module, a ZigBee communication module. Taking the terminal 100 as an example, the terminal 100 may detect and scan electronic devices in the vicinity of the terminal 100 by transmitting signals through a close range communication module (e.g., a bluetooth communication module), so that the terminal 100 may discover and establish wireless communication connection with the nearby electronic devices (e.g., the terminal 300) through a close range wireless communication protocol, and transmit data to the nearby electronic devices.
In some embodiments, referring to fig. 2, terminal 100 and terminal 300 may also communicate indirectly through at least one electronic device in a communication network including a local area network (local area network, LAN) and/or a wide area network (wide area network, WAN). In one implementation, the terminals 100 and 300 may be connected to a local area network (local area network, LAN) through at least one electronic device 300 based on a wired or wireless fidelity (WIRELESS FIDELITY, wiFi) connection. For example, the electronic device 300 may include a third party device such as a router, gateway, smart device controller, or the like. In one implementation, terminal 100 and terminal 300 may also be indirectly connected through at least one electronic device 400 in a wide area network (e.g., the Internet). For example, electronic device 400 may include one or more hardware servers, cloud servers embedded in a virtualized environment, and the like. It will be appreciated that terminal 100 and terminal 300 may be indirectly connected for wireless communication and data transmission via electronic device 300 and/or electronic device 400.
Illustratively, the cross-device split-screen system 10 includes a tablet computer 100, and a cell phone 200 and a PC 300 cooperatively connected with the tablet computer 100; this will be taken as an example for the following illustrative explanation, which should not be construed as limiting the embodiments of the present application. In the embodiment of the present application, the user may use one or more input modes of voice, gesture, finger, stylus, mouse, etc. for the input operation on the tablet PC 100, the mobile phone 200 and the PC 300, which is not particularly limited in the embodiment of the present application.
It will be appreciated that the configuration shown in this embodiment does not constitute a particular limitation of the cross-device split-screen system 10. In other embodiments of the present application, cross-device split-screen system 10 may include more or fewer devices than shown.
Fig. 3 shows a schematic structure of the terminal 100.
The terminal 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal 100. In other embodiments of the application, terminal 100 may include more or less components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the terminal 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing function of terminal 100. The processor 110 and the display 194 communicate through a DSI interface to implement the display function of the terminal 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal 100, or may be used to transfer data between the terminal 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is only illustrative, and does not limit the structure of the terminal 100. In other embodiments of the present application, the terminal 100 may also use different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the terminal 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the terminal 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal 100 may be configured to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the terminal 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied on the terminal 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, demodulates and filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of terminal 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that terminal 100 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
Terminal 100 implements display functions via a GPU, display 194, and application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the terminal 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, terminal 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the terminal 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The terminal 100 may support one or more video codecs. In this way, the terminal 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the terminal 100 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may include a static random-access memory (SRAM), a dynamic random-access memory (dynamic random access memory, DRAM), a synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), a double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc.; the nonvolatile memory may include a disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC-LEVEL CELL), multi-level memory cells (multi-LEVEL CELL, MLC), triple-level memory cells (LEVEL CELL, TLC), quad-LEVEL CELL, QLC), etc. divided according to a memory cell potential order, may include general FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory card (eMMC) MEDIA CARD, eMMC), etc. divided according to a memory specification.
The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other recently running programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect an external nonvolatile memory to realize the memory capability of the extension terminal 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The terminal 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 180B may be used to determine a motion gesture of the terminal 100. In some embodiments, the angular velocity of terminal 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B.
The air pressure sensor 180C is used to measure air pressure.
The magnetic sensor 180D includes a hall sensor.
The acceleration sensor 180E may detect the magnitude of acceleration of the terminal 100 in various directions (typically three axes).
A distance sensor 180F for measuring a distance. The terminal 100 may measure the distance by infrared or laser.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode.
The ambient light sensor 180L is used to sense ambient light level. The terminal 100 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint.
The temperature sensor 180J is for detecting temperature. In some embodiments, terminal 100 performs a temperature processing strategy using the temperature detected by temperature sensor 180J.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the terminal 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal.
The keys 190 include a power-on key, a volume key, etc. The terminal 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the terminal 100.
The motor 191 may generate a vibration cue.
The indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, a message, a missed call, etc.
The SIM card interface 195 is used to connect a SIM card.
The term "User Interface (UI)" in the following embodiments of the present application is an interface for interaction and information exchange between an application program or an operating system and a user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the terminal equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be a visual interface element such as text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc. displayed in the display screen of the terminal device. In the implementation of the present application, the "user interface" may also be simply referred to as "page" or "interface".
An exemplary graphical user interface implemented on tablet computer 100 provided by embodiments of the present application is described below.
Fig. 4A illustrates a main interface 11 for exposing applications installed by the tablet 100. The main interface 11 may include: status bar 201, calendar indicator, display area 202 of application icons, and navigation bar 203. Wherein:
The display area 202 of the application icon may show: icons of gallery, word, browser, camera, cloud-shared, memo, etc. The main interface 11 may also include a page indicator 204. The application icons may be distributed across multiple pages and the page indicator 204 may be used to indicate which page the user is currently viewing the application in. The user may slide the area of the application icon left and right to view the application icons in other pages.
Navigation bar 203 may include: a system navigation key such as a return key 203A, a home screen key 203B, a multi-tasking key 203C, and the like. Upon detecting that the user clicks the back key 203A, the tablet computer 100 may display the previous page of the current page. Upon detecting that the user clicks the home screen key 203B, the tablet computer 100 may display the home interface 11. As shown in fig. 4B, upon detecting a user clicking the multitasking key 203C, the tablet 100 may display the task management interface 12. The names of the navigation keys may be other, and the present application is not limited thereto.
In the embodiment of the present application, the main interface 11 may not include virtual keys in the navigation bar 203, and each navigation key in the navigation bar 203 may be implemented as a physical key or a navigation gesture. In some embodiments, tablet 100 implements the functionality of each navigation key in navigation bar 203 by specifying a navigation gesture. For example, dividing the bottom edge of the display screen 194 of the tablet computer into a left side bottom edge, a middle bottom edge, and a right side bottom edge, the navigation gesture corresponding to the return key 203A includes a gesture that slides upward from the left side bottom edge of the display screen 194; the navigation gestures corresponding to home screen key 203B include a gesture that slides up from the bottom edge of the middle portion of display screen 194; the navigation gesture corresponding to the multitasking key 203C includes a gesture sliding upward from the right bottom side of the display screen 194.
The task management interface according to the embodiment of the present application is described below as an example.
In the embodiment of the present application, the conventional task management interface 12 is an interface for displaying the latest running task (i.e., background running task) of the present device in the system user interface.
Taking the tablet pc 100 as an example, when a user opens a new application on the tablet pc 100, the tablet pc 100 may switch the application running in the foreground to running in the background, sequentially add an application program window (i.e. a task) that is recently running in the application program window into the multi-task queue 1, and store a snapshot of a user interface that is finally displayed in the application program window, where the snapshot is used as an interface snapshot of the application program window in a task card of a task management interface. Alternatively, the snapshot of the user interface may be a screenshot of the user interface. As shown in fig. 4A and fig. 4B, a user may call up a conventional task management interface 12 by using a multitasking key 203C, where the task management interface 12 may display task cards corresponding to each task in the multitasking queue 1 of the tablet computer, and through each task card, the effects of previewing, closing, and fast switching may be achieved for each application window (i.e., task) in the multitasking queue 1 of the tablet computer 100. Alternatively, closing a task may include ending the task's associated process, freeing the running memory occupied by the task. Illustratively, taking the task card 205 of the Word application in the task management interface 12 as an example, the user can preview the application window 1 recently operated by the Word application through the interface snapshot 205A in the task card 205; when detecting that the user clicks the task card 205, the tablet computer 100 may switch the application window 1 to the foreground operation, i.e. display the application window 1; upon detecting that the user drags the task card 205 upward, the tablet 100 may close the background run of the application window 1.
Note that, the application window may be an Activity window in the Android system, an application window in the IOS system, or an application window in another operating system, which is not limited herein. An application includes a plurality of application windows, and an application window may correspond to a user interface (i.e., a page). For example, in the Android system, an Activity is an interface for interaction between a user and an application program, and each Activity component is associated with a Window object to describe a specific application program Window (the Window may be simply referred to as an Activity Window). According to the method, the Activity is a highly abstract user interface component, the Android represents a user interface and corresponding business logic centered on the user interface, and events triggered by a user can be monitored and processed through controls in the user interface. It will be appreciated that in an Android application, one or more activities may be presented as a user interface and an Android application may have multiple activities. The display application window related to the subsequent embodiment refers to a user interface corresponding to the display application window.
The task related to the embodiment of the application can be an application, an application window or a user interface in the application, and the subsequent embodiment mainly takes the application window as an example for illustration. The "task card" may also be referred to as "task identification", "task window", "task page", etc., and the embodiments of the present application are not limited in this regard.
The embodiment of the application also provides a task management interface of the multi-device. Unlike conventional task management interfaces, the task management interface of the multiple devices can not only check and manage the latest running task of the device, but also check the latest running task of the cooperative device (i.e., foreground running task and/or background running task), and perform cross-device split-screen on the latest running task of the device and/or the cooperative device. In one implementation, as shown in fig. 4A and 4C, the task management interface 13 of the multi-device may be invoked directly through the multi-tasking key 203C. In another implementation, only the conventional task management interface 12 can be invoked via the multitasking key 203C; then, through a specific operation, the conventional task management interface 12 can be switched to the task management interface 13 of the multiple devices. In another implementation, the multi-device task management interface 13 may be presented through other system interfaces (e.g., negative two-screen). The calling-out mode of the task management interface 13 of the multiple devices in the embodiment of the present application is not particularly limited.
As shown in fig. 4C, the area 1 of the multi-device task management interface 13 displays the device identifier 301 of the tablet 100, and the device identifier of the cooperative device of the tablet 100, such as the device identifier 302 of the mobile phone 200 and the device identifier 303 of the PC 300. The device identification may include a device icon and a device name. The device identifier 301 of the tablet computer 100 is in a selected state, and the device identifiers of the cooperative devices are in an unselected state; the area 2 of the task management interface 13 is used for displaying a task card corresponding to the most recently operated task of the currently selected device, and the clear control 304 in the area 2 is used for closing part or all of the most recently operated tasks of the currently selected device at one time. As shown in fig. 4C, the area 2 of the task management interface 13 displays a task card corresponding to the background running task of the tablet computer 100, and a clear control 304. The task card corresponding to one task includes an application identifier and/or an interface snapshot of an application program corresponding to the task, and the clearance control 304 is used to close a background running task of the tablet computer 100 at one time.
Illustratively, before the tablet computer 100 calls out the task management interface 13, the user views the document 1 of the Word application, the gallery application, the browser application, and the document 2 of the Word application in sequence. As shown in fig. 4C, the task cards of the tablet computer 100 include a task card 305 corresponding to document 1 of the Word application, a task card 306 corresponding to the gallery application, a task card 307 corresponding to the browser application, and a task card 308 corresponding to document 2 of the Word application. Taking the task card 308 as an example, the task card 308 task card includes an application identification 308A of a Word application and an interface snapshot 308B of document 1.
As shown in fig. 4C and fig. 4D, after detecting that the user clicks on the device identifier of the other cooperative device (for example, the device identifier 302 of the mobile phone 200), the tablet computer 100 displays task cards corresponding to each task of the multitasking queue 2 of the mobile phone 200 in the area 2, where the multitasking queue 2 is used to cache task information of the task that the mobile phone 200 has recently operated. For example, the task card 309 corresponding to the instant messaging application and the task card corresponding to the music application running in the background of the mobile phone, and the task card 310 corresponding to the gallery application running in the foreground of the mobile phone 200. Through the task cards of the mobile phone 200 displayed in the area 2, the tablet pc 100 can preview, close and rapidly switch each application window in the multitasking queue 2 of the mobile phone 200. Taking a task card 309 of the instant messaging application in the task management interface 13 as an example, a user may preview the recently running application window 2 of the instant messaging application of the mobile phone 200 through an interface snapshot in the task card 309; when detecting that the user clicks the task card 309, the tablet computer 100 can switch the application program window 2 of the mobile phone 200 to the foreground of the tablet computer 100 for running, and display the user interface 2 corresponding to the application program window 2; upon detecting that the user drags the task card 309 upward, the tablet 100 may instruct the cell phone 200 to close the background operation of the application window 2.
In the embodiment of the present application, the tablet pc 100 may acquire multi-task data of each cooperative device (for example, the mobile phone 200), where the multi-task data includes task information of each task in the multi-task queue 2 of the cooperative device; further, the multi-task queue 2 based on the cooperative device may display a task card corresponding to a task that the cooperative device has recently operated on the multi-task management interface 13. In the embodiment of the present application, the manner and timing for obtaining the multi-task data of the cooperative device by the tablet pc 100 are not particularly limited.
In some embodiments, when the tablet computer 100 determines that a cooperative device is online, a task acquisition request is periodically sent to the cooperative device to request acquisition of the latest multi-task data of the cooperative device. In one implementation, the tablet computer 100 determines that a cooperative device is online, specifically including: tablet 100 determines that it may communicate directly or indirectly with the cooperating device. In some embodiments, when the tablet computer 100 detects that the user invokes the operation of the multitasking interface 13, or detects that the user clicks on the device identifier of the cooperative device in the multitasking interface 13, a task acquisition request is sent to the cooperative device to request to acquire the latest multitasking data of the cooperative device. In some embodiments, the collaborative device of the tablet 100 periodically sends the latest multitasking data to the tablet 100 or synchronizes the latest multitasking data to the tablet 100 when the multitasking data is updated. In some embodiments, the collaborative device of tablet 100 synchronizes up-to-date multitasking data to a multitasking data management module of a third party device (e.g., an application server) when the multitasking data is updated or timed; when the tablet computer 100 senses that the multi-task data of the cooperative device in the multi-task data management module is updated, or when the tablet computer 100 detects that a user calls out the operation of the multi-task management interface 13, or when the tablet computer 100 detects that the user clicks on the device identifier of the cooperative device in the multi-task management interface 13, the third party device requests to acquire the latest multi-task data of the cooperative device.
The following describes exemplary task migration and connection between devices using the application window 2 of the instant messaging application in the multitasking data of the mobile phone 200.
In some embodiments, the task information of the application window 2 may include some or all of the following: the state information of the application window 2, the startup information required for starting the application window 2, the user data of the application window 2, and the Context information (Context) of the application window 2, so as to ensure that the application window 2 can perform task flow and seamless connection between the mobile phone 200 and the tablet computer 100. The tablet pc 100 is provided with an instant messaging application corresponding to the application window 2, and based on the task information of the application window 2, the instant messaging application can be started, the application window 2 in the application is opened, and a user interface of the application window 2 when the application window 2 runs on the mobile phone 200 last time is displayed. In the embodiment of the present application, after the multi-task data (for example, the task information of the application window 2) of the mobile phone 200 is updated, the tablet pc 100 can synchronize the updated multi-task data of the mobile phone 200, so as to ensure that the application window 2 can perform seamless connection between the mobile phone 200 and the tablet pc 100.
Wherein, the status information is used for recording the last running status of the application window 2, and the starting information may include part or all of the following: package name, class name and cleaning mode of the application, and window identification of the application window 2. The user data of the application window 2 may include: the display position of the long page corresponding to the application window 2 when the application window 2 is recently operated, the address (for example, uniform resource identifier (Uniform ResourceIdentifier, URI)) of the online resource or the local resource in the application window 2, and the viewing progress (for example, video progress and audio progress) of the specific resource.
In some embodiments, the state information of the application program may further include an interface capability (feature ability, FA) file, and even if the tablet pc 100 does not install the instant messaging application to which the application program window 2 belongs, the instant messaging application may be directly run according to the FA file. The FA files can be independently packaged and released, and have one or more of installation-free, independent running of the application, cross-device UI migration and cross-device binary migration, so that task migration among devices is more convenient.
Referring to the task card shown in fig. 4C and 4D, the present application may be directed to two types of applications. For the first type of application, the task that the terminal has recently run can only include one task that the application has recently run, and the multitasking interface 13 can only display one task card corresponding to the task that the application has recently run, for example, the task card 309 corresponding to the instant messaging application. For the second type of application, the terminal may run multiple tasks of the application at the same time, and the multitask management interface 13 may also display task cards corresponding to multiple tasks that are recently run by the application, for example, the task card 305 corresponding to the document 1 and the task card 308 corresponding to the document 2 of the Word application. The same application may present different properties on different types of terminals, e.g. the browser application may be a first type of application on a tablet and a second type of application on a PC. The second type of application may not be involved in the embodiments of the present application, and is not specifically limited herein.
Referring to fig. 4C and fig. 4D, the terminal according to the embodiment of the present application may support a landscape display and/or a portrait display, where the landscape display refers to a landscape/portrait ratio of a user interface displayed by the terminal (i.e., a landscape edge of the user interface is compared to an upper portrait edge of the user interface) is greater than 1, and the portrait display refers to a landscape/portrait ratio of the user interface displayed by the terminal is less than 1. If the task is switched to the background when the terminal is displayed on the transverse screen, the transverse-longitudinal ratio of the interface snapshot in the task card of the task is larger than 1; for example, see task card 305 of the Word application shown in FIG. 4C. If the task is switched to the background when the terminal displays a vertical screen, the aspect ratio of the interface snapshot in the task card of the task is smaller than 1; for example, see the corresponding task card 307 of the browser application shown in fig. 4C.
Referring to fig. 4C and 4D, when the area 2 cannot display task cards of all tasks in the multitasking queue of the currently selected device, the user can view more task cards by a sliding operation to the left/right in the area 2, limited by the size of the screen. The arrangement sequence and the layout mode of the task cards of each device in the area 2 are not limited, and the arrangement sequence and the layout mode of the task cards of the two devices can be different or the same. As shown in fig. 4B, the task cards of the tablet pc 100 are displayed in a two-row and multi-column layout manner, and the task cards corresponding to the tasks are sequentially arranged in the area 2 from top to bottom and from right to left according to the order of the last running time of the front stage of each task, where the task running the last front stage is located in the first row of the first column. As shown in fig. 4C, the task cards of the mobile phone 200 are displayed in a row-to-column layout manner, and the task cards corresponding to the tasks are sequentially arranged in the area 2 from right to left according to the order of the last running time of the foreground of each task, where the task running in the latest foreground is located in the first column. The embodiment of the application can also sort the task cards corresponding to each task according to the factors such as the historical operation duration, the historical operation frequency and the like of each task in the preset time period, and the application is not particularly limited.
In some embodiments, the task management interface 13 of the tablet computer 100 further includes a configuration control 401 of a split screen window of a cross-device, and after detecting that a user drags any task card displayed by the task management interface 13 to the configuration control, the tablet computer 100 may add a task corresponding to the task card to the split screen window. After adding one or more tasks to the split screen window, after detecting that the user drags the configuration control to the device identifier of any cooperative device (for example, PC 300), tablet computer 100 may send a display instruction to PC 300 to trigger PC 300 to display the split screen window of the cross-device, so as to implement split screen display on the PC of one or more tasks of other devices.
In some embodiments, when only one task is added (e.g., the task of the handset 200), the task may be displayed full screen across the split screen window of the device. Thus, the effect of cross-device screen projection of the mobile phone 200 to the PC 300 can also be realized through the tablet personal computer 100.
Fig. 5A-5G illustrate one example of tablet computer 100 adding two tasks across a split window of a device and triggering PC 300 to display the split window of the device.
As shown in fig. 4D, 5A and 5B, the task management interface 13 includes a configuration control 401 for a split-screen window across devices, and a task card 309 for an instant messaging application of the handset 200; when detecting the sliding operation of dragging the task card 309 after the user presses the task card 309 for a long time, the tablet computer 100 displays that the card identifier corresponding to the task card 309 moves along with the finger of the user; when detecting that the user drags the card identifier to the configuration control 401 or when detecting that the user drags the card identifier to the configuration control 401 and releases the hand (i.e. stops touching the task card 309), the tablet computer 100 adds a task (i.e. the application program window 2) corresponding to the task card 309 to a split screen window of the cross-device, and makes the configuration control 401 present a display state 1; the display state 1 is used for indicating that a task is added to the split-screen window. The card identifier corresponding to the task card 308 may be the task card 309 itself or a thumbnail of the task card 309, which is not particularly limited herein.
In some embodiments, after the task corresponding to the task card 309 is added to the split window across the device, the task card 309 switches from the un-added state to the added state. For example, the background color and/or transparency of the added state changes as compared to the non-added state.
As shown in fig. 5B and 5C, after detecting that the user clicks the device identifier 301 of the tablet pc 100, the tablet pc 100 displays a task card of the device, such as a task card 305 of a Word application, on the task management interface 13; as shown in fig. 5C to 5E, similar to the task card 309, after the user presses the task card 305 for a long time and drags the configuration control 401, the tablet computer 100 adds the task (the application window 1) corresponding to the task card 305 to the split-screen window of the cross-device, and makes the configuration control 401 present the display state 2, where the display state 2 is used to indicate that two tasks have been added to the split-screen window. In some embodiments, after adding two tasks, the split screen window is divided into two split screen areas for displaying the two tasks respectively; the display state 2 also indicates the positions of the two split screen areas in the split screen window. For example, referring to fig. 5C, the display state 2 of the configuration control 401 indicates that the split-screen window is divided longitudinally into left and right split-screen areas.
In the embodiment of the present application, the tasks added to the cross-device split window by the tablet PC 100 may be all tasks of the tablet PC 100, or may be all tasks of at least one cooperative device (for example, the mobile phone 200 and/or the PC 300) of the tablet PC 100, which are not limited herein.
As shown in fig. 5E and 5F, when a sliding operation of dragging the configuration control 401 after the user presses the configuration control 401 for a long time is detected, the tablet computer 100 displays that the configuration control 401 moves with the finger of the user; as shown in fig. 5F and 5G, upon detecting that the user drags the configuration control 401 to the device identifier 303 of the PC 300 and releases his hand, the tablet 100 sends a display instruction to the PC 300 to trigger the PC 300 to display and run the split window 14 across devices. The split screen area 1 of the split screen window 14 displays the application window 1 of the tablet computer 100, and the split screen area 2 displays the application window 2 of the mobile phone 200.
In some embodiments, the split window size of the split window 14 across devices is determined according to the target device (i.e., PC 300), and the split window size corresponding to different target devices may be different, with the split window size indicating the aspect ratio of the split window. For example, the size of the split window corresponding to the PC 300 is the size of the display screen of the PC 300. For example, the size of the split-screen window corresponding to the mobile phone 200 may be the size of the display screen when the mobile phone 200 is displayed on a horizontal screen. In one implementation, the tablet computer 100 may store the size of the split screen window corresponding to each collaborative device, and when detecting that the user drags the configuration control 401 to the device identifier of the PC 300, the tablet computer 100 obtains the size of the split screen window corresponding to the PC 300, and determines the division of each split screen region in the split screen window 14 based on the size of the split screen window. In some embodiments, the size of the split-screen window is preset by the control device (i.e., tablet 100) or the user, and the sizes of the split-screen windows corresponding to different target devices are the same.
In the embodiment of the present application, according to the original display size and the original aspect ratio of the added application window in the source device, and part or all of the display size and the aspect ratio of the split screen window, the tablet computer 100 may adaptively plan the split screen area corresponding to each application window in the split screen window, and the display area of each application window in the split screen area. Optionally, when two application windows that are displayed laterally are added across the split-screen window of the device, the split-screen window may be divided laterally into two split-screen regions, one above the other. Optionally, the aspect ratio of the split screen window of the cross-device is smaller than 1, and when two application program windows are added, the split screen window can be transversely divided into an upper split screen region and a lower split screen region.
Fig. 6A-6F also illustrate, by way of example, two split-screen regions in the split-screen window 14 across devices and more display forms of corresponding application windows.
As shown in fig. 5G, and fig. 6A to 6F, the aspect ratio of the split-screen window 14 across the device is smaller than 1, and the split-screen window 14 may include two split-screen areas longitudinally divided, namely, a split-screen area 1 on the right side and a split-screen area 2 on the left side; the split screen area 1 is used for displaying an application window 1 of a Word application of the tablet computer 100, and the split screen area 2 is used for displaying an application window 2 of an instant messaging application of the mobile phone 200.
The present application relates to two types of application windows. The size (size) of the first type of application window is not adjustable, i.e. the aspect ratio is not adjustable. The size of the second type of application program window is adjustable, namely the aspect ratio is adjustable; specifically, the horizontal and/or vertical edges of the application window are adjustable. When the longitudinal edges of the second type of application program window are adjustable, the corresponding user interface is limited by the longitudinal edge length of the application program window when the longitudinal edges of the second type of application program window are long pages, the display screen of the terminal cannot display all display contents of the user interface at one time, and a user needs to slide the user interface up and down to view more display contents; by increasing (or decreasing) the length of the longitudinal edge of the application window, the display that the application window is currently viewable can be increased (or decreased). When the lateral edge of the second type of application program window is adjustable, the corresponding user interface is limited by the lateral edge length of the application program window when the lateral edge of the second type of application program window is a lateral long page, the display screen of the terminal can not display all display contents of the user interface at one time, and a user needs to slide the user interface left and right to view more display contents; by increasing (or decreasing) the lateral edge length of the second type of application window, the display content currently viewable by the application window can be increased (or decreased). The horizontal edge and the vertical edge of part of the second type application program window can be adjusted, and the corresponding user interface is a horizontal long page and a vertical long page.
In the embodiment of the application, when the tablet personal computer determines the display position of the application program window in the split screen window, the size of the application program window can be integrally reduced or enlarged. If the horizontal edge or the vertical edge of the application program window is adjustable, the panel computer can integrally reduce or enlarge the display size of the application program window in the split screen window, and adjust the horizontal edge length or the vertical edge length of the application program window, so that the split screen window reduces blank areas, and the split screen window utilization rate is improved.
Referring to the display form shown in fig. 5G, the aspect ratio of the application window 1 and the application window 2 is not adjustable, and the size of the split screen area 1 is the same as that of the split screen area 2; the application window 1 and the application window 2 are maximally displayed in the corresponding split screen areas according to the original aspect ratio. An application window is maximally displayed in a split screen area, which means that the application window is integrally reduced or enlarged until the application window and the split screen area meet the following conditions: the length of the transverse side is equal to that of the longitudinal side. The original aspect ratio may refer to an aspect ratio of an application window when running in a source device. As shown in fig. 5G, since the original aspect ratio of the application window 1 of the Word application is greater than the aspect ratio of the split screen area 1, after the application window 1 is maximally displayed in the split screen area 1, the lengths of the sides of the application window 1 and the split screen area 1 are equal; because the original transverse-longitudinal ratio of the application program window 2 of the instant messaging application is smaller than the transverse-longitudinal ratio of the split screen area 2, after the application program window 2 is maximally displayed in the split screen area 2, the longitudinal side lengths of the application program window 2 and the split screen area 2 are equal. It will be appreciated that when the original aspect ratio of the application window is the same as the aspect ratio of the split screen region, the application window may span the split screen region. In this display form, as shown in fig. 5G, both split screen areas have unused blank areas.
Referring to the display form shown in fig. 6A, neither the aspect ratio of the application window 1 nor the application window 2 is adjustable; in comparison with the display form shown in fig. 5G, the sizes of the split screen area 1 and the split screen area 2 are determined according to the original aspect ratio of the application program window 1 and the application program window 2, and the aspect ratio of the split screen window; the two dimensions may be different; the application program window 1 and the application program window 2 are maximally displayed in the corresponding split screen areas according to the original transverse-longitudinal ratio, and the transverse edges of the application program windows are equal to the transverse edges of the split screen areas where the application program windows are positioned. Therefore, the blank area in the split-screen window can be reduced, and the utilization rate of the split-screen window is improved. As shown in fig. 6A, compared with fig. 5G, the lateral side lengths of the application window 2 and the split screen area 2 are also equal, and there is no blank area in the split screen area 2; the lateral side length of the split screen area 1 increases, and the display size of the application window 1 is larger.
Referring to the display form shown in fig. 6B, the longitudinal edge of the application window 1 is adjustable, and the aspect ratio of the application window 2 is not adjustable; compared with the display form shown in fig. 6A, when the longitudinal side length of the application window 1 is smaller than the longitudinal side length of the split screen area 1 (i.e., when the split screen area 1 is blank), the longitudinal side length of the application window 1 is increased until it is equal to the longitudinal side length of the split screen area 1. Therefore, the split screen area where the application program window 1 with the adjustable transverse-longitudinal ratio is located is free from blank, and the utilization rate of the split screen window is further improved.
Referring to the display form shown in fig. 6C, the longitudinal edge of the application window 1 is adjustable, the transverse edge of the application window 2 is adjustable, and the sizes of the split screen area 1 and the split screen area 2 are the same; compared with the display form shown in fig. 5G, when the longitudinal side length of the application window 1 is smaller than the longitudinal side length of the split screen area 1 after the maximized display (i.e., when the split screen area 1 is blank), the longitudinal side length of the application window 1 is increased until the longitudinal side length is equal to the longitudinal side length of the split screen area 1; when the lateral side length of the application window 2 after the maximized display is smaller than the lateral side length of the split screen area 2 (i.e. when the split screen area 2 is blank), the lateral side length of the application window 2 is increased until the lateral side length of the application window is equal to the lateral side length of the split screen area 2. Therefore, the split screen area where the size-adjustable application program window is located is not blank, and the utilization rate of the split screen window is further improved.
Referring to the display form shown in fig. 6D, the longitudinal edges of the application window 1 and the application window 2 are adjustable, and the size of the split screen area 1 is the same as that of the split screen area 2; compared with the display form shown in fig. 5G, when the longitudinal side length of the application window 1 is smaller than the longitudinal side length of the split screen area 1 after the maximized display (i.e., when the split screen area 1 is blank), the longitudinal side length of the application window 1 is increased until the longitudinal side length is equal to the longitudinal side length of the split screen area 1; when the lateral side length of the application window 2 after the maximized display is smaller than the lateral side length of the split screen area 2 (i.e. when the split screen area 2 is blank), the application window 2 is continuously enlarged until the lateral side length of the application window is equal to the lateral side length of the split screen area 2, and the longitudinal side length of the application window 2 is reduced to be equal to the longitudinal side length of the split screen area 2. Therefore, the split screen area where the size-adjustable application program window is located is not blank, and the utilization rate of the split screen window is further improved.
Referring to the display form shown in fig. 6E, both the longitudinal edges of the application window 1 and the application window 2 are adjustable, and the sizes of the split screen area 1 and the split screen area 2 may be different; in comparison with the display form shown in fig. 5G, the sizes of the split screen area 1 and the split screen area 2 are determined based on the original resolution of the application window 2 at the source device and the difference in the original resolution of the application window 1 at the source device. For example, when the original resolution of the application window 1 is N times that of the application window 2, it is determined that the lateral side length of the split screen region 1 is N times that of the split screen region 2; and after each application program window is maximally displayed in the corresponding split screen area, adjusting the longitudinal side length of the application program window 1 and the application program window 2 until the longitudinal side length of the application program window is equal to the longitudinal side length of the split screen area. Therefore, on the premise that the split screen window is not blank, the resolution of each application program window in the split screen window is closer to the original resolution, and better visual effect and user experience are achieved.
Referring to the display form shown in fig. 6F, when at least one size of N application windows added in the split window is not adjustable, or when all sizes of N application windows added in the split window are not adjustable, each application window is displayed in N portlets in the PC 300, and the user can control each portlet independently. For example, application window 1 is displayed via widget 14A and application window 2 is displayed via widget 14B.
In some embodiments, as shown in FIG. 7A, the task management interface 13 may also set a configuration control 402 for a separate cross-device split-screen widget; similar to the configuration control 401 of the split-screen window of the cross-device, the user adds the task corresponding to the split-screen widget by dragging the task card to the configuration control 402; after adding N tasks (e.g., application window 1 corresponding to task card 305 and application window 2 corresponding to task card 309), detecting that the user drags configuration control 402 to the device identifier of PC 300, tablet computer 100 sends a display instruction to PC 300 to trigger PC 300 to display the N tasks in N floating portlets, respectively.
The display forms shown in fig. 5G and fig. 6A to 6F are exemplified by the split window having a ratio of horizontal to vertical of less than 1 and two split areas divided longitudinally by the split window. In some display forms, the lateral edge and the longitudinal edge of the application program window can be adjusted simultaneously, so that the application program window in the split screen window achieves better visual effect. In some embodiments, the aspect ratio of the split-screen window may be greater than 1, and the split-screen window may be divided into two split-screen regions laterally after adding two tasks. Similarly, the two split screen areas and the corresponding application program window which are transversely divided by the split screen window can also have various display forms, and are not described herein.
In some embodiments, when a target device (e.g., PC 300) across device split screens is configured with multiple display devices, such as a home main screen, and an external auxiliary screen 1 and an external auxiliary screen 2, tablet computer 100 may trigger a designated display device of PC 300 to display split screen window 14 across devices.
Illustratively, as shown in fig. 7B, when the user is detected to drag the configuration control 401 to the device identifier 303 of the PC 300, display identifiers corresponding to the plurality of display devices of the PC 300, for example, the display identifier 701 of the main screen, the display identifier 702 of the sub screen 1, and the display identifier 703 of the sub screen 2, respectively, are displayed. Dragging the configuration control 401 to any display screen identifier (e.g., the display screen identifier 701) releases the hand, and may trigger the tablet PC 100 to send a display instruction to the PC 300, so as to instruct the PC 300 to display the split screen window 14 through the display device (i.e., the auxiliary screen 1) corresponding to the display screen identifier 701.
In some embodiments, since the display screen sizes of different display devices of the PC 300 are different, the sizes of the split-screen windows corresponding to the display devices are also different; when determining that the user selects the auxiliary screen 1, the tablet computer 100 may specify the layout of the split screen windows 14 according to the size of the split screen window corresponding to the auxiliary screen 1, that is, determine the position of each split screen region, and the display position and display size of each application window in the split screen region.
In some embodiments, taking the example of the cross-device split window 14 added with two tasks, the tablet computer 100 triggers the PC 300 to display the cross-device split window 14 through the display instruction, which may include the following 3 implementations.
The implementation mode is as follows: the tablet computer 100 runs the split-screen window 14 in the virtual display screen and throws the screen to the PC 300; the PC 300 displays and runs the split-screen window 14 based on the screen contents of the virtual display screen.
Referring to fig. 8A, upon detecting that the user has added a task to the split-screen window 14 across devices, the tablet 100 creates a virtual display screen that is used to run the split-screen window 14. The tablet computer 100 adds the application program window 1 of the local machine and the application program window 2 of the mobile phone 200 in the split-screen window 14, and can determine layout information of the split-screen window 14 based on the added application program window, wherein the layout information indicates the size of the split-screen window 14, the position of each split-screen area in the split-screen window 14 and the display area of the application program window in each split-screen area. The tablet computer 100 determines the display content of the application window 1 in the split-screen window 14 based on the task information of the application window 1 in the local multi-task data; the display content of the application window 2 in the split-screen window 14 is determined based on the task information of the application window 2 in the multitasking data of the mobile phone 200. After determining that the target device of the cross-device split screen is the PC 300, the display content of the virtual display screen is projected to the PC 300 through the display instruction 1, where the display instruction may include image data of the split screen window 14 in the virtual display screen.
The implementation mode II is as follows: the tablet computer 100 sends layout information of the split screen window 14 and task information corresponding to tasks of each split screen region of the split screen window 14 to the PC 300; based on this information, PC 300 can display and operate split screen window 14.
Referring to fig. 8B, the display instruction transmitted from the tablet computer 100 to the PC 300 may include layout information of the split screen window 14, task information of the native application window 1, and task information of the application window 2 of the mobile phone 200. Based on the task information of the application window 1, the PC 300 can determine the display content of the application window 1, and further can run the application window 1 in the split screen area 1 of the split screen window 14; based on the task information of the application window 2, the display content of the application window 2 can be determined, and the application window 2 can be operated in the split screen area 2 of the split screen window 14. In this way, the PC 300 can determine the display content of the split-screen window 14, display and operate the split-screen window 14.
And the implementation mode is three: the tablet computer 100 sends layout information of the split screen window 14, task information of the application program window 1 of the tablet computer and window identification of the application program window 2 of the mobile phone 200 to the PC 300; based on this information, PC 300 can display and operate split screen window 14.
Referring to fig. 8C, unlike the second implementation, in the third implementation, the tablet computer 100 transmits the window identifier of the application window 2 of the mobile phone 200 to the PC 300. The PC 300 and the mobile phone 200 are also cooperative devices, and the PC 300 can acquire the multi-task data of the mobile phone 200; the PC 300 can acquire task information of the application window 2 from the multi-task data of the mobile phone 200 based on the window identification of the application window 2. In particular, other implementations of the third implementation may refer to the related descriptions of the second implementation, which are not described herein.
Without being limited to the 3 implementations described above, the embodiments of the present application may also implement cross-device split-screen from the control device (i.e., tablet 100) to the target device (i.e., PC 300) through other implementations.
Fig. 9A-9C illustrate one example of adding three tasks in a split window across devices by tablet 100 and triggering a PC to display the split window across devices.
As shown in fig. 9A and 9B, the user has added tasks corresponding to the task card 305 and the task card 309 to the split screen window through the configuration control 401; after the tablet computer 100 detects that the user drags the task card 306 of the gallery application to the configuration control 401, adding a task corresponding to the task card 306 (i.e. the application program window 3 of the gallery application) to the split screen window of the cross-device, and making the configuration control 401 present a display state 3, where the display state 3 is used for indicating that 3 tasks have been added to the split screen window. Optionally, the display state 3 further indicates the positions of the split screen areas corresponding to the 3 tasks in the split screen window. As shown in fig. 9B and 9C, after the user drags the configuration control 401 to the device identifier 303 of the PC 300 and releases his hand, the tablet PC 100 triggers the PC 300 to display the split-screen window 15 across devices, where the split-screen window 15 includes the application window 1 corresponding to the task card 305, the application window 2 corresponding to the task card 309, and the application window 3 corresponding to the task card 306.
In the embodiment of the present application, when the tablet pc 100 adds a plurality of tasks to the split screen window of the cross-device through the configuration control 401, the display positions of the tasks in the split screen window can be automatically specified according to the original display size and the original aspect ratio of the added tasks in the source device and some or all of the display size and the aspect ratio of the split screen window. In some embodiments, the user may also preview the split-screen window through tablet computer 100, and adjust the display position and display size of each task in the split-screen window.
Taking the split-screen window 15 with three tasks added as shown in fig. 9C as an example, fig. 10A to 10C show an example of adjusting the display positions of the tasks in the split-screen window when the split-screen window 15 is previewed by the tablet pc 100.
As shown in fig. 10A, in the case that 3 application windows are added to the split-screen window of the cross-device, after detecting that the user clicks the configuration control 401, the tablet computer 100 displays the split-screen window 15 of the cross-device, where the split-screen window 15 includes the application window 1 of the tablet computer 100, the application window 2 of the mobile phone 200, and the application window 3 of the tablet computer 100.
In some embodiments, after the user previews the split window 15 displayed by the tablet computer 100, any application window (e.g., the application window 1) may be dragged in the split window 15 to adjust the display position of the application window in the split window 15, and adaptively adjust the display positions of other application windows. In one implementation, the size of each application window is unchanged when the user adjusts the display position of the application window in split-screen window 15. Depending on the size of each application window in the split window 15, the application windows in the split window 15 may have 4 layouts (i.e., layout 1 to layout 4) shown in fig. 10B; the current layout mode of the split screen window 15 is a layout mode 1; the user may adjust the layout 1 of the split window 15 to other layouts by dragging the application window in the split window 15. It can be understood that the more tasks are added to the split screen window, the more split screen areas the split screen window is divided into, and the more layout modes each application program window in the split screen window can traverse; when the split window 15 is adjusted from the layout mode 1 to the layout mode 2, the division of the split area in the split window 15 is unchanged, and when the split window 15 is adjusted from the layout mode 1 to the layout mode 3 or the layout mode 4, the division of the split area in the split window 15 is changed.
In some embodiments, as shown in fig. 10C, after detecting the input operation 1 (e.g., long press application window 3) of the user with respect to the split screen window 15, the split screen window 15 displayed by the tablet computer 100 enters the layout adjustment mode; in the layout adjustment mode, it is detected that the user drags the application window 3 to position 1; traversing the 4 layout modes, determining that the dragged application window 3 is matched with the application window 3 in the layout mode 3 based on the position 1, and adjusting the display position of each application window in the split screen window 15 according to the layout mode 3. Alternatively, dragging the application window 3 to the position 1 may mean that a preset point (for example, a center point) in the application window 3 is dragged to the position 1, and when the distance between the preset point of the application window 3 in the layout 3 and the position 1 in the 4 layout modes is minimum, it is determined that the dragged application window 3 matches the application window 3 in the layout 3.
For example, as shown in fig. 10D, after detecting that the user presses the application window 3 for a long time, dragging the application window 3 upwards, based on the position of the application window 3 after being dragged, the tablet computer 100 adjusts the display position of each application window in the split screen window 15 according to the layout 2 shown in fig. 10B; in visual effect, after the application window 3 of the split screen area 3 is dragged to the split screen area 1 where the application window 1 is located, the application window 3 is displayed in the split screen area 1, and the application window 1 in the split screen area 1 is pressed to the split screen area 3.
The adjustment method of the display position described in fig. 10A to 10C is not limited, and in the embodiment of the present application, after detecting that the user drags the application window, the display position of each application window may be adjusted by other adjustment methods.
Taking the split-screen window 15 with three tasks added as shown in fig. 9C as an example, fig. 10D to 10F show one example of adjusting the display size of each task in the split-screen window when the split-screen window is previewed by the tablet computer 100.
In some embodiments, at least one size adjustment control is provided in the split window 15, where the size adjustment control is used to adjust the size of each split region in the split window 15, and thus also the size of the application window displayed in each split region. As shown in fig. 10E, the split area 1 where the application window 1 is located and the split area 3 where the application window 3 is located are separated by a transverse dividing line, and the transverse dividing line is parallel to the transverse edge of the split window; a resizing control 403 is provided on the transverse dividing line. When detecting that the user drags the adjustment control 403 upwards, the tablet computer 100 can move the transverse dividing line upwards along with the movement of the finger of the user, so as to reduce the length of the longitudinal edge of the split screen area 1 and increase the length of the longitudinal edge of the split screen area 3. When detecting that the user drags the adjustment control 404 downwards, the tablet computer 100 can move the transverse dividing line downwards along with the movement of the finger of the user, so as to increase the length of the longitudinal edge of the split screen area 1 and reduce the length of the longitudinal edge of the split screen area 3.
In some embodiments, when it is detected that the user drags the adjustment control 403 to the top of the split window (e.g., the adjustment control 403 is less than the preset distance from the top), the longitudinal length of the split region 1 is reduced to zero (i.e., the split region 1 disappears), and the tablet 100 deletes the application window 1 displayed in the split region 1. Similarly, when the user drags the adjustment control 403 to the bottom of the split window, the longitudinal length of the split area 3 is reduced to zero, and the tablet 100 deletes the application window 3 displayed in the split area 3.
As shown in fig. 10E, a size adjustment control 404 may be disposed on a longitudinal dividing line between the split screen area 2 where the application window 2 is located and the split screen area 1 where the application window 1 is located, and between the split screen area 2 and the split screen area 3, where the two split screen areas are separated by a longitudinal dividing line. The longitudinal dividing line is parallel to the longitudinal edge (i.e. side edge) of the split-screen window. When detecting that the user drags the adjustment control 403 to the left, the tablet computer 100 can move the longitudinal dividing line to the left along with the movement of the finger of the user, so as to reduce the length of the lateral edge of the split screen area 2 and increase the lengths of the lateral edges of the split screen area 3 and the split screen area 1. When detecting that the user drags the adjustment control 404 to the right, the tablet computer 100 can move the longitudinal dividing line to the right along with the movement of the finger of the user, so as to increase the lateral side length of the split screen area 2 and reduce the lateral side lengths of the split screen area 3 and the split screen area 1.
In some embodiments, when the user is detected to drag the adjustment control 404 to the left side of the split window, the lateral side length of the split region 2 is reduced to zero, and the tablet 100 deletes the application window 2 displayed in the split region 2. Similarly, when the user drags the adjustment control 404 to the right side of the split window, the lateral side lengths of the split region 1 and the split region 3 decrease to zero, and the tablet computer 100 deletes the application windows displayed in the split region 1 and the split region 3.
Taking the application window 1 displayed in the split screen area 1 as an example, if the aspect ratio of the application window 1 is adjustable, when the length of the longitudinal side or the length of the transverse side of the split screen area 1 is changed, the aspect ratio of the application window 1 is adjusted to be equal to or close to the aspect ratio of the split screen area 1, so that the split screen area 1 after the self-adaptive adjustment of the application window 1 is realized. If the aspect ratio of the application window 1 is not adjustable, when the longitudinal side length or the transverse side length of the split screen area 1 is changed, the application window 1 is reduced or enlarged as a whole, and then the split screen area 1 after the self-adaptive adjustment of the application window 1 is realized.
As shown in fig. 10E and 10F, the aspect ratio of the application window 2 is adjustable, and when the lateral side length of the split screen area 2 is reduced, the aspect ratio of the application window 2 is reduced (i.e., the longitudinal side length of the application window 2 is increased after the application window 2 is integrally reduced according to the lateral side length of the split screen area 2), so that the aspect ratio of the split screen area 2 is equal to that of the split screen area 2, thereby implementing the split screen area 2 after the adaptive adjustment of the application window 2. As shown in fig. 10E and fig. 10G, the aspect ratio of the application window 2 is not adjustable, and when the length of the lateral side of the split screen area 2 is reduced, the application window 2 is reduced as a whole, so as to realize the split screen area 2 after the self-adaptive adjustment of the application window 2; in this case, the aspect ratio of the application window 2 and the split screen area 2 after adjustment is generally different, and a blank area exists in the split screen area 2.
The adjustment manners of the display sizes described in fig. 10E to 10F are not limited, and in the embodiment of the present application, the display sizes of the application windows may be adjusted by other adjustment manners.
Similarly, after the PC 300 displays the split window 15 across devices, the user may also adjust the display position and display size of each task in the split window 15 through the split window 15 displayed by the PC 300. Specifically, reference may be made to the related description on the tablet pc 100 side, which is not repeated here.
In some embodiments, taking the cross-device split window 15 as an example, after the PC 300 displays the split window 15, a user may control any application window in the split window 15 on the PC 300 to implement a function of the application window in the source device.
The principle of reverse manipulation and management of the split-screen window 15 may also be different for the aforementioned different implementations of the cross-device split-screen (i.e., the aforementioned implementation one to implementation three). The application window 2 in the split-screen window 15 is exemplified below.
Illustratively, as shown in fig. 11A, the application window 2 of the instant messaging application in the split-screen window 15 includes a chat input box 801; after detecting the input operation 2, the PC 300 switches the display state of the chat input box 801 to an inputtable state, and displays the virtual keyboard 802 in the application window 2; input operation 2 is a click operation of clicking on the chat input box 801.
In some embodiments, the aforementioned implementation one is employed across device split screens (i.e., virtual display screen of tablet 100 is projected onto PC 300). The PC300 detects an input operation 2 acting on the coordinates 1 of the display screen; after determining that the coordinate 1 is within the display area of the split screen window 15, the PC300 converts the relative coordinate 2 of the input operation 2 in the split screen window 15; the PC300 transmits operation information (e.g., operation type, operation duration, etc.) of the input operation 2 and the relative coordinates 2 to the tablet PC 100. The virtual display screen of the tablet computer 100 determines that the input operation 2 is a click operation acting on the chat input box 801 based on the operation information and the relative coordinates 2 of the input operation 2 and the current user interface layout of the application window 2; in response to the clicking operation, the virtual display screen of the tablet computer 100 updates the display content of the application window 2 in the split-screen window 15, i.e., switches the display state of the chat input box 801 to the inputtable state, and displays the virtual keyboard 802 in the application window 2. The tablet PC 100 projects the updated display content of the virtual display screen to the PC300 in real time, so that the PC300 displays the updated split-screen window 15. In one implementation, the tablet pc 100 may synchronize the task information updated by the application window 2 to the mobile phone 200, so as to ensure that the mobile phone 200 and the application window 2 running on the tablet pc 100 may keep information synchronization.
In one implementation, when the PC 300 displays the split window 15 full screen, the coordinates 1 of the input operation 2 on the display screen of the PC 300 and the relative coordinates 2 of the input operation 2 in the split window 15 are the same.
In some embodiments, the foregoing implementation two or implementation three is employed across device split screens (i.e., the layout information of the split screen window and the task information of the application window 2 are obtained, and the PC 300 runs the application window 2 in the split screen region 2 of the split screen window 15). The PC 300 detects an input operation 2 acting on the coordinates 1 of the display screen; after determining that coordinate 1 is within the display area of split-screen window 15, PC 300 translates relative coordinate 2 of input operation 2 in split-screen window 15. The PC 300 determines that the input operation 2 is a click operation acting on the chat input box 801 based on the operation information and the relative coordinates 2 of the input operation 2 and the current user interface layout of the application window 2; in response to the click operation, the PC 300 updates the display content of the application window 2 in the split screen window 15, i.e., switches the display state of the chat input box 801 to the inputtable state, and displays the virtual keyboard 802 in the application window 2. In one implementation, the PC 300 may further synchronize the task information updated by the application window 2 directly to the mobile phone 200 or through the tablet PC 100 to the mobile phone 200, so as to ensure that the application windows 2 running in the mobile phone 200, the tablet PC 100 and the PC 300 may keep information synchronization.
In some embodiments, taking the cross-device split window 15 as an example, the PC 300 may manage the application windows in the split window after displaying the split window. For example, taking the application window 1 in the split-screen window 15 as an example, the application window 1 is displayed in the split-screen window 15 in full screen; after full screen display, restoring the split screen display of the application program window 1 in the split screen window 15; the application window 1 in the split-screen window 15 is closed.
Fig. 11B to 11G illustrate some examples of the PC 300 managing the application window in the split-screen window 15, for example.
As shown in fig. 11B, the split-screen window 15 displayed by the PC 300 further includes a trigger control of the management mode, which may be presented as a drop-down indicator bar 501 on top of the split-screen window 15, and may be displayed in other forms at other positions of the split-screen window 15. As shown in fig. 11B and 11C, after the PC 300 detects an input operation of clicking the drop-down indicator 501 by the user, the management box 502, the minimize control 503, the zoom-out control 504, and the close control 505 of the split screen window 15 are displayed, and introduction information corresponding to each application window, for example, introduction information 506 corresponding to the application window 2, where the introduction information 506 may indicate the source device and the application to which the application window 2 belongs. Minimisation control 503 is used to hide split screen window 15; the zoom-out control 504 is used for zooming out the split-screen window 15; the close control 505 is used to close the split window 15, stopping the split across devices.
As shown in fig. 11C, the management box 502 may include a pull-up indicator 502A, and upon detecting that the user clicks on the pull-up indicator 502A, the pc 300 may stop displaying the management box 502 and return to the split screen window 15 shown in fig. 11B.
As shown in FIG. 11C, the management box 502 may also include options corresponding to each application window, such as option 502B corresponding to application window 1 of the Word application, as well as full screen control 502C and close control 502D. As shown in fig. 11C and 11D, the PC 300 switches the option 502B from the unselected state to the selected state upon detecting an input operation of clicking the option 502B corresponding to the application window 1. As shown in fig. 11D and 11E, in the case where the option 502B of the application window 1 is selected, after detecting an input operation of clicking the full-screen control 502C, the PC 300 displays the application window 1 of the Word application in the full-screen in the split-screen window 15, and switches the full-screen control 502C to the resume split-screen control 502E. The restore split control 502E is used to restore the application window 1 displayed in full screen in the split window 15 to each application window displayed in split screen shown in fig. 11D.
As shown in fig. 11F and 11G, in the case where the option 502B of the application window 1 is selected, after detecting an input operation of clicking the close control 502D, the PC 300 closes the application window 1 in the split window 15, and repartitions the split window 15 into 2 split areas, and adaptively adjusts the display areas of other application windows in the split window 15 in the split areas.
In some embodiments, the method is not limited to performing cross-device screen splitting on the latest running task of the local or cooperative device, but may also perform cross-device screen splitting on a collection task in the local or cooperative device, and a preset application window in an application program installed in the local or cooperative device. In the following, taking the mobile phone 200 as an example, other tasks of the mobile phone 200 that can be split across devices are exemplarily described.
In some embodiments, the preset application window in the application program may include a home page of the application program, an application window with a highest historical operating frequency of the application program within a preset duration, and an application window with a largest historical operating duration of the application program within the preset duration. The area 2 of the multitasking interface 13 may also display an application icon of the currently selected application of the mobile phone 200; by dragging the application icon of the application to the configuration control 401, a preset application window of the application may be added to the cross-device split window, and further the tablet computer 100 may trigger the target device (e.g., the PC 300) to perform cross-device split display on the preset application window.
By way of example, taking the example where the preset application window is the home page, fig. 12A to 12D illustrate one example of a device-across split screen for the home page in the application.
As shown in fig. 12A, the user has added an application window 2 of the instant messaging application in a split-screen window across devices through a configuration control 401; the device identifier 302 of the handset 200 displayed by the multitasking interface 13 is selected and the area 2 of the multitasking interface 13 further comprises a control 601. After detecting that the user clicks the control 601, the tablet computer 100 displays application icons of a plurality of application programs on the mobile phone 200, for example, an application icon 602 of a video application in the area 2, where the application icon 602 may be regarded as a task identifier corresponding to a home page of the video application.
As shown in fig. 12A and 12B, similar to the task card described above, the tablet pc 100 displays that the application icon 602 moves with the user's finger, detecting a sliding operation of the user dragging the application icon 602 after pressing the application icon 602 for a long time; upon detecting that the application icon 602 is dragged to the configuration control 401, the tablet 100 adds the home page of the video application to the split screen window 16 across devices. As shown in fig. 12C or 12D, the split screen window 16 across devices includes a top page of the video application. In fig. 12C, the top page of the video application is displayed according to the original aspect ratio. In fig. 12D, the aspect ratio of the top page of the video application is adjustable, and the tablet pc 100 adjusts the aspect ratio of the top page of the video application to display according to the aspect ratio of the split screen area where it is located.
In some embodiments, the collection tasks of the cell phone 200 include a user manual collection task and/or a device automatic collection task. A collection task may include a user interface of any application installed by the handset 200, such as an activity page or Tab page. The area 2 of the multitask management interface 13 may also display task cards for the currently selected collection task of the mobile phone 200; by dragging the task card corresponding to the collection task to the configuration control 401, the collection task can be added to the cross-device split-screen window, and further the tablet computer 100 can trigger the target device (e.g., the PC 300) to perform cross-device split-screen display on the collection task.
The manual collection task may refer to a user interface collected through a specific operation when the user browses the user interface of the application of interest by the mobile phone 200. In one implementation, the handset 200 stores a multi-task queue 3 for maintaining task information for manual collection tasks (e.g., collection task 1); each time the mobile phone 200 exits the collection task 1, task information of the collection task 1 in the multi-task queue 3 may be automatically obtained and updated, and the task information may include an interface snapshot of the collection task 1.
The automatic collection task may refer to a commonly used task determined by the mobile phone 200 according to historical usage data of each application window; for example, M tasks with highest historical operation frequency in a preset duration, and/or M tasks with highest historical operation duration in a preset duration, where M is a positive integer. In one implementation, the handset 200 stores a multi-task queue 4 for maintaining task information for automatic collection tasks (e.g., task 2); the mobile phone 200 records historical usage data of tasks that the mobile phone 200 has run for a preset period of time (e.g., 5 days). When the mobile phone 200 exits a task (for example, task 2) each time, determining whether the task 2 is a common task according to historical usage data of each task; if the task 2 is a common task, task information of the task 2 is obtained, the task 2 is added to a multi-task queue 4 for automatically collecting the task, and the task information of the task 2 comprises an interface snapshot of the task.
13A-13C illustrate one example of cross-device split-screen of a collection task.
As shown in fig. 12A and 13A, the device identifier 302 of the mobile phone 200 displayed by the multitasking interface 13 is selected, and the area 2 of the multitasking interface 13 further includes a control 601. After detecting that the user clicks the control 601, the tablet computer 100 displays application icons of a plurality of application programs of the mobile phone 200, for example, application icons 602 of video applications, in the area 2; when the collection task of the mobile phone 200 includes at least one task of the video application, a collection identifier 603 is displayed on an application icon 602 of the video application. In one implementation, a user dragging the application icon 602 of the video application to the configuration control 401 may trigger the tablet 100 to add the home page of the video application to the split screen window across devices; clicking the application icon 602 by the user may trigger the tablet computer to display the collection task of the video application of the cell phone 200.
As shown in fig. 13A and 13B, when the user is detected to click on the application icon 602, the tablet computer 100 displays task cards, such as task card 604, corresponding to at least one collection task of the video application of the mobile phone 200, respectively. Similarly, as shown in fig. 13B and 13C, when the user is detected to drag the task card 604 to the configuration control 401 after pressing it for a long time, the tablet computer 100 adds the collection task corresponding to the task card 604 (i.e., the application window corresponding to the hot-cast video of the video application) to the split-screen window 17 of the cross-device.
14A-14C illustrate another example of viewing collection tasks and application icons in area 2.
As shown in fig. 14A, when the multi-task management interface 13 displays the most recently operated task of the selected device (mobile phone 200) in the area 2, a prompt 605 may be displayed, and the prompt 605 is used to prompt the user to view the collection task through the sliding operation. As shown in fig. 14A and 14B, when the user's sliding down operation in the area 2 is detected, the tablet pc 100 displays a task card corresponding to the collection task of the mobile phone 200 in the area 2. The arrangement order of the task cards corresponding to the collection tasks in the area 2 may be determined based on one or more factors such as the latest use time, the historical operation frequency, the historical operation duration, etc. of the collection tasks, which are not limited herein. When the task cards corresponding to all the collection tasks of the mobile phone 200 cannot be displayed due to the screen size, the user can view more task cards corresponding to the collection tasks by sliding the task cards leftwards/leftwards in the area 2.
As shown in fig. 14B, when the task card corresponding to the collection task of the selected device (mobile phone 200) is displayed in the area 2, the multitasking interface 13 may display prompt information 606, where the prompt information 606 is used to prompt the user to view an application icon of the application program of the mobile phone 200 through a sliding operation. As shown in fig. 14B and 14C, when the user's slide-down operation in the area 2 is detected, the tablet pc 100 displays an application icon of an application program installed in the mobile phone 200 in the area 2. Limited to the screen size, when application icons of all applications of the mobile phone 200 cannot be displayed, the user can view application icons of more applications by a sliding operation to the left/left in the area 2.
In the embodiment of the present application, the tablet pc 100 may obtain the multi-task data of the local and each cooperative device (for example, the mobile phone 200). The tablet computer 100 may display a task identifier (for example, a task card or an application icon) corresponding to a preset task in the mobile phone 200 in the area 2 of the multitask management interface 13 based on the multitask data of the mobile phone 200, so as to perform cross-device screen division on the preset task of the mobile phone 200 through the task identifier. The multi-task data of the mobile phone 200 may include a part or all of a multi-task queue 2 corresponding to a recently operated task, a multi-task queue 3 corresponding to a manually collected task, a multi-task queue 4 corresponding to an automatically collected task, and a task list corresponding to a preset application window of an installed application.
The application provides a cross-equipment split-screen method which is applied to a first terminal, wherein the first terminal is provided with at least one piece of cooperative equipment, and the at least one piece of cooperative equipment comprises a third terminal; the method includes, but is not limited to, steps S101 to S104.
S101, a first terminal displays a task management interface, wherein the task management interface comprises equipment identifiers of the first terminal and at least one equipment identifier respectively corresponding to cooperative equipment; the display state of the equipment identifier comprises a selected state and an unselected state, wherein the selected state is used for indicating that the terminal corresponding to the equipment identifier is selected; the first area of the task management interface is used for displaying task identifiers corresponding to at least one preset task of the currently selected terminal respectively.
Illustratively, the first terminal may be the aforementioned terminal 100 (e.g., tablet 100), and the third terminal may be the aforementioned terminal 300 (e.g., PC 300). The task management interface may be the task management interface 13 of the aforementioned multi-device. The first region may be the aforementioned region 2; the task identifier corresponding to the task may be the task card or the application identifier.
In one implementation, the preset task includes some or all of the following: running tasks recently, presetting application program windows in installed application programs and collecting tasks; the latest running tasks comprise background running tasks and/or foreground running tasks; the collection tasks include some or all of the following: the method comprises the steps of manually collecting tasks by a user, M tasks with maximum historical operation time length in preset time length, and M tasks with maximum historical operation frequency in preset time length, wherein M is a positive integer.
In one implementation manner, the preset application window includes a home page of the application program, an application window with the largest historical operating frequency of the application program within a preset duration, or an application window with the largest historical operating duration of the application program within the preset duration.
In one implementation manner, the preset task includes a recently operated task, a task identifier corresponding to the recently operated task is a task card, and the task card includes an application identifier of an application to which the recently operated task belongs, and an interface snapshot when the recently operated task is switched to the background.
S102, a first terminal receives a first input operation of a task identifier of a first task acting on a first area; responding to a first input operation, and adding a first task as a task of a split screen window; the first task is a task of any one of the first terminal and the at least one cooperative device.
S103, the first terminal receives a second input operation of a task identifier of a second task acting on the first area; responding to a second input operation, and adding a second task as a task of the split screen window; and the second task is the task of any one of the first terminal and the at least one cooperative device.
In one implementation, the task management interface further includes a first control; the first input operation comprises a sliding operation of dragging the task identifier of the first task to the first control after long-pressing the task identifier of the first task; the second input operation comprises a sliding operation of dragging the task identifier of the second task to the first control after long-pressing the task identifier. The first input operation and the second input operation may also be other operations, such as a long press operation, which is not particularly limited in the embodiment of the present application.
For example, referring to the related embodiments of fig. 5A-5D, the first control may be the configuration control 401 described previously; the first task may be an application window 2 of an instant messaging application of the cooperative device (i.e. the mobile phone 200), and the corresponding task identifier may be a task card 309; the second task may be the application window 1 of the Word application of the local (i.e., tablet 100), and the corresponding task identifier may be the task card 305. Or the first task may be the aforementioned application window 1 and the second task may be the aforementioned application window 2.
For example, referring to the related embodiments of fig. 9A-9C, the first control may be the configuration control 401 described previously; the first task and the second task may be any two of the application window 2, the application window 1, and the application window 3 of the gallery application of the local (i.e., the tablet 100).
In one implementation manner, the at least one cooperative device further includes a second terminal, and the first task is a task of the second terminal; before the first terminal receives the second input operation of the task identifier of the first task acting on the first area, the first terminal further includes: the first terminal receives a fourth input operation of the equipment identifier acting on the second terminal; in response to a fourth input operation, the first terminal switches the display state of the equipment identifier of the second terminal from an unselected state to a selected state, and the first area is used for displaying a task identifier corresponding to at least one preset task of the second terminal, wherein the at least one preset task of the second terminal comprises the first task.
For example, referring to the related embodiments of fig. 4C to 5B, the second terminal may be the mobile phone 200, and the first task may be the application window 2 of the mobile phone 200; the fourth input operation may include a click operation on the device identification of the cell phone 200. When the tablet pc 100 invokes the multitasking interface 13, the initial state of the device identifier of the tablet pc 100 is the selected state, that is, the tablet pc 100 is selected first, and the multitasking interface 13 displays task cards of the preset tasks of the tablet pc 100. After the user clicks the device identifier of the mobile phone 200, the task card of the preset task of the mobile phone 200 is displayed by the multitask management interface 13.
In one implementation, the third terminal is configured with at least two display screens; the method further comprises the steps of: when detecting the equipment identifiers dragging the first control to the third terminal, displaying the display screen identifiers respectively corresponding to the two display screens; the third input operation comprises that under the condition that a first control is dragged to a display screen mark of a first display screen, a first instruction is used for indicating a third terminal to display a split screen window through the first display screen only; the two display screens include a first display screen.
For example, referring to the related embodiment of fig. 7B, the two display screens may be display screens configured by the PC 300, such as the main screen and the auxiliary screen 1.
S104, the first terminal receives a third input operation, wherein the third input operation is used for determining that a target device crossing the device split screen is the third terminal; responding to a third input operation, and sending a first instruction to a third terminal; the first instruction is used for indicating the third terminal to display a split screen window, a first split screen area of the split screen window is used for displaying a first task, and a second split screen area of the split screen window is used for displaying a second task.
In one implementation, the third input operation includes a sliding operation of dragging the first control to the device identifier of the third terminal after long pressing.
In one implementation, the preset task includes a preset application window in the application that has been recently run and installed; the first task is a preset application window in the application program; before the first terminal receives the first input operation of the task identifier of the first task acting on the first area, the first terminal displays the task identifier corresponding to the latest running task of the selected device in the first area of the task management interface, and the method further comprises: receiving a fifth input operation; responding to a fifth input operation, and displaying an application identifier of at least one application program installed by the selected equipment in a first area by the first terminal, wherein the application identifier is a task identifier of a preset application program window in the application program; the at least one application program includes an affiliated application of the first task.
In one implementation manner, when the first terminal displays a task identifier corresponding to a most recently operated task of the selected device in the first area of the task management interface, a second control is further displayed in the task management interface, and a fifth input operation acts on the second control.
For example, referring to the related embodiments of fig. 12A-13A, when the selected terminal is the mobile phone 200, the second control may be the control 601, and the fifth input operation may include a click operation on the control 601. When the mobile phone 200 is selected, the area 2 of the multi-task management interface firstly displays a task identifier corresponding to the latest running task of the mobile phone 200; after the user clicks control 601, region 2 displays an application identifier (e.g., an application icon) of an application program of mobile phone 200, such as an application identifier of an instant messaging application.
For example, referring to the related embodiment of fig. 14A to 14C, when the selected terminal is the mobile phone 200, the fifth input operation may include a slide-down operation acting on the area 2. The fifth input operation may be other operations, and is not particularly limited herein.
In one implementation, the preset task further includes a collection task; when the collection task of the currently selected device comprises the task of the first application program, the first terminal displays a collection identifier corresponding to the application identifier of the first application program, and the collection identifier is used for indicating that the first application program has at least one collection task; the method further comprises the steps of: the first terminal receives a sixth input operation of an application identifier acting on the first application program; and responding to the sixth input operation, the first terminal displays a task identifier corresponding to the collection task of the first application program installed by the selected equipment in the first area.
For example, referring to the related embodiments of fig. 13A to 13C, when the selected terminal is the mobile phone 200, the first application program may be a video application of the mobile phone 200, and the sixth input operation may be a clicking operation of the application icon 602 acting on the video application. If the video application of the mobile phone 200 includes a collection application, the multitasking management interface also displays a collection identifier 603 on the application icon 602 of the video application when displaying the application icon of the application program of the mobile phone 200.
In one implementation, the first task and the second task are tasks of the first terminal.
In one implementation manner, the at least one cooperative device further includes a second terminal, the first task is a task of the first terminal, and the second task is a task of the second terminal.
In an implementation manner, before the displaying, in the first area, a task identifier corresponding to at least one preset task of the second terminal, the method further includes: the first terminal acquires multi-task data of the second terminal, wherein the multi-task data comprises task information of at least one preset task of the second terminal.
The embodiments of the present application may be arbitrarily combined to achieve different technical effects.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Drive (SSD)), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
In summary, the foregoing description is only exemplary embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made according to the disclosure of the present invention should be included in the protection scope of the present invention.

Claims (16)

1. The cross-device split-screen method is characterized by being applied to a first terminal, wherein the first terminal is provided with at least one cooperative device, and the at least one cooperative device comprises a third terminal; the method comprises the following steps:
The first terminal displays a task management interface, wherein the task management interface comprises equipment identifiers of the first terminal and at least one equipment identifier corresponding to the cooperative equipment respectively; the display state of the equipment identifier comprises a selected state and an unselected state, wherein the selected state is used for indicating that a terminal corresponding to the equipment identifier is selected; the first area of the task management interface is used for displaying task identifiers corresponding to at least one preset task of the currently selected terminal respectively;
The first terminal receives a first input operation of a task identifier of a first task acting on the first area; responding to the first input operation, and adding the first task as a task of a split screen window; the first task is a task of any one of the first terminal and the at least one cooperative device;
The first terminal receives a second input operation of a task identifier of a second task acting on the first area; responding to the second input operation, and adding the second task as a task of a split screen window; the second task is a task of any one of the first terminal and the at least one cooperative device;
the first terminal receives a third input operation, wherein the third input operation is used for determining that a target device crossing a device split screen is the third terminal; transmitting a first instruction to the third terminal in response to the third input operation;
the first instruction is used for indicating the third terminal to display the split screen window, a first split screen area of the split screen window is used for displaying the first task, and a second split screen area of the split screen window is used for displaying the second task.
2. The method of claim 1, wherein the task management interface further comprises a first control;
The first input operation comprises a sliding operation of dragging the task identifier of the first task to the first control after long-pressing the task identifier;
The second input operation comprises a sliding operation of dragging the task identifier of the second task to the first control after long-pressing the task identifier.
3. The method of claim 2, wherein the third input operation comprises a sliding operation of dragging the first control to a device identifier of the third terminal after long pressing.
4. A method according to claim 3, wherein the third terminal is configured with at least two display screens; the method further comprises the steps of:
When detecting that the first control is dragged to the device identifier of the third terminal, displaying the display screen identifiers corresponding to the two display screens respectively;
The third input operation includes that under the condition that the first control is dragged to a display screen mark of a first display screen, the first instruction is used for indicating the third terminal to display the split-screen window only through the first display screen; the two display screens include the first display screen.
5. The method of claim 1, wherein the at least one cooperating device further comprises a second terminal, the first task being a task of the second terminal; before the first terminal receives the second input operation of the task identifier of the first task acting on the first area, the method further comprises:
The first terminal receives a fourth input operation of the equipment identifier acting on the second terminal;
And responding to the fourth input operation, the first terminal switches the display state of the equipment identifier of the second terminal from an unselected state to a selected state, the first area is used for displaying a task identifier corresponding to at least one preset task of the second terminal, and the at least one preset task of the second terminal comprises the first task.
6. The method according to claim 1, wherein the preset task comprises part or all of the following: running tasks recently, presetting application program windows in installed application programs and collecting tasks; the latest running task comprises a background running task and/or a foreground running task;
The collection task comprises the following parts or all of: the method comprises the steps of manually collecting tasks by a user, M tasks with maximum historical operation time length in preset time length, and M tasks with maximum historical operation frequency in preset time length, wherein M is a positive integer.
7. The method of claim 6, wherein the preset application window comprises a top page of the application, an application window with a largest historical operating frequency of the application within a preset duration, or an application window with a largest historical operating duration of the application within a preset duration.
8. The method of claim 6, wherein the preset task comprises a preset application window in a recently run task and an installed application; the first task is a preset application program window in the application program;
Before the first terminal receives a first input operation of a task identifier of a first task acting on the first area, the first terminal displays a task identifier corresponding to a most recently operated task of the selected device in the first area of the task management interface, and the method further includes:
Receiving a fifth input operation;
responding to the fifth input operation, the first terminal displays an application identifier of at least one application program installed by the selected equipment in the first area, wherein the application identifier is a task identifier of a preset application program window in the application program; the at least one application program includes an application to which the first task belongs.
9. The method according to claim 8, wherein when the first terminal displays a task identifier corresponding to a most recently operated task of the selected device in the first area of the task management interface, a second control is further displayed in the task management interface, and the fifth input operation acts on the second control.
10. The method of claim 8, wherein the preset tasks further comprise the collection task; when the collection task of the currently selected device comprises the task of the first application program, the first terminal displays a collection identifier corresponding to the application identifier of the first application program, and the collection identifier is used for indicating that at least one collection task exists in the first application program;
The method further comprises the steps of:
the first terminal receives a sixth input operation of an application identifier acting on the first application program;
And responding to the sixth input operation, the first terminal displays a task identifier corresponding to a collection task of the first application program installed by the selected equipment in the first area.
11. The method of claim 6, wherein the preset task includes the most recently executed task, a task identifier corresponding to the most recently executed task is a task card, the task card includes an application identifier of an application to which the most recently executed task belongs, and an interface snapshot when the most recently executed task switches to a background.
12. The method of claim 1, wherein the first task and the second task are tasks of the first terminal.
13. The method of claim 1, wherein the at least one cooperating device further comprises a second terminal, the first task being a task of the first terminal, the second task being a task of the second terminal.
14. The method of claim 1, wherein before displaying, in the first area, a task identifier corresponding to at least one preset task of the second terminal, the method further comprises:
the first terminal acquires multi-task data of the second terminal, wherein the multi-task data comprises task information of at least one preset task of the second terminal.
15. A terminal comprising a display screen, a memory, one or more processors, and one or more programs; wherein the one or more programs are stored in the memory; characterized in that the one or more processors, when executing the one or more programs, cause the terminal to implement the method of any of claims 1 to 14.
16. A computer storage medium comprising computer instructions which, when run on an electronic device, cause the terminal to implement the method of any one of claims 1 to 14.
CN202211695536.0A 2022-12-28 Cross-equipment split-screen method and related device Pending CN118259995A (en)

Publications (1)

Publication Number Publication Date
CN118259995A true CN118259995A (en) 2024-06-28

Family

ID=

Similar Documents

Publication Publication Date Title
WO2021013158A1 (en) Display method and related apparatus
WO2021129326A1 (en) Screen display method and electronic device
WO2021052147A1 (en) Data transmission method and related devices
WO2021103981A1 (en) Split-screen display processing method and apparatus, and electronic device
CN110119296B (en) Method for switching parent page and child page and related device
WO2021000881A1 (en) Screen splitting method and electronic device
CN111240547A (en) Interactive method for cross-device task processing, electronic device and storage medium
WO2021036770A1 (en) Split-screen processing method and terminal device
KR20210068097A (en) Method for controlling display of system navigation bar, graphical user interface and electronic device
US20220350470A1 (en) User Profile Picture Generation Method and Electronic Device
WO2021185250A1 (en) Image processing method and apparatus
CN116302227A (en) Method for combining multiple applications and simultaneously starting multiple applications and electronic equipment
US20220214891A1 (en) Interface display method and electronic device
CN110830645B (en) Operation method, electronic equipment and computer storage medium
WO2020155875A1 (en) Display method for electronic device, graphic user interface and electronic device
WO2021057699A1 (en) Method for controlling electronic device with flexible screen, and electronic device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
WO2021190524A1 (en) Screenshot processing method, graphic user interface and terminal
WO2021196980A1 (en) Multi-screen interaction method, electronic device, and computer-readable storage medium
CN115480670A (en) Navigation bar display method, navigation bar display method and first electronic equipment
WO2022213831A1 (en) Control display method and related device
WO2022002213A1 (en) Translation result display method and apparatus, and electronic device
WO2024140757A1 (en) Cross-device screen splitting method and related apparatus
CN118259995A (en) Cross-equipment split-screen method and related device
WO2023098417A1 (en) Interface display method and apparatus

Legal Events

Date Code Title Description
PB01 Publication