CN112703472A - Terminal equipment and graphical user interface and multitask interaction control method thereof - Google Patents

Terminal equipment and graphical user interface and multitask interaction control method thereof Download PDF

Info

Publication number
CN112703472A
CN112703472A CN201880096040.7A CN201880096040A CN112703472A CN 112703472 A CN112703472 A CN 112703472A CN 201880096040 A CN201880096040 A CN 201880096040A CN 112703472 A CN112703472 A CN 112703472A
Authority
CN
China
Prior art keywords
screen
display area
application program
application
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880096040.7A
Other languages
Chinese (zh)
Inventor
潘英强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Royole Technologies Co Ltd
Original Assignee
Shenzhen Royole Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Royole Technologies Co Ltd filed Critical Shenzhen Royole Technologies Co Ltd
Publication of CN112703472A publication Critical patent/CN112703472A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses terminal equipment, including treater and touch screen, the touch screen includes first screen, third screen at least and is located second screen between first screen and the third screen, the second screen is flexible touch screen, the treater control first screen shows a graphical user interface, and graphical user interface shows a many application program interfaces, many application program interfaces include two at least application program interfaces of arranging according to the preface, the treater: and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode. The application also discloses a graphical user interface and a multitask interaction control method thereof. According to the application, left and right split screen display can be achieved through long-time pressing of the application program interface.

Description

Terminal equipment and graphical user interface and multitask interaction control method thereof Technical Field
The application relates to the field of graphical user interface control, in particular to a terminal device based on a touch screen, a graphical user interface of the terminal device and a multitask interaction control method.
Background
The touch screen is used as a new generation display screen behind the liquid crystal display screen, is made of soft materials, can be deformed and bent, and brings novel use experience to users. However, a multi-task interactive control method based on a touch screen is still lacked at present.
Disclosure of Invention
The embodiment of the application discloses a terminal device based on a touch screen, a graphical user interface thereof and a multitask interaction control method, so as to solve the problems.
The terminal equipment disclosed by the embodiment of the application comprises a processor and a touch screen, wherein the touch screen at least comprises a first screen, a third screen and a second screen positioned between the first screen and the third screen, the second screen is a flexible touch screen, the processor controls the first screen to display a graphical user interface, the graphical user interface displays a multi-application program interface, the multi-application program interface comprises at least two application program interfaces which are sequentially arranged, and the processor: and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode.
The graphical user interface disclosed by the embodiment of the application is applied to terminal equipment with a touch screen, the touch screen at least comprises a first screen, a third screen and a second screen positioned between the first screen and the third screen, the second screen is a flexible touch screen, the first screen displays a graphical user interface, the graphical user interface displays a multi-application program interface, the multi-application program interface comprises at least two application program interfaces arranged in sequence, and the graphical user interface is controlled and realized by a processor of the terminal equipment: and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode.
The embodiment of the application discloses a multitask interaction control method, which is applied to a terminal device with a touch screen, wherein the touch screen at least comprises a first screen, a third screen and a second screen positioned between the first screen and the third screen, the second screen is a flexible touch screen, the first screen displays a graphical user interface, the graphical user interface displays a multitask application program interface, the multitask application program interface comprises at least two application program interfaces which are sequentially arranged, and the multitask interaction control method comprises the following steps: and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode.
According to the terminal device and the multitask interaction control method, the operation of dragging the application program interface displayed on the first screen to the second screen can be responded, and the application program interface foreground is controlled to be displayed on the third screen. Therefore, the split-screen display can be realized in the folded state, and more convenience is brought to the user.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic block diagram of a hardware module of a terminal device in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a terminal device in a folded state according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a terminal device in a folded state according to another embodiment of the present application.
Fig. 4 is a schematic diagram illustrating a state in which the terminal device displays the graphical user interface from the first screen in the folded state according to another embodiment of the present application.
Fig. 5 is a schematic diagram illustrating a state in which the terminal device displays the multi-application interface from the first screen in the folded state according to an embodiment of the present application.
Fig. 6 is a schematic diagram of the terminal device displaying a taskbar from a first side curved screen of a first screen in a folded state according to an embodiment of the present application.
Fig. 7 is a schematic diagram illustrating another state in which the terminal device displays the taskbar from the first side curved screen of the first screen in the folded state according to an embodiment of the present application.
Fig. 8 is a schematic diagram illustrating a state in which the terminal device displays the taskbar from the second side curved screen of the second screen in the folded state according to an embodiment of the present application.
Fig. 9 is a schematic diagram illustrating a state in which the terminal device displays a multi-application interface from the first screen in a folded state according to an embodiment of the present application.
Fig. 10 is a schematic view of a terminal device from a folded state to an unfolded state in an embodiment of the present application.
Fig. 11 is a schematic diagram illustrating a state in which a terminal device displays a multi-application interface in an expanded state according to an embodiment of the present application.
Fig. 12 is an interface schematic diagram of a terminal device performing a screen splitting operation in an expanded state in an embodiment of the present application.
Fig. 13 is a schematic diagram illustrating a state in which the terminal device is split and displays a split screen bar in an expanded state according to an embodiment of the present application.
Fig. 14 is a schematic diagram illustrating a state in which the terminal device is split and displays a split toolbar in the expanded state according to an embodiment of the present application.
Fig. 15 is a schematic diagram illustrating a state in which the terminal device is split and displays a split application list in an expanded state according to an embodiment of the present application.
Fig. 16 is an interface schematic diagram of the terminal device in the folded state according to an embodiment of the present application.
Fig. 17 is a schematic diagram of the terminal device in an embodiment of the present application in a folded state.
Fig. 18 is a schematic view of the first screen of the terminal device in the folded state in the up-down split state in an embodiment of the present application.
Fig. 19 is a schematic diagram illustrating a state in which the terminal device is split left and right in a folded state according to an embodiment of the present application.
Fig. 20 is a schematic view of a state in which the terminal device in an embodiment of the present application is horizontally placed in a folded state.
Fig. 21 is a schematic diagram illustrating a terminal device displaying a task bar and a task shortcut operation bar in a folded state according to an embodiment of the present application.
Fig. 22 is a schematic diagram of the terminal device in an embodiment of the present application performing shortcut operation addition in a folded state.
Fig. 23 is a flowchart of a multitask interaction control method of a terminal device in an embodiment of the present application.
Fig. 24 is a flowchart of a multitask interaction control method for dynamically adapting a terminal device when a display window size is changed in an embodiment of the present application.
Fig. 25 is a flowchart of a multitask interaction control method for performing left-right split screen operation on a terminal device in an expanded state according to an embodiment of the present application.
Fig. 26 is a flowchart of a multitask interactive control method for the terminal device to perform the left-right/up-down split screen operation in the folded state in an embodiment of the present application.
Fig. 27 is a flowchart of a multitask interaction control method for performing shortcut operation by a terminal device in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this application and the drawings described above, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. The terms "first," "second," and "third," etc. in the description and claims of this application and the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Referring to fig. 1, fig. 1 is a schematic block diagram of a hardware module of a terminal device according to an embodiment of the present application. The terminal device 100 may be, but is not limited to, a bendable terminal device such as a mobile phone, a tablet computer, an e-reader, a wearable electronic device, and the like. The terminal device 100 includes, but is not limited to, a processor 10, and a memory 20, a touch screen 30, an angle sensor 40, a gravity acceleration sensor 50, and/or a direction sensor 60 electrically connected to the processor 10, respectively. It should be understood by those skilled in the art that fig. 1 is only an example of the terminal device 100 and does not constitute a limitation to the terminal device 100, and that the terminal device 100 may include more or less components than those shown in fig. 1, or combine some components, or different components, for example, the terminal device 100 may further include an input-output device, a network access device, a data bus, etc.
The Processor 10 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general-purpose processor may be a microprocessor or a general-purpose processor or any conventional processor or the like, and the processor 10 is a control center of the terminal device 100 and connects various parts of the entire terminal device 100 by using various interfaces and lines.
The memory 20 may be used to store computer programs and/or modules, and the processor 10 implements various functions of the terminal device 100 by running or executing the computer programs and/or modules stored in the memory 20 and calling data stored in the memory 20. The memory 20 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required by a plurality of functions (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, image data, etc.) created according to the use of the terminal device 100, and the like. In addition, the memory 20 may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), a plurality of magnetic disk storage devices, a Flash memory device, or other volatile solid state storage devices.
Referring to fig. 2, in an embodiment, the touch screen 30 at least includes a first screen a and a second screen B located at one side of the first screen a, wherein the second screen B is a flexible touch screen capable of being bent and has an arc shape in a bent state. In this embodiment, the second screen B is formed by bending and extending from one side of the first screen a. It is understood that in other embodiments, the second screen B and the first screen a may be separate screens and connected to each other.
The angle sensor 40 is disposed on the second screen B. The angle sensor 40 senses a relative spread angle between the first screen a and the second screen B. Specifically, the unfolding angle of the first screen a and the second screen B refers to an included angle between a display surface of the first screen a and a display surface of one end of the second screen B far away from the first screen a. Specifically, at least one angle sensor 40 is disposed on the second screen B for sensing an expansion angle between the second screen B and the first screen a. It is understood that, in other embodiments, the angle sensor 40 may be disposed at other positions of the terminal device 100 where the spread angle between the first screen a and the second screen B can be sensed, and is not limited herein. It can be understood that, when the second screen B is bent to make the display surface of the end of the second screen B far from the first screen a parallel to the display surface of the first screen a, the spreading angle sensed by the angle sensor 40 is 0 degree. When the second screen B and the first screen a are completely unfolded until the display surfaces of the second screen B and the first screen a are located at the same horizontal plane, the unfolded angle sensed by the angle sensor 40 is 180 degrees. When the second screen B is bent with respect to the first screen a, the unfolded angle sensed by the angle sensor 40 is changed between 0 and 180 degrees.
The gravitational acceleration sensor 50 and/or the direction sensor 60 are disposed on the first screen a, and the gravitational acceleration sensor 50 senses gravitational acceleration of the first screen a. In the present embodiment, the gravitational acceleration is a three-axis gravitational acceleration. Wherein, the three axes refer to an X axis, a Y axis and a Z axis. The X-axis is the horizontal axis, the Y-axis is the vertical axis, and the Z-axis is the vertical axis. When the terminal device 100 is unfolded and placed on a horizontal plane, the display surface of the screen faces upward, the X axis and the Y axis are perpendicular to each other and lie on the horizontal plane, and the Z axis is perpendicular to the horizontal plane and directed away from the display surface of the screen. When the terminal device 100 is folded or unfolded, the three axes are stationary with respect to the terminal device 100, but the value of the gravitational acceleration detected by each axis changes. When the Z-axis gravitational acceleration among the three-axis gravitational acceleration is a positive value, the processor 10 determines that the display screen orientation of the first screen a of the terminal device 100 is upward. When the Z-axis gravitational acceleration among the three-axis gravitational accelerations is a negative value, the processor 10 determines that the display screen of the first screen a of the terminal device 100 is oriented downward. The direction sensor 60 senses the display screen orientation of the first screen a. When the inclination angle sensed by the direction sensor 60 is between 0 and 180 degrees and the rotation angle is between 0 and 90 degrees, the processor 10 determines that the display screen of the first screen a of the terminal device 100 faces upward, otherwise, the processor 10 determines that the display screen of the first screen a of the terminal device 100 faces downward. The orientation of the display screen can be vertical, inverted vertical, left horizontal, right horizontal, elevation, depression and the like.
Referring to fig. 3, in another embodiment, the touch screen 30 further includes a third screen C. The third screen C is disposed at the other side of the second screen B, and the second screen B connects the first screen a and the third screen C. The first screen a, the second screen B and the third screen C are different regions of an integral screen, and the second screen B corresponds to the bending region of the touch screen 30 in the folded state. Alternatively, in yet another embodiment, the first screen a, the second screen B, and the third screen C are respectively screens independent from each other, and adjacent screens are connected to each other. In other embodiments, the touch screen 30 may include one first screen a, two second screens B, and two third screens C. Wherein the first screen a is located between two third screens C. The first screen A is respectively connected with the two third screens C through the second screen B. It is to be understood that the number of the first screen a, the second screen B, and the third screen C is not limited to one, and may be two or more.
In an embodiment, in the folded state of the touch screen 30, the height-width ratio of the display window of the first screen a is 16: and 9, at a resolution of 1440 × 810. The aspect ratio of the display window of the third screen C is 16:8, and the resolution is 1440 × 720. The second screen B has a display window with a height to width ratio of 48:13 and a resolution of 1440 × 390. In the expanded state of the touch screen 30, the height-width ratio of the display window is 4:3, and the resolution is 1440 × 1080. It is understood that in other embodiments, the aspect ratio and the resolution of the first screen a, the third screen C and the second screen B may be changed according to the actual situation, and are not described in detail herein.
Further, the second screen B includes a first side curved screen B1 adjacent to the first screen a and a second side curved screen B3 adjacent to the third screen C. When the first screen a displays content, the first side curved screen B1 may be displayed in cooperation with the first screen a; the second side curved screen B3 may be displayed in cooperation with the third screen C when the third screen C displays content. Therefore, the first side curved screen B1 can be better matched with the display of the first screen A, the second side curved screen B3 can be better matched with the display of the third screen C, the display content is richer, and the user interaction experience is better.
The angle sensor 40 is disposed on the second screen B, and senses an expansion angle between the first screen a and the third screen C, and determines a screen-folded state of the terminal device 100 according to the expansion angle. Wherein, the screen folding state comprises a folding state, an unfolding state or a transition state. The "folded state" refers to a state in which the unfolded angle of the touch screen 30 is in the range of 0 to 30 degrees. The "unfolded state" refers to a state in which the unfolded angle of the touch screen 30 is in the range of 150 to 180 degrees. The "transition state" refers to a state in which the spread angle of the touch screen 30 is between 30 and 150 degrees. It is understood that, in other embodiments, the angle sensor 40 may be disposed at other positions of the terminal device 100 where the bending angle between the first screen a and the third screen C can be sensed, and is not limited herein. It will be understood that in other embodiments, the angle ranges of the folded state, the unfolded state and the transition state defined above can be adjusted according to practical requirements, and are not limited herein. The folded state and the transitional state may be collectively referred to as a bent state. A gravitational acceleration sensor 50 and/or a direction sensor 60 may be disposed on the third screen C for sensing a gravitational acceleration and/or a direction of the third screen C, respectively. Thus, the processor 10 can determine the display screen orientations of the first screen a and the third screen C to be a normal vertical direction, an inverted vertical direction, a left horizontal direction, a right horizontal direction, a pitch direction, and the like, respectively, by combining the gravitational acceleration and/or the direction of the first screen a and the third screen C.
Referring to fig. 4 to 5, when the terminal device 100 is in the folded state, the processor 10 controls the first screen a to display a graphical user interface G. The graphical user interface G is a window for user interaction with applications installed on the terminal device 100, and includes a navigation bar N. In this embodiment, the navigation bar N is located below the graphical user interface G. The navigation bar N includes a return key N1, a return main menu key N2, and a multitasking trigger key N3, which are sequentially arranged from left to right. Wherein performing the clicking operation on the multitasking trigger key N3 may execute the multi-application interface D. The pointing operation and the below-described pointing operation may be gesture operations for selecting an application program by operations such as double-click, single-click, and slide.
The processor 10 controls the display of the multi-application interface D on the first screen a in response to the user's multitasking operation. The multi-application program interface D comprises at least two application program interfaces which are arranged in sequence. And the application program interface positioned in front of the sequence partially shields the application program interface positioned in the next position of the sequence. It will be appreciated that the application interfaces may be arranged in a left-to-right, right-to-left, top-to-bottom, bottom-to-top, top-to-left-to-bottom, top-to-right-to-bottom-right, top-to-left-to-bottom arrangement, and the like. In this embodiment, the application interfaces are arranged from left to right, one of the at least two application interfaces is located at the leftmost side of the multiple application interface D, and the other application interfaces are sequentially located at the right side of the previous application interface and are covered by the previous application interface.
The application program interface referred to in the embodiments of the present invention refers to a configuration picture displayed on a display screen by an application program, and includes, but is not limited to, an application icon of the application program, a picture during which a foreground of the application program works after being reduced or enlarged, a screenshot of a previous working picture when the application program is switched to a background after being reduced or enlarged, and the like.
It will be appreciated that in one embodiment, the multi-task trigger operation is a clicking operation performed on multi-task trigger key N3. In another embodiment, the multitask trigger operation may be a preset trigger operation performed at any position of the touch screen 30, where the preset trigger operation includes, but is not limited to, continuously sliding twice on the touch screen 30, performing a preset sliding gesture, such as a circle gesture "O", and the like, and is not limited herein.
Further, in this embodiment, the processor 10 controls all the application interfaces in the multi-application interface D to be sequentially arranged on the graphical user interface G according to the sequence of the last runtime, that is, the application interface of the last runtime is displayed at the leftmost side of the multi-application interface D, and so on. It is understood that in other embodiments, the processor 10 controls all application interfaces in the multi-application interface D to be sequentially arranged on the graphical user interface G according to the chronological order of the last run time and/or the comprehensive ordering of the priority chronological order, which is not limited herein.
Specifically, referring to fig. 5, in the present embodiment, in the folded state of the touch screen 30, the multi-application interface D displayed by the first screen a or the third screen C includes three application interfaces, that is, a first application interface, a second application interface and a third application interface. The first application interface is positioned at the forefront, the second application interface is partially obscured by the first application interface, and the third application interface is partially obscured by the second application interface.
It will be appreciated that in one embodiment, the width of the portion of the first application interface that obscures the second application interface is greater than the width of the portion of the second application interface that obscures the third application interface.
Further, in an embodiment, the width of the portion of the second application interface that is not occluded is half of the width of the first application interface, and the width of the portion of the third application interface that is not occluded is half of the width of the portion of the second application interface that is not occluded. Therefore, the multi-application program interface D is more layered, and a user can conveniently select the multi-application program interface D.
It is understood that in other embodiments, the multi-application interface D displayed by the first screen a or the third screen C may include more than three or less than three application interfaces in the folded state of the touch screen 30.
In another embodiment, the multi-application interface D that the processor 10 controls the touch screen 30 to display in the expanded state includes five application interfaces (as shown in fig. 11), that is, the first application interface, the second application interface, the third application interface, the fourth application interface and the fifth application interface, and there is no overlap between the five application interfaces. In one embodiment, the five application program interfaces are arranged at equal intervals. It can be understood that, when the number of the application interfaces included in the multi-application interface D displayed in the expanded state by the touch screen 30 is greater than a preset value, one of the application interfaces is located at the leftmost side of the multi-application interface D, and the other application interfaces are sequentially located at the right side of the previous application interface and are partially covered by the previous application interface
It is understood that, whether the touch screen 30 is in the folded state or the unfolded state, the processor 10 can respond to the user's left-right sliding operation on the touch screen 30, control all the application interfaces of the multi-application interface D currently displayed on the first screen a or the third screen C to slide together with the sliding operation, and hide one of the application interfaces when the one of the application interfaces slides into a preset area of one side edge of the first screen a or the third screen C, and simultaneously call out a new application interface on the other side of the first screen a or the third screen C to fill in the space that is excessive due to the hiding of the application interface. Thus, the user can view more application interfaces through a side-to-side sliding operation on the touch screen 30. It is understood that the application program interface hidden by the sliding on one side of the first screen a or the third screen C can be displayed from the other side of the first screen a or the third screen C.
Further, the multi-application interface D is displayed above the navigation bar N, and the processor 10 controls the display of the multi-application interface D on the graphical user interface G while displaying a clear key C between the multi-application interface D and the navigation bar N. The processor 10 controls the clearing of all application interfaces on the multi-application interface D in response to the user's clicking operation of the clear key C.
The multi-application interface D also displays the name of each application interface above it, which may be, for example, "first application", "second application", "third application", etc., as shown in fig. 5. It is understood that these names are merely examples, and in practical applications, these names may be "QQ", "wechat", "mailbox", and the like, and are not limited herein.
The following interaction description takes three applications as an example, that is, a first application, a second application and a third application as an example, and the first application, the second application and the third application are distinguished in a light-and-dark manner, or are distinguished in a manner of labeling corresponding application icons. It is to be understood that the interaction described below can be extended to more than three applications or less than three applications, and is not so limited.
Further, referring to fig. 5, when the touch screen 30 is in the folded state, the first screen a displays the multiple application interfaces D, referring to fig. 6, and the processor 10 further controls to display the selected application interface in front of the first screen a and display the unselected application interfaces on the second screen B in response to selecting one application interface from the multiple application interfaces D. The unselected application program interfaces displayed on the second screen B are located on the curved display surface of the second screen B, and the application program interface displayed in front of the first screen a is located on the planar display surface of the first screen a. Further, the unselected application program interfaces displayed on the second screen B run in the background; and/or displaying the selected application program interface on the first screen A in a full screen mode.
Specifically, when the processor 10 controls the touch screen 30 to display the multi-application interface D in the folded state, in response to a click operation of clicking any one application interface on the multi-application interface D by a user, the clicked application interface is controlled to be switched to the foreground display on the first screen a (where the application interface may be displayed in a full screen or partially displayed according to a preset display size of the application interface), and other application interfaces in the multi-application interface D are sequentially displayed on the taskbar T1 on the second screen B. It will be appreciated that the other applications displayed on the taskbar T1 may be application icons or application interfaces after zooming out. Specifically, when a multi-application interface D is displayed on the first screen a, and an application interface in the multi-application interface D receives a pointing operation of a user, the processor 10 controls to switch the application interface receiving the pointing operation to a foreground display, preferably a full-screen display, on the first screen a; meanwhile, the processor 10 also controls to generate a taskbar T1 and display the taskbar T1 on the first side curved screen B1 of the second screen B, wherein the taskbar T1 displays application interfaces that are not clicked by the user in the multi-application interface D, preferably application icons corresponding to the application interfaces that are not clicked by the user. In one embodiment, the application icons are arranged in the taskbar T1 in the same order as the application program interface corresponding to the application icons, and the application icons are arranged in the taskbar T1 in sequence from top to bottom or from bottom to top. Specifically, in this embodiment, the processor 10 controls the other application interfaces in the multi-application interface D to be sequentially displayed on the task bar T1 located on the first side curved screen B1 from top to bottom. For example, the processor 10 controls the first application interface to be displayed on the first screen a in a full screen manner in response to a pointing operation of the user for pointing the first application interface, and sequentially displays the second application interface and the third application interface in the multi-application interface D in the form of application icons on the taskbar T1 located in the first curved screen B1 from top to bottom.
Referring also to fig. 7, in response to the user's pointing operation to point an application interface on the taskbar T1, the processor 10 cancels display of the clicked application interface on the taskbar T1, controls the application interface currently displayed foreground on the first screen a to be displayed on the taskbar T1, and controls the application interface currently clicked on the taskbar T1 to be displayed foreground on the first screen a. Specifically, in one embodiment, when the user performs a clicking operation for clicking an application icon of an application interface on the taskbar T1, the processor 10 switches the application currently displayed on the first screen a to a background run in response to the clicking operation applied by the user and generates and displays an application icon of the newly switched to background run application on the taskbar T1; meanwhile, the processor 10 also controls the application program corresponding to the application icon clicked by the user to be switched from the background running to the foreground running, and displays an application program interface of the application program newly switched to the foreground running on the first screen a in a full screen manner. In one embodiment, the application icons that are newly switched to the application running in the background are inserted into the application icon queue displayed in taskbar T1 in the order in which they were originally arranged in multitasking interface D. In other embodiments, the application icons newly switched to the application running in the background may be all application icons in the taskbar T1 and the application icons newly switched to the application running in the background, and the application icons are re-ordered according to a preset rule and then inserted into the application icon queue displayed in the taskbar T1 according to the ordering, where the preset rule may be the ordering according to the last usage time of the application program corresponding to the application icon, the ordering according to the system priority of the application program, the ordering according to the setting rule of the user, or the like. For example, in response to the user clicking the application icon corresponding to the second application interface on the taskbar T1, the processor 10 cancels the display of the application icon corresponding to the clicked second application on the taskbar T1, controls the first application currently displayed in full screen to be displayed in the form of an application icon on the taskbar T1, and the application icon corresponding to the first application is located above the application icon corresponding to the third application, and displays the second application interface corresponding to the currently clicked application icon on the taskbar T1 in full screen on the first screen a.
Therefore, any application program interface in the multi-application program interface D can be switched to be displayed in the foreground on the first screen A through the taskbar T1 quickly, and the quick program switching can be realized.
Further, in one embodiment, the processor 10 controls the plurality of application icons in the task bar T1 to be arranged at equal intervals along the length direction of the second screen B. It is understood that the taskbar T1 may be disposed on the upper or lower half of the first side curved screen B1 or on the entire first side curved screen B1. When the number of icons on the taskbar T1 is large such that the length of the entire taskbar T1 is greater than the length of the second screen B, i.e., the length limit of the second screen B, part of the icons on the taskbar T1 are displayed on the second screen B and part of the icons are not displayed on the second screen B; at this time, the user can view the icons on the taskbar T1 that have not been previously displayed on the second screen B through a gesture operation such as a slide operation and hide some or all of the icons previously displayed on the second screen B as necessary. The hiding means that the icon is not displayed on the second screen B.
Referring to fig. 5 and 8 together, in another embodiment, when the processor 10 selects an application interface in response to a click operation of a user selecting any application interface in the multi-application interface D located on the first screen a, and the processor 10 further responds to a drag operation of dragging the selected application interface to the first curved screen B1, the processor 10 recognizes the drag operation as an operation of sending the selected application interface to the third screen C, and controls the third screen C to display the application interface (e.g., the second application interface) in the foreground. When the user turns the terminal device 100 to the third screen C facing upward and views or uses the application interface (e.g., the second application interface) on the third screen C, the processor 10 also controls to display the taskbar T1 on the second side curved screen B3. Of course, after recognizing that the dragging operation is an operation of sending the selected application program interface to the third screen C, the processor 10 may not immediately control the third screen C to display the application program interface, but control the third screen C to display the application program interface in a full screen manner when the terminal device 100 turns over to the third screen C upward, so as to save power and prevent a false touch. Referring to fig. 9, when the user turns the terminal device 100 to the first screen a facing up again and views or uses the first screen a, the processor 10 controls the multi-application interface D displayed on the first screen a not to include the application interface (e.g., the second application interface) currently displayed on the third screen C.
Please refer to fig. 5 again, and refer to fig. 10 and fig. 11 together, wherein, in fig. 10, firstly represents the folded state, secondly represents the transition state, and thirdly represents the unfolded state. When the touch screen 30 is converted from the folded state to the unfolded state, that is, the unfolding angle is changed, the processor 10 controls the size of the graphical user interface G according to the change of the unfolding angle of the touch screen 30.
Specifically, in an embodiment, when the touch screen 30 includes a first screen a and a second screen B, the first screen a and the second screen B together form a whole continuous flexible display screen. The processor 10 controls the graphical user interface G to extend from the first screen a to the second screen B as the angle of expansion changes. It is understood that, in another embodiment, when the touch screen 30 includes a first screen a, a second screen B and a third screen C, the first screen a, the second screen B and the third screen C together form a whole continuous flexible display screen. The processor 10 controls the extension of the graphical user interface G from the first screen a to the third screen C as a function of the deployment angle.
Specifically, in an embodiment, the processor 10 moves the window boundary E of the display window W according to the change of the expansion angle to adjust the size of the display window W and controls the graphical user interface G to adapt to the size of the display window W according to the movement of the window boundary E.
Further, in an embodiment, the processor 10 controls all application interfaces of the multi-application interface D to dynamically adapt the size of the display window W according to the size of the display window W.
Specifically, the processor 10 acquires the spread angle between the first screen a and the third screen C sensed by the angle sensor 40. The processor 10 moves the window boundary line E according to the change of the expansion angle acquired by the angle sensor 40 to adjust the size of the display window W and controls all application interfaces of the multi-application interface D to adapt to the size of the display window W according to the movement of the window boundary line E. Specifically, the processor 10 can dynamically adjust the number of application interfaces located in the display window W, the width of the blocked portion between adjacent application interfaces, and the distance between adjacent application interfaces according to the size of the display window W. When the display window W becomes smaller, the application program interface which can not be displayed in the display window W is moved out of the display window W; the position of the movement is a position of movement from a side away from the window boundary line E. When the display window W becomes larger, the application program interface that was not displayed in the display window W will move into the display window W to fill the increased space in the enlarged display window W. The moved-in position is moved from a side away from the window boundary E. Therefore, the number of the application program interfaces in the display window W is adapted along with the size of the display window W.
Specifically, in one embodiment, the processor 10 controls the window boundary line E to move such that the display window W increases as the expansion angle increases, controls the number of application interfaces located in the display window W to increase as the expansion angle increases, controls the width of the blocked portion between adjacent application interfaces to gradually decrease as the expansion angle increases, and controls the distance between adjacent application interfaces to gradually increase as the expansion angle increases, in the process of increasing the expansion angle from 0 degree to 180 degree. The processor 10 controls the window boundary line E to move so that the display window W decreases as the expansion angle decreases, controls the number of application interfaces located in the display window W to decrease as the expansion angle decreases, controls the width of the blocked portion between adjacent application interfaces to gradually increase as the expansion angle decreases, and controls the distance between adjacent application interfaces to gradually decrease as the expansion angle decreases, in the process of decreasing the expansion angle from 180 degrees to 0 degrees.
Specifically, in one embodiment, when the first screen a faces upward and the expansion angle of the touch screen 30 increases from 0 degree to 180 degrees, the processor 10 controls the window boundary line E to move to the left and controls the application interface to move to the left along with the window boundary line E, so that the newly added application interface moves in from the right side away from the window boundary line E. When the third screen C faces upward and the expansion angle of the touch screen 30 increases from 0 degree to 180 degrees, the processor 10 controls the window boundary line E to move to the right and controls the application interface to move to the right along with the window boundary line E, so that the newly added application interface moves from the left side far away from the window boundary line E. When the first screen a faces upward and the expansion angle of the touch screen 30 decreases from 180 degrees to 0 degree, the processor 10 controls the window boundary line E to move to the right and controls the application interface to move to the right along with the window boundary line E, so that the application interface that cannot be displayed in the display window W moves from the right side away from the window boundary line E due to the display window W becoming smaller. When the third screen C faces upward and the expansion angle of the touch screen 30 is reduced from 180 degrees to 0 degree, the processor 10 controls the window boundary line E to move to the left and controls the application interface to move to the left along with the window boundary line E, so that the application interface that cannot be displayed in the display window W is moved away from the left side far from the window boundary line E due to the reduction of the display window W.
Specifically, in one embodiment, the processor 10 controls the speed of movement of the application interfaces to be proportional to the speed of increase of the deployment angle during the increase of the deployment angle from 0 degrees to 180 degrees, i.e., during the deployment, and controls the speed of decrease of the width of the blocked portion between adjacent application interfaces to be proportional to the speed of increase of the deployment angle. The processor 10 controls the moving speed of the application interface to be proportional to the decreasing speed of the unfolding angle and the increasing speed of the width of the blocked portion between the adjacent application interfaces to be proportional to the decreasing speed of the unfolding angle during the process of the unfolding angle decreasing from 180 degrees to 0 degrees, i.e., during the process of folding. The moving speed of the application program interface refers to the speed at which the application program interface moves along with the window boundary line E.
Specifically, in one embodiment, the processor 10 controls the size of each application interface to remain the same during the change in the deployment angle.
Specifically, as shown in fig. 11, the processor 10 controls the multi-application interface D displayed on the touch screen 30 when the expansion angle of the touch screen 30 is 180 degrees to be displayed on the touch screen 30. As shown in fig. 5, the processor 10 controls the multi-application interface D displayed on the touch screen 30 when the expansion angle is 0 degree to be displayed on the first screen a. It is understood that, in other embodiments, if the third screen C of the touch screen 30 is facing upward and in the use state, the processor 10 controls the multi-application interface D displayed when the expansion angle of the touch screen 30 is 0 degrees to be displayed on the third screen C.
The touch screen 30 can also realize left-right split screen display in the unfolded state, and even can directly realize up-down split screen display on the flexible touch screen 30. The method comprises the following specific steps:
referring to fig. 11, the touch screen 30 displays a graphical user interface G in an expanded state, where the graphical user interface G includes a multi-application interface D, and the multi-application interface D includes at least two application interfaces arranged at intervals in sequence. In this embodiment, the multiple application program interfaces D include a first application program interface, a second application program interface, a third application program interface, a fourth application program interface, and a fifth application program interface, which are arranged in sequence.
Referring to fig. 13, in response to the user's screen splitting operation, the processor 10 divides the touch screen 30 into a first display area W1 and a second display area W2, which are adjacently disposed, and controls different application interfaces to be displayed in the first display area W1 and the second display area W2, respectively.
In this embodiment, the first display area W1 and the second display area W2 are disposed adjacent to each other in the left-right direction. It is understood that, in other embodiments, the first display area W1 and the second display area W2 are disposed adjacent to each other.
In one embodiment, the processor 10 divides the touch screen 30 into a first display area W1 and a second display area W2 in response to the operation of the corresponding application interface, controls the operated application interface to be displayed in the first display area W1, and keeps the multi-application interface D displayed in the second display area W2.
In an embodiment, when the processor 10 displays the operated application interface in the first display area W1, the multi-application interface D is further controlled to move away from the first display area W1 until the multi-application interface D crosses a boundary between the first display area W1 and the second display area W2 and enters the second display area W2. The processor 10 also displays another application interface of the second display area W2 in the foreground of the second display area W2 in response to an operation of the other application interface.
Specifically, please refer to fig. 12 again, where the screen-splitting operation includes a long-press operation of long-pressing the corresponding application program interface and a drag operation of dragging the application program interface to a predetermined range of the screen-splitting prompt bar. And when the processor 10 responds to the long press operation and displays the split screen prompt bar at the top of the graphical user interface G, and continues to respond to the dragging operation to drag the application program interface to the preset range of the split screen prompt bar, controlling the touch screen 30 to split the screen into a first display area W1 and a second display area W2, controlling the operated application program interface to be displayed in the first display area W1, and keeping the multi-application program interface D displayed in the second display area W2, or vice versa. When the application interface is displayed in the first display area W1, the processor 10 further controls the multi-application interface D to move away from the first display area W1 until the multi-application interface D crosses the boundary between the first display area W1 and the second display area W2 and enters the second display area W2. The processor 10 also displays another application interface of the second display area W2 in the foreground of the second display area W2 in response to an operation of the other application interface.
For example, referring to fig. 12, when the user long-presses the second application interface to the top of the touch screen 30 to generate the split-screen prompt "drag" and continuously responds to the drag operation of the user dragging the second application interface to the split-screen prompt, the processor 10 controls the touch screen 30 to be divided into the first display area W1 and the second display area W2 which are adjacently arranged, and controls the second application interface to be displayed in the second display area W2.
It should be noted that the "long-press operation" refers to long-press of the same position of the screen for more than a preset time length, and the "drag operation" refers to dragging the application program interface to slide along with the sliding of the finger. The long press operation and the drag operation are coherent, that is, there is no pause between the two actions, and the finger does not need to be lifted.
Alternatively, referring to fig. 12 again, for an application interface supporting split-screen, split-screen may also be implemented by a split-screen icon K displayed at a corresponding position of the application interface supporting split-screen. It will be appreciated that the split screen icon K may be displayed below the application interface capable of supporting split screens by default or triggered by a predetermined gesture, such as pressing the application interface long enough to be displayed below the application interface. It will be appreciated that the split screen icon K may also be displayed above an application interface capable of supporting split screens. In this embodiment, the split-screen icon K includes two small squares spaced apart from each other, and the two small squares are an integral icon. The icon is used for indicating that the split-screen display operation between the application program interface and other application program interfaces can be realized through the icon. It is understood that, in another embodiment, two small squares of the split-screen icon K are used to instruct the user to select to perform split-screen display in the first display area W1 or the second display area W2, such as labeling L or R beside the split-screen icon K, indicating that the application interface corresponding to the split-screen icon K is split-screen to the left display area or the right display area by default. It is understood that in another embodiment, the split-screen icon K may have other shapes, and is not limited herein.
Specifically, in one embodiment, the processor 10 controls the split screen icon K to be displayed below an application interface capable of supporting the split screen function. The processor 10 further responds to the click operation of the split screen icon K to control the split screen of the touch screen 30 to be the first display area W1 and the second display area W2, and first displays the application program interface corresponding to the split screen key K in the first display area W1, and displays other application program interfaces in the multi-application program interface D in the second display area W2. When the application interface is displayed in the first display area W1, the processor 10 further controls the multi-application interface D to move away from the first display area W1 until the multi-application interface D crosses the boundary between the first display area W1 and the second display area W2 and enters the second display area W2. The processor 10 also controls one of the application interfaces D located in the second display area W2 to be displayed in the second display area W2 in response to a clicking operation on the application interface or a clicking operation on the split-screen icon K below the application interface.
In this embodiment, the first display area W1 is located on the left side of the touch screen 30, and the second display area W2 is located on the right side of the touch screen 30. It is understood that in other embodiments, the first display area W1 may be disposed on the right side of the touch screen 30 and the second display area W2 may be disposed on the left side of the touch screen 30.
Further, referring to fig. 13 again, when the display area of the touch screen 30 is divided into the first display area W1 and the second display area W2, a boundary is formed between the first display area W1 and the second display area W2, and the processor 10 controls the split bar S1 to be displayed on the boundary between the first display area W1 and the second display area W2. The processor 10 adjusts the ratio of the display sizes of the first display area W1 and the second display area W2 in response to a drag operation of a left or right drag to the touch screen 30 applied by the user to the screen bar S1.
Specifically, when the processor 10 drags the split bar S1 to the left, the width h1 of the first display area W1 becomes smaller, and the width h2 of the second display area W2 becomes larger, that is, the first display area W1 is adjusted to be smaller, and the second display area W2 is adjusted to be larger. When the processor 10 drags the split bar S1 to the right, the width h1 of the first display area W1 becomes larger, and the width h2 of the second display area W2 becomes smaller, that is, the first display area W1 is adjusted to be larger, and the second display area W2 is adjusted to be smaller. More specifically, when the split bar S1 is dragged to the leftmost side of the touch screen 30, i.e. an end of the touch screen 30, the processor 10 adjusts the display size of the second display area W2 to occupy the entire display area of the touch screen 30, and at this time, the application interface previously displayed in the first display area W1 is switched to the background operation. When the split bar S1 is dragged to the rightmost side of the touch screen 30, that is, the other end of the touch screen 30, the processor 10 adjusts the display size of the first display area W1 to occupy the entire display area of the touch screen 30, and at this time, the application interface previously displayed in the second display area W2 is switched to the background operation. It is to be understood that the above-described function may be implemented by dragging the boundary between the first display region W1 and the second display region W2 directly without displaying the split screen bar S1.
Further, referring to fig. 14, the processor 10 controls the display of a screen-splitting toolbar S2 in response to the clicking operation of the screen-splitting bar S1. Specifically, the split toolbar S2 includes a left-right swap option S21. The processor 10 controls the application interface swap display displayed in the first display area W1 and the second display area W2 in response to the user 'S clicking operation on the left-right swap option S21, that is, when the left-right swap option S21 receives the user' S clicking operation, the processor 10 controls the first application interface displayed in the first display area W1 to be replaced with the second application interface previously displayed in the second display area W2, and controls the second application interface displayed in the second display area W2 to be replaced with the first application interface previously displayed in the first display area W1.
Further, the split toolbar S2 also includes an exit split option S22. Specifically, in one embodiment, the exit split screen option S22 is an x-shaped symbol. The processor 10 controls exiting of the split screen display, i.e., exiting of the display manner in which the first display region W1 and the second display region W2 are split screen display, in response to the clicking operation on the exit split screen option S22, and returns to the multi-application interface D before the split screen, for example, the multi-application interface D shown in fig. 11. Of course, it is also possible to return to the home page or the multi-application interface displayed in one of the first display area W1 and the second display area W2.
Further, the split toolbar S2 also includes a split application list S23. Referring also to fig. 15, the processor 10 controls to display a list of icons, which may be a list of icons of application programs that can support split screen, on the second display region W2/the first display region W1 in response to a clicking operation to the split screen application list S23.
Further, in an embodiment, the processor 10 controls the split toolbar S2 to be switched to the split bar S1 display when the icon list is displayed on the second display area W2/the first display area W1. It will be appreciated that in other embodiments, the processor 10 may control the switching of the split screen toolbar S2 to the split screen bar S1 display in response to the clicking operation of the left-right toggle option S21, the exit split screen option S22, or the list of split screen applications S23, and upon re-clicking the split screen bar S1, re-awaken the split screen toolbar S2.
It can be understood that, in an embodiment, when the touch screen 30 is in the unfolded state and is divided into the first display area W1 and the second display area W2 that are adjacently disposed left and right, when the touch screen 30 starts to be folded from the unfolded state to the folded state, the touch screen 30 continues the screen distribution in the unfolded state, that is, the first display area W1 and the second display area W2 are respectively located on two opposite sides of the terminal device 100.
Further, when the touch screen 30 is in the folded state, the processor 10 controls the boundary line between the first display area W1 and the second display area W2 to be located at the bending position of the touch screen 30, that is, the second screen B, when the touch screen 30 is divided into the first display area W1 and the second display area W2 in response to the screen splitting operation of the user.
It is understood that, in other embodiments, the touch screen 30 may implement not only the left-right split screen display in the unfolded state, but also the up-down split screen display in the first display area W1 or the second display area W2. Further, the touch screen 30 may also implement a top-bottom split screen display and/or a left-right split screen display of the first screen a and/or the third screen C in the folded state. The details are as follows:
referring to fig. 16, when the touch screen 30 is in the folded state, the processor 10 responds to a long-press operation of a user for long-pressing a corresponding application program interface to control display of a split-screen prompt on the first screen a, and responds to a dragging operation for dragging a selected application program interface to a split-screen prompt position, so as to implement split-screen display. Specifically, in one embodiment, the split screen prompt is "split screen up and down to drag" displayed at the top of the first screen a.
Referring to fig. 17 together, in particular, when the user drags the selected application interface to the split screen prompt of the top area of the first screen a, the processor 10 controls the first screen a to be divided into the third display area W3 and the fourth display area W4 and controls the corresponding application interfaces to be displayed in one or both of the third display area W3 and the fourth display area W4, respectively, in response to the drag operation of dragging the application interface to the split screen prompt of the top area of the first screen a. Among them, the third display region W3 and the fourth display region W4 belong to a display form of an up-down structure.
Specifically, in this embodiment, the processor 10 controls the selected application interface, for example, the first application interface, to be displayed in the third display area W3, while controlling the other application interfaces of the multi-application interface D to remain displayed in the fourth display area W4. When the application interface is displayed in the third display area W3, the processor 10 further controls the multi-application interface D to move away from the fourth display area W4 until the multi-application interface D crosses the boundary between the third display area W3 and the fourth display area W4 and enters the fourth display area W4. Further, referring to fig. 18, in response to the clicking operation of clicking one of the multiple application interfaces D located in the fourth display area W4, for example, the second application interface, the processor 10 controls the selected application interface to be displayed in the fourth display area W4 and controls the multiple application interface D to be hidden.
Optionally, for an application program interface supporting the split screen function, split screen display can be realized through the split screen icon K. Specifically, the processor 10 controls the split screen icon K to be displayed at a corresponding position of the application program interface supporting the split screen function, and the processor 10 also controls the touch screen 30 to split the screen into the third display area W3 and the fourth display area W4 in response to a click operation on the split screen icon K, and controls different application program interfaces to be displayed in the third display area W3 and the fourth display area W4, respectively.
It is understood that, in other embodiments, a split-screen bar is further displayed between the third display area W3 and the fourth display area W4, and the split-screen bar can be used to adjust the ratio of the display size between the third display area W3 and the fourth display area W4, which can be referred to the corresponding embodiments described above.
Specifically, in one embodiment, the split screen prompt is "send to secondary screen" displayed on the second screen B. Referring to fig. 16 again, when the application interface is dragged to the second screen B, the processor 10 responds to the dragging operation of dragging the application interface to the second screen B, and controls the foreground of the application interface to be displayed on the third screen C.
Further, in an embodiment, the multi-application interface displayed on the first screen a is an application running in the background.
Further, in an embodiment, the application program interface foreground-displayed on the third screen C is full-screen-displayed on the third screen C.
Further, in an embodiment, the processor 10 recognizes an operation when the application interface of the first screen a is dragged beyond a boundary between the first screen a and the second screen B as a split screen operation, and controls the application interface to be displayed in the foreground on the third screen C.
Further, in an embodiment, the first screen a displays a plurality of application interfaces, and the processor 10 responds to the screen splitting operation for one of the application interfaces and controls the other remaining application interfaces to reorder the positions of the removed application interfaces.
Further, in an embodiment, the second screen B is a flexible screen. When the second screen B is in a bent state, the processor 10 triggers screen splitting in response to an operation of dragging the application program interface displayed on the first screen a to the second screen B; when the second screen B is in the expanded state, the processor 10 does not trigger the split screen for the operation of dragging the application program interface displayed on the first screen a to the second screen B.
Specifically, the processor 10 controls the touch screen 30 to be divided into the second display area W2 displayed on the first screen a and the first display area W1 displayed on the third screen C in response to a drag operation of dragging an application interface to the second screen B, and controls different application interfaces to be displayed in the first display area W1 and the second display area W2, respectively.
Alternatively, the left-right split-screen display may also be realized in the following manner. Specifically, the processor 10 triggers the multi-application interface D on the first screen a, and responds to a user's clicking operation on one of the application interfaces of the multi-application interface D, to display the application interface in the foreground on the second display area W2 of the first screen a. In an embodiment, the application interface foreground displayed on the first screen a is displayed full screen on the first screen a. The processor 10 further triggers the multi-application interface D on the third screen C, and responds to the user's clicking operation on one of the application interfaces of the multi-application interface D, so as to display the application interface in the foreground on the first display area W1 of the third screen C, thereby implementing the split-screen display. In an embodiment, the application program interface foreground displayed on the third screen C is full-screen displayed on the third screen C.
Referring to fig. 19 together, when the mobile terminal is in a folded state, the processor 10 controls the split operation bar S3 to be displayed on the second screen B. The split operation bar S3 includes a split toggle option S31. The processor 10 controls the application interface swap display displayed on the first display area W1 and the second display area W2 in response to a user' S clicking operation on the split screen swap option S31 in the split screen bar S3.
The split screen bar S3 also includes an exit split screen option S33. The processor 10 responds to the clicking operation of the user on the exit split screen option S33 in the split screen operation bar S3, and controls to exit the split screen display and return to the multi-application interface D, the home page, or an arbitrarily set application interface before the split screen display.
It can be understood that, when the touch screen 30 displays the application program interface by the first screen a and the third screen C respectively in the folded state, the processor 10 further obtains the data of the first screen a and the third screen C sensed by the gravitational acceleration sensor 50 and/or the direction sensor 60 and determines the usage state of the first screen a and the third screen C according to the data, and adjusts the display direction of the display content on the first screen a and the third screen C according to the usage state. Specifically, when the first screen a and the third screen C are in a vertical state standing on a desk as if the book is opened and substantially vertical, the processor 10 controls the display directions of the first screen a and the third screen C to be vertical screen display. When the first screen a and the third screen C are in the landscape orientation as shown in fig. 20, the processor 10 controls the display direction of the first screen a and the third screen C to be landscape display. Thus, two different users can view different contents using the first screen a and the third screen C, respectively.
Referring to fig. 13 again, when the touch screen 30 is in the split-screen state, when the touch screen 30 is gradually switched from the folded state to the unfolded state, and the first screen a rotates relative to the third screen C and is unfolded, the first display area W1 and the second display area W2 are still in the split-screen state in the folded state, the first display area W1 and the second display area W2 gradually approach each other and increase their display areas, until the first display area W1 displayed on the third screen C and the second display area W2 displayed on the first screen a respectively extend toward the second screen B until the first display area W1 and the second display area W2 are connected, and a split-screen bar S1 is formed between the first display area W1 and the second display area W2.
It should be noted that the extension of the first display region W1 and the second display region W2 is related to the unfolding angle, and the larger the unfolding angle is, the closer the first display region W1 and the second display region W2 are to the central axis of the screen. When the spread angle is 180 degrees, the first display region W1 and the second display region W2 are connected.
The processor 10 displays the split toolbar S2 in response to the clicking operation of the split bar S1, the split toolbar S2 including a split toggle option S21, an exit split option S22, and a split application list S23.
The processor 10 may also control the touch screen 30 to implement some shortcut operations to enhance the interaction experience. The details are as follows.
The graphical user interface G includes an application, and the processor 10 controls the second screen B to display a task shortcut bar in response to an operation of the application, and displays an application shortcut in the task shortcut bar.
Specifically, referring to fig. 21 together, the task shortcut bar includes a task shortcut switching bar T2, and the application shortcut includes a shortcut of a last application executed before the operated application is executed, which is displayed in the task shortcut switching bar T2.
Specifically, in an embodiment, the processor 10 controls the touch screen 30 to display the multi-application interface D or the currently selected application interface from the first screen a in the folded state, controls the second screen B to display the task shortcut switching bar T2, and displays a shortcut of a last application executed before the operated application is executed in the task shortcut switching bar T2. It is understood that in one embodiment, the shortcut of the application refers to an application icon of the application, a reduced view of an application interface of the application, and the like.
Specifically, in one embodiment, the processor 10 controls the execution of the operated application program in response to the operation of the application program, and controls the shortcut of the last executed application program to be displayed in the task shortcut switching field T2.
Specifically, in an embodiment, when the current application is an application used for the first time within a preset time period, the processor 10 controls the task shortcut switching bar T2 to be displayed in a blank state or not displayed.
Further, in an embodiment, the processor 10 controls the shortcut to be removed from the task shortcut switching bar T2 when the application icon in the task shortcut switching bar T2 is not used again within a preset time period.
Specifically, the processor 10 controls the first screen a to display the currently selected application interface in front, and controls the first side curved screen B1 to display the task shortcut switching bar T2, so that the first screen a can be better assisted to display, and the task shortcut switching is more convenient.
It is understood that, while controlling the first curved screen B1 to display the task shortcut switching bar T2, the processor 10 may also control the first curved screen B1 to display the task bar T1, and both the task shortcut switching bar T2 and the task bar T1 are displayed at different positions of the first curved screen B1. Therefore, the application task pages can be switched quickly and conveniently.
Optionally, in an embodiment, the task shortcut includes a task shortcut operation bar T3, and the application shortcut includes a shortcut of an application operated, which is displayed in the task shortcut operation bar T3.
Specifically, in one embodiment, the processor 10 displays the shortcut of the operated application in the task shortcut operation bar T3 in response to the operation of the application.
Specifically, in one embodiment, the applications of the gui G are displayed on the first screen a in an application interface manner, and the processor 10 displays the shortcuts of the operated applications in the task shortcut bar T3 in response to the operation of the application interface of the application.
Specifically, referring to fig. 22, when a multitasking interface D is displayed in a graphical user interface G, and the processor 10 responds to a pull-down action of pulling down the application program interface by a predetermined length, controls the graphical user interface G to display a virtual key for adding a shortcut on the application program interface; and when responding to the clicking operation of clicking the virtual key by the user, controlling to add a shortcut to the application program interface in the task shortcut operation bar T3. For example, the processor 10 controls the graphical user interface G to display a virtual key of "add to shortcut bar" above the application interface in response to a user's pull-down operation of a predetermined length downward of the application interface. The processor 10 also controls the addition of shortcuts to the application interface in the task shortcut bar T3 in response to the user clicking on the virtual key for "add to shortcut bar". In one embodiment, the processor 10 controls the task shortcut bar T3 to be displayed on the second screen B in response to a user pulling down the application program interface for a predetermined length. Specifically, when the first screen a is facing upward, the processor 10 controls the task shortcut operation bar T3 to be displayed on the first side curved screen B1; when the third screen C is directed upward, the processor 10 controls the task shortcut operation bar T3 to be displayed on the second side curved screen B3. It is understood that, in other embodiments, the processor 10 controls the task shortcut operation bar T3 to be displayed on the first side curved screen B1 when controlling the first screen a to display the multi-application interface D, and controls the task bar T1 to be displayed on the first side curved screen B1 instead of the task shortcut operation bar T3 when one of the application interfaces is displayed in front of the first screen a. It is understood that the processor 10 controls the task shortcut bar T3 to be displayed on the second side curved screen B3 when controlling the third screen C to display the multi-application interface D, and controls the task bar T1 to be displayed on the second side curved screen B3 instead of the task shortcut bar T3 when one of the application interfaces is displayed in the foreground of the third screen C.
It is understood that, in other embodiments, when the second screen B displays the task shortcut bar T3, the processor 10 may further control adding a shortcut in the task shortcut bar T3 for the application program interface in response to a dragging operation of dragging the application program interface to the second screen B.
Note that the task shortcut operation bar T3 and the task bar T1 are displayed on the second screen B in a two-out manner, that is, when the second screen B displays the task shortcut operation bar T3, the task bar T1 will not be displayed, and when the second screen B displays the task bar T1, the task shortcut operation bar T3 will not be displayed. When the second screen B displays the task shortcut operation bar T3, the task shortcut operation bar T3 and the task shortcut switching bar T2 are located at different positions of the second screen B; when the second screen B displays the taskbar T1, the taskbar T1 and the task shortcut switching bar T2 are located at different positions on the second screen B. The taskbar T1 is a column for displaying other application interfaces except the currently foreground-operated application interface of the multi-application interface D when the multi-application interface D foreground-operates an application interface. The task shortcut switching field T2 is a field for displaying a shortcut to the last application running within a predetermined period of time. The task shortcut bar T3 is a field for displaying a shortcut to an application that the user is actively adding.
It is understood that in other embodiments, the task shortcut operation bar T3, the task bar T1, and the task shortcut switching bar T2 are simultaneously displayed on different positions of the second screen B.
It will be appreciated that in one embodiment, the processor 10, after adding a shortcut to the application interface in the task shortcut bar T3, will still remain in the multi-application interface D.
It will be appreciated that, in one embodiment, the processor 10 may delete a selected shortcut in the task shortcut bar T3 in response to a user deleting any shortcut in the task shortcut bar T3.
It will be appreciated that in one embodiment, the processor 10 may simultaneously delete the shortcut of the application program interface in the task shortcut bar T3 in response to a user deleting an application program interface in the multi-application interface D.
Further, the processor 10 controls the application interface to be displayed on the first screen a in a full screen manner in response to a dragging operation of the user dragging the shortcut corresponding to the application in the task shortcut operation bar T3 to the first screen a.
Further, the processor 10 controls the application interface to be displayed on the third screen C in a full screen manner in response to the user dragging the shortcut corresponding to the application program in the task shortcut operation bar T3 to the third screen C.
Further, the processor 10 controls the application program interface to be displayed in full screen on the first screen a or the third screen C facing the user when the user clicks the shortcut corresponding to the application program in the task shortcut operation bar T3.
Therefore, the corresponding application program interface can be opened quickly through the shortcut arranged in the task shortcut operation bar T3, the user does not need to search the application program to be opened in a plurality of application icons and/or application program interfaces, the operation is simpler and more convenient, and convenience is brought to the user.
It is to be understood that, when the corresponding application program interface is long-pressed on the first screen a or the third screen C for the split screen operation, the processor 10 controls the second screen B to no longer display the task shortcut switching bar T3 in response to the long-press operation, but to display a "send to secondary screen" split screen prompt on the second screen B.
It is understood that when the first screen a is used, various contents required to be displayed on the second screen B are correspondingly displayed on the first side curved screen B1 closer to the first screen a. When the third screen C is used, various contents required to be displayed on the second screen B are correspondingly displayed on the second side curved screen B3 closer to the third screen C. Therefore, the first side curved screen B1 and the second side curved screen B3 can better assist the content display of the corresponding first screen A and the third screen C, so that the man-machine interaction becomes simpler and more direct, more humanized and meets the requirement of ergonomics.
Please refer to fig. 23, which is a flowchart illustrating a multitask interaction control method of the terminal device 100 according to an embodiment of the present application. The multitask interaction control method is applied to the terminal device 100 described above. It is to be understood that the order of execution of the multitask interactive control method is not limited to the order shown in fig. 23. Specifically, the multitask interaction control method comprises the following steps:
and responding to the multitask trigger operation control of a user to display a multi-application program interface D on the first screen A when the second screen B is in the bent state, wherein the multi-application program interface D comprises at least two application program interfaces which are sequentially arranged (step 2301).
It will be appreciated that the application interfaces may be arranged in a left-to-right, right-to-left, top-to-bottom, bottom-to-top, top-to-left-to-bottom, top-to-right-to-bottom-right, top-to-left-to-bottom, etc. arrangement. In this embodiment, the application interfaces are arranged from left to right, one of the at least two application interfaces is located at the leftmost side of the multiple application interface D, and the other application interfaces are sequentially located at the right side of the previous application interface and are partially covered by the previous application interface.
In an embodiment, the processor 10 controls the first screen to display a graphical user interface G when the second screen B is in the bent state, and controls the at least two application interfaces of the multiple application interfaces to be sequentially displayed on the graphical user interface G according to a time sequence. Specifically, the processor 10 controls all the application program interfaces in the multi-application program interface D to be sequentially arranged on the graphical user interface G according to the sequence of the last running time. It will be appreciated that in other embodiments, the processor 10 controls all of the multiple application interfaces D to be ordered on the graphical user interface G according to a composite ranking of the last run time precedence and/or priority precedence.
Specifically, in an embodiment, whether the touch screen 30 is in the folded state or the unfolded state, the processor 10 can respond to a user's left-right sliding operation on the touch screen 30, control all the application interfaces of the multi-application interface D currently displayed on the first screen a or the third screen C to slide together with the sliding operation, and hide one of the application interfaces when the one of the application interfaces slides into a preset area of one side edge of the first screen a or the third screen C, and simultaneously call out a new application interface on the other side of the first screen a or the third screen C to fill in the space that is excessive due to the hiding of the application interface. Thus, the user can view more application interfaces by a slide-left operation on the touch screen 30 to select a desired application interface.
And in response to selecting one application program interface from the multiple application program interfaces D, controlling the selected application program interface to be displayed in front of the first screen A, and controlling the unselected application program interfaces to be displayed on the second screen B (step 2302).
Specifically, the unselected application interfaces displayed on the second screen B are located on a display surface of the second screen B in a curved shape, and the application interface displayed in front of the first screen a is located on a display surface of the first screen a in a flat shape.
Further, the unselected application program interfaces displayed on the second screen B run in the background; and/or displaying the selected application program interface on the first screen A in a full screen mode.
Specifically, when the processor 10 controls the touch screen 30 to display the multi-application interface D in the folded state, in response to a clicking operation of clicking any one of the application interfaces on the multi-application interface D by a user, the clicked application interface is controlled to be switched to the foreground display on the first screen a, and other application interfaces in the multi-application interface D are sequentially displayed on the taskbar T1 on the second screen B.
In response to a user's pointing operation to point an application interface on the taskbar T1, the processor 10 cancels display of the clicked application interface on the taskbar T1, controls the application interface currently foreground-displayed on the first screen A to be displayed on the taskbar T1, and controls the application interface currently clicked on the taskbar T1 to be foreground-displayed on the first screen A.
Specifically, the touch screen 30 further includes a third screen C, the second screen B is located between the first screen a and the third screen C, the second screen B includes a first side curved screen B1 adjacent to the first screen a and a second side curved screen B3 adjacent to the third screen C, and the processor 10 controls the task bar T1 to be displayed on the first side curved screen B1 of the second screen B when the first screen a is controlled by the foreground to display the application interface.
Further, the multitask interaction control method further comprises the following steps:
when one of the multiple application interfaces D located on the first screen a is dragged to the foreground display on the third screen C, controlling to display the unselected application interfaces on the second screen B (step 2303).
Specifically, when one of the multiple application interfaces D located on the first screen a is dragged to the foreground display on the third screen C, the control displays the unselected application interfaces on the taskbar T1 and displays the taskbar T1 on the second side curved screen B3 of the second screen B.
When the user turns the terminal device 100 again to the first screen a facing up and views or uses the first screen a, the processor 10 controls the multi-application interface D displayed on the first screen a not to include the application interface currently displayed on the third screen C (step 2304).
Therefore, the multi-application program interface D can be displayed on the graphical user interface G in response to the multi-task triggering operation control of the user, the multi-application program interface D comprises at least two application program interfaces which are arranged in sequence, and the application program interface positioned in front of the sequence partially shields the application program interface positioned in the next position of the sequence. When the selected application interface is displayed in front of the first screen a, the taskbar T1 can also be displayed using the second screen B, so that other application interfaces can be rapidly switched and displayed through the taskbar T1. In addition, when the application interface is displayed on the first screen a, the taskbar T1 will be displayed on the first side curved screen B1 closer to the first screen a, and when the application interface is displayed on the third screen C, the taskbar T1 will be displayed on the second side curved screen B3 closer to the third screen C, thereby enabling further user convenience.
Please refer to fig. 24, which is a flowchart illustrating a multitask interaction control method of the terminal device 100 according to an embodiment of the present application. The multitask interaction control method is applied to the terminal device 100 described above. The execution order of the multitask interaction control method is not limited to the order shown in fig. 24. Specifically, the processor 10 controls the touch screen 30 to display a graphical user interface G, and the multitask interaction control method includes:
the spread angle between the first screen a and the second screen B sensed by the angle sensor 40 is acquired (step 2401).
Controlling the size of the graphical user interface G according to the change of the unfolding angle (step 2402).
Specifically, in an embodiment, when the touch screen 30 includes a first screen a and a second screen B, and the first screen a and the second screen B together form a whole continuous flexible display screen, the processor 10 controls the gui G to extend from the first screen a to the second screen B as the unfolding angle changes. When the touch screen 30 includes a first screen a, a second screen B and a third screen C, and the first screen a, the second screen B and the third screen C together form a whole continuous flexible display screen, the processor 10 controls the gui G to extend from the first screen a to the third screen C along with the change of the unfolding angle.
Specifically, in an embodiment, the processor 10 moves the window boundary E of the display window W according to the change of the expansion angle to adjust the size of the display window W and controls the graphical user interface G to adapt to the size of the display window W according to the movement of the window boundary E.
Specifically, in one embodiment, the processor 10 controls all application interfaces of the multi-application interface D to dynamically adapt the size of the display window W according to the size of the display window W.
Specifically, in one embodiment, the processor 10 acquires the spread angle between the first screen a and the third screen C sensed by the angle sensor 40. The processor 10 moves the window boundary line E according to the change of the expansion angle acquired by the angle sensor 40 to adjust the size of the display window W and controls all application interfaces of the multi-application interface D to adapt to the size of the display window W according to the movement of the window boundary line E.
Specifically, in one embodiment, the processor 10 controls the window boundary line E to move such that the display window W increases as the expansion angle increases, controls the number of application interfaces located in the display window W to increase as the expansion angle increases, controls the width of the blocked portion between adjacent application interfaces to gradually decrease as the expansion angle increases, and controls the distance between adjacent application interfaces to gradually increase as the expansion angle increases, in the process of increasing the expansion angle from 0 degree to 180 degree. The processor 10 controls the window boundary line E to move so that the display window W decreases as the expansion angle decreases, controls the number of application interfaces located in the display window W to decrease as the expansion angle decreases, controls the width of the blocked portion between adjacent application interfaces to gradually increase as the expansion angle decreases, and controls the distance between adjacent application interfaces to gradually decrease as the expansion angle decreases, in the process of decreasing the expansion angle from 180 degrees to 0 degrees.
Specifically, in an embodiment, when the first screen a faces upward and the expansion angle of the touch screen 30 increases from 0 degree to 180 degrees, the processor 10 controls the window boundary line E to move to the left and controls the application interface to move to the left along with the window boundary line E, and controls the newly added application interface to move from the right side away from the window boundary line E. When the third screen C faces upward and the expansion angle of the touch screen 30 increases from 0 degree to 180 degrees, the processor 10 controls the window boundary line E to move to the right and controls the application interface to move to the right along with the window boundary line E, and controls the newly added application interface to move from the left far from the window boundary line E. When the first screen a faces upward and the expansion angle of the touch screen 30 decreases from 180 degrees to 0 degree, the processor 10 controls the window boundary line E to move to the right and controls the application interface to move to the right along with the window boundary line E, and controls the application interface that cannot be displayed in the display window W to move away from the right side away from the window boundary line E due to the display window W becoming smaller. When the third screen C faces upward and the expansion angle of the touch screen 30 is reduced from 180 degrees to 0 degree, the processor 10 controls the window boundary line E to move to the left and controls the application interface to move to the left along with the window boundary line E, and controls the display window W to become smaller, so that the application interface which cannot be displayed in the display window W moves away from the left side far from the window boundary line E.
Specifically, in one embodiment, the processor 10 controls the moving speed of the application interface to be proportional to the increasing speed of the deployment angle and controls the decreasing speed of the width of the blocked portion between the adjacent application interfaces to be proportional to the increasing speed of the deployment angle during the process of increasing the deployment angle from 0 degrees to 180 degrees. The processor 10 controls the moving speed of the application interface to be proportional to the decreasing speed of the deployment angle and the increasing speed of the width of the blocked portion between the adjacent application interfaces to be proportional to the decreasing speed of the deployment angle in the process of decreasing the deployment angle from 180 degrees to 0 degrees.
Specifically, in one embodiment, the processor 10 controls the size of each application interface to remain constant during the change of the deployment angle.
Therefore, the method and the device can acquire the expansion angle between the first screen A and the second screen B sensed by the angle sensor 40, move the window boundary line E according to the change of the expansion angle to adjust the size of the display window W, and control all application program interfaces in the multi-application program interface D to adapt to the size of the display window W according to the movement of the window boundary line E. Therefore, the number of the application program interfaces in the display window W, the width of the shielded part between the adjacent application program interfaces and the distance between the adjacent application program interfaces can be dynamically adjusted according to the expansion angle, and the dynamic adaptability of the speed of the application program interfaces moving along with the window boundary line E is better.
Please refer to fig. 25, which is a flowchart illustrating a multitask interaction control method of the terminal device 100 according to an embodiment of the present application. The multitask interaction control method is applied to the terminal device 100 described above. The execution order of the multitask interaction control method is not limited to the order shown in fig. 25. Specifically, the multitask interaction control method comprises the following steps:
the touch screen 30 is divided into a first display area W1 and a second display area W2 which are adjacently arranged in response to a screen-splitting operation of a user, and different application program interfaces are controlled to be respectively displayed in the first display area W1 and the second display area W2 (step 2501).
Specifically, in one embodiment, the processor 10 divides the touch screen 30 into a first display area W1 and a second display area W2 in response to the operation of the corresponding application interface, controls the operated application interface to be displayed in the first display area W1, and keeps the multi-application interface D displayed in the second display area W2.
In an embodiment, when the processor 10 displays the operated application interface in the first display area W1, the multi-application interface D is further controlled to move away from the first display area W1 until the multi-application interface D crosses a boundary between the first display area W1 and the second display area W2 and enters the second display area W2. The processor 10 also displays another application interface of the second display area W2 in the foreground of the second display area W2 in response to an operation of the other application interface.
Specifically, the screen-splitting operation includes a long-press operation of long-pressing a corresponding application program interface and a drag operation of dragging the application program interface to a predetermined range of the screen-splitting prompt bar. And when the processor 10 responds to the long press operation and displays the split screen prompt bar at the top of the graphical user interface G, and continues to respond to the dragging operation to drag the application program interface to the preset range of the split screen prompt bar, controlling the touch screen 30 to split the screen into a first display area W1 and a second display area W2, controlling the operated application program interface to be displayed in the first display area W1, and keeping the multi-application program interface D displayed in the second display area W2, or vice versa.
Specifically, in one embodiment, the processor 10 controls the split screen icon K to be displayed below an application interface capable of supporting the split screen function. The processor 10 further responds to the click operation of the split screen icon K to control the split screen of the touch screen 30 to be the first display area W1 and the second display area W2, and first displays the application program interface corresponding to the split screen key K in the first display area W1, and displays other application program interfaces in the multi-application program interface D in the second display area W2. When the application interface is displayed in the first display area W1, the processor 10 further controls the multi-application interface D to move away from the first display area W1 until the multi-application interface D crosses the boundary between the first display area W1 and the second display area W2 and enters the second display area W2. The processor 10 also controls one of the application interfaces D located in the second display area W2 to be displayed in the second display area W2 in response to a clicking operation on the application interface or a clicking operation on the split-screen icon K below the application interface.
The display of the screen division bar S1 on the division line between the first display region W1 and the second display region W2 is controlled (step 2502).
In response to a drag operation of a left or right drag to the touch screen 30 applied by the user to the screen bar S1, the display size ratios of the first display area W1 and the second display area W2 are adjusted (step 2503).
Specifically, in one embodiment, when the split bar S1 is dragged to the left, the processor 10 adjusts the first display area W1 to become smaller and the second display area W2 to become larger. When the split bar S1 is dragged to the right, the processor 10 adjusts the first display area W1 to be larger and the second display area W2 to be smaller. More specifically, when the split bar S1 is dragged to an end of the touch screen 30, the processor 10 adjusts the display size of the second display area W2 to occupy the entire display area of the touch screen 30, and at this time, the application interface previously displayed in the first display area W1 switches to the background operation. When the split bar S1 is dragged to the other end of the touch screen 30, the processor 10 adjusts the display size of the first display area W1 to occupy the entire display area of the touch screen 30, and at this time, the application interface previously displayed in the second display area W2 switches to the background operation. It is to be understood that the above-described function may be implemented by dragging the boundary between the first display region W1 and the second display region W2 directly without displaying the split screen bar S1.
The screen-splitting toolbar S2 is controlled to be displayed in response to the clicking operation of the screen-splitting bar S1, and the screen-splitting toolbar S2 includes a left-right toggle option S21, an exit screen-splitting option S22, and a screen-splitting application list S23 (step 2504).
Controlling the application interface swap display displayed in the first display area W1 and the second display area W2 in response to the user' S clicking operation on the left-right swap option S21; controlling to exit the split screen display in response to the clicking operation on the exit split screen option S22; in response to the click operation control to the split-screen application list S23, a list of icons, which may be a list of icons of applications that can support split-screen, is displayed on the second display region W2/first display region W1 (step 2505).
Specifically, in one embodiment, the exiting of the split screen display is controlled in response to the clicking operation on the exiting split screen option S22, that is, the display mode of the split screen display in the first display area W1 and the second display area W2 is exited, and the multi-application interface D before the split screen is returned, or the multi-application interface displayed in one of the first display area W1 and the second display area W2 is returned to the home page.
The split toolbar S2 is switched to the split bar S1 display (step 2506).
Specifically, in one embodiment, the processor 10 controls the split toolbar S2 to be switched to the split bar S1 display when the icon list is displayed on the second display area W2/the first display area W1. It will be appreciated that in other embodiments, the processor 10 may control the switching of the split screen toolbar S2 to the split screen bar S1 display in response to the clicking operation of the left-right toggle option S21, the exit split screen option S22, or the list of split screen applications S23, and upon re-clicking the split screen bar S1, re-awaken the split screen toolbar S2.
Therefore, the touch screen 30 can be divided into the first display area W1 and the second display area W2 which are adjacently arranged at the left and right in response to the split screen operation of the user, the application program interfaces are controlled to be respectively displayed in the first display area W1 and the second display area W2, and the split screen interface can be switched, quitted or switched to the split screen display operation of other applications. Therefore, even if the user enters the left and right split screen state, the split screen interface can be exchanged, the split screen can be quitted or other applications can be switched to perform split screen display, the user does not need to quit the split screen and then enter the split screen state as required, and more convenience is brought to the user.
Referring to fig. 26, a flowchart of a multitask interaction control method of the terminal device 100 in an embodiment of the present application is shown. The multitask interaction control method is applied to the terminal device 100 described above. The execution order of the multitask interaction control method is not limited to the order shown in fig. 24. Specifically, the multitask interaction control method comprises the following steps:
and responding to the operation of dragging the application program interface displayed on the first screen A to the second screen B, and controlling the application program interface to be displayed on the third screen C in the foreground (step 2601).
Specifically, in an embodiment, the processor 10 recognizes an operation when the application interface of the first screen a is dragged beyond a boundary between the first screen a and the second screen B as a split screen operation, and controls the application interface to be displayed in the foreground on the third screen C.
Specifically, in an embodiment, the application program interface foreground-displayed on the third screen C is full-screen-displayed on the third screen C.
Specifically, in an embodiment, the multi-application interface D displayed on the first screen a is an application running in the background.
Specifically, in one embodiment, the first screen a displays a plurality of application interfaces, and the processor 10 responds to the screen splitting operation on one of the application interfaces and controls the other remaining application interfaces to reorder the positions of the removed application interfaces.
Specifically, in an embodiment, the second screen B is a flexible screen, and when the second screen B is in a bent state, the processor 10 triggers screen splitting in response to an operation of dragging the application program interface displayed by the first screen a to the second screen B; when the second screen B is in the expanded state, the processor 10 does not trigger the split screen for the operation of dragging the application program interface displayed by the first screen a to the third screen C.
And controlling a screen splitting operation bar S3 to be displayed on the second screen B, wherein the screen splitting operation bar S3 comprises a screen splitting conversion option S31 and an exit screen splitting option S33 (step 2602).
Controlling the swap display of the application program interfaces displayed on the first display area W1 and the second display area W2 in response to a user' S clicking operation on a split screen swap option S31 in the split screen operation bar S3; and controlling to quit the split screen display (step 2603) in response to the clicking operation of the user on the quit split screen option in the split screen operation bar S3.
Specifically, in one embodiment, the processor 10 controls the swap display of the application interfaces displayed in the first display area W1 and the second display area W2 in response to the user clicking on the split screen swap option S31 in the split screen bar S3; and in response to the clicking operation of the user on the exit split screen option S33 in the split screen operation bar S3, controlling the exit of the split screen display and returning to the multi-application program interface D, the home page or an arbitrarily set application program interface before the split screen display.
When the touch screen 30 is in the folded state, the processor 10 divides the first screen a into a third display area W3 and a fourth display area W4 which are adjacently arranged in response to a screen splitting operation of a user, and controls different application program interfaces to be respectively displayed in the third display area W3 and the fourth display area W4. (step 2604).
Specifically, the third display region and the third display region are in an up-down configuration.
Specifically, in an embodiment, when the processor 10 responds to a long-press operation of the application interface corresponding to the long-press operation until a split-screen prompt appears on the first screen a, and continues to respond to a drag operation of dragging the application interface to a predetermined range of the split-screen prompt, the processor controls the first screen a to be split into the third display area W3 and the fourth display area W4, controls the operated application interface to be displayed in the third display area W3, and keeps the multi-application interface D displayed in the fourth display area W4.
Specifically, in an embodiment, the processor 10 controls a split screen icon K to be displayed at a corresponding position of an application interface supporting split screen, and the processor 10 further controls the first screen a to be split screen into the third display area W3 and the fourth display area W4 in response to a click operation on the split screen icon K, controls an operated application interface to be displayed in the third display area W3, and keeps the multi-application interface D displayed in the fourth display area W4.
Specifically, in an embodiment, when the processor 10 displays the operated application interface in the third display area W3, the processor 10 further controls the multi-application interface D to move in a direction away from the boundary between the third display area W3 and the fourth display area W4 until the multi-application interface D enters the fourth display area W4, and the processor 10 further responds to the operation of another application interface in the fourth display area W4 to display the another application interface in the foreground in the fourth display area W4.
Therefore, when the touch screen 30 is in the folded state, the method can respond to the long-press operation of the user long-press the selected application program interface to control the split screen prompt displayed on the top of the first screen a to drag the split screen up and down, control the split screen prompt displayed on the first side curved screen B1 to be sent to the auxiliary screen, respond to the drag operation of dragging the application program interface to the top area of the first screen a, control the split screen up and down of the first screen a to be the third display area W3 and the fourth display area W4, and respond to the drag operation of dragging the application program interface to the first side curved screen B1 to control the touch screen 30 to be divided into the second display area W2 located on the first screen a and the first display area W1 located on the third screen C. Therefore, left and right split screens or up and down split screens can be displayed in multiple modes in a folded state, and more convenience is brought to users.
Please refer to fig. 27, which is a flowchart illustrating a multitask interaction control method of the terminal device 100 according to an embodiment of the present application. The multitask interaction control method is applied to the terminal device 100 described above. The execution order of the multitask interaction control method is not limited to the order shown in fig. 27. The multitask interaction control method comprises the following steps:
when the touch screen 30 is in the folded state, the processor 10 controls the first screen a to display a graphical user interface G, where the graphical user interface G includes an application program (step 2701);
in response to the operation of the application (step 2702).
And controlling the second screen B to display a task shortcut bar, and displaying an application shortcut in the task shortcut bar (step 2703).
Specifically, the task shortcut bar includes a task shortcut switching bar T2, and the application shortcut includes a shortcut of a last application executed before the operated application is executed, which is displayed in the task shortcut switching bar T2.
Specifically, in an embodiment, the processor 10 controls the touch screen 30 to display the multi-application interface D or the currently selected application interface from the first screen a in the folded state, controls the second screen B to display the task shortcut switching bar T2, and displays a shortcut of a last application executed before the operated application is executed in the task shortcut switching bar T2. It is understood that in one embodiment, the shortcut of the application refers to an application icon of the application, a reduced view of an application interface of the application, and the like.
Specifically, in one embodiment, the processor 10 controls the execution of the operated application program in response to the operation of the application program, and controls the shortcut of the last executed application program to be displayed in the task shortcut switching field T2.
Specifically, in an embodiment, when the current application is an application used for the first time within a preset time period, the processor 10 controls the task shortcut switching bar T2 to be displayed in a blank state or not displayed.
Specifically, in an embodiment, the processor 10 controls the shortcut to be removed from the task shortcut switching bar T2 when the application icon in the task shortcut switching bar T2 is not used again within a preset time period.
Specifically, in an embodiment, the processor 10 controls the first screen a to display the currently selected application interface in the foreground, and controls the first side curved screen B1 to display the task shortcut switching bar T2, so that the first screen a can be better assisted to display, and the task shortcut switching is more convenient.
Optionally, in an embodiment, the task shortcut includes a task shortcut operation bar T3, and the application shortcut includes a shortcut of an application operated, which is displayed in the task shortcut operation bar T3.
Specifically, in one embodiment, the processor 10 displays the shortcut of the operated application in the task shortcut operation bar T3 in response to the operation of the application.
Specifically, in one embodiment, the applications of the gui G are displayed on the first screen a in an application interface manner, and the processor 10 displays the shortcuts of the operated applications in the task shortcut bar T3 in response to the operation of the application interface of the application.
Specifically, when a multitasking interface D is displayed in a graphical user interface G, and the processor 10 responds to a pull-down action of pulling down the application program interface by a predetermined length, controls the graphical user interface G to display a virtual key for adding a shortcut on the application program interface; and when responding to the clicking operation of clicking the virtual key by the user, controlling to add a shortcut to the application program interface in the task shortcut operation bar T3.
It is understood that, in other embodiments, when the second screen B displays the task shortcut bar T3, the processor 10 may further control adding a shortcut in the task shortcut bar T3 for the application program interface in response to a dragging operation of dragging the application program interface to the second screen B.
It will be appreciated that in one embodiment, the processor 10, after adding a shortcut to the application interface in the task shortcut bar T3, will still remain in the multi-application interface D.
It will be appreciated that, in one embodiment, the processor 10 may delete a selected shortcut in the task shortcut bar T3 in response to a user deleting any shortcut in the task shortcut bar T3.
It will be appreciated that in one embodiment, the processor 10 may simultaneously delete the shortcut of the application program interface in the task shortcut bar T3 in response to a user deleting an application program interface in the multi-application interface D.
Further, the multitask control method further comprises the following steps: and controlling the corresponding first screen A or third screen C to display the application program interface corresponding to the shortcut in a full screen mode according to the touch operation of the shortcut of the application program interface in the task shortcut operation bar T3 by the user.
Specifically, when the processor 10 responds to a dragging operation of dragging a shortcut corresponding to an application program in the task shortcut operation bar T3 to the first screen a by the user, the processor controls the application program interface to be displayed on the first screen a in a full screen; or
When the processor 10 responds to the dragging operation of the shortcut corresponding to the application program in the task shortcut operation bar T3 to the third screen C by the user, controlling the application program interface to be displayed in full screen on the third screen C; or
When the processor 10 responds to the click operation of the user on the shortcut corresponding to the application program in the task shortcut operation bar T3, the processor controls the application program interface to be displayed in full screen on the first screen a or the third screen C facing the user.
Specifically, when the currently selected application program interface is controlled to be displayed in full screen by the first screen a, the task shortcut switching bar T2 is controlled to be displayed by the first side curved screen B1, so that the first screen a can be better assisted to be displayed, and the task shortcut switching is more convenient to perform.
It is understood that, while controlling the first side curved screen B1 to display the task shortcut switching bar T2, the first side curved screen B1 may also be controlled to display the task bar T1, and both the task shortcut switching bar T2 and the task bar T1 are displayed at different positions of the first side curved screen B1. Therefore, the task can be switched quickly and conveniently.
Therefore, the task shortcut switching bar T2 and/or the task shortcut operation bar T3 displayed on the second screen B of the touch screen 30 can be conveniently switched and displayed by the user, so that more convenience is brought to the user.
It should be noted that, for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the order of acts described, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, and the program can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and embodiments of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (32)

  1. A terminal device is characterized by comprising a processor and a touch screen, wherein the touch screen at least comprises a first screen, a third screen and a second screen positioned between the first screen and the third screen, the second screen is a flexible touch screen, the processor controls the first screen to display a graphical user interface and displays a multi-application interface on the graphical user interface, the multi-application interface comprises at least two application interfaces which are sequentially arranged, and the processor:
    and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode.
  2. The terminal device according to claim 1, wherein the processor recognizes an operation when the application interface of the first screen is dragged beyond a boundary between the first screen and the second screen as a split screen operation, and controls the application interface to be displayed in the foreground on the third screen.
  3. The terminal device according to any one of claims 1 to 2, wherein the application program interface foreground-displayed on the third screen is full-screen displayed on the third screen.
  4. The terminal device of any of claims 1-2, wherein the multi-application interface of the first screen display is a background running application.
  5. The terminal device of claim 1, wherein the first screen displays a plurality of application interfaces, and the processor is responsive to a split operation for one of the application interfaces and controls the other remaining application interfaces to reorder the locations of the removed application interfaces.
  6. The terminal device according to claim 1, wherein when the second screen is in a bent state, the processor triggers screen splitting in response to an operation of dragging an application program interface displayed on the first screen to the second screen; when the second screen is in the unfolded state, the processor does not trigger split screen for dragging the application program interface displayed by the first screen to the third screen.
  7. The terminal device according to claim 1, wherein the processor controls the touch screen to be divided into a second display area on the first screen and a first display area on the third screen in response to an operation of dragging the application interface displayed on the first screen to the second screen, and controls different application interfaces to be displayed in the first display area and the second display area, respectively.
  8. The terminal device according to claim 7, wherein the processor controls the second screen to display a split screen operation bar, the split screen operation bar comprises a split screen swap option and/or an exit split screen option, and the processor controls the swap display of the application program interfaces displayed on the first display area and the second display area in response to a user clicking operation on the split screen swap option in the split screen operation bar; and/or the processor responds to the clicking operation of the user on the screen-splitting quitting option in the screen-splitting operation bar, and controls to quit the screen-splitting display and return to a multi-application program interface, a home page or an arbitrarily set application program interface before the screen-splitting display.
  9. The terminal device according to claim 7, wherein after the first screen is rotated and unfolded relative to the third screen, the first display area and the second display area respectively continue to be in a split screen state in a folded state, and extend from the first display area and the second display area respectively towards the second screen to be connected with each other, and a split screen bar is further arranged between the first display area and the second display area; and the processor responds to the clicking operation of the split screen bar to display a split screen toolbar, wherein the split screen toolbar at least comprises one of a split screen conversion option, a left and right split screen quitting option and a split screen application list.
  10. The terminal device according to claim 1, wherein the processor divides the first screen into a third display area and a fourth display area which are adjacently arranged in response to a screen splitting operation of a user when the second screen is in a bent state, and controls different application program interfaces to be respectively displayed in the third display area and the fourth display area, wherein the third display area and the fourth display area are in an up-down structural form.
  11. The terminal device according to claim 10, wherein the processor controls the first screen to be split into the third display area and the fourth display area, controls the operated application program interface to be displayed in the third display area, and controls the multi-application program interface to be displayed in the fourth display area when responding to a long-press operation of the application program interface corresponding to the long-press operation until a split-screen prompt appears on the first screen and continuing to respond to a drag operation of dragging the application program interface to a predetermined range of the split-screen prompt.
  12. The terminal device according to claim 10, wherein the processor controls a split-screen icon to be displayed at a corresponding position of a split-screen-enabled application interface, and the processor further controls the first screen to be split into the third display area and the fourth display area in response to a click operation on the split-screen icon, controls an operated application interface to be displayed in the third display area, and controls the multi-application interface to be displayed in the fourth display area in a reserved manner.
  13. The terminal device according to any one of claims 11 or 12, wherein the processor, when displaying the operated application interface in the third display region, further controls the multi-application interface to move away from a boundary between the third display region and the fourth display region until the multi-application interface enters the fourth display region, and further responds to an operation of another application interface in the fourth display region to foreground the other application interface in the fourth display region.
  14. A graphical user interface is applied to a terminal device with a touch screen, and is characterized in that the touch screen at least comprises a first screen, a third screen and a second screen positioned between the first screen and the third screen, the second screen is a flexible touch screen, the first screen displays a graphical user interface, and displays a multi-application interface on the graphical user interface, the multi-application interface comprises at least two application interfaces arranged in sequence, and the graphical user interface is controlled by a processor of the terminal device and is realized as follows:
    and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode.
  15. A graphical user interface as recited in claim 14, wherein said graphical user interface is controlled by a processor of said terminal device and implements:
    and identifying the operation when the application program interface of the first screen is dragged to exceed the boundary between the first screen and the second screen as split screen operation, and controlling the application program interface to be displayed on the third screen in the foreground.
  16. A gui according to claim 14, wherein the first screen displays a plurality of application interfaces, the gui being controlled by the processor of the terminal device and implementing:
    and responding to the screen splitting operation of one application program interface, and controlling other rest application program interfaces to reorder the positions of the application program interfaces with the removed supplement.
  17. A graphical user interface as claimed in any one of claims 14 to 16, wherein the graphical user interface is controlled by a processor of the terminal device and effects:
    when the second screen is in a bent state, responding to the operation of dragging the application program interface displayed by the first screen to the second screen to trigger split screen; and
    and when the second screen is in an expanded state, the operation of dragging the application program interface displayed by the first screen to the third screen does not trigger screen splitting.
  18. A graphical user interface as recited in claim 14, wherein said graphical user interface is controlled by a processor of said terminal device and implements:
    the second screen responds to the screen splitting operation of a user to divide the first screen into a third display area and a fourth display area which are adjacently arranged in a bent state, and different application program interfaces are controlled to be respectively displayed in the third display area and the third display area, wherein the third display area and the third display area are in an up-and-down structural form.
  19. A graphical user interface as recited in claim 18, wherein said graphical user interface is controlled by a processor of said terminal device and implements:
    responding to long-press operation of the application program interface corresponding to the long-press operation until a split screen prompt appears on the first screen;
    when the dragging operation of dragging the application program interface to the preset range of the split screen prompt is continuously responded, the first screen is controlled to be split into the third display area and the fourth display area; and
    and controlling the operated application program interface to be displayed in the third display area, and reserving and displaying the multi-application program interface in the fourth display area.
  20. A graphical user interface as recited in claim 18, wherein said graphical user interface is controlled by a processor of said terminal device and implements:
    controlling a corresponding position of an application program interface supporting split screen to display a split screen icon;
    responding to the clicking operation of the split screen icon to control the first screen to be split into the third display area and the fourth display area; and a process for the preparation of a coating,
    and controlling the operated application program interface to be displayed in the third display area, and reserving and displaying the multi-application program interface in the fourth display area.
  21. A graphical user interface as claimed in claim 19 or 20, wherein the graphical user interface is controlled by a processor of the terminal device and effects:
    when the operated application program interface is displayed in the third display area, controlling the multi-application program interface to move towards a direction away from a boundary line between the third display area and the fourth display area until the multi-application program interface enters the fourth display area;
    responding to the operation of another application program interface in the fourth display area to display the another application program interface in the foreground in the fourth display area.
  22. A multitask interaction control method is applied to a terminal device with a touch screen, and is characterized in that the touch screen at least comprises a first screen, a third screen and a second screen positioned between the first screen and the third screen, the second screen is a flexible touch screen, the first screen displays a graphical user interface, and displays a multitask application program interface on the graphical user interface, the multitask application program interface comprises at least two application program interfaces which are sequentially arranged, and the multitask interaction control method comprises the following steps:
    and responding to the operation of dragging the application program interface displayed on the first screen to the second screen, and controlling the application program interface to be displayed on the third screen in a foreground mode.
  23. The multitask interaction control method according to claim 22, wherein the step of controlling the application program interface displayed in the first screen to be displayed in the foreground on the third screen in response to the operation of dragging the application program interface displayed in the first screen to the second screen comprises the steps of:
    and identifying the operation when the application program interface of the first screen is dragged to exceed the boundary between the first screen and the second screen as split screen operation, and controlling the application program interface to be displayed on the third screen in the foreground.
  24. The multitask interaction control method according to claim 22, wherein the first screen displays a plurality of application program interfaces, and further comprising:
    and responding to the screen splitting operation of one application program interface, and controlling other rest application program interfaces to reorder the positions of the application program interfaces with the removed supplement.
  25. The multitask interaction control method according to claim 22, wherein said multitask interaction control method further comprises:
    when the second screen is in a bent state, responding to the operation of dragging the application program interface displayed by the first screen to the second screen to trigger split screen; and
    and when the second screen is in an expanded state, the operation of dragging the application program interface displayed by the first screen to the third screen does not trigger screen splitting.
  26. The multitask interaction control method according to claim 22, wherein said multitask interaction control method further comprises:
    responding to the operation of dragging the application program interface displayed on the first screen to the second screen, controlling the touch screen to be divided into a second display area positioned on the first screen and a first display area positioned on the second screen, and controlling different application program interfaces to be respectively displayed in the first display area and the second display area.
  27. The multitask interaction control method according to claim 26, wherein said multitask interaction control method further comprises:
    controlling a split screen operation bar to be displayed on the second screen, wherein the split screen operation bar comprises a split screen swap option and/or a split screen quitting option; responding to the clicking operation of the user on the split screen swap option in the split screen operation bar, and controlling the swap display of the application program interfaces displayed on the first display area and the second display area; or
    Responding to the click operation of the user on the screen-splitting quitting option in the screen-splitting operation bar, and controlling to quit the screen-splitting display and return to a multi-application program interface, a home page or an arbitrarily set application program interface before the screen-splitting display.
  28. The multitask interaction control method according to claim 26, wherein said multitask interaction control method further comprises:
    after the first screen rotates relative to the third screen and is unfolded, the first display area and the second display area are respectively continued to be in a split screen state under a folded state, and are located in the first display area and the second display area respectively extend towards the direction of the second screen until the first display area and the second display area are connected.
  29. The multitask interaction control method according to claim 22, wherein said multitask interaction control method further comprises:
    the second screen responds to the screen splitting operation of a user to divide the first screen into a third display area and a fourth display area which are adjacently arranged under the bending state; and a process for the preparation of a coating,
    and controlling to respectively display different application program interfaces in the third display area and the third display area, wherein the third display area and the third display area are in an up-and-down structural form.
  30. The multitask interaction control method according to claim 29, wherein said multitask interaction control method further comprises:
    responding to long-press operation of the application program interface corresponding to the long-press operation until a split screen prompt appears on the first screen;
    when the dragging operation of dragging the application program interface to the preset range of the split screen prompt is continuously responded, the first screen is controlled to be split into the third display area and the fourth display area; and a process for the preparation of a coating,
    and controlling the operated application program interface to be displayed in the third display area, and reserving and displaying the multi-application program interface in the fourth display area.
  31. The multitask interaction control method according to claim 29, wherein said multitask interaction control method further comprises:
    controlling a corresponding position of an application program interface supporting split screen to display a split screen icon;
    responding to the clicking operation of the split screen icon to control the first screen to be split into the third display area and the fourth display area; and
    and controlling the operated application program interface to be displayed in the third display area, and reserving and displaying the multi-application program interface in the fourth display area.
  32. A multitask interaction control method as claimed in any one of claims 30 or 31, wherein said multitask interaction control method further comprises:
    when the operated application program interface is displayed in the third display area, the multi-application program interface is further controlled to move towards a direction away from a boundary line between the third display area and the fourth display area until the multi-application program interface enters the fourth display area; and
    responding to the operation of another application program interface in the fourth display area to display the another application program interface in the foreground in the fourth display area.
CN201880096040.7A 2018-10-30 2018-10-30 Terminal equipment and graphical user interface and multitask interaction control method thereof Pending CN112703472A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/112788 WO2020087304A1 (en) 2018-10-30 2018-10-30 Terminal device and graphical user interface thereof, and multi-task interactive control method

Publications (1)

Publication Number Publication Date
CN112703472A true CN112703472A (en) 2021-04-23

Family

ID=70463530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880096040.7A Pending CN112703472A (en) 2018-10-30 2018-10-30 Terminal equipment and graphical user interface and multitask interaction control method thereof

Country Status (2)

Country Link
CN (1) CN112703472A (en)
WO (1) WO2020087304A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220010995A (en) * 2020-07-20 2022-01-27 삼성전자주식회사 Electronic device for displaying an execution screen of an application and method for operating thereof
CN114415894A (en) * 2022-01-25 2022-04-29 京东方科技集团股份有限公司 Terminal split screen processing method, device, equipment and medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103336651A (en) * 2013-06-18 2013-10-02 深圳市金立通信设备有限公司 Method for realizing multi-task function interface and terminal
US20140053097A1 (en) * 2012-08-17 2014-02-20 Pantech Co., Ltd. Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same
CN106020682A (en) * 2016-05-05 2016-10-12 北京小米移动软件有限公司 Multi-task management method and device
CN106537319A (en) * 2016-10-31 2017-03-22 北京小米移动软件有限公司 Screen-splitting display method and device
KR20170079549A (en) * 2015-12-30 2017-07-10 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN107678724A (en) * 2017-10-19 2018-02-09 广东欧珀移动通信有限公司 A kind of method for information display, device, mobile terminal and storage medium
CN107704177A (en) * 2017-11-07 2018-02-16 广东欧珀移动通信有限公司 interface display method, device and terminal
CN107728885A (en) * 2017-10-24 2018-02-23 广东欧珀移动通信有限公司 Control method, device, mobile terminal and the storage medium of multitask
CN108196618A (en) * 2017-11-30 2018-06-22 努比亚技术有限公司 A kind of terminal split screen method, terminal and computer readable storage medium
CN108255378A (en) * 2018-02-09 2018-07-06 维沃移动通信有限公司 A kind of display control method and mobile terminal
CN108345425A (en) * 2018-02-09 2018-07-31 维沃移动通信有限公司 A kind of management method and mobile terminal of application
CN108614677A (en) * 2018-04-28 2018-10-02 努比亚技术有限公司 Method for information display, mobile terminal and computer readable storage medium
CN108664185A (en) * 2018-04-28 2018-10-16 努比亚技术有限公司 Picture display process, mobile terminal and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8413040B2 (en) * 2009-02-13 2013-04-02 Microsoft Corporation Creating and inserting links by drag and drop

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140053097A1 (en) * 2012-08-17 2014-02-20 Pantech Co., Ltd. Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same
CN103336651A (en) * 2013-06-18 2013-10-02 深圳市金立通信设备有限公司 Method for realizing multi-task function interface and terminal
KR20170079549A (en) * 2015-12-30 2017-07-10 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN106020682A (en) * 2016-05-05 2016-10-12 北京小米移动软件有限公司 Multi-task management method and device
CN106537319A (en) * 2016-10-31 2017-03-22 北京小米移动软件有限公司 Screen-splitting display method and device
CN107678724A (en) * 2017-10-19 2018-02-09 广东欧珀移动通信有限公司 A kind of method for information display, device, mobile terminal and storage medium
CN107728885A (en) * 2017-10-24 2018-02-23 广东欧珀移动通信有限公司 Control method, device, mobile terminal and the storage medium of multitask
CN107704177A (en) * 2017-11-07 2018-02-16 广东欧珀移动通信有限公司 interface display method, device and terminal
CN108196618A (en) * 2017-11-30 2018-06-22 努比亚技术有限公司 A kind of terminal split screen method, terminal and computer readable storage medium
CN108255378A (en) * 2018-02-09 2018-07-06 维沃移动通信有限公司 A kind of display control method and mobile terminal
CN108345425A (en) * 2018-02-09 2018-07-31 维沃移动通信有限公司 A kind of management method and mobile terminal of application
CN108614677A (en) * 2018-04-28 2018-10-02 努比亚技术有限公司 Method for information display, mobile terminal and computer readable storage medium
CN108664185A (en) * 2018-04-28 2018-10-16 努比亚技术有限公司 Picture display process, mobile terminal and computer readable storage medium

Also Published As

Publication number Publication date
WO2020087304A1 (en) 2020-05-07

Similar Documents

Publication Publication Date Title
US11853523B2 (en) Display device and method of indicating an active region in a multi-window display
US11048404B2 (en) Information processing apparatus, information processing method, and program
KR102133410B1 (en) Operating Method of Multi-Tasking and Electronic Device supporting the same
KR101527827B1 (en) Split-screen display method and apparatus, and electronic device thereof
KR101608183B1 (en) Arranging display areas utilizing enhanced window states
US9977523B2 (en) Apparatus and method for displaying information in a portable terminal device
US8860672B2 (en) User interface with z-axis interaction
KR101720849B1 (en) Touch screen hover input handling
KR101387270B1 (en) Mobile terminal for displaying menu information accordig to trace of touch signal
US9213477B2 (en) Apparatus and method for touch screen user interface for handheld electric devices part II
KR100783552B1 (en) Input control method and device for mobile phone
EP2613244A2 (en) Apparatus and method for displaying screen on portable device having flexible display
EP2889740A1 (en) Method, apparatus and computer program product for zooming and operating screen frame
CN107704157A (en) A kind of multi-screen interface operation method, device and storage medium
CN103455245A (en) Method and device for regulating area of widget
CN112703472A (en) Terminal equipment and graphical user interface and multitask interaction control method thereof
CN112689821A (en) Terminal equipment and graphical user interface and multitask interaction control method thereof
CN112703476A (en) Terminal equipment and graphical user interface and multitask interaction control method thereof
CN112689809A (en) Terminal equipment and graphical user interface and multitask interaction control method thereof
CN112703477A (en) Terminal equipment and graphical user interface and multitask interaction control method thereof
CN114879872A (en) Display method, display device, electronic equipment and storage medium
KR20150098366A (en) Control method of virtual touchpadand terminal performing the same
CN112764623B (en) Content editing method and device
KR101692848B1 (en) Control method of virtual touchpad using hovering and terminal performing the same
CN116301506A (en) Content display method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210423