CN111026303A - Interface display method and terminal equipment - Google Patents

Interface display method and terminal equipment Download PDF

Info

Publication number
CN111026303A
CN111026303A CN201911193970.7A CN201911193970A CN111026303A CN 111026303 A CN111026303 A CN 111026303A CN 201911193970 A CN201911193970 A CN 201911193970A CN 111026303 A CN111026303 A CN 111026303A
Authority
CN
China
Prior art keywords
area
icon
input
terminal device
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911193970.7A
Other languages
Chinese (zh)
Inventor
郑世卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911193970.7A priority Critical patent/CN111026303A/en
Publication of CN111026303A publication Critical patent/CN111026303A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an interface display method and terminal equipment, which are applied to the technical field of communication and are used for solving the problem that certain edge areas of a touch screen cannot be touched when a user touches the touch screen of the terminal equipment with one hand. The method comprises the following steps: receiving a first input to a first area of a target interface under the condition that M icons are displayed on the target interface, wherein the target interface comprises N areas, and the first area is one of the N areas; responding to the first input, and displaying an interface of the first application program corresponding to the first icon in the second area; the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.

Description

Interface display method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an interface display method and terminal equipment.
Background
With the continuous evolution of terminal equipment, the terminal equipment becomes an indispensable tool in people's life, and brings great convenience to various aspects of users' daily life. In order to bring a better visual experience to the user, the screen of the terminal device (e.g., a smartphone) is also getting larger and larger.
However, as the touch screen of the terminal device is made larger, it brings a wider viewing angle to people, and at the same time, it is more and more difficult for a user to touch the touch screen of the terminal device with one hand, that is, in the case that the user operates the touch screen of the terminal device with one hand, the user is difficult to touch some edge areas of the touch screen.
At this time, if the user forcibly touches some edge areas of the touch screen with one hand, the terminal device may slip off, which brings unnecessary loss to the user.
Disclosure of Invention
The embodiment of the invention provides an interface display method and terminal equipment, and aims to solve the problem that certain edge areas of a touch screen cannot be touched when a user touches the touch screen of the terminal equipment with one hand.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present invention provides an interface display method, where the method includes:
receiving a first input to a first area of a target interface under the condition that M icons are displayed on the target interface, wherein the target interface comprises N areas, and the first area is one of the N areas;
responding to the first input, and displaying an interface of the first application program corresponding to the first icon in the second area;
the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.
In a second aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes:
the system comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving first input of a first area of a target interface under the condition that M icons are displayed on the target interface, the target interface comprises N areas, and the first area is one of the N areas;
the display module is used for responding to the first input received by the receiving module and displaying an interface of a first application program corresponding to a first icon in the second area;
the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.
In a third aspect, an embodiment of the present invention provides a terminal device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the interface display method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the interface display method according to the first aspect.
In the embodiment of the present invention, when M icons are displayed on a target interface (including N areas), after receiving a first input to a first area (one of the N areas) of the target interface, the terminal device may display an interface of a first application corresponding to the first icon in a second area indicated by the first input, where the first icon is at least one icon displayed in the second area. Therefore, the user can display the application program interfaces corresponding to the icons in other areas on the target interface by controlling the first area in the target interface, and further realize the control of the whole target interface.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of an interface display method according to an embodiment of the present invention;
fig. 3 is one of schematic diagrams of an interface applied by an interface display method according to an embodiment of the present invention;
fig. 4 is a second schematic diagram of an interface applied by the interface display method according to the embodiment of the present invention;
fig. 5 is a third schematic view of an interface applied by an interface display method according to an embodiment of the present invention;
fig. 6 is a fourth schematic view of an interface applied by the interface display method according to the embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 8 is a second schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
It should be noted that "a plurality" herein means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
It should be noted that, for the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, words such as "first" and "second" are used to distinguish the same items or similar items with substantially the same functions or actions, and those skilled in the art can understand that the words such as "first" and "second" do not limit the quantity and execution order. For example, the first input and the second input are for distinguishing different inputs, rather than for describing a particular order of inputs.
The single-hand operation area in the embodiment of the invention specifically refers to: and when the user holds the terminal equipment by one hand, the user holds the terminal equipment by one hand in the current interface to operate the terminal equipment by one hand.
It should be noted that, in the embodiment of the present invention, when a user holds the terminal device with one hand, the holding position may be a central point (any point in the area) of an area where a tiger's mouth of the held hand contacts with a side edge of the terminal device. Of course, the holding position can be a tiger mouth position where the thumb and the index finger contact the terminal device when the user holds the terminal device by one hand; the middle position of the position where the user touches the terminal device when holding the terminal device with one hand may also be selected, which is not specifically limited in the embodiment of the present invention.
In one example, since different users may have different gestures for holding the terminal device with one hand, some users may hold the terminal device with the left hand and some users may hold the terminal device with the right hand; some users prefer to hold the terminal device on the upper side, some users prefer to hold the terminal device in the middle, and some users prefer to hold the terminal device on the lower side, so that the single-hand operation areas of different users may be different when the users hold the terminal device with one hand.
In this way, the terminal device can detect the gesture of holding the terminal device by one hand of the user, and determine the one-hand operation area of the terminal device according to the gesture of holding the terminal device by one hand of the user.
In another example, the user may shake the other side of the terminal device with one side of the terminal device as an axis, thereby determining the one-handed operation region of the terminal device. For example, the user shakes the left side of the terminal device upward with the right side of the terminal device as an axis, and determines the lower right area of the current interface of the terminal device as the one-handed operation area.
The execution main body of the interface display method provided by the embodiment of the present invention may be the terminal device (including a mobile terminal device and a non-mobile terminal device), or may also be a functional module and/or a functional entity capable of implementing the interface display method in the terminal device, and may be specifically determined according to actual use requirements, which is not limited in the embodiment of the present invention. The following takes a terminal device as an example to exemplarily explain an interface display method provided by the embodiment of the present invention.
The terminal device in the embodiment of the invention can be a mobile terminal device and can also be a non-mobile terminal device. The mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile terminal device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the interface display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the interface display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the interface display method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the interface display method provided by the embodiment of the invention by running the software program in the android operating system.
The interface display method according to the embodiment of the present invention is described below with reference to a flowchart of the interface display method shown in fig. 2, where fig. 2 is a schematic flowchart of the interface display method according to the embodiment of the present invention, and the method includes steps 201 and 202:
step 201: under the condition that M icons are displayed on a target interface, the terminal equipment receives first input of a first area of the target interface.
In an embodiment of the present invention, the target interface includes N regions, the first region is one of the N regions, N is a positive integer greater than 1, and M is a positive integer greater than or equal to 1.
In an embodiment of the present invention, the first area belongs to a one-handed operation area of a user in the target interface.
In an embodiment of the present invention, the first input specifically includes: the click input for the first area, or the press input for the first area, or the slide input for the first area, or the drag input for the first area, or other feasibility inputs for the first area may be specifically determined according to actual usage requirements, and the embodiment of the present invention is not limited thereto.
For example, in the embodiment of the present invention, the click input in the present application may be a single click input, a double click input, or any number of click inputs; the click input may be a long-press input or a short-press input. The slide input in the present application may be a slide input in any direction, for example, an upward slide, a downward slide, a leftward slide, or a rightward slide, and the slide trajectory of the slide input may be a straight line or a curved line. The dragging input in the present application may be a dragging input to any direction, for example, dragging clockwise, dragging counterclockwise, dragging upward, dragging downward, dragging leftward, or dragging rightward, and the like, and may be specifically set according to actual requirements.
Optionally, in the embodiment of the present invention, the terminal device may perform area division on the target interface, and the division may be performed in at least two possible implementation manners.
In a first possible implementation (active partitioning):
illustratively, before the step 201, the method may further include the following steps 201a to 201 b:
step 201 a: the terminal device receives a third input from the user.
Step 201 b: and responding to the third input, and dividing the target interface into N areas by the terminal equipment.
Illustratively, the third input may include: the input of the user to the display screen or the input of the user to the target control may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited. It should be noted that the target control is used to trigger the terminal device to divide the target interface into N regions.
For example, the user input to the target control specifically includes: the embodiment of the present invention is not limited to click input of a user on a target control, or slide input of the user on the target control, or other feasibility inputs of the user on the target control.
It should be noted that, the above-mentioned click input and slide input may refer to the description of the first input in the present application, and are not described herein again.
For example, taking the terminal device as an example to divide the target interface into 4 regions, the target interface of the terminal device displays a "one-handed operation" control, and when the user clicks the control (i.e., the third input), as shown in fig. 3 (a), the terminal device divides the target interface (i.e., 31 in fig. 3 (a)) into 4 regions, namely, a region a (i.e., 32 in fig. 3 (a)), a region B (i.e., 33 in fig. 3 (a)), a region C (i.e., 34 in fig. 3 (a)), and a region D (i.e., 35 in fig. 3 (a)).
For example, the N regions may be set by default for the terminal device; or adding an "area setting" option in the "setting" application program, and then setting the "area setting" option by the user, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
In a second possible implementation (passive partitioning):
illustratively, before the step 201, the method may further include the following steps 201c to 201 d:
step 201 c: and the terminal equipment receives the sliding input of the user on the target interface.
Step 201 d: and responding to the sliding input, and dividing the target interface by the terminal equipment by taking the sliding track of the sliding input as a zone boundary.
For example, in a case where the terminal device displays the target interface, the user slides twice along the horizontal central axis and the vertical central axis of the display screen, and the terminal device divides the target interface into 4 regions by taking the sliding track of the user as a region boundary, as shown in (a) of fig. 3.
Further optionally, in the embodiment of the present invention, after the terminal device divides the target interface into N regions, the terminal device may determine, based on a preset first mapping relationship table, another region mapped by any one region. The first mapping relation table is used for representing the mapping relation between any area and other areas.
Optionally, in the embodiment of the present invention, after the terminal device divides the target interface into N regions, each of the N regions may be divided into sub regions.
It should be noted that, in the specific process of dividing each of the N regions into sub regions, reference may be made to a process of dividing the target interface into regions by the terminal device, and details are not described here again.
Further optionally, in the embodiment of the present invention, after the terminal device divides each of the N regions into sub regions, the terminal device may determine, based on a preset second mapping relationship table, the sub region in the other region to which the sub region in any one region is mapped. The second mapping relation table is used for representing the mapping relation between the sub-area in any area and the sub-area in other areas.
In one example, the number of sub-regions included in each of the N regions is the same.
Example 1, as shown in (a) of fig. 3, the target interface 31 of the terminal device includes 4 regions, which are a region a32, a region B33, a region C34, and a region D35, respectively. Wherein, the area A32 comprises 6 sub-areas, which are respectively sub-areas A1 to A6; region B33 includes 6 sub-regions, sub-regions B1 through B6, respectively; region C34 includes 6 sub-regions, sub-regions C1 through C6, respectively; region D35 includes 6 sub-regions, sub-regions D1 through D6, respectively. The terminal device may map sub-region a1, sub-region B1, sub-region C1 and sub-region D1, sub-region a2, sub-region B2, sub-region C2 and sub-region D2, and so on map sub-region a6, sub-region B6, sub-region C6 and sub-region D6.
In another example, the number of sub-regions included in each of some or all of the N regions is not all the same.
Example 2, as shown in fig. 3 (B), the target interface of the terminal device (i.e., 31 in fig. 3 (B)) includes 4 regions, which are a region a32, a region B33, a region C34, and a region D35, respectively. Wherein, the region a32 includes 8 sub-regions, respectively sub-regions a1 to A8; region B33 includes 4 sub-regions, sub-regions B1 through B4, respectively; region C34 includes 4 sub-regions, sub-regions C1 through C4, respectively; region D35 includes 8 sub-regions, sub-regions D1 through D8, respectively. The terminal device may map sub-region a1, sub-region B1, sub-region C1 and sub-region D1, map sub-region a2, sub-region B2, sub-region C2 and sub-region D2, and so on, map sub-region a4, sub-region B4, sub-region C4 and sub-region D4; the sub-region a5 and the sub-region D5 are mapped, the region a6 and the sub-region D6 are mapped, the region a7 and the sub-region D7 are mapped, and the sub-region A8 and the sub-region D8 are mapped.
Step 202: and responding to the first input, and displaying an interface of the first application program corresponding to the first icon in the second area by the terminal equipment.
In an embodiment of the present invention, the second area is an area indicated by the first input, the first icon is at least one icon displayed in the second area, and any one icon in the target interface corresponds to at least one application program.
For example, the second region may be at least one region of the N regions except for the first region, or may be all regions of the N regions, which is not limited in the embodiment of the present invention.
For example, the first icon may be an application icon of an application corresponding to the first icon, may be an icon for triggering a shortcut of the application corresponding to the first icon, and may also be another icon capable of uniquely identifying the application corresponding to the first icon, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
For example, when the number of the first icon is one, the terminal device may display an interface of the first application program corresponding to the first icon in a full screen manner; when the number of the first icons is multiple, the terminal device may display the interfaces of the first applications corresponding to the multiple first icons in a segment-by-segment overlapping manner, or the terminal device may display the interfaces of the first applications corresponding to the multiple first icons in a split-screen manner.
Optionally, in the embodiment of the present invention, when the first input is: a slide input for the first region; the second region is: and a region of the N regions to which the sliding direction of the sliding input is directed.
For example, in a case that a first region and a second region of the N regions in the target interface of the terminal device each include at least one sub-region, the first input may be: a slide input to the first sub-area in the first area. The first icon is: and the second sub-area is a sub-area corresponding to the first sub-area in the second area.
Illustratively, the first input is that the user slides upwards on the first area, and then the second area is an area above the first area; the first input is that the user slides rightwards on the first area, and then the second area is the area at the right of the first area; the first input is that the user slides leftwards on the first area, and then the second area is the area on the left of the first area; the first input is that the user slides down on the first area, and the second area is an area below the first area, which may be determined according to actual usage requirements.
For example, as shown in (a) of fig. 3, when the user slides upward (i.e., the above-described first input) on the region D35 (i.e., the above-described first region) in the target interface 31 of the terminal device, the corresponding region C34; when the user slides left and up (i.e., the first input described above) on the region D35 (i.e., the first region described above) in the target interface 31 of the terminal device, it corresponds to the region B33; when the user slides leftward (i.e., the first input described above) on the region D35 (i.e., the first region described above) in the target interface 31 of the terminal device, it corresponds to the region a 32.
Optionally, in this embodiment of the present invention, when any one of N regions in the target interface of the terminal device includes at least one sub-region, the first input may be: and touch input aiming at the first subarea in the first area. The first icon is: and the second sub-area is a sub-area corresponding to the first sub-area in the second area.
For example, taking the right hand of the user holding the terminal device and taking the first icon as the application icon of the application as an example, with reference to (a) in fig. 3, as shown in fig. 4, the terminal device correspondingly displays the application icon of "application 1" in the sub-area a1 in the area a32 in the target interface 31; an application icon of "application 2" is correspondingly displayed in a sub-area B1 in the area B33 in the target interface 31; an application icon of "application 3" is correspondingly displayed in a sub-area C1 in the area C34 in the target interface 31; "application 4" is correspondingly displayed in the sub-area D1 in the area D35 in the target interface 31.
Example 1, when the user slides up on the application icon of "application 4", the interface of "application 3" is displayed in the display screen of the terminal device; when the user slides to the upper left on the application program icon of the application 4, an interface of the application 2 is displayed in a display screen of the terminal equipment; when the user slides to the left on the application icon of "app 4", the interface of "app 1" is displayed in the display screen of the terminal device.
Example 2, when the user double-clicks the application icon of "application 4", as shown in fig. 5, the interfaces of "application 1" to "application 4" are displayed in a fragment-by-fragment manner on the display screen (i.e., 41 in fig. 5) of the terminal device, the user may click the fragment interface of "application 1", and then the terminal device displays the interface of "application 1" in a full-screen manner, and other applications run in the background.
In the interface display method provided by the embodiment of the present invention, after receiving a first input to a first area (one of N areas) of a target interface when a terminal device displays M icons on the target interface (including N areas), the terminal device may display an interface of a first application program corresponding to the first icon in a second area indicated by the first input, where the first icon is at least one icon displayed in the second area. Therefore, the user can display the application program interfaces corresponding to the icons in other areas on the target interface by controlling the first area in the target interface, and further realize the control of the whole target interface.
Optionally, in the embodiment of the present invention, when the second area in the target interface of the terminal device does not include an icon, the terminal device may remind the user in a variety of ways.
Illustratively, after step 201, the method may further include the following step a 1:
step A1: and if the icon is not displayed in the second area, the terminal equipment outputs reminding information.
For example, the reminding information may be vibration information, text information, or voice information, which is not limited in the embodiment of the present invention.
In an example, in a case that any one of N regions in the target interface of the terminal device includes at least one sub-region, the first input may be: and touch input aiming at the first subarea in the first area. And when the second subarea in the second area in the target interface of the terminal equipment does not comprise the icon, the terminal equipment controls the terminal equipment to vibrate. The second sub-region is a sub-region corresponding to the first sub-region in the second region.
Therefore, the terminal equipment can directly remind the user that the terminal equipment does not have the interface of the application program which can be displayed by utilizing the reminding information.
Optionally, in the embodiment of the present invention, the terminal device may further display the icons displayed in the first area and the second area in an exchangeable manner, and further may display the icon in the second area in the first area.
Illustratively, after step 201, the method may further include the following step B1 to step B3:
step B1: and responding to the first input, and displaying a second icon in the first area by the terminal equipment.
For example, the second icon is an icon displayed in the second area.
For example, the second icon may be an application icon of an application corresponding to the second icon, may be an icon for triggering a shortcut of the application of the second icon, and may also be another icon capable of uniquely identifying the application corresponding to the second icon, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
Step B2: and the terminal equipment receives a second input aiming at the first icon.
Illustratively, the first icon is at least one of the second icons.
For example, the second input may specifically include: the embodiment of the present invention is not limited to a click input by a user on the first icon, or a slide input by the user on the first icon, or other feasible inputs by the user on the first icon.
Step B3: and responding to the second input, and displaying an interface of the first application program corresponding to the first icon by the terminal equipment.
Further optionally, in an embodiment of the present invention, the step B1 may specifically include the step B1 a:
b1 a: the terminal device displays a second icon in the first area and displays a third icon in the second area.
Illustratively, the third icon is an icon displayed in the first area.
For example, the third icon may be an application icon of an application corresponding to the third icon, may be an icon for triggering a shortcut of the application of the third icon, and may also be another icon capable of uniquely identifying the application corresponding to the third icon, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
For example, referring to fig. 3 and 4, after the user slides to the left in the blank of the area D (i.e., the first area), an application icon of "application 1" (i.e., the second icon) displayed in the area a (i.e., the second area) is displayed in the area D in the target interface 31 of the terminal device as shown in fig. 6; an application icon of "application 4" displayed in the area D (i.e., the above-described third icon) is displayed in the area a in the target interface 31 of the terminal device. When the user clicks the application icon of "application 1" (i.e., the second input described above), the terminal device displays the interface of "application 1".
Therefore, when the user touches the first area of the terminal device with one hand, the terminal device can exchange the icons displayed in the first area and the second area, and then the user can control the first area with one hand to control the N areas of the target interface of the terminal device.
Fig. 7 is a schematic diagram of a possible structure of a terminal device according to an embodiment of the present invention, and as shown in fig. 7, the terminal device 600 includes: a receiving module 601 and a display module 602, wherein:
the receiving module 601 is configured to receive a first input to a first area of a target interface when M icons are displayed on the target interface, where the target interface includes N areas, and the first area is one of the N areas.
A display module 602, configured to display an interface of a first application program corresponding to a first icon in a second area in response to the first input received by the receiving module 601; the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.
Optionally, the first input is: a slide input to the first area; the second region is: and a region of the N regions to which the sliding direction of the sliding input is directed.
Optionally, the first region and the second region each include at least one sub-region; the first input is: touch input aiming at a first subarea in the first area; the first icon is: and the second sub-area is a sub-area corresponding to the first sub-area in the second area.
Optionally, the display module 602 is further configured to display a second icon in the first area in response to the first input received by the receiving module 601, where the second icon is an icon displayed in the second area; the receiving module 601 is further configured to receive a second input for the first icon, where the first icon is at least one of the second icons; the display module 602 is further configured to display an interface of the first application program corresponding to the first icon in response to the second input received by the receiving module 601.
Optionally, as shown in fig. 7, the terminal device 600 further includes: an execution module 603, wherein: the executing module 603 is configured to output a reminding message if no icon is displayed in the second area.
It should be noted that, as shown in fig. 7, modules that are necessarily included in the electronic device 600 are illustrated by solid line boxes, such as a receiving module 601; modules that may or may not be included in the electronic device 600 are illustrated with dashed boxes, such as execution module 603.
In the terminal device provided in the embodiment of the present invention, when M icons are displayed on a target interface (including N areas), after receiving a first input to a first area (one of the N areas) of the target interface, the terminal device may display an interface of a first application program corresponding to the first icon in a second area indicated by the first input, where the first icon is at least one icon displayed in the second area. Therefore, the user can display the application program interfaces corresponding to the icons in other areas on the target interface by controlling the first area in the target interface, and further realize the control of the whole target interface.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and is not described here again to avoid repetition.
Fig. 8 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention, where the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the configuration of the terminal device 100 shown in fig. 8 does not constitute a limitation of the terminal device, and that the terminal device 100 may include more or less components than those shown, or combine some components, or arrange different components. In the embodiment of the present invention, the terminal device 100 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The user input unit 107 is configured to receive a first input to a first area of a target interface when M icons are displayed on the target interface, where the target interface includes N areas, and the first area is one of the N areas; a display unit 106, configured to display an interface of the first application program corresponding to the first icon in the second area in response to the first input received by the user input unit 107; the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.
In the terminal device provided in the embodiment of the present invention, when M icons are displayed on a target interface (including N areas), after receiving a first input to a first area (one of the N areas) of the target interface, the terminal device may display an interface of a first application program corresponding to the first icon in a second area indicated by the first input, where the first icon is at least one icon displayed in the second area. Therefore, the user can display the application program interfaces corresponding to the icons in other areas on the target interface by controlling the first area in the target interface, and further realize the control of the whole target interface.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device 100 provides the user with wireless broadband internet access via the network module 102, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device 100. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 8, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device 100, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device 100, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device 100, connects various parts of the entire terminal device 100 by various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device 100. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor 110, where the computer program, when executed by the processor, implements each process of the above-mentioned interface display method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the interface display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An interface display method is applied to terminal equipment, and is characterized by comprising the following steps:
receiving a first input to a first area of a target interface under the condition that M icons are displayed on the target interface, wherein the target interface comprises N areas, and the first area is one of the N areas;
responding to the first input, and displaying an interface of the first application program corresponding to the first icon in the second area;
the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.
2. The method of claim 1, wherein the first input is: a slide input for the first region; the second area is: an area of the N areas toward which a sliding direction of the sliding input is directed.
3. The method of claim 1 or 2, wherein the first region and the second region each comprise at least one sub-region;
the first input is: a touch input directed to a first sub-area in the first area;
the first icon is: and icons in a second sub-area in the second area, wherein the second sub-area is a sub-area corresponding to the first sub-area in the second area.
4. The method of claim 1, wherein after receiving the first input for the first region of the target interface, the method further comprises:
displaying a second icon in the first area in response to the first input, the second icon being an icon displayed in the second area;
the interface for displaying the first application program corresponding to the first icon in the second area comprises:
receiving a second input for the first icon, the first icon being at least one of the second icons;
and responding to the second input, and displaying an interface of the first application program corresponding to the first icon.
5. The method of claim 1, wherein after receiving the first input for the first region of the target interface, the method further comprises:
and if the icon is not displayed in the second area, outputting reminding information.
6. A terminal device, characterized in that the terminal device comprises:
the system comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving first input of a first area of a target interface under the condition that M icons are displayed on the target interface, the target interface comprises N areas, and the first area is one of the N areas;
the display module is used for responding to the first input received by the receiving module and displaying an interface of the first application program corresponding to the first icon in the second area;
the second area is an area indicated by the first input, and the first icon is at least one icon displayed in the second area.
7. The terminal device of claim 6, wherein the first input is: a slide input for the first region; the second area is: an area of the N areas toward which a sliding direction of the sliding input is directed.
8. A terminal device according to claim 6 or 7, wherein the first region and the second region each comprise at least one sub-region; the first input is: a touch input directed to a first sub-area in the first area; the first icon is: and icons in a second sub-area in the second area, wherein the second sub-area is a sub-area corresponding to the first sub-area in the second area.
9. The terminal device of claim 6,
the display module is further configured to display a second icon in the first area in response to the first input received by the receiving module, where the second icon is an icon displayed in the second area;
the receiving module is further configured to receive a second input for the first icon, where the first icon is at least one of the second icons;
the display module is further configured to display an interface of the first application program corresponding to the first icon in response to the second input received by the receiving module.
10. The terminal device according to claim 6, wherein the terminal device further comprises: an execution module, wherein:
and the execution module is used for outputting reminding information if the icon is not displayed in the second area.
11. A terminal device, characterized by comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the interface display method according to any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the interface display method according to any one of claims 1 to 5.
CN201911193970.7A 2019-11-28 2019-11-28 Interface display method and terminal equipment Pending CN111026303A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911193970.7A CN111026303A (en) 2019-11-28 2019-11-28 Interface display method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911193970.7A CN111026303A (en) 2019-11-28 2019-11-28 Interface display method and terminal equipment

Publications (1)

Publication Number Publication Date
CN111026303A true CN111026303A (en) 2020-04-17

Family

ID=70203247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911193970.7A Pending CN111026303A (en) 2019-11-28 2019-11-28 Interface display method and terminal equipment

Country Status (1)

Country Link
CN (1) CN111026303A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022063123A1 (en) * 2020-09-27 2022-03-31 维沃移动通信有限公司 Interface displaying method, device, and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019568A (en) * 2012-12-21 2013-04-03 东莞宇龙通信科技有限公司 Terminal and icon display method
CN103593136A (en) * 2013-10-21 2014-02-19 广东欧珀移动通信有限公司 Touch terminal, and one-hand operation method and device of large-screen touch terminal
CN104345887A (en) * 2014-10-31 2015-02-11 广东欧珀移动通信有限公司 Position regulation method and device for desktop icons
WO2016145832A1 (en) * 2015-08-04 2016-09-22 中兴通讯股份有限公司 Method of operating terminal and device utilizing same
CN108196748A (en) * 2017-12-28 2018-06-22 努比亚技术有限公司 Terminal display control method, terminal and computer readable storage medium
CN109814794A (en) * 2018-12-13 2019-05-28 维沃移动通信有限公司 A kind of interface display method and terminal device
CN110221761A (en) * 2019-04-24 2019-09-10 维沃移动通信有限公司 Display methods and terminal device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019568A (en) * 2012-12-21 2013-04-03 东莞宇龙通信科技有限公司 Terminal and icon display method
CN103593136A (en) * 2013-10-21 2014-02-19 广东欧珀移动通信有限公司 Touch terminal, and one-hand operation method and device of large-screen touch terminal
CN104345887A (en) * 2014-10-31 2015-02-11 广东欧珀移动通信有限公司 Position regulation method and device for desktop icons
WO2016145832A1 (en) * 2015-08-04 2016-09-22 中兴通讯股份有限公司 Method of operating terminal and device utilizing same
CN108196748A (en) * 2017-12-28 2018-06-22 努比亚技术有限公司 Terminal display control method, terminal and computer readable storage medium
CN109814794A (en) * 2018-12-13 2019-05-28 维沃移动通信有限公司 A kind of interface display method and terminal device
CN110221761A (en) * 2019-04-24 2019-09-10 维沃移动通信有限公司 Display methods and terminal device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022063123A1 (en) * 2020-09-27 2022-03-31 维沃移动通信有限公司 Interface displaying method, device, and electronic device

Similar Documents

Publication Publication Date Title
CN110851051B (en) Object sharing method and electronic equipment
CN108255378B (en) Display control method and mobile terminal
CN110069178B (en) Interface control method and terminal equipment
CN110062105B (en) Interface display method and terminal equipment
CN108762634B (en) Control method and terminal
CN108762705B (en) Information display method, mobile terminal and computer readable storage medium
CN108897486B (en) Display method and terminal equipment
CN109032486B (en) Display control method and terminal equipment
CN109828705B (en) Icon display method and terminal equipment
CN110489029B (en) Icon display method and terminal equipment
CN110502162B (en) Folder creating method and terminal equipment
CN109857289B (en) Display control method and terminal equipment
CN109933252B (en) Icon moving method and terminal equipment
CN110007822B (en) Interface display method and terminal equipment
CN110752981B (en) Information control method and electronic equipment
CN109408072B (en) Application program deleting method and terminal equipment
CN109407949B (en) Display control method and terminal
CN108681427B (en) Access right control method and terminal equipment
CN110231972B (en) Message display method and terminal equipment
WO2021093772A1 (en) Notification message processing method and electronic device
CN111190517B (en) Split screen display method and electronic equipment
CN111459350B (en) Icon sorting method and device and electronic equipment
CN111198637B (en) Operation control method and electronic equipment
CN109885242B (en) Method for executing operation and electronic equipment
CN108897477B (en) Operation control method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200417

RJ01 Rejection of invention patent application after publication