CN108833679B - Object display method and terminal equipment - Google Patents

Object display method and terminal equipment Download PDF

Info

Publication number
CN108833679B
CN108833679B CN201810510386.9A CN201810510386A CN108833679B CN 108833679 B CN108833679 B CN 108833679B CN 201810510386 A CN201810510386 A CN 201810510386A CN 108833679 B CN108833679 B CN 108833679B
Authority
CN
China
Prior art keywords
objects
terminal device
target
interface
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810510386.9A
Other languages
Chinese (zh)
Other versions
CN108833679A (en
Inventor
汪劼文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810510386.9A priority Critical patent/CN108833679B/en
Publication of CN108833679A publication Critical patent/CN108833679A/en
Application granted granted Critical
Publication of CN108833679B publication Critical patent/CN108833679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0279Improving the user comfort or ergonomics
    • H04M1/0281Improving the user comfort or ergonomics for providing single handed use or left/right hand conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Abstract

The embodiment of the invention discloses an object display method and terminal equipment, relates to the technical field of terminals, and can solve the problem of poor flexibility of the terminal equipment in displaying application icons. The specific scheme is as follows: receiving a first input of a user; under the condition that at least one object in M objects in a first interface of the terminal equipment is in other areas, in response to a first input, combining P objects in the M objects to obtain N objects, wherein one object is used for indicating at least one application program, the other areas are areas except a target operation area preset by the terminal equipment in the first interface, M, P and N are positive integers, P is less than or equal to M, and N is less than P; and displaying N objects and M-P objects in the target operation area.

Description

Object display method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of terminals, in particular to an object display method and terminal equipment.
Background
With the rapid development of terminal technology, a full screen has become a new trend in the development of terminal devices (e.g., mobile phones). However, with the increase of the display screen of the mobile phone, the problem that the user is not convenient enough to operate the mobile phone with one hand is caused.
In order to solve the above problem, many mobile phones currently use a screen gesture (e.g., three-finger pull-down) to pull down several rows of application icons on the upper half of the mobile phone screen to the lower half of the mobile phone screen, and the application icons on the lower half of the mobile phone screen are not displayed on the current interface of the mobile phone. Therefore, the mobile phone can be conveniently operated by one hand of a user.
However, in the above method, after several rows of application icons on the upper half of the mobile phone screen are pulled down to the lower half of the mobile phone screen by using a screen gesture, the application icons on the original lower half of the mobile phone screen cannot be displayed on the current interface of the mobile phone, so that the flexibility of displaying the application icons on the mobile phone is poor.
Disclosure of Invention
The embodiment of the invention provides an object display method and a terminal device, which can solve the problem of poor flexibility of displaying an application icon by the terminal device.
In order to solve the technical problem, the embodiment of the invention adopts the following technical scheme:
in a first aspect of the embodiments of the present invention, an object display method is provided, where the object display method may include: receiving a first input of a user; under the condition that at least one object in M objects in a first interface of the terminal equipment is in other areas, P objects in the M objects are merged and processed in response to first input to obtain N objects, one object is used for indicating at least one application program, the other areas are areas except a target operation area preset by the terminal equipment in the first interface, M, P and N are positive integers, P is less than or equal to M, and N is less than P; and displaying the N objects and the M-P objects in the target operation area.
In a second aspect of the embodiments of the present invention, a terminal device is provided, where the terminal device may include: the device comprises a receiving unit, a processing unit and a display unit. The receiving unit may be configured to receive a first input from a user. The processing unit may be configured to, when at least one object of the M objects in the first interface of the terminal device is in another region, respond to the first input received by the receiving unit, perform merging processing on P objects of the M objects to obtain N objects, where one object is used to indicate at least one application program, the other regions are regions of the first interface other than a target operation region preset by the terminal device, M, P and N are positive integers, P is less than or equal to M, and N is less than or equal to P. And the display unit can be used for displaying the N objects and the M-P objects obtained by the processing unit in the target operation area.
In a third aspect of the embodiments of the present invention, a terminal device is provided, where the terminal device may include a processor, a memory, and a computer program stored in the memory and running on the processor, and the computer program, when executed by the processor, implements the steps of the object display method according to the first aspect.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the object display method according to the first aspect.
In the embodiment of the present invention, the terminal device may perform merging processing on P objects of the M objects to obtain N objects when at least one object of the M objects in the first interface is in another area (an area other than a target operation area preset by the terminal device in the first interface), and display the N objects and the M-P objects in the target operation area. The terminal equipment can display N objects and M-P objects in the target operation area, so that a user can conveniently operate the terminal equipment by one hand; in addition, since the N objects are obtained by combining the P objects in the M objects by the terminal device, the N objects may include the P objects, and thus, when the terminal device displays the N objects and the M-P objects in the target operation area, the M objects may all be located in the target operation area, and thus, the flexibility of displaying the M objects by the terminal device is improved.
Drawings
Fig. 1 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
fig. 2 is a first flowchart of an object display method according to an embodiment of the present invention;
fig. 3 is a first schematic diagram illustrating an example of an interface of a mobile phone according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating an example of a target operating region according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 6 is a flowchart of a method for displaying an object according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating an example of a set of objects according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating an example of N objects according to an embodiment of the present invention;
fig. 9 is a third schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 10 is a fourth schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 11 is a flowchart of a third method for displaying an object according to an embodiment of the present invention;
fig. 12 is a schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 13 is a first schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 15 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like, in the description and in the claims of embodiments of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs. In the description of the embodiments of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the invention provides an object display method and terminal equipment, wherein the terminal equipment can combine P objects in M objects to obtain N objects under the condition that at least one object in the M objects in a first interface is in other areas (areas except a target operation area preset by the terminal equipment in the first interface), and display the N objects and M-P objects in the target operation area. The terminal equipment can display N objects and M-P objects in the target operation area, so that a user can conveniently operate the terminal equipment by one hand; in addition, since the N objects are obtained by combining the P objects in the M objects by the terminal device, the N objects may include the P objects, and thus, when the terminal device displays the N objects and the M-P objects in the target operation area, the M objects may all be located in the target operation area, and thus, the flexibility of displaying the M objects by the terminal device is improved.
The embodiment of the invention provides an object display method and terminal equipment, which can be applied to the process of operating the terminal equipment by a single hand of a user. Specifically, the method can be applied to a process that the terminal device displays the M objects in the first interface of the terminal device in the target operation area when the user operates the terminal device with one hand.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the object display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the object display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the object display method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the object display method provided by the embodiment of the invention by running the software program in the android operating system.
An object display method and a terminal device provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
At present, in the prior art, after a mobile phone uses a screen gesture mode to pull down several rows of application icons on the upper half of a mobile phone screen to the lower half of the mobile phone screen, the application icons on the original lower half of the mobile phone screen cannot be displayed on the current interface of the mobile phone, so that the flexibility of displaying the application icons on the mobile phone is poor.
In order to solve the above technical problem, an embodiment of the present invention provides an object display method. Exemplarily, fig. 2 illustrates an object display method provided by an embodiment of the present invention, which may be applied to a terminal device having an android operating system as illustrated in fig. 1. Wherein, although the logical order of the object display methods provided by embodiments of the present invention is illustrated in method flow diagrams, in some cases, the steps shown or described may be performed in an order different than here. For example, as shown in fig. 2, the object display method includes steps 201 to 203 described below.
Step 201, the terminal device receives a first input of a user.
In this embodiment of the present invention, the first input may be used to trigger the terminal device to update M objects in the first interface of the display terminal device, and one object may be used to indicate at least one application.
Optionally, in the embodiment of the present invention, when the user holds the terminal device with one hand, the user may perform the first input on the terminal device to trigger the terminal device to update and display the M objects in the first interface of the terminal device.
It is to be understood that, in the embodiment of the present invention, in the case that one object is one folder (e.g., a first folder), the one object may be used to indicate at least two applications; in the case where one object is one application icon, the one object may be used to indicate one application corresponding to the one application icon.
For example, a terminal device is taken as a mobile phone for explanation. Let M be 15. As shown in fig. 3, a user holds the mobile phone 10 with one hand, and 15 objects are displayed on the first interface of the mobile phone 10, where the 15 objects are application icon 1 (i.e., an icon corresponding to application 1) -application icon 12, and folder 1-folder 3, respectively; each of the three folders (i.e., folder 1-folder 3) includes at least two application icons, and one application icon corresponds to one application.
Optionally, in the embodiment of the present invention, the first input may be a pressing operation of a user on a physical key of the terminal device, a clicking operation of the user on a display screen of the terminal device, a shaking input of the user on the terminal device, or the like. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Illustratively, the clicking operation may be a single click, a double click, or a preset number of continuous clicks. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Illustratively, assume that the first input is a shaking input of the handset by the user. Referring to fig. 3, when holding the mobile phone 10 with one hand, the user may shake the mobile phone 10 to the left (using the user as a reference object), to the right, upwards or downwards to trigger the mobile phone 10 to update 15 objects in the first interface of the mobile phone 10.
Step 202, the terminal device responds to the first input and performs merging processing on P objects in the M objects to obtain N objects when at least one object in the M objects in the first interface of the terminal device is in another area.
Wherein M, P and N are both positive integers, and P is less than or equal to M, and N is less than P.
In the embodiment of the present invention, one object is used to indicate at least one application program, and the other area is an area other than a target operation area preset by the terminal device in the first interface.
Optionally, in this embodiment of the present invention, the target operation area may be an area preset by the terminal device, and the shape of the target operation area may be a rectangle, a circle, an ellipse, or a sector. The method and the device can be specifically set according to actual use requirements, and the embodiment of the invention is not limited.
It can be understood that, in the embodiment of the present invention, the target operation area may be an area that is convenient for the user to operate with one hand when the user holds the terminal device with one hand in a first interface preset by the terminal device, and the other area may be an area that is inconvenient for the user to operate with one hand when the user holds the terminal device with one hand in the first interface.
It should be noted that, in the embodiment of the present invention, when a user holds the terminal device with one hand, the holding position may be a central point (any point in the area) of an area where a tiger's mouth of the held hand contacts with a side edge of the terminal device. Of course, the holding position can be a tiger mouth position where the thumb and the index finger contact the terminal device when the user holds the terminal device by one hand; the middle position of the position where the user touches the terminal device when holding the terminal device with one hand may also be selected, which is not specifically limited in the embodiment of the present invention.
It can be understood that, in the embodiment of the present invention, the gesture of holding the terminal device with one hand may be different for different users, for example, some users use the left hand to hold the terminal device, and some users use the right hand to hold the terminal device; some users prefer to hold the terminal device on the upper side, some users prefer to hold the terminal device in the middle, and some users prefer to hold the terminal device on the lower side, so that target operation areas of different users may be different when the users hold the terminal device with one hand.
Generally, the terminal device may detect a gesture of holding the terminal device with one hand of a user, and determine a target operation region of the terminal device according to the gesture of holding the terminal device with one hand of the user.
For example, the target operation area is described as a sector. As shown in fig. 4 (a), that is, when the user holds the mobile phone 10 with the right hand and the position of the user holding the mobile phone 10 is relatively low, the mobile phone 10 may determine that the target operation area in the first interface of the mobile phone 10 is the area 1 and the other areas in the first interface of the mobile phone 10 are the areas 2 except the target operation area (e.g., the area 1) according to the posture of the user holding the mobile phone 10.
As another example, as shown in (B) of fig. 4, when the user holds the mobile phone 10 with the right hand, and the position of the user holding the mobile phone 10 is relatively close to the middle, the mobile phone 10 may determine that the target operation area in the first interface of the mobile phone 10 is the area 3, and the other areas in the first interface are the areas 4 except the target operation area (e.g., the area 3) according to the posture of the user holding the mobile phone 10.
As another example, the gesture of the user holding the mobile phone 10 is shown in (C) in fig. 4, that is, the user holds the mobile phone 10 with the left hand, and the position of the user holding the mobile phone 10 is relatively low, the mobile phone 10 may determine, according to the gesture of the user holding the mobile phone 10, that the target operation area in the first interface of the mobile phone 10 is the area 5, and the other areas in the first interface are the areas 6 except the target operation area (e.g., the area 5).
As another example, the gesture of the user holding the mobile phone 10 is shown in (D) in fig. 4, that is, the user holds the mobile phone 10 with the left hand, and the position of the user holding the mobile phone 10 is relatively middle, the mobile phone 10 may determine that the target operation region in the first interface of the mobile phone 10 is the region 7, and the other regions in the first interface are the regions 8 except the target operation region (e.g., the region 7) according to the gesture of the user holding the mobile phone 10.
Optionally, in the embodiment of the present invention, the terminal device may first detect a current region where each object in the M objects is located, and then determine whether at least one object in the M objects is located in another region, except the target operation region, in the first interface according to the current region where each object is located.
Illustratively, in conjunction with (a) in fig. 3 and 4, as shown in fig. 5, the mobile phone 10 may detect that, of the 15 objects (e.g., application icon 1-application icon 12, and folder 1-folder 3) of the first interface, application icon 7-application icon 12 and folder 3 are in area 1 (i.e., the target operation area), and application icon 1-application icon 6, folder 1 and folder 2 are in area 2 (i.e., other areas except the target operation area).
Optionally, in an implementation manner of the embodiment of the present invention, the first input may be a shaking input of the terminal device by the user. With reference to fig. 2, as shown in fig. 6, before the step 202, the object display method provided in the embodiment of the present invention may further include the following step 301 and step 302, and the step 202 may be specifically implemented by the following step 202 a.
Step 301, the terminal device responds to the first input, and obtains a parameter of shaking input.
Wherein the parameter may include a shaking force value.
Optionally, in the embodiment of the present invention, the terminal device may obtain the shake input parameter through a sensor of the terminal device.
And step 302, the terminal equipment acquires a target rule according to the shaking force value.
Optionally, in an embodiment of the present invention, the target rule may include single-row merging, merging every two rows, merging all rows, merging single column, merging every two columns, and merging all columns.
Optionally, in this embodiment of the present invention, the step 302 may be specifically implemented by the following step 302a and step 302 b.
Step 302a, the terminal equipment determines the force range to which the shaking force value belongs.
In the embodiment of the invention, the terminal equipment can preset a plurality of force ranges, so that the terminal equipment can determine the force range to which the shaking force value belongs from the plurality of force ranges according to the obtained shaking force value.
And step 302b, the terminal equipment acquires a target rule corresponding to the strength range according to the strength range.
In the embodiment of the invention, the terminal equipment can preset a plurality of target rules, so that the terminal equipment can acquire the target rule corresponding to the strength range from the plurality of target rules according to the determined strength range.
For example, it is assumed that the plurality of target rules preset by the terminal device are single-row merging, two-row-every-row merging, and all-row merging, respectively. As shown in table 1, an example of a correspondence relationship between the shaking force value, the force range, and the target rule according to the embodiment of the present invention is shown.
TABLE 1
Figure BDA0001672236450000061
In table 1, the force range to which the shaking force value a belongs is force range 1, and the target rule corresponding to the force range 1 is single-line combination; the shaking force value b belongs to a force range of 2, and the target rule corresponding to the force range of 2 is that every two lines are combined; the shaking force value c belongs to a force range 3, and the target rule corresponding to the force range 3 is that all rows are combined.
For example, in combination with table 1, assuming that the shaking force value obtained by the terminal device is a, the terminal device may determine, according to the shaking force value a, a force range 1 to which the shaking force value a belongs, and determine, according to the force range 1, that the target rule corresponding to the force range 1 is single-line combination.
Step 202a, when the terminal device is in another area in at least one object of the M objects in the first interface of the terminal device, responding to the first input, and merging the P objects of the M objects according to a target rule corresponding to the first input, so as to obtain N objects.
Optionally, in this embodiment of the present invention, the step 202a may be specifically implemented by the following step 202 a' and step 202a ″.
Step 202 a', when the terminal device is in another area in at least one object of the M objects in the first interface of the terminal device, in response to the first input, determining an object that meets a target rule among the M objects according to the target rule, so as to obtain N object groups.
Wherein the N object groups include P objects.
For example, it is assumed that the target rule obtained by the terminal device is merged in a single row. The terminal device may determine all objects in the same row of the first interface among the P objects as an object group to obtain N object groups.
For example, assuming that M is 15, P is 12, and N is 5, the target rule is a single-row merge. In conjunction with fig. 3, as shown in (a) of fig. 7, the mobile phone 10 may determine the application icon 1 and the application icon 2 in the first row as the object group 1, determine the application icon 3 and the application icon 4 in the second row as the object group 2, determine the application icon 5, the application icon 6, and the application icon 7 in the third row as the object group 3, determine the application icon 8 and the folder 3 in the fourth row as the object group 4, and determine the application icon 10, the application icon 11, and the application icon 12 in the fifth row as the object group 5, so as to obtain 5 object groups (i.e., object group 1-object group 5), where the 5 object groups include 12 objects.
As another example, assuming that M is 15, P is 15, and N is 5, the target rule is a single-line merge. In conjunction with fig. 3, as shown in (B) of fig. 7, the mobile phone 10 may determine all objects (e.g., app icon 1, app icon 2, and folder 1) in the first row as an object group 1, all objects (e.g., folder 2, app icon 3, and app icon 4) in the second row as an object group 2, all objects (e.g., app icon 5, app icon 6, and app icon 7) in the third row as an object group 3, all objects (e.g., app icon 8, folder 3, and app icon 9) in the fourth row as an object group 4, and all objects (e.g., app icon 10, app icon 11, and app icon 12) in the fifth row as an object group 5, so as to obtain 5 object groups (i.e., object group 1-object group 5), which include 15 objects.
In step 202a ", the terminal device respectively merges the objects in each of the N object groups to obtain N objects.
It can be understood that, in the embodiment of the present invention, when at least one object in an object group is a folder (for example, a first folder, where the first folder includes at least two application icons), the terminal device may first obtain at least two application icons in each first folder, and then merge other application icons in the object group except the first folder with the at least two application icons in each first folder; or, the terminal device may also directly merge the at least one first folder in the object group and the application icons other than the at least one first folder.
For example, referring to fig. 7 (B), as shown in fig. 8 (a), the mobile phone 10 may merge the application icon 1, the application icon 2, and the folder 1 in the object group 1 to obtain a folder 4, merge the folder 2, the application icon 3, and the application icon 4 in the object group 2 to obtain a folder 5, merge the application icon 5, the application icon 6, and the application icon 7 in the object group 3 to obtain a folder 6, merge the application icon 8, the folder 3, and the application icon 9 in the object group 4 to obtain a folder 7, and merge the application icon 10, the application icon 11, and the application icon 12 in the object group 5 to obtain a folder 8, so as to obtain 5(N ═ 5) objects (i.e., folder 4-folder 8).
Further illustratively, assume that folder 1 includes application icon 13 and application icon 14, folder 2 includes application icon 15, application icon 16, and application icon 17, and folder 3 includes application icon 18 and application icon 19. Referring to fig. 7 (B), as shown in fig. 8 (B), the mobile phone 10 may combine the application icon 1, the application icon 2, the application icon 13, and the application icon 14 in the object group 1 to obtain the folder 4, combine the application icon 15, the application icon 16, and the application icon 17 in the object group 2, the application icons 3 and 4 are combined to obtain a folder 5, the application icons 5, 6 and 7 in the object group 3 are combined to obtain a folder 6, the application icons 8, 18, 19 and 9 in the object group 4 are combined to obtain a folder 7, and the application icons 10, 11 and 12 in the object group 5 are combined to obtain a folder 8, so that 5(N ═ 5) objects (namely, the folder 4-the folder 8) are obtained.
And step 203, the terminal device displays the N objects and the M-P objects in the target operation area.
It can be understood that, in the embodiment of the present invention, M-P objects are objects that are not combined by the terminal device in the M objects; after the terminal device merges the P objects to obtain N objects, the terminal device may display the N objects and M-P objects in the target operation area.
For example, referring to fig. 7 (a), P is 12, and M-P is 3, the mobile phone 10 may merge the application icon 1 and the application icon 2 in the object group 1 to obtain a folder 4, merge the application icon 3 and the application icon 4 in the object group 2 to obtain a folder 5, merge the application icon 5, the application icon 6, and the application icon 7 in the object group 3 to obtain a folder 6, merge the application icon 8 and the folder 3 in the object group 4 to obtain a folder 7, and merge the application icon 10, the application icon 11, and the application icon 12 in the object group 5 to obtain a folder 8, so as to obtain 5(N is 5) objects (i.e., folders 4 to 8). The handset 10 can merge the processed 5 objects (i.e., display folder 4-folder 8) and the non-merged processed 3 objects (i.e., folder 1, folder 2, and application icon 9) in the target operating area.
The embodiment of the invention provides an object display method, wherein a terminal device can combine P objects in M objects to obtain N objects under the condition that at least one object in the M objects in a first interface is in other areas (areas except a target operation area preset by the terminal device in the first interface), and display the N objects and M-P objects in the target operation area. The terminal equipment can display N objects and M-P objects in the target operation area, so that a user can conveniently operate the terminal equipment by one hand; in addition, since the N objects are obtained by combining the P objects in the M objects by the terminal device, the N objects may include the P objects, and thus, when the terminal device displays the N objects and the M-P objects in the target operation area, the M objects may all be located in the target operation area, and thus, the flexibility of displaying the M objects by the terminal device is improved.
Furthermore, in the embodiment of the present invention, when the terminal device displays N objects and M-P objects in the target operation area, the first interface of the terminal device is not reduced, so that the display effect of the large screen of the terminal device can be ensured, that is, the display effect of the terminal device is better.
Optionally, in an embodiment of the present invention, the parameters in step 301 may further include a shaking direction, and the N objects may be N first folders. In combination with step 301 and step 302, before step 203, the object display method provided in the embodiment of the present invention may further include step 401 to step 403 described below, and step 203 may be specifically realized by step 203a described below.
Step 401, the terminal device obtains a target display mode corresponding to the shaking direction according to the shaking direction.
Optionally, in the embodiment of the present invention, when the user holds the terminal device with one hand, the user may shake the terminal device in a certain direction (for example, a left direction, a right direction, an upward direction, or a downward direction) with the user as a reference object, and the terminal device may obtain a target display mode corresponding to the shake direction.
Optionally, in this embodiment of the present invention, the target display modes may include a display mode of a slant increase (for example, a slant increase from right bottom to left top or a slant increase from left bottom to right top), a circular display mode, a display mode of an irregular figure, and the like. The method and the device can be specifically set according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in an embodiment of the present invention, the shaking direction is a rightward direction, and the target display manner corresponding to the rightward direction may be a display manner that increases obliquely from a lower right to an upper left; the direction of the swing is a left direction, and the target display mode corresponding to the left direction may be a display mode in which the display mode increases obliquely from a lower left direction to an upper right direction.
Step 402, the terminal device adds a name to each of the N first folders according to a preset naming mode.
Optionally, in this embodiment of the present invention, the preset naming mode may include naming according to a type of an object included in each first folder, naming according to a size of each first folder, naming according to a composition time of each first folder, naming according to a position (e.g., a row and/or a column) of the object included in each first folder in the first interface, and the like.
Illustratively, in conjunction with (B) in fig. 8, N is 5, and 5 first folders are folders 4-8. The mobile phone 10 may name the 5 first folders according to the row of the object included in each of the 5 first folders in the first interface. For example, if the objects included in the folder 4 (i.e., the app icon 1, the app icon 2, the app icon 13, and the app icon 14) are in the first row in the first interface, the mobile phone 10 may add the name row 1 to the folder 4, and so on, and may obtain the name row 2 of the folder 5, the name row 3 of the folder 6, the name row 4 of the folder 7, and the name row 5 of the folder 8.
And step 403, the terminal device sorts the N first folders and the M-P objects according to the name of each first folder and the names of the M-P objects to obtain a sorting result.
Optionally, in this embodiment of the present invention, the terminal device may sort the N first folders and the M-P objects according to an order of a name of each first folder and an initial of names of the M-P objects; or the terminal device may also sort the N first folders and the M-P objects according to the order of the numbers included in the name of each first folder and the names of the M-P objects; alternatively, the terminal device may also sort the N first folders and the M-P objects by the name of each first folder and the initials of the names of the M-P objects, and by the order of the numbers included in the name of each first folder and the names of the M-P objects. The method and the device can be specifically set according to actual use requirements, and the embodiment of the invention is not limited.
For example, in conjunction with (a) in fig. 7, it is assumed that the mobile phone 10 merges and processes objects in each object group shown in (a) in fig. 7, resulting in 5(N ═ 5) objects (e.g., folders 4 to 8), and the objects that are not merged and processed are M-P ═ 3 objects (i.e., folder 1, file 2, and application icon 9). The handset 10 may add a name for folder 4 as line 1, a name for folder 5 as line 2, a name for folder 6 as line 3, a name for folder 7 as line 4, and a name for folder 8 as line 5; then, the mobile phone 10 may sort the 5 folders and the 3 un-merged objects according to the order of the numbers included in the names of the 5 folders (i.e., folder 4-folder 8) and the names of the 3 un-merged objects (i.e., folder 1, file 2, and application icon 9), and according to the order of the names of the 5 folders and the initials of the names of the 3 un-merged objects, and obtain a sorting result of row 1, folder 1, row 2, folder 2, row 3, row 4, row 5, and application icon 9.
Further exemplarily, in connection with (B) in fig. 8, N is 5 and M-P is 0. The mobile phone 10 may add a name of row 1 to the folder 4, a name of row 2 to the folder 5, a name of row 3 to the folder 6, a name of row 4 to the folder 7, and a name of row 5 to the folder 8 shown in (B) of fig. 8; the handset 10 may then sort the 5 folders in order of the numbers included in the names of the 5 folders (i.e., row 1, row 2, row 3, row 4, and row 5), resulting in a sort result of row 1, row 2, row 3, row 4, and row 5.
And 203a, the terminal equipment displays the N first folders and the M-P objects in the target operation area according to the sequencing result and the target display mode.
For example, assuming that N is 5, M-P is 3, the mobile phone 10 sorts the named 5 folders (folder 4-folder 8) and 3 un-merged objects (i.e., folder 1, file 2, and application icon 9), the obtained sorting result is row 1, folder 1, row 2, folder 2, row 3, row 4, row 5, and application icon 9, and the target display mode is a display mode that increases diagonally from the bottom right to the top left. In conjunction with (a) in fig. 4, as shown in (a) in fig. 9, the mobile phone 10 may display the line 1, the folder 1, the line 2, the folder 2, the line 3, the line 4, the line 5, and the application icon 9 in the area 1 (i.e., the target operation area) in a diagonally increasing display manner from the bottom right to the top left (indicated by a line with an arrow in (a) in fig. 9) according to the sorting result.
Further exemplarily, assuming that N is 5, M-P is 0, the mobile phone 10 sorts the named 5 folders (folder 4-folder 8), the obtained sorting result is row 1, row 2, row 3, row 4, and row 5, and the target display mode is a display mode that increases diagonally from bottom left to top right. In conjunction with (a) in fig. 4, as shown in (B) in fig. 9, the mobile phone 10 may display the line 1, the line 2, the line 3, the line 4, and the line 5 in the area 1 (i.e., the target operation area) in a display manner (indicated by a line with an arrow in (B) in fig. 9) that increases diagonally from the bottom left to the top right according to the sorting result.
In the embodiment of the invention, the terminal equipment can display the N first folders and the M-P objects in the target operation area according to the sequencing result and the target display mode, so that a user can conveniently operate the terminal equipment by one hand; in addition, since the N first folders are obtained by the terminal device after merging the P objects in the M objects according to the target rule, the N first folders may include the P objects, and thus, when the terminal device displays the N first folders and the M-P objects in the target operation region, the M objects may all be located in the target operation region, and thus, the flexibility of the terminal device in displaying the M objects is improved.
It should be noted that, in the embodiment of the present invention, the step 401 may be executed first, and then the step 402 and the step 403 are executed, or the step 402 and the step 403 may be executed first, and then the step 401 is executed, or the step 401, the step 402, and the step 403 may be executed at the same time, and the order of executing the step 401, the step 402, and the step 403 is not limited in the embodiment of the present invention.
Optionally, in this embodiment of the present invention, the N objects may be N first folders. After step 203, the object display method provided in the embodiment of the present invention may further include step 501 and step 502 described below.
Step 501, the terminal device receives a second input of the user.
In this embodiment of the present invention, the second input may be used to trigger the terminal device to display at least two application icons in the second folder on the first interface, where the second folder is one of the N first folders.
In the embodiment of the present invention, a user may perform a second input on one folder (e.g., a second folder) of the N first folders at the first interface of the terminal device, so as to trigger the terminal device to display at least two application icons in the second folder at the first interface.
Optionally, in this embodiment of the present invention, the second input may be a click operation of the second folder by the user. For example, the clicking operation may be a single-click operation, a double-click operation, or a continuous clicking operation for a preset number of times. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
And 502, the terminal equipment responds to the second input and displays a target interface in a target operation area in an overlapping mode.
The target interface can be used for displaying at least two application icons, the target interface can comprise at least one sub-interface, and the at least two application icons are located on the at least one sub-interface.
It can be understood that, in the embodiment of the present invention, in a case that the target interface includes a plurality of application icons, and the plurality of application icons cannot be displayed in the target operation area, the target interface may include at least two sub-interfaces; the user can perform sliding input on the target interface to trigger the terminal device to scroll and display at least two sub-interfaces.
It should be noted that, in the embodiment of the present invention, when the terminal device displays the target interface in the target operation area in an overlapping manner, the transparency of the target interface may be between 0 and 100%.
Illustratively, assume that the second folder is row 3, and that row 3 includes application icon 5, application icon 6, and application icon 7. Referring to fig. 9 (B), as shown in fig. 10 (a), the mobile phone 10 may display a target interface in the area 1 (i.e., the target operation area) in an overlapping manner, where the target interface includes a sub-interface 11, and the application icon 5, the application icon 6, and the application icon 7 are displayed on the sub-interface 11.
Further illustratively, assume that the second folder is row 2, and that row 2 includes application icon 15, application icon 16, application icon 17, application icon 3, and application icon 4. With reference to fig. 9 (B), as shown in fig. 10 (B), the mobile phone 10 may display a target interface in an area 1 (i.e., a target operation area) in an overlapping manner, where the target interface includes two sub-interfaces (e.g., a sub-interface 12 and a sub-interface 13), an application icon 15, an application icon 16, an application icon 17, and an application icon 3 are displayed on the sub-interface 12, an application icon 4 is displayed on the sub-interface 13, and the mobile phone 10 may display the sub-interface 12 in the area 1 in an overlapping manner; after the user makes a slide input (e.g., a slide in the left direction) on the sub-interface 12, the cellular phone 10 may display the sub-interface 13 in the area 1 as shown in (C) of fig. 10.
In the embodiment of the invention, the terminal device can display the target interface in the target operation area in an overlapping manner according to the second input of the user, the target interface can comprise at least one sub-interface, and the at least one sub-interface can be used for displaying all the application icons in the second folder.
Optionally, in the embodiment of the present invention, as shown in fig. 11 in combination with fig. 2, after the step 201, in a case that it is detected that each of the M objects is in the target operation area, the step 202 and the step 203 may be replaced with a step 601 described below.
Step 601, the terminal device responds to the first input and displays the M objects at the default positions of the system under the condition that each object in the M objects is in the target operation area.
In the embodiment of the present invention, the terminal device may display one object at a system default position corresponding to one object for each object when each object of the M objects is in the target operation area.
It can be understood that, in the embodiment of the present invention, the terminal device may determine whether each object of the M objects is in the target operation area according to the area where each object of the M objects is currently located. The terminal device may determine, for each object, a system default position corresponding to the object first and then display the object at the system default position corresponding to the object when detecting that each object of the M objects is in the target operation area.
For example, after the mobile phone 10 receives the first input of the user, the mobile phone 10 detects whether each of 15(M ═ 15) objects is within the target operation region, as shown in fig. 12 (a), the mobile phone 10 detects that each of the 15 objects is within the region 1 (target operation region); as shown in fig. 12 (B), the mobile phone 10 displays each object at the system default position corresponding to each object of the 15 objects according to the system default position corresponding to the object.
Fig. 13 shows a schematic diagram of a possible structure of a terminal device involved in the embodiment of the present invention, and as shown in fig. 13, the terminal device 130 may include: a receiving unit 131, a processing unit 132, and a display unit 133.
The receiving unit 131 may be configured to receive a first input from a user. The processing unit 132 may be configured to, in a case that at least one object of the M objects in the first interface of the terminal device is in another area, in response to the first input received by the receiving unit 131, perform merging processing on P objects of the M objects to obtain N objects, where one object is used to indicate at least one application program, the other areas are areas of the first interface other than a target operation area preset by the terminal device, M, P and N are positive integers, P is less than or equal to M, and N < P. The display unit 133 may be configured to display the N objects and the M-P objects obtained by the processing unit 132 in the target operation region.
In a possible implementation manner, the display unit 133 may be further configured to display the M objects at default positions of the system in response to the first input received by the receiving unit 131 when each of the M objects is in the target operation region.
In a possible implementation manner, the processing unit 132 may be specifically configured to determine, according to a target rule corresponding to the first input, an object that meets the target rule in the M objects, so as to obtain N object groups, where the N object groups include P objects; and respectively combining the objects in each object group in the N object groups to obtain N objects.
In a possible implementation manner, the first input may be a shaking input of the terminal device by the user, and the N objects may be N first folders. With reference to fig. 13, as shown in fig. 14, the terminal device 130 according to the embodiment of the present invention may further include: an acquisition unit 134, an addition unit 135 and a sorting unit 136. The acquiring unit 134 may be configured to acquire a parameter of the shake input before the displaying unit 133 displays the N objects and the M-P objects in the target operating area, where the parameter may include a shake direction; and acquiring a target display mode corresponding to the shaking direction according to the shaking direction. The adding unit 135 may be configured to add a name to each of the N first folders obtained by the processing unit 132 according to a preset naming manner. The sorting unit 136 may be configured to sort the N first folders and the M-P objects according to the name of each first folder and the names of the M-P objects obtained by the adding unit 135, so as to obtain a sorting result. The display unit 133 may be specifically configured to display the N first folders and the M-P objects in the target operation area according to the sorting result obtained by the sorting unit 136 and the target display manner obtained by the obtaining unit.
In a possible implementation manner, the receiving unit 131 may be further configured to receive a second input from the user after the displaying unit 133 displays the N objects and the M-P objects in the target operation region. The display unit 133 may be further configured to, in response to the second input received by the receiving unit 131, superimpose and display a target interface in the target operation area, where the target interface is configured to display icons of at least two applications in the second folder, and the target interface includes at least one sub-interface, where the icons of the at least two applications are located on the at least one sub-interface, and the second folder is one of the N first folders.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and for avoiding repetition, detailed description is not repeated here.
The embodiment of the invention provides a terminal device, which can merge P objects in M objects to obtain N objects under the condition that at least one object in the M objects in a first interface is in other areas (areas except a target operation area preset by the terminal device in the first interface), and display the N objects and M-P objects in the target operation area. The terminal equipment can display N objects and M-P objects in the target operation area, so that a user can conveniently operate the terminal equipment by one hand; in addition, since the N objects are obtained by combining the P objects in the M objects by the terminal device, the N objects may include the P objects, and thus, when the terminal device displays the N objects and the M-P objects in the target operation area, the M objects may all be located in the target operation area, and thus, the flexibility of displaying the M objects by the terminal device is improved.
Fig. 15 is a hardware schematic diagram of a terminal device for implementing various embodiments of the present invention. As shown in fig. 15, the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111.
It should be noted that, as those skilled in the art will appreciate, the terminal device structure shown in fig. 15 does not constitute a limitation to the terminal device, and the terminal device may include more or less components than those shown in fig. 15, or may combine some components, or may arrange different components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 107 may be configured to receive a first input from a user.
The processor 110 may be configured to, in a case that at least one object of M objects in the first interface of the terminal device is in another area, in response to a first input received by the user input unit 107, merge P objects of the M objects to obtain N objects, where one object is used to indicate at least one application program, the other area is an area of the first interface other than a target operation area preset by the terminal device, M, P and N are positive integers, P is less than or equal to M, and N is less than P; and displaying the N objects and the M-P objects in the target operation area.
The embodiment of the invention provides a terminal device, which can merge P objects in M objects to obtain N objects under the condition that at least one object in the M objects in a first interface is in other areas (areas except a target operation area preset by the terminal device in the first interface), and display the N objects and M-P objects in the target operation area. The terminal equipment can display N objects and M-P objects in the target operation area, so that a user can conveniently operate the terminal equipment by one hand; in addition, since the N objects are obtained by combining the P objects in the M objects by the terminal device, the N objects may include the P objects, and thus, when the terminal device displays the N objects and the M-P objects in the target operation area, the M objects may all be located in the target operation area, and thus, the flexibility of displaying the M objects by the terminal device is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 15, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes the processor 110 shown in fig. 15, the memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, and when the computer program is executed by the processor 110, the computer program implements each process of the foregoing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

1. An object display method, characterized in that the method comprises:
receiving a first input of a user;
under the condition that at least one object in M objects in a first interface of terminal equipment is in other areas, in response to the first input, combining P objects in the M objects to obtain N objects, wherein the N objects are folders, one object is used for indicating at least two application programs, the other areas are areas except a target operation area preset by the terminal equipment in the first interface, M, P and N are positive integers, P is less than or equal to M, and N is less than or equal to P;
displaying the N objects and the M-P objects in the target operation area;
the merging the P objects in the M objects to obtain N objects includes:
determining an object which accords with the target rule in the M objects according to a target rule corresponding to the first input to obtain N object groups, wherein the N object groups comprise the P objects;
respectively merging the objects in each object group in the N object groups to obtain the N objects;
the target rule comprises any one of single-row combination, every two-row combination, all-row combination, single-column combination, every two-column combination and all-column combination.
2. The method of claim 1, further comprising:
in a case where each of the M objects is in the target operating region, the M objects are displayed at system default positions in response to the first input.
3. The method according to claim 1, wherein the first input is a shaking input of the terminal device by a user, and the N objects are N first folders;
before the displaying the N objects and the M-P objects in the target operation region, the method further includes:
acquiring parameters of the shaking input, wherein the parameters comprise a shaking direction;
acquiring a target display mode corresponding to the shaking direction according to the shaking direction;
adding a name to each first folder in the N first folders according to a preset naming mode;
sorting the N first folders and the M-P objects according to the name of each first folder and the names of the M-P objects to obtain a sorting result;
the displaying the N objects and the M-P objects in the target operation area comprises:
and displaying the N first folders and the M-P objects in the target operation area according to the sequencing result and the target display mode.
4. The method of claim 3, wherein after the target operating region displays the N objects and the M-P objects, the method further comprises:
receiving a second input of the user;
and in response to the second input, displaying a target interface in the target operation area in an overlapping manner, wherein the target interface is used for displaying icons of at least two applications in a second folder, the target interface comprises at least one sub-interface, the icons of the at least two applications are located on the at least one sub-interface, and the second folder is one of the N first folders.
5. A terminal device, characterized in that the terminal device comprises: the device comprises a receiving unit, a processing unit and a display unit;
the receiving unit is used for receiving a first input of a user;
the processing unit is configured to, in response to the first input received by the receiving unit, merge P objects of the M objects when at least one object of the M objects in a first interface of the terminal device is in another area, to obtain N objects, where the N objects are folders, one object is used to indicate at least two applications, the another area is an area of the first interface other than a target operation area preset by the terminal device, M, P and N are positive integers, P is less than or equal to M, and N is less than or equal to P;
the display unit is used for displaying the N objects and the M-P objects obtained by the processing unit in the target operation area;
the processing unit is specifically configured to determine, according to a target rule corresponding to the first input, an object that meets the target rule among the M objects to obtain N object groups, where the N object groups include the P objects; respectively merging the objects in each object group in the N object groups to obtain N objects;
the target rule comprises any one of single-row combination, every two-row combination, all-row combination, single-column combination, every two-column combination and all-column combination.
6. The terminal device according to claim 5, wherein the display unit is further configured to display the M objects at system default positions in response to the first input received by the receiving unit if each of the M objects is in the target operation region.
7. The terminal device of claim 5, wherein the first input is a shaking input of the terminal device by a user, and the N objects are N first folders;
the terminal further comprises: the device comprises an acquisition unit, an adding unit and a sorting unit;
the acquisition unit is used for acquiring the parameters of the shaking input before the display unit displays the N objects and the M-P objects in the target operation area, wherein the parameters comprise the shaking direction; acquiring a target display mode corresponding to the shaking direction according to the shaking direction;
the adding unit is used for adding a name to each of the N first folders obtained by the processing unit according to a preset naming mode;
the sorting unit is configured to sort the N first folders and the M-P objects according to the name of each first folder and the names of the M-P objects, so as to obtain a sorting result;
the display unit is specifically configured to display the N first folders and the M-P objects in the target operation area according to the sorting result obtained by the sorting unit and according to the target display manner.
8. The terminal device according to claim 7, wherein the receiving unit is further configured to receive a second input from the user after the displaying unit displays the N objects and the M-P objects in the target operation region;
the display unit is further configured to display, in response to the second input received by the receiving unit, a target interface in the target operation area in an overlapping manner, where the target interface is configured to display icons of at least two applications in a second folder, the target interface includes at least one sub-interface, the icons of the at least two applications are located on the at least one sub-interface, and the second folder is one of the N first folders.
9. A terminal device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the object display method according to any one of claims 1 to 4.
CN201810510386.9A 2018-05-24 2018-05-24 Object display method and terminal equipment Active CN108833679B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810510386.9A CN108833679B (en) 2018-05-24 2018-05-24 Object display method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810510386.9A CN108833679B (en) 2018-05-24 2018-05-24 Object display method and terminal equipment

Publications (2)

Publication Number Publication Date
CN108833679A CN108833679A (en) 2018-11-16
CN108833679B true CN108833679B (en) 2021-02-23

Family

ID=64145511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810510386.9A Active CN108833679B (en) 2018-05-24 2018-05-24 Object display method and terminal equipment

Country Status (1)

Country Link
CN (1) CN108833679B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110099296B (en) * 2019-03-27 2021-02-23 维沃移动通信有限公司 Information display method and terminal equipment
CN110069185B (en) * 2019-04-29 2021-04-23 上海连尚网络科技有限公司 Method and apparatus for co-running hosted applications

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100735376B1 (en) * 2006-03-29 2007-07-04 삼성전자주식회사 Method of executing fast application in mobile communication terminal
CN105573622A (en) * 2015-12-15 2016-05-11 广东欧珀移动通信有限公司 Single-hand control method and device of user interface and terminal device
CN105917300A (en) * 2014-01-20 2016-08-31 三星电子株式会社 User interface for touch devices
CN106293332A (en) * 2016-07-29 2017-01-04 维沃移动通信有限公司 The processing method of a kind of singlehanded location application and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100735376B1 (en) * 2006-03-29 2007-07-04 삼성전자주식회사 Method of executing fast application in mobile communication terminal
CN105917300A (en) * 2014-01-20 2016-08-31 三星电子株式会社 User interface for touch devices
CN105573622A (en) * 2015-12-15 2016-05-11 广东欧珀移动通信有限公司 Single-hand control method and device of user interface and terminal device
CN106293332A (en) * 2016-07-29 2017-01-04 维沃移动通信有限公司 The processing method of a kind of singlehanded location application and mobile terminal

Also Published As

Publication number Publication date
CN108833679A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
WO2020258929A1 (en) Folder interface switching method and terminal device
CN109491738B (en) Terminal device control method and terminal device
CN108446058B (en) Mobile terminal operation method and mobile terminal
CN111338530B (en) Control method of application program icon and electronic equipment
CN109032486B (en) Display control method and terminal equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
CN110752981B (en) Information control method and electronic equipment
CN110196668B (en) Information processing method and terminal equipment
CN110502162B (en) Folder creating method and terminal equipment
CN111064848B (en) Picture display method and electronic equipment
CN109933252B (en) Icon moving method and terminal equipment
CN108681427B (en) Access right control method and terminal equipment
CN111459349B (en) Application recommendation method and electronic equipment
CN110231972B (en) Message display method and terminal equipment
CN111026350A (en) Display control method and electronic equipment
CN108920040B (en) Application icon sorting method and mobile terminal
CN108762613B (en) State icon display method and mobile terminal
CN110989896A (en) Control method and electronic equipment
CN110647277A (en) Control method and terminal equipment
CN110703972A (en) File control method and electronic equipment
CN107728923B (en) Operation processing method and mobile terminal
CN110531903B (en) Screen display method, terminal device and storage medium
CN111338525A (en) Control method of electronic equipment and electronic equipment
CN109067975B (en) Contact person information management method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant