CN108646958B - Application program starting method and terminal - Google Patents

Application program starting method and terminal Download PDF

Info

Publication number
CN108646958B
CN108646958B CN201810247839.3A CN201810247839A CN108646958B CN 108646958 B CN108646958 B CN 108646958B CN 201810247839 A CN201810247839 A CN 201810247839A CN 108646958 B CN108646958 B CN 108646958B
Authority
CN
China
Prior art keywords
target
input
icon
control
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810247839.3A
Other languages
Chinese (zh)
Other versions
CN108646958A (en
Inventor
崔晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810247839.3A priority Critical patent/CN108646958B/en
Publication of CN108646958A publication Critical patent/CN108646958A/en
Application granted granted Critical
Publication of CN108646958B publication Critical patent/CN108646958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The invention provides an application program starting method and a terminal, wherein the method comprises the following steps: receiving a first input of a user; responding to the first input, linking a target control and a target icon displayed on a current interface, and updating the display of the target control and the target icon; acquiring a target operation characteristic of the first input; and starting a target body-splitting application program of the target icon associated with the target operation characteristic. Therefore, a user can trigger the terminal to start the target self-body-splitting application program of the target icon according to the target operation characteristic of the first input by executing the first input for triggering the target control and the target icon displayed on the current interface, so that the operation is simplified, and the time is saved.

Description

Application program starting method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an application program starting method and a terminal.
Background
With the development of terminal technology, the types of application programs installed on a terminal are more and more abundant, and the user has more and more requirements on the application programs. Currently, a terminal provides an application-based function, that is, an application-based function, for example, two or more identical applications are started on the same terminal, and the identical applications can be simultaneously and independently operated to meet the requirement that a user logs in by using multiple accounts at the same time.
In the existing terminal, because a user cannot distinguish the account numbers logged in by the same application program by directly watching the application icon, when the user wants to use the application program logged in by a certain account number, the user needs to start the application program and check the interface of the application program for determination, which easily causes repeated operation for many times and is complicated in operation.
Disclosure of Invention
The embodiment of the invention provides an application program starting method and a terminal, and aims to solve the problems that repeated operation is easily caused for many times and the operation is complicated in the existing application program starting method.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an application program starting method, where the method includes:
receiving a first input of a user;
responding to the first input, linking a target control and a target icon displayed on a current interface, and updating the display of the target control and the target icon;
acquiring a target operation characteristic of the first input;
and starting a target body-splitting application program of the target icon associated with the target operation characteristic.
In a second aspect, an embodiment of the present invention further provides a terminal, where the terminal includes:
the first receiving module is used for receiving a first input of a user;
the first response module is used for responding to the first input, linking a target control and a target icon displayed on the current interface and updating the display of the target control and the target icon;
the first acquisition module is used for acquiring the target operation characteristics of the first input;
and the starting module is used for starting the target self-body-distinguishing application program of the target icon associated with the target operation characteristic.
In a third aspect, an embodiment of the present invention further provides a terminal, where the terminal includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when the computer program is executed by the processor, the steps of the application starting method described above are implemented.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the application starting method are implemented as described above.
In the embodiment of the invention, a first input of a user is received; responding to the first input, linking a target control and a target icon displayed on a current interface, and updating the display of the target control and the target icon; acquiring a target operation characteristic of the first input; and starting a target body-splitting application program of the target icon associated with the target operation characteristic. Therefore, a user can start the target self-body-splitting application program of the target icon according to the target operation characteristic of the first input by executing the first input for triggering the target control and the target icon displayed on the current interface, the operation is simplified, and the time is saved.
Drawings
Fig. 1 is a flowchart of an application starting method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an interface provided by an embodiment of the present invention;
FIG. 3a is a second schematic view of an interface provided by an embodiment of the present invention;
FIG. 3b is a third schematic diagram of an interface provided by the embodiment of the present invention;
FIG. 4a is a fourth schematic view of an interface provided by an embodiment of the present invention;
FIG. 4b is a fifth schematic view of an interface provided by the present invention;
FIG. 5a is a sixth schematic view of an interface provided by an embodiment of the present invention;
FIG. 5b is a seventh schematic view of an interface provided by an embodiment of the present invention;
FIG. 6 is an eighth schematic view of an interface provided by an embodiment of the present invention;
FIG. 7 is a ninth schematic view of an interface provided by an embodiment of the present invention;
FIG. 8 is a tenth schematic view of an interface provided by an embodiment of the present invention;
fig. 9 is one of the structural diagrams of a terminal provided in the embodiment of the present invention;
fig. 10 is a second structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The application program starting method is mainly applied to the terminal and used for starting the target individual application program of the target icon. The terminal may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The following describes an application startup method according to an embodiment of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of an application starting method according to an embodiment of the present invention, and as shown in fig. 1, the application starting method according to the embodiment includes the following steps:
step 101, receiving a first input of a user.
In this embodiment, the first input is used to trigger the terminal to start the target avatar application of the target icon. The first input may be a voice input, a gesture input, or a touch input, but is not limited thereto.
Specifically, step 101 may include:
a first input of a user in a display area of a display screen of the terminal is received.
In this way, the user can trigger the terminal to start the target body-splitting application program of the target icon by executing the first input in the display area of the display screen of the terminal, and the operation is simplified. In addition, compared with voice input or gesture input, the touch input can reduce misoperation and is more accurate in control; further, the input operation mode of the user can be diversified.
In addition, it should be understood that, when the first input is represented as a touch input, the first input may be input by a finger of a user, or input by other tools, such as a stylus, and may be determined according to actual needs, which is not limited in the embodiment of the present invention.
In particular implementations, the first input may include (but is not limited to) at least one of:
dragging operation of a user on a target icon displayed on a current interface;
stretching operation of a user on a target control displayed on a current interface;
the user stretches the target control displayed on the current interface to the target icon and continues to drag the target icon or the target control;
dragging the target icon displayed on the current interface to the target control by the user, and continuing the operation of dragging the target icon; and so on.
The stretching operation of the user on the target control may be specifically expressed as: and the user performs sliding operation by taking the preset area of the target control as a starting point, and in the sliding process, the target control is stretched and deformed along with the gesture of the user. It should be noted that the preset area of the target control may be any area of the target control, and may be determined specifically according to actual needs, which is not limited in the embodiment of the present invention.
And 102, responding to the first input, linking a target control and a target icon displayed on the current interface, and updating the display of the target control and the target icon.
In this step, the link between the target control and the target icon displayed on the current interface is used for representing: and the terminal establishes the relation between the target control displayed on the current interface and the target icon. Therefore, it can be understood that the link may be replaced with a word that can represent that the target control is associated with the target icon, such as binding or association, and the like, which may be specifically determined according to actual needs, and the embodiment of the present invention does not limit this.
In addition, after receiving the first input of the user, the terminal can respond to the first input and update the display of the target control and the target icon, so that the user can be prompted about the current input operation, and the user can conveniently determine whether the adjustment operation is needed.
Specifically, if the first input includes an operation of stretching the target control, the terminal may update the display form and the display position of the target control; if the first input includes an operation of dragging the target icon, the terminal may update the display position of the target icon, but is not limited thereto.
The target control may be a linear control, such as an arc control, a straight control, and the like, and the linear width of the target control may range from 1 cm to 5cm, for example. For ease of understanding, referring to fig. 2, in fig. 2, the target control 11 is a straight line control, and in this case, the target control 11 may be named as, but not limited to, Touch-line, Touch line, or Touch line, and may be interpreted as, but not limited to, a Touch line, a control line, an operation line, or an operation control.
It should be understood that the representation and display position of the target control in fig. 2 are merely examples, and in other embodiments, the target control may represent other shaped controls, such as a circular control, a rectangular control, a fan control, etc. The display position of the target control can also be set according to actual needs, and is not limited herein. In addition, the display form and the display position of the target control in the embodiment of the invention can be changed.
In addition, it can be understood that the target control may be always displayed on the display screen of the terminal, such as being displayed on the display screen of the terminal in a floating manner, or may be displayed on the display screen of the terminal after receiving the user input.
In addition, the current interface may be any interface, such as a system desktop, an application interface, or a screen-off interface, among others. In practical application, when the current interface is an interface of an application program other than the application program a of which the user wants to start the user, the user can execute the first input to trigger the terminal to quickly start the application program of the application program a on the current interface, so that the user can quickly start the application program of the user on any application program interface, and the operation is convenient.
And 103, acquiring the target operation characteristics of the first input.
In this step, the target operation characteristic may be expressed as an operation direction of the first input.
Specifically, if the first input includes a drag operation, the target operation feature may be expressed as a drag direction of the first input; if the first input comprises a stretch operation, the target operation characteristic may be, but is not limited to, a stretch direction of the first input.
In addition, the target operation feature may also be expressed as a target area of the target control pointed by the operation direction of the first input.
And 104, starting a target body-divided application program of the target icon associated with the target operation characteristic.
In this embodiment, the target icon is an icon of an application program that has opened the body-splitting function. It should be understood that when the split-body function of an application is turned on, the terminal may generate one or more split-body applications with exactly the same function as the application, so that the mobile terminal may run two or more applications with exactly the same function at the same time for the user to log in different accounts. It should be noted that, in the embodiment of the present invention, any one of at least two application programs with identical functions is referred to as an "avatar application program".
Therefore, it can be understood that, in the present embodiment, the target icon is mapped with at least two avatar applications, that is, the user can start any avatar application of the mapped at least two avatar applications by operating the target icon.
In this step, the terminal may determine, as the target avatar application, an avatar application associated with the target operation feature from among the at least two avatar applications mapped by the target icon, and start the target avatar application.
The terminal can acquire the association relationship between the operation characteristics and the body-divided application program mapped by the target icon in advance. In this way, after the terminal obtains the target feature operation, the body-separated application program associated with the target operation feature can be determined as the target body-separated application program from the at least two body-separated application programs mapped by the target icon by searching the association relation.
Specifically, the terminal may pre-establish an association relationship between the operation feature and the avatar application mapped by the target icon, and specifically may include:
acquiring the number k of the split application programs mapped by the target icon;
dividing the operational characteristics into k sub-operational characteristics;
respectively establishing incidence relations between the k application programs mapped by the target icon and the k sub-operation characteristics;
it should be understood that, in the association relationship, k applications mapped by the target icon correspond to the k child objects one to one, where k is an integer greater than 1.
Of course, in other embodiments, the terminal may also receive the association relationship between the operation features sent by other terminals and the body-separated application program mapped by the target icon, but the method is not limited to this.
In addition, when the terminal determines to start the target self-body application program, the target self-body application program can be operated in a background first, so that the time for a user to wait for the loading of the target self-body application program resource can be reduced.
The application program starting method of the embodiment receives a first input of a user; responding to the first input, linking a target control and a target icon displayed on a current interface, and updating the display of the target control and the target icon; acquiring a target operation characteristic of the first input; and starting a target body-splitting application program of the target icon associated with the target operation characteristic. Therefore, a user can trigger the terminal to start the target self-body-splitting application program of the target icon according to the target operation characteristic of the first input by executing the first input for triggering the target control and the target icon displayed on the current interface, so that the operation is simplified, and the time is saved.
In the embodiment of the present invention, the target control displayed on the current interface may be always displayed on the terminal display screen, for example, displayed on the terminal display screen in a floating manner, or displayed on the terminal display screen after receiving the user input.
Optionally, before step 101, the method further includes:
receiving a second input of the user;
in response to the second input, displaying a target control.
In this embodiment, the second input is used to trigger the terminal to display the target control. The second input may be a voice input, a gesture input, or a touch input to a display area of the terminal display screen.
The target control may be a preset control, or a control generated based on the input trajectory of the first input. The specific representation form of the target control can refer to the description in the above embodiments, and is not repeated herein to avoid repetition.
The target control is displayed under the condition that the second input of the user is received, so that the shielding of the target control on the interface displayed by the terminal display screen can be reduced, and the influence of the display of the target control on the content displayed by the terminal display screen browsed by the user can be reduced.
In particular implementations, the second input may include (but is not limited to) at least one of:
double-click operation of a user on the current interface;
long-time pressing operation of a user on the current interface;
sliding operation of a user on a current interface; and so on.
The current interface may include, but is not limited to, a system desktop, an application program interface, or a screen-off interface. For example, as shown in fig. 2, the user performs a pull-down hovering action from the top of the screen, that is, presses the top of the screen, and hovers after sliding a certain distance in the direction indicated by the first arrow 21, so as to trigger the terminal to display the target control 11. The residence time of the down-draw process may be determined according to actual needs, which is not limited in the embodiments of the present invention.
It should be understood that the display form and the display position of the target control 11 in fig. 2 are only examples, and do not limit the display form and the display position of the target control according to the embodiment of the present invention.
In the embodiment of the invention, the display form of the target control is variable. Therefore, optionally, the display target control includes:
displaying a target control in a first state;
after the target control is displayed, the method further comprises:
receiving a third input of the user;
and responding to the third input, displaying a target control in a second state, and displaying N icons in a first display area formed by enclosing the target control in the second state and the side edge of the display screen of the terminal, wherein N is a positive integer.
In this embodiment, the third input is used to trigger the terminal to switch the display state of the target control from the first state to the second display state, and to display N icons in a first display area enclosed by the target control and the side of the terminal display screen. The second input may be a voice input, a gesture input, or a touch input to the target control.
In particular implementations, the third input may include (but is not limited to) at least one of:
stretching operation of a user on the target control;
dragging operation of a user on the target control;
clicking operation of a user on the target control;
sliding operation of a user on a current interface; and so on.
For example, please refer to fig. 3 a. In fig. 3a, reference numeral 111 is used to designate a target control in a first state, and reference numeral 112 is used to designate a target control in a second state. And the target control in the first state is represented as a straight line, and the target control in the second state is represented as an arc line.
As shown in fig. 3a, after the user triggers the terminal to display the target control in the first state, the user may slide on the current interface in the direction indicated by the second arrow 22 to trigger the terminal to display the target control in the second state, and 2 icons, namely a first icon 41 and a second icon 42, are displayed in the first display area 31 formed by the target control in the second state and the side edge of the display screen of the terminal.
The first icon 41 and/or the second icon 42 may include an icon of an application program that does not start the avatar function, or may include an icon of an application program that starts the avatar function. In addition, for the icon of the application program which has started the body-separating function, at least two body-separating application programs are mapped on the icon.
In addition, in the embodiment of the present invention, in the drawing, the first icon 41 is represented as "○" containing a letter "a", the first icon 41 maps to the application a, the second icon 42 is represented as "○" containing a letter "B", and the second icon 42 maps to the application B.
Of course, in other embodiments, the terminal may also display the target control and display the N icons after receiving the second input from the user.
Optionally, the target control divides a display area of a display screen of the terminal into a first sub-area and a second sub-area, and displays N icons in the first sub-area.
For ease of understanding, please refer to FIG. 3 b. As shown in fig. 3b, the target control 11 divides the display area of the display screen of the terminal into a first sub-area 32 and a second sub-area 33, and displays 2 icons, namely a first icon 41 and a second icon 42, in the first sub-area 32.
In the embodiment of the invention, the terminal can display the target control in the specific display area. Optionally, the display screen of the terminal is a special-shaped screen;
the display target control comprises:
displaying a target control in a second display area of the special-shaped screen;
wherein the second display area comprises two areas separated by a tip area of the shaped screen.
In this embodiment, the display screen of the terminal may be a special-shaped screen. For example, the structure of a terminal shaped screen can be seen in fig. 2, but not limited thereto. In fig. 2, the terminal, that is, the special-shaped screen of the mobile phone, may be referred to as a screen with a bang area, and the bang area is specifically a non-display area in which the top end of the screen is recessed and which is used for setting a camera and the like, and may also be referred to as a top end area; the two side regions of the bang region may be referred to generally as the ear regions.
The second display area comprises two areas separated by the top end area of the special-shaped screen, namely an ear area separated by a Liu Hai area of the special-shaped screen.
In embodiments of the present invention, the first input may comprise a plurality of representations.
Optionally, a first preset region corresponding to the target control displays N icons;
step 101 may include:
receiving a first input of a user dragging a target icon of the N icons.
The first preset area may be the first display area 31 in fig. 3a, or may be the first sub-area 32 in fig. 3b, but is not limited thereto.
In this embodiment, if the terminal detects that the user drags the target icon of the N icons displayed on the current interface, the target control and the target icon displayed on the current interface may be linked, and the display position of the target icon may be updated.
Since the first input includes an operation of dragging the target icon, in the application scenario, the target operation characteristic of the first input may be represented as a dragging direction of the first input, or a target area of a target control pointed by the dragging direction of the first input.
In a scenario where the target operation characteristic is represented by the first input dragging direction, the terminal may determine, as the target avatar application, an avatar application associated with the first input dragging direction from among the at least two avatar applications mapped by the target icon, and start the target avatar application.
In a scene where the target operation characteristic is represented as a target area of a target control pointed by the first input dragging direction, the terminal may determine, as the target avatar application, an avatar application associated with the target area of the target control, from among the at least two avatar applications mapped by the target icon, and start the target avatar application. The method and the device are used for quickly starting the target self-body application program through the first input only by associating the target area with the certain target self-body application program, and a user is not required to repeatedly confirm or search the self-body application program corresponding to the icon to be started.
Further, step 101 may be represented as:
and receiving a first input of dragging a target icon in the N icons to a target control by a user.
In this way, the terminal can start the target personal application program of the target icon associated with the first input target operation feature only when the terminal detects that the user drags the target icon in the N icons and the target icon is in contact with the target control or has an overlapping area, so that the misoperation rate can be reduced.
In addition, since the first input causes the target icon to contact or have an overlapping region with the target control, in the application scenario, the target operation feature of the first input may appear as a target region of the target control pointed to by the first input. The terminal may determine, as the target avatar application, an avatar application associated with the target area of the target control, of the at least two avatar applications mapped by the target icon, and start the target avatar application.
It should be noted that, in a specific practical application, as shown in fig. 4a, the first input may further be represented by an operation that the user drags the first icon 41 of the N icons to the target area of the target control, and continues to drag the target icon so as to change the display form of the target control.
Optionally, a second preset region corresponding to the target control displays N icons;
step 101 may include:
and receiving a first input of a user for stretching a target control displayed on the current interface to a target icon in the N icons.
The second predetermined area may be the first display area 31 in fig. 3a or the first sub-area 32 in fig. 3b, but is not limited thereto. In addition, the second preset area corresponding to the target control may be the same area as the first preset area, or may be a different area, which may be determined according to actual needs.
In this embodiment, if the terminal detects an operation of stretching the target control displayed on the current interface to a target icon of the N icons, the target control displayed on the current interface and the target icon may be linked, and the display position and the display form of the target control may be updated.
Since the first input causes the target icon to contact or have an overlapping region with the target control, in this application scenario, the target operation feature of the first input may appear as a target region of the target control pointed to by the first input. The terminal may determine, as the target avatar application, an avatar application associated with the target area of the target control, of the at least two avatar applications mapped by the target icon, and start the target avatar application.
It should be noted that, in a specific practical application, as shown in fig. 4b, the first input may also be represented by an operation that the user stretches the target control displayed in the current interface to the first icon 41 in the N icons, and continues to drag the target icon or the target control in a certain direction.
In some implementations of the embodiment of the present invention, optionally, the first input is used to control the target icon to contact with or have an overlapping area with a target area of the target control, and when the first input operation is finished, the target icon is not separated from the target control;
the responding to the first input, linking a target control and a target icon displayed on the current interface, and updating the display of the target control and the target icon includes:
under the condition that the target icon is in contact with a target area of the target control or an overlapping area exists, the target control and the target icon displayed on the current interface are linked;
and updating the display form of the target control and the display position of the target icon in the operation process of the first input.
In this embodiment, the first input may be an operation of a user stretching a target control displayed on a current interface to a target icon, so that the target icon is in contact with a target area of the target control or has an overlapping area. In the application scenario, the terminal may link the target control and the target icon displayed on the current interface in response to the first input, and update the display form of the target control. It can be understood that when the display form of the target control changes, the display position of the target control changes accordingly, so that the terminal can update the display position of the target control.
The first input may also be an operation of dragging a target icon displayed on the current interface to a target control by a user, so that the target icon is in contact with a target area of the target control or has an overlapping area. In the application scenario, the terminal may link the target control and the target icon displayed on the current interface in response to the first input, and update a display position of the target icon.
Therefore, in the operation process of the first input, the display form of the target control and the display position of the target icon are updated, so that the current input operation of the user can be prompted, and the user can conveniently determine whether the adjustment operation is needed.
Of course, in other embodiments, the first input may not control the target icon to contact or have an overlapping region with the target control. For example: the first input may be a connection operation of a user for the target icon and the target control, that is, an input trajectory of the first input connects the target icon and the target control, and in the application scenario, if the target icon and the target control are not operated by the first input, a positional relationship between the target icon and the target control is not changed, but is not limited thereto.
In the embodiment of the invention, the terminal determines a target individual application program which is pre-associated with the target operation characteristic in at least two individual application programs for starting the target icon mapping based on the first input target operation characteristic.
Optionally, the first input includes an operation of dragging a target icon;
the acquiring the target operation characteristic of the first input comprises:
acquiring a target area of the target control pointed by the first input operation direction;
the target split-body application program for starting the target icon associated with the target operation characteristic comprises the following steps:
and starting a target body-separating application program of the target icon associated with the target area.
In the present embodiment, the operation direction of the first input may be understood as a drag direction in which the user drags the target icon.
For example, the first input including the operation of dragging the target icon may be embodied as: and dragging the target icon in the N icons to a target area of the target control by the user, and continuously dragging the target icon to change the display form of the target control.
For ease of understanding, please refer to FIG. 4 a. In fig. 4a, it is assumed that the first icon 41 maps 3 divided applications, namely, a first divided application, a second divided application and a third divided application; the target control 11 includes a first area previously associated with the first avatar application, a second area previously associated with the second avatar application, and a third area previously associated with the third avatar application.
As shown in fig. 4a, if the user drags the first icon 41 to the first area of the target control 11, and continues to drag the first icon 41 in a certain direction, so that the display form of the target control 11 changes, and when the first input operation is finished, that is, when the user releases (stops dragging) the first icon 41, the first icon 41 is not separated from the target control 11, the terminal may start the first avatar application program mapped by the first icon 41 and pre-associated with the first area.
It should be noted that reference numerals 111, 112, and 113 in fig. 4a are all used to identify the target control 11, and the difference is that the target controls 11 in different states are identified, specifically, reference numeral 111 is used to identify the target control 11 in a first state, reference numeral 112 is used to identify the target control 11 in a second state, and reference numeral 113 is used to identify the target control 11 in a third state. The time sequence of the display state of the target control 11 is switched from the first state to the second state and then to the third state.
Therefore, a user can trigger the terminal to start the target self-body-splitting application program mapped by the target icon and associated with the target area by dragging the target icon to the target area of the target control, and operation is simplified.
Optionally, the first input includes an operation of stretching the target control;
the acquiring the target operation characteristic of the first input comprises:
acquiring a target area of the target control of the first input operation;
the target split-body application program for starting the target icon associated with the target operation characteristic comprises the following steps:
and starting a target body-separating application program of the target icon associated with the target area.
In this embodiment, if the user stretches the target control with the target area of the target control as a starting point, it may be determined that, of the body-divided application icons mapped by the target icon, the body-divided application icon associated with the target area in advance is the target body-divided application program, and the target body-divided application program is started.
Therefore, the user can trigger the terminal to start the target icon-mapped target personal application program associated with the target area by stretching the target control by taking the target area of the target control as a starting point, and the operation is simplified.
Optionally, the first input includes an operation of stretching the target control;
the acquiring the target operation characteristic of the first input comprises:
acquiring a stretching direction of the first input;
the target split-body application program for starting the target icon associated with the target operation characteristic comprises the following steps:
and starting a target body-splitting application program of the target icon associated with the stretching direction.
In the present embodiment, the stretching operation may be understood as: and the user performs sliding operation by taking the preset area of the target control as a starting point, and in the sliding process, the target control is stretched and deformed along with the gesture of the user.
Correspondingly, the stretching direction of the first input may also be understood as: and the first display position of the preset area of the target control before the first input is received points to the direction of the second display position of the preset area of the target control after the first input is received.
Therefore, the user can trigger the terminal to start the target body-splitting application program mapped by the target icon and associated with the stretching direction of the stretching target control by stretching the target control, and the operation is simplified.
Optionally, the first input includes an operation of dragging a target icon;
the acquiring the target operation characteristic of the first input comprises:
acquiring the dragging direction of the first input;
the target split-body application program for starting the target icon associated with the target operation characteristic comprises the following steps:
and starting a target body-splitting application program of the target icon associated with the dragging direction.
In this embodiment, if the user drags the target icon, it may be determined that, among the split application icons mapped by the target icon, the split application icon associated with the dragging direction in which the target icon is dragged is a target split application program in advance, and the target split application program is started.
Therefore, the user drags the target icon according to the target direction, and the terminal can be triggered to start the target body-splitting application program which is mapped by the target icon and is associated with the target direction, so that the operation is simplified.
It should be noted that, when the first input includes operations of stretching the target control and dragging the target icon at the same time, in order to avoid confusion of starting the target avatar application program by the terminal, the terminal may uniformly start the target avatar application program by using a manner that the first input only includes the first input of stretching the target control or only includes the first input of dragging the target icon, which may improve accuracy and response speed of starting the target avatar application program by the terminal.
For example, the first input includes a stretching operation of stretching the target control to the target icon, and then a dragging operation of changing the display positions of the target icon and the target control
In this embodiment, the first input comprising the operation of stretching the target control may be represented as: and stretching the target control displayed on the current interface to a target icon in the N icons by the user, and continuously dragging along a certain direction to change the display positions of the target icon and the target control.
For ease of understanding, please refer to FIG. 4 b. In fig. 4b, it is assumed that the first icon 41 maps 3 divided applications, which are a first divided application, a second divided application and a third divided application, respectively, wherein the first divided application is pre-associated with a first direction, the second divided application is pre-associated with a second direction, and the third divided application is pre-associated with a third direction.
As shown in fig. 4b, if the user stretches the target control 11 to the first icon 41 and continues to perform the dragging operation along the second direction, so that the display positions of the first icon 41 and the target control 11 are changed, and when the first input operation is finished, that is, when the user releases (stops dragging), the first icon 41 is not separated from the target control 11, the terminal may start the second avatar application program mapped by the first icon 41 and pre-associated with the second direction.
It should be noted that reference numerals 111, 112, and 113 in fig. 4b are all used to identify the target control 11, and the difference is that the target controls 11 in different states are identified, specifically, reference numeral 111 is used to identify the target control 11 in a first state, reference numeral 112 is used to identify the target control 11 in a second state, and reference numeral 113 is used to identify the target control 11 in a third state. The time sequence of the display state of the target control 11 is switched from the first state to the second state and then to the third state.
In this embodiment of the present invention, optionally, after step 101, further including:
and under the condition that the target icon is in contact with or has an overlapping area with the target area of the target control, displaying the program detail information of the target personal application program of the target icon associated with the target area.
Optionally, the program detail information includes the personal account information of the target personal application program of the target icon, and/or the reminding task information related to the target personal application program of the target icon, which is input by the user.
Through the program detail information, the user can prompt the user which target self-body application program is mapped to the target area pointed by the current operation under the condition that the user forgets which self-body application program is mapped to which target area.
Preferably, in the case that the first input is that the user stretches the target control displayed on the current interface to a target icon of the N icons, the program detail information of the target avatar application corresponding to the stretching direction may also be displayed around the target control, and the same effect is achieved.
Specifically, the program detail information may include an account identifier of the target avatar application, such as a login account or a login name. Further, the program detail information may also include remark information of the target personal application, such as reminder task information. For example, if the target icon is an icon of an instant messaging application, the reminding task information of the target personal identification application may be specifically expressed as text information for prompting the user to contact a certain contact at a certain time point, but is not limited thereto.
For convenience of understanding, referring to fig. 5a, if the user drags the first icon 41 to the first area of the target control 11, so that the first icon 41 is in contact with or has an overlapping area with the first area of the target control 11, the terminal may display the program detail information 51 of the first avatar application mapped by the first icon 41 associated with the first area of the target control 11.
Specifically, in fig. 5a, specific contents of the program detail information 51 include: the program a is a body 1, an encryption program, and represents a first body application program mapped by the first area association first icon 41 of the target control 11, and the first body application program is an encryption application program.
Referring to fig. 5b, if the user stretches the target control 11 to the first icon 41, so that the first icon 41 contacts with or has an overlapping area with the second area of the target control 11, the terminal may display the program detail information 51 of the second avatar application program mapped by the first icon 41 associated with the second area of the target control 11.
Specifically, in fig. 5b, the specific contents of the program detail information 51 include: the program a is a separate 2, an unencrypted program, which represents a second separate application mapped by the second area-associated first icon 41 of the target control 11, and the second separate application is an unencrypted application.
It should be understood that the display positions and the included specific contents of the program detail information 51 in fig. 5a and 5b are merely examples, and are not limited thereto.
In this way, the program detail information of the target body-divided application program associated with the target area can be prompted to the user, so that the user can conveniently determine whether the body-divided application program of the target icon expected to be started is matched with the target body-divided application program, and further determine whether the body-divided application program associated with the target area needs to be replaced.
In addition, in this embodiment, the terminal may only need to associate a certain avatar program of the target area associated target icon, and may start the target avatar application program associated with the target area under the condition that the target icon is in contact with the target area of the target control or has an overlapping area, and the user is not required to search for association relationships between the areas of the target control and the avatar application programs mapped by the target icon, thereby simplifying the operation.
Similarly, in a case where the first input includes an operation of stretching the target control, the terminal may display program detail information of the target avatar application of the target icon associated with the stretching direction of the first input.
In this embodiment, the terminal may only need to associate a certain avatar program of the first input stretching direction associated target icon, and may start the first input stretching direction associated target avatar application program, and the user is not required to search for the association relationship between each stretching direction and each avatar application program mapped by the target icon, which simplifies the operation.
Meanwhile, the user can be prompted with the program detail information of the target body-separated application program in the first input stretching direction, so that the user can conveniently determine whether the body-separated application program of the target icon expected to be started is matched with the target body-separated application program, and further determine whether the body-separated application program related to the first input stretching direction needs to be replaced.
Optionally, in some embodiments, in a case where the target icon is in contact with the target control or has an overlapping area, program detail information of the avatar application of the target icon is displayed.
Compared with the above embodiment, in the present embodiment, when the target icon is in contact with the target control or has an overlapping area, program detail information of a part or all of the avatar applications mapped by the target icon may be displayed.
Therefore, the user can conveniently execute corresponding operation according to the personal application program started according to the self requirement on the premise of knowing the account number logged in by each personal application program and/or the task to be processed, so that the repeated operation rate can be reduced, and the operation burden of the terminal is reduced.
Optionally, after displaying the program detail information of the target avatar application of the target icon associated with the target area, the method further includes:
receiving a fourth input of the user on the program detail information;
updating the display content of the program detail information in response to the fourth input.
The user can edit the display content of the program detail information, can record the backlog of the corresponding target self-identification application program, and can be reminded to execute the backlog displayed by the program detail information when the user starts the target self-identification application program next time, so that the effective reminding of the backlog is realized.
And the fourth input is used for triggering the terminal to update the display content of the program detail information. The fourth input may be a voice input, a gesture input, or a touch input, but is not limited thereto.
In particular implementations, the fourth input may include (but is not limited to) at least one of:
the user modifies the operation of the program detail information;
adding program detail information by a user;
deleting the program detail information by the user; and so on.
As shown in fig. 6, the user can click on the program detail information 51 to change the display contents of the program detail information 51. Specifically, in fig. 6, the specific contents of the updated program detail information 51 include: and 10 points of remark information for sending a reminding message to the queen.
It should be understood that the display position and the included specific content of the program detail information 51 in fig. 6 are merely examples, and are not limited thereto.
Therefore, the user can trigger the terminal to update the display content of the program detail information according to the actual requirement of the user, and if the corresponding backlog is recorded, the interestingness can be improved, the actual requirement of the user can be met, and the effect of effectively reminding the user can be achieved.
Optionally, after step 101, the method further includes:
and under the condition that the target control is separated from the target icon when the first input operation is finished, canceling the starting of the target body-splitting application program of the target icon.
For example, please refer to fig. 7. In fig. 7, if the user wants to cancel the start of the target application of the first icon 41 halfway, the user may drag the first icon 41 to the first display area 31 enclosed by the target control 11 and the side of the display screen of the terminal, and disengage the first icon 41 from the target control 11, so as to trigger the terminal to cancel the start of the target split application.
It should be noted that reference numerals 111, 112, 113, and 114 in fig. 7 are all used to identify the target control 11, and the difference is that the target controls 11 in different states are identified, specifically, reference numeral 111 is used to identify the target control 11 in a first state, reference numeral 112 is used to identify the target control 11 in a second state, reference numeral 113 is used to identify the target control 11 in a third state, and reference numeral 114 is used to identify the target control 11 in a fourth state. The time sequence of the display state of the target control 11 is switched from the first state to the second state, then to the third state, and then to the fourth state.
It should be understood that fig. 7 is only an example, and in any case that the target control is separated from the target icon at the end of the first input operation, the terminal may cancel the launch of the target avatar application of the target icon.
It should be noted that, for a scene in which the first input uncontrolled target icon is in contact with the target control or has an overlapping region, the terminal may cancel the starting of the target avatar application of the target icon when it is detected that the user drags the target icon to the initial display position of the target icon or to a position where the distance from the initial display position of the target icon is smaller than a preset value.
Therefore, compared with the method that the target self-body application program which is not expected to be started by the user is started by the terminal and then is closed, the starting of the target self-body application program can be cancelled, so that the operation burden of the terminal can be reduced, and the power consumption of the terminal can be reduced.
In addition, the user can cancel the program which is not required to be started temporarily at any time, and the operation can be realized by one step, so that the operation is convenient and fast.
In the embodiment of the present invention, in a case where the target icon is in contact with the target control or has an overlapping area, attribute information of the target icon may be further displayed, that is, the target icon is an icon of an application program whose avatar function has been started or an icon of an application program whose avatar function has not been started, so that a user may execute a corresponding operation according to an avatar application program that the user needs to start.
The attribute information of the target icon can be expressed as numbers, and if the attribute information of the target icon is '1', the application program mapped by the target icon is not distinguished; if the attribute information of the target icon is '2', the target icon is mapped to 2 separate application programs, and so on. However, it should be understood that the present invention is not limited to the representation manner of the attribute information of the target icon, for example, the attribute information of the target icon may also be represented by "with" and "without" or "with" and "without" and may be determined according to actual needs, and the embodiment of the present invention does not limit this.
Optionally, the target icon is an icon of an application program which does not start the body-separating function;
after receiving the first input of the user, the method further comprises:
displaying a split account adding control in a third preset area of the target icon;
receiving a fifth input of the user on the split account adding control;
displaying an account number input box in response to the fifth input;
acquiring account information input by a user in the account input box;
and creating a first body-splitting application program of the target icon, and logging in the first body-splitting application program through the account information.
For ease of understanding, please refer to fig. 8. In fig. 8, the third icon 43 is an icon of an application program that does not start the split function, the terminal may display a split account addition control 61 above the third icon 43, and the user may trigger the terminal to display an account input box by clicking, long-pressing, re-pressing, double-clicking, or sliding the split account addition control 61. Therefore, the user can input account information, such as an account and a password, in the account input box, trigger the terminal to create the first body-splitting application program of the third icon 43, and log in the first body-splitting application program through the account information.
It should be understood that the representation and the display position of the split account addition control 61 in fig. 8 are only examples, and the present invention does not limit the representation and the display position of the split account addition control 61.
Therefore, compared with the prior art, the user needs to start the body-splitting function of the application program of the target icon on the setting interface, then enter the created interface of the first body-splitting application program, input the account information and log in the first body-splitting application program by using the account information, and the implementation mode can effectively simplify the user operation.
In addition, it should be noted that, for an icon of an application program of which a target icon is a started split function, the terminal may also display a split account adding control in a third preset area of the target icon after receiving the first input of the user, so that the user may add the control by operating the split account, create a new split application of the target icon, directly log in the new split application through the input account information, and do not need to enter a setting interface to perform complicated operations such as setting and saving, and the operation is convenient and fast, thereby simplifying the user operation.
Optionally, the linking the target control and the target icon displayed on the current interface includes:
and under the condition that the target icon is in contact with or has an overlapping area with the target area of the target control, locking the target icon in the target area of the target control.
Therefore, no matter the user drags the target icon or the target control, the target icon and the target control can be controlled to move simultaneously, and the target icon and the target control are prevented from being separated, so that the separation of the target icon and the target control caused by dragging can be reduced, and the probability that the target self-identification application program is cancelled to be started is reduced.
Optionally, the linking the target control and the target icon displayed on the current interface includes:
displaying a locking identifier in a fourth preset area of the target icon; the locking mark is used for indicating that the target icon is the selected icon of the to-be-started self-body application program.
Therefore, the user can be prompted to select the target icon currently, so that the user can conveniently determine whether the icon is the target icon expected to be selected, the user can conveniently determine whether to change the operation, and the misoperation rate can be reduced.
It should be noted that, various optional implementations described in the embodiments of the present invention may be implemented in combination with each other or implemented separately, and the embodiments of the present invention are not limited thereto.
In addition, it should be understood that fig. 2, 3a, 4a, 5a, 6 and 7 may be combined into a complete embodiment; fig. 2, 4b, 5b and 8 may be combined into another complete embodiment.
Referring to fig. 9, fig. 9 is a structural diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 9, a terminal 900 includes: a first receiving module 901, a first responding module 902, a first obtaining module 903 and a starting module 904.
The first receiving module 901 is configured to receive a first input of a user;
a first response module 902, configured to, in response to the first input, link a target control and a target icon displayed on a current interface, and update display of the target control and the target icon;
a first obtaining module 903, configured to obtain a target operation characteristic of the first input;
a starting module 904, configured to start a target avatar application of the target icon associated with the target operating feature.
On the basis of fig. 9, the following describes modules further included in the terminal 900, sub-modules included in each module, and/or units included in the sub-modules.
Optionally, the terminal 900 further includes:
the second receiving module is used for receiving the second input of the user before receiving the first input of the user;
a second response module to display a target control in response to the second input.
Optionally, the second response module is specifically configured to: in response to the second input, displaying a target control in a first state;
the terminal 900 further includes:
the third receiving module is used for receiving a third input of the user after the target control is displayed;
and the third response module is used for responding to the third input, displaying the target control in the second state, and displaying N icons in a first display area formed by enclosing the target control in the second state and the side edge of the display screen of the terminal, wherein N is a positive integer.
Optionally, the target control divides a display area of a display screen of the terminal into a first sub-area and a second sub-area, and displays N icons in the first sub-area.
Optionally, the display screen of the terminal is a special-shaped screen;
the second response module is specifically configured to: in response to the second input, displaying a target control in a second display area of the shaped screen;
wherein the second display area comprises two areas separated by a tip area of the shaped screen.
Optionally, a first preset region corresponding to the target control displays N icons;
the first receiving module 901 is specifically configured to:
receiving a first input of a user dragging a target icon of the N icons.
Optionally, a second preset region corresponding to the target control displays N icons;
the first receiving module 901 is specifically configured to:
and receiving a first input of a user for stretching a target control displayed on the current interface to a target icon in the N icons.
Optionally, the first input is used to control the target icon to contact with or overlap with a target area of the target control, and when the first input operation is finished, the target icon is not separated from the target control;
the first response module 902 includes:
the link module is used for linking the target control and the target icon displayed on the current interface under the condition that the target icon is in contact with the target area of the target control or the target area has an overlapping area;
and the updating module is used for updating the display form of the target control and the display position of the target icon in the operation process of the first input.
Optionally, the first input includes an operation of dragging a target icon;
the first obtaining module 903 is specifically configured to:
acquiring a target area of the target control pointed by the first input operation direction;
the starting module 904 is specifically configured to:
and starting a target body-separating application program of the target icon associated with the target area.
Optionally, the first input includes an operation of stretching the target control;
the first obtaining module 903 is specifically configured to:
acquiring a stretching direction of the first input;
the starting module 904 is specifically configured to:
and starting a target body-splitting application program of the target icon associated with the stretching direction.
Optionally, the terminal 900 further includes:
the first display module is used for displaying the program detail information of the target individual application program of the target icon associated with the target area under the condition that the target icon is in contact with the target area of the target control or has an overlapped area after receiving the first input of the user.
Optionally, the terminal 900 further includes:
a fourth receiving module, configured to receive a fourth input of the user on program detail information after the program detail information of the target personal application of the target icon associated with the target area is displayed;
and the fourth response module is used for responding to the fourth input and updating the display content of the program detail information.
Optionally, the program detail information includes the personal account information of the target personal application program of the target icon, and/or the reminding task information related to the target personal application program of the target icon, which is input by the user.
Optionally, the terminal 900 further includes:
and the canceling module is used for canceling the starting of the target body-splitting application program of the target icon under the condition that the target control is separated from the target icon when the first input operation is finished after the first input of the user is received.
Optionally, the target icon is an icon of an application program which does not start the body-separating function;
the terminal 900 further includes:
the second display module is used for displaying the split account adding control in a third preset area of the target icon after receiving the first input of the user;
the fifth receiving module is used for receiving a fifth input of the user on the split account adding control;
a fifth response module, configured to respond to the fifth input and display an account entry box;
the second acquisition module is used for acquiring the account information input by the user in the account input box;
and the creating module is used for creating a first body-divided application program of the target icon and logging in the first body-divided application program through the account information.
Optionally, the target control is a linear control.
Optionally, the first response module 902 is specifically configured to:
and responding to the first input, locking the target icon in the target area of the target control under the condition that the target icon is in contact with the target area of the target control or has an overlapping area, and updating the display of the target control and the target icon.
Optionally, the first response module 902 is specifically configured to:
responding to the first input, displaying a locking identifier in a fourth preset area of a target icon, and updating the display of the target control and the target icon;
the locking mark is used for indicating that the target icon is the selected icon of the to-be-started self-body application program.
The terminal 900 can implement each process in the method embodiment of the present invention and achieve the same beneficial effects, and is not described herein again to avoid repetition.
Referring to fig. 10, fig. 10 is a structural diagram of a terminal according to another embodiment of the present invention, where the terminal may be a hardware structural diagram of a terminal for implementing various embodiments of the present invention. As shown in FIG. 10, terminal 1000 can include, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. Those skilled in the art will appreciate that the terminal configuration shown in fig. 10 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 1010 is configured to: receiving a first input of a user; responding to the first input, linking a target control and a target icon displayed on a current interface, and updating the display of the target control and the target icon; acquiring a target operation characteristic of the first input; and starting a target body-splitting application program of the target icon associated with the target operation characteristic.
Optionally, the processor 1010 is further configured to: receiving a second input of the user; in response to the second input, displaying a target control.
Optionally, the processor 1010 is further configured to: displaying a target control in a first state; receiving a third input of the user; and responding to the third input, displaying a target control in a second state, and displaying N icons in a first display area formed by enclosing the target control in the second state and the side edge of the display screen of the terminal, wherein N is a positive integer.
Optionally, the target control divides a display area of a display screen of the terminal into a first sub-area and a second sub-area, and displays N icons in the first sub-area.
Optionally, the display screen of the terminal is a special-shaped screen; a processor 1010, further configured to: displaying a target control in a second display area of the special-shaped screen; wherein the second display area comprises two areas separated by a tip area of the shaped screen.
Optionally, a first preset region corresponding to the target control displays N icons; a processor 1010, further configured to: receiving a first input of a user dragging a target icon of the N icons.
Optionally, a second preset region corresponding to the target control displays N icons; a processor 1010, further configured to: and receiving a first input of a user for stretching a target control displayed on the current interface to a target icon in the N icons.
Optionally, the first input is used to control the target icon to contact with or overlap with a target area of the target control, and when the first input operation is finished, the target icon is not separated from the target control; a processor 1010, further configured to: under the condition that the target icon is in contact with a target area of the target control or an overlapping area exists, the target control and the target icon displayed on the current interface are linked; and updating the display form of the target control and the display position of the target icon in the operation process of the first input.
Optionally, the first input includes an operation of dragging a target icon; a processor 1010, further configured to: acquiring a target area of the target control pointed by the first input operation direction; and starting a target body-separating application program of the target icon associated with the target area.
Optionally, the first input includes an operation of stretching the target control; a processor 1010, further configured to: acquiring a stretching direction of the first input; and starting a target body-splitting application program of the target icon associated with the stretching direction.
Optionally, the processor 1010 is further configured to: and under the condition that the target icon is in contact with or has an overlapping area with the target area of the target control, displaying the program detail information of the target personal application program of the target icon associated with the target area.
Optionally, the processor 1010 is further configured to: receiving a fourth input of the user on the program detail information; updating the display content of the program detail information in response to the fourth input.
Optionally, the program detail information includes the personal account information of the target personal application program of the target icon, and/or the reminding task information related to the target personal application program of the target icon, which is input by the user.
Optionally, the processor 1010 is further configured to: and under the condition that the target control is separated from the target icon when the first input operation is finished, canceling the starting of the target body-splitting application program of the target icon.
Optionally, the target icon is an icon of an application program which does not start the body-separating function; a processor 1010, further configured to: displaying a split account adding control in a third preset area of the target icon; receiving a fifth input of the user on the split account adding control; displaying an account number input box in response to the fifth input; acquiring account information input by a user in the account input box; and creating a first body-splitting application program of the target icon, and logging in the first body-splitting application program through the account information.
Optionally, the target control is a linear control.
Optionally, the processor 1010 is further configured to: and under the condition that the target icon is in contact with or has an overlapping area with the target area of the target control, locking the target icon in the target area of the target control.
Optionally, the processor 1010 is further configured to: displaying a locking identifier in a fourth preset area of the target icon; the locking mark is used for indicating that the target icon is the selected icon of the to-be-started self-body application program.
It should be noted that, in this embodiment, the terminal 1000 may implement each process in the method embodiment of the present invention and achieve the same beneficial effects, and for avoiding repetition, details are not described here.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other devices through a wireless communication system.
The terminal provides the user with wireless broadband internet access through the network module 1002, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 can provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the terminal 1000. The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
Terminal 1000 can also include at least one sensor 1005 such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 10061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 10061 and/or a backlight when the terminal 1000 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 1005 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1010, and receives and executes commands sent by the processor 1010. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 10071, the user input unit 1007 can include other input devices 10072. Specifically, the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated for implementing the input and output functions of the terminal, which is not limited herein.
Interface unit 1008 is an interface for connecting an external device to terminal 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1008 can be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within terminal 1000 or can be used to transmit data between terminal 1000 and external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby integrally monitoring the terminal. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
Terminal 1000 can also include a power supply 1011 (e.g., a battery) for powering the various components, and preferably, power supply 1011 can be logically coupled to processor 1010 through a power management system that provides management of charging, discharging, and power consumption.
In addition, terminal 1000 can include some functional blocks not shown, which are not described herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 1010, a memory 1009, and a computer program stored in the memory 1009 and capable of running on the processor 1010, where the computer program, when executed by the processor 1010, implements each process of the above-mentioned application program starting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned application program starting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (38)

1. An application program starting method is applied to a terminal and is characterized by comprising the following steps:
receiving a first input of a user;
responding to the first input, linking a target control and a target icon displayed on a current interface, and updating the display of the target control and the target icon;
acquiring a target operation characteristic of the first input;
starting a target body-separating application program of the target icon associated with the target operation characteristic;
the first input is used for controlling the target icon to be in contact with or have an overlapping area with a target area of the target control, and when the first input operation is finished, the target icon is not separated from the target control.
2. The method of claim 1, wherein prior to receiving the first input from the user, further comprising:
receiving a second input of the user;
in response to the second input, displaying a target control.
3. The method of claim 2, wherein displaying the target control comprises:
displaying a target control in a first state;
after the target control is displayed, the method further comprises:
receiving a third input of the user;
responding to the third input, displaying a target control in a second state, and displaying N icons in a first display area formed by enclosing the target control in the second state and the side edge of a display screen of the terminal,
wherein N is a positive integer.
4. The method of claim 2, wherein the target control divides a display area of a display screen of the terminal into a first sub-area and a second sub-area, and wherein N icons are displayed in the first sub-area.
5. The method according to claim 2, wherein the display screen of the terminal is a shaped screen;
the display target control comprises:
displaying a target control in a second display area of the special-shaped screen;
wherein the second display area comprises two areas separated by a tip area of the shaped screen.
6. The method according to claim 1, wherein a first preset area corresponding to the target control displays N icons;
the receiving a first input of a user comprises:
receiving a first input of a user dragging a target icon of the N icons.
7. The method according to claim 1, wherein a second preset area corresponding to the target control displays N icons;
the receiving a first input of a user comprises:
and receiving a first input of a user for stretching a target control displayed on the current interface to a target icon in the N icons.
8. The method of claim 1, wherein the first input is used for controlling the target icon to be in contact with or have an overlapping area with a target area of the target control, and the target icon is not separated from the target control at the end of the first input operation;
the responding to the first input, linking a target control and a target icon displayed on the current interface, and updating the display of the target control and the target icon includes:
under the condition that the target icon is in contact with a target area of the target control or an overlapping area exists, the target control and the target icon displayed on the current interface are linked;
and updating the display form of the target control and the display position of the target icon in the operation process of the first input.
9. The method according to claim 1, wherein the first input includes an operation of dragging a target icon;
the acquiring the target operation characteristic of the first input comprises:
acquiring a target area of the target control pointed by the first input operation direction;
the target split-body application program for starting the target icon associated with the target operation characteristic comprises the following steps:
and starting a target body-separating application program of the target icon associated with the target area.
10. The method of claim 1, wherein the first input comprises an operation to stretch a target control;
the acquiring the target operation characteristic of the first input comprises:
acquiring a stretching direction of the first input;
the target split-body application program for starting the target icon associated with the target operation characteristic comprises the following steps:
and starting a target body-splitting application program of the target icon associated with the stretching direction.
11. The method of claim 1, wherein after receiving the first input from the user, further comprising:
and under the condition that the target icon is in contact with or has an overlapping area with the target area of the target control, displaying the program detail information of the target personal application program of the target icon associated with the target area.
12. The method of claim 11, wherein after displaying the program detail information of the target avatar application of the target icon associated with the target area, further comprising:
receiving a fourth input of the user on the program detail information;
updating the display content of the program detail information in response to the fourth input.
13. The method according to claim 11, wherein the program detail information comprises the target avatar account information of the target avatar application of the target icon and/or the reminding task information input by the user and associated with the target avatar application of the target icon.
14. The method of claim 1, wherein after receiving the first input from the user, further comprising:
and under the condition that the target control is separated from the target icon when the first input operation is finished, canceling the starting of the target body-splitting application program of the target icon.
15. The method of claim 1, wherein the target icon is an icon of an application that does not launch a split function;
after receiving the first input of the user, the method further comprises:
displaying a split account adding control in a third preset area of the target icon;
receiving a fifth input of the user on the split account adding control;
displaying an account number input box in response to the fifth input;
acquiring account information input by a user in the account input box;
and creating a first body-splitting application program of the target icon, and logging in the first body-splitting application program through the account information.
16. The method of claim 1, wherein the target control is a line type control.
17. The method of claim 1, wherein the linking the target control and the target icon displayed by the current interface comprises:
and under the condition that the target icon is in contact with or has an overlapping area with the target area of the target control, locking the target icon in the target area of the target control.
18. The method of claim 1, wherein the linking the target control and the target icon displayed by the current interface comprises:
displaying a locking identifier in a fourth preset area of the target icon;
the locking mark is used for indicating that the target icon is the selected icon of the to-be-started self-body application program.
19. A terminal, comprising:
the first receiving module is used for receiving a first input of a user;
the first response module is used for responding to the first input, linking a target control and a target icon displayed on the current interface and updating the display of the target control and the target icon;
the first acquisition module is used for acquiring the target operation characteristics of the first input;
the starting module is used for starting a target self-body-distinguishing application program of the target icon associated with the target operation characteristic;
the first input is used for controlling the target icon to be in contact with or have an overlapping area with a target area of the target control, and when the first input operation is finished, the target icon is not separated from the target control.
20. The terminal of claim 19, further comprising:
the second receiving module is used for receiving the second input of the user before receiving the first input of the user;
a second response module to display a target control in response to the second input.
21. The terminal of claim 20, wherein the second response module is specifically configured to: in response to the second input, displaying a target control in a first state;
the terminal further comprises:
the third receiving module is used for receiving a third input of the user after the target control is displayed;
the third response module is used for responding to the third input, displaying a target control in a second state, and displaying N icons in a first display area formed by enclosing the target control in the second state and the side edge of the display screen of the terminal;
wherein N is a positive integer.
22. The terminal of claim 20, wherein the target control divides a display area of a display of the terminal into a first sub-area and a second sub-area, and wherein N icons are displayed in the first sub-area.
23. The terminal of claim 20, wherein the display of the terminal is a shaped screen;
the second response module is specifically configured to: in response to the second input, displaying a target control in a second display area of the shaped screen;
wherein the second display area comprises two areas separated by a tip area of the shaped screen.
24. The terminal according to claim 19, wherein a first preset area corresponding to the target control displays N icons;
the first receiving module is specifically configured to:
receiving a first input of a user dragging a target icon of the N icons.
25. The terminal according to claim 19, wherein a second preset area corresponding to the target control displays N icons;
the first receiving module is specifically configured to:
and receiving a first input of a user for stretching a target control displayed on the current interface to a target icon in the N icons.
26. The terminal of claim 19, wherein the first input is used to control the target icon to contact or have an overlapping region with a target region of the target control, and when the first input operation is finished, the target icon is not separated from the target control;
the first response module includes:
the link module is used for linking the target control and the target icon displayed on the current interface under the condition that the target icon is in contact with the target area of the target control or the target area has an overlapping area;
and the updating module is used for updating the display form of the target control and the display position of the target icon in the operation process of the first input.
27. The terminal of claim 19, wherein the first input comprises an operation of dragging a target icon;
the first obtaining module is specifically configured to:
acquiring a target area of the target control pointed by the first input operation direction;
the starting module is specifically configured to:
and starting a target body-separating application program of the target icon associated with the target area.
28. The terminal of claim 19, wherein the first input comprises an operation to stretch a target control;
the first obtaining module is specifically configured to:
acquiring a stretching direction of the first input;
the starting module is specifically configured to:
and starting a target body-splitting application program of the target icon associated with the stretching direction.
29. The terminal of claim 19, further comprising:
the first display module is used for displaying the program detail information of the target individual application program of the target icon associated with the target area under the condition that the target icon is in contact with the target area of the target control or has an overlapped area after receiving the first input of the user.
30. The terminal of claim 29, further comprising:
a fourth receiving module, configured to receive a fourth input of the user on program detail information after the program detail information of the target personal application of the target icon associated with the target area is displayed;
and the fourth response module is used for responding to the fourth input and updating the display content of the program detail information.
31. The terminal according to claim 29, wherein the program detail information comprises the target avatar account information of the target avatar application of the target icon and/or the user-input reminding task information associated with the target avatar application of the target icon.
32. The terminal of claim 19, further comprising:
and the canceling module is used for canceling the starting of the target body-splitting application program of the target icon under the condition that the target control is separated from the target icon when the first input operation is finished after the first input of the user is received.
33. The terminal of claim 19, wherein the target icon is an icon of an application program that does not start an avatar function;
the terminal further comprises:
the second display module is used for displaying the split account adding control in a third preset area of the target icon after receiving the first input of the user;
the fifth receiving module is used for receiving a fifth input of the user on the split account adding control;
a fifth response module, configured to respond to the fifth input and display an account entry box;
the second acquisition module is used for acquiring the account information input by the user in the account input box;
and the creating module is used for creating a first body-divided application program of the target icon and logging in the first body-divided application program through the account information.
34. The terminal of claim 19, wherein the target control is a line type control.
35. The terminal of claim 19, wherein the first response module is specifically configured to:
and responding to the first input, locking the target icon in the target area of the target control under the condition that the target icon is in contact with the target area of the target control or has an overlapping area, and updating the display of the target control and the target icon.
36. The terminal of claim 19, wherein the first response module is specifically configured to:
responding to the first input, displaying a locking identifier in a fourth preset area of a target icon, and updating the display of the target control and the target icon;
the locking mark is used for indicating that the target icon is the selected icon of the to-be-started self-body application program.
37. A terminal, characterized in that it comprises a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the application launching method according to any of claims 1 to 18.
38. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the application launching method as claimed in any one of the claims 1 to 18.
CN201810247839.3A 2018-03-23 2018-03-23 Application program starting method and terminal Active CN108646958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810247839.3A CN108646958B (en) 2018-03-23 2018-03-23 Application program starting method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810247839.3A CN108646958B (en) 2018-03-23 2018-03-23 Application program starting method and terminal

Publications (2)

Publication Number Publication Date
CN108646958A CN108646958A (en) 2018-10-12
CN108646958B true CN108646958B (en) 2020-06-23

Family

ID=63744623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810247839.3A Active CN108646958B (en) 2018-03-23 2018-03-23 Application program starting method and terminal

Country Status (1)

Country Link
CN (1) CN108646958B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111198629B (en) * 2018-11-19 2023-09-15 青岛海信移动通信技术有限公司 Method for processing touch operation of mobile terminal and mobile terminal
CN109634480B (en) * 2018-11-30 2020-08-25 维沃移动通信有限公司 User interface editing method and terminal equipment
CN109902679B (en) * 2019-02-26 2021-01-29 维沃移动通信有限公司 Icon display method and terminal equipment
CN110008011B (en) * 2019-02-28 2021-07-16 维沃移动通信有限公司 Task switching method and terminal equipment
CN110263515B (en) * 2019-04-26 2021-12-24 荣耀终端有限公司 Opening method of encrypted application and terminal equipment
CN111368188B (en) * 2020-02-27 2023-09-05 维沃移动通信有限公司 Application processing method and electronic device
CN111857997A (en) * 2020-06-30 2020-10-30 维沃移动通信有限公司 Application program body-separating identification display method, device, equipment and medium
CN114265635A (en) * 2021-12-21 2022-04-01 维沃移动通信有限公司 Application program starting method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092464A (en) * 2012-12-28 2013-05-08 东莞宇龙通信科技有限公司 Terminal device and icon operation method thereof
CN103500057A (en) * 2013-10-08 2014-01-08 百度在线网络技术(北京)有限公司 Mobile terminal and control method and device thereof
CN105224322A (en) * 2015-09-25 2016-01-06 维沃移动通信有限公司 A kind of attend to anything else method and terminal of application program
CN106201523A (en) * 2016-07-15 2016-12-07 宇龙计算机通信科技(深圳)有限公司 The control method of application program, actuation means and terminal
CN107168602A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 One kind control application drawing calibration method and terminal
CN107506109A (en) * 2017-08-16 2017-12-22 维沃移动通信有限公司 A kind of method and mobile terminal for starting application program
CN107566638A (en) * 2017-08-31 2018-01-09 维沃移动通信有限公司 The display control method and mobile terminal of a kind of application program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9003318B2 (en) * 2011-05-26 2015-04-07 Linden Research, Inc. Method and apparatus for providing graphical interfaces for declarative specifications

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092464A (en) * 2012-12-28 2013-05-08 东莞宇龙通信科技有限公司 Terminal device and icon operation method thereof
CN103500057A (en) * 2013-10-08 2014-01-08 百度在线网络技术(北京)有限公司 Mobile terminal and control method and device thereof
CN105224322A (en) * 2015-09-25 2016-01-06 维沃移动通信有限公司 A kind of attend to anything else method and terminal of application program
CN106201523A (en) * 2016-07-15 2016-12-07 宇龙计算机通信科技(深圳)有限公司 The control method of application program, actuation means and terminal
CN107168602A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 One kind control application drawing calibration method and terminal
CN107506109A (en) * 2017-08-16 2017-12-22 维沃移动通信有限公司 A kind of method and mobile terminal for starting application program
CN107566638A (en) * 2017-08-31 2018-01-09 维沃移动通信有限公司 The display control method and mobile terminal of a kind of application program

Also Published As

Publication number Publication date
CN108646958A (en) 2018-10-12

Similar Documents

Publication Publication Date Title
CN108646958B (en) Application program starting method and terminal
WO2019174611A1 (en) Application configuration method and mobile terminal
CN108491133B (en) Application program control method and terminal
US20210044556A1 (en) Message management method and terminal
CN108491149B (en) Split screen display method and terminal
CN110196667B (en) Notification message processing method and terminal
US20220300302A1 (en) Application sharing method and electronic device
CN109525710B (en) Method and device for accessing application program
CN110830363B (en) Information sharing method and electronic equipment
CN110888707A (en) Message sending method and electronic equipment
CN110196668B (en) Information processing method and terminal equipment
CN108228902B (en) File display method and mobile terminal
CN108681427B (en) Access right control method and terminal equipment
CN110442279B (en) Message sending method and mobile terminal
CN111610904B (en) Icon arrangement method, electronic device and storage medium
CN111338530A (en) Control method of application program icon and electronic equipment
WO2020024770A1 (en) Method for determining communication object, and mobile terminal
CN109725789B (en) Application icon archiving method and terminal equipment
CN109062634B (en) Application starting method and mobile terminal
WO2019120190A1 (en) Dialing method and mobile terminal
CN110768804A (en) Group creation method and terminal device
CN111007976B (en) Application control method and terminal equipment
CN109885242B (en) Method for executing operation and electronic equipment
CN111367450A (en) Application program control method, electronic device and medium
CN109067975B (en) Contact person information management method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant