CN108897477B - Operation control method and terminal equipment - Google Patents

Operation control method and terminal equipment Download PDF

Info

Publication number
CN108897477B
CN108897477B CN201810650873.5A CN201810650873A CN108897477B CN 108897477 B CN108897477 B CN 108897477B CN 201810650873 A CN201810650873 A CN 201810650873A CN 108897477 B CN108897477 B CN 108897477B
Authority
CN
China
Prior art keywords
target
touch input
area
sub
fingerprint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810650873.5A
Other languages
Chinese (zh)
Other versions
CN108897477A (en
Inventor
杨其豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810650873.5A priority Critical patent/CN108897477B/en
Publication of CN108897477A publication Critical patent/CN108897477A/en
Application granted granted Critical
Publication of CN108897477B publication Critical patent/CN108897477B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The invention discloses an operation control method, which is applied to terminal equipment and comprises the following steps: receiving a first touch input of a first finger of a user on a display screen of the terminal equipment; responding to the first touch input, and displaying N control subregions in the target region; receiving a second touch input of the user in the target control subarea; responding to a second touch input, and executing a target control operation corresponding to the second touch input on a target object in a target sub display area associated with the target control sub area; the display area of the display screen of the terminal equipment comprises N sub-display areas; the target control subregion is one of N control subregions; each of the N control subregions is associated with one of the N sub-display regions; n is an integer greater than 1. The operation control mode provided by the embodiment of the invention is simple and easy to implement, and a user can operate the touch screen with a larger size more conveniently.

Description

Operation control method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an operation control method and terminal equipment.
Background
With the development of terminal devices such as smart phones and tablet computers, screens of the terminal devices are designed to be larger and larger in order to provide better viewing and operating experiences for users.
In an existing operation mode, a user is often required to click a corresponding control on a touch screen of a terminal device, or perform multi-point touch operation according to a preset gesture, so as to trigger the terminal to execute a corresponding function. However, terminal devices with larger and larger screens have many disadvantages while satisfying the viewing experience of users. For example, it is difficult for a user to perform operations on different parts of the entire screen with a single hand. For example, when operations such as zooming and screen capturing are performed that require multi-touch, the moving distance between the multiple points may increase due to the increase of the screen, which makes the operations cumbersome.
Disclosure of Invention
The embodiment of the invention provides an operation control method and terminal equipment, and aims to solve the problem of complex operation caused by a large screen.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an operation control method is provided, which is applied to a terminal device, and includes:
receiving a first touch input of a first finger of a user on a display screen of the terminal equipment;
responding to the first touch input, and displaying N control subregions in a target region;
receiving a second touch input of the user in the target control subarea;
responding to the second touch input, and executing target control operation corresponding to the second touch input on a target object in a target sub display area associated with the target control sub area;
the display area of the display screen of the terminal equipment comprises N sub-display areas; a target control sub-region is one of the N control sub-regions; each of the N control subregions is associated with one of the N sub-display regions; and N is an integer greater than 1.
In a second aspect, a terminal device is provided, which includes:
the first receiving module is used for receiving a first touch input of a first finger of a user on a display screen of the terminal equipment;
the first response module is used for responding to the first touch input and displaying N control subregions in the target region;
the second receiving module is used for receiving a second touch input of the user in the target control subarea;
a second response module, configured to, in response to the second touch input, execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control sub-area;
the display area of the display screen of the terminal equipment comprises N sub-display areas; a target control sub-region is one of the N control sub-regions; each of the N control subregions is associated with one of the N sub-display regions; and N is an integer greater than 1.
In a third aspect, a terminal device is provided, which includes: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the operation control method as provided in the preceding first aspect.
In a fourth aspect, a computer-readable storage medium is provided, which is characterized in that the computer-readable storage medium has stored thereon a computer program, which when executed by a processor implements the steps of the operation control method as provided in the first aspect.
In the embodiment of the invention, the display area of the display screen of the terminal equipment comprises N sub-display areas, the target control sub-area is one of the N control sub-areas, and each control sub-area in the N control sub-areas is associated with one sub-display area in the N sub-display areas. On the basis, the terminal device can respond to a first touch input of a finger of a user on a display screen of the terminal device, display a control subarea in the target area, and further respond to a second touch input of the finger of the user on the display screen of the terminal device, and execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control subarea. Therefore, by adopting the operation control method of the embodiment of the invention, the user can execute the corresponding target control operation on the target object in the target sub-display area associated with the target control sub-area through the touch input in the target control sub-area, thereby realizing the operation control on the whole terminal equipment display screen. The operation control mode is simple and easy to implement, and a user can operate the touch screen with a larger size more conveniently.
Drawings
FIG. 1 is a schematic flow chart of an operation control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a gesture of a touch input according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a second gesture of touch input according to an embodiment of the invention;
FIG. 4 is a schematic diagram illustrating a third gesture of touch input according to the embodiment of the invention;
FIG. 5 is a schematic diagram illustrating a fourth gesture of touch input according to the embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a fifth exemplary touch input gesture in accordance with an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating a sixth exemplary touch input in accordance with an embodiment of the present invention;
FIG. 8 is a schematic diagram illustrating a seventh touch input in accordance with an embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating an eighth exemplary touch input gesture in accordance with an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating a ninth touch input in accordance with an embodiment of the present invention;
FIG. 11 is a schematic diagram illustrating a tenth exemplary touch input in accordance with the present invention;
FIG. 12 is a flowchart illustrating an eleventh touch input method according to an embodiment of the present invention;
FIG. 13 is a flowchart illustrating a twelfth touch input method according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of a terminal device in an embodiment of the present invention;
fig. 15 is a schematic structural diagram of another terminal device in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides an operation control method, applied to a terminal device, including:
step 101: receiving a first touch input of a first finger of a user on a display screen of the terminal equipment.
The touch input of the first finger of the user on the display screen of the terminal device may be a sliding input, a pressing input, or a combination of multiple input modes. The embodiment of the present invention is not limited thereto.
Optionally, in this embodiment of the present invention, the first fingerprint of the first finger may include T fingerprint partitions, where T is a positive integer. The fingerprint partition of the first fingerprint is formed by contacting the first finger of the user with a display screen (also referred to as a screen, a touch screen, or the like) of the terminal device, and the fingerprint partition is a part or all of a complete fingerprint (i.e., the first fingerprint) formed by the first finger of the user.
For example, a complete fingerprint formed by the first finger of the user can be seen from fig. 2, where T may be 5, and the first fingerprint of the first finger may be divided into 5 fingerprint partitions, which are:
a left fingerprint section formed by the left side of the user's finger in contact with the screen, as shown in fig. 2;
a right fingerprint section formed by the right side of the user's finger in contact with the screen, as shown in fig. 2;
an upper fingerprint section formed by the upper side of the user's finger contacting the screen, as shown in fig. 2;
a lower fingerprint section formed by the contact of the underside of the user's finger with the screen, see FIG. 2;
the central fingerprint section formed by the contact of the central portion of the user's finger (which can be understood as the finger pad) with the screen is shown in fig. 2.
Optionally, the touch input of the first finger of the user on the screen may include a pressing input and/or a sliding input of a first area of the first finger of the user on the screen, where the first area is a partial or entire fingerprint area of the first fingerprint of the first finger. It can be understood that, when the touch input is a sliding input, the first finger will always keep in contact with the screen during the touch input. According to the touch input of the first area of the first finger of the user on the screen, the fingerprint partition of the first finger formed on the screen, wherein the first area of the first finger is in contact with the screen, can be determined. For example:
when the first area is a left fingerprint area of the first fingerprint of the first finger, the fingerprint partition formed on the screen is a left fingerprint partition, which is shown in fig. 2, fig. 3 or fig. 5;
when the first area is a right fingerprint area of the first fingerprint of the first finger, the fingerprint partition formed on the screen is a right fingerprint partition, as shown in fig. 2, fig. 3 or fig. 5;
when the first area is an upper side fingerprint area of the first fingerprint of the first finger, the fingerprint partition formed on the screen is an upper fingerprint partition, as shown in fig. 2 or fig. 4;
when the first area is a lower side fingerprint area of the first fingerprint of the first finger, the fingerprint partition formed on the screen is a lower fingerprint partition, as shown in fig. 2 or fig. 4;
when the first area is a central fingerprint area of the first fingerprint of the first finger, the fingerprint partition formed on the screen is a central fingerprint partition, as shown in fig. 2 or fig. 6;
when the first area is the entire fingerprint area of the first fingerprint of the first finger, the fingerprint partition formed on the screen is a complete fingerprint partition, as shown in fig. 2.
Optionally, in the process of performing press input on the screen by using the first area of the first finger, the touch input may further include rotation input in which the user rotates the first finger, and in the process of the rotation input, the fingerprint information of the first area changes, and correspondingly, the fingerprint partition formed by the contact of the first finger and the screen also changes correspondingly.
For example, when the first finger rotates from left pressing to right pressing to positive pressing during the pressing input of the first area of the first finger on the screen, the fingerprint information of the first area changes from a left fingerprint area to a full fingerprint area, and a fingerprint partition formed by the contact of the first finger with the screen also changes from the left fingerprint partition to the full fingerprint partition. On the contrary, when the first area of the first finger is pressed on the screen and input, the first finger rotates from the front pressing direction to the left pressing direction, the fingerprint information of the first area is changed from the whole fingerprint area to the left fingerprint area, and the fingerprint partition formed by the contact of the first finger and the screen is also changed from the whole fingerprint partition to the left fingerprint partition.
For another example, when the first finger rotates from right to left pressing to forward pressing during the pressing input of the first area of the first finger on the screen, the fingerprint information of the first area changes from the right fingerprint area to the full fingerprint area, and the fingerprint partition formed by the contact of the first finger with the screen also changes from the right fingerprint partition to the full fingerprint partition. On the contrary, when the first area of the first finger is pressed on the screen and the first finger is rotated to the right side from the front pressing in the process of pressing input, the fingerprint information of the first area is changed from the whole fingerprint area to the right fingerprint area, and the fingerprint partition formed by the contact of the first finger and the screen is also changed from the whole fingerprint partition to the right fingerprint partition.
For another example, when the first finger rotates from the upper side to the lower side to the forward direction from the upper side to the lower side in the process of pressing the first area of the first finger on the screen for inputting, the fingerprint information of the first area changes from the upper fingerprint area to the whole fingerprint area, and the fingerprint partition formed by the contact of the first finger with the screen also changes from the upper fingerprint partition to the complete fingerprint partition. On the contrary, when the first area of the first finger is pressed on the screen and the first finger rotates to the upper side from the front pressing to the upper side for pressing, the fingerprint information of the first area is changed from all fingerprint areas to the upper fingerprint area, and the fingerprint partition formed by the contact of the first finger and the screen is also changed from the complete fingerprint partition to the upper fingerprint partition.
For another example, when the first finger rotates from the lower side to the upper side to the forward side in the process of pressing the first area of the first finger on the screen, the fingerprint information of the first area changes from the lower fingerprint area to the whole fingerprint area, and the fingerprint partition formed by the contact of the first finger with the screen also changes from the lower fingerprint partition to the whole fingerprint partition. On the contrary, when the first area of the first finger is pressed on the screen and input, the first finger rotates from the front pressing to the lower side to be pressed, the fingerprint information of the first area is changed from all fingerprint areas to lower fingerprint areas, and the fingerprint subarea formed by the contact of the first finger and the screen is also changed from the complete fingerprint subarea to the lower fingerprint subarea.
For example, when the forward pressing force of the first finger is changed from small to large in the process of performing pressing input on the first area of the first finger on the screen, the fingerprint information of the first area is changed from the central fingerprint area to the whole fingerprint area, and then the fingerprint partition formed by the contact of the first finger with the screen is also changed from the central fingerprint partition to the whole fingerprint partition. On the contrary, when the front pressing force of the first finger is reduced greatly in the process of pressing and inputting the first area of the first finger on the screen, the fingerprint information of the first area is changed from all fingerprint areas to the central fingerprint area, and the fingerprint subarea formed by the contact of the first finger and the screen is also changed from the complete fingerprint subarea to the central fingerprint subarea.
It can be understood that the terminal may recognize different fingerprint information and touch characteristics corresponding to different first touch inputs. Wherein, the fingerprint information may refer to at least one of T fingerprint partitions formed by the first fingerprint; the touch characteristics may include fingerprint partition change information, for example, at least one of an area change of a fingerprint partition and a switch of the fingerprint partition, and the touch characteristics may also include at least one of a sliding direction, a sliding speed, a sliding distance, a sliding track, and the like of the first finger on the screen. The terminal may determine different response modes for the first touch input based on the identified different fingerprint information and touch characteristics.
One optional way of the first touch input is that the first touch input includes a first sliding input of a first direction of a first fingerprint partition of a first finger of the user on the display screen of the terminal device and a second sliding input of a second direction of a second fingerprint partition of the first finger on the display screen of the terminal device. Wherein, first fingerprint subregion and second fingerprint subregion all contain in T fingerprint subregion that first fingerprint of first finger contains.
In a specific implementation, the first touch input may have various forms. The first fingerprint partition and the second fingerprint partition in the first touch input can be the same or different; the first direction of the first sliding input and the second direction of the second sliding input may be the same or different.
For example, referring to fig. 3, the first fingerprint partition is embodied as a left fingerprint partition, the first direction is embodied as a left swipe, the second fingerprint partition is embodied as a right fingerprint partition, and the second direction is embodied as a right swipe. Therefore, the first touch input comprises that the user firstly uses the left fingerprint subarea of the first finger to perform first sliding input leftwards on the display screen of the terminal equipment, and then uses the right fingerprint subarea of the first finger to perform second sliding input rightwards on the display screen of the terminal equipment.
For another example, referring to fig. 4, the first fingerprint partition is embodied as an upper fingerprint partition, the first direction is embodied as an upward swipe, the second fingerprint partition is embodied as a lower fingerprint partition, and the second direction is embodied as a downward swipe. Therefore, the first touch input comprises that the user firstly uses the upper fingerprint subarea of the first finger to perform first sliding input upwards on the display screen of the terminal equipment, and then uses the lower fingerprint subarea of the first finger to perform second sliding input downwards on the display screen of the terminal equipment.
For another example, referring to fig. 5, the first fingerprint partition is embodied as a left fingerprint partition, the first direction is embodied as a counterclockwise sliding, the second fingerprint partition is embodied as a right fingerprint partition, and the second direction is embodied as a clockwise sliding. Therefore, the first touch input includes that the user firstly uses the left fingerprint partition of the first finger to perform first sliding input in the anticlockwise direction on the display screen of the terminal equipment, and then uses the right fingerprint partition of the first finger to perform second sliding input in the clockwise direction on the display screen of the terminal equipment.
Also for example, referring to fig. 6, the first fingerprint partition is embodied as a center fingerprint partition, the first direction is embodied as a swipe up, the second fingerprint partition is embodied as a center fingerprint partition, and the second direction is embodied as a swipe right. Therefore, the first touch input comprises that the user firstly uses the center fingerprint subarea of the first finger to perform first sliding input upwards on the display screen of the terminal equipment, and then uses the center fingerprint subarea of the first finger to perform second sliding input rightwards on the display screen of the terminal equipment.
Step 103: and responding to the first touch input, and displaying N control subareas in the target area.
After receiving and recognizing the first touch input, the terminal device may execute step 103, and display N control sub-areas (N is an integer greater than 1) in the target area. It can be understood that the target area may be set by default in the terminal, may also be configured by a user in advance in the terminal, and may also be determined according to an operation and/or an area corresponding to the first touch input, which is not limited in the embodiment of the present invention.
Alternatively, if the main purpose of the display control sub-region is to facilitate one-handed operation of the large-screen terminal by the user, the target region is preferably set to the middle-lower region of the screen. If the left-right hand usage habits of the user are further considered, the target area may be further set in a lower-left area (suitable for left-handed users) or a lower-right area (suitable for right-handed users).
Optionally, if the main purpose of displaying the control sub-region is to facilitate a user to perform a multi-touch operation on the large-screen terminal, the target region is preferably set according to a use habit of the user or is manually preset by the user, so as to better meet a user requirement.
It should be noted that the display area of the display screen of the terminal device may also correspondingly include N sub-display areas, and each of the N control sub-areas is associated with one of the N sub-display areas. Taking the display screens shown in fig. 3 to 6, and fig. 11, 12, and 13 as examples, N is taken as 4, and the display area of the screen is divided into 4 sub-display areas by dotted lines, which are an upper left sub-display area, an upper right sub-display area, a lower left sub-display area, and a lower right sub-display area, respectively. The target area 120 has 4 control sub-areas displayed therein, and each control sub-area corresponds to a virtual identifier (also referred to as a virtual button), and each virtual identifier also corresponds to a number. It will be appreciated that each of the 4 control sub-regions is associated with one of the 4 sub-display regions, in particular:
the control sub-area corresponding to the virtual identifier No. 1 is associated with the upper left sub-display area;
the control sub-area corresponding to the virtual identifier No. 2 is associated with the upper right sub-display area;
the control sub-area corresponding to the virtual identifier No. 3 is associated with the lower left sub-display area;
and the control sub-area corresponding to the virtual identifier No. 4 is associated with the lower right sub-display area.
Step 105: and receiving a second touch input of the user in the target control subarea.
Step 107: and responding to the second touch input, and executing target control operation corresponding to the second touch input on the target object in the target sub display area associated with the target control sub area.
It can be understood that the target control sub-area is one or more of the N control sub-areas, the target sub-display area associated with the target control sub-area is one or more of the N sub-display areas, and the target object is one or more touch objects in the target sub-display area. By means of the operation control method provided by the embodiment of the invention, the user can control the terminal device to execute the target control operation corresponding to the second touch input on the target object in the target sub display area associated with the target control sub area through the second touch input in the target control sub area.
The second touch input by the user in the target control sub-area may be one or more of a plurality of operation modes, such as a sliding operation, a clicking operation, a pressing operation, and a rotating operation. Different second touch inputs can also correspond to different fingerprint information and touch characteristics, and the recognition result of the terminal on the second touch input also corresponds to different response modes corresponding to the second touch input.
Optionally, after the terminal device performs step 105, receives a second touch input of the user in the target control sub-area, the target indication identifier may be displayed in a target sub-display area associated with the target control sub-area; and further based on the second touch input, controlling the target indication mark to move in the target sub-display area. Taking fig. 11 as an example, the user controls the movement of the virtual identifier No. 2 in the control sub-area No. 2 (i.e. the target control sub-area), and correspondingly, in the upper right sub-display area (i.e. the target sub-display area) associated with the control sub-area No. 2, the target indication identifier 110 also moves along with the movement of the user in the control sub-area No. 2.
Specifically, in order to control the target indication identifier to move in the target sub-display area based on the second touch input, the terminal device receives the second touch input of the user in the target control sub-area, which may specifically be to receive the second touch input of a second finger of the user in the target control sub-area. The second finger of the user for performing the second touch input may be the same as or different from the first finger for performing the first touch input.
An alternative way of the second touch input is that the second touch input includes a press input of a first area of a second finger of the user on the display screen of the terminal device, and the first area is a partial or full fingerprint partition of the second fingerprint of the second finger. In addition, in the process of performing a pressing input on the display screen of the terminal device by using the first area of the second finger, the second touch input may further include a rotation input that a user rotates the second finger, and in the process of rotating the second finger, the area characteristic information (i.e., the touch characteristic) of the first area changes. The region feature information may specifically include an area feature or a fingerprint feature. Specifically, the area characteristic may be an area of a fingerprint partition, the fingerprint characteristic may be a type of the fingerprint partition and/or a switching of the fingerprint partition, and the like.
Further, when the target indication identifier is controlled to move in the target sub-display area based on the second touch input, the method may specifically include:
first, area characteristic change information corresponding to a second touch input is acquired. Specifically, the area characteristic change information may include area change information or fingerprint change information, the area change information may be an increase or decrease in the area of the fingerprint partition, and the fingerprint change information may be a switch from one fingerprint partition to another fingerprint partition.
Secondly, the moving direction of the target indication mark is determined based on the regional characteristic change information. For example, the left fingerprint partition moves to the right when increasing in area, and moves to the left when decreasing in area. For another example, the right fingerprint section moves to the left when the area increases, and moves to the right when the area decreases. As another example, the left fingerprint partition may be moved to the right when switching to the full fingerprint partition, and the right fingerprint partition may be moved to the left when switching to the full fingerprint partition.
And finally, controlling the target indication mark to move in the target sub-display area according to the moving direction.
The following exemplifies a process of controlling the movement of the target indicator according to the second touch input (specifically, the press input and the rotation input) input by the user. Referring to fig. 11, the user uses the thumb as the second finger to input the second touch input in the No. 2 control sub-area. The second touch input by the user comprises a press input of a left fingerprint partition of the thumb of the right hand of the user (in this example, the first area is specifically a left area, and the left fingerprint partition is formed on the screen) in the control sub-area No. 2. Furthermore, in the pressing input process, the user can rotate the thumb of the right hand to change the regional characteristic information of the left fingerprint partition, so that the terminal device can determine the moving direction of the target indication mark according to the change condition of the regional characteristic information, and further control the target indication mark to move in the target sub-display area according to the moving direction.
For example, if the user turns the thumb of the right hand to the right, the area of the left fingerprint partition will gradually increase, and the area characteristic change information corresponding to the second touch input is that the area of the fingerprint partition increases, the moving direction of the target indication mark may be determined to be a direction that tends to coincide with the finger turning direction, so that the user can more intuitively control the target indication mark to move within the target sub-display area by using the touch input in the target control sub-area.
Yet another alternative of the second touch input is that the second touch input comprises a press input of a first area of a second finger of the user on the display screen of the terminal device, the first area being a partial or full fingerprint partition of a second fingerprint of the second finger. And after the first area of the second finger performs press input on the display screen of the terminal device, the second touch input further comprises a slide input of sliding the second finger by the user. The slide input corresponds to slide parameters such as slide speed, slide direction, slide distance, etc.
Further, when the target indication identifier is controlled to move in the target sub-display area based on the second touch input, the method may specifically include:
first, a sliding parameter corresponding to a second touch input is obtained, wherein the sliding parameter includes at least one of a sliding speed, a sliding direction and a sliding distance.
Secondly, based on the sliding parameter, the moving direction of the target indication mark is determined. For example, if the sliding direction of the second touch input is leftward sliding, the moving direction of the target indication mark is determined to be leftward moving. For another example, if the sliding direction of the second touch input is an upward sliding, the moving direction of the target indication mark is determined to be an upward movement.
And finally, controlling the target indication mark to move in the target sub-display area according to the moving direction.
By adopting the mode, the target indication mark is controlled to move in the target sub-display area, so that the user can more intuitively control the target indication mark to move in the target sub-display area by utilizing the touch input in the target control sub-area.
When the target object in the target sub-display area associated with the target control sub-area is subjected to the target control operation corresponding to the second touch input on the basis that the touch input control target indication mark in the target control sub-area moves in the target sub-display area, the target object located at the current display position of the target indication mark may be acquired first, and then the target control operation corresponding to the second touch input may be performed on the target object. It should be noted that, when a second touch input by the user is received, the target indication identifier 110 is controlled to move according to the second touch input, and then the target object located at the current display position of the target indication identifier may be determined.
Optionally, after the target indication identifier is moved based on the second touch input, the target object located at the current display position of the target indication identifier may be determined. At this time, the user may continue to perform the second touch input, for example, continue to perform the pressing input after the sliding operation or the rotating operation, and when the time for continuing to press reaches a preset time (for example, 3 seconds), it is determined that the target control operation corresponding to the second touch input is performed on the target object as the clicking operation, so that the terminal performs the clicking operation on the target object.
Optionally, after the target indication identifier is moved based on the second touch input, the target object located at the current display position of the target indication identifier may be determined. At this time, the user may also interrupt the second touch input and input a third touch input, for example, the third touch input may be a click operation in an arbitrary position, a target control sub-area, or a target area on the screen, and the terminal device may regard the third touch input as a continuation operation of the second touch input when receiving the third touch input, recognize that a target control operation corresponding to the second touch input performed on the target object is a click operation, and then perform the click operation on the target object. It should be noted that the interval time between the second touch input and the third touch input (for example, the click operation received within 5 seconds after the second touch input can be regarded as the click operation required on the target object) may be preset to prevent the terminal device from being triggered by mistake.
Optionally, referring to fig. 13, receiving a second touch input of the user on a target control sub-area in the N control sub-areas may specifically be: receiving second touch input of M fingers of a user in M target control subregions, wherein the N control subregions comprise M target control subregions, and M is more than or equal to 2 and less than or equal to N; correspondingly, one target indication mark can be respectively displayed in the M target sub-display areas associated with the M target control sub-areas.
Further, when the target control operation corresponding to the second touch input is performed on the target object in the target sub-display area associated with the target control sub-area, the target control operation corresponding to the second touch input may be performed on the M target sub-display areas associated with the M target control sub-areas. Wherein the target control operation comprises at least one of zooming operation and updating the display content of at least one target sub-display area in the M target sub-display areas.
Taking the example shown in fig. 13, M is taken to be 2. 2 fingers of the user respectively perform second touch input in the No. 1 control subarea and the No. 2 control subarea; correspondingly, in the 2 target sub-display areas (upper left sub-display area and upper right sub-display area, respectively) associated with the control sub-area No. 1 and the control sub-area No. 2, a target indication mark 111 (associated with the control sub-area No. 1) and a target indication mark 112 (associated with the control sub-area No. 2) are also displayed, respectively.
When the target control operation corresponding to the second touch input is executed on the target object in the target sub-display area associated with the target control sub-area, the target control operation corresponding to the second touch input is also executed on the 2 target sub-display areas associated with the 2 target control sub-areas correspondingly. Alternatively, if the second touch input is 2 fingers sliding in the opposite direction, the target control operation corresponding to the second touch input may be a zoom operation, and taking the example shown in fig. 13, the target object in the target sub-display area will be zoomed in under the control of the second touch input.
In the embodiment of the present invention, after a target control operation corresponding to a second touch input is performed on a target object in a target sub-display area associated with a target control sub-area, a fourth touch input of a third finger of a user on a display screen of a terminal device may be further received, and in response to the fourth touch input, the N control sub-areas displayed in the target area are cancelled.
It is to be understood that the third finger may be the same as the first finger and the second finger, or may be different from the first finger and the second finger, as long as the fingerprint information and/or the touch characteristics of the fourth touch input can be distinguished from other touch inputs.
For the convenience of user operation, it is preferable that the fourth touch input and the first touch input are reciprocal operations. The reciprocal operation may be the reverse order of the operations, the reverse direction of the sliding, the correspondence of the fingerprint partitions, and so on. As will be exemplified below.
For example, referring to fig. 3, the first touch input includes a user first performing a first slide input on the display screen of the terminal device to the left with the left fingerprint partition of the first finger, and then performing a second slide input on the display screen of the terminal device to the right with the right fingerprint partition of the first finger. And after receiving the first touch input, the terminal equipment displays N control subregions in the target region. Correspondingly, referring to fig. 7, the fourth touch input includes that the user performs a first sliding input rightward on the terminal device display screen by using the right fingerprint partition of the first finger, and then performs a second sliding input leftward on the terminal device display screen by using the left fingerprint partition of the first finger. And after receiving the fourth touch input, the terminal equipment cancels the display of the N control sub-areas in the target area.
For another example, referring to fig. 4, the first touch input includes a user first performing a first sliding input on the terminal device display screen by using the upper fingerprint partition of the first finger, and then performing a second sliding input on the terminal device display screen by using the lower fingerprint partition of the first finger. And after receiving the first touch input, the terminal equipment displays N control subregions in the target region. Correspondingly, referring to fig. 8, the fourth touch input includes that the user firstly uses the lower fingerprint partition of the first finger to perform a first sliding input on the display screen of the terminal device downwards, and then uses the upper fingerprint partition of the first finger to perform a second sliding input on the display screen of the terminal device upwards. And after receiving the first touch input, the terminal equipment cancels the display of the N control sub-areas in the target area.
For another example, referring to fig. 5, the first touch input includes that the user first uses the left fingerprint partition of the first finger to perform a first sliding input in a counterclockwise direction on the display screen of the terminal device, and then uses the right fingerprint partition of the first finger to perform a second sliding input in a clockwise direction on the display screen of the terminal device. And after receiving the first touch input, the terminal equipment displays N control subregions in the target region. Correspondingly, referring to fig. 9, the fourth touch input includes that the user firstly uses the right fingerprint partition of the first finger to perform the first sliding input in the clockwise direction on the display screen of the terminal device, and then uses the left fingerprint partition of the first finger to perform the second sliding input in the counterclockwise direction on the display screen of the terminal device. And after receiving the first touch input, the terminal equipment cancels the display of the N control sub-areas in the target area.
For example, referring to fig. 6, the first touch input includes a user first performing a first slide input on the terminal device display screen upward with the center fingerprint partition of the first finger, and then performing a second slide input on the terminal device display screen rightward with the center fingerprint partition of the first finger. And after receiving the first touch input, the terminal equipment displays N control subregions in the target region. Correspondingly, referring to fig. 10, the fourth touch input includes that the user performs a first sliding input on the display screen of the terminal device leftward by using the central fingerprint partition of the first finger, and then performs a second sliding input on the display screen of the terminal device downward by using the central fingerprint partition of the first finger. And after receiving the first touch input, the terminal equipment cancels the display of the N control sub-areas in the target area.
In the embodiment of the invention, the display area of the display screen of the terminal equipment comprises N sub-display areas, the target control sub-area is one of the N control sub-areas, and each control sub-area in the N control sub-areas is associated with one sub-display area in the N sub-display areas. On the basis, the terminal device can respond to a first touch input of a finger of a user on a display screen of the terminal device, display a control subarea in the target area, and further respond to a second touch input of the finger of the user on the display screen of the terminal device, and execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control subarea. Therefore, by adopting the operation control method of the embodiment of the invention, the user can execute the corresponding target control operation on the target object in the target sub-display area associated with the target control sub-area through the touch input in the target control sub-area, thereby realizing the operation control on the whole terminal equipment display screen. The operation control mode is simple and easy to implement, and a user can operate the touch screen with a larger size more conveniently.
Corresponding to the above operation control method, an embodiment of the present invention further provides a terminal device, as shown in fig. 14, including:
the first receiving module 201 is configured to receive a first touch input of a first finger of a user on a display screen of the terminal device;
a first response module 203, configured to respond to the first touch input and display N control sub-areas in the target area;
a second receiving module 205, configured to receive a second touch input of the user in the target control sub-area;
a second response module 207, configured to respond to a second touch input, execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control sub-area;
the display area of the display screen of the terminal equipment comprises N sub-display areas; the target control subregion is one of N control subregions; each of the N control subregions is associated with one of the N sub-display regions; n is an integer greater than 1.
Preferably, in the terminal device provided in the embodiment of the present invention, the first receiving module 201 may specifically include:
the terminal device comprises a first receiving unit, a second receiving unit and a control unit, wherein the first receiving unit is used for receiving a first sliding input of a first fingerprint partition of a first finger of a user in a first direction on a display screen of the terminal device and a second sliding input of a second fingerprint partition of the first finger in a second direction on the display screen of the terminal device;
the first touch input comprises a first sliding input and a second sliding input; the first fingerprint of the first finger comprises T fingerprint partitions, wherein T is a positive integer; the T fingerprint partitions include a first fingerprint partition and a second fingerprint partition.
Preferably, the terminal device provided in the embodiment of the present invention may further include:
the first mark display module is used for displaying a target indication mark in a target sub-display area associated with the target control sub-area;
and the moving module is used for controlling the target indication mark to move in the target sub-display area based on the second touch input.
Preferably, in the terminal device provided in the embodiment of the present invention, the second receiving module 205 may specifically include:
the second receiving unit is used for receiving second touch input of a second finger of the user in the target control subarea;
the second touch input comprises a pressing input of a first area of a second finger of the user on a display screen of the terminal equipment, and the first area is a partial or whole fingerprint partition of the second fingerprint of the second finger; the second touch input further comprises a rotation input of rotating the second finger by a user in the process of performing press input on the display screen of the terminal equipment by the first area of the second finger, and the area characteristic information of the first area changes in the process of rotating the second finger; the region feature information comprises an area feature or a fingerprint feature;
the mobile module may specifically include:
the change acquiring unit is used for acquiring regional characteristic change information corresponding to the second touch input;
a direction determination unit for determining a moving direction of the target indication mark based on the region characteristic change information;
the moving unit is used for controlling the target indication mark to move in the target sub-display area according to the moving direction;
the region feature change information includes area change information or fingerprint change information.
Preferably, in the terminal device provided in the embodiment of the present invention, the second response module 207 may specifically include;
the object acquisition unit is used for acquiring a target object positioned at the current display position of the target indication mark;
and the first operation execution unit is used for executing target control operation corresponding to the second touch input on the target object.
Preferably, in the terminal device provided in the embodiment of the present invention, the second receiving module 205 may specifically include:
the third receiving unit is used for receiving second touch input of M fingers of a user in the M target control sub-areas;
wherein the N control subregions comprise M target control subregions; m is more than or equal to 2 and less than or equal to N;
the terminal device may further include:
the second identifier display module is used for respectively displaying a target indication identifier in M target sub-display areas associated with the M target control sub-areas;
the second response module 207 may specifically include:
the second operation execution unit is used for executing target control operation corresponding to second touch input on the M target sub-display areas associated with the M target control sub-areas;
wherein the target control operation comprises at least one of zooming operation and updating the display content of at least one target sub-display area in the M target sub-display areas.
It can be understood that the terminal device provided in this embodiment may implement the operation control method given in the foregoing embodiment, and relevant descriptions and examples in the foregoing embodiment are all applicable to this embodiment, and are not described herein again.
In the embodiment of the invention, the display area of the display screen of the terminal equipment comprises N sub-display areas, the target control sub-area is one of the N control sub-areas, and each control sub-area in the N control sub-areas is associated with one sub-display area in the N sub-display areas. On the basis, the terminal device can respond to a first touch input of a finger of a user on a display screen of the terminal device, display a control subarea in the target area, and further respond to a second touch input of the finger of the user on the display screen of the terminal device, and execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control subarea. Therefore, by adopting the operation control method of the embodiment of the invention, the user can execute the corresponding target control operation on the target object in the target sub-display area associated with the target control sub-area through the touch input in the target control sub-area, thereby realizing the operation control on the whole terminal equipment display screen. The operation control mode is simple and easy to implement, and a user can operate the touch screen with a larger size more conveniently.
Figure 15 is a schematic diagram of a hardware structure of a terminal device implementing various embodiments of the present invention,
the terminal device 700 includes but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 15 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the user input unit 707 is configured to perform the following processing steps:
receiving a first touch input of a first finger of a user on a display screen of the terminal equipment;
and receiving a second touch input of the user in the target control subarea.
A processor 710 for performing the following processing steps:
responding to the first touch input, and displaying N control subregions in a target region;
responding to the second touch input, and executing target control operation corresponding to the second touch input on a target object in a target sub display area associated with the target control sub area;
the display area of the display screen of the terminal equipment comprises N sub-display areas; a target control sub-region is one of the N control sub-regions; each of the N control subregions is associated with one of the N sub-display regions; and N is an integer greater than 1.
In the embodiment of the invention, the display area of the display screen of the terminal equipment comprises N sub-display areas, the target control sub-area is one of the N control sub-areas, and each control sub-area in the N control sub-areas is associated with one sub-display area in the N sub-display areas. On the basis, the terminal device can respond to a first touch input of a finger of a user on a display screen of the terminal device, display a control subarea in the target area, and further respond to a second touch input of the finger of the user on the display screen of the terminal device, and execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control subarea. Therefore, by adopting the operation control method of the embodiment of the invention, the user can execute the corresponding target control operation on the target object in the target sub-display area associated with the target control sub-area through the touch input in the target control sub-area, thereby realizing the operation control on the whole terminal equipment display screen. The operation control mode is simple and easy to implement, and a user can operate the touch screen with a larger size more conveniently.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 702, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the terminal device 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The terminal device 700 further comprises at least one sensor 705, such as light sensors, motion sensors and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the luminance of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 7061 and/or a backlight when the terminal device 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although in fig. 15, the touch panel 7071 and the display panel 7061 are implemented as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the terminal device, which is not limited herein.
The interface unit 708 is an interface for connecting an external device to the terminal apparatus 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 700 or may be used to transmit data between the terminal apparatus 700 and the external device.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby performing overall monitoring of the terminal device. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The terminal device 700 may further include a power supply 711 (e.g., a battery) for supplying power to various components, and preferably, the power supply 711 may be logically connected to the processor 710 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 700 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 710, a memory 709, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program is executed by the processor 710 to implement each process of the operation control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the operation control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (12)

1. An operation control method is applied to a terminal device, and is characterized by comprising the following steps:
receiving a first touch input of a first finger of a user on a display screen of the terminal equipment;
responding to the first touch input, and displaying N control subregions in a target region;
receiving a second touch input of the user in the target control subarea;
responding to the second touch input, and executing target control operation corresponding to the second touch input on a target object in a target sub display area associated with the target control sub area;
the display area of the display screen of the terminal equipment comprises N sub-display areas; a target control sub-region is one of the N control sub-regions; each of the N control subregions is associated with one of the N sub-display regions; n is an integer greater than 1;
the receiving a second touch input of the user on a target control sub-area of the N control sub-areas includes:
receiving second touch input of M fingers of a user in the M target control subareas;
wherein the N control sub-regions comprise the M target control sub-regions; m is more than or equal to 2 and less than or equal to N;
after the receiving of the second touch input of the user in the target control subarea, the method further comprises:
respectively displaying a target indication mark in M target sub-display areas associated with the M target control sub-areas;
the executing, on the target object in the target sub-display area associated with the target control sub-area, the target control operation corresponding to the second touch input includes:
executing target control operation corresponding to the second touch input on M target sub-display areas associated with the M target control sub-areas;
wherein the target control operation comprises at least one of a zoom operation and an update of display content of at least one of the M target sub-display areas.
2. The method of claim 1, wherein receiving a first touch input of a first finger of a user on a display screen of a terminal device comprises:
receiving a first sliding input of a first fingerprint partition of a first finger of a user in a first direction on a display screen of a terminal device and a second sliding input of a second fingerprint partition of the first finger in a second direction on the display screen of the terminal device;
wherein the first touch input comprises the first sliding input and the second sliding input; the first fingerprint of the first finger comprises T fingerprint partitions, wherein T is a positive integer; the T fingerprint partitions include the first fingerprint partition and the second fingerprint partition.
3. The method of claim 1, wherein receiving the second touch input by the user on the target control sub-area further comprises:
displaying a target indication mark in a target sub-display area associated with the target control sub-area;
controlling the target indication mark to move in the target sub-display area based on the second touch input.
4. The method of claim 3, wherein receiving a second touch input from the user at the target control sub-area comprises:
receiving a second touch input of a second finger of the user in the target control subarea;
the second touch input comprises a pressing input of a first area of a second finger of a user on the display screen of the terminal equipment, and the first area is a partial or whole fingerprint partition of the second fingerprint of the second finger; the second touch input further comprises a rotation input of rotating the second finger by a user in the process of performing the press input on the display screen of the terminal equipment by the first area of the second finger, and the area characteristic information of the first area changes in the process of rotating the second finger; the region feature information comprises an area feature or a fingerprint feature;
the controlling the target indication mark to move in the target sub-display area based on the second touch input comprises:
acquiring area characteristic change information corresponding to the second touch input;
determining the moving direction of the target indication mark based on the regional characteristic change information;
controlling the target indication mark to move in the target sub-display area according to the moving direction;
wherein the region characteristic change information includes area change information or fingerprint change information.
5. The method according to claim 3, wherein the performing a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control sub-area further comprises;
acquiring a target object located at the current display position of the target indication mark;
and executing target control operation corresponding to the second touch input on the target object.
6. A terminal device, comprising:
the first receiving module is used for receiving a first touch input of a first finger of a user on a display screen of the terminal equipment;
the first response module is used for responding to the first touch input and displaying N control subregions in the target region;
the second receiving module is used for receiving a second touch input of the user in the target control subarea;
a second response module, configured to, in response to the second touch input, execute a target control operation corresponding to the second touch input on a target object in a target sub-display area associated with the target control sub-area;
the display area of the display screen of the terminal equipment comprises N sub-display areas; a target control sub-region is one of the N control sub-regions; each of the N control subregions is associated with one of the N sub-display regions; n is an integer greater than 1;
the second receiving module includes:
the third receiving unit is used for receiving second touch input of M fingers of a user in the M target control sub-areas;
wherein the N control sub-regions comprise the M target control sub-regions; m is more than or equal to 2 and less than or equal to N;
the terminal device further includes:
a second identifier display module, configured to display a target indication identifier in each of M target sub-display regions associated with the M target control sub-regions;
the second response module includes:
a second operation execution unit, configured to execute a target control operation corresponding to the second touch input on the M target sub display areas associated with the M target control sub areas;
wherein the target control operation comprises at least one of a zoom operation and an update of display content of at least one of the M target sub-display areas.
7. The terminal device of claim 6, wherein the first receiving module comprises:
the terminal device comprises a first receiving unit, a second receiving unit and a control unit, wherein the first receiving unit is used for receiving a first sliding input of a first fingerprint partition of a first finger of a user in a first direction on a display screen of the terminal device and a second sliding input of a second fingerprint partition of the first finger in a second direction on the display screen of the terminal device;
wherein the first touch input comprises the first sliding input and the second sliding input; the first fingerprint of the first finger comprises T fingerprint partitions, wherein T is a positive integer; the T fingerprint partitions include the first fingerprint partition and the second fingerprint partition.
8. The terminal device of claim 6, further comprising:
the first mark display module is used for displaying a target indication mark in a target sub-display area associated with the target control sub-area;
and the moving module is used for controlling the target indication mark to move in the target sub-display area based on the second touch input.
9. The terminal device of claim 8, wherein the second receiving module comprises:
the second receiving unit is used for receiving second touch input of a second finger of the user in the target control subarea;
the second touch input comprises a pressing input of a first area of a second finger of a user on the display screen of the terminal equipment, and the first area is a partial or whole fingerprint partition of the second fingerprint of the second finger; the second touch input further comprises a rotation input of rotating the second finger by a user in the process of performing the press input on the display screen of the terminal equipment by the first area of the second finger, and the area characteristic information of the first area changes in the process of rotating the second finger; the region feature information comprises an area feature or a fingerprint feature;
the mobile module includes:
a change acquiring unit, configured to acquire region feature change information corresponding to the second touch input;
a direction determining unit, configured to determine a moving direction of the target indicator based on the region feature change information;
the moving unit is used for controlling the target indication mark to move in the target sub-display area according to the moving direction;
wherein the region characteristic change information includes area change information or fingerprint change information.
10. The terminal device of claim 8, wherein the second response module comprises;
the object acquisition unit is used for acquiring a target object positioned at the current display position of the target indication mark;
and the first operation execution unit is used for executing target control operation corresponding to the second touch input on the target object.
11. A terminal device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the method according to any one of claims 1 to 5.
12. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201810650873.5A 2018-06-22 2018-06-22 Operation control method and terminal equipment Active CN108897477B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810650873.5A CN108897477B (en) 2018-06-22 2018-06-22 Operation control method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810650873.5A CN108897477B (en) 2018-06-22 2018-06-22 Operation control method and terminal equipment

Publications (2)

Publication Number Publication Date
CN108897477A CN108897477A (en) 2018-11-27
CN108897477B true CN108897477B (en) 2020-11-17

Family

ID=64345628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810650873.5A Active CN108897477B (en) 2018-06-22 2018-06-22 Operation control method and terminal equipment

Country Status (1)

Country Link
CN (1) CN108897477B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109840129A (en) * 2019-01-30 2019-06-04 维沃移动通信有限公司 A kind of display control method and electronic equipment
CN109857241B (en) * 2019-02-27 2021-04-23 维沃移动通信有限公司 Display control method, terminal equipment and computer readable storage medium
CN114063838A (en) * 2020-08-07 2022-02-18 青岛海信商用显示股份有限公司 Processing method of touch operation of display equipment and display equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830917A (en) * 2012-08-02 2012-12-19 上海华勤通讯技术有限公司 Mobile terminal and touch control establishing method thereof
CN102855066A (en) * 2012-09-26 2013-01-02 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN105204756A (en) * 2014-06-30 2015-12-30 阿尔卡特朗讯 Method and device used for operating screen of touch screen device
CN106569723A (en) * 2016-10-28 2017-04-19 努比亚技术有限公司 Device and method for controlling cursor movement
CN107340966A (en) * 2017-06-28 2017-11-10 珠海市魅族科技有限公司 Terminal control method and device, computer installation and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101788051B1 (en) * 2011-01-04 2017-10-19 엘지전자 주식회사 Mobile terminal and method for controlling thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830917A (en) * 2012-08-02 2012-12-19 上海华勤通讯技术有限公司 Mobile terminal and touch control establishing method thereof
CN102855066A (en) * 2012-09-26 2013-01-02 东莞宇龙通信科技有限公司 Terminal and terminal control method
CN105204756A (en) * 2014-06-30 2015-12-30 阿尔卡特朗讯 Method and device used for operating screen of touch screen device
CN106569723A (en) * 2016-10-28 2017-04-19 努比亚技术有限公司 Device and method for controlling cursor movement
CN107340966A (en) * 2017-06-28 2017-11-10 珠海市魅族科技有限公司 Terminal control method and device, computer installation and storage medium

Also Published As

Publication number Publication date
CN108897477A (en) 2018-11-27

Similar Documents

Publication Publication Date Title
CN108958615B (en) Display control method, terminal and computer readable storage medium
CN107835321B (en) Incoming call processing method and mobile terminal
CN108491129B (en) Application program management method and terminal
CN108984067B (en) Display control method and terminal
CN108491149B (en) Split screen display method and terminal
CN109407932B (en) Icon moving method and mobile terminal
CN108762705B (en) Information display method, mobile terminal and computer readable storage medium
CN109032447B (en) Icon processing method and mobile terminal
CN108897486B (en) Display method and terminal equipment
CN109508136B (en) Display method of application program and mobile terminal
CN107728923B (en) Operation processing method and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN110825295B (en) Application program control method and electronic equipment
CN110531915B (en) Screen operation method and terminal equipment
CN109683802B (en) Icon moving method and terminal
CN110795189A (en) Application starting method and electronic equipment
CN110764675A (en) Control method and electronic equipment
CN109165033B (en) Application updating method and mobile terminal
CN108469940B (en) Screenshot method and terminal
CN108897477B (en) Operation control method and terminal equipment
CN111324257B (en) Application icon arrangement method and mobile terminal
CN108093137B (en) Dialing method and mobile terminal
CN109885242B (en) Method for executing operation and electronic equipment
CN108984099B (en) Man-machine interaction method and terminal
CN110879685B (en) Interaction method of application program interface and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant