CN106527903B - Touch control method and device - Google Patents

Touch control method and device Download PDF

Info

Publication number
CN106527903B
CN106527903B CN201611124493.5A CN201611124493A CN106527903B CN 106527903 B CN106527903 B CN 106527903B CN 201611124493 A CN201611124493 A CN 201611124493A CN 106527903 B CN106527903 B CN 106527903B
Authority
CN
China
Prior art keywords
arm
touch
area
indication
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611124493.5A
Other languages
Chinese (zh)
Other versions
CN106527903A (en
Inventor
孙世嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electric Co Ltd filed Critical Hisense Electric Co Ltd
Priority to CN201611124493.5A priority Critical patent/CN106527903B/en
Publication of CN106527903A publication Critical patent/CN106527903A/en
Application granted granted Critical
Publication of CN106527903B publication Critical patent/CN106527903B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a touch control method and device, which are applied to a terminal, wherein the terminal is provided with a touch display screen, and the method comprises the following steps: after a trigger event for displaying an indication arm is detected in a preset area of the touch display screen, displaying the indication arm with a preset length, wherein the indication arm is provided with an indication end, and the preset area is located in an area close to the bottom end of the touch display screen; after the sliding touch operation is detected on the preset area, moving the indicating arm according to the sliding direction and the sliding distance of the sliding touch operation; after an application operation event is detected on the touch display screen, executing an operation corresponding to the application operation event on a target application pointed by the indicating end of the indicating arm. By the method, the touch control of the user on the terminal touch display screen is not influenced by the size of the touch display screen, and the operation experience of the user is improved.

Description

Touch control method and device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a touch control method and apparatus.
Background
With the development of technology, screens of smart terminals (such as smart phones) are larger and larger, and more applications can be installed on the smart terminals. The touch operation area of the user is increased while the visual experience and the requirements of the user are met. Taking a smart phone as an example, a user usually holds the lower half of the smart phone, and performs touch control on a touch display screen of the smart phone by using a thumb, for example, clicking an application icon to start touch control such as an application. However, in this case, the area that the user thumb can touch on the touch display screen is limited, and if the user wants to touch the application icon located in the area on the upper end of the touch display screen through the thumb, the user needs to change the hand-held gesture or adopt two-hand operation, thereby affecting the operation experience of the user.
In order to solve the above problem, in the prior art, in order to enable the application icon on the touch display screen of the smart terminal to be located in an area that can be touched by the thumb of the user, when the smart terminal monitors a touch instruction of the user, the application icon located in the area on the upper end of the touch display screen may be moved to a blank area that can be touched by the thumb of the user. Therefore, when the touch operation of the user is frequent, the intelligent terminal needs to frequently move the application icon, so that the performance of the intelligent terminal is consumed, and meanwhile, the normal position of the application icon on the touch display screen is modified, so that the operation experience of the user is not well met.
Disclosure of Invention
In view of this, the present application provides a touch control method and device, so that the touch control of a user on a terminal touch display screen is not affected by the size of the touch display screen, and the operation experience of the user is improved.
Specifically, the method is realized through the following technical scheme:
according to a first aspect of an embodiment of the present application, there is provided a touch control method applied to a terminal, where the terminal has a touch display screen, the method including:
after a trigger event for displaying an indication arm is detected in a preset area of the touch display screen, displaying the indication arm with a preset length, wherein the indication arm is provided with an indication end, and the preset area is located in an area close to the bottom end of the touch display screen;
after the sliding touch operation is detected on the preset area, moving the indicating arm according to the sliding direction and the sliding distance of the sliding touch operation;
after an application operation event is detected on the touch display screen, executing an operation corresponding to the application operation event on a target application pointed by the indicating end of the indicating arm.
Optionally, the preset region includes at least two preset sub-regions; the terminal is stored with a corresponding relation between each sub-area and the preset length of the indicating arm, and the preset length of the indicating arm corresponding to each sub-area is in inverse proportion to the distance between the sub-area and the top end of the touch display screen;
after a trigger event for displaying an indication arm is detected in a preset area of the touch display screen, displaying the indication arm with a preset length comprises:
after a trigger event for displaying an indication arm is detected on one of the sub-regions of the touch display screen, determining the preset length of the indication arm corresponding to the sub-region where the trigger event is detected according to the corresponding relationship between the sub-regions and the preset length of the indication arm;
and displaying the indicating arm with the preset length.
Optionally, after detecting a trigger event for displaying an indication arm on a preset area of the touch display screen, displaying the indication arm with a preset length includes:
after a trigger event of a display indicating arm is detected in a preset area of the touch display screen, determining whether a touch area corresponding to the trigger event in the preset area is a blank area;
and if the touch area corresponding to the trigger event on the preset area is a blank area, displaying an indication arm with a preset length.
Optionally, the indicating arm displaying the preset length includes:
creating a transparent display layer in a current display interface of the touch display screen;
and displaying an indication arm with a preset length in the transparent display layer.
Optionally, after the sliding touch operation is detected on the preset area, moving the indication arm according to the sliding direction and the sliding distance of the sliding touch operation includes:
determining a touch starting point and a touch end point corresponding to the sliding touch operation;
determining a sliding direction and a sliding distance of the sliding touch operation according to the touch starting point and the touch end point;
and controlling the indicating arm to move according to the sliding direction and the sliding distance.
Optionally, a sliding track of the sliding touch operation on the touch display screen coincides with a sliding track of the operation end of the indication arm.
Optionally, after detecting an application operation event on the touch display screen, before performing an operation corresponding to the application operation event on a target application pointed by the pointer of the pointer arm, the method further includes:
detecting whether a plurality of application icons exist in the indication direction of the indication end of the indication arm;
and if so, determining the application program corresponding to the application icon closest to the indicating end of the indicating arm as the target application.
According to a second aspect of the embodiments of the present application, there is provided a touch control apparatus for a terminal, the terminal having a touch display screen, the apparatus including:
the display module is used for displaying an indication arm with a preset length after a trigger event for displaying the indication arm is detected in a preset area of the touch display screen, wherein the indication arm is provided with an indication end, and the preset area is located in an area close to the bottom end of the touch display screen;
the moving module is used for moving the indicating arm according to the sliding direction and the sliding distance of the sliding touch operation after the sliding touch operation is detected in the preset area;
and the execution module is used for executing the operation corresponding to the application operation event on the target application pointed by the indicating end of the indicating arm after the application operation event is detected on the touch display screen.
Optionally, the preset region includes at least two preset sub-regions; the terminal is stored with a corresponding relation between each sub-area and the preset length of the indicating arm, and the preset length of the indicating arm corresponding to each sub-area is in inverse proportion to the distance between the sub-area and the top end of the touch display screen;
the display module includes:
the first determining submodule is used for determining the preset length of the indicating arm corresponding to the sub-region in which the triggering event is detected according to the corresponding relation between the sub-region and the preset length of the indicating arm after the triggering event for displaying the indicating arm is detected on one of the sub-regions of the touch display screen;
and the first display submodule is used for displaying the indication arm with the preset length.
Optionally, the display module includes:
the second determining submodule is used for determining whether a touch area corresponding to a trigger event on a preset area is a blank area or not after the trigger event of a display indicating arm is detected on the preset area of the touch display screen;
and the second display submodule is used for displaying an indication arm with a preset length if the touch area corresponding to the trigger event on the preset area is a blank area.
Optionally, the display module includes:
the creating submodule is used for creating a transparent display layer in a current display interface of the touch display screen;
and the third display submodule is used for displaying an indication arm with a preset length in the transparent display layer.
Optionally, the moving module includes:
the third determining submodule is used for determining a touch starting point and a touch end point corresponding to the sliding touch operation;
a fourth determining submodule, configured to determine a sliding direction and a sliding distance of the sliding touch operation according to the touch start point and the touch end point;
and the movement control submodule is used for controlling the indication arm to move according to the sliding direction and the sliding distance.
Optionally, a sliding track of the sliding touch operation on the touch display screen coincides with a sliding track of the operation end of the indication arm.
Optionally, the apparatus further comprises:
the detection module is used for detecting whether a plurality of application icons exist in the indication direction of the indication end of the indication arm;
and the target determining module is used for determining the application program corresponding to the application icon closest to the indicating end of the indicating arm as the target application if a plurality of application icons exist in the indicating direction of the indicating end of the indicating arm.
As can be seen from the above embodiments, after a terminal detects a trigger event for displaying an indication arm on a preset area of a touch display screen, the indication arm with a preset length is displayed, and since a user can move the indication arm through sliding touch operation, an indication end of the indication arm points to a target application and operate the target application through the indication arm, the user can still touch the target application through the indication arm to operate the target application under the condition that the touch display screen of the terminal is large and the user holds the terminal with one hand and cannot touch the target application with the thumb of one hand, so that the operation experience of the user is improved; meanwhile, the arrangement and the size of the application icons of the terminal main interface are unchanged, so that the visual experience and the operation experience of a user are not influenced.
Drawings
FIG. 1A is a flowchart of an embodiment of a touch control method of the present application;
fig. 1B is a schematic view of an application scenario for implementing the touch control method according to the embodiment of the present application;
fig. 1C is a schematic view of another application scenario for implementing the touch control method according to the embodiment of the present application;
fig. 1D is a schematic view of another application scenario for implementing the touch control method according to the embodiment of the present application;
FIG. 2A is a flowchart of another embodiment of a touch control method of the present application;
fig. 2B is a schematic view of another application scenario for implementing the touch control method according to the embodiment of the present application;
fig. 2C is a schematic view of another application scenario for implementing the touch control method according to the embodiment of the present application;
FIG. 3 is a hardware structure diagram of a terminal where the touch control device is located according to the present application;
FIG. 4 is a block diagram of an embodiment of a touch control device according to the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Please refer to fig. 1A, which is a flowchart of an embodiment of a touch control method of the present application, fig. 1B is a schematic view of an application scenario for implementing the touch control method of the present application, fig. 1C is a schematic view of another application scenario for implementing the touch control method of the present application, and fig. 1D is a schematic view of another application scenario for implementing the touch control method of the present application. The method illustrated in fig. 1A may be applied to a terminal having a touch display screen, such as the smartphone 11 illustrated in fig. 1B, 1C, and 1D, and the method illustrated in fig. 1A may include the following steps:
step S101: after a trigger event for displaying the indicating arm is detected in a preset area of the touch display screen, the indicating arm with a preset length is displayed, the indicating arm is provided with an indicating end, and the preset area is located in an area close to the bottom end on the touch display screen.
Taking the smart phone 11 illustrated in fig. 1B as an example, several application programs are installed on the smart phone 11. Typically, the application icons of these applications are arranged on the main interface (not shown in fig. 1B) of the smartphone 11 in order from top to bottom, from left to right.
In general, when the user uses the smartphone 11, the user holds the lower half of the smartphone 11 with his left or right hand, and performs touch control on a touch display (not shown in fig. 1B) of the smartphone 11 by using the thumb of the left or right hand.
However, when the user holds the smartphone 11 by the above-described gesture, the area that the user can touch on the touch display screen of the smartphone 11 by the thumb is limited, and the touchable area is located in an area on the touch display screen close to the bottom end. If a user wants to touch an application icon, such as "app 1", located in the near-top area of the main interface with the thumb, the user needs to change the hand-held gesture or use both hands to operate, thereby affecting the operation experience of the user.
In order to enable the touch control of the user on the smart phone 11 not to be affected by the size of the touch display screen, and improve the operation experience of the user, in the application, a manufacturer of the smart phone 11 can set a preset area on the touch display screen of the smart phone 11, and the preset area represents an area which can be touched by a thumb when the user holds the smart phone 11 through the gesture. For example, as shown in fig. 1B, the touch display screen of the smart phone 11 includes a preset area 111, and the preset area 111 is located in an area close to the bottom end on the touch display screen.
In an embodiment, the user may perform a specific operation on the preset area 111 by the thumb to generate a trigger event of the display indication arm, wherein the specific operation may include a long press operation or a double click operation.
When the smart phone 11 detects the trigger event of the display indication arm on the preset area 111, an indication arm with a preset length may be displayed on the main interface, for example, as shown in fig. 1B, an indication arm 112 is displayed on the main interface of the smart phone 11, and the indication arm 112 has an indication end 1121 and an operation end 1122.
In fig. 1B, only the indication arm 112 is perpendicular to the bottom end of the smartphone 11 as an example, and the inclination direction of the indication arm 112 with respect to the bottom end of the smartphone 11 is not limited in the present application.
In one embodiment, the manufacturer of the smart phone 11 may install an application corresponding to the pointer arm 112 on the smart phone 11. After the smartphone 11 detects a trigger event of displaying the indication arm on the preset area 111, a transparent display layer is created in a current display interface of the touch display screen by calling an application program corresponding to the indication arm 112, and then the indication arm 112 is displayed in the transparent display layer, wherein in the transparent display layer, except for an area where the indication arm 112 is located, other areas are transparent, so that a user can visually see the indication arm 112.
Step S102: and after the sliding touch operation is detected on the preset area, moving the indicating arm according to the sliding direction and the sliding distance of the sliding touch operation.
In this application, the application icon "app 3" on the main interface of the smart phone 11 is touched by the user as an example:
in an alternative implementation, after the indication arm 112 is displayed on the current display interface of the touch display screen of the smartphone 11, the user may touch the operation end 1122 of the indication arm 112 with the thumb, and then the user slides the thumb on the preset area 111 to generate the sliding touch operation. After the smartphone 11 detects the sliding touch operation on the preset area 111, the pointing arm 112 may be moved according to the sliding direction and the sliding distance of the sliding touch operation.
In the above process, the slide trajectory of the slide touch operation by the user on the touch display screen coincides with the slide trajectory of the operation end 1122 of the instruction arm 112. When the user determines that the pointing end 1121 of the pointing arm 112 has pointed at the application icon "app 3", the user may lift the thumb so that the smartphone 11 stops moving the pointing arm 112, at which point the position of the pointing arm 112 on the smartphone 11 home interface may be as shown in fig. 1C.
It should be noted that, after the thumb of the user leaves the touch display screen of the smartphone 11, the position of the indication arm 112 is not changed, so that the indication end 1121 of the indication arm 112 still points to the application icon "application 3".
In another alternative implementation, after the indication arm 112 is displayed on the current display interface of the touch display screen of the smartphone 11, the user may touch a blank area in the preset area 111 with a thumb and slide the thumb in the preset area 111, so as to generate a sliding touch operation. After the smartphone 11 detects the sliding touch operation on the preset area 111, the pointing arm 112 may be moved according to the sliding direction and the sliding distance of the sliding touch operation.
Specifically, after the smartphone 11 detects the sliding touch operation on the preset area 111, it may determine a touch start point and a touch end point corresponding to the sliding touch operation, and determine a sliding direction and a sliding distance of the sliding touch operation according to the touch start point and the touch end point, so as to control the indication arm 112 to move according to the sliding direction and the sliding distance of the sliding touch operation.
Step S103: after the application operation event is detected on the touch display screen, the operation corresponding to the application operation event is executed on the target application pointed by the indicating end of the indicating arm.
In one embodiment, as shown in fig. 1C, after determining that the pointing end 1121 of the pointing arm 112 has pointed at the target application, the user can click on the operation end 1122 of the pointing arm 112 with a thumb to generate an application operation event. After detecting the application operation event on the touch display screen, the smartphone 11 may execute an operation corresponding to the application operation event on the target application pointed by the pointer 1121 of the pointer arm 112, for example, start the target application.
In an embodiment, after the user clicks the operation end of the indication arm 112 with a thumb to generate an application operation event, and the smartphone 11 detects the application operation event on the touch display screen, it may first detect whether more than one application icon exists in the indication direction of the indication end 1121 of the indication arm 112, where "the indication direction of the indication end 1121 of the indication arm 112" refers to: a direction extending from the operation end 1122 of the indication arm 112 toward the indication end 1121, and "whether more than one application icon exists in the indication direction" means: indicating whether the direction corresponds to more than one application icon passed by the virtual extension line.
For example, as shown in fig. 1D, the smartphone 11 may detect that the application icons "application 7" and "application 3" exist in the pointing direction of the pointing end 1121 of the pointing arm 112. At this time, the smartphone 11 may determine, as the target application, the application program corresponding to the application icon "application 7" closest to the indication end 1121 of the indication arm 112.
After that, according to the detection of the application operation event, the smartphone 11 may execute an operation corresponding to the application operation event on the target application pointed by the pointer 1121 of the pointer arm 112, for example, start the target application.
As can be seen from the above embodiments, after a terminal detects a trigger event for displaying an indication arm on a preset area of a touch display screen, the indication arm with a preset length is displayed, and since a user can move the indication arm through sliding touch operation, an indication end of the indication arm points to a target application and operate the target application through the indication arm, the user can still touch the target application through the indication arm to operate the target application under the condition that the touch display screen of the terminal is large and the user holds the terminal with one hand and cannot touch the target application with the thumb of one hand, so that the operation experience of the user is improved; meanwhile, the arrangement and the size of the application icons of the terminal main interface are unchanged, so that the visual experience and the operation experience of a user are not influenced.
Please refer to fig. 2A, which is a flowchart illustrating another embodiment of the touch control method of the present application, fig. 2B is a schematic view illustrating another application scenario for implementing the touch control method of the present application, and fig. 2C is a schematic view illustrating another application scenario for implementing the touch control method of the present application. The flow shown in fig. 2A is described by taking the process of displaying the indication arm as an example based on the flow shown in fig. 1A, and may include the following steps:
step S201: after a trigger event of a display indication arm is detected in a preset area of a touch display screen, determining whether a first touch area corresponding to the trigger event in the preset area is a blank area, and if the first touch area is the blank area, executing step S202; if the first touch area is not the blank area, step S203 is executed.
Step S202: and displaying an indicating arm with a preset length, and ending the process.
Step S203: the display of the indicating arm is inhibited.
In this application, if a touch area corresponding to a trigger event on the touch display screen is a blank area, that is, no application icon is on the touch area, the smart phone 11 may display an indication arm with a preset length; if the touch area is not a blank area, that is, if the first touch area has an application icon, the display of the indication arm may be prohibited, and it should be noted that the specific operation performed by the smartphone 11 in this case is not limited in this application.
In addition, in order to more effectively avoid the influence of the touch control of the user on the terminal touch display screen by the size of the touch display screen and improve the operation experience of the user, in the application, the preset area 111 on the smart phone 11 can be further divided into at least two sub-areas, and the preset length of the corresponding indication arm of each preset sub-area is different, wherein the closer the distance between the preset sub-area and the top end of the smart phone 11 is, the smaller the preset length of the corresponding indication arm can be, and the corresponding relationship between the preset lengths of the sub-areas and the indication arm can be stored on the smart phone 11.
As follows, taking the preset region 111 on the smartphone 11 as an example of being divided into two sub-regions, as shown in fig. 2B or fig. 2C, the preset region 111 on the smartphone 11 includes a sub-region 1111 and a sub-region 1112, where the sub-region 1111 is located above the sub-region 1112. For convenience of description, the preset length of the sub-region 1111 corresponding to the pointing arm may be referred to as a first preset length, and the preset length of the sub-region 1112 corresponding to the pointing arm may be referred to as a second preset length, and since the sub-region 1111 is closer to the top end of the smartphone 11 than the sub-region 1112, the first preset length may be smaller than the second preset length.
As shown in fig. 2B, if the smart phone 11 detects a trigger event for displaying the indication arm on the preset sub-region 1111, the smart phone 11 may determine a first preset length of the sub-region 1111 corresponding to the indication arm according to a corresponding relationship between the sub-region and the preset length of the indication arm, and subsequently, the smart phone 11 may display the indication arm with the first preset length on the main interface; as shown in fig. 2C, if the smart phone 11 detects a trigger event for displaying the indication arm on the sub-region 1112, the smart phone 11 may determine a second preset length of the indication arm corresponding to the preset sub-region 1112 according to a corresponding relationship between the sub-region and the preset length of the indication arm, and subsequently, the smart phone 11 may display the indication arm with the second preset length on the main interface.
Through dividing the preset area into preset sub-areas, the length of the indicating arm can be flexibly controlled, and therefore user experience is better improved.
It can be seen from the above embodiments that, after the terminal detects the trigger event of the display indication arm on the preset region of the touch display screen, the indication arm with the preset length is displayed when it is determined that the first touch region corresponding to the trigger event on the preset region is the blank region, so that the misoperation can be effectively avoided, and the operation experience of the user is improved.
Corresponding to the embodiment of the touch control method, the application also provides an embodiment of the touch control device.
The embodiment of the touch control device can be applied to the terminal. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor of the terminal where the device is located. From a hardware aspect, as shown in fig. 3, the present application is a hardware structure diagram of a terminal where the touch control device is located, except for the processor 31, the memory 32, the network interface 33, and the nonvolatile memory 34 shown in fig. 3, the terminal where the device is located in the embodiment may also include other hardware according to the actual function of the terminal, which is not described again.
Referring to fig. 4, a block diagram of an embodiment of a touch control apparatus according to the present application, where the apparatus may be applied to a terminal, the terminal has a touch display screen, and the apparatus may include: a display module 41, a moving module 42 and an execution module 43.
The display module 41 may be configured to display an indication arm with a preset length after detecting a trigger event for displaying the indication arm on a preset area of the touch display screen, where the indication arm has an indication end, and the preset area is located in an area on the touch display screen close to a bottom end;
the moving module 42 may be configured to, after detecting the sliding touch operation on the preset area, move the indication arm according to a sliding direction and a sliding distance of the sliding touch operation;
the executing module 43 may be configured to, after detecting an application operation event on the touch display screen, execute an operation corresponding to the application operation event on a target application pointed by the pointer of the pointer arm.
In an embodiment, the preset area may include at least two preset sub-areas, a corresponding relationship between each of the sub-areas and a preset length of the indication arm is stored on the terminal, and the preset length of the indication arm corresponding to each of the sub-areas is in an inverse relationship with a distance between the sub-area and the top end of the touch display screen; the display module 41 may include (not shown in fig. 4): the display device comprises a first determining submodule and a first displaying submodule.
The first determining sub-module may be configured to, after a trigger event for displaying an indication arm is detected on one of the sub-regions of the touch display screen, determine a preset length of the indication arm corresponding to the sub-region where the trigger event is detected according to a correspondence between the sub-region and a preset length of the indication arm;
the first display sub-module may be configured to display the indication arm with the preset length.
In one embodiment, the display module 41 may include (not shown in fig. 4): a second determining submodule and a second displaying submodule.
The second determining submodule may be configured to determine whether a touch area corresponding to a trigger event on a preset area of the touch display screen is a blank area after the trigger event of the display indication arm is detected on the preset area;
the second display sub-module may be configured to display an indication arm with a preset length if the touch area corresponding to the trigger event on the preset area is a blank area.
In one embodiment, the display module 41 may include (not shown in fig. 4): creating a sub-module and a third display sub-module.
The creating submodule can be used for creating a transparent display layer in a current display interface of the touch display screen;
the third display sub-module may be configured to display an indication arm with a preset length in the transparent display layer.
In one embodiment, the movement module 42 may include (not shown in fig. 4): a third determining submodule, a fourth determining submodule and a movement control submodule.
The third determining submodule may be configured to determine a touch start point and a touch end point corresponding to the sliding touch operation;
the fourth determining submodule can be used for determining the sliding direction and the sliding distance of the sliding touch operation according to the touch starting point and the touch end point;
the movement control submodule can be used for controlling the indicating arm to move according to the sliding direction and the sliding distance.
In one embodiment, a sliding track of the sliding touch operation on the touch display screen coincides with a sliding track of the operation end of the indication arm.
In an embodiment, the apparatus may further comprise (not shown in fig. 4): the device comprises a detection module and a target determination module.
The detection module may be configured to detect whether a plurality of application icons exist in the indication direction of the indication end of the indication arm;
the target determination module may be configured to determine, as the target application, an application program corresponding to an application icon closest to the indicating end of the indicating arm if a plurality of application icons exist in the indicating direction of the indicating end of the indicating arm.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (10)

1. A touch control method is applied to a terminal, the terminal is provided with a touch display screen, and the method is characterized by comprising the following steps:
after a trigger event for displaying an indication arm is detected in a preset area of the touch display screen, displaying the indication arm with a preset length, wherein the indication arm is provided with an indication end, and the preset area is located in an area close to the bottom end of the touch display screen;
after the sliding touch operation is detected on the preset area, moving the indicating arm according to the sliding direction and the sliding distance of the sliding touch operation;
after an application operation event is detected on the touch display screen, executing an operation corresponding to the application operation event on a target application pointed by the indicating end of the indicating arm, wherein the application operation event is generated when the operation end of the indicating arm is detected to be clicked.
2. The method of claim 1, wherein the preset area comprises at least two sub-areas; the terminal is stored with a corresponding relation between each sub-area and the preset length of the indicating arm, and the preset length of the indicating arm corresponding to each sub-area is in inverse proportion to the distance between the sub-area and the top end of the touch display screen;
after a trigger event for displaying an indication arm is detected in a preset area of the touch display screen, displaying the indication arm with a preset length comprises:
after a trigger event for displaying an indication arm is detected on one of the sub-regions of the touch display screen, determining the preset length of the indication arm corresponding to the sub-region where the trigger event is detected according to the corresponding relationship between the sub-regions and the preset length of the indication arm;
and displaying the indicating arm with the preset length.
3. The method of claim 1, wherein displaying a pointing arm of a preset length after detecting a triggering event for displaying a pointing arm on a preset area of the touch display screen comprises:
after a trigger event of a display indicating arm is detected in a preset area of the touch display screen, determining whether a touch area corresponding to the trigger event in the preset area is a blank area;
and if the touch area corresponding to the trigger event on the preset area is a blank area, displaying an indication arm with a preset length.
4. The method of claim 1, wherein displaying the indicator arm of the preset length comprises:
creating a transparent display layer in a current display interface of the touch display screen;
and displaying an indication arm with a preset length in the transparent display layer.
5. The method according to claim 1, wherein moving the pointing arm according to the sliding direction and the sliding distance of the sliding touch operation after the sliding touch operation is detected on the preset area comprises:
determining a touch starting point and a touch end point corresponding to the sliding touch operation;
determining a sliding direction and a sliding distance of the sliding touch operation according to the touch starting point and the touch end point;
and controlling the indicating arm to move according to the sliding direction and the sliding distance.
6. The method according to claim 1, wherein a sliding trajectory of the sliding touch operation on the touch display screen coincides with a sliding trajectory of the operation end of the pointing arm.
7. The method according to claim 1, wherein after detecting an application operation event on the touch display screen, before performing an operation corresponding to the application operation event on a target application pointed to by a pointer of the pointer arm, the method further comprises:
detecting whether a plurality of application icons exist in the indication direction of the indication end of the indication arm;
and if so, determining the application program corresponding to the application icon closest to the indicating end of the indicating arm as the target application.
8. A touch control device is applied to a terminal, the terminal is provided with a touch display screen, and the device is characterized by comprising:
the display module is used for displaying an indication arm with a preset length after a trigger event for displaying the indication arm is detected in a preset area of the touch display screen, wherein the indication arm is provided with an indication end, and the preset area is located in an area close to the bottom end of the touch display screen;
the moving module is used for moving the indicating arm according to the sliding direction and the sliding distance of the sliding touch operation after the sliding touch operation is detected in the preset area;
the execution module is used for executing the operation corresponding to the application operation event to the target application pointed by the indicating end of the indicating arm after the application operation event is detected on the touch display screen, wherein the application operation event is generated when the clicking operation of the operating end of the indicating arm is detected.
9. The apparatus of claim 8, wherein the preset area comprises at least two sub-areas; the terminal is stored with a corresponding relation between each sub-area and the preset length of the indicating arm, and the preset length of the indicating arm corresponding to each sub-area is in inverse proportion to the distance between the sub-area and the top end of the touch display screen;
the display module includes:
the first determining submodule is used for determining the preset length of the indicating arm corresponding to the sub-region in which the triggering event is detected according to the corresponding relation between the sub-region and the preset length of the indicating arm after the triggering event for displaying the indicating arm is detected on one of the sub-regions of the touch display screen;
and the first display submodule is used for displaying the indication arm with the preset length.
10. The apparatus of claim 8, wherein the display module comprises:
the second determining submodule is used for determining whether a touch area corresponding to a trigger event on a preset area is a blank area or not after the trigger event of a display indicating arm is detected on the preset area of the touch display screen;
and the second display submodule is used for displaying an indication arm with a preset length if the touch area corresponding to the trigger event on the preset area is a blank area.
CN201611124493.5A 2016-12-08 2016-12-08 Touch control method and device Active CN106527903B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611124493.5A CN106527903B (en) 2016-12-08 2016-12-08 Touch control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611124493.5A CN106527903B (en) 2016-12-08 2016-12-08 Touch control method and device

Publications (2)

Publication Number Publication Date
CN106527903A CN106527903A (en) 2017-03-22
CN106527903B true CN106527903B (en) 2020-04-07

Family

ID=58342553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611124493.5A Active CN106527903B (en) 2016-12-08 2016-12-08 Touch control method and device

Country Status (1)

Country Link
CN (1) CN106527903B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108536291A (en) * 2018-03-29 2018-09-14 努比亚技术有限公司 A kind of application operating method, wearable device and storage medium
CN111338494B (en) * 2018-12-19 2022-03-25 华为技术有限公司 Touch display screen operation method and user equipment
CN109885242B (en) * 2019-01-18 2021-07-27 维沃移动通信有限公司 Method for executing operation and electronic equipment
CN111752425B (en) * 2019-03-27 2022-02-15 北京外号信息技术有限公司 Method for selecting an interactive object on a display medium of a device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014026666A (en) * 2013-09-19 2014-02-06 Aplix Ip Holdings Corp User interface device
CN103677556A (en) * 2012-09-24 2014-03-26 北京三星通信技术研究有限公司 Method and device for locating application program quickly
CN104380238A (en) * 2013-12-03 2015-02-25 华为技术有限公司 Processing method, device and terminal
CN104714726A (en) * 2015-04-01 2015-06-17 王明 Control device capable of operating touch screen of mobile terminal by one hand, and control method of control device
CN105320420A (en) * 2014-07-31 2016-02-10 中兴通讯股份有限公司 Mobile terminal and method for realizing one-hand operation of mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677556A (en) * 2012-09-24 2014-03-26 北京三星通信技术研究有限公司 Method and device for locating application program quickly
JP2014026666A (en) * 2013-09-19 2014-02-06 Aplix Ip Holdings Corp User interface device
CN104380238A (en) * 2013-12-03 2015-02-25 华为技术有限公司 Processing method, device and terminal
CN105320420A (en) * 2014-07-31 2016-02-10 中兴通讯股份有限公司 Mobile terminal and method for realizing one-hand operation of mobile terminal
CN104714726A (en) * 2015-04-01 2015-06-17 王明 Control device capable of operating touch screen of mobile terminal by one hand, and control method of control device

Also Published As

Publication number Publication date
CN106527903A (en) 2017-03-22

Similar Documents

Publication Publication Date Title
RU2523169C2 (en) Panning content using drag operation
CN106527903B (en) Touch control method and device
CN105335048B (en) Electronic equipment with hidden application icon and method for hiding application icon
US20130080951A1 (en) Device and method for moving icons across different desktop screens and related computer readable storage media comprising computer executable instructions
KR101328202B1 (en) Method and apparatus for running commands performing functions through gestures
CN107704157B (en) Multi-screen interface operation method and device and storage medium
CN103207750A (en) Method and device for scaling icon
CN103955331A (en) Display processing method and device of application icon
JP2016529635A (en) Gaze control interface method and system
JP5449630B1 (en) Programmable display and its screen operation processing program
WO2011157527A1 (en) Contextual hierarchical menu system on touch screens
CN102402375A (en) Display terminal and display method
CN104536643A (en) Icon dragging method and terminal
CN107688420B (en) Method for starting floating object and mobile terminal
US9739995B2 (en) Operating system and method for displaying an operating area
KR20160068623A (en) Method and apparatus for reconfiguring icon location
CN112379958A (en) Sliding control method and device for application program page
TWI483172B (en) Method and system for arranging a user interface of the electronic device
CN113721819B (en) Man-machine interaction method and device and electronic equipment
US10642481B2 (en) Gesture-based interaction method and interaction apparatus, and user equipment
EP3371686B1 (en) Improved method for selecting an element of a graphical user interface
CN108132721B (en) Method for generating drag gesture, touch device and portable electronic equipment
CN103279304A (en) Method and device for displaying selected icon and mobile device
CN107211056B (en) Terminal wallpaper control method and terminal
KR20150049716A (en) Method and apparatus for changing a displaying magnification of an object on touch-screen display

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 266555, No. 218, Bay Road, Qingdao economic and Technological Development Zone, Shandong

Patentee after: Hisense Visual Technology Co., Ltd.

Address before: 266555, No. 218, Bay Road, Qingdao economic and Technological Development Zone, Shandong

Patentee before: QINGDAO HISENSE ELECTRONICS Co.,Ltd.

CP01 Change in the name or title of a patent holder