CN108132721B - Method for generating drag gesture, touch device and portable electronic equipment - Google Patents

Method for generating drag gesture, touch device and portable electronic equipment Download PDF

Info

Publication number
CN108132721B
CN108132721B CN201711403387.5A CN201711403387A CN108132721B CN 108132721 B CN108132721 B CN 108132721B CN 201711403387 A CN201711403387 A CN 201711403387A CN 108132721 B CN108132721 B CN 108132721B
Authority
CN
China
Prior art keywords
touch
single finger
movement
finger
drag gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711403387.5A
Other languages
Chinese (zh)
Other versions
CN108132721A (en
Inventor
刘晓军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201711403387.5A priority Critical patent/CN108132721B/en
Publication of CN108132721A publication Critical patent/CN108132721A/en
Application granted granted Critical
Publication of CN108132721B publication Critical patent/CN108132721B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Abstract

A method of generating a drag gesture, comprising: after detecting a first touch event of a single finger, determining a first movement track generated by the single finger touching the touch control device; determining the end point of the first movement track when another finger is detected while the single finger touches the touch device; determining a second movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the first movement trajectory; and generating a drag gesture based on at least the first and second movement trajectories.

Description

Method for generating drag gesture, touch device and portable electronic equipment
Technical Field
The present invention relates to the field of electronic devices, and more particularly, to a method for generating a drag gesture, a touch device and a portable electronic device.
Background
Nowadays, electronic devices such as notebook computers and the like are increasingly used, and users can interact with the notebook computers through touch panels of the notebook computers, thereby realizing various user operations. For example, a user may generate a drag gesture by double-clicking the touch panel with a single finger and then moving the finger on the touch panel, so that the notebook computer performs a corresponding action according to the drag gesture, such as selecting a certain area on the screen, thereby achieving interaction between the user and the notebook computer through the drag gesture.
However, since the area of the touch pad of the notebook computer is limited and the dragging gesture is considered to be finished once the finger is detected to leave the touch pad, the dragging range corresponding to the dragging gesture is very limited on the premise of ensuring certain recognition accuracy of the operating system on the dragging gesture. For example, in a situation where a drawing application is run on a notebook computer and a user wants to select a certain area of a picture, since the user can only make a drag gesture on a touch pad with a limited area, the drag range in the corresponding drawing application is very limited and a large area cannot be selected, which brings a bad user experience to the user.
Disclosure of Invention
In order to solve the above technical problems in the prior art, according to an aspect of the present invention, there is provided a method of generating a drag gesture, including: after detecting a first touch event of a single finger, determining a first movement track generated by the single finger touching the touch control device; determining the end point of the first movement track when another finger is detected while the single finger touches the touch device; determining a second movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the first movement trajectory; and generating a drag gesture based on at least the first and second movement trajectories.
Further, according to an embodiment of the present invention, wherein the first touch event is at least one of a double-click operation of a single finger on the touch device, a touch having a predetermined contact area, or a touch lasting for a predetermined time.
Furthermore, in accordance with an embodiment of the present invention, wherein the end point of the second movement trajectory is determined when a single finger is detected to leave the touch device.
Further, according to an embodiment of the present invention, wherein the end point of the second movement trajectory is determined when another finger is detected while the single finger touches the touch device and a predetermined time elapses.
Further, according to an embodiment of the present invention, wherein the end point of the second movement trajectory is determined when another finger is detected while the single finger touches the touch device. The method of generating a drag gesture further includes: determining a third movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the second movement trajectory; and generating a drag gesture based on the first, second, and third movement trajectories.
Further, according to an embodiment of the present invention, wherein the drag gesture is generated while the first movement trajectory and/or the second movement trajectory is determined.
Further, in accordance with an embodiment of the present invention, wherein the drag gesture is generated after the first movement trajectory and the second movement trajectory are determined.
According to another aspect of the present invention, there is provided a touch device, including: a touch detection unit configured to detect a touch of a finger of a user on a touch device; a track determining unit configured to determine a movement track of a single finger on the touch device; a control unit configured to: after detecting a first touch event of a single finger, determining a first movement track generated by the single finger touching the touch control device; determining the end point of the first movement track when another finger is detected while the single finger touches the touch device; determining a second movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the first movement trajectory; and a drag gesture generation unit configured to generate a drag gesture based on at least the first and second movement trajectories.
Furthermore, according to an embodiment of the present invention, the touch device further includes: and the dragging gesture sending unit is configured to send the dragging gesture to a remote device connected with the touch control device so that the remote device operates according to the dragging gesture.
According to another aspect of the present invention, there is provided a portable electronic device including: a display, a central processing unit, and a touch panel. The touch panel includes: a touch detection unit configured to detect a touch of a finger of a user on the touch panel; a trajectory determination unit configured to determine a movement trajectory of a single finger on the touch panel; a control unit configured to: after detecting the first touch event of the single finger, determining a first movement track generated by the single finger touching the touch pad; determining an end point of the first movement track when another finger is detected while the single finger touches the touch pad; determining a second movement trajectory generated by the single finger touching the touch panel after the single finger is detected again at an end point different from the first movement trajectory; and a drag gesture generation unit configured to generate a drag gesture based on at least the first and second movement trajectories; and the central processing unit operates according to the dragging gesture generated by the touch pad.
According to the method for generating the drag gesture, the touch device and the portable electronic device in the aspects of the invention, when a single finger touches the touch device and another finger is detected, the pause of the drag gesture can be determined, and the user can form another track with the single finger again at a position different from the previously formed track end point and generate the corresponding drag gesture based on each track, so that the drag gesture is generated based on discontinuous moving tracks, thereby expanding the drag range corresponding to the drag gesture, solving the problem that only a limited drag range can be generated due to the limitation of the area of the touch pad, and further improving the user experience.
Drawings
These and/or other aspects and advantages of the present invention will become more apparent and more readily appreciated from the following detailed description of the embodiments of the invention, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of an exemplary application scenario of an embodiment of the present invention;
FIG. 2 is a flow diagram illustrating a method of generating a drag gesture in accordance with an embodiment of the present invention;
FIG. 3 is a diagram illustrating a movement trace generated by a user touching a touch device with a single finger according to an embodiment of the invention;
FIG. 4 is a block diagram illustrating a touch device according to an embodiment of the present invention; and
fig. 5 is a block diagram illustrating a portable electronic device according to an embodiment of the present invention.
Detailed Description
Various embodiments according to the present invention will be described in detail with reference to the accompanying drawings. Here, it is to be noted that, in the drawings, the same reference numerals are given to constituent parts having substantially the same or similar structures and functions, and repeated description thereof will be omitted.
As described above, since the area of the touch pad of the notebook computer is limited and once the finger is detected to leave the touch pad, the drag gesture is considered to be finished, and the drag range corresponding to the drag gesture of the user on the touch pad is very limited, the operation experience of the user is limited. In order to solve the problem, the dragging gesture is generated by combining discontinuous moving tracks of a single finger of a user on the touch device, and the dragging range corresponding to the dragging gesture is expanded. One illustrative scenario in which a drag gesture may be generated that may implement embodiments of the present disclosure is described below with reference to FIG. 1.
As shown in fig. 1, assuming that a user is running a drawing application using his notebook computer and wants to select a portion for editing in a screen displayed by the drawing application, in such a case, the user may generate a drag gesture according to the method of generating a drag gesture of the present disclosure, so that a drag area corresponding to the drag gesture is selected. It should be understood that the method of generating a drag gesture of the present disclosure may also be applied to various other scenarios, including but not limited to: displaying a plurality of operable objects (such as file icons, shortcuts and the like) on a screen of the notebook computer, and enabling a user to select a part of the operable objects in batches for editing by making a dragging gesture on a touch pad; for another example, when contents such as pictures, documents, and the like, which can be browsed by pulling up and down a scroll bar, are displayed on a screen of a notebook computer, a user swiftly browses the contents by making a drag gesture on a touch panel.
It will be appreciated by those skilled in the art that the application scenarios listed above are merely examples, and that the method of generating a drag gesture of the present invention is equally applicable to application scenarios similar to pressing a left mouse button while moving a mouse. Further, although the description has been made on a scene in which the embodiments of the present invention are applicable, taking the above of a touch panel of a notebook computer as an example, it is to be understood that the present invention is equally applicable to other various electronic apparatuses equipped with a touch device that can detect a touch of a user's finger.
Hereinafter, a method of generating a drag gesture according to an embodiment of the present invention will be described with reference to fig. 2. The method can be applied to a notebook computer or any other type of electronic equipment with a touch device.
As shown in FIG. 2, the method of generating a drag gesture of this embodiment may include the steps of:
in step S101, after detecting a first touch event of a single finger, a first movement track generated by the single finger touching the touch device is determined. The first touch event may be set by a user as desired, for example, the first touch event may be a double-click operation of a single finger on the touch device, a touch having a predetermined contact area, or a touch lasting for a predetermined time. After detecting the first touch event of the single finger of the user in this step, it may be determined that the user has started to make a drag gesture, and accordingly, a first movement trajectory generated when the single finger of the user moves on the touch device is determined.
In step S102, when another finger is detected while the single finger touches the touch device, an end point of the first movement trajectory is determined. In this step S102, if another finger is detected while the single finger making the first touch event and forming the first movement trace is moving, or when another finger is detected while the single finger making the first touch event and forming the first movement trace is no longer moving and is not leaving the touch device, it may be determined that the user has paused generation of the drag gesture, and accordingly, the end point of the first trace formed by the single finger is determined. During the pause period after the step S102, each finger of the user may leave the touch device without being regarded as the end of the drag gesture.
In step S103, after detecting the single finger again at the end point different from the first movement track, determining a second movement track generated by the single finger touching the touch device. As described above, the user may leave the finger from the touch device after step S102, and in step S103, the user may touch again with the single finger at a position different from the end point of the first trajectory, thereby determining that the user continues the drag gesture and determining that the single finger moves the formed second movement trajectory on the touch device. It should be noted that the single finger in step S103 may be the same or different single finger as the single finger in step S101, and the invention is not limited thereto.
The second movement trace in step S103 is formed by detecting the movement of a single finger of the user on the touch device. For example, when it is detected that the single finger detected again in step S103 is away from the touch device, it may be determined that the user has ended the drag gesture, and thus the end point of the second movement trajectory may be determined. For another example, when another finger is detected while the single finger detected again in step S103 touches the touch device and a predetermined time elapses, that is, when the predetermined time has elapsed since the drag gesture was suspended, it may be determined that the user has ended the drag gesture, and thus the end point of the second movement trajectory may be determined.
In step S104, a drag gesture is generated based on at least the first and second movement trajectories. In this step S104, the drag gesture may be generated while the first movement trace and/or the second movement trace is determined. For example, while the first movement track and/or the second movement track are/is determined, that is, while a single finger of a user is moving on the touch device, the notebook computer can be caused to perform corresponding actions in real time in response to the dragging gesture generated by the single finger movement. For example, in a scenario where a drawing application is run on a notebook computer, a corresponding dragging gesture may be generated while a single finger of a user is moving, so that a selected range on a screen may be dynamically changed in real time.
Further, in this step S104, the drag gesture may be generated after determining the first and second movement trajectories. For example, after the first and second movement trajectories are formed, the displacement/direction between the start and end points of each trajectory may be found, and accumulating the displacement/direction of each movement trajectory generates a drag gesture corresponding to both trajectories. In this case, after the first movement track and the second movement track are formed, that is, when the single finger of the user no longer moves on the touch device, the drag gesture corresponding to the first movement track and the second movement track is generated, so that the notebook computer performs corresponding actions in response to the drag gesture generated by each movement track of the single finger of the user.
According to the method for generating the dragging gesture, when a single finger touches the touch device and another finger is detected, the pause of the dragging gesture can be determined, the user can form another track by the single finger again at a position different from the previously formed track end point, and the corresponding dragging gesture is generated based on each track, so that the dragging gesture is generated based on discontinuous moving tracks, the dragging range corresponding to the dragging gesture is expanded, the problem that only a limited dragging range can be generated due to the limitation of the area of the touch pad is solved, and the user experience is improved.
The method for generating the drag gesture based on the two trajectories of the first movement trajectory and the second movement trajectory is described above, but is not limited thereto, and the present invention may also form the drag gesture based on a greater number of movement trajectories and cause the notebook computer to perform a corresponding action in response to the drag gesture. A method of generating a drag gesture based on the third movement trajectory in addition to the first movement trajectory and the second movement trajectory is described below.
Specifically, in the case where the drag gesture is formed based on more than two movement trajectories, the method of determining the end point of the second movement trajectory is similar to the method of determining the end point of the first movement trajectory in step S102. Specifically, when another finger is detected while a single finger forming a second movement trajectory touches the touch device, the end point of the second movement trajectory is determined. For example, when another finger is detected while the single finger forming the second movement trajectory is moving, or when another finger is detected while the single finger forming the second movement trajectory is no longer moving and is not away from the touch device, it is determined that the user has suspended generation of the drag gesture, and accordingly, the end point of the second trajectory formed by the single finger may be determined. Similarly, after determining the end point of the second trajectory, each finger of the user may leave the touch device without being considered the end of the drag gesture.
Next, after detecting the single finger again at a different end point than the second movement trajectory, determining that the user continues the dragging gesture, and accordingly determining a third movement trajectory generated by the single finger touching the touch device. Similarly, the single finger forming the third movement trace may be the same or different single finger as the single finger forming the first movement trace and the second movement trace, and the present invention is not limited thereto.
Next, a drag gesture is generated based on at least the first, second, and third movement trajectories. In this step, similarly as in the above step S104, the drag gesture may be generated while the first movement trajectory and/or the second movement trajectory and/or the third movement trajectory is determined; alternatively, the drag gesture may be generated after determining the first, second, and third movement trajectories. Accordingly, the notebook computer can be caused to act accordingly in response to the drag gesture generated by the movement of the user's single finger.
The process of the method of generating a drag gesture according to an embodiment of the present disclosure is described above in conjunction with fig. 2, and a schematic diagram of a user's operation on a touch panel and a generated drag trajectory corresponding to the process described in fig. 2 is schematically depicted below in conjunction with fig. 3.
Specifically, the operation steps of the user on the touch panel and the movement tracks formed in the steps are as follows:
(a) double-clicking by the finger F1 at point 1 (an example of a first touch event) upon detection by the touchpad of which it is determined that a drag gesture is to be initiated;
(b) the finger F1 continuously touches and moves to a point 2, and the touch pad determines that the movement track between the point 1 and the point 2 is a first movement track;
(c) when the finger F1 touches the point 2, the finger F2 touches the point 2' of the touch pad, and the touch pad determines that the generation of the dragging gesture is suspended after detecting the two fingers, and determines that the end point of the first movement track is the point 2;
(d) fingers F1 and F2 move away from the touchpad;
(e) finger F1 touched at point 3, and the touchpad determined to continue generating a drag gesture after again detecting a single finger;
(f) the finger F1 continuously touches and moves to a point 4, and the touch pad determines that the movement track from the point 3 to the point 4 is a second movement track;
(g) when the finger F1 touches the point 4, the finger F2 touches the point 4' of the touch pad, the touch pad determines to pause generating the dragging gesture again after detecting the two fingers, and determines that the end point of the second movement track is the point 4;
(h) fingers F1 and F2 move away from the touchpad;
(i) the finger F1 touched at point 5, and the touchpad determined to continue generating the drag gesture again after detecting the single finger again;
(j) the finger F1 continuously touches and moves to the point 6, and the touch pad determines that the movement track between the point 5 and the point 6 is a third movement track;
(k) when the finger F1 leaves the touch panel at the point 6 or the finger F2 touches at the touch panel point 6' (not shown) while the finger F1 holds the point 6 and a predetermined time elapses thereafter, the touch panel determines that the drag gesture is ended and determines that the end point of the third movement trajectory is the point 6.
It should be noted that the lengths and directions of the respective trajectories shown in fig. 3 are only illustrative, and a user may generate movement trajectories with different lengths and directions as needed, thereby generating different dragging gestures. Further, although generating the corresponding drag gesture based on three movement trajectories of a single finger of the user is described above in connection with fig. 3, it is not limited thereto, and the drag gesture may be generated based on a greater number or a smaller number of movement trajectories. For example, for step (g), it is also possible to determine that the drag gesture ends and determine the end point of the second movement trajectory when the finger F1 leaves the touch panel at the point 4, or when the finger F2 touches at the touch panel point 4' while the finger F1 holds the point 4 and a predetermined time elapses thereafter, and then generate the drag gesture based on the two movement trajectories of the first movement trajectory and the second movement trajectory. For another example, steps (e) - (h) may be repeated all the time, that is, a drag gesture may be generated according to more than three movement trajectories of a single finger of the user on the touch pad.
In summary, in the above-described process of generating a drag gesture: if a first touch event of a single finger is detected, a drag gesture start may be determined, and a movement trajectory resulting from the single finger movement may then be determined; if a single finger touches the touch device while another finger is detected, it may be determined that the drag gesture is paused, during which time the user's finger may leave the touch device without being deemed that the drag gesture is ended; if a single finger is detected again after the fling gesture is paused, it may be determined that the fling gesture continues and another trace resulting from the single finger movement is correspondingly determined; if the single finger leaves the touch control device in the process of touching the touch control device by the single finger, determining that the dragging gesture is finished; or the end of the drag gesture may be determined if a predetermined time has elapsed after the drag gesture has been determined to be paused. According to the method for generating the dragging gesture, good interaction between the user and the notebook computer can be realized.
Next, a block diagram of a touch device according to an embodiment of the present disclosure will be described with reference to fig. 4. Fig. 4 shows an exemplary structure block diagram of a touch device 400 according to an embodiment of the disclosure. As shown in fig. 4, the touch device 400 may include a touch detection unit 401, a trajectory determination unit 402, a control unit 403, and a drag gesture generation unit 404. Only the main functions of the units of the touch device 400 are described below, and the details that have been described above are omitted. In addition, the touch device 400 may further include other components not shown in fig. 4, such as physical keys and the like, as needed.
The touch device 400 may be, for example, a touch panel of a notebook computer, a touch panel equipped on a keyboard separate from the computer, a touch controller for controlling a remote device, or any similar touch device that can detect a user touching and sliding on its surface, thereby generating a corresponding drag gesture. Accordingly, the notebook computer or the controlled remote device can perform corresponding actions according to the dragging gesture so as to realize interaction with the user. For example, a user selects a specific region on the screen by making a drag gesture on the touch panel, the user selects some manipulatable objects in a batch for editing among a plurality of manipulatable objects by making a drag gesture on the touch panel, and the user quickly browses contents such as a picture document that can be scrolled for display by pulling up and down a scroll bar by making a drag gesture on the touch panel.
The touch detection unit 401 may detect a touch of a finger of a user on the touch device 400. For example, the touch detection unit 401 may detect a touch action of a finger of the user on the surface of the touch device 400, including but not limited to: a single-click touch operation, a slide operation, a double-click operation, a touch having a predetermined contact area, a touch lasting for a predetermined time.
The trajectory determination unit 402 may determine a movement trajectory of a single finger on the touch device 400. For example, the trajectory determination unit 402 may detect the coordinate position of the single-click touch of the finger on the touch device 400, the coordinate position of the double-click touch of the finger on the touch device 400, and the respective coordinate positions on the trajectory along which the finger slides on the touch device 400, the coordinate positions of the respective fingers when another finger is detected while a single finger is touching on the touch device 400, and the coordinate position at which the finger leaves the touch device 400, and form a corresponding movement trajectory accordingly.
The control unit 403 may determine a first movement trace generated by the single finger touching the touch device 400 after detecting the first touch event of the single finger. The control unit 403 may also determine the end point of the first movement track when another finger is detected while the single finger touches the touch device 400. The control unit 403 may also determine a second movement trace generated by the single finger touching the touch device 400 after detecting the single finger again at an end point different from the first movement trace. The control unit 403 may be implemented by any processor, microprocessor, or any component having a function of processing calculation, and may execute a preset process based on an instruction of a preset software, firmware.
The drag gesture generation unit 404 may generate a drag gesture based on at least the first and second movement trajectories. For example, the drag gesture generation unit 404 may generate the drag gesture while the control unit 403 determines the first movement trajectory and/or the second movement trajectory. For another example, the drag gesture generation unit 404 may generate the drag gesture after the control unit 403 determines the first movement trajectory and the second movement trajectory.
Alternatively, the touch device 400 may also be used as a touchable controller, for example, and accordingly include a drag gesture transmission unit (not shown). The drag gesture may be transmitted to a remote device connected to the touch device 400 through a drag gesture transmitting unit, so that the remote device operates according to the drag gesture.
The touch device according to the embodiments of the present disclosure is described above with reference to the drawings, by which a pause of a dragging gesture can be determined when another finger is detected while a single finger touches the touch device, and a user can form another trajectory with the single finger again at a position different from a previously formed trajectory end point and generate a corresponding dragging gesture based on each trajectory, so that the dragging gesture is generated based on discontinuous moving trajectories, thereby extending a dragging range corresponding to the dragging gesture, solving a problem that only a limited dragging range can be generated due to a limitation of a touch pad area, and thus improving user experience.
Next, a block diagram of a portable electronic device according to an embodiment of the present disclosure will be described with reference to fig. 5. Fig. 5 illustrates an exemplary block diagram of a portable electronic device 500 according to an embodiment of the present disclosure. As shown in fig. 5, the portable electronic device 500 may include a display 501, a central processing unit 502, a touch panel 503. In addition, the touch panel 503 may include a touch detection unit 5031, a trajectory determination unit 5032, a control unit 5033, and a drag gesture generation unit 5034. Only the main functions of the units of the portable electronic device 500 will be described below, and details that have been described above will be omitted. In addition, the portable electronic device 500 may also include other components not shown in fig. 5, such as a keyboard and the like, as desired.
The display 501 may be any type of display such as a liquid crystal display, a light emitting diode display, or the like.
The central processing unit 502 may be a general-purpose processor, a microprocessor, or the like to provide a data processing function, an arithmetic function, and overall control of the portable electronic device 500, and may operate according to a drag gesture generated by the touch panel 503 as described below.
The touch panel 503 is similar to the touch device 400 described above with reference to fig. 4, and includes a touch detection unit 5031, a trajectory determination unit 5032, a control unit 5033, and a drag gesture generation unit 5034. The touch detection unit 5031 may detect a touch of the user's finger on the touch panel 503. The trajectory determination unit 5032 may determine the movement trajectory of a single finger on the touch panel 503. The control unit 5033 may determine the first movement trajectory generated by the single finger touching the touch panel 503 after detecting the first touch event of the single finger. The control unit 5033 may also determine the end point of the first movement locus when another finger is detected while the single finger touches the touch panel 503. The control unit 5033 may also determine the second movement locus generated by the single finger touching the touch panel 503 after detecting the single finger again at a different end point from the first movement locus. The drag gesture generating unit 5034 may generate a drag gesture based on at least the first and second movement trajectories.
The portable electronic device according to the embodiments of the present disclosure is described above with reference to the drawings, by which a pause of a drag gesture can be determined when another finger is detected while a single finger touches a touch panel, and a user can form another trajectory with the single finger again at a position different from a previously formed trajectory end point and generate a corresponding drag gesture based on the trajectories, so that the drag gesture is generated based on discontinuous movement trajectories, thereby expanding a drag range corresponding to the drag gesture, solving a problem that only a limited drag range can be generated due to a limitation of an area of the touch panel, and thus improving user experience.
Another embodiment of the present disclosure also provides a computer-readable storage medium for storing non-transitory computer-readable instructions that, when executed by a computer, may perform the method of generating a drag gesture of an embodiment of the present disclosure. The storage medium includes volatile storage media or nonvolatile storage media, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, which can store program codes.
It should be understood that each functional unit in the embodiments of the present disclosure may be integrated into one processing unit, each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. Furthermore, those skilled in the art will appreciate that various modifications, combinations, or sub-combinations of the embodiments can be made without departing from the spirit and scope of the invention, and that such modifications are intended to be within the scope of the invention.

Claims (10)

1. A method of generating a drag gesture, comprising:
after detecting a first touch event of a single finger, determining a first movement track generated by the single finger touching the touch control device;
determining the end point of the first movement track when another finger is detected while the single finger touches the touch device;
determining a second movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the first movement trajectory; and
generating a drag gesture based on at least the first and second movement trajectories.
2. The method of claim 1, wherein the first touch event is at least one of a double-tap operation of a single finger on a touch device, a touch having a predetermined contact area, or a touch lasting a predetermined time.
3. The method of claim 1, wherein an end point of the second movement trajectory is determined when a single finger is detected to be off the touch device.
4. The method of claim 1, wherein an end point of the second movement trace is determined when another finger is detected while the single finger is touching the touch device and a predetermined time elapses.
5. The method of claim 1, wherein an end point of the second movement trace is determined when another finger is detected while the single finger is touching the touch device, the method further comprising:
determining a third movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the second movement trajectory; and
and generating a dragging gesture based on the first movement track, the second movement track and the third movement track.
6. The method of claim 1, wherein
Generating the drag gesture while determining the first and/or second movement trajectories.
7. The method of claim 1, wherein
After determining the first and second movement trajectories, generating the drag gesture.
8. A touch device, comprising:
a touch detection unit configured to detect a touch of a finger of a user on a touch device;
a track determining unit configured to determine a movement track of a single finger on the touch device;
a control unit configured to:
after detecting a first touch event of a single finger, determining a first movement track generated by the single finger touching the touch control device;
determining the end point of the first movement track when another finger is detected while the single finger touches the touch device;
determining a second movement trajectory generated by the single finger touching the touch device after the single finger is detected again at an end point different from the first movement trajectory; and
a drag gesture generating unit configured to generate a drag gesture based on at least the first and second movement trajectories.
9. The touch device of claim 8, further comprising:
and the dragging gesture sending unit is configured to send the dragging gesture to a remote device connected with the touch control device so that the remote device operates according to the dragging gesture.
10. A portable electronic device, comprising:
a display;
a central processing unit;
a touch panel comprising:
a touch detection unit configured to detect a touch of a finger of a user on the touch panel;
a trajectory determination unit configured to determine a movement trajectory of a single finger on the touch panel;
a control unit configured to:
after detecting the first touch event of the single finger, determining a first movement track generated by the single finger touching the touch pad;
determining an end point of the first movement track when another finger is detected while the single finger touches the touch pad;
determining a second movement trajectory generated by the single finger touching the touch panel after the single finger is detected again at an end point different from the first movement trajectory; and
a drag gesture generation unit configured to generate a drag gesture based on at least the first and second movement trajectories;
and the central processing unit operates according to the dragging gesture generated by the touch pad.
CN201711403387.5A 2017-12-22 2017-12-22 Method for generating drag gesture, touch device and portable electronic equipment Active CN108132721B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711403387.5A CN108132721B (en) 2017-12-22 2017-12-22 Method for generating drag gesture, touch device and portable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711403387.5A CN108132721B (en) 2017-12-22 2017-12-22 Method for generating drag gesture, touch device and portable electronic equipment

Publications (2)

Publication Number Publication Date
CN108132721A CN108132721A (en) 2018-06-08
CN108132721B true CN108132721B (en) 2020-10-27

Family

ID=62392274

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711403387.5A Active CN108132721B (en) 2017-12-22 2017-12-22 Method for generating drag gesture, touch device and portable electronic equipment

Country Status (1)

Country Link
CN (1) CN108132721B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688044B (en) * 2019-09-30 2021-04-13 联想(北京)有限公司 Input method and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566908A (en) * 2011-12-13 2012-07-11 鸿富锦精密工业(深圳)有限公司 Electronic equipment and page zooming method for same
CN103677577A (en) * 2013-12-12 2014-03-26 宇龙计算机通信科技(深圳)有限公司 Terminal object operating method and terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8924892B2 (en) * 2008-08-22 2014-12-30 Fuji Xerox Co., Ltd. Multiple selection on devices with many gestures
CN101556524A (en) * 2009-05-06 2009-10-14 苏州瀚瑞微电子有限公司 Display method for controlling magnification by sensing area and gesture operation
CN103902216B (en) * 2012-12-29 2017-09-12 深圳雷柏科技股份有限公司 Use gesture the method and system for realizing that file is pulled on a kind of peripheral hardware touch pad
CN103324440B (en) * 2013-07-05 2016-06-08 广东欧珀移动通信有限公司 A kind of method utilizing multi-point touch to select word content
CN105260061B (en) * 2015-11-20 2019-09-06 深圳市奇客布达科技有限公司 A kind of the back touch device and its back touch gesture of hand-hold electronic equipments

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566908A (en) * 2011-12-13 2012-07-11 鸿富锦精密工业(深圳)有限公司 Electronic equipment and page zooming method for same
CN103677577A (en) * 2013-12-12 2014-03-26 宇龙计算机通信科技(深圳)有限公司 Terminal object operating method and terminal

Also Published As

Publication number Publication date
CN108132721A (en) 2018-06-08

Similar Documents

Publication Publication Date Title
US9298266B2 (en) Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
EP3591509B1 (en) Split-screen display method and apparatus, and electronic device thereof
RU2523169C2 (en) Panning content using drag operation
KR101361214B1 (en) Interface Apparatus and Method for setting scope of control area of touch screen
US8370772B2 (en) Touchpad controlling method and touch device using such method
CN102236442B (en) Touchpad control system and method
US20150199125A1 (en) Displaying an application image on two or more displays
US9280265B2 (en) Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device
JP5664147B2 (en) Information processing apparatus, information processing method, and program
KR20130052749A (en) Touch based user interface device and methdo
JP2014532949A (en) Indirect user interface interaction
US20130100051A1 (en) Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device
TW201512940A (en) Multi-region touchpad
JP7233109B2 (en) Touch-sensitive surface-display input method, electronic device, input control method and system with tactile-visual technology
KR102205283B1 (en) Electro device executing at least one application and method for controlling thereof
US20070075984A1 (en) Method and device for scroll bar control on a touchpad having programmed sections
US20170052694A1 (en) Gesture-based interaction method and interaction apparatus, and user equipment
WO2014034369A1 (en) Display control device, thin-client system, display control method, and recording medium
US20150153925A1 (en) Method for operating gestures and method for calling cursor
CN108132721B (en) Method for generating drag gesture, touch device and portable electronic equipment
JP6218451B2 (en) Program execution device
CN104951211A (en) Information processing method and electronic equipment
US8274476B2 (en) Computer cursor control system
KR20150098366A (en) Control method of virtual touchpadand terminal performing the same
US20140019897A1 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant