CN103246462B - A kind of detection method and terminal of vertical gesture - Google Patents

A kind of detection method and terminal of vertical gesture Download PDF

Info

Publication number
CN103246462B
CN103246462B CN201210031950.1A CN201210031950A CN103246462B CN 103246462 B CN103246462 B CN 103246462B CN 201210031950 A CN201210031950 A CN 201210031950A CN 103246462 B CN103246462 B CN 103246462B
Authority
CN
China
Prior art keywords
endpoint
input
end point
judging result
vertical gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210031950.1A
Other languages
Chinese (zh)
Other versions
CN103246462A (en
Inventor
阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201210031950.1A priority Critical patent/CN103246462B/en
Publication of CN103246462A publication Critical patent/CN103246462A/en
Application granted granted Critical
Publication of CN103246462B publication Critical patent/CN103246462B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of detection method of vertical gesture and terminals, wherein the method includes:When having detected user's input, the first end point and the second endpoint of the input are obtained;Judge whether the area of the first end point and the second endpoint meets the first preset range, generates the first judging result;Wherein, the area of second endpoint is not less than the area of the first end point;Judge that the distance of the first end point and second endpoint whether in the second preset range, generates the second judging result;When first judging result and second judging result are to be, determine that the input is vertical gesture.The present invention is by obtaining first end point input by user and the second endpoint, and the area to first end point and the second endpoint and distance between the two carry out matching judgment with preset range, the detection for realizing vertical gesture provides premise to carry out relevant operation using vertical gesture later.

Description

A kind of detection method and terminal of vertical gesture
Technical field
The present invention relates to input field more particularly to the detection methods and terminal of a kind of vertical gesture.
Background technology
Vertical gesture be it is a kind of pinkie palm corresponding with pinkie root is contacted with the touch screen of terminal, other four Refer to and corresponding root contact or with the touch screen of terminal by the tactile of thumb palm corresponding with thumb root and terminal Screen contact is touched, other four refer to and the gesture that is not contacted with the touch screen of terminal of corresponding palm root.As shown in Figure 1, for profit With the vertical gesture schematic diagram of pinkie.The gesture especially with the vertical gesture of pinkie be for a user it is a kind of very from Right gesture.If using the gesture as a kind of input mode, it can be used for changing simultaneously the position of multiple operation objects.Than Such as, which can be moved in parallel upper and lower or left and right directions as the operation for deleting desktop icons, then one can be used A gesture deletes multiple desktop icons simultaneously.Or multiple icons in the region contacted in action that the gesture is clenched fist are carried out It chooses or files simultaneously.It can be seen that vertical gesture will be a kind of easily input method.But the gesture is not easy as input mode It distinguishes.For example the root of user's index finger and entire palm contacts terminal touch screen, is difficult to carry out area to it in the terminal Point.If therefore inputted using the gesture, need first to input be detected, to avoid by be not vertical gesture operation As vertical gesture, lead to faulty operation.
Invention content
The present invention provides a kind of detection method and terminal of vertical gesture, passes through the first end point of the input to acquisition and The distance of double-pointed area and two endpoints carries out matching judgment with preset range, realizes the inspection of vertical gesture It surveys, premise is provided to carry out input using vertical gesture.
The present invention provides a kind of detection method of vertical gesture, the method includes:
When having detected user's input, the first end point and the second endpoint of the input are obtained;
Judge whether the area of the first end point and the second endpoint meets the first preset range, generates first and judge knot Fruit;Wherein, the area of second endpoint is not less than the area of the first end point;
The distance of the first end point and second endpoint is judged whether in the second preset range, is generated second and is judged As a result;
When first judging result and second judging result are to be, determine that the input is vertical gesture.
Preferably, the method further includes:
Obtain the running orbit of the input;
Judge whether the track is continuous path, generates third judging result;
It is described when first judging result and second judging result are to be, determine that the input is vertical man Gesture includes:
When first judging result, the second judging result and third judging result are to be, determine that the input is Vertical gesture.
Preferably, the method further includes:
The vertical gesture input for learning user, obtains the standard of the first end point and the second endpoint of user's vertical gesture The gauged distance of area and first end point and the second endpoint;
The area for judging the first end point and the second endpoint whether in the first preset range include:
Judge whether the area of the first end point and the second endpoint meets the standard area;
The distance for judging the first end point and second endpoint whether include within a preset range:
Judge whether the first end point and the distance of second endpoint meet the gauged distance.
Preferably, the method further includes:
The vertical gesture input for learning the user obtains the standard dynamics of user's vertical gesture input;
Obtain the dynamics of the input;
Judge whether the dynamics meets the standard dynamics, generates the 4th judging result;
It is described when first judging result and second judging result are to be, determine that the input is vertical man Gesture includes:
When first judging result, second judging result and the 4th judging result are to be, institute is determined It is vertical gesture to state input.
Preferably, when it is vertical gesture to determine the input, the method further includes:
According to the change in location of the first end point and the second endpoint, the direction of motion of the vertical gesture is obtained;
Corresponding instruction is executed according to the direction of motion.
The present invention also provides a kind of detection terminal of vertical gesture, the terminal includes:
Endpoint acquiring unit, first end point and second end for when having detected user's input, obtaining the input Point;
First judging unit, for judging whether the area of the first end point and the second endpoint meets the first default model It encloses, generates the first judging result;Wherein, the area of second endpoint is not less than the area of the first end point;
Second judgment unit, for judging the distance of the first end point and second endpoint whether in preset range It is interior, generate the second judging result;
Determination unit, for when first judging result and second judging result are to be, determining described defeated Enter for vertical gesture.
Preferably, the terminal further includes:
Track acquiring unit, the running orbit for obtaining the input;
Third judging unit generates third judging result for judging whether the track is continuous path;
The determination unit, be additionally operable to be when first judging result, the second judging result and third judging result When being, determine that the input is vertical gesture.
Preferably, the terminal further includes:
Unit, for learn user vertical gesture input, obtain user's vertical gesture first end point and The gauged distance of second endpoint standard area and first end point and the second endpoint;
First judging unit is additionally operable to judge whether the area of second endpoint meets the standard area;
The second judgment unit is additionally operable to judge whether the first end point and the distance of second endpoint meet institute State gauged distance.
Preferably, the unit is additionally operable to learn the vertical gesture input of the user, it is vertical to obtain the user The standard dynamics of gesture input;
The terminal further includes:
Dynamics acquiring unit, the dynamics for obtaining the input;
4th judging unit generates the 4th judging result for judging whether the dynamics meets the standard dynamics;
The determination unit is additionally operable to judge as first judging result, second judging result and the described 4th Result is to determine that the input is vertical gesture when being.
Preferably, when it is vertical gesture to determine the input, the terminal further includes:
Direction of motion acquiring unit obtains described vertical for the change in location according to the first end point and the second endpoint The direction of motion of formula gesture;
Operating unit, for executing corresponding instruction according to the direction of motion.
The present invention has the advantages that:
The present invention is by obtaining the first end point and the second endpoint of input, area to first end point and the second endpoint and the One preset range carries out matching judgment, and is matched with the second preset range to the distance between first end point and the second endpoint Judge, realize the accurate detection of vertical gesture, premise is provided to carry out input using vertical gesture.
Description of the drawings
It in order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, below will be to institute in embodiment Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the present invention Example, for those of ordinary skill in the art, without creative efforts, can also obtain according to these attached drawings Obtain other attached drawings.
Fig. 1 is the vertical gesture schematic diagram that pinkie is utilized in the present invention;
Fig. 2 is one flow chart of the method for the present invention embodiment;
Fig. 3 is first end point and the second endpoint schematic diagram in the present invention;
Fig. 4 is seven structure chart of terminal embodiment of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, the every other embodiment that those of ordinary skill in the art are obtained belong to what the present invention protected Range.
The embodiment of the present invention one provides a kind of detection method of vertical gesture, and referring to Fig. 2, this method specifically includes:
S1, when having detected user's input, obtain the first end point and the second endpoint of the input.
Existing terminal touch screen is mostly capacitive touch screen, and principle is worked using the electric current induction of human body 's.In the prior art, can completely does not obtain form input by user to most of capacitive touch screens.And it only will input Aggregation forms two endpoints.As shown in Figure 3.The present invention is exactly to be carried out under the premise of the technology.
When having detected user's input, the first end point and the second endpoint of the input are formed.
S2, judge whether the area of the first end point and the second endpoint meets the first preset range, generate first and judge As a result;Wherein, the area of second endpoint is not less than the area of the first end point.
As stated in the background art, vertical gesture refers to that pinkie (or thumb) and pinkie (or thumb) is corresponding A kind of gesture for being contacted with touch screen of palm root.Therefore, pinkie (or thumb) is corresponded in two endpoints of formation The area of endpoint is less than the area of the endpoint of the corresponding palm root of corresponding pinkie (or thumb).And it uses most of For family, the area of first end point and the second endpoint should be respectively within the scope of certain.
For user, compared using the vertical gesture of pinkie more natural using thumb.Therefore, in the present invention In following embodiment, illustrated by taking the vertical gesture using pinkie as an example.
The judgement of areal extent about two endpoints, in the embodiment of the present invention two, can by first end point with it is pre- If areal extent compare, while the second endpoint being compared with another preset areal extent.It is readily comprehensible, the two Areal extent is different, because the area of the second endpoint is not less than the area of first end point, is answered when specifically judging When the area of the second endpoint is compared with larger preset area range.It should be noted that at this point, only there are two endpoints When area meets preset area, the first judging result is just yes.
By above-mentioned comparison, it will be able to by first end point area not and/or second end corresponding with user's pinkie area The not corresponding input of point area palm root area corresponding with user's pinkie detected.Obviously, such input is not Meet vertical gesture.
In the embodiment of the present invention three, because the area of two endpoints and being differed in vertical gesture, it is possible to first Judge whether the difference of the area of first end point and the area of the second endpoint is more than certain threshold value, if so, continuing to judge wherein Whether the area of any one endpoint is in certain preset range.For example, judge the second larger endpoint of area area whether In preset areal extent.
The above-mentioned first step judges the input that can such as carry out the identical input of two endpoint areas with two fingers simultaneously It excludes.Although simultaneously that the area of two endpoints is different, one of endpoint area does not meet the input row of vertical gesture It removes.
S3, the distance for judging the first end point and second endpoint generate second whether in the second preset range Judging result.
By the judgement of above-mentioned area, what is obtained is the input of area coinciding vertical gesture, but and not all area symbol The input for closing vertical gesture is all vertical gesture.Touch screen is touched for example, two hands are all made of vertical gesture head and the tail and are linked up, The pinkie of the corresponding hand of two endpoints so obtained one, another corresponds to the palm root of another hand.Therefore, Jin Jintong Crossing the judgement of area can not be excluded.It will be clear that in this gesture, the distance of two endpoints is to be far longer than normal stand The distance of two endpoints in formula gesture.Therefore, in the present invention, need to continue to judge whether the distance of two endpoints is being preset Distance range in.
S4, when first judging result and second judging result are to be, determine that the input is vertical man Gesture.
When the distance between the area and two endpoints of two endpoints is according to preset range is met, determine that the input is Vertical gesture.
User accidentally encounters touch screen sometimes, and therefore, it is necessary to be distinguished with normal input to accidentally touching behavior.Because accidentally The behavior of touching is moment, therefore is not in continuous track.And there will be one section of continuous rails for normal vertical gesture input Mark.In the embodiment of the present invention four, to avoid the accidentally behavior of touching by user from being determined as vertical gesture, its track can be examined It surveys.Specific method is:Obtain the running orbit of the input.Judge whether the track is continuous path, generates third and judge As a result.
At this point, it is described when first judging result and second judging result are to be, determine that the input is Vertical gesture includes:
When first judging result, the second judging result and third judging result are to be, determine that the input is Vertical gesture.
User is different, and the distance between the area and endpoint of corresponding endpoint is also variant, more accurately to judge vertical man Gesture can learn the vertical gesture of user in the embodiment of the present invention five, obtain the first of user's vertical gesture The gauged distance of the standard area and first end point and the second endpoint of endpoint and the second endpoint.At this point, described judge described the The area of end point and the second endpoint whether in the first preset range include:Judge the face of the first end point and the second endpoint Whether product meets the standard area.Whether the distance for judging the first end point and second endpoint is in preset range Include inside:Judge whether the first end point and the distance of second endpoint meet the gauged distance.
It should be noted that the standard area and gauged distance can refer to certain range rather than be only here One numerical value.In addition, according to the difference of area determination method, standard area can refer to the standard area and second end of first end point The standard area of the standard area of point or the area difference and one of endpoint of two endpoints.
It in the preferred embodiment of embodiment five, can also repeatedly learn the vertical gesture of same user, repeatedly should with obtaining Distance between the area and two endpoints of two endpoints of user's vertical gesture, and to the area repeatedly obtained and apart from averaging Value, to obtain standard area and gauged distance.
There are many specific forms for the vertical gesture of user.For example pinkie and the corresponding palm root of pinkie are stretched, Point-blank or pinkie bending, palm corresponding with pinkie root are in the arc-shaped.At this point, the face of two endpoints Distance between product especially two endpoints can vary widely.Therefore, more preferably, a variety of vertical of user can be learnt Gesture input, so that the standard area and gauged distance that obtain are more accurate.
It is, of course, also possible to learn the vertical gesture input of multiple users, and preserves user and marked with the input of user's vertical gesture The correspondence of quasi- area and gauged distance.In this way, when touch screen has detected input, the information of user is obtained, is identified User.Then according to the correspondence of user and standard area and gauged distance, the standard of user's vertical gesture input is determined Area and gauged distance.The endpoint area and end-point distances that detect are made comparisons with it, to judge whether input is vertical man Gesture inputs.
Vertical gesture is inputted by people, and therefore, dynamics will not be especially big, is that vertical gesture and other inputs is such as non- Artificial input distinguishes, and more accurately to judge vertical gesture, in the embodiment of the present invention six, can also obtain input Dynamics, it is compared with preset dynamics range, judges it whether in the preset range, generate a judging result. And when it is to be that the judging result is with the judging result in above-described embodiment, determine that it is vertical gesture.
More preferably, the input dynamics of each user has certain difference, therefore, can be carried out to the input dynamics of user Study obtains standard dynamics.Its specific learning process is similar with embodiment five, is no longer discussed in detail herein.
It should be clear that in the present invention, the sequence of each judgment step can be interchanged.For example it can first judge two The distance of a endpoint, then judge the area of two endpoints.But when judging that result is no in a certain step, no longer carry out in next step Rapid judgement.
After by detecting, determining that input is vertical gesture, can the direction of further vertical gesture be detected, To execute corresponding instruction.
The detection in vertical gesture direction can be obtained by the variation of two endpoint locations.A kind of tool in the present invention Body mode is:
Obtain the changes in coordinates of first end point and the second endpoint;
Find out the center point coordinate variation of two endpoints;
Connect the central point, starting central point and the movement side that the line direction for terminating central point is vertical gesture To.
Assuming that the coordinate by obtaining first end point and the second endpoint, detects the coordinate of the central point of the two on the y axis Unchanged, x-axis coordinate gradually increases.It is water from starting central point to central point line, the line direction is terminated along center point coordinate It puts down to the right, as the direction of motion of the vertical gesture.As it can be seen that the vertical gesture is that hand level moves right.Specifically, can be with This kind of gesture is set as the instruction of filing all icons on the right side of hand.Therefore, when detecting the above-mentioned direction of motion, just Certain instruction can be executed.
It can see by the embodiment, user can be made to be very natural with vertical gesture, simply to multiple options It is operated, it is seen then that vertical gesture provides a kind of easily input method.The detection of vertical gesture is just easily heavy as a result, It wants.
The embodiment of the present invention seven provides a kind of detection terminal of vertical gesture, and referring to Fig. 4, the terminal includes:
Endpoint acquiring unit 11, the first end point and second for when having detected user's input, obtaining the input Endpoint;
First judging unit 12, for judging whether the area of the first end point and the second endpoint meets the first default model It encloses, generates the first judging result;Wherein, the area of second endpoint is not less than the area of the first end point;
Second judgment unit 13, for judging the distance of the first end point and second endpoint whether in preset range It is interior, generate the second judging result.
By the judgement of above-mentioned area, what is obtained is the input of area coinciding vertical gesture, but and not all area symbol The input for closing vertical gesture is all vertical gesture.Touch screen is touched for example, two hands are all made of vertical gesture head and the tail and are linked up, The pinkie of the corresponding hand of two endpoints so obtained one, another corresponds to the palm root of another hand.Therefore, Jin Jintong Crossing the judgement of area can not be excluded.It will be clear that in this gesture, the distance of two endpoints is to be far longer than normal stand The distance of two endpoints in formula gesture.Therefore, in the present invention, needing the distance that second judgment unit judges two endpoints is It is no in preset distance range.
Determination unit 14, described in when first judging result and second judging result are to be, determining Input is vertical gesture.
User accidentally encounters touch screen sometimes, and therefore, it is necessary to be distinguished with normal input to accidentally touching behavior.Because accidentally The behavior of touching is moment, therefore is not in continuous track.And there will be one section of continuous rails for normal vertical gesture input Mark.In a preferred embodiment of the invention, to avoid the accidentally behavior of touching by user from being determined as vertical gesture, its track is examined It surveys, the terminal further includes:
Track acquiring unit, the running orbit for obtaining the input.
Third judging unit generates third judging result for judging whether the track is continuous path.
The determination unit is additionally operable to when first judging result, the second judging result and third judging result be to be When, determine that the input is vertical gesture.
User is different, and the distance between the area and endpoint of corresponding endpoint is also variant, more accurately to judge vertical man Gesture, in a preferred embodiment of the invention, the terminal further includes:Unit, the vertical gesture for learning user input, Obtain the standard of the first end point and the second endpoint standard area and first end point and the second endpoint of user's vertical gesture Distance.First judging unit is additionally operable to judge whether the area of second endpoint meets the standard area.Described Two judging units are additionally operable to judge whether the first end point and the distance of second endpoint meet the gauged distance.
It should be noted that the standard area and gauged distance can refer to certain range rather than be only here One numerical value.In addition, according to the difference of area determination method, standard area can refer to the standard area and second end of first end point The standard area of the standard area of point or the area difference and one of endpoint of two endpoints.
In more preferably mode, unit can also repeatedly learn the vertical gesture of same user, more to obtain Distance between the area and two endpoints of two endpoints of secondary user's vertical gesture, and the area and distance that repeatedly obtain are asked Average value, to obtain standard area and gauged distance.
There are many specific forms for the vertical gesture of user.For example pinkie and the corresponding palm root of pinkie are stretched, Point-blank or pinkie bending, palm corresponding with pinkie root are in the arc-shaped.At this point, the face of two endpoints Distance between product especially two endpoints can vary widely.Therefore, more preferably, unit is additionally operable to study and uses A variety of vertical gestures at family input, so that the standard area and gauged distance that obtain are more accurate.
Certainly, unit can also learn the vertical gesture input of multiple users, and preserve user and user's vertical man Gesture inputs the correspondence of standard area and gauged distance.In this way, when touch screen has detected input, the letter of user is obtained Breath, identifies user.Then according to the correspondence of user and standard area and gauged distance, determine that user's vertical gesture is defeated The standard area and gauged distance entered.The endpoint area and endpoint that first judging unit and second judgment unit will detect respectively Distance is made comparisons with it, to judge whether input is vertical gesture input.
Vertical gesture is inputted by people, and therefore, dynamics will not be especially big, is that vertical gesture and other inputs is such as non- Artificial input distinguishes, and more accurately to judge vertical gesture, in an embodiment of the present invention, can also be obtained including dynamics Take unit and the 4th judging unit.Dynamics acquiring unit, the dynamics for obtaining input.4th judging unit by its with it is preset Dynamics range is compared, and judges it whether in the preset range, generates a judging result.Determination unit is additionally operable to sentence at this Disconnected result and the judging result in above-described embodiment are when being, to determine that it is vertical gesture.
More preferably, the input dynamics of each user has certain difference, and therefore, unit is additionally operable to study institute The vertical gesture input for stating user obtains the standard dynamics of user's vertical gesture input.Dynamics acquiring unit, described in acquisition The dynamics of input.4th judges whether the dynamics meets the standard dynamics, generates the 4th judging result.The determination Unit is additionally operable to when first judging result, second judging result and the 4th judging result are to be, is determined The input is vertical gesture.
After by detecting, determining that input is vertical gesture, can the direction of further vertical gesture be detected, To execute corresponding instruction.The terminal further includes:
Direction of motion acquiring unit obtains described vertical for the change in location according to the first end point and the second endpoint The direction of motion of formula gesture.
Operating unit, for executing corresponding instruction according to the direction of motion.
The detection in vertical gesture direction can be obtained by the variation of two endpoint locations.In the present invention, the direction of motion Acquiring unit includes:
Coordinate acquiring unit, the changes in coordinates for obtaining first end point and the second endpoint;
Computing unit, the center point coordinate variation for calculating two endpoints;
Line unit, for connecting the central point;It originates central point and is as stood with the line direction for terminating central point The direction of motion of formula gesture.
Assuming that the coordinate by obtaining first end point and the second endpoint, detects the coordinate of the central point of the two on the y axis Unchanged, x-axis coordinate gradually increases.It is water from starting central point to central point line, the line direction is terminated along center point coordinate It puts down to the right, as the direction of motion of the vertical gesture.As it can be seen that the vertical gesture is that hand level moves right.Specifically, can be with This kind of gesture is set as the instruction of filing all icons on the right side of hand.Therefore, when detecting the above-mentioned direction of motion, behaviour Certain instruction is carried out as unit.
It is worth noting that, the method for the present invention is corresponding with the terminal of the present invention, therefore terminal is no longer described in detail, it is related Part is referring to embodiment of the method.
Above to a kind of detection method and terminal of vertical gesture provided by the present invention, it is described, answers herein With specific case, principle and implementation of the present invention are described, and the explanation of above example is only intended to help to manage Solve the method and its core concept of the present invention;Meanwhile for those of ordinary skill in the art, according to the thought of the present invention, There will be changes in specific implementation mode and application range.In conclusion the content of the present specification should not be construed as to this hair Bright limitation.

Claims (10)

1. a kind of detection method of vertical gesture, which is characterized in that the method includes:
When having detected user's input, the first end point and the second endpoint of the input are obtained, wherein touch screen is defeated by user Enter aggregation and forms the first end point and the second endpoint;
Judge whether the area of the first end point and the second endpoint meets the first preset range, generates the first judging result;Its In, the area of second endpoint is not less than the area of the first end point;
The distance of the first end point and second endpoint is judged whether in the second preset range, is generated second and is judged knot Fruit;
When first judging result and second judging result are to be, determine that the input is vertical gesture.
2. according to the method described in claim 1, it is characterized in that, the method further includes:
Obtain the running orbit of the input;
Judge whether the track is continuous path, generates third judging result;
It is described when first judging result and second judging result are to be, determine that the input is vertical gesture packet It includes:
When first judging result, the second judging result and third judging result are to be, determine that the input is vertical Gesture.
3. according to the method described in claim 1, it is characterized in that, the method further includes:
The vertical gesture input for learning user, obtains the standard area of the first end point and the second endpoint of user's vertical gesture And the gauged distance of first end point and the second endpoint;
The area for judging the first end point and the second endpoint whether in the first preset range include:
Judge whether the area of the first end point and the second endpoint meets the standard area;
The distance for judging the first end point and second endpoint whether include within a preset range:
Judge whether the first end point and the distance of second endpoint meet the gauged distance.
4. according to the method described in claim 3, it is characterized in that, the method further includes:
The vertical gesture input for learning the user obtains the standard dynamics of user's vertical gesture input;
Obtain the dynamics of the input;
Judge whether the dynamics meets the standard dynamics, generates the 4th judging result;
It is described when first judging result and second judging result are to be, determine that the input is vertical gesture packet It includes:
When first judging result, second judging result and the 4th judging result are to be, determine described defeated Enter for vertical gesture.
5. method according to claim 1 to 4, which is characterized in that when determine it is described input be vertical gesture When, the method further includes:
According to the change in location of the first end point and the second endpoint, the direction of motion of the vertical gesture is obtained;
Corresponding instruction is executed according to the direction of motion.
6. a kind of detection terminal of vertical gesture, which is characterized in that the terminal includes:
Endpoint acquiring unit, the first end point for when having detected user's input, obtaining the input and the second endpoint, In, user is inputted aggregation and forms the first end point and the second endpoint by touch screen;
First judging unit, it is raw for judging whether the area of the first end point and the second endpoint meets the first preset range At the first judging result;Wherein, the area of second endpoint is not less than the area of the first end point;
Within a preset range whether second judgment unit, the distance for judging the first end point and second endpoint give birth to At the second judging result;
Determination unit, for when first judging result and second judging result are to be, determining that the input is Vertical gesture.
7. requiring the terminal described in 6 according to claim, which is characterized in that the terminal further includes:
Track acquiring unit, the running orbit for obtaining the input;
Third judging unit generates third judging result for judging whether the track is continuous path;
The determination unit is additionally operable to when first judging result, the second judging result and third judging result are to be, Determine that the input is vertical gesture.
8. requiring the terminal described in 6 according to claim, which is characterized in that the terminal further includes:
Unit, the vertical gesture for learning user input, and obtain the first end point and second of user's vertical gesture The gauged distance of endpoint standard area and first end point and the second endpoint;
First judging unit is additionally operable to judge whether the area of second endpoint meets the standard area;
The second judgment unit is additionally operable to judge whether the first end point and the distance of second endpoint meet the mark Quasi- distance.
9. requiring the terminal described in 8 according to claim, which is characterized in that the unit is additionally operable to learn the user Vertical gesture input, obtain the standard dynamics of user's vertical gesture input;
The terminal further includes:
Dynamics acquiring unit, the dynamics for obtaining the input;
4th judging unit generates the 4th judging result for judging whether the dynamics meets the standard dynamics;
The determination unit is additionally operable to work as first judging result, second judging result and the 4th judging result It is to determine that the input is vertical gesture when being.
10. according to claim 6 to 9 any one of them terminal, which is characterized in that when determine it is described input be vertical gesture When, the terminal further includes:
Direction of motion acquiring unit obtains the vertical man for the change in location according to the first end point and the second endpoint The direction of motion of gesture;
Operating unit, for executing corresponding instruction according to the direction of motion.
CN201210031950.1A 2012-02-13 2012-02-13 A kind of detection method and terminal of vertical gesture Active CN103246462B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210031950.1A CN103246462B (en) 2012-02-13 2012-02-13 A kind of detection method and terminal of vertical gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210031950.1A CN103246462B (en) 2012-02-13 2012-02-13 A kind of detection method and terminal of vertical gesture

Publications (2)

Publication Number Publication Date
CN103246462A CN103246462A (en) 2013-08-14
CN103246462B true CN103246462B (en) 2018-08-10

Family

ID=48926004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210031950.1A Active CN103246462B (en) 2012-02-13 2012-02-13 A kind of detection method and terminal of vertical gesture

Country Status (1)

Country Link
CN (1) CN103246462B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007089766A2 (en) * 2006-01-30 2007-08-09 Apple Inc. Gesturing with a multipoint sensing device
CN101198925A (en) * 2004-07-30 2008-06-11 苹果公司 Gestures for touch sensitive input devices
CN101882042A (en) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 Palm judgment method of capacitive touch screen
CN102129289A (en) * 2010-01-19 2011-07-20 阿瓦雅公司 Event generation based on print portion identification

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090267906A1 (en) * 2008-04-25 2009-10-29 Nokia Corporation Touch sensitive apparatus
EP2524490A4 (en) * 2010-02-04 2016-03-02 Nokia Technologies Oy User input

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101198925A (en) * 2004-07-30 2008-06-11 苹果公司 Gestures for touch sensitive input devices
WO2007089766A2 (en) * 2006-01-30 2007-08-09 Apple Inc. Gesturing with a multipoint sensing device
CN102129289A (en) * 2010-01-19 2011-07-20 阿瓦雅公司 Event generation based on print portion identification
CN101882042A (en) * 2010-06-08 2010-11-10 苏州瀚瑞微电子有限公司 Palm judgment method of capacitive touch screen

Also Published As

Publication number Publication date
CN103246462A (en) 2013-08-14

Similar Documents

Publication Publication Date Title
TWI579734B (en) 3d visualization
US9588621B2 (en) Touch screen controller and method for controlling thereof
CN109634415B (en) It is a kind of for controlling the gesture identification control method of analog quantity
CN104077003B (en) A kind of method and device for merging mobile phone return key, home keys and Menu key
CN104699545B (en) The method and apparatus of misjudgment proximity event
JP2008052729A (en) Apparatus and method of sensing movement of multi-touch point, and mobile device using the same
CN106104434A (en) Touch panel device is used to determine user's handedness and orientation
CN108235747A (en) Edge false-touch prevention method, touch control equipment and storage medium
TWI511012B (en) Method of recognizing touch
TWI528271B (en) Method, apparatus and computer program product for polygon gesture detection and interaction
AU2015202763A1 (en) Glove touch detection
CN107346206A (en) The control method and mobile terminal of a kind of mobile terminal
Surale et al. Experimental analysis of mode switching techniques in touch-based user interfaces
CN105278734B (en) The control method and control device of touch sensor panel
JP2015026375A (en) Input signal identification method of touch panel
CN109582126A (en) Input interface unit, control method and non-transient computer readable media
Dang et al. Usage and recognition of finger orientation for multi-touch tabletop interaction
CN103246462B (en) A kind of detection method and terminal of vertical gesture
US20150042586A1 (en) Input Device
Zhang et al. Left and right hand distinction for multi-touch tabletop interactions
CN104007865B (en) Recognition method and electronic device
CN104063173A (en) Mobile terminal and unlocking method of mobile terminal
US20130328617A1 (en) Touch device and method for detecting touch thereof
CN104063179B (en) Control method, device and the electronic equipment of a kind of electronic equipment
Edwin et al. Hand detection for virtual touchpad

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant