CN103218167B - A kind of car-mounted terminal single-point touch gesture pattern recognition - Google Patents
A kind of car-mounted terminal single-point touch gesture pattern recognition Download PDFInfo
- Publication number
- CN103218167B CN103218167B CN201310114048.0A CN201310114048A CN103218167B CN 103218167 B CN103218167 B CN 103218167B CN 201310114048 A CN201310114048 A CN 201310114048A CN 103218167 B CN103218167 B CN 103218167B
- Authority
- CN
- China
- Prior art keywords
- designated
- ordinate
- point
- gesture
- touch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
The invention provides a kind of car-mounted terminal single-point touch gesture pattern recognition, the method is by setting up rectangular coordinate system, obtain starting point, obtain and be worth most a little, obtain terminating point and calculate gesture graph type and obtain last gesture graph type, single-point touch gesture pattern recognition provided by the invention, do not need to pre-set gesture graph storehouse, the gesture point to collecting is not needed to store yet, when user leaves touch-screen, namely completing judging a little, identifying the gesture graph of user.Compared with Graphic Pattern Matching method, the recognition speed of method of the present invention improves twice.Single-point touch gestures pattern recognition technique provided by the invention reduces the space complexity of algorithm, improves the speed of identification, facilitates user's operation readiness fast.
Description
Technical field
The invention belongs to touch screen technology field, be specifically related to a kind of car-mounted terminal single-point touch gesture pattern recognition.
Background technology
Car-mounted terminal is widely used in vehicle travels, driver operates different touch gestures (as horizontal line, ordinate, check mark, circle etc.) and represents different implications on the touch-screen of car-mounted terminal, in the process of moving, driver does not have time enough and carries out detailed operation to car-mounted terminal, therefore adopt one simply, gesture identification method efficiently, be convenient to user and operate easily.
Existing touch-screen gesture identification method is mated by the figure of the gesture graph detected with known shape library, then according to the size of matching degree, judges gesture graph; Or by multi-touch method, adopt operations such as rotating, draw high, according to the angle rotated or draw high, given fixing threshold value, realizes the identification to gesture.Although these class methods can identify gesture graph accurately, but adopt algorithm complicated, recognition speed is slow, require higher to time complexity and space complexity, also higher to hardware requirement, therefore need to provide one more simply, efficiently implementation method solve the problems referred to above.
Existing Graphic Pattern Matching method, by gathering 300 points of gesture graph, adopts the method for Graphic Pattern Matching, carries out translation, rotation, ratio change, judge the similarity degree of figure in input gesture and shape library, identify gesture graph gesture graph.
Summary of the invention
For the deficiencies in the prior art, the object of the present invention is to provide a kind of car-mounted terminal single-point touch gesture pattern recognition, the method can identify the operating gesture of car-mounted terminal simply, fast and easily.
A kind of car-mounted terminal single-point touch gesture pattern recognition, the method is carried out according to following steps:
Step one, set up rectangular coordinate system:
With the capacitance touch screen upper left corner for true origin, level is to the right horizontal ordinate positive dirction, is ordinate positive dirction straight down;
Step 2, obtains starting point:
When user starts to touch input, record touches the starting point of input, is designated as P
s, its horizontal ordinate is designated as P
sx (), ordinate is designated as P
s(y);
Step 3, obtains and is worth most a little:
Definition temporary variable is worth a P most
t1and P
t2, then some P is worth most
t1horizontal ordinate be designated as P
t1x (), is worth some P most
t1ordinate be designated as P
t1(y); Then be worth most some P
t2horizontal ordinate be designated as P
t2x (), is worth some P most
t2ordinate be designated as P
t2(y); Initialization assignment: P
t1(x)=P
t2(x)=P
s(x), P
t1(y)=P
t2(y)=P
sy (), obtains each touch point, is designated as P in touch input process
i, its horizontal ordinate is designated as P
ix (), ordinate is designated as P
iy (), then have:
If: P
i(x)-P
t1(x) > 0, then: P
t1(x)=P
i(x), otherwise: P
t1(x)=P
t1(x);
If: P
i(x)-P
t2(x) < 0, then: P
t2(x)=P
i(x), otherwise: P
t2(x)=P
t2(x);
If: P
i(y)-P
t1(y) > 0, then: P
t1(y)=P
i(y), otherwise: P
t1(y)=P
t1(y);
If: P
i(y)-P
t2(y) < 0, then: P
t2(y)=P
i(y), otherwise: P
t2(y)=P
t2(y);
At the end of touch process, obtain horizontal ordinate minimum value P
t1(x), horizontal ordinate maximal value P
t2(x), ordinate minimum value P
t1(y), ordinate minimum value P
t2y (), corresponding horizontal ordinate minimum point is designated as P
xmin, horizontal ordinate maximum of points is designated as P
xmax, ordinate minimum point is designated as P
ymin, ordinate maximum of points is designated as P
ymax;
Step 4, obtains terminating point:
When user's finger leaves touch-screen, touch input process and terminate, obtain terminating point, be designated as P
e, its horizontal ordinate is designated as P
ex (), ordinate is designated as P
e(y);
Step 5, calculates gesture-type:
According to starting point, terminating point and four coordinates be worth most a little, gesture is calculated, then has:
Work as P
xmin≠ P
xmax≠ P
ymin≠ P
ymaxtime, then the gesture graph of user's input is circular;
Work as P
xmin=P
yminor P
xmin=P
ymaxor P
xmax=P
yminor P
xmax=P
ymaxtime, then the gesture graph of user's input, for checking the number or straight line, does further calculating:
(1) P is worked as
ymax(y)=P
s(y) or P
ymax(y)=P
ey time (), then the gesture graph of user's input is straight line;
(2) P is worked as
ymax(y) ≠ P
s(y) and P
ymax(y) ≠ P
ey time (), then the gesture graph of user's input is for checking the number.
Compared with prior art, beneficial effect is in the present invention:
Single-point touch gesture pattern recognition provided by the invention, does not need to pre-set gesture graph storehouse, not needing the gesture point to collecting to store yet, when user leaves touch-screen, namely completing judging a little, identifying the gesture graph of user.Compared with Graphic Pattern Matching method, the recognition speed of method of the present invention improves twice.Single-point touch gestures pattern recognition technique provided by the invention reduces the space complexity of algorithm, improves the speed of identification, facilitates user's operation readiness fast.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of car-mounted terminal single-point touch gesture pattern recognition.
The coordinate schematic diagram of the circular gesture graph of Fig. 2.
Fig. 3 is the coordinate schematic diagram of horizontal straight line gesture graph.
Fig. 4 is the coordinate schematic diagram of check mark gesture graph.
Below in conjunction with drawings and Examples, particular content of the present invention is described in more detail.
Embodiment
Deferring to technique scheme, as shown in Figure 1, is the process flow diagram of the car-mounted terminal touch screen gesture input recognition method of the embodiment of the present invention.First with the touch-screen upper left corner for true origin O, level is to the right horizontal ordinate X-axis positive dirction, is ordinate Y-axis positive dirction straight down; Then user inputs gesture graph (circle, check mark, straight line), in gesture path moving process, judgement is compared to the size of transverse and longitudinal coordinate figure between touch-screen point, at the end of touch process, calculate the maximal value of transverse and longitudinal coordinate in whole track moving process, minimum point and terminal, altogether main points point; Judge four value point whether have identical point, if there is no identical point, then the gesture inputted for circular, otherwise for checking the number or straight line; Compare the maximal value of input gesture graph ordinate and the ordinate of terminal, if there is no equal value, then this gesture is for checking the number, otherwise is straight line again.
Below provide specific embodiments of the invention, it should be noted that the present invention is not limited to following specific embodiment, all equivalents done on technical scheme basis all fall into protection scope of the present invention.
Embodiment 1:
User inputs a gesture graph, as shown in Figure 2, defers to above-mentioned recognition methods, and obtaining starting point is S (X
s, Y
s), terminating point is E (X
e, Y
e), horizontal ordinate minimum point is D (X
4, Y
4), horizontal ordinate maximum of points is B (X
2, Y
2), ordinate minimum point is A (X
1, Y
1), ordinate maximum of points is C (X
3, Y
3), through can be calculated:
Four are worth point all differences most, i.e. D (X
4, Y
4) ≠ B (X
2, Y
2) ≠ C (X
3, Y
3) ≠ A (X
1, Y
1);
Therefore draw: this gesture graph is for circular.
Embodiment 2:
User inputs a gesture graph, as shown in Figure 3, defers to above-mentioned recognition methods, and obtaining starting point is A (X
1, Y
1), terminating point is B (X
2, Y
2), horizontal ordinate minimum point is A (X
1, Y
1), horizontal ordinate maximum of points is B (X
2, Y
2), ordinate minimum point is B (X
2, Y
2), ordinate maximum of points is A (X
1, Y
1), through can be calculated:
Horizontal ordinate minimum point is identical with ordinate maximum of points, i.e. A (X
1, Y
1)=A (X
1, Y
1);
Horizontal ordinate maximum of points is identical with ordinate minimum point, i.e. B (X
2, Y
2)=B (X
2, Y
2);
There is identical point, calculate further:
Ordinate maximum of points is identical with starting point, that is: A (X
1, Y
1)=A (X
1, Y
1)
Therefore draw: this gesture graph is straight line.
Embodiment 3:
User inputs a gesture graph, as shown in Figure 4, defers to above-mentioned recognition methods, and obtaining starting point is A (X
1, Y
1), terminating point is C (X
3, Y
3), horizontal ordinate minimum point is A (X
1, Y
1), horizontal ordinate maximum of points is C (X
3, Y
3), ordinate minimum point is C (X
3, Y
3), ordinate maximum of points is B (X
2, Y
2), through can be calculated:
Horizontal ordinate maximum of points is identical with ordinate minimum point, that is: C (X
3, Y
3)=C (X
3, Y
3)
There is identical point, calculate further:
Ordinate maximum of points is different with starting point, and ordinate maximum of points is different with terminating point, that is: B (X
2, Y
2) ≠ A (X
1, Y
1) and B (X
2, Y
2) ≠ C (X
3, Y
3)
Therefore: this gesture graph is for checking the number.
Claims (1)
1. a car-mounted terminal single-point touch gesture pattern recognition, is characterized in that, the method is carried out according to following steps:
Step one, set up rectangular coordinate system:
With the capacitance touch screen upper left corner for true origin, level is to the right horizontal ordinate positive dirction, is ordinate positive dirction straight down;
Step 2, obtains starting point:
When user starts to touch input, record touches the starting point of input, is designated as P
s, its horizontal ordinate is designated as P
sx (), ordinate is designated as P
s(y);
Step 3, obtains and is worth most a little:
Definition temporary variable is worth a P most
t1and P
t2, then some P is worth most
t1horizontal ordinate be designated as P
t1x (), is worth some P most
t1ordinate be designated as P
t1(y); Then be worth most some P
t2horizontal ordinate be designated as P
t2x (), is worth some P most
t2ordinate be designated as P
t2(y); Initialization assignment: P
t1(x)=P
t2(x)=P
s(x), P
t1(y)=P
t2(y)=P
sy (), obtains each touch point, is designated as P in touch input process
i, its horizontal ordinate is designated as P
ix (), ordinate is designated as P
iy (), then have:
If: P
i(x)-P
t1(x) > 0, then: P
t1(x)=P
i(x), otherwise: P
t1(x)=P
t1(x);
If: P
i(x)-P
t2(x) < 0, then: P
t2(x)=P
i(x), otherwise: P
t2(x)=P
t2(x);
If: P
i(y)-P
t1(y) > 0, then: P
t1(y)=P
i(y), otherwise: P
t1(y)=P
t1(y);
If: P
i(y)-P
t2(y) < 0, then: P
t2(y)=P
i(y), otherwise: P
t2(y)=P
t2(y);
At the end of touch process, obtain horizontal ordinate minimum value P
t1(x), horizontal ordinate maximal value P
t2(x), ordinate minimum value P
t1(y), ordinate minimum value P
t2y (), corresponding horizontal ordinate minimum point is designated as P
xmin, horizontal ordinate maximum of points is designated as P
xmax, ordinate minimum point is designated as P
ymin, ordinate maximum of points is designated as P
ymax;
Step 4, obtains terminating point:
When user's finger leaves touch-screen, touch input process and terminate, obtain terminating point, be designated as P
e, its horizontal ordinate is designated as P
ex (), ordinate is designated as P
e(y);
Step 5, calculates gesture-type:
According to starting point, terminating point and four coordinates be worth most a little, gesture is calculated, then has:
Work as P
xmin≠ P
xmax≠ P
ymin≠ P
ymaxtime, then the gesture graph of user's input is circular;
Work as P
xmin=P
yminor P
xmin=P
ymaxor P
xmax=P
yminor P
xmax=P
ymaxtime, then the gesture graph of user's input, for checking the number or straight line, does further calculating:
(1) P is worked as
ymax(y)=P
s(y) or P
ymax(y)=P
ey time (), then the gesture graph of user's input is straight line;
(2) P is worked as
ymax(y) ≠ P
s(y) and P
ymax(y) ≠ P
ey time (), then the gesture graph of user's input is for checking the number.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310114048.0A CN103218167B (en) | 2013-04-02 | 2013-04-02 | A kind of car-mounted terminal single-point touch gesture pattern recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310114048.0A CN103218167B (en) | 2013-04-02 | 2013-04-02 | A kind of car-mounted terminal single-point touch gesture pattern recognition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103218167A CN103218167A (en) | 2013-07-24 |
CN103218167B true CN103218167B (en) | 2015-09-02 |
Family
ID=48816022
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310114048.0A Active CN103218167B (en) | 2013-04-02 | 2013-04-02 | A kind of car-mounted terminal single-point touch gesture pattern recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103218167B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107145295A (en) * | 2017-05-04 | 2017-09-08 | 浙江路港互通信息技术有限公司 | The implementation method of gesture positioning and the device of gesture positioning |
CN111399740B (en) * | 2020-03-11 | 2021-10-01 | 上海科世达-华阳汽车电器有限公司 | Touch gesture recognition method and system |
CN112115853A (en) * | 2020-09-17 | 2020-12-22 | 西安羚控电子科技有限公司 | Gesture recognition method and device, computer storage medium and electronic equipment |
CN112363613A (en) * | 2020-09-25 | 2021-02-12 | 惠州市德赛西威汽车电子股份有限公司 | Infrared sliding gesture induction recognition method |
CN112770130B (en) * | 2020-12-30 | 2022-10-14 | 咪咕互动娱乐有限公司 | Live broadcast control method, electronic equipment and storage equipment |
CN112926518A (en) * | 2021-03-29 | 2021-06-08 | 上海交通大学 | Gesture password track restoration system based on video in complex scene |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102147707A (en) * | 2011-03-30 | 2011-08-10 | 中国科学院软件研究所 | Method for identifying multi-finger touch gestures based on strokes |
CN102289318A (en) * | 2011-07-06 | 2011-12-21 | 广东威创视讯科技股份有限公司 | Method and device for processing writing of touch screen |
CN102622225A (en) * | 2012-02-24 | 2012-08-01 | 合肥工业大学 | Multipoint touch application program development method supporting user defined gestures |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5569062B2 (en) * | 2010-03-15 | 2014-08-13 | オムロン株式会社 | Gesture recognition device, method for controlling gesture recognition device, and control program |
CN103098012B (en) * | 2010-09-15 | 2016-06-08 | 先进矽有限公司 | For detecting the method that in multi-touch device, any amount touches |
-
2013
- 2013-04-02 CN CN201310114048.0A patent/CN103218167B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102147707A (en) * | 2011-03-30 | 2011-08-10 | 中国科学院软件研究所 | Method for identifying multi-finger touch gestures based on strokes |
CN102289318A (en) * | 2011-07-06 | 2011-12-21 | 广东威创视讯科技股份有限公司 | Method and device for processing writing of touch screen |
CN102622225A (en) * | 2012-02-24 | 2012-08-01 | 合肥工业大学 | Multipoint touch application program development method supporting user defined gestures |
Also Published As
Publication number | Publication date |
---|---|
CN103218167A (en) | 2013-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103218167B (en) | A kind of car-mounted terminal single-point touch gesture pattern recognition | |
US8674958B1 (en) | Method and apparatus for accurate coordinate calculation of objects in touch applications | |
CN101408824A (en) | Method for recognizing mouse gesticulation | |
US9910541B2 (en) | Mis-touch recognition method and device | |
CN103294401A (en) | Icon processing method and device for electronic instrument with touch screen | |
KR20160005656A (en) | Method of performing a touch action in a touch sensitive device | |
CN106650648B (en) | Recognition method and system for erasing handwriting | |
US9778780B2 (en) | Method for providing user interface using multi-point touch and apparatus for same | |
CN102902407B (en) | A kind of touch-screen output display touches the method and apparatus of person's handwriting | |
CN102768595B (en) | A kind of method and device identifying touch control operation instruction on touch-screen | |
WO2019062243A1 (en) | Identification method and apparatus for touch operation, and electronic device | |
WO2014088722A1 (en) | Multi-touch symbol recognition | |
CN103902086A (en) | Curve fitting based touch trajectory smoothing method and system | |
CN101807130B (en) | Touch control position correcting method | |
US20150355769A1 (en) | Method for providing user interface using one-point touch and apparatus for same | |
CN113342208B (en) | Railway line selection method based on multi-point touch equipment, terminal and storage medium | |
CN102426491A (en) | Multipoint touch realization method and system for touch screen | |
CN103324410A (en) | Method and apparatus for detecting touch | |
CN102426483B (en) | Multi-channel accurate target positioning method for touch equipment | |
CN105426729A (en) | Information processing method and electronic equipment | |
CN102129321A (en) | Touch screen-based track recording and comparing method | |
CN104460999A (en) | Method and device for recognizing gesture with inflection point | |
CN103809912A (en) | Tablet personal computer based on multi-touch screen | |
CN110442266A (en) | Object identification method, device, electronic equipment and storage medium | |
CN113296616B (en) | Pen point selection method and device and intelligent terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20210610 Address after: Room 1008, block C, Saigao city square, Xi'an Economic and Technological Development Zone, Shaanxi 710000 Patentee after: Shaanxi intelligent networked automobile Research Institute Co.,Ltd. Address before: 710064 middle section of south 2nd Ring Road, Xi'an, Shaanxi Patentee before: CHANG'AN University |