CN103093189A - Touch screen writing pen identifying method - Google Patents
Touch screen writing pen identifying method Download PDFInfo
- Publication number
- CN103093189A CN103093189A CN2012105573602A CN201210557360A CN103093189A CN 103093189 A CN103093189 A CN 103093189A CN 2012105573602 A CN2012105573602 A CN 2012105573602A CN 201210557360 A CN201210557360 A CN 201210557360A CN 103093189 A CN103093189 A CN 103093189A
- Authority
- CN
- China
- Prior art keywords
- touch
- camera
- lettering pen
- screen
- actual size
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 238000006243 chemical reaction Methods 0.000 claims description 2
- 208000003443 Unconsciousness Diseases 0.000 abstract 1
- 210000003811 finger Anatomy 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 2
- 210000005224 forefinger Anatomy 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 239000011358 absorbing material Substances 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 210000004932 little finger Anatomy 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to a writing pen identifying method, and discloses a touch screen writing pen identifying method. According to the method, light spots in images shot by three cameras which are arranged on the upper left corner, in the middle and on the upper right corner of a touch screen and touch point coordinates on the touch screen are acquired, actual size of touch bodies respectively corresponding to each camera is computed, if the actual size of the three light spots are all smaller than the correspondingly-set threshold values, the current touch body is a writing pen, writing is allowed, and otherwise, writing is not allowed. Mistakes caused by unconscious touch of an operator can be effectively avoided, and touch operation can be carried out accurately.
Description
Technical field
The present invention relates to the lettering pen recognition methods, more specifically, relate to a kind of lettering pen recognition methods of interactive touch-screen.
Background technology
For the interactive touch screen, traditional input method comprises uses lettering pen input and finger input method.Use typically finger directly to write on touch-screen, have some setbacks but so not only write, bring wearing and tearing to finger, and bring many interference of writing, for example when using forefinger to write on screen, one or several in middle finger, the third finger, little finger of toe may unconsciously touch on screen, and coat-sleeve also may unconsciously touch on screen, thereby brought interference for writing of forefinger, caused wrong on line.And adopt the lettering pen input can overcome preferably the problems referred to above, on touch-screen, a kind of device is arranged at present, the object that this device can be identified touch screen is finger or lettering pen or other object devices, whether the touch object that this device can accurately be identified on current touch-screen is lettering pen, be the input tool of writing of touch-screen thereby can set lettering pen, finger or other touch objects all can not be as the input tools of writing of touch-screen, thereby can improve the recognition accuracy of touch-screen.
Application number a kind of lettering pen recognition methods of touch-screen that has been 201110289113.4 Patent Application Publication, this lettering pen recognition methods is to obtain an identification vector by the touch point to the distance value of camera and the product of the corresponding spot width value of camera, compare with the coordinate threshold value of setting by this identification vector, thereby determine it is the judgement factor, fixed threshold according to the judgement factor and setting compares again, and whether final definite touch object is lettering pen.The method is based on the statistics principle, needs data a large amount of on collection touch to add up, and adopts the method for data fitting to obtain computing formula, lacks rigorous mathematical derivation, and its identification correctness is unstable, is unfavorable for improving and using of touching technique.
Summary of the invention
In order to overcome the deficiencies in the prior art, the present invention proposes a kind of lettering pen recognition methods of touch-screen, whether the touch object that the method can accurately be identified on current touch-screen is lettering pen, be the input tool of writing of touch-screen thereby can set lettering pen, finger or other touch objects all can not as the input tool of writing of touch-screen, improve the recognition accuracy of touch-screen.
To achieve these goals, technical scheme of the present invention is:
A kind of lettering pen recognition methods of touch-screen comprises the following steps:
S1. obtain hot spot in the captured image of three cameras being arranged on the touch-screen upper left corner, centre, the upper right corner and the touch point coordinate on touch-screen;
S2. calculate respectively touch body and left, center, right camera apart from dl, dm, dr, and the width of the hot spot in the image of left, center, right camera shooting is wl, wm, wr;
S3. calculate and touch body respectively for the actual size of each camera;
S4. set and touch body for threshold value difference th1, th2, the th3 of left, center, right camera size, the touch body that step S3 is calculated for the actual size of each camera respectively with th1, th2, th3 relatively, if touch body for the actual size of each camera all less than set threshold value, illustrate that current touch body is lettering pen, allow to write; Otherwise not lettering pen, do not allow to write.
Compared with prior art, the lettering pen recognition methods of touch-screen of the present invention is to judge before carrying out action of writing, the actual size of the touch body hot spot of taking by camera compares with the threshold value that sets, thereby judge whether the touch body on current touch-screen is lettering pen, to carry out to write, otherwise do not carry out action of writing, thereby the mistake that can effectively avoid the non-conscious touch behavior of operating personnel to cause can be carried out touch operation more accurately.
Description of drawings
Fig. 1 is the structural representation of the touch positioning device of the inventive method.
Fig. 2 is the structural representation of lettering pen in the present invention.
Fig. 3 is that the present invention writes the stroke recognition method process flow diagram.
Fig. 4 is that the present invention writes the big or small schematic diagram calculation of pen.
Fig. 5 is the computing method schematic diagram of spot width of the present invention.
Embodiment
The present invention will be further described below in conjunction with accompanying drawing, but embodiments of the present invention are not limited to this.
as shown in Figure 1, touch positioning device for the large scale touch-screen, in figure 101, 102, 103 is respectively three cameras, be arranged on respectively the upper left corner of touch-screen framework 104, centre and the upper right corner, wherein, camera 101, 103 coverage is 90 degree, the coverage of camera 102 is 170 degree, the resolution sizes of three captured images of camera that adopt in the present embodiment is 1280*8, the image pixel color depth is 8, the image background of taking is the touch-screen frame that the black light-absorbing material is made, when therefore not touching object on touch-screen 105, the image of three camera collections is black background.In case be blocked on touch-screen, on the image of camera collection, hot spot will appear.The present invention is namely whether identify this touch object according to hot spot be lettering pen.The resolution of the touch-screen that the present embodiment adopts is 1920*1080.
The structure of lettering pen is divided into 201 two parts of a body 202 and nib roughly as shown in Figure 2, in the described method of this patent, is circular for the lettering pen nib, and the lettering pen of nib diameter in the 2.5-3.5mm scope all can be made accurate identification.
As shown in Figure 3, the lettering pen recognition methods of a kind of touch-screen of the present invention comprises the following steps:
S1. obtain hot spot in the captured image of three cameras being arranged on the touch-screen upper left corner, centre, the upper right corner and the touch point coordinate (x, y) on touch-screen;
S2. calculate respectively touch body and left, center, right camera apart from dl, dm, dr, and the width of the hot spot in the image of left, center, right camera shooting is wl, wm, wr;
S3. calculate and touch body respectively for the actual size of each camera;
S4. set and touch body for threshold value difference th1, th2, the th3 of left, center, right camera size, the touch body that step S3 is calculated for the actual size of each camera respectively with th1, th2, th3 relatively, if touch body for the actual size of each camera all less than set threshold value, illustrate that current touch body is lettering pen, allow to write; Otherwise not lettering pen, do not allow to write.
Wherein according to the hot spot of step S1, obtain writing object at spot width wl, wm, the wr of three cameras, the location parameter (Xl, Yl) of touch point coordinate (x, y) and three cameras, (Xm, Ym), (Xr, Yr), (l, m, r represent respectively the left, center, right camera), calculate touch the touch point to three of body on a touch-screen camera apart from dl, dm, dr
Calculating in step S3 and touching body is to obtain the actual size of lettering pen according to formula 1-3 for the actual size of each camera respectively, and unit is pixel;
Rl=dlsin(π/2*wl/kl) (1)
Rm=dm*sin(π/2*wm/km) (2)
Rr=drsin(π/2*wr/kr) (3)
Wherein R1 is the actual size of the hot spot data of left camera shooting; Rm is the actual size of the hot spot data of middle camera shooting; Rr is the actual size of the hot spot data of right camera shooting;
Wherein kl, km, kr are respectively the modifying factors of three cameras in left, center, right, make (pi/2 * wl/kl), and the span of (pi/2 * wm/km) and (pi/2 * wr/kr) is controlled at [0-pi/2].The camera lens of Kl, km, three parameters of kr and camera system is relevant, and in the present embodiment, the relational expression of kl, km, kr is 2*kl=2*kr=km.
As shown in Figure 4, be Rl, Rm, Rr formula Computing Principle schematic diagram, the light of lettering pen reflection is received by camera, light 1 in figure is tangent with light 2 and lettering pen, article two, the angle of tangent line is θ, circle is d to the distance of camera, and the corner relation by right-angle triangle can obtain radius of a circle:
r=d*sin(θ/2)
Spot width represents to touch the distance of object distance camera, and the spot width value is larger, and it is nearer that expression touches the object distance camera; The spot width value is less, and it is far away that expression touches the object distance camera; In this touch system, there are following relation in spot width and angle:
θ/2=π/2*wr/kr。
In order to improve Systems balanth, when following 3 conditions are set up simultaneously, need to satisfy simultaneously R1<th1 and Rm<th2 and Rr<th3 in step S4, wherein th1, th2 and th3 arrange according to the actual requirements.
Corresponding each camera setting threshold th1 of difference in step S4, th2, th3, its span is respectively: the scope of th1 is [20,30], and the scope of th2 is [20,30], and the scope of th3 is [20,30].
Th1=26 in the present embodiment, th2=22, th3=26.
In step S3, the concrete steps of calculating spot width in current captured image are: the image that collects is first carried out binary conversion treatment, as shown in Figure 5, image is divided into two parts, background as shown in zone 2 in figure, hot spot as shown in zone 1 in figure, the shared left and right number of pixels w of computed image the first row hot spot then, w=|x1-x0|, this number of pixels is spot width.
Above-described embodiments of the present invention do not consist of the restriction to protection domain of the present invention.Any modification of having done within spiritual principles of the present invention, be equal to and replace and improvement etc., within all should being included in claim protection domain of the present invention.
Claims (7)
1. the lettering pen recognition methods of a touch-screen, is characterized in that, comprises the following steps:
S1. obtain hot spot in the captured image of three cameras being arranged on the touch-screen upper left corner, centre, the upper right corner and the touch point coordinate on touch-screen;
S2. calculate respectively touch body and left, center, right camera apart from dl, dm, dr, and the width of the hot spot in the image of left, center, right camera shooting is wl, wm, wr;
S3. calculate and touch body respectively for the actual size of each camera;
S4. set and touch body for threshold value difference th1, th2, the th3 of left, center, right camera size, the touch body that step S3 is calculated for the actual size of each camera respectively with th1, th2, th3 relatively, if touch body for the actual size of each camera all less than set threshold value, illustrate that current touch body is lettering pen, allow to write; Otherwise not lettering pen, do not allow to write.
2. the lettering pen recognition methods of touch-screen according to claim 1, is characterized in that, in step S3, the touch body is respectively for the actual size of each camera:
Rl=dl*sin(π/2*wl/kl);
Rm=dm*sin(π/2*wm/km);
Rr=dr*sin(π/2*wr/kr);
Wherein Rl is the actual size of the hot spot data of left camera shooting; Rm is the actual size of the hot spot data of middle camera shooting; Rr is the actual size of the hot spot data of right camera shooting; Unit is pixel;
Wherein kl, km, kr are respectively the modifying factors of three cameras in left, center, right, make (pi/2 * wl/kl), (pi/2 * wm/km) and pi/2 * wr/kr) span in [0, pi/2].
3. the lettering pen recognition methods of touch-screen according to claim 2, is characterized in that, the relational expression of described kl, km, kr is 2*kl=2*kr=km.
4. the lettering pen recognition methods of touch-screen according to claim 3, it is characterized in that, corresponding each camera setting threshold th1 of difference in described step S4, th2, th3, its span is respectively: the scope of th1 is [20,30], the scope of th2 is [20,30], the scope of th3 is [20,30].
5. the lettering pen recognition methods of touch-screen according to claim 4, is characterized in that, th1=26, th2=22, th3=26.
6. the lettering pen recognition methods of touch-screen according to claim 1, is characterized in that, described be positioned at the touch-screen upper left corner and the upper right corner camera coverage be 90 °, the coverage of middle camera is 170 °.
7. the lettering pen recognition methods of touch-screen according to claim 1, it is characterized in that, in described step S2, the mode of calculating the width of the hot spot in the image of taking is: the image that collects is carried out binary conversion treatment, then the shared left and right number of pixels of computed image the first row hot spot, this number of pixels is spot width.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2012105573602A CN103093189A (en) | 2012-12-19 | 2012-12-19 | Touch screen writing pen identifying method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2012105573602A CN103093189A (en) | 2012-12-19 | 2012-12-19 | Touch screen writing pen identifying method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103093189A true CN103093189A (en) | 2013-05-08 |
Family
ID=48205737
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2012105573602A Pending CN103093189A (en) | 2012-12-19 | 2012-12-19 | Touch screen writing pen identifying method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103093189A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101441542A (en) * | 2008-11-21 | 2009-05-27 | 广东威创视讯科技股份有限公司 | Method and apparatus for recognizing multiple target objects by interactive input apparatus |
US7692639B2 (en) * | 2006-02-10 | 2010-04-06 | Microsoft Corporation | Uniquely identifiable inking instruments |
CN102360417A (en) * | 2011-09-26 | 2012-02-22 | 广东威创视讯科技股份有限公司 | Recognition method of touch screen by use of writing pen |
-
2012
- 2012-12-19 CN CN2012105573602A patent/CN103093189A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7692639B2 (en) * | 2006-02-10 | 2010-04-06 | Microsoft Corporation | Uniquely identifiable inking instruments |
CN101441542A (en) * | 2008-11-21 | 2009-05-27 | 广东威创视讯科技股份有限公司 | Method and apparatus for recognizing multiple target objects by interactive input apparatus |
CN102360417A (en) * | 2011-09-26 | 2012-02-22 | 广东威创视讯科技股份有限公司 | Recognition method of touch screen by use of writing pen |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114952809B (en) | Workpiece recognition and pose detection method, system, and grasping control method of a robotic arm | |
CN100552718C (en) | Method and device for drawing graphics | |
CN105469084A (en) | Rapid extraction method and system for target central point | |
CN102736785A (en) | Information processing apparatus, information processing method, and computer-readable storage medium | |
CN110232379A (en) | A kind of vehicle attitude detection method and system | |
CN107561087A (en) | A kind of mouse logo positioning and defect inspection method based on machine vision | |
CN103019483A (en) | Touch object identification method | |
CN102163108A (en) | Method and device for identifying multiple touch points | |
CN103207709A (en) | Multi-touch system and method | |
CN106780538B (en) | A kind of method of error hiding during solution image trace | |
CN101630414A (en) | Method for confirming barycenter of real-timeimage connected domain | |
CN105893981A (en) | Human face posture correction method | |
CN104991684A (en) | Touch control device and working method therefor | |
CN103824072A (en) | Method and device for detecting font structure of handwritten character | |
CN203508417U (en) | Sorting system of industrial robot | |
CN102073405B (en) | Image zooming and rotating judgment method | |
CN110119720A (en) | A kind of real-time blink detection and pupil of human center positioning method | |
CN106600664A (en) | Drawing method and device of symmetric graph | |
CN102023759B (en) | Writing and locating method of active pen | |
CN106546174A (en) | A kind of circle detection method based on model filtering | |
CN109147469B (en) | Calligraphy practicing method | |
CN103699254A (en) | Method, device and system for multi-point touch positioning | |
CN107688431B (en) | Man-machine interaction method based on radar positioning | |
CN103093189A (en) | Touch screen writing pen identifying method | |
CN103092439A (en) | Improved identification method for infrared interactive touch spot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20130508 |
|
RJ01 | Rejection of invention patent application after publication |