CN102096528A - Touch input device and touch input method - Google Patents
Touch input device and touch input method Download PDFInfo
- Publication number
- CN102096528A CN102096528A CN2010106073224A CN201010607322A CN102096528A CN 102096528 A CN102096528 A CN 102096528A CN 2010106073224 A CN2010106073224 A CN 2010106073224A CN 201010607322 A CN201010607322 A CN 201010607322A CN 102096528 A CN102096528 A CN 102096528A
- Authority
- CN
- China
- Prior art keywords
- aperture
- image acquiring
- acquiring unit
- coordinate
- felt pen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Abstract
The invention discloses a touch input device. The touch input device comprises a touch pen and an input panel, wherein the touch pen is provided with a light source used for emitting light rays; the input panel comprises a touch input area, a first small hole, a second small hole, and a first image acquisition unit and a second image acquisition unit respectively corresponding to the first small hole and the second small hole, wherein the first small hole and the second small hole are respectively arranged on the adjacent two sides of the input panel; the image acquisition units are respectively arranged at the positions, corresponding to the small holes, on the two adjacent sides; the light rays emitted from the touch pen pass through the first small hole and the second small hole and respectively form projections on the first image acquisition unit and the second image acquisition unit; and the coordinates of the position of the touch pen are calculated by the touch input device through the position coordinates of the projections so as to determine the position and the moving track of the touch pen. The invention also provides a touch input method. The touch input device has a simple structure, is convenient to use, and is free from the influence of electromagnetic interference.
Description
Technical field
The present invention relates to a kind of touch input device and touch inputting method, relate in particular to a kind of optical touch input media and touch inputting method.
Background technology
Present literal input mode mainly contains keyboard input, phonetic entry and touch pad input etc.Wherein touch input device generally comprises a felt pen and a touch panel, and touch input device generally has following several types at present: resistance-type, condenser type, and induction etc.In resistance or capacitance touch input media, a plurality of electric capacity or resistance are set in this touch pad, this felt pen can be any object that can change electric capacity or resistance, for example metal bar or plastic rod.Yet, when certain or some electric capacity on the substrate or resistance damage, touch input and just mistake can occur, thereby influence serviceable life, brought inconvenience to the user simultaneously.In the induction touch input device, one layer circuit board is arranged below the touch pad, can above touch pad, produce magnetic field in the certain limit, the circuit that can produce magnetic field is also arranged in felt pen, this technology fluent writing, but this kind mode is subjected to the electromagnetic interference (EMI) of external environment easily, for example, almost can not work aside the time as mobile phone speaker.
Summary of the invention
In view of this, the invention provides a kind of touch input device to address the above problem.
Touch input device among the present invention comprises a felt pen and an input panel.This felt pen is provided with light source and is used for emitting beam.This input panel comprises that one touches input area, first aperture, second aperture, the first image acquiring unit and the second image acquiring unit, wherein this first and second aperture is arranged at the adjacent both sides of this input panel respectively, this first and second image acquiring unit is located at these adjacent both sides and the corresponding position of aperture respectively, and the light that makes this felt pen send sees through this first and second aperture and forms projection respectively on this first and second image acquiring unit; The also corresponding X-Y coordinate system of this input panel, this first and second aperture are respectively to the coordinate in should coordinate system.This touch input device also comprises a processing unit, first and second image acquiring unit of this processing unit and this is connected, the angle α that this processing unit calculates this felt pen and the first aperture line and X-direction respectively in the position and the distance between image acquiring unit and the respective apertures of projection on first and second image acquiring unit respectively according to first and second aperture, and the angle β of felt pen and the second aperture line and Y direction, and determine first straight line through touch location according to the coordinate of first aperture and angle α respectively, and determine second straight line through touch location according to the coordinate of second aperture and angle β, thereby determine the intersecting point coordinate of this first straight line and second straight line, this intersecting point coordinate is this felt pen at the coordinate that touches input area institute touch location, thereby determines the position and the motion track of this felt pen by this felt pen position coordinates of continuous detecting.
The present invention also provides a kind of touch inputting method, and this method comprises the steps: that the light that felt pen sends sees through the projection on first and second image acquiring unit respectively of first and second aperture; Determine the projected position on the first image acquiring unit, and go out the line (calling first straight line in the following text) of this felt pen position and aperture and the angle of horizontal direction according to the distance calculation of this projected position and first aperture and the first image acquiring unit; Determine the coordinate set of this first straight line according to the angle of the first aperture position coordinate and first straight line and horizontal direction; Determine the position of this projection on the second image acquiring unit, and go out the angle of line (to call second straight line in the following text) and vertical direction between the felt pen position and second aperture according to the distance calculation of the position of this projection and second aperture and the second image acquiring unit; Determine the coordinate set of this second straight line according to the angle of this second aperture coordinate and second straight line and vertical direction; Determine a public coordinate points according to the coordinate set of first straight line and second straight line, be the coordinate of felt pen institute position of touch; Detect the position coordinates of this felt pen continuously, thereby determine the motion track of this felt pen.
Touch input device among the present invention is simple in structure, the convenient use, and be not subjected to the influence of electromagnetic interference (EMI).
Description of drawings
Fig. 1 is a touch input device synoptic diagram in an embodiment of the present invention.
Fig. 2 is this touch input device high-level schematic functional block diagram in an embodiment of the present invention.
Fig. 3 is that touch input device calculates principle explanation synoptic diagram in an embodiment of the present invention.
Fig. 4 is a touch inputting method process flow diagram in an embodiment of the present invention.
The main element symbol description
|
100 |
Felt pen | 10 |
Light source | 101 |
Input panel | 20 |
Touch input area | 201 |
First aperture | 202 |
Second aperture | 203 |
The first |
204 |
The second |
205 |
|
30 |
First straight line | L1 |
Second straight line | L2 |
Embodiment
See also Fig. 1 and Fig. 2, this touch input device 100 comprises a felt pen 10 and an input panel 20.This felt pen 10 is provided with a light source 101, this light source 101 to around launch light.In the present embodiment, this light source 101 is an infrared light supply, is arranged at this felt pen 10 nib places.In other embodiments, this light source 101 is a lasing light emitter.This input panel 20 comprises that one touches input area 201, first aperture 202, second aperture 203, the first image acquiring unit 204 and the second image acquiring unit 205.In the present embodiment, this input panel 20 is one to have the rectangular configuration of protruding frame, and this protrusions frame is a hollow structure, touches input area 201 settings around this.This first aperture 202 and second aperture 203 lay respectively on the protrusion frame of 20 liang of adjacent sidewalls of this input panel and aperture direction and its place frame perpendicular, promptly touch input area 201 towards this.
In the present embodiment, this first aperture 202 is positioned on the horizontal sides of this input panel 20, and this second aperture 203 is positioned on the vertical edge of this input panel 20.This first image acquiring unit 204 is arranged at the inner position corresponding with first aperture 202 of the protrusion frame of this input panel 20, and with these first aperture, 202 one first preset distance settings at interval, this second image acquiring unit 205 is arranged in this protrusion frame the position corresponding with this second aperture 203, and with this second aperture, 203 intervals, one second preset distance, when thereby this felt pen 10 was arranged in this touch input area 201, the light that this felt pen 10 sends can see through this first aperture 202 and the projection on the first image acquiring unit 204 and the second image acquiring unit 205 respectively of this second aperture 203.Wherein, this first preset distance and this second preset distance can be identical also can be different.When this felt pen 10 touched different positions, the light that this felt pen 10 sends was by the position difference of projection on corresponding image acquiring unit behind the aperture.This touch input device 100 also comprises a processing unit 30, and this processing unit 30 is electrically connected with this first image acquiring unit 204 and the second image acquiring unit 205.
In the present embodiment, the tangible point of this touch input area 201 and a coordinate system for example the coordinate in the X-Y coordinate system shine upon one by one.This first aperture 202, second aperture 203 are equally to the coordinate in should the X-Y coordinate system.This processing unit 30 according to light see through first aperture 202 and second aperture 203 respectively on the first image acquiring unit 204 and the second image acquiring unit 205 position of projection and the distance between image acquiring unit and the respective apertures calculate the angle α of this felt pen 10 and first aperture, 202 lines and X-direction respectively, and the angle β of felt pen 10 and second aperture, 203 lines and Y direction, and determine first a straight line L1 through first aperture 202 and touch location according to the coordinate and the angle α of first aperture 202 respectively, and determine second a straight line L2 through second aperture 203 and touch location according to the coordinate and the angle β of second aperture 203, thereby determine the intersecting point coordinate of this first straight line L1 and the second straight line L2, this intersecting point coordinate is this felt pen 10 at the coordinate that touches 201 touch locations of input area, thereby this processing unit 30 is determined the position and the motion track of this felt pen 10.
In the present embodiment, this touch input device 100 also comprises a display panel (not shown), and this display panel is positioned at the top of this input panel 20, and the motion track that this processing unit 30 is also controlled this felt pen 10 is presented on the display panel.In other embodiments, this input panel 20 is a transparent panel, and this display panel is positioned at the below of this input panel 20.
See also Fig. 3, for touching the coordinate synoptic diagram of input panel 20 in the present embodiment.In the present embodiment, if the center with this touch input area 201 is that true origin is set up the X-Y coordinate system, wherein first aperture 202 and the first image acquiring unit 204 are positioned on the X-axis, coordinate is respectively (0, a), (0, c), this second aperture 203 and this second image acquiring unit 205 are positioned on the Y-axis, coordinate is respectively (b, 0) reach (d, 0), thereby, the distance of this first image acquiring unit 204 and first aperture 202 is c-a, and the distance of this second image acquiring unit 205 and second aperture 203 is b-d.In the present embodiment, this the first image acquiring unit 204 and the second image acquiring unit 205 are formed by a plurality of optical inductors of linearly arranging, when the light that sends when felt pen 10 projected to an optical inductor by aperture, this optical inductor produced an induced signal.Optical inductor in this first image acquiring unit 204 is parallel to X-axis, and the optical inductor in this second image acquiring unit 205 is parallel to Y-axis, all corresponding coordinate figure of each optical inductor in this image acquiring unit 204,205.Wherein the coordinate that is positioned at right-hand optical inductor of Y-axis of the first image acquiring unit 204 on the occasion of, the coordinate that is positioned at the optical inductor of Y-axis left is a negative value.This second image acquiring unit 205 be positioned at X-axis top optical inductor coordinate on the occasion of, the coordinate that is positioned at the optical inductor of X-axis below is a negative value.
In the present embodiment, the computation process of these felt pen 10 position coordinateses is as follows: the position coordinates of setting this felt pen 10 for (x, y).The light that sends when felt pen 10 passes this first aperture 202 and when projecting to an optical inductor of this first image acquiring unit 204, this optical inductor produces an induced signal, determine the coordinate of this optical inductor behind processing unit 30 these induced signals of reception, this coordinate is the position of this ray cast on this first image acquiring unit 204, be made as (z, and obtain the value of angle α according to the distance c-a of this horizontal ordinate z and the first image acquiring unit 204 and first aperture 202 c).Concrete because tan α=(c-a)/z, thereby can obtain α, wherein c-a is an absolute value, when z was negative value, tan α was for negative, when z be on the occasion of the time, tan α is for just.In like manner can try to achieve the angle value of angle β.Thereby this processing unit 30 can determine that according to the angle α of the coordinate of this first aperture 202 and this this first straight line L1 and horizontal direction the coordinate set of this first straight line L1 is for satisfying all coordinates of equation y=tan α (x+a).In like manner, this processing unit 30 determines that according to the coordinate of second aperture 203 and the angle β of this second straight line L2 and vertical direction the coordinate set of this second straight line L2 is for satisfying all coordinates of EQUATION x=tan β (y+b).This processing unit 30 is determined this public coordinate points by the coordinate set of this first straight line L1 of contrast and the second straight line L2, promptly draws the coordinate of 10 position of touch of this felt pen.This processing unit 30 draws the deformation trace of this felt pen 10 by detecting the coordinate of 10 position of touch of this felt pen continuously.
See also Fig. 4, the touch inputting method in this touch input device 100 comprises the steps:
Step S301: this felt pen 10 with light source is put the light that produces when touching and is produced projection respectively on the first image acquiring unit 204 and the second image acquiring unit 205 through first aperture 202 and second aperture 203 on input panel 20.
Step S302: determine the position of this subpoint on the first image acquiring unit 204, and go out the angle of line (to call the first straight line L1 in the following text) and horizontal direction between felt pen 10 positions and first aperture 202 according to the distance calculation of the position of this subpoint and first aperture 202 and the first image acquiring unit 204.
Step S303: the coordinate set of determining this first straight line L1 according to the angle of these first aperture, 202 coordinates and the first straight line L1 and horizontal direction.
Step S304: determine the position of this subpoint on the second image acquiring unit 205, and go out the angle of line (to call the second straight line L2 in the following text) and vertical direction between felt pen 10 positions and second aperture 203 according to the distance calculation of the position of this subpoint and second aperture 203 and the second image acquiring unit 205.
Step S305: the coordinate set of determining this second straight line L1 according to the angle of these second aperture, 203 coordinates and the second straight line L1 and vertical direction;
Step S306: determine a public coordinate points according to the coordinate set of the first straight line L1 and the second straight line L2, be the coordinate of 10 position of touch of felt pen;
Step S307: detect the position coordinates of this felt pen 10 continuously, thereby determine the motion track of this felt pen 10.
Claims (9)
1. touch input device, comprise a felt pen and an input panel and, it is characterized in that:
This felt pen is provided with light source and is used to emit beam;
This input panel comprises that one touches input area, first aperture, second aperture, the first image acquiring unit and the second image acquiring unit, wherein this first and second aperture is arranged at the adjacent both sides of this input panel respectively, this first and second image acquiring unit is located at these adjacent both sides and the corresponding position of aperture respectively, and the light that makes this felt pen send sees through this first and second aperture and forms projection respectively on this first and second image acquiring unit; The also corresponding X-Y coordinate system of this input panel, this first and second aperture are respectively to the coordinate in should coordinate system;
This touch input device also comprises a processing unit, first and second image acquiring unit of this processing unit and this is connected, the angle α that this processing unit calculates this felt pen and the first aperture line and X-direction respectively in the position and the distance between image acquiring unit and the respective apertures of projection on first and second image acquiring unit respectively according to first and second aperture, and the angle β of felt pen and the second aperture line and Y direction, and determine first straight line through touch location according to the coordinate of first aperture and angle α respectively, and determine second straight line through touch location according to the coordinate of second aperture and angle β, thereby determine the intersecting point coordinate of this first straight line and second straight line, this intersecting point coordinate is this felt pen at the coordinate that touches input area institute touch location, thereby determines the position and the motion track of this felt pen by this felt pen position coordinates of continuous detecting.
2. touch input device as claimed in claim 1 is characterized in that the light source of this felt pen is arranged at the nib place.
3. touch input device as claimed in claim 2 is characterized in that, the light source on this felt pen is an infrared light supply.
4. touch input device as claimed in claim 2 is characterized in that, the light source on this felt pen is a lasing light emitter.
5. touch input device as claimed in claim 1, it is characterized in that, this touch input panel comprises that one protrudes frame, this protrusion frame is a hollow structure, this first, second aperture is arranged at respectively on the protrusion limit of this touch input panel, this first, second image acquiring unit is arranged at this protrusion frame inside respectively, the position corresponding respectively with this first, second aperture.
6. touch input device as claimed in claim 1, it is characterized in that, this image acquiring unit is made of a plurality of optical inductors of linearly arranging, coordinate figure in the corresponding coordinate system of each optical inductor difference in this image acquiring unit, when the light that sends when felt pen projects to an optical inductor by aperture, this optical inductor produces an induced signal, this processing unit is determined the coordinate of the optical inductor of this generation induced signal after receiving this induced signal, thereby determines the projected position of this light on this image acquiring unit.
7. touch input device as claimed in claim 1 is characterized in that, this touch input device is an electronic drawing board, and after this processing unit was determined felt pen institute position of touch, control user's input trajectory can be presented on the touch input panel of this electronic drawing board.
8. touch inputting method, be applied to a kind of touch input device, this touch input device comprises that a felt pen and touches input panel, this felt pen is provided with light source to emit beam, these adjacent both sides of touch input panel are respectively arranged with first aperture, second aperture, first and second corresponding with first and second aperture respectively image acquiring unit, and one first and second image acquiring unit of processing unit and this be connected, the also corresponding X-Y coordinate system of this input panel, this first and second aperture to the coordinate in should coordinate system, is characterized in that this touch inputting method comprises the steps: respectively
The light that felt pen sends sees through the projection on first and second image acquiring unit respectively of first and second aperture;
Determine the projected position on the first image acquiring unit, and go out first line of this felt pen position and aperture and the angle of horizontal direction according to the distance calculation of this projected position and first aperture and the first image acquiring unit;
Determine the coordinate set of this first line according to the angle of the first aperture position coordinate and first line and horizontal direction;
Determine the position of this projection on the second image acquiring unit, and go out second line between the felt pen position and second aperture and the angle of vertical direction according to the distance calculation of the position of this projection and second aperture and the second image acquiring unit;
Determine the coordinate set of this second line according to the angle of this second aperture coordinate and second line and vertical direction;
Determine a public coordinate points according to the coordinate set of first line and second line, be the coordinate of felt pen institute position of touch;
Detect the position coordinates of this felt pen continuously, thereby determine the motion track of this felt pen.
9. touch inputting method as claimed in claim 8, it is characterized in that, use in the touch input device of this touch inputting method, this image acquiring unit is made of a plurality of optical inductors of linearly arranging, each optical inductor in this image acquiring unit is respectively to the coordinate figure in should coordinate system, when the light that sends when felt pen projects to an optical inductor by aperture, this optical inductor produces an induced signal, this processing unit is determined the coordinate of the optical inductor of this generation induced signal after receiving this induced signal, thereby determines the projected position of this light on this image acquiring unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010106073224A CN102096528A (en) | 2010-12-27 | 2010-12-27 | Touch input device and touch input method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010106073224A CN102096528A (en) | 2010-12-27 | 2010-12-27 | Touch input device and touch input method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102096528A true CN102096528A (en) | 2011-06-15 |
Family
ID=44129640
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2010106073224A Pending CN102096528A (en) | 2010-12-27 | 2010-12-27 | Touch input device and touch input method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102096528A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103617642A (en) * | 2013-11-22 | 2014-03-05 | 深圳市掌网立体时代视讯技术有限公司 | Digital painting and writing method and device |
CN106201117A (en) * | 2015-01-30 | 2016-12-07 | 中强光电股份有限公司 | Optical object positioning device and positioning method thereof |
CN113449627A (en) * | 2021-06-24 | 2021-09-28 | 深兰科技(武汉)股份有限公司 | Personnel tracking method based on AI video analysis and related device |
-
2010
- 2010-12-27 CN CN2010106073224A patent/CN102096528A/en active Pending
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103617642A (en) * | 2013-11-22 | 2014-03-05 | 深圳市掌网立体时代视讯技术有限公司 | Digital painting and writing method and device |
CN103617642B (en) * | 2013-11-22 | 2017-03-15 | 深圳市掌网科技股份有限公司 | A kind of digital book drawing method and device |
CN106201117A (en) * | 2015-01-30 | 2016-12-07 | 中强光电股份有限公司 | Optical object positioning device and positioning method thereof |
CN106201117B (en) * | 2015-01-30 | 2019-09-10 | 中强光电股份有限公司 | Optical object positioning device and positioning method thereof |
CN113449627A (en) * | 2021-06-24 | 2021-09-28 | 深兰科技(武汉)股份有限公司 | Personnel tracking method based on AI video analysis and related device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7864165B2 (en) | Apparatus and method for detecting position, and touch panel using the same | |
KR101548819B1 (en) | Touchscreen apparatus and driving method thereof | |
CN107463296B (en) | Display panel, touch screen and display device | |
CN102622149B (en) | Touch window obtaining coordinate position on boundary portion of view area and method thereof | |
KR20120104362A (en) | Infrared touch screen | |
KR20150074350A (en) | Touch panel and touchscreen apparatus including the same | |
CN103226411A (en) | Interactive projection system and touch control interactive method thereof | |
CN102096528A (en) | Touch input device and touch input method | |
CN106055140A (en) | Force enhanced input device with shielded electrodes | |
CN103376949A (en) | Display device and method using a plurality of display panels | |
CN102073417A (en) | Electronic device with infrared touch identification function | |
KR101388793B1 (en) | Digitizer pen, input device, and operating method thereof | |
CN104252265B (en) | Touch-sensing system and method | |
CN205193768U (en) | Touch frame of double -screen display system | |
CN107302616B (en) | Back lid touch structure and smart mobile phone of smart mobile phone | |
KR101512568B1 (en) | Touch panel and touchscreen apparatus including the same | |
CN102760009A (en) | Double-sided touch screen based on printed circuit board and double-sided touch realizing method | |
CN104951137B (en) | Touch pad and the electronic equipment for including it | |
TWI400641B (en) | Optical touch apparatus | |
CN201859424U (en) | Touch control system of display | |
CN103984444B (en) | Infrared electronic white board and touch localization method | |
TW201229832A (en) | Touch input device and touch input method | |
WO2013136380A1 (en) | Digitizer apparatus | |
KR102529821B1 (en) | Method for detecting writing pressure and processing writing pressure in convergence touch screen panel and interactive flat panel display therefor | |
CN102722294A (en) | Method using sector light sources for touch screen positioning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20110615 |