TW201104533A - Multi-touch input apparatus and method thereof - Google Patents

Multi-touch input apparatus and method thereof Download PDF

Info

Publication number
TW201104533A
TW201104533A TW98124231A TW98124231A TW201104533A TW 201104533 A TW201104533 A TW 201104533A TW 98124231 A TW98124231 A TW 98124231A TW 98124231 A TW98124231 A TW 98124231A TW 201104533 A TW201104533 A TW 201104533A
Authority
TW
Taiwan
Prior art keywords
touch
detection signal
angle
detection
viewing angle
Prior art date
Application number
TW98124231A
Other languages
Chinese (zh)
Other versions
TWI393038B (en
Inventor
yi-long Hu
zhao-yu Chen
kun-xun Li
Original Assignee
Lite On Semiconductor Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lite On Semiconductor Corp filed Critical Lite On Semiconductor Corp
Priority to TW98124231A priority Critical patent/TWI393038B/en
Publication of TW201104533A publication Critical patent/TW201104533A/en
Application granted granted Critical
Publication of TWI393038B publication Critical patent/TWI393038B/en

Links

Landscapes

  • Position Input By Displaying (AREA)

Abstract

This invention relates to a multi-touch input apparatus using photo detecting method. According to present invention, a touch panel provides inputting at least one touch point, at least one light source surrounds the touch panel to provides detecting light, a plurality of detecting systems positioned on the adjacent corners of the touch panel to detect the shielding light angle by the touch point, wherein, the plurality of detecting systems having a first visual angle and a second visual angle. The multi-touch input apparatus calculates the shielding light angle detecting by different visual angles of the detecting systems to determining the corresponding touch points, and calculates by trigonometric formula to get the coordinates of the touch points.

Description

201104533 六、發明說明: 【發明所屬之技術領域】 本發明係關於一種多點觸控輸入裝置,尤指一種光學 式多點觸控裝置及其偵測方法。 【先前技術】 隨著高科技產品的普及,消費性設備如桌上型電腦、 筆記型電腦(Notebook)、個人數位助理(PDA)或者手機 等設備也被廣泛的應用。隨著這些高科技產品的發展以及 功能上的多樣化,輸入工具也隨之變得更加重要。目前已 有的輸入方式,例如:傳統型的鍵盤與滑鼠、可直接輸入 的觸控式面板(Touch Panel )、觸控筆(Touch Pen )或者 聲音輸入(speech recognition)等都已廣泛應用於各種產 品之中。其中最為大眾所矚目的,當屬於可直接輸入的觸 控式面板。 近年來,隨著蘋果電腦產品iP〇D的暢銷,利用觸控 式面板進行多點觸控輸入的方式也受到大眾的矚目與重 視。由於利用觸控式面板進行多點觸控輸入,對於一般使 用者而言具可經過直覺性的手勢語言即可進行輸入與操 作’不需要經過任何的打字訓練或者任何的外加裝置,因 此可以增加使用者使用的便利性以及進一步減少鍵盤以及 滑氡所佔的空間。因此,近年來的各種科技產品無不全力 推出具備此種觸控輸入的裝置。並且,微軟公司也即將推 出新一代的windows作業系統,該新一代的wind〇ws作業 系统特別支援觸控面板的多點觸控輸入,因此,預期將引 爆一股觸控輸入的熱潮。並且,將來的輸入方式將會從傳 201104533 =骨氧或鍵盤等輸入方式進化成利用直覺式觸控輸入的 置的料=所7^為f知利用光學感應技術之觸控輸入裝 置之伯圖所^為習知彻光學錢技術之觸控輸入裝 直之偵側訊號示意圖。 僅雲:^ 一圖所示’利用光學感應技術之觸控輸入裝置10 發二板U的周圍加細板12以及複數個光源 產生-、伯則系統13a、13b。利用光源發射/侦測系統i3a、i3b 射系:/測光束,並透過反射板12反射,最後由光源/發 板n 3a 131^貞測光束。目此’當—使用者觸碰觸控面 13b漆’ Γ會擔住仙J光束,因此即可於光源/發射系統13a、 產生一光束遮蔽訊號。 敕人:控輸入裝置1〇之光源發射/偵測系統13a、13b為-=光减以及谓測襄置之系統,並配置於觸控面板11之相 束,' 7頁點’並發射近乎平行於觸控面板11表面之债測光 測系f序掃描整個面板’並利用反射板12將光源發射續 源Da 所發射之偵測光束依原發射路徑反射回光 艮、:射/偵測系統13a、13b,則光源發射/偵測系統na、13b y可透過偵測裝置進行偵測光束之偵側。因此,當使用者 1用手指或觸控筆於觸控面板U輸入一觸控點ρι,則會 ^擋光源發射/偵測系統13a、13b所發射之個光束,並 &成從光源發射/制系統13a、i3b無法接收藉由反射板 12反射之偵測光束,則光源發射/偵測系統13a、nb即可 產生一光束遮蔽訊號以感測該觸控點P1。此時,光源發射 /偵測系統13a、13b可分別產生一偵測訊號,即光束遮蔽 201104533 afU虎’如第二圖所示之偵測訊號s ( 0L1)及s ( 0R1), 則觸控輸人裝置1Q即可透過此侧減所對應之角度,例 如.0U及0R1,再配合上觸控面板1〇之寬度w,亦即光 源發射/偵測系統13a、13b之間距w,即可透過簡單之三 角函數公式進行計算得到觸控點P1之座標(xl,yl)。同 理,亦可以相同之原理及計算方法得到觸控點P2之座杈 (x2,y2)。 ^ 但疋’利用此種光學感應技術之觸控輸入裝置1〇無法 同時偵測多點輸入的狀況,例如:當使用者同時於觸控面 板11上輸入兩個觸控點P1、P2,則光源發射/偵測系統 13a 13b則分別偵測到如第二圖所示之偵測訊號§( $ Μ )、 S ( θυ)及S ( i9R1)、S ( Θμ)等四個偵測訊號。此時, 觸控輸入裝置1G無法辨別此四個偵測訊號所對應之角度 與其對應之觸控點Ρ卜Ρ2,因此,觸控輸人裝置1〇即可 能產生假性觸控點(Ghostpoint) Ρ3及Ρ4。即是,當光源 發射/偵測系統13a偵測到觸控點P1的偵測訊號s ( 且光源發射/偵測系統13b偵測到觸控點ρι及觸控點p2L的 伯測訊號S ( Θ R1)、S ( Θ R2 ) ’則對光源發射/偵測系統… 而言,偵測訊號S ( 0L1)所對應之角度為〜,對光源 射/偵測系統13b而言,偵測訊號s ( 、s ( )戶 對應之角度為‘、。經過運算後,觸控輸人裝^ = 則可能產生正確的辨識觸控點P1座標及錯誤辨 觸控點P3座標。因此,觸⑽μ置1Q無法正確_ 些制訊號 s(D、s(0L2hSURi)、s 以產生正確的觸控座標。 ; 201104533 ^針對此種姻光學感應技術之觸控輸入裝 ϋι更有_韻方法,以使得此種光學感應式 的觸控輸人裝置1G得以應用於多點觸控的輸入方式。 【發明内容】 本發明提供-種具備多點觸控功能之觸控輸入裝置, 包括:-觸控面板,以供輸入至少一觸控點;至 於該觸控ί板周圍,以提供光束;複數個成像 點、,且間視角’位於該觸控面板之相鄰之頂 束產生-第測光源’並根據該觸控點所遮蔽之偵測光 兮此㈣ 測訊號;以及至少一視角可變化單元,盘 使得該些成像系統具有-第二空間 點所—空間視角之該些成像系統根據該觸控 敝之❹i光束產生—第二偵測訊號 並且二 本發明提供一種觸控輸入裝置,包括: 以供輸入’控點;至少—光源 二二’ 以提供一_光束,·複數個第—成料 間視角,位於該觸控面板之相鄰之 二^ 二 ’並根據該觸控點所遮蔽之偵測光束產生 該觸控點所遮蔽之偵測::產據 -運算系統接收該第—偵測二域〜、中, 以運算出該觸控點之座標值。,偵測訊號,並據 201104533 本發明提供一種觸控輸入裝置之偵測多點位置之觸控 方法,包括:提供一觸控面板,以供輸入多點之觸控點; 透過複數個成像系統感測該些觸控點以產生一第一偵測訊 號,其中3亥些成像系統具有一第一空間視角;透過一視角 可變化單元以改變該些成像系統,使該些成像系統具有一 第二空間視角,且該些成像系統感測該些觸控點以產生一 第一偵測§凡號;以及透過—運算系統之一角度選擇單元接 收該第一偵測訊號及該第二偵測訊號,並根據該第一偵測 訊號及該第二偵測訊號所對應之角度的變化以選擇出該第 一偵測訊號所對應之角度與其相對應之觸控點,則一.運算 單元據以運算出該觸控點之座標值。 本發明提供一種觸控輪入裝置之偵測多點位置之觸控 方法,包括:提供一觸控面板,以供輸入多點之觸控點; 透過複數個第一成像系統感測該些觸控點以產生一第一侦 測訊號,其中該些成像系統具有一第一空間視角;透過一 第二成像系統感測該些輸入點以產生一第二偵測訊號,其 中該些成像系統具有一第二空間視角;以及透過一運算系 統之一角度選擇單元接收該第一偵測訊號及該第二偵測訊 號’並根據該第一偵測訊號及該第二偵測訊號所對應之角 度的變化以選擇出該第一偵測訊號所對應之角度與其相對 應之觸控點’則一運算單元據以運算出該觸控點之座標值。 本案發明人提出本案之發明概念,本發明之機制係與 公知技術截然不同’俾以提供具備多點觸控功能之觸控輪 入裝置及偵測多點位置之觸控方法,以促進產業升級。 以上之概述與接下來的詳細說明及附圖,皆是為了能 進一步說明本發明為達成預定目的所採取之方式、手段及 201104533 ’將在後續的說明 功效。而有關本發明的其他目的及優點 及圖式中加以闡述。 實施方式】 技術===:=第一實施例之利用光學感應 第四A、四B圖所示為根據本發明第一實施例之利用 光學感應技術之觸控輸入裝置之偵側訊號的示意圖。201104533 VI. Description of the Invention: [Technical Field] The present invention relates to a multi-touch input device, and more particularly to an optical multi-touch device and a detection method thereof. [Prior Art] With the popularization of high-tech products, consumer devices such as desktop computers, notebooks, personal digital assistants (PDAs) or mobile phones are also widely used. With the development of these high-tech products and the diversification of functions, input tools have become more important. Existing input methods, such as traditional keyboards and mice, touch panels, touch pens, or speech recognition, have been widely used. Among various products. One of the most popular is the touch panel that can be directly input. In recent years, with the popularity of Apple's iP〇D, the use of touch panels for multi-touch input has also received attention and attention. Since the touch panel is used for multi-touch input, it can be input and operated by an intuitive gesture language for the average user. It does not require any typing training or any additional device, so it can be increased. The convenience of the user and further reduce the space occupied by the keyboard and the slider. Therefore, various technology products in recent years have all made efforts to introduce devices with such touch inputs. In addition, Microsoft is about to launch a new generation of Windows operating system. The new-generation wind〇ws operating system specifically supports multi-touch input of touch panels, so it is expected to ignite a wave of touch input. In addition, the future input method will evolve from the input method of 201104533 = bone oxygen or keyboard to the input of the intuitive touch input = the knowledge of the touch input device using optical sensing technology. It is a schematic diagram of the touch-input signal of the touch input device. Cloud only: ^ As shown in the figure, the touch input device 10 using the optical sensing technology sends a thin plate 12 around the two plates U and a plurality of light source generating-, and the system 13a, 13b. The light source transmitting/detecting system i3a, i3b is used: the measuring beam, and is reflected by the reflecting plate 12, and finally the light source/emission plate n 3a 131^ is used to measure the light beam. Therefore, when the user touches the touch surface 13b, the paint 担 will bear the beam of J, so that a light beam shading signal can be generated at the light source/transmitting system 13a.敕人: The light source emission/detection system 13a, 13b of the control input device is a system of -= light reduction and measurement, and is disposed on the phase beam of the touch panel 11, '7 pages' and emits almost The fingerprint photometric system parallel to the surface of the touch panel 11 scans the entire panel' and uses the reflector 12 to reflect the detection beam emitted by the light source emission source Da back to the pupil according to the original emission path: the shot/detection system 13a, 13b, the light source transmitting/detecting systems na, 13b y can detect the side of the detecting beam through the detecting device. Therefore, when the user 1 inputs a touch point ρι on the touch panel U with a finger or a stylus, the light beams emitted by the light source emission/detection systems 13a, 13b are blocked and emitted from the light source. The system 13a, i3b cannot receive the detection beam reflected by the reflector 12, and the source emission/detection system 13a, nb can generate a beam masking signal to sense the touch point P1. At this time, the light source transmitting/detecting systems 13a, 13b can respectively generate a detecting signal, that is, the beam shielding 201104533 afU tiger's detection signals s (0L1) and s (0R1) as shown in the second figure, then the touch The input device 1Q can reduce the corresponding angle through the side, for example, .0U and 0R1, and then match the width w of the touch panel 1〇, that is, the distance w between the light source transmitting/detecting systems 13a and 13b. The coordinates (xl, yl) of the touch point P1 are obtained by a simple trigonometric formula. Similarly, the coordinates (x2, y2) of the touch point P2 can be obtained by the same principle and calculation method. ^ However, the touch input device 1 using the optical sensing technology cannot simultaneously detect the multi-point input. For example, when the user simultaneously inputs two touch points P1 and P2 on the touch panel 11, The light source transmitting/detecting system 13a 13b detects four detection signals such as the detection signals §($ Μ ), S ( θυ), and S (i9R1), S ( Θμ) as shown in the second figure. At this time, the touch input device 1G cannot distinguish the angle corresponding to the four detection signals and the corresponding touch point Ρ 2, so the touch input device 1 may generate a false touch point (Ghostpoint). Ρ3 and Ρ4. That is, when the light source transmitting/detecting system 13a detects the detecting signal s of the touch point P1 (and the light source transmitting/detecting system 13b detects the touch point ρι and the touch point S2 of the touch point p2L ( Θ R1), S ( Θ R2 ) ' For the light source emission/detection system... The angle corresponding to the detection signal S (0L1) is ~, for the light source detection/detection system 13b, the detection signal s ( , s ( ) corresponds to the angle of ', after the operation, the touch input device ^ = may produce the correct identification of the touch point P1 coordinates and the wrong touch point P3 coordinates. Therefore, touch (10) μ set 1Q can't be correct _ some signal s (D, s (0L2hSURi), s to produce the correct touch coordinates.; 201104533 ^The touch input device for this kind of optical sensing technology has a _ rhyme method to make this The optical inductive touch input device 1G can be applied to a multi-touch input mode. [Invention] The present invention provides a touch input device having a multi-touch function, including: a touch panel. For inputting at least one touch point; as for the touch panel, to provide a light beam; a plurality of imaging points And the viewing angle 'is located in the adjacent top beam of the touch panel to generate a -to-measure light source' and according to the detection light blocked by the touch point, the (four) test signal; and at least one viewing angle changeable unit, the disc The imaging system has a second spatial point-the spatial viewing angle, and the imaging systems generate a second detecting signal according to the touch beam, and the second invention provides a touch input device, including: Input 'control point; at least - light source two two' to provide a _beam, a plurality of first-material viewing angles, located adjacent to the touch panel of the second two and two according to the touch point The measuring beam generates the detection blocked by the touch point: the production data-operation system receives the first-detection two-domain~, the middle, to calculate the coordinate value of the touch point, the detection signal, and according to 201104533 The present invention provides a touch control method for detecting a multi-point position of a touch input device, comprising: providing a touch panel for inputting a plurality of touch points; and sensing the touch points through a plurality of imaging systems Generating a first detection signal, The imaging system of the 3H has a first spatial viewing angle; the imaging unit can be changed by a viewing angle to change the imaging systems, so that the imaging systems have a second spatial viewing angle, and the imaging systems sense the touch points The first detecting signal and the second detecting signal are received by the angle selecting unit of the operating system, and the first detecting signal and the second detecting signal are received according to the first detecting signal and the second detecting signal The angle corresponding to the signal changes to select the corresponding touch point of the first detection signal, and the operation unit calculates the coordinate value of the touch point. The present invention provides a touch A touch control method for detecting a multi-point position of a wheel-in device includes: providing a touch panel for inputting a multi-point touch point; sensing the touch points through a plurality of first imaging systems to generate a a first detection signal, wherein the imaging systems have a first spatial viewing angle; and the second imaging system senses the input points to generate a second detection signal, wherein the imaging systems have a second spatial viewing angle And receiving the first detection signal and the second detection signal by an angle selection unit of a computing system and selecting a change according to an angle corresponding to the first detection signal and the second detection signal The angle corresponding to the first detection signal corresponds to the touch point of the touch point, and an operation unit calculates the coordinate value of the touch point. The inventor of the present invention has proposed the inventive concept of the present invention, and the mechanism of the present invention is completely different from the known technology to provide a touch wheeling device with multi-touch function and a touch method for detecting multi-point position to promote industrial upgrading. . The above summary and the following detailed description and the accompanying drawings are intended to further illustrate the manner and means of the invention, Other objects, advantages and aspects of the invention are set forth in the drawings. EMBODIMENT========================================================================================= .

,第五A、五B圖所示為根據本發明第一實施例之成像 糸統偵測到之觸控點的光束遮蔽角度的示意圖。 根據本發明第一實施例,觸控輸入裝置3 〇具有一觸控 面板11,以供使用者觸碰並輸入至少一觸批 源32 ’環賴控面板n之周_提供近乎平行於^面 板11表面之偵測光束、至少兩個成像系統31a、3ib,配 置於觸控面板11之兩相鄰頂點,以接收光源32所產生之 光束。其中,光源32可為複數個LED所組成或者是LED 與導光板的組合、LED與反射片的組合、其他的發光元件 或反射片’成像系統31a、31b為至少一光學鏡片及至少一 光感應元件(photo sensor)所組成,並且成像系統31a、 3lb之間距為w ’即為觸控面板之寬度w。於本實施例, 成像系統31a、31b具有一第一空間視角。 根據本發明第一實施例,成像系統31 a、3lb可各自透 過一視角可變化單元(未顯示),例如:一致動元件,如透 過步進馬達(Stepping Motor )、音圈馬達(Voice CoilMotor ; VCM)、壓電元件(piez〇 Actuator)驅動達成,或是可調 焦鏡片,如液態鏡片(Liquid lens),或是軟體處理視角技 201104533 術、,如數位自動變焦技術,各自對成像系統3丨a、3比之位 置進行移動,且成像系統31a、31b之移動距離為d,移動 方向為平行於光源32之光束行進的方向,則成像系統 3/a、31b則會具有一第二空間視角’相對於視角可變化單 兀移動成像系統31a、31b之前之第一空間視角而言。 當使用者利用手指或觸控筆於觸控面板11上輸入兩 個觸控點P卜P 2時,則會遮蔽從光源3 2產生之偵測光束, 並各自於成像系統31a、31b產生兩個第一偵測訊號,如第 四A圖所示之第一偵測訊號s ( s ( θί2)及s ( 0 R1)及S ( ΘΚ2),其中,第四A圖所示分別為成像系統31a # 與31b所偵測到之偵測訊號波形圖。此時,此第一偵測訊 號 S (0L1)、S (0L2)及 S (0R])、S (0R2)所代表之 光束遮蔽角度各自為0L1、θί2、0R1、0R2。 根據本發明之第一實施例’當成像系統31a、31b感測 到觸控點P1與P2之第一偵測訊號,然後,成像系統31a、 31b會受到視角可變化單元的驅動而向後移動一距離d。因 此’觸控點PI、P2至成像系統31a、31b之相對距離則會 改變’則會產生如第四B圖所示之第二偵側訊號$ ( 0 φ li’)、S ( 0L2’)及 S ( 0R],)及 s ( 其中,第四5A and 5B are diagrams showing the beam shielding angle of the touch point detected by the imaging system according to the first embodiment of the present invention. According to the first embodiment of the present invention, the touch input device 3 has a touch panel 11 for the user to touch and input at least one touch source 32 'the circumference of the control panel n _ provides nearly parallel to the panel The detection beam of the surface, the at least two imaging systems 31a, 3ib are disposed at two adjacent vertices of the touch panel 11 to receive the light beam generated by the light source 32. The light source 32 may be composed of a plurality of LEDs or a combination of an LED and a light guide plate, a combination of an LED and a reflection sheet, and other light-emitting elements or reflection sheets. The imaging systems 31a and 31b are at least one optical lens and at least one light sensor. The photo sensor is composed, and the distance between the imaging systems 31a and 3lb is w′, that is, the width w of the touch panel. In the present embodiment, the imaging systems 31a, 31b have a first spatial viewing angle. According to a first embodiment of the present invention, the imaging systems 31a, 3bb may each pass through a viewing angle changeable unit (not shown), such as an actuating element, such as a stepping motor (stepping motor), a voice coil motor (Voice CoilMotor; VCM), piezoelectric element (piez〇Actuator) drive, or adjustable focus lens, such as liquid lens (Liquid lens), or software processing perspective technology 201104533, such as digital automatic zoom technology, each of the imaging system 3丨a, 3 are moved relative to the position, and the imaging system 31a, 31b has a moving distance d, and the moving direction is a direction parallel to the traveling direction of the light source 32, and the imaging system 3/a, 31b has a second space. The angle of view 'with respect to the angle of view may vary from the first spatial perspective before moving the imaging system 31a, 31b. When the user inputs two touch points P P 2 on the touch panel 11 by using a finger or a stylus, the detection beam generated from the light source 3 2 is shielded, and two images are respectively generated in the imaging systems 31a, 31b. a first detection signal, such as the first detection signal s ( s ( θ ί2) and s ( 0 R1) and S ( ΘΚ 2) shown in FIG. 4A, wherein the fourth A picture is an imaging system respectively The detection signal waveform detected by 31a # and 31b. At this time, the beam shielding angle represented by the first detection signals S (0L1), S (0L2), S (0R), and S (0R2) Each is 0L1, θί2, 0R1, 0R2. According to the first embodiment of the present invention, when the imaging systems 31a, 31b sense the first detection signals of the touch points P1 and P2, then the imaging systems 31a, 31b are subjected to The viewing angle can be changed by the driving of the unit and moved backward by a distance d. Therefore, the relative distance between the touch points PI and P2 to the imaging systems 31a and 31b will change, and the second side signal as shown in FIG. 4B will be generated. $ ( 0 φ li'), S ( 0L2') and S ( 0R],) and s (where the fourth

B圖所示分別為成像系統31a與31b所债測到之偵測訊號 波形圖。此時’此第二偵測訊號S ( 0L1,)、S ( θί2,)、S (βιιΓ)及S( 0R2,)所代表之光束遮蔽角度各自為0u,、 Θ L2’、(9 R1’、0 R2,。 如第五A圖所示,例如:觸控點pi距離成像系統31a 較近(相對於觸控點p2距離成像系統31 a而言),則成像 系統31a所感測到觸控點ρι之光束遮蔽角度為。當成 10 201104533 像系統31a受到視角可變化單元的驅動向後移動一距離 d ’則相對應之觸控點P1轉成㈣統…之距離即向後 移動至觸控點pi’處,此時成像系統31a所感測到觸控點 ΡΓ之光束遮蔽角度為Θ L1’ ’且針對成像系統31a所摘測到 觸控點P1的遮蔽角度的差異為:△ΘΗθγθυ,卜 ▲如第五Β圖所tf ’例如:觸控點ρ2距離成像系統仏 較遠(相對於觸控點P1距離成像系統31b而言),則成像 系統3U戶斤侧到觸控fip2之光束遮蔽角度為θ[2,當成 • 像系統31a文到視角可變化單元的驅動向後移動一距離 d,則相對應之觸控點p2距離成像系統31a之距離即向後 移動至觸控點P2’處’此時成像系統31&所偵測到觸控點 p2之光束遮蔽角度為5>L2,,且針對成像系統31a所偵測到 觸控點P2的遮蔽角度的差異為:△ 0 L2=| 6» L2- 0 u,l。 ^由於,觸控點P1距離成像系統31a較近,因此,成像 系統3la向後移動一距離d對於觸控點pl之偵測到的遮蔽 角度的交化△ θυ會大於對距離較遠之觸控點P2所偵測 鲁至1J的遮蔽角度的變化,即大於。因此, η即可利用此方法判定如第四a圖及第四B圖之第一偵測訊 f s ( 0L1)與第二偵測訊號s ( 0U,)是屬於距離成像系 ^la較近之觸控點ρι所產生之偵側訊號,而非其他距離 =遠之假性觸控點p3。同樣地,亦可依照較小的遮蔽角度 變化△判定如第四a圖及第四6圖之第一偵測訊號s HL2)與第二偵測訊號S ( 0L2,)屬於距離成像系統31a 車又遠之觸控點P2 ’而非其他距離較近之假性觸控點p4。Figure B shows the waveforms of the detected signals detected by the imaging systems 31a and 31b, respectively. At this time, the beam shielding angles represented by the second detection signals S (0L1,), S ( θί2,), S (βιιΓ), and S( 0R2, respectively are 0u, Θ L2', (9 R1' 0 R2, as shown in FIG. 5A, for example, the touch point pi is closer to the imaging system 31a (relative to the touch point p2 from the imaging system 31a), and the imaging system 31a senses the touch. The angle of the beam of the point ρι is as. 10 201104533 The image system 31a is moved backward by the driving of the viewing angle changeable unit by a distance d ', and the corresponding touch point P1 is converted into a (four) system... the distance is moved backward to the touch point pi ' At this time, the imaging system 31a senses that the beam shielding angle of the touch point Θ is Θ L1 ′′ and the difference of the shielding angle of the touch point P1 measured by the imaging system 31 a is: ΔΘΗθγθυ, ▲如For example, the touch point ρ2 is far away from the imaging system (relative to the touch point P1 from the imaging system 31b), and the beam shielding angle of the imaging system 3U to the touch fip2 is θ[2, as the driving of the image system 31a to the view changeable unit After moving a distance d, the distance of the corresponding touch point p2 from the imaging system 31a is moved backward to the touch point P2'. At this time, the beam shielding angle of the touch point p2 detected by the imaging system 31 &5> L2, and the difference in the shielding angle of the touch point P2 detected by the imaging system 31a is: Δ 0 L2=| 6» L2- 0 u, l ^ Since, the touch point P1 is away from the imaging system 31a Closer, therefore, the imaging system 3la moves backward by a distance d. The intersection angle Δ θ 侦测 of the detected occlusion angle of the touch point pl is greater than the occlusion angle of 1 J detected by the touch point P2 far away. The change is greater than. Therefore, η can use this method to determine that the first detection signal fs (0L1) and the second detection signal s (0U,) of the fourth and fourth B images belong to the distance imaging. The detection side signal generated by the closer touch point ρι is not the other distance=distal pseudo touch point p3. Similarly, it can also be determined according to the smaller shading angle change Δ as the fourth a diagram And the first detection signal s HL2) and the second detection signal S (0L2,) in the fourth picture belong to the distance imaging system 31a. Far away from touch point P2' instead of other pseudo-touch points p4 that are closer.

“同理,亦可利用此方法判斷出成像系統31b之第一偵 側‘虎S ( 0幻)、s ( ΘΚ2)與第二偵、測訊號s ( <9R1,)、S 201104533 (0r2’)與觸控點PI、P2的對應關係。即是,可利用此 方法判定如第四A圖及第四B圖之第一偵測訊號s ( θκι) 與第二偵測訊號S ( Θ R1’)屬於距離成像系統3lb較近之 觸控點P2所產生之彳貞側訊號,而非其他距離較遠之假性觸 控點P3。同樣地,第一偵測訊號s ( θΚ2)與第二偵測訊 號S ( 0 μ’)屬於距離成像系統3ib較遠之觸控點pi,而 非其他距離較近之假性觸控點P4。 因此,觸控輸入裝置30可將上述之第一偵測訊號或第 二偵測訊號各別對應之角度配合上成像系統3丨a、3丨b之間 距W,即可透過簡單之三角函數公式進行計算得到觸控點 P1之座標(xl,yl)及觸控點P2之座標(χ2,γ2)。 第六圖所示為根據本發明第二實施例之成像系統的結 構示意圖。 第七A圖、第七B圖所示為根據本發明第二實施例之 成像系統所偵測之觸控點的光束遮蔽角度的示意圖。 根據本發明第二實施例’第六圖顯示之成像系統60a 為如第三圖所示配置於觸控面板U之成像系統31a的位 置,且另具有一成像系統60b (未顯示),置放於如成像系 統31b之位置’其中,成像系統6〇a具有至少一組光學鏡 片61及至少一組光感應元件62,且成像系統60a、60b具 有相同之結構。 本實施例利用一視角可變化單元,例如:一致動元件, 如透過步進馬達(Stepping Motor )、音圈馬達(Voice CoilMotor ; VCM )、壓電元件(Piezo Actuator )驅動達成, 或是可調焦鏡片,如液態鏡片(Liquid lens),或是軟體處理 視角技術’如數位自動變焦技術,對成像系統60a之光學 12 201104533 鏡片61或是光感應元件62之一的位置進行移動,且光學 鏡片61或是光感應元件62之移動距離分別為e與f。由 於,光學鏡片61或是光感應元件62之相對距離的改變, 即會改變此成像系統60a所能偵測之最大角度的範圍,因 此本實施例之成像系統60a即可產生兩組以上不同的最大 • 偵測角度範圍,即是具有兩組不同的空間視角,實際來說 是一第一最大偵測角度範圍及第二最大偵測角度範圍 Θ 2。 如第三圖所示,當使用者利用手指或觸控筆於觸控面 板11上輸入兩個觸控點PI、P2時,則會遮蔽從光源32 產生之光束,並各自於成像系統60a產生光遮蔽訊號,即 是如第四A圖之第一偵測訊號8(0[1)、8(01^2)與8(0 r] )、S ( 0 R2)。此時,成像系統60a係操作於第一最大偵 測角度範圍0!,即是操作於第一空間視角。 當成像系統60a偵測到上述之偵測訊號,接著,視角 可變化單元將成像系統60a之光學鏡片61或光感應元件 62的位置移動至光學鏡片61’或光感應元件62’之位置,則 • 由於光學鏡片61及光感應元件62之相對距離改變,因此 成像系統60a即具有一第二最大偵測角度範圍02,即是具 有一第二空間視角。 由於,觸控點P1距離成像系統60a較近(相對於觸控 點P2距離成像系統60a而言),則如第七A圖所示,成像 系統60a所偵測到觸控點P1之光束遮蔽角度為θυ。同樣 • 地,觸控點Ρ2距離成像系統60a (相對於觸控點Ρ1距離 - 成像系統60a而言)較遠,則如第七B圖所示,成像系統 60a所偵測到觸控點P2之光束遮蔽角度為0 L2。因此,如 13 201104533 第七A圖所示,當成像系統60a所情測到之觸控點P1距離 成像系統60a較近,則成像系統60a所偵測到之光束遮蔽 角度為0L1,且此光束遮蔽角度0Ll佔第一最大偵測角度 範圍0 1與第二偵測角度範圍02之比例的絕對差值為: yLl a{%) 如第七B圖所示’當成像系統6〇a所彳貞測到之觸控點P2 距離成像系統6〇a較遠,則成像系統6〇a所偵測到之光束 遮蔽角度為,且此光束遮蔽角度01^佔最大偵測角度 範圍0 1與第二最大偵測角度範圍0 2之比例的絕對差值 為: β(%) 由於’觸控點Ρ1距離成像系統6〇a較近,而觸控點ρ2距 離成像系統60a較遠,因此觸控點pi之光束遮蔽角度0^ 佔第一偵測角度範圍0!與第二偵測角度範圍02比例的絕 對差值α(%)會大於觸控點P2之光束遮蔽角度0L2佔第一 偵測角度範圍0!與第二偵測角度範圍比例的絕對差值 /5 (%),即 α (%)大於 /5 (%)。 因此,當¢2 (%)大於/3 (%),則表示如第四Α圖所示之 第一偵測訊號S(0L1)、第二偵測訊號s(0L〗,)屬於距離成 像糸統60a較近之觸控點P1,而非其他距離較遠之假性觸 控點P3 ;且第一偵測訊號s(0L2)、第二偵測訊號s(0u,) 屬於距離成像系統60a較遠之觸控點P2,而非其他距離較 近之假性觸控點P4。 同理,亦可利用此方法判斷出成像系統6〇b之第一偵 側 §fl 號 S ( 0 R1)、S ( <9 R2)與第二偵測訊號 s ( 0 R1,)、s (Θμ’)與觸控點pi、P2的對應關係。即是,可利用此 201104533 方法判定如第四A圖及第四B圖之第一偵測訊號s ( 與第二偵測訊號S ( <9R1’)是屬於距離成像系統6〇b較近 之觸控點P2所產生之偵側訊號,而非其他距離較遠之假性 觸控點P3。同樣地,第一偵測訊號S ( 0R2)與第二偵測 訊號S ( 0R2’)屬於距離成像系統6〇b較遠之觸控點, 而非其他距離較近之假性觸控點P4。 因此,可將上述之第一偵測訊號S ( 、s ( 各別對應之角度配合上成像系統60a、60b之間距w ,即 • :透過簡單之三角函數公式進行計算得到觸控點Pi之座 標(xl,yl)及觸控點P2之座標(X2,y2)。 第八圖所示為根據本發明第三實施例之利用光學感應 技術之觸控輸入裝置的示意圖。 一 根據本發明第三實施例,觸控輸入裝置80具有與第三 圖所=之觸控輸入裝置30相似的結構,但是本實施例增加 一組第二成像系統31c、31d,且此第二成像系統31c、 具有與第一成像系統31a、31b不同之最大偵測角度範圍, ~是具有不同之空間視角’具體而言是第—成像系統 31a、31b具有第一最大偵測角度範圍^,而帛二成像系統 31c、31d具有第二最大偵測角度範圍Θ2,因此第一成像系 統31a、31b具有一第一空間視角,而第二成像系統31c、、 31d具有一第二空間視角。並且,第二成像系、统3ic、3w 可置放於第-成像系統31a、31b之附近,例如:置放於第 一成像系統31a、31b的上方以接收光源32發出之光束。 本發明之第三實施例採用與第七A圖與第七B圖所述 =之偵測方法。由於本發明採用兩組不同最大偵測角度 犯圍心與^之第一與第二成像系統仏、3115與仏、 15 201104533 31d,因此,即可如第七A圖與第七B圖所示,具有第一最 大偵測角度範圍Θ 1與第二最大偵測角度範圍。 透過上述之偵測方法,第一與第二成像系統31a、31b 與31 c、31 d即可偵測到第四A圖所示之第一偵側訊號S ( 0 L1)、S(6»L2)、S(0R1)、S((9R2),且第一偵測訊號 S(0L1) 屬於距離第一與第二成像系統31a、31c較近之觸控點PI, 而非其他距離較遠之假性觸控點P3,則第一偵測訊號S(0 L2)屬於距離第一與第二成像系統31a、31c較遠之觸控點 P2,而非其他距離較近之假性觸控點P4。同理,第一與第 二成像系統31b、31d亦以相同之偵測方法進行觸控點P2 與假性觸控點P3、P4的判斷。最後,亦透過第一偵測訊 號 S(0L1)、S(0L2)、S(0R1)、S((9R2)所各自對應之角度 0 u、θί2、0R1、0R2並配合上第一成像系統31a、31b之 間距W,即可透過簡單之三角函數公式進行計算得到觸控 點P1之座標(xl,yl)及觸控點P2之座標(x2,y2)。 第九圖所示為根據本發明實施例之運算系統示意圖。 根據上述之實施例,本發明提出一種應用光學偵測的 技術以實現多點觸控功能,並且根據上述實施例,提出一 種利用角度比較的方式來辨識觸控點的遠近以及真假等。 根據上述之實施例,皆具有一運算系統90,其包含一角度 選擇單元92以及一運算單元93,另外,成像系統91係與 第三圖、第六圖及第八圖之成像系統31a、31b、31c、31d、 60a相同。 當成像系統91偵測到如第三圖所示之觸控點PI、P2 的輸入,並據以產生一第一偵測訊號S(0L1)、S(0L2)、S(<9 R1)、S(0R2)以及一第二偵測訊號 S(0L1’)、S(0L2’)、s(e 201104533 R1’)、S(0R2’),則成像系統91即將此第一及第二偵測訊號 送至運算系統90之角度選擇單元92。角度選擇單元92則 根據上述實施例之方法進行比較,以判斷如第四A圖之第 一偵測訊號 S( β L1)、S( 0 L2)、S( Θ R1)、S( 0 R2)所各自對應 之觸控點PI、P2,並將比較結果送至運算單元93。運算 單元93則根據此第一偵測訊號S(0L1)、S(0L2)、S(0R1)、 s( Θ R2)及其各自對應之角度Θ Ll、Θ L2、β R1、β R2並配合 上如第三圖所示之成像系統31 a、31 b之間距W,即可透 過簡單之三角函數公式進行計算得到觸控點P1之座標 (xl,yl)及觸控點P2之座標(x2,y2)。上述係以本發明 之第一實施例作為例子,亦可適用於本發明之第二實施例 及第三實施例。 惟,以上所述,僅為本發明的具體實施例之詳細說明 及圖式而已,並非用以限制本發明,本發明之所有範圍應 以下述之申請專利範圍為準,任何熟悉該項技藝者在本發 明之領域内,可輕易思及之變化或修飾皆可涵蓋在以下本 案所界定之專利範圍。 【圖式簡單說明】 第一圖所示為習知利用光學感應技術之觸控輸入裝 置的示意圖。 第二圖所示為習知利用光學感應技術之觸控輸入裝 置之/[貞側訊號示意圖。 第三圖所示為根據本發明第一實施例之利用光學感應 技術之觸控輸入裝置的示意圖。 17 201104533 第四A、四B圖所示為根據本發明苐一實施例之利用 光學感應技術之觸控輸入裝置之偵側訊號示意圖 第五A、五B圖所示為根據本發明第一實施例之成像 系統所偵測之觸控點的光束遮蔽角度的示意圖。 第六圖所示為根據本發明第二實施例之成像系統的結 構不意圖。 第七A圖、第七B圖所示為根據本發明第二實施例之 成像系統所偵測之觸控點的光束遮蔽角度的示意圖。 第八圖所示為根據本發明第三實施例之利用光學感應 技術之觸控輸入裝置的示意圖。 第九圖所示為根據本發明實施例之運算系統示意圖。 【主要元件符號說明】 10、30、80觸控輸入裝置 11觸控面板 12反射板 13a、13b光源發射/偵測系統 31a、31b、60a成像系統 31c、31d第二成像系統 3 2光源 61、 61’光學鏡片 62、 62’光感應元件 90運算系統 91成像系統 92角度選擇單元 18 201104533"Similarly, this method can also be used to determine the first detection side of the imaging system 31b's tiger S (0 illusion), s ( ΘΚ 2) and the second Detective, signal s ( <9R1,), S 201104533 (0r2 Correspondence between the touch points PI and P2, that is, the first detection signal s ( θκι) and the second detection signal S (ie, the fourth A picture and the fourth B picture) can be determined by this method. R1') belongs to the side signal generated by the touch point P2 which is closer to the imaging system 3lb, and is not the other pseudo touch point P3 which is far away. Similarly, the first detection signal s (θΚ2) and The second detection signal S (0 μ') belongs to the touch point pi far from the imaging system 3ib, and is not the other pseudo touch point P4 that is closer. Therefore, the touch input device 30 can The angle corresponding to each of the detection signal or the second detection signal is matched with the distance W between the imaging systems 3丨a and 3丨b, and the coordinate of the touch point P1 can be obtained by a simple trigonometric formula (xl, Yl) and the coordinates of the touch point P2 (χ2, γ2). The sixth figure shows the structure of the imaging system according to the second embodiment of the present invention. 7A and 7B are diagrams showing the beam shielding angle of the touch point detected by the imaging system according to the second embodiment of the present invention. The sixth embodiment shows the imaging according to the second embodiment of the present invention. The system 60a is disposed at the position of the imaging system 31a of the touch panel U as shown in the third figure, and further has an imaging system 60b (not shown) placed in the position of the imaging system 31b, wherein the imaging system 6 a has at least one set of optical lenses 61 and at least one set of light sensing elements 62, and the imaging systems 60a, 60b have the same structure. This embodiment utilizes a viewing angle changeable unit, such as an actuating element, such as a stepper motor ( Stepping Motor), voice coil motor (VCM), piezoelectric element (Piezo Actuator) drive, or adjustable focus lens, such as liquid lens (Liquid lens), or software processing perspective technology such as digital auto zoom The technique moves the position of one of the optical lens 12 or the light sensing element 62 of the optical system 12 201104533 of the imaging system 60a, and the moving distance of the optical lens 61 or the light sensing element 62 respectively e and f. Since the change in the relative distance between the optical lens 61 or the light sensing element 62 changes the range of the maximum angle that the imaging system 60a can detect, the imaging system 60a of the present embodiment can generate two The maximum range of detection angles above the group has two different spatial viewing angles, which are actually a first maximum detection angle range and a second maximum detection angle range Θ 2. As shown in the third figure When the user inputs two touch points PI, P2 on the touch panel 11 by using a finger or a stylus, the light beams generated from the light source 32 are shielded, and the light shielding signals are respectively generated in the imaging system 60a. For example, the first detection signal 8 (0[1), 8(01^2) and 8(0 r]), S (0 R2) of the fourth A picture. At this time, the imaging system 60a operates at a first maximum detection angle range of 0!, i.e., operates at a first spatial perspective. When the imaging system 60a detects the above detection signal, then the viewing angle changeable unit moves the position of the optical lens 61 or the light sensing element 62 of the imaging system 60a to the position of the optical lens 61' or the light sensing element 62'. • Since the relative distance between the optical lens 61 and the light sensing element 62 changes, the imaging system 60a has a second maximum detection angle range 02, that is, has a second spatial viewing angle. Since the touch point P1 is closer to the imaging system 60a (relative to the touch point P2 from the imaging system 60a), as shown in FIG. 7A, the imaging system 60a detects the beam obscuration of the touch point P1. The angle is θυ. Similarly, the touch point Ρ 2 is far from the imaging system 60a (relative to the touch point Ρ1 distance - the imaging system 60a), as shown in the seventh B, the touch point P2 is detected by the imaging system 60a. The beam shielding angle is 0 L2. Therefore, as shown in FIG. 13 201104533, the touch point P1 of the imaging system 60a is closer to the imaging system 60a, and the beam shielding angle detected by the imaging system 60a is 0L1, and the beam is The absolute difference between the shielding angle 0L1 and the ratio of the first maximum detection angle range 0 1 to the second detection angle range 02 is: yLl a{%) as shown in FIG. 7B, when the imaging system 6〇a is The measured touch point P2 is far away from the imaging system 6〇a, and the beam shielding angle detected by the imaging system 6〇a is, and the beam shielding angle 01^ occupies the maximum detection angle range 0 1 and the first The absolute difference of the ratio of the maximum detection angle range of 0 2 is: β(%) Since the touch point Ρ1 is closer to the imaging system 6〇a, and the touch point ρ2 is far from the imaging system 60a, the touch is Point pi beam obscuration angle 0^ occupies the first detection angle range 0! The absolute difference α (%) of the ratio of the second detection angle range 02 is greater than the beam shielding angle 0L2 of the touch point P2. The absolute difference of the angle range 0! to the second detection angle range is /5 (%), ie α (%) is greater than /5 (%). Therefore, when ¢2 (%) is greater than /3 (%), it means that the first detection signal S (0L1) and the second detection signal s (0L), as shown in the fourth figure, belong to distance imaging. The touch point P1 is closer to 60a than the other pseudo touch point P3 which is far away; and the first detection signal s (0L2) and the second detection signal s(0u,) belong to the distance imaging system 60a. The touch point P2 is farther away than the other pseudo touch point P4 which is closer. Similarly, this method can also be used to determine the first detection side §fl number S ( 0 R1), S ( < 9 R2) of the imaging system 6〇b and the second detection signal s ( 0 R1,), s (Θμ') Correspondence with touch points pi, P2. That is, the first detection signal s such as the fourth A picture and the fourth B picture can be determined by using the 201104533 method (the second detection signal S (<9R1') belongs to the distance imaging system 6〇b. The detection signal generated by the touch point P2 is not the other pseudo touch point P3 which is far away. Similarly, the first detection signal S (0R2) and the second detection signal S (0R2') belong to The touch point farther from the imaging system 6〇b, instead of the other pseudo touch point P4 that is closer to the distance. Therefore, the first detection signal S ( , s (the corresponding angle can be matched) The distance between the imaging systems 60a and 60b is w, that is, the coordinate (Xl, yl) of the touch point Pi and the coordinate (X2, y2) of the touch point P2 are calculated by a simple trigonometric formula. A schematic diagram of a touch input device using optical sensing technology according to a third embodiment of the present invention. According to a third embodiment of the present invention, the touch input device 80 has a similar touch input device 30 as the third figure. Structure, but this embodiment adds a set of second imaging systems 31c, 31d, and this second imaging system 31c Having a different detection angle range than the first imaging system 31a, 31b, ~ is a different spatial viewing angle 'specifically, the first imaging system 31a, 31b has a first maximum detection angle range ^, and the second imaging The systems 31c, 31d have a second maximum detection angle range Θ2, such that the first imaging system 31a, 31b has a first spatial viewing angle and the second imaging system 31c, 31d has a second spatial viewing angle. The system, system 3ic, 3w may be placed adjacent to the first imaging system 31a, 31b, for example, placed above the first imaging system 31a, 31b to receive the light beam emitted by the light source 32. The third embodiment of the present invention employs And the detection method of the descriptions of the seventh and seventh diagrams. Since the present invention employs two sets of different maximum detection angles, the first and second imaging systems 仏, 3115 and 仏, 15 201104533 31d, therefore, as shown in FIGS. 7A and 7B, having a first maximum detection angle range Θ 1 and a second maximum detection angle range. Through the above detection methods, first and second Imaging systems 31a, 31b and 31 c, 31 d The first side signal S (0 L1), S (6»L2), S (0R1), S ((9R2), and the first detection signal S (0L1) shown in the fourth picture A can be detected. The touch detection point PI that is closer to the first and second imaging systems 31a, 31c, and other pseudo-touch points P3 that are far away, the first detection signal S(0 L2) belongs to the distance first The second imaging system 31a, 31c is farther away from the touch point P2 than the other pseudo-touch points P4. Similarly, the first and second imaging systems 31b, 31d are also performed by the same detection method. The judgment of the touch point P2 and the dummy touch points P3 and P4. Finally, the first detection signals S(0L1), S(0L2), S(0R1), and S((9R2) respectively correspond to angles 0u, θί2, 0R1, and 0R2, and cooperate with the first imaging system 31a. The distance between 31b and W can be calculated by a simple trigonometric formula to obtain the coordinates (xl, yl) of the touch point P1 and the coordinates (x2, y2) of the touch point P2. The ninth figure shows the invention according to the present invention. According to the above embodiments, the present invention provides a technique for applying optical detection to implement a multi-touch function, and according to the above embodiment, a method for recognizing a touch point by using an angle comparison method is proposed. In the above embodiments, there is an arithmetic system 90 including an angle selecting unit 92 and an arithmetic unit 93. In addition, the imaging system 91 is connected to the third, sixth and eighth figures. The imaging systems 31a, 31b, 31c, 31d, 60a are identical. When the imaging system 91 detects the input of the touch points PI, P2 as shown in the third figure, and accordingly generates a first detection signal S (0L1) ), S(0L2), S(<9 R1), S(0R2), and a second detection signal S(0L1) '), S(0L2'), s(e 201104533 R1'), S(0R2'), the imaging system 91 sends the first and second detection signals to the angle selection unit 92 of the computing system 90. Angle selection The unit 92 compares according to the method in the above embodiment to determine the first detection signals S(β L1), S( 0 L2), S( Θ R1), and S( 0 R2) as shown in FIG. 4A. Corresponding touch points PI, P2, and the comparison result is sent to the operation unit 93. The operation unit 93 is based on the first detection signals S (0L1), S (0L2), S (0R1), s ( Θ R2) And their respective corresponding angles Θ L1, Θ L2, β R1, β R2 and the distance W between the imaging systems 31 a, 31 b as shown in the third figure can be calculated by a simple trigonometric formula The coordinates (x1, y2) of the control point P1 and the coordinates (x2, y2) of the touch point P2. The first embodiment of the present invention is also applicable to the second embodiment and the third embodiment of the present invention. The above is only the detailed description and drawings of the specific embodiments of the present invention, and is not intended to limit the present invention. The scope of the patent application shall prevail, and any person skilled in the art who is familiar with the art in the field of the invention may easily change or modify the scope of the patent as defined in the following. [Simplified illustration] A schematic diagram of a touch input device using optical sensing technology is shown in the following figure. The second figure shows a schematic diagram of a [touch side signal] of a touch input device using optical sensing technology. The third figure shows a schematic diagram according to the present invention. A schematic diagram of a touch input device using optical sensing technology of the first embodiment. 17 201104533 FIG. 4A and FIG. 4B are diagrams showing the side view signals of the touch input device using the optical sensing technology according to an embodiment of the present invention. FIGS. 5A and 5B are diagrams showing the first embodiment according to the present invention. A schematic diagram of a beam shielding angle of a touch point detected by an imaging system. Fig. 6 is a view showing the structure of the imaging system according to the second embodiment of the present invention. 7A and 7B are schematic views showing the beam shielding angle of the touch point detected by the imaging system according to the second embodiment of the present invention. Figure 8 is a diagram showing a touch input device using optical sensing technology according to a third embodiment of the present invention. The ninth diagram shows a schematic diagram of an arithmetic system in accordance with an embodiment of the present invention. [Main component symbol description] 10, 30, 80 touch input device 11 touch panel 12 reflector 13a, 13b light source emission/detection system 31a, 31b, 60a imaging system 31c, 31d second imaging system 3 2 light source 61, 61' optical lens 62, 62' light sensing element 90 computing system 91 imaging system 92 angle selection unit 18 201104533

93運算單元 PI、P2、P1’、P2’ 觸控點 P3、P4假性觸控點 W寬度 d、e、f移動距離 1993 arithmetic unit PI, P2, P1', P2' touch points P3, P4 pseudo touch points W width d, e, f moving distance 19

Claims (1)

201104533 七、申請專利範圍: 1、種具備多點觸控功能之觸控輸入裝置,包括: 一觸控面板,以供輸入至少一觸控點; 至少-光源’配置於該觸控面板周圍,以提供—偵測光 束; 複數個成像系統,具有―第—空間視角,位於該觸控面 板之相鄰之頂點,且接收該偵測光源,並根據該觸控 點所遮蔽之债測光束產生一第一摘測訊號;以及 至少-視角可變化單元’與該些成㈣統連接,並使得 該些成像系統具有一第二空間視角,且具有該第二空 間視角之該些成像系統根據該觸控點所遮蔽之偵測 光束產生一第二偵測訊號; 其中,一運异系統接收該第一偵測訊號及該第二偵測訊 號,並且據以運算出該觸控點之座標值。 2、 如申請專利範圍第1項之觸控輸入裝置,其中該光源可 為複數個LED的組合、LED與導光板的組合、LED與 反射片的組合、其他光源或是反射片。 3、 如申請專利範圍第i項之觸控輸入裝置,其中該些成像 系、’’先包含至少一光學鏡片及至少一光感應元件。 4、 如申請專利範圍第3項之觸控輸入裝置,其中該視角可 變化單元移動該些成像系統之位置以使得該些成像系 統具有該第二空間視角,且移動方向與該偵測光束之行 進方向平行。 5如申5月專利範圍第3項之觸控輸入裝置,其中該視角可 變化單元移動該光學鏡片或該光感應元件之一的位置 使得該些成像系統具有該第二空間視角。 20 201104533 6、 如申請專利範圍第1項之觸控輸入裝置,其中該第一空 間視角與第二空間視角不同。 7、 如申請專利範圍第1項之觸控輸入裝置,其中該運算系 統更包含一角度選擇單元與一運算單元電性連接,=角 度選擇單元接收該第一偵測訊號及該第二偵測訊號,並 根據該第一偵測訊號及該第二偵測訊號所代表之角度 的絕對差異,判斷該第一偵測訊號相對應之該觸控點^ 並且§玄運算單元據以運算出該觸控點之座標值。 8、 如申請專利範圍第1項之觸控輸入裝置,其中該運算系 統更包含一角度選擇單元與一運算單元電性連接,該角' 度選擇單元接收該第-制訊號及該第二_訊號f並 根據該第-及該第二偵測訊號及該第二偵測訊號所代 表之角度分別佔該第-與該第二空間視角的比例的絕 對差值’判斷該第-制訊號相對應之該觸控點,並且 該運算單元據以運算出該觸控點之座標值。 9、 ==圍第1項之觸控輸入裝置,其中該視角可 件’如透過步進馬達⑶一 馬達(Volce C01lM0t0r ; VCM )、屢電元 二^0 Γ0Γ)驅動達成’或是可調焦鏡片,如液 ;變=〜’或是軟體處理視角技術,如數位自 10、一種具備多點觸控功能之 一觸㈣板,以供輸入一觸控點’· 包括. 至t光源,配置於該觸控面板周圍,以提供-谓測光 複數個第-成像系統 乐工間視角,位於該觸 21 201104533 控面板之相鄰之頂點,且接收該偵測光源,並根據該 ★觸控點所遮蔽之仙光束產生—第-制訊號 ,‘以及 硬數個第二成像系統’具有—第二空間視角,位於該觸 控面板之相鄰之頂點,且接收該偵測光源,並根據該 觸控點所遮蔽之憤測光束產生一第二偵測訊號; 其。中,一運异系統接收該第一偵測訊號及該第二偵測訊 號,並據以運算出該觸控點之座標值。 11、如申請專利範圍第1〇項之觸控輸入裝置,其中該光 源可為複數個led的組合、LED與導光板的組合、LED 與反射片的組合、其他光源或是反射片。 1 2、如申請專利範圍帛1〇項之觸控輸入裝置,其中該第 一空間視角與第二空間視角不同。 1 3三如申請專利範圍第10項之觸控輸入裝置,其中該運 算系統更包含一角度選擇單元與一運算單元電性連 接,該角度選擇單元接收該第一偵測訊號及該第二偵測 訊號,並根據該第一偵測訊號及該第二偵測訊號所代表 之角度分別佔該第一與該第二空間視角的比例的絕對 差值,判斷該第一偵測訊號相對應之該觸控點,並且該 運昇單元據以運算出該觸控點之座標值。 L 4、一種觸控輸入裝置之偵測多點位置之觸控方法,包 括: 提供一觸控面板,以供輸入多點之觸控點; 透過複數個成像系統感測該些觸控點以產生一第一偵 測訊號,其中該些成像系統具有一第一空間視角; 透過一視角可變化單元以改變該些成像系統,使該些成 像系統具有一第二空間視角,且該些成像系統感測該 22 201104533 些觸控點以產生一第二偵測訊號;以及 透過-運算系統之-角度選擇單元接收該第一偵 娩及該第二制訊號,並根據該第一偵測訊號及該 二偵測訊號所對應之角度的變化以選擇出該第」 j訊號所對應之角度與其相對應之觸控點,則一運曾 單元據以運算出該觸控點之座標值。 斤 1 5、如中請專利範圍第14項之觸控方法,其中該些第一 成像系統包含至少-光學鏡片及至少—光感應元 請專利範圍第14項之觸控方法,其中該視角可 文化早7G移動該些成像系統之位置以使得該些成 2有該第二空間視角,且移動方向與該偵測光束之行 進方向平行。 17變=請專利範圍第15項之觸控方法,其中該視角可 2化早几移動該光學鏡片或該光感應元件之—的位 使得該些成像系統具有該第二空間視角。 1 8、如中請專利範圍第丄4項之觸控方法,其中 允 間視角與第二空間視角不同。 I 19 =中請專利範圍第14項之觸控方法,其中該角度選 ίΓ接/該第—侧域及該第二偵測訊號,並根據 一偵測訊號及該第二偵測訊號所代表之角度的絕 '差異,判斷該第一偵測訊號相對應之該觸控點。 2 0、如申請專利範圍第i 4項之觸控方法, 角度選擇單_—運算單元電性連接,該角 =擇早4收該第-偵測訊號及該第二偵測訊號,並 ^ °亥第—偵測訊號及該第二偵測訊號所代表之角度 刀另】佔。亥第-與該第二空間視角的比例的絕對差值,判 23 201104533 • &gt; 斷該第一偵測訊號相對應之該觸控點。 2 1、如申請專利範圍第丄4項之觸控方法,其中視角可變 化單元可為一致動元件,如透過步進馬達(以卬#% Motor )、音圈馬達(Voice CoilM〇t〇r ; VCM )、壓電元 件(PiezoActuator)驅動達成’或是可調焦鏡片,如液 態鏡片(Liquid lens),或是軟體處理視角技術,如數位自 動變焦技術。 2 2、一種觸控輸入裝置之偵測多點位置之觸控方法,包 括: 提供一觸控面板’以供輸入多點之觸控點; β 透過複數個第一成像系統感測該些觸控點以產生一第 一偵測訊號,其中該些成像系統具有一第一空間視 角; 透過一第一成像系統感測該些輸入點以產生一第二偵 測訊號,其中該些成像系統具有一第二空間視角丨以 及 ’ 透過一運算系統之一角度選擇單元接收該第一偵測訊 號及該第二制訊號,並根據該第一偵測訊號及該第籲 二偵測訊號所對應之角度的變化以選擇出該第一偵 測訊號所對應之角度與其相對應之觸控點,則一運算 單元據以運算出該觸控點之座標值。 2 3、如申請專利範圍第2 2項之觸控方法,其中該第一空 間視角與第二空間視角不同。 2 4、如申請專利範圍第2 2項之觸控方法,其中該運算系 . 統更之該角度選擇單元與該運算單元電性連接,該角度 選擇單元接收該第一偵測訊號及該第二偵測訊號,並根 24 201104533 據該第一偵測訊號及該第二偵測訊號所代表之角度分 別佔該第一與該第二空間視角的比例的絕對差值,判斷 該第一偵測訊號相對應之該觸控點。201104533 VII. Patent application scope: 1. A touch input device with multi-touch function, comprising: a touch panel for inputting at least one touch point; at least a light source is disposed around the touch panel. Providing a detection beam; a plurality of imaging systems having a "first" spatial view, located at an adjacent vertex of the touch panel, and receiving the detection light source, and generating a fingerprint beam according to the touch point a first measurement signal; and at least a viewing angle changeable unit is coupled to the plurality of imaging systems, and the imaging systems have a second spatial viewing angle, and the imaging systems having the second spatial viewing angle are The detection beam blocked by the touch point generates a second detection signal; wherein, the different detection system receives the first detection signal and the second detection signal, and calculates a coordinate value of the touch point . 2. The touch input device of claim 1, wherein the light source is a combination of a plurality of LEDs, a combination of an LED and a light guide plate, a combination of an LED and a reflection sheet, another light source or a reflection sheet. 3. The touch input device of claim i, wherein the image forming systems comprise at least one optical lens and at least one light sensing element. 4. The touch input device of claim 3, wherein the viewing angle changeable unit moves the positions of the imaging systems such that the imaging systems have the second spatial viewing angle, and the moving direction and the detecting beam The direction of travel is parallel. 5. The touch input device of claim 3, wherein the viewing angle changeable unit moves the optical lens or a position of one of the light sensing elements such that the imaging systems have the second spatial viewing angle. 20 201104533 6. The touch input device of claim 1, wherein the first spatial viewing angle is different from the second spatial viewing angle. 7. The touch input device of claim 1, wherein the computing system further comprises an angle selection unit electrically connected to an operation unit, and the angle selection unit receives the first detection signal and the second detection. And determining, according to the absolute difference between the angles represented by the first detection signal and the second detection signal, the touch point corresponding to the first detection signal and the calculating unit The coordinate value of the touch point. 8. The touch input device of claim 1, wherein the computing system further comprises an angle selection unit electrically connected to an operation unit, the angle selection unit receiving the first signal and the second The signal f is determined according to the absolute difference between the first and the second detection signal and the angle represented by the second detection signal respectively, which is the ratio of the first to the second spatial angle of view Corresponding to the touch point, and the operation unit calculates the coordinate value of the touch point. 9. == The touch input device of the first item, wherein the viewing angle can be 'either through the stepping motor (3), a motor (Volce C01lM0t0r; VCM), the relay element 2^0 Γ0Γ) Focus lens, such as liquid; variable = ~ ' or software processing perspective technology, such as digital from 10, a touch (four) board with multi-touch function for inputting a touch point '· including. to t light source, Arranging around the touch panel to provide a plurality of first-imaging system perspective angles of the photometry, located at an adjacent vertex of the touch panel 201104533 control panel, and receiving the detection light source, and according to the touch The shaded beam generated by the point produces a first-order signal, and the second plurality of imaging systems have a second spatial perspective, located at an adjacent vertex of the touch panel, and receiving the detection light source, and according to The intrusive beam blocked by the touch point generates a second detection signal; The first system detects the first detection signal and the second detection signal, and calculates a coordinate value of the touch point. 11. The touch input device of claim 1, wherein the light source is a combination of a plurality of LEDs, a combination of an LED and a light guide plate, a combination of an LED and a reflection sheet, and another light source or a reflection sheet. 1 2. The touch input device of claim 1, wherein the first spatial viewing angle is different from the second spatial viewing angle. The touch input device of claim 10, wherein the computing system further comprises an angle selection unit electrically connected to an operation unit, the angle selection unit receiving the first detection signal and the second detection And determining, according to the absolute difference between the first detection signal and the angle represented by the second detection signal, the first detection signal corresponding to the first detection signal and the second detection signal respectively The touch point, and the lifting unit calculates the coordinate value of the touch point. L4. A touch method for detecting a multi-point position of a touch input device, comprising: providing a touch panel for inputting a multi-point touch point; sensing the touch points through a plurality of imaging systems Generating a first detection signal, wherein the imaging systems have a first spatial viewing angle; and the imaging system is changed by a viewing angle changeable unit, the imaging systems having a second spatial viewing angle, and the imaging systems Sensing the 22 201104533 touch points to generate a second detection signal; and receiving the first detection and the second signal through the angle selection unit of the computing system, and based on the first detection signal and The change of the angle corresponding to the two detection signals is to select the touch point corresponding to the angle corresponding to the "J" signal, and then the unit is used to calculate the coordinate value of the touch point. The touch method of claim 14, wherein the first imaging system comprises at least an optical lens and at least a light sensing element, the touch method of claim 14 of the patent scope, wherein the viewing angle is The culture 7G moves the positions of the imaging systems such that the two pixels have the second spatial viewing angle, and the moving direction is parallel to the traveling direction of the detecting beam. 17 </ RTI> </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; </ RTI> <RTIgt; 18. The touch method of item 4 of the patent scope is as follows, wherein the allowable viewing angle is different from the second spatial viewing angle. I 19 = the touch method of claim 14 of the patent application, wherein the angle is selected from the first side field and the second detection signal, and is represented by a detection signal and the second detection signal The absolute difference of the angle determines the touch point corresponding to the first detection signal. 2 0. For the touch method of the i-th item of the patent application scope, the angle selection unit _-the operation unit is electrically connected, and the angle=the early 4th receives the first-detection signal and the second detection signal, and ^ °Hai-the detection signal and the angle of the second detection signal represent another knife. Haidi - the absolute difference of the ratio of the second spatial perspective, 23 201104533 • &gt; The touch point corresponding to the first detection signal is broken. 2 1. The touch method of claim 4, wherein the viewing angle changeable unit can be an actuating component, such as a stepping motor (卬#% Motor), a voice coil motor (Voice CoilM〇t〇r) VCM), piezoelectric elements (PiezoActuator) drive to achieve 'or adjustable focal lens, such as liquid lens (Liquid lens), or software processing perspective technology, such as digital auto zoom technology. 2 . A touch method for detecting a multi-point position of a touch input device, comprising: providing a touch panel for inputting a plurality of touch points; β sensing the touches through a plurality of first imaging systems Controlling a point to generate a first detection signal, wherein the imaging systems have a first spatial viewing angle; sensing the input points through a first imaging system to generate a second detection signal, wherein the imaging systems have a second spatial viewing angle 丨 and 'receiving the first detection signal and the second signal through an angle selection unit of a computing system, and corresponding to the first detection signal and the second detection signal The angle of the change is to select a touch point corresponding to the angle corresponding to the first detection signal, and an operation unit calculates the coordinate value of the touch point. 2 3. The touch method of claim 22, wherein the first spatial angle of view is different from the second spatial perspective. The touch control method of claim 2, wherein the angle selection unit is electrically connected to the operation unit, and the angle selection unit receives the first detection signal and the first The second detection signal, the root 24 201104533, according to the angle represented by the first detection signal and the second detection signal respectively occupying the absolute difference between the ratio of the first and the second spatial perspective, determining the first detection The touch signal corresponds to the touch point. 2525
TW98124231A 2009-07-17 2009-07-17 Multi-touch input apparatus and method thereof TWI393038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW98124231A TWI393038B (en) 2009-07-17 2009-07-17 Multi-touch input apparatus and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW98124231A TWI393038B (en) 2009-07-17 2009-07-17 Multi-touch input apparatus and method thereof

Publications (2)

Publication Number Publication Date
TW201104533A true TW201104533A (en) 2011-02-01
TWI393038B TWI393038B (en) 2013-04-11

Family

ID=44813678

Family Applications (1)

Application Number Title Priority Date Filing Date
TW98124231A TWI393038B (en) 2009-07-17 2009-07-17 Multi-touch input apparatus and method thereof

Country Status (1)

Country Link
TW (1) TWI393038B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092429A (en) * 2011-11-02 2013-05-08 时代光电科技股份有限公司 Method and device determining positions of objects
TWI448918B (en) * 2011-09-09 2014-08-11 Pixart Imaging Inc Optical panel touch system
US9235293B2 (en) 2012-04-19 2016-01-12 Wistron Corporation Optical touch device and touch sensing method
TWI579579B (en) * 2012-06-14 2017-04-21 英特希爾美國公司 Method,system and optoelectronics apparatus for simple gesture detection using multiple photodetector segments

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4546224B2 (en) * 2004-11-24 2010-09-15 キヤノン株式会社 Coordinate input method and apparatus
US8259078B2 (en) * 2006-06-09 2012-09-04 Apple Inc. Touch screen liquid crystal display
TWI372349B (en) * 2007-06-13 2012-09-11 Egalax Empia Technology Inc Device and method for determining function represented by relative motion between/among multitouch inputs on scan signal shielding for position acquisition-based touch screen
TWI339808B (en) * 2007-09-07 2011-04-01 Quanta Comp Inc Method and system for distinguishing multiple touch points

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI448918B (en) * 2011-09-09 2014-08-11 Pixart Imaging Inc Optical panel touch system
US9229579B2 (en) 2011-09-09 2016-01-05 Pixart Imaging Inc. Optical touch system
CN103092429A (en) * 2011-11-02 2013-05-08 时代光电科技股份有限公司 Method and device determining positions of objects
CN103092429B (en) * 2011-11-02 2015-10-14 音飞光电科技股份有限公司 Measure the device and method of object space
US9235293B2 (en) 2012-04-19 2016-01-12 Wistron Corporation Optical touch device and touch sensing method
TWI579579B (en) * 2012-06-14 2017-04-21 英特希爾美國公司 Method,system and optoelectronics apparatus for simple gesture detection using multiple photodetector segments

Also Published As

Publication number Publication date
TWI393038B (en) 2013-04-11

Similar Documents

Publication Publication Date Title
TWI423096B (en) Projecting system with touch controllable projecting picture
CN102150117B (en) Determining the location of one or more objects on a touch surface
US8743089B2 (en) Information processing apparatus and control method thereof
JP5703644B2 (en) Optical detection system, electronic equipment
TWI457798B (en) Method and device for identifying multipoint rotating movement
WO2013104219A1 (en) Laser virtual keyboard
US20080259050A1 (en) Optical touch control apparatus and method thereof
JP2002236544A (en) Coordinate input device and its control method, and computer-readable memory
TW201135553A (en) Contact sensitive device for detecting temporally overlapping traces
TW201137704A (en) Optical touch-control screen system and method for recognizing relative distance of objects
US20110216041A1 (en) Touch panel and touch position detection method of touch panel
CN102467298A (en) Implementation mode of virtual mobile phone keyboard
TWM434260U (en) Apparatus for identifying multipoint rotating movement
TW201104533A (en) Multi-touch input apparatus and method thereof
JP2010160772A (en) Electronic apparatus with virtual input device
TW200832200A (en) Optical positioning input method and device
CN104850275A (en) Projection terminal and projection touch-control implementation method therefor
US20160092032A1 (en) Optical touch screen system and computing method thereof
Hachisu et al. HACHIStack: dual-layer photo touch sensing for haptic and auditory tapping interaction
TW201037579A (en) Optical input device with multi-touch control
CN104571726B (en) Optical touch system, touch detection method and computer program product
KR20100066671A (en) Touch display apparatus
CN101963867B (en) Touch input device with multi-point touch function and touch method
Motamedi Hd touch: multi-touch and object sensing on a high definition lcd tv
US20120205166A1 (en) Sensing system

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees