US20140078087A1 - Method for touch contact tracking - Google Patents

Method for touch contact tracking Download PDF

Info

Publication number
US20140078087A1
US20140078087A1 US14/028,832 US201314028832A US2014078087A1 US 20140078087 A1 US20140078087 A1 US 20140078087A1 US 201314028832 A US201314028832 A US 201314028832A US 2014078087 A1 US2014078087 A1 US 2014078087A1
Authority
US
United States
Prior art keywords
location
detected
reported
virtual
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/028,832
Other languages
English (en)
Inventor
Shun-Lung Ho
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Egalax Empia Technology Inc
Original Assignee
Egalax Empia Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Egalax Empia Technology Inc filed Critical Egalax Empia Technology Inc
Priority to US14/028,832 priority Critical patent/US20140078087A1/en
Assigned to EGALAX_EMPIA TECHNOLOGY INC. reassignment EGALAX_EMPIA TECHNOLOGY INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HO, SHUN-LUNG
Publication of US20140078087A1 publication Critical patent/US20140078087A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present invention relates to a method for touch contact tracking, and more particularly, to a method for prediction-based touch contact tracking for touch screens.
  • an input track Itrack is the track of an external conductive object moving on a touch screen
  • an output track Otrack is the track detected by the touch screen.
  • the output track Otrack may have the jitter phenomenon due to noise interference, especially when the external conductive object is stationary, the detected location may jitter around the external conductive object. The larger the noise, the greater the jittering.
  • a coefficient is employed to carry out a filtering process, whereby a new reported location is shifted backwards towards a latest reported location based on a detected location to reduce or filter the jittering of the reported locations caused by noise in the signals of the touch sensor.
  • the coefficient is adjusted such that the more external objects that are detected, the closer a reported location is to a detected location.
  • a method for touch contact tracking in accordance with the present invention may include: continuously detecting the number of external objects touching or approaching a touch screen and a detected location corresponding to each external object; continuously determining a coefficient less than one based on the number, wherein the larger the number of the external objects, the smaller the coefficient; and continuously generating a reported location based on the coefficient and the respective detected location of each external object, wherein the smaller the coefficient, the closer the reported location is to the detected location.
  • a method for touch contact tracking in accordance with the present invention may include: continuously detecting the number of external objects touching or approaching a touch screen and a detected location corresponding to each external object; continuously determining a coefficient less than one based on the number, wherein the larger the number of the external objects, the smaller the coefficient; and continuously generating a virtual detected location between a latest detected location corresponding to the same external object and a previously detected location corresponding to the same external object; continuously generating a respective reported location based on the coefficient and the detected location of each external object, wherein the smaller the coefficient, the closer the reported location is to the detected location; and continuously generating a virtual reported location based on the coefficient and the respective virtual detected location of each external object, wherein the smaller the coefficient, the closer the virtual reported location is to the virtual detected location.
  • the present invention includes at least the following advantages and benefits: jittering is addressed by filtering processes, and the degree of filtering is adjusted according to the number of external objects detected, so that when jittering is prominent, jittering can be reduced. On the other hand, when jittering is not prominent, the distance between a reported location and a detected location can be reduced.
  • FIG. 1 is a schematic diagram illustrating reporting points for tracking of the prior art
  • FIGS. 2 and 9 are schematic diagrams illustrating reporting points for tracking that employs a filtering algorithm
  • FIG. 3 is a schematic diagram illustrating reporting points for prediction-based touch contact tracking of the present invention.
  • FIG. 4 is a schematic diagram illustrating reporting points for prediction-based touch contact tracking of the present invention with the addition of virtual reported locations;
  • FIGS. 5 and 6 are flowcharts illustrating predication-based touch contact tracking methods in accordance with a first embodiment of the present invention
  • FIGS. 7 and 8 are flowcharts illustrating predication-based touch contact tracking methods in accordance with a second embodiment of the present invention.
  • FIGS. 10 and 11 are flowcharts illustrating touch contact tracking methods in accordance with the present invention.
  • a new reported location can be determined by performing interpolation between the latest reported location (the last reported location) and the latest detected location (the current detected location) with a certain ratio.
  • the touch screen sequentially detects reported locations I 1 , I 2 , . . . , I 5 based on an input track Itrack, wherein when I 2 is detected, a filtering process is carried out based on the latest reported location (assuming the latest reported location is at I 1 ) and the latest detected location I 2 with a certain ratio (e.g. 40%) to generate a new reported location O 2 .
  • an output track Otrack can be generated from the continuously generated reported locations. It can be seen from FIG. 2 that the output track Otrack lags the input track Itrack by a considerably large distance.
  • a prediction-based touch contact tracking method is provided to address the issue of linear jittering while shortening the distance between the output and input tracks.
  • a touch screen sequentially detects continuous detected locations (e.g. detected locations I 1 , I 2 , . . . , I 5 ), and predicts continuous predicted locations (e.g. predicted locations P 3 , P 4 and P 5 ) based on the detected locations.
  • a filtering process is carried out based on the latest reported location (the last reported location) and the latest predicted location (the current predicted location) to generate a new reported location.
  • a predicted location can be determined from at least two detected locations.
  • predicted location P 3 is determined from the detected locations I 1 and I 3 ; the predicted location P 4 is determined from the detected locations I 2 and I 4 ; and the predicted location P 5 is determined from the detected locations I 3 and I 5 .
  • predicted locations are determined with a uniform velocity.
  • a translational velocity or vector is calculated from a previous detected location and a following detected location that is one, two or more time unit away from the previous detected location.
  • the location after k time units is predicted based on the translational velocity or vector and based on the following detected location, wherein k may be a natural or real number, for example, k may be 1 or 1.5.
  • the filtering process performing linear interpolation is carried out to generate a new reported location.
  • the prediction and filtering processes just described may employ a linear Bezier curve.
  • L0 and L1 are the detected locations (e.g. the detected locations I 1 and I 3 above), wherein t>1 (e.g. 1.5), and the resulting B(t) is a new predicted location.
  • L0 and L1 are the latest reported location and the latest predicted location (e.g. the reported location O 2 and the predicted location P 3 ), respectively, wherein coefficient t ⁇ [0,1], and the resulting B(t) is a new reported location.
  • a quadratic Bézier curve can be used.
  • B(t) (1 ⁇ t) 2 L0+2t(1 ⁇ t)L1+t 2 L2.
  • L0, L1 and L2 are the detected locations, wherein t>1, and the resulting B(t) is a new predicted location.
  • L0 and L1 are the two latest reported locations and L2 is the latest predicted location, wherein coefficient t ⁇ [0,1], and the resulting B(t) is a new reported location.
  • At least one virtual detected location is added between each pair of adjacent detected locations among the detected locations.
  • the detected locations and the virtual detected location form continuous input locations.
  • the virtual detected location can be at the middle of each pair of adjacent detected locations, or generated in accordance to the above Bézier curves. Accordingly, the number of detected locations becoming the input locations can be effectively doubled.
  • the number of virtual detected location added between each pair of adjacent detected locations can be one, two or more.
  • the touch screen sequentially detects continuous detected locations (e.g. detected locations I 1 , I 2 , . . . , I 5 ) to generate continuous virtual detected locations (I 1 - 1 , I 2 - 1 , . . . , I 4 - 1 ) to form continuous input locations (I 1 , I 1 - 1 , I 2 , I 2 - 1 , . . . , I 4 , I 4 - 1 , I 5 ), and continuous predicted locations (e.g. predicted locations P 3 , P 3 - 1 , P 4 , P 4 - 1 and P 5 ) are predicted based on the input locations.
  • continuous detected locations e.g. detected locations I 1 , I 2 , . . . , I 5
  • continuous virtual detected locations I 1 - 1 , I 2 - 1 , . . . , I 4 - 1
  • continuous input locations e.g. predicted locations P 3 , P 3 - 1
  • a filtering process is carried out based on the latest reported location (the last reported location) and the latest predicted location (the current predicted location) to generate a new reported location (e.g. reported locations O 3 , O 3 - 1 , O 4 , O 4 - 1 or O 5 ).
  • the reported location O 2 - 1 is the reported location immediately before the reported location O 3 .
  • the technical means of the present invention can be applied to most types of the touch screens, for example, resistive, surface acoustic, infrared, optical, surface capacitive, projected capacitive or other types of touch screen that are capable of reporting locations in order to display an output track.
  • the present invention can be used to report the output track (or tracks) of one or more external conductive objects.
  • a flowchart illustrating a predication-based touch contact tracking method in accordance with a first embodiment of the present invention is shown.
  • step 510 when an external object touches or approaches a touch screen, a detected location corresponding to the external object is continuously generated from signals received by the touch screen.
  • step 530 a predicted location is generated continuously based on a most recently generated detected location and at least one previously generated detected location.
  • step 550 a new reported location is generated continuously based on at least one predicted location including the most recently generated predicted location and at least one reported location including the most recently generated reported location.
  • a flowchart illustrating another predication-based touch contact tracking method in accordance with a first embodiment of the present invention is shown.
  • step 610 when an external object touches or approaches a touch screen, a plurality of continuous detected location corresponding to the external object are generated from signals received by the touch screen.
  • step 630 a plurality of predicted locations are generated based on the detected locations, wherein each predicted location is generated based on at least two detected locations.
  • step 650 a plurality of new reported location are generated based on the predicted locations, wherein each reported location is generated based on at least one predicted location and at least one reported location.
  • a predicted location is generated by multiplying the vector of the two most recently generated detected locations by a predetermined factor greater than 1.
  • a new vector is generated by multiplying a vector produced from using the older detected location as the starting point and the newer detected location as the end point by a predetermined factor greater than 1.
  • the predicted location is at the end point of the new vector that is based on the older detected location.
  • generating another location from two locations can be carried out using the example shown above or other methods; the present invention is not limited as such.
  • a predicted location is generated by multiplying the vector of the two detected locations by a predetermined factor greater than 1, wherein there is at least one detected location between the two detected locations.
  • the two detected locations for generating a predicted location can be adjacent or not adjacent to each other; the present invention is not limited as such.
  • a reported location is generated by multiplying by a predetermined factor that is less than 1.
  • a reported location is generated by multiplying the vector of the most recently generated reported location and the most recently generated predicted location by a predetermined factor that is less than 1.
  • input locations I 1 - 1 , I 2 - 1 , . . . , I 4 - 1 can be regarded as virtual input locations
  • predicated locations P 3 - 1 and P 4 - 1 can be regarded as virtual predicted locations
  • reported locations O 3 - 1 and O 4 - 1 can be regarded as virtual reported locations. The virtual reported locations increase the reporting rate, so the line becomes smoother.
  • a virtual detected location is continuously generated based on a most recently generated detected location and an adjacent detected location.
  • a virtual predicted location is generated continuously based on a most recently generated virtual detected location and at least one previously generated virtual detected location.
  • a new virtual reported location is generated continuously based on at least one virtual predicted location including the most recently generated virtual predicted location and at least one virtual reported location including the most recently generated virtual reported location.
  • the method illustrated in FIG. 6 further includes addition steps as follow.
  • step 620 a plurality of continuous virtual detected location are generated based on the detected locations, wherein each virtual detected location resides between two adjacent detected locations.
  • step 640 a plurality of virtual predicted locations are generated based on the virtual detected locations, wherein each virtual predicted location is generated based on at least two virtual detected locations.
  • step 660 a plurality of new virtual reported location are generated based on the virtual predicted locations, wherein each virtual reported location is generated based on at least one virtual predicted location and at least one virtual reported location.
  • a virtual predicted location is generated by multiplying the vector of the two most recently generated virtual detected locations by a predetermined factor greater than 1.
  • a virtual reported location is generated by multiplying the vector of the most recently generated virtual reported location and the most recently generated virtual predicted location by a predetermined factor that is less than 1.
  • the two virtual detected locations for generating a virtual predicted location can be adjacent or not adjacent to each other; the present invention is not limited as such.
  • a virtual reported location is generated by multiplying by a predetermined factor that is less than 1.
  • a virtual predicted location is generated by multiplying the vector of the two virtual detected locations by a predetermined factor that is greater than 1, wherein at least one virtual detected location resides between the two virtual detected locations.
  • the present invention further includes continuously providing a latest virtual reported location and a latest reported location, wherein the latest virtual reported location is provided before the latest reported location.
  • the predicted locations, the reported locations, the virtual predicted locations and the virtual reported locations mentioned before can be generated by quadratic Bézier curves.
  • the number of reported locations provided in a unit time is called the reporting rate.
  • the reporting rate can be fixed or variable. For example, in the detection method disclosed in U.S. patent application Ser. No. 12/499,981, the duration of the detection varies with the number of external objects approaching or touching a touch screen, the larger the number of external objects, the longer the detection duration. Even though the detection duration is longer, the number of locations detected is greater, so the overall reporting rate is not necessary lower, but is not fixed.
  • a touch contact tracking method is proposed by the present invention, wherein the coefficient t is adjusted according to the number of external object detected from the signals of a touch screen, such that when the number of external object detected is larger, the reported locations will be closer to the detected locations. For example, when the number of external objects is small, for example, 1, then a larger coefficient is used, as shown in FIG. 2 . As another example, when the number of external objects is greater, for example, 5, then a smaller coefficient is used, as shown in FIG. 9 .
  • step 1010 the number of external objects touching or approaching a touch screen and a detected location corresponding to each external object are continuously detected. Since the detected locations of more than one external objects (touching or approaching) may be detected on the touch screen, a corresponding external object to which each new detected location corresponds can be determined based on the history of detected locations of each external object previously detected. In contrast to the new detected location just detected, any previously detected locations corresponding to the same external object belongs to the history of detected locations of the same external object.
  • step 1020 a coefficient less than 1 is continuously determined according to the number, wherein the larger the number of the external objects, the smaller the coefficient.
  • a reported location is continuously generated based on the coefficient and the respective detected location of each external object, wherein the smaller the coefficient, the closer the reported location is to the detected location. In other words, the more external objects there are, the closer the reported locations will be to the detected locations. On the contrary, the fewer the external objects there are, the larger the coefficient, and the further away the reported locations will be from the detected locations.
  • the reported locations can also be generated by quadratic Bézier curves; the present invention is not limited as such.
  • the reported location can also be generated by prediction-based methods such as those described in FIGS. 5 and 7 , and will not be repeated herein.
  • a virtual detected location is continuously generated between the latest detected location corresponding to the same external object and a previously detected location corresponding to the same external object.
  • a virtual detected location may reside between two adjacent (former and latter) detected locations corresponding to the same external object.
  • a virtual reported location is continuously generated based on the coefficient and the respective virtual detected location of each external object, wherein the smaller the coefficient, the closer the virtual reported location is to the virtual detected location. In other words, the more external objects there are, the closer the virtual reported locations will be to the virtual detected locations. On the contrary, the fewer the external objects there are, the larger the coefficient, and the further away the virtual reported locations will be from the virtual detected locations.
  • the virtual reported location of each external object is generated based on the latest virtual reported location and a virtual detected location.
  • virtual reported location (latest virtual reported location ⁇ (1 ⁇ coefficient)+virtual detected location ⁇ coefficient).
  • the virtual reported locations can also be generated by quadratic Bézier curves; the present invention is not limited as such.
  • the virtual reported location can also be generated by prediction-based methods such as those described in FIGS. 5 and 7 , and will not be repeated herein.
  • the present invention further includes continuously providing a newly generated virtual reported location corresponding to each external object, and continuously providing a newly generated reported location corresponding to each external object, wherein the newly generated virtual reported location corresponding to each external object is provided before the newly generated reported location corresponding to each external object.
  • jittering is addressed, and the coefficient is adjusted according to the number of external objects detected, so that when jittering is prominent, jittering can be reduced.
  • the distance between a reported location and a detected location can be reduced.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
US14/028,832 2012-09-18 2013-09-17 Method for touch contact tracking Abandoned US20140078087A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/028,832 US20140078087A1 (en) 2012-09-18 2013-09-17 Method for touch contact tracking

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261702301P 2012-09-18 2012-09-18
US14/028,832 US20140078087A1 (en) 2012-09-18 2013-09-17 Method for touch contact tracking

Publications (1)

Publication Number Publication Date
US20140078087A1 true US20140078087A1 (en) 2014-03-20

Family

ID=50273965

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/028,713 Active 2034-08-16 US9606656B2 (en) 2012-09-18 2013-09-17 Prediction-based touch contact tracking
US14/028,832 Abandoned US20140078087A1 (en) 2012-09-18 2013-09-17 Method for touch contact tracking
US15/432,215 Active 2034-04-03 US10359872B2 (en) 2012-09-18 2017-02-14 Prediction-based touch contact tracking

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/028,713 Active 2034-08-16 US9606656B2 (en) 2012-09-18 2013-09-17 Prediction-based touch contact tracking

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/432,215 Active 2034-04-03 US10359872B2 (en) 2012-09-18 2017-02-14 Prediction-based touch contact tracking

Country Status (3)

Country Link
US (3) US9606656B2 (zh)
CN (2) CN103677382B (zh)
TW (1) TWI486837B (zh)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150091832A1 (en) * 2013-10-02 2015-04-02 Sony Corporation Information processing apparatus, information processing method, and program
US20150355740A1 (en) * 2013-01-09 2015-12-10 Sharp Kabushiki Kaisha Touch panel system
US9323449B2 (en) * 2014-05-09 2016-04-26 Htc Corporation Electronic apparatus and drawing method using the same
CN105955525A (zh) * 2016-04-21 2016-09-21 青岛海信电器股份有限公司 触摸轨迹跟踪方法、装置和显示设备
US9508166B2 (en) * 2014-09-15 2016-11-29 Microsoft Technology Licensing, Llc Smoothing and GPU-enabled rendering of digital ink
EP3115874A1 (en) * 2015-07-09 2017-01-11 Alps Electric Co., Ltd. Input device, method for controlling them and program, that adapt the filtering process according to the number of touches
CN106502459A (zh) * 2016-10-31 2017-03-15 北京交通大学 一种电容触控轨迹噪声信号的平滑滤波方法
US20170090672A1 (en) * 2015-09-30 2017-03-30 Apple Inc. Unified Drawing Framework
US10359872B2 (en) * 2012-09-18 2019-07-23 Egalax_Empia Technology Inc. Prediction-based touch contact tracking
US20230205368A1 (en) * 2021-12-24 2023-06-29 Lx Semicon Co., Ltd. Touch sensing device and coordinate correction method

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201618288D0 (en) * 2016-10-28 2016-12-14 Remarkable As Interactive displays
CN107704128A (zh) * 2017-09-26 2018-02-16 北京集创北方科技股份有限公司 数据处理方法和装置、存储介质及处理器
CN112527139A (zh) * 2019-09-17 2021-03-19 北京小米移动软件有限公司 触摸点的报点位置确定方法、装置、设备及存储介质
CN112835455A (zh) * 2019-11-22 2021-05-25 华为技术有限公司 一种预测手写笔绘制点的方法和设备
DE102020007256A1 (de) 2020-11-27 2022-06-02 Daimler Ag Vorrichtung zur Korrektur und Weitergabe von erfassten Koordinaten
US11720205B2 (en) 2021-12-27 2023-08-08 Stmicroelectronics Ltd. Touch report rate of touchscreen
US11755149B1 (en) 2022-04-18 2023-09-12 Samsung Electronics Co., Ltd. Systems and methods for using statistical inference to enhance the precision of sparsified capacitive-touch and other human-interface devices

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120194444A1 (en) * 2011-01-31 2012-08-02 Tpk Touch Solutions Inc. Method of Tracing Touch Paths for a Multi-Touch Panel
US20130314358A1 (en) * 2011-02-16 2013-11-28 Nec Casio Mobile Communications Ltd. Input apparatus, input method, and recording medium
US8704792B1 (en) * 2012-10-19 2014-04-22 Google Inc. Density-based filtering of gesture events associated with a user interface of a computing device
US20140333583A1 (en) * 2012-01-31 2014-11-13 Fujitsu Component Limited Position detection method in touch panel and touch panel

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3052997B2 (ja) * 1996-01-12 2000-06-19 日本電気株式会社 手書き入力表示装置
US6295378B1 (en) * 1996-02-29 2001-09-25 Sanyo Electric Co., Ltd. Handwriting stroke information encoder which encodes handwriting stroke information by sampling
AUPR890201A0 (en) * 2001-11-16 2001-12-06 Silverbrook Research Pty. Ltd. Methods and systems (npw005)
US7228227B2 (en) * 2004-07-07 2007-06-05 The Boeing Company Bezier curve flightpath guidance using moving waypoints
US20070018966A1 (en) * 2005-07-25 2007-01-25 Blythe Michael M Predicted object location
JP5204381B2 (ja) * 2006-05-01 2013-06-05 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム及びゲーム処理方法
JP4800873B2 (ja) * 2006-08-04 2011-10-26 オークマ株式会社 近似点群データからの近似曲線生成プログラム及び方法
US7969440B1 (en) * 2007-05-02 2011-06-28 Evernote Corporation Method and system for curve fitting using digital filtering
CN101615291B (zh) * 2008-06-27 2012-10-10 睿致科技股份有限公司 一种反馈式对象侦测方法
KR100958643B1 (ko) * 2008-10-17 2010-05-20 삼성모바일디스플레이주식회사 터치 스크린 디스플레이 장치 및 이의 구동 방법
TWI563442B (en) * 2009-09-23 2016-12-21 Egalax Empia Technology Inc Method and device for position detection
US9189147B2 (en) * 2010-06-22 2015-11-17 Microsoft Technology Licensing, Llc Ink lag compensation techniques
CN102096530B (zh) * 2011-01-28 2013-09-18 广东威创视讯科技股份有限公司 一种多点触摸轨迹跟踪方法
US9542092B2 (en) * 2011-02-12 2017-01-10 Microsoft Technology Licensing, Llc Prediction-based touch contact tracking
CN102331873B (zh) * 2011-06-01 2014-06-18 广州视睿电子科技有限公司 一种触摸点跟踪定位校正方法及其系统
CN102890576B (zh) * 2011-07-22 2016-03-02 宸鸿科技(厦门)有限公司 触控屏触摸轨迹检测方法及检测装置
US8760423B2 (en) * 2011-10-28 2014-06-24 Nintendo Co., Ltd. Computer-readable storage medium, information processing apparatus, information processing system, and information processing method
US10452188B2 (en) * 2012-01-13 2019-10-22 Microsoft Technology Licensing, Llc Predictive compensation for a latency of an input device
US9218094B1 (en) * 2012-06-21 2015-12-22 Parade Technologies, Ltd. Sense position prediction for touch sensing methods, circuits and systems
US8487896B1 (en) * 2012-06-27 2013-07-16 Google Inc. Systems and methods for improving image tracking based on touch events
TWI486837B (zh) * 2012-09-18 2015-06-01 Egalax Empia Technology Inc 基於預測的位置追蹤方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120194444A1 (en) * 2011-01-31 2012-08-02 Tpk Touch Solutions Inc. Method of Tracing Touch Paths for a Multi-Touch Panel
US20130314358A1 (en) * 2011-02-16 2013-11-28 Nec Casio Mobile Communications Ltd. Input apparatus, input method, and recording medium
US20140333583A1 (en) * 2012-01-31 2014-11-13 Fujitsu Component Limited Position detection method in touch panel and touch panel
US8704792B1 (en) * 2012-10-19 2014-04-22 Google Inc. Density-based filtering of gesture events associated with a user interface of a computing device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10359872B2 (en) * 2012-09-18 2019-07-23 Egalax_Empia Technology Inc. Prediction-based touch contact tracking
US20150355740A1 (en) * 2013-01-09 2015-12-10 Sharp Kabushiki Kaisha Touch panel system
US20150091832A1 (en) * 2013-10-02 2015-04-02 Sony Corporation Information processing apparatus, information processing method, and program
US9323449B2 (en) * 2014-05-09 2016-04-26 Htc Corporation Electronic apparatus and drawing method using the same
US9508166B2 (en) * 2014-09-15 2016-11-29 Microsoft Technology Licensing, Llc Smoothing and GPU-enabled rendering of digital ink
CN106687891A (zh) * 2014-09-15 2017-05-17 微软技术许可有限责任公司 数字墨水的平滑和gpu使能的渲染
US9697625B2 (en) 2014-09-15 2017-07-04 Microsoft Technology Licensing, Llc Smoothing and GPU-enabled rendering of digital ink
EP3115874A1 (en) * 2015-07-09 2017-01-11 Alps Electric Co., Ltd. Input device, method for controlling them and program, that adapt the filtering process according to the number of touches
JP2017021518A (ja) * 2015-07-09 2017-01-26 アルプス電気株式会社 入力装置とその制御方法及びプログラム
EP3141990A1 (en) * 2015-07-09 2017-03-15 Alps Electric Co., Ltd. Threshold apadtation for a multi-touch input device
US20170090672A1 (en) * 2015-09-30 2017-03-30 Apple Inc. Unified Drawing Framework
US10739911B2 (en) * 2015-09-30 2020-08-11 Apple Inc. Unified drawing framework
CN105955525A (zh) * 2016-04-21 2016-09-21 青岛海信电器股份有限公司 触摸轨迹跟踪方法、装置和显示设备
CN106502459A (zh) * 2016-10-31 2017-03-15 北京交通大学 一种电容触控轨迹噪声信号的平滑滤波方法
US20230205368A1 (en) * 2021-12-24 2023-06-29 Lx Semicon Co., Ltd. Touch sensing device and coordinate correction method
US11960682B2 (en) * 2021-12-24 2024-04-16 Lx Semicon Co., Ltd. Touch sensing device and coordinate correction method

Also Published As

Publication number Publication date
CN107422893B (zh) 2021-03-02
TW201413525A (zh) 2014-04-01
CN103677382A (zh) 2014-03-26
US20140078085A1 (en) 2014-03-20
CN103677382B (zh) 2017-04-12
TWI486837B (zh) 2015-06-01
US10359872B2 (en) 2019-07-23
CN107422893A (zh) 2017-12-01
US20170153768A1 (en) 2017-06-01
US9606656B2 (en) 2017-03-28

Similar Documents

Publication Publication Date Title
US10359872B2 (en) Prediction-based touch contact tracking
CA3008371C (en) Coordinate correction apparatus
US9063611B2 (en) Systems and methods for improving image tracking based on touch events
US8687848B2 (en) Techniques for context-enhanced confidence adjustment for gesture
KR101813061B1 (ko) 데이터 보고 방법 및 장치, 및 단말 기기
CN102292981B (zh) 帧率变换装置及方法
CN102622120A (zh) 多点触控面板的触碰轨迹追踪方法
JP2013531305A (ja) タッチイベントの判定方法およびタッチ感応装置
JP2013025788A (ja) タッチスクリーンのタッチトラッキングデバイス及び方法
CN102170567A (zh) 一种基于预测运动矢量搜索的自适应运动估计算法
US20120308080A1 (en) Image processing apparatus, image processing method, and program
CN102999192B (zh) 具轨迹侦测功能的触控系统及方法
US9262009B2 (en) Touch device and method for detecting touch point thereof
CN102035996B (zh) 图像处理设备及其控制方法
TWI470482B (zh) 位置追蹤方法
TW201514770A (zh) 觸摸屏
CN103150051A (zh) 触摸操作响应方法、系统及触摸屏终端
JP2011130128A (ja) 画像処理装置及びその制御方法、プログラム
JP2021086602A (ja) 視覚探索方法、装置および電子機器
JP2009272780A (ja) フレームレート変換装置、方法及びプログラム
JP6289071B2 (ja) タッチパネル装置およびタッチ座標処理方法
JP6219708B2 (ja) タッチ検出装置およびタッチ検出方法
CN111930156B (zh) 一种抑振方法、抑振系统、抑振装置和机器人设备
CN103941899A (zh) 位置追踪方法
Vanga et al. Multi stage based time series analysis of user activity on touch sensitive surfaces in highly noise susceptible environments

Legal Events

Date Code Title Description
AS Assignment

Owner name: EGALAX_EMPIA TECHNOLOGY INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HO, SHUN-LUNG;REEL/FRAME:031221/0644

Effective date: 20130917

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION