JP2017529582A - タッチ分類 - Google Patents
タッチ分類 Download PDFInfo
- Publication number
- JP2017529582A JP2017529582A JP2016575483A JP2016575483A JP2017529582A JP 2017529582 A JP2017529582 A JP 2017529582A JP 2016575483 A JP2016575483 A JP 2016575483A JP 2016575483 A JP2016575483 A JP 2016575483A JP 2017529582 A JP2017529582 A JP 2017529582A
- Authority
- JP
- Japan
- Prior art keywords
- classification
- touch
- blob
- frame
- rating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 99
- 238000010801 machine learning Methods 0.000 claims abstract description 51
- 230000002902 bimodal effect Effects 0.000 claims abstract description 32
- 238000007637 random forest analysis Methods 0.000 claims description 37
- 230000001186 cumulative effect Effects 0.000 claims description 24
- 230000004931 aggregating effect Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 37
- 210000003811 finger Anatomy 0.000 description 31
- 230000015654 memory Effects 0.000 description 30
- 238000004364 calculation method Methods 0.000 description 16
- 238000003860 storage Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 14
- 230000002776 aggregation Effects 0.000 description 11
- 238000004220 aggregation Methods 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 11
- 238000007667 floating Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000005192 partition Methods 0.000 description 7
- 230000003993 interaction Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000003066 decision tree Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 241001422033 Thestylus Species 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 210000003813 thumb Anatomy 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000005672 electromagnetic field Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 210000000707 wrist Anatomy 0.000 description 2
- 101150077079 RDT1 gene Proteins 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 210000001145 finger joint Anatomy 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04186—Touch location disambiguation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
- G06N5/027—Frames
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Sorting Of Articles (AREA)
- Automatic Disk Changers (AREA)
- Supplying Of Containers To The Packaging Station (AREA)
- Transition And Organic Metals Composition Catalysts For Addition Polymerization (AREA)
Abstract
Description
ブロブ分類レーティング=指テーブルレーティング−手のひらテーブルレーティング
その結果、ブロブ分類レーティングは−3から+3の範囲にある。ここで、正の値は、ブロブが、意図されたタッチである可能性がより高いことを示し、負の値は、ブロブが、意図されていないタッチである可能性がより高いことを示す。レーティングの絶対値は、レーティングにおける分類の確からしさまたは曖昧さレベルを示すインジケーションである。ブロブ分類レーティングは、タッチ画像における各ブロブについて計算される。
Claims (15)
- コンピュータによって実施される方法であって、
タッチセンサ式デバイスによってキャプチャされた複数のフレームを表すフレームデータを取得するステップと、
前記複数のフレームの各フレームにおける各々のブロブを定義するために前記フレームデータを解析するステップであって、前記ブロブはタッチイベントを示す、ステップと、
前記タッチイベントのための複数の特性セットを計算するステップであって、各特性セットは、前記複数のフレームの各フレームにおける前記各々のブロブのプロパティを指定する、ステップと、
前記複数のフレームのための前記複数の特性セットに基づいて、多数の非バイモーダルな分類スコアを提供するように構成されたマシン学習分類によって、前記タッチイベントのタイプを判定するステップであって、各非バイモーダルな分類スコアは、前記マシン学習分類における曖昧さレベルを示す、ステップと、
を備える方法。 - 前記マシン学習分類は、各非バイモーダルな分類スコアが、前記タッチイベントが各々のタイプである確率を表すことができるように、前記非バイモーダルな分類スコアを生成するように構成される、請求項1に記載のコンピュータによって実施される方法。
- 前記非バイモーダルな分類スコアの各々が、前記複数の特性セットを入力として受け取るように構成されたマシン学習分類器によって生成される、請求項2に記載のコンピュータによって実施される方法。
- 前記マシン学習分類器は、ランダム決定フォレスト分類器を備える、請求項3に記載のコンピュータによって実施される方法。
- 前記タッチイベントのための前記複数のフレームにわたる前記ブロブのトラックを定義するステップと、
前記トラックのためのトラック特性セットを計算するステップとをさらに備え、
前記タイプを判定するステップは、前記トラック特性セットをマシン学習分類器へ適用するステップを備える、請求項1に記載のコンピュータによって実施される方法。 - 前記複数の特性セットを計算するステップは、前記タッチイベントの前記タイプを判定する際に、前記複数の特性セットのマシン学習分類器への適用前に、前記複数の特性セットを示すデータを集約するステップを備える、請求項1に記載のコンピュータによって実施される方法。
- 前記マシン学習分類は、ルックアップテーブルベースの分類を備える、請求項1に記載のコンピュータによって実施される方法。
- 前記タイプを判定するステップは、前記複数のフレームの各々のフレームのための前記特性セットを、多数のルックアップテーブルへ適用するステップを備え、各ルックアップテーブルは、前記多数の非バイモーダルな分類スコアの各々の個々の非バイモーダルな分類スコアを提供する、請求項1に記載のコンピュータによって実施される方法。
- 前記タイプを判定するステップは、前記各々のフレームのためのブロブ分類レーティングスコアを生成するために、前記各々のフレームのための前記個々の非バイモーダルな分類スコアの各々を組み合わせるステップを備える、請求項8に記載のコンピュータによって実施される方法。
- 前記多数のルックアップテーブルは、前記タッチイベントが意図されたタッチである第1のレーティングを提供するように構成された第1のルックアップテーブルを備え、さらに、前記タッチイベントが意図されていないタッチである第2のレーティングを判定するための第2のルックアップテーブルを備え、
前記タイプを判定するステップは、前記各々のフレームのための前記ブロブ分類レーティングスコアを判定するために、前記第1のレーティングから前記第2のレーティングを減算するステップを備える、請求項9に記載のコンピュータによって実施される方法。 - 前記タイプを判定するステップは、前記タッチイベントのための累積的なマルチフレーム分類スコアを判定するために、前記ブロブ分類レーティングスコアを、前記複数のフレームにわたって集約するステップを備える、請求項9に記載のコンピュータによって実施される方法。
- 前記タイプを判定するステップは、
前記累積的なマルチフレーム分類スコアが、多数の分類しきい値のうちの1つをパスするか否かを判定するステップと、
パスしないのであれば、前記複数の特性セットのさらなる特性セットに関連して、前記特性セットを適用する動作、前記分類スコアを組み合わせる動作、および、前記レーティングスコアを集約する動作を繰り返すステップとを備える、請求項11に記載のコンピュータによって実施される方法。 - 各特性セットは、各フレームにおける前記各々のブロブにおいて配置された画像パッチの外見を示すデータを備える、請求項1に記載のコンピュータによって実施される方法。
- 各特性セットは、各フレームにおける前記各々のブロブのための前記フレームデータにおける強度勾配を示すデータを備える、請求項1に記載のコンピュータによって実施される方法。
- 各特性セットは、各フレームにおける前記各々のブロブの円周の商または真円度の他のメトリックを示すデータを備える、請求項1に記載のコンピュータによって実施される方法。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/329,052 | 2014-07-11 | ||
US14/329,052 US9558455B2 (en) | 2014-07-11 | 2014-07-11 | Touch classification |
PCT/US2015/039282 WO2016007450A1 (en) | 2014-07-11 | 2015-07-07 | Touch classification |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2017529582A true JP2017529582A (ja) | 2017-10-05 |
JP6641306B2 JP6641306B2 (ja) | 2020-02-05 |
Family
ID=53758517
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2016575483A Active JP6641306B2 (ja) | 2014-07-11 | 2015-07-07 | タッチ分類 |
Country Status (11)
Country | Link |
---|---|
US (2) | US9558455B2 (ja) |
EP (1) | EP3167352B1 (ja) |
JP (1) | JP6641306B2 (ja) |
KR (1) | KR102424803B1 (ja) |
CN (1) | CN106537305B (ja) |
AU (1) | AU2015288086B2 (ja) |
BR (1) | BR112016029932A2 (ja) |
CA (1) | CA2954516C (ja) |
MX (1) | MX2017000495A (ja) |
RU (1) | RU2711029C2 (ja) |
WO (1) | WO2016007450A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016177343A (ja) * | 2015-03-18 | 2016-10-06 | 株式会社トヨタIt開発センター | 信号処理装置、入力装置、信号処理方法、およびプログラム |
Families Citing this family (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11327599B2 (en) | 2011-04-26 | 2022-05-10 | Sentons Inc. | Identifying a contact type |
EP2769291B1 (en) | 2011-10-18 | 2021-04-28 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
US11262253B2 (en) | 2017-08-14 | 2022-03-01 | Sentons Inc. | Touch input detection using a piezoresistive sensor |
KR101648143B1 (ko) | 2011-11-18 | 2016-08-16 | 센톤스 아이엔씨. | 터치 입력 힘 검출 |
KR20140114766A (ko) | 2013-03-19 | 2014-09-29 | 퀵소 코 | 터치 입력을 감지하기 위한 방법 및 장치 |
US9013452B2 (en) | 2013-03-25 | 2015-04-21 | Qeexo, Co. | Method and system for activating different interactive functions using different types of finger contacts |
US9612689B2 (en) | 2015-02-02 | 2017-04-04 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer |
US10380434B2 (en) * | 2014-01-17 | 2019-08-13 | Kpit Technologies Ltd. | Vehicle detection system and method |
US9558455B2 (en) * | 2014-07-11 | 2017-01-31 | Microsoft Technology Licensing, Llc | Touch classification |
CN104199572B (zh) * | 2014-08-18 | 2017-02-15 | 京东方科技集团股份有限公司 | 一种触摸显示装置的触摸定位方法及触摸显示装置 |
US9329715B2 (en) | 2014-09-11 | 2016-05-03 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
US9864453B2 (en) * | 2014-09-22 | 2018-01-09 | Qeexo, Co. | Method and apparatus for improving accuracy of touch screen event analysis by use of edge classification |
US10606417B2 (en) | 2014-09-24 | 2020-03-31 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
US10083378B2 (en) * | 2015-12-28 | 2018-09-25 | Qualcomm Incorporated | Automatic detection of objects in video images |
TWI606376B (zh) * | 2016-08-08 | 2017-11-21 | 意象無限股份有限公司 | 觸控感測裝置及濾除誤觸的觸控方法 |
US10313348B2 (en) * | 2016-09-19 | 2019-06-04 | Fortinet, Inc. | Document classification by a hybrid classifier |
CN106708317A (zh) * | 2016-12-07 | 2017-05-24 | 南京仁光电子科技有限公司 | 判断触控点的方法和装置 |
US11580829B2 (en) | 2017-08-14 | 2023-02-14 | Sentons Inc. | Dynamic feedback for haptics |
US11057238B2 (en) | 2018-01-08 | 2021-07-06 | Brilliant Home Technology, Inc. | Automatic scene creation using home device control |
CN112041799A (zh) * | 2018-02-19 | 2020-12-04 | 拉普特知识产权公司 | 触摸敏感装置中的不需要的触摸管理 |
CN110163460B (zh) * | 2018-03-30 | 2023-09-19 | 腾讯科技(深圳)有限公司 | 一种确定应用分值的方法及设备 |
KR102606766B1 (ko) | 2018-06-01 | 2023-11-28 | 삼성전자주식회사 | Em 센서 및 이를 포함하는 모바일 기기 |
KR102104275B1 (ko) * | 2018-06-01 | 2020-04-27 | 경희대학교 산학협력단 | 스타일러스 펜을 이용하는 터치 시스템 및 이를 이용한 터치 검출 방법 |
US11009989B2 (en) * | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
US11747290B2 (en) * | 2019-02-27 | 2023-09-05 | Li Industries, Inc. | Methods and systems for smart battery collection, sorting, and packaging |
JP7513019B2 (ja) * | 2019-03-29 | 2024-07-09 | ソニーグループ株式会社 | 画像処理装置および方法、並びに、プログラム |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
KR20190104101A (ko) * | 2019-08-19 | 2019-09-06 | 엘지전자 주식회사 | 전자 장치의 터치 스크린에서 오류 터치를 판정하는 방법, 장치 및 시스템 |
CN111881287B (zh) * | 2019-09-10 | 2021-08-17 | 马上消费金融股份有限公司 | 一种分类模糊性分析方法及装置 |
US11301099B1 (en) * | 2019-09-27 | 2022-04-12 | Apple Inc. | Methods and apparatus for finger detection and separation on a touch sensor panel using machine learning models |
EP3835929A1 (en) * | 2019-12-13 | 2021-06-16 | Samsung Electronics Co., Ltd. | Method and electronic device for accidental touch prediction using ml classification |
US11528028B2 (en) | 2020-01-05 | 2022-12-13 | Brilliant Home Technology, Inc. | Touch-based control device to detect touch input without blind spots |
EP4085307A4 (en) * | 2020-01-05 | 2024-01-24 | Brilliant Home Technology, Inc. | TOUCH BASED CONTROL DEVICE |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
US11620294B2 (en) * | 2020-01-30 | 2023-04-04 | Panasonic Avionics Corporation | Dynamic media data management |
GB2591764B (en) * | 2020-02-04 | 2024-08-14 | Peratech Holdco Ltd | Classifying pressure inputs |
US11599223B1 (en) | 2020-03-13 | 2023-03-07 | Apple Inc. | System and machine learning method for separating noise and signal in multitouch sensors |
CN111524157B (zh) * | 2020-04-26 | 2022-07-01 | 南瑞集团有限公司 | 基于摄像头阵列的触屏物体分析方法、系统及存储介质 |
US11899881B2 (en) | 2020-07-17 | 2024-02-13 | Apple Inc. | Machine learning method and system for suppressing display induced noise in touch sensors using information from display circuitry |
KR20220023639A (ko) | 2020-08-21 | 2022-03-02 | 삼성전자주식회사 | 전자 장치 및 그 제어 방법 |
US11954288B1 (en) | 2020-08-26 | 2024-04-09 | Apple Inc. | System and machine learning method for separating noise and signal in multitouch sensors |
US11481070B1 (en) | 2020-09-25 | 2022-10-25 | Apple Inc. | System and method for touch sensor panel with display noise correction |
EP4099142A4 (en) | 2021-04-19 | 2023-07-05 | Samsung Electronics Co., Ltd. | ELECTRONIC DEVICE AND METHOD OF OPERATION |
AU2022296590A1 (en) | 2021-06-24 | 2024-01-04 | Icu Medical, Inc. | Infusion pump touchscreen with false touch rejection |
US11537239B1 (en) * | 2022-01-14 | 2022-12-27 | Microsoft Technology Licensing, Llc | Diffusion-based handedness classification for touch-based input |
NL2031789B1 (en) * | 2022-05-06 | 2023-11-14 | Microsoft Technology Licensing Llc | Aggregated likelihood of unintentional touch input |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120007821A1 (en) * | 2010-07-11 | 2012-01-12 | Lester F. Ludwig | Sequential classification recognition of gesture primitives and window-based parameter smoothing for high dimensional touchpad (hdtp) user interfaces |
WO2013166513A2 (en) * | 2012-05-04 | 2013-11-07 | Oblong Industries, Inc. | Cross-user hand tracking and shape recognition user interface |
US20140104225A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Input Classification for Multi-Touch Systems |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8018440B2 (en) | 2005-12-30 | 2011-09-13 | Microsoft Corporation | Unintentional touch rejection |
US8103109B2 (en) | 2007-06-19 | 2012-01-24 | Microsoft Corporation | Recognizing hand poses and/or object classes |
US8519965B2 (en) * | 2008-04-23 | 2013-08-27 | Motorola Mobility Llc | Multi-touch detection panel with disambiguation of touch coordinates |
US8502787B2 (en) | 2008-11-26 | 2013-08-06 | Panasonic Corporation | System and method for differentiating between intended and unintended user input on a touchpad |
KR101648747B1 (ko) * | 2009-10-07 | 2016-08-17 | 삼성전자 주식회사 | 복수의 터치 센서를 이용한 ui 제공방법 및 이를 이용한 휴대 단말기 |
US8587532B2 (en) * | 2009-12-18 | 2013-11-19 | Intel Corporation | Multi-feature interactive touch user interface |
KR20110138095A (ko) | 2010-06-18 | 2011-12-26 | 삼성전자주식회사 | 터치 시스템에서 좌표 보정 방법 및 장치 |
GB201011146D0 (en) * | 2010-07-02 | 2010-08-18 | Vodafone Ip Licensing Ltd | Mobile computing device |
WO2012057887A1 (en) | 2010-10-28 | 2012-05-03 | Cypress Semiconductor Corporation | Capacitive stylus with palm rejection |
US9244545B2 (en) | 2010-12-17 | 2016-01-26 | Microsoft Technology Licensing, Llc | Touch and stylus discrimination and rejection for contact sensitive computing devices |
AU2012214445A1 (en) * | 2011-02-08 | 2013-08-01 | Haworth, Inc. | Multimodal touchscreen interaction apparatuses, methods and systems |
EP2769291B1 (en) * | 2011-10-18 | 2021-04-28 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
US20130106761A1 (en) | 2011-10-28 | 2013-05-02 | Atmel Corporation | Touch Sensor with Lookup Table |
US20130176270A1 (en) | 2012-01-09 | 2013-07-11 | Broadcom Corporation | Object classification for touch panels |
JP5760100B2 (ja) | 2012-02-02 | 2015-08-05 | 株式会社タカラトミー | 放射線測定装置 |
US8973211B2 (en) * | 2012-02-04 | 2015-03-10 | Hsi Fire & Safety Group, Llc | Detector cleaner and/or tester and method of using same |
US8902181B2 (en) | 2012-02-07 | 2014-12-02 | Microsoft Corporation | Multi-touch-movement gestures for tablet computing devices |
WO2013126005A2 (en) | 2012-02-21 | 2013-08-29 | Flatfrog Laboratories Ab | Touch determination with improved detection of weak interactions |
US9542045B2 (en) | 2012-03-14 | 2017-01-10 | Texas Instruments Incorporated | Detecting and tracking touch on an illuminated surface using a mean-subtracted image |
EP2662756A1 (en) | 2012-05-11 | 2013-11-13 | BlackBerry Limited | Touch screen palm input rejection |
WO2013171747A2 (en) | 2012-05-14 | 2013-11-21 | N-Trig Ltd. | Method for identifying palm input to a digitizer |
US8902170B2 (en) * | 2012-05-31 | 2014-12-02 | Blackberry Limited | Method and system for rendering diacritic characters |
US20140232679A1 (en) * | 2013-02-17 | 2014-08-21 | Microsoft Corporation | Systems and methods to protect against inadvertant actuation of virtual buttons on touch surfaces |
US10578499B2 (en) * | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
SG11201506843PA (en) | 2013-03-15 | 2015-09-29 | Tactual Labs Co | Fast multi-touch noise reduction |
KR102143574B1 (ko) * | 2013-09-12 | 2020-08-11 | 삼성전자주식회사 | 근접 터치를 이용한 온라인 서명 인증 방법 및 이를 위한 장치 |
US9329727B2 (en) * | 2013-12-11 | 2016-05-03 | Microsoft Technology Licensing, Llc | Object detection in optical sensor systems |
US9430095B2 (en) * | 2014-01-23 | 2016-08-30 | Microsoft Technology Licensing, Llc | Global and local light detection in optical sensor systems |
US9558455B2 (en) * | 2014-07-11 | 2017-01-31 | Microsoft Technology Licensing, Llc | Touch classification |
US9818043B2 (en) * | 2015-06-24 | 2017-11-14 | Microsoft Technology Licensing, Llc | Real-time, model-based object detection and pose estimation |
-
2014
- 2014-07-11 US US14/329,052 patent/US9558455B2/en active Active
-
2015
- 2015-07-07 RU RU2017100249A patent/RU2711029C2/ru active
- 2015-07-07 BR BR112016029932A patent/BR112016029932A2/pt not_active Application Discontinuation
- 2015-07-07 EP EP15742432.6A patent/EP3167352B1/en active Active
- 2015-07-07 CA CA2954516A patent/CA2954516C/en active Active
- 2015-07-07 AU AU2015288086A patent/AU2015288086B2/en active Active
- 2015-07-07 MX MX2017000495A patent/MX2017000495A/es unknown
- 2015-07-07 CN CN201580037941.5A patent/CN106537305B/zh active Active
- 2015-07-07 WO PCT/US2015/039282 patent/WO2016007450A1/en active Application Filing
- 2015-07-07 KR KR1020177003813A patent/KR102424803B1/ko active IP Right Grant
- 2015-07-07 JP JP2016575483A patent/JP6641306B2/ja active Active
-
2017
- 2017-01-03 US US15/397,336 patent/US10679146B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120007821A1 (en) * | 2010-07-11 | 2012-01-12 | Lester F. Ludwig | Sequential classification recognition of gesture primitives and window-based parameter smoothing for high dimensional touchpad (hdtp) user interfaces |
WO2013166513A2 (en) * | 2012-05-04 | 2013-11-07 | Oblong Industries, Inc. | Cross-user hand tracking and shape recognition user interface |
US20140104225A1 (en) * | 2012-10-17 | 2014-04-17 | Perceptive Pixel, Inc. | Input Classification for Multi-Touch Systems |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016177343A (ja) * | 2015-03-18 | 2016-10-06 | 株式会社トヨタIt開発センター | 信号処理装置、入力装置、信号処理方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
KR102424803B1 (ko) | 2022-07-22 |
KR20170030613A (ko) | 2017-03-17 |
RU2017100249A (ru) | 2018-07-16 |
CN106537305B (zh) | 2019-12-20 |
US20170116545A1 (en) | 2017-04-27 |
US20160012348A1 (en) | 2016-01-14 |
CA2954516C (en) | 2022-10-04 |
MX2017000495A (es) | 2017-05-01 |
EP3167352A1 (en) | 2017-05-17 |
JP6641306B2 (ja) | 2020-02-05 |
AU2015288086A1 (en) | 2017-01-05 |
CA2954516A1 (en) | 2016-01-14 |
US9558455B2 (en) | 2017-01-31 |
CN106537305A (zh) | 2017-03-22 |
AU2015288086B2 (en) | 2020-07-16 |
WO2016007450A1 (en) | 2016-01-14 |
BR112016029932A2 (pt) | 2017-08-22 |
RU2711029C2 (ru) | 2020-01-14 |
RU2017100249A3 (ja) | 2019-02-12 |
US10679146B2 (en) | 2020-06-09 |
EP3167352B1 (en) | 2021-09-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6641306B2 (ja) | タッチ分類 | |
US20220383535A1 (en) | Object Tracking Method and Device, Electronic Device, and Computer-Readable Storage Medium | |
JP6649947B2 (ja) | タッチ入力の非意図的または意図的なものとしての分類 | |
JP2018524734A (ja) | 複数のオブジェクトの入力を認識するためのシステムならびにそのための方法および製品 | |
WO2014127697A1 (en) | Method and terminal for triggering application programs and application program functions | |
CN109977906B (zh) | 手势识别方法及系统、计算机设备及存储介质 | |
KR101559502B1 (ko) | 실시간 손 포즈 인식을 통한 비접촉식 입력 인터페이스 방법 및 기록 매체 | |
Joo et al. | Real‐Time Depth‐Based Hand Detection and Tracking | |
CN111492407B (zh) | 用于绘图美化的系统和方法 | |
CN111625157A (zh) | 指尖关键点检测方法、装置、设备和可读存储介质 | |
US20140232672A1 (en) | Method and terminal for triggering application programs and application program functions | |
CN109582171A (zh) | 使用电容悬停模式的对新触摸姿势的手指标识 | |
WO2024164486A1 (zh) | 触摸书写笔迹生成方法、装置、电子设备及存储介质 | |
Alam et al. | A unified learning approach for hand gesture recognition and fingertip detection | |
Bai et al. | Dynamic hand gesture recognition based on depth information | |
US20120299837A1 (en) | Identifying contacts and contact attributes in touch sensor data using spatial and temporal features | |
CN114743038A (zh) | 一种数据聚类方法、装置、计算机设备及存储介质 | |
CN111639573A (zh) | 基于orb算法的手势识别方法、存储介质及电子设备 | |
KR101687941B1 (ko) | 온라인 필기 데이터 라인 분할 방법 및 이를 이용하는 장치 | |
CN111626364B (zh) | 手势图像分类方法、装置、计算机设备及存储介质 | |
CN116301361B (zh) | 基于智能眼镜的目标选择方法、装置和电子设备 | |
JP2023504319A (ja) | 人体と人手を関連付ける方法、装置、機器及び記憶媒体 | |
CN112596603A (zh) | 核电站控制系统的手势操控方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20180528 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20190218 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20190326 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20190619 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20191203 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20191227 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 6641306 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |