JP6464138B2 - ジェスチャによるタッチ入力の向上 - Google Patents
ジェスチャによるタッチ入力の向上 Download PDFInfo
- Publication number
- JP6464138B2 JP6464138B2 JP2016501321A JP2016501321A JP6464138B2 JP 6464138 B2 JP6464138 B2 JP 6464138B2 JP 2016501321 A JP2016501321 A JP 2016501321A JP 2016501321 A JP2016501321 A JP 2016501321A JP 6464138 B2 JP6464138 B2 JP 6464138B2
- Authority
- JP
- Japan
- Prior art keywords
- touch
- screen
- user device
- input data
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000033001 locomotion Effects 0.000 claims description 35
- 238000000034 method Methods 0.000 claims description 24
- 230000002452 interceptive effect Effects 0.000 claims description 11
- 238000005516 engineering process Methods 0.000 claims description 10
- 238000001514 detection method Methods 0.000 description 29
- 210000003811 finger Anatomy 0.000 description 23
- 230000009471 action Effects 0.000 description 14
- 238000010586 diagram Methods 0.000 description 10
- 210000003813 thumb Anatomy 0.000 description 8
- 238000002604 ultrasonography Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000003993 interaction Effects 0.000 description 6
- 150000001875 compounds Chemical class 0.000 description 5
- 238000003825 pressing Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 210000004247 hand Anatomy 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000005684 electric field Effects 0.000 description 2
- 238000002567 electromyography Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04106—Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Description
102 ユーザデバイス
104 ターゲットアイテム
201 点
202 ユーザデバイス
207 下方の動作
301 点
302 ユーザデバイス
302B アイテム
316 矢印
1500 システム
1502 バス
1504 処理構成要素
1506 システムメモリ構成要素
1508 スタティック記憶構成要素
1510 ディスクドライブ構成要素
1512 ネットワークインターフェース構成要素
1514 表示構成要素
1516 入力構成要素
1518 カーソル制御構成要素
1520 通信リンク
Claims (15)
- ユーザデバイスの画面上のタッチ入力データを取り込むステップであって、前記タッチ入力データが、前記ユーザデバイスの前記画面上で行われるタッチを表す、ステップと、
前記タッチ入力データに基づいて入力コマンドを決定するステップと、
前記タッチ入力データに基づいて前記ユーザデバイスの動作に影響を及ぼすステップであって、前記動作が、前記ユーザデバイスの前記画面上のコンテンツの表示に影響を与える、ステップと、
非タッチジェスチャ入力データを取り込むステップであって、前記非タッチジェスチャ入力データが、前記タッチを維持しながら前記ユーザデバイスの前記画面からオフセットされた領域で行われる画面外ジェスチャを表し、前記非タッチジェスチャ入力データが、前記決定された入力コマンドに関連している、ステップと、
前記決定された入力コマンドおよび前記非タッチジェスチャ入力データに基づいて前記ユーザデバイスの前記動作にさらに影響を及ぼすステップと
を含む、対話型入力を検出するための方法。 - 前記タッチ入力データが、前記ユーザデバイスの前記画面上のターゲットアイテムを特定し、前記入力コマンドが、前記非タッチジェスチャ入力データに基づく前記ターゲットアイテムの態様の調整を含む、請求項1に記載の方法。
- 前記入力コマンドが前記ターゲットアイテムの可変量調整を含み、前記可変量調整が前記非タッチジェスチャ入力データから決定される、請求項2に記載の方法。
- 前記タッチ入力データを取り込む前記ステップがさらに、前記ユーザデバイスの前記画面上のユーザから、影響が及ぶべき所望のターゲットアイテムに対するタッチを受けるステップと、
前記ユーザデバイスの前記画面上で前記タッチの解放を検出することによって、前記ターゲットアイテムが解除されたと決定するステップとを含む、請求項1に記載の方法。 - 前記非タッチジェスチャ入力データを取り込む前記ステップが、物体の位置および動きを検出するステップを含む、請求項1に記載の方法。
- 前記物体の前記動きがさらに、前記ユーザデバイスの前記画面と実質的に同じ平面の中での動きを含む、請求項5に記載の方法。
- 前記非タッチジェスチャ入力データを取り込む前記ステップが、前記ユーザデバイスの表面を越えた物体を超音波技術、画像もしくは映像取込み技術、または赤外線技術によって検出するように適合された1つまたは複数のセンサを使用するステップを含む、請求項1に記載の方法。
- 前記決定された入力コマンドが、前記ユーザデバイスの前記画面上のターゲットアイテムの選択を表す複数の異なるタイプの入力のうちの1つを含む、請求項1に記載の方法。
- 前記異なるタイプの入力がさらに、前記画面に触れる手の第1のポーズによって生成される右マウスクリック(RMC)、または前記画面に触れる前記手の第2のポーズによって生成される左マウスクリック(LMC)もしくは代替クリックとを含み、前記第1のポーズが前記第2のポーズとは異なる、請求項8に記載の方法。
- 前記非タッチジェスチャ入力データを取り込む前記ステップが、手のポーズまたは手の動作を取り込むステップを含む、請求項1に記載の方法。
- ユーザデバイスの画面上のタッチ入力データを取り込むための手段であって、前記タッチ入力データが、前記ユーザデバイスの前記画面上で行われるタッチを表す、手段と、
前記タッチ入力データに基づいて入力コマンドを決定する手段と、
前記タッチ入力データに基づいて前記ユーザデバイスの動作に影響を及ぼす手段であって、前記動作が、前記ユーザデバイスの前記画面上のコンテンツの表示に影響を与える、手段と、
非タッチジェスチャ入力データを取り込む手段であって、前記非タッチジェスチャ入力データが、前記タッチを維持しながら前記ユーザデバイスの前記画面からオフセットされた領域で行われる画面外ジェスチャを表し、前記非タッチジェスチャ入力データが、前記決定された入力コマンドに関連している、手段と、
前記決定された入力コマンドおよび前記非タッチジェスチャ入力データに基づいて前記ユーザデバイスの前記動作にさらに影響を及ぼすための手段と
を含む、対話型入力を検出するための装置。 - 前記タッチ入力データが、前記ユーザデバイスの前記画面上のターゲットアイテムを特定し、前記入力コマンドが、前記非タッチジェスチャ入力データに基づく前記ターゲットアイテムの態様の調整を含む、請求項11に記載の装置。
- 前記入力コマンドが前記ターゲットアイテムの可変量調整を含み、前記可変量調整が前記非タッチジェスチャ入力データから決定される、請求項12に記載の装置。
- 前記タッチ入力データを取り込むための前記手段が、前記ユーザデバイスの前記画面上のユーザから、影響が及ぶべき所望のターゲットアイテムに対するタッチを受けるための手段を含み、
前記ユーザデバイスの前記画面上で前記タッチの解放を検出することによって、前記ターゲットアイテムが解除されたと決定することをさらに含む、請求項11に記載の装置。 - プロセッサによって実行されたときに、前記プロセッサに請求項1から10のいずれか一項に記載のステップを行わせるコンピュータ可読命令が保存される、コンピュータ可読記録媒体。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/842,891 US9170676B2 (en) | 2013-03-15 | 2013-03-15 | Enhancing touch inputs with gestures |
US13/842,891 | 2013-03-15 | ||
PCT/US2014/023704 WO2014150588A1 (en) | 2013-03-15 | 2014-03-11 | Enhancing touch inputs with gestures |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2016511487A JP2016511487A (ja) | 2016-04-14 |
JP2016511487A5 JP2016511487A5 (ja) | 2017-03-30 |
JP6464138B2 true JP6464138B2 (ja) | 2019-02-06 |
Family
ID=50639925
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2016501321A Active JP6464138B2 (ja) | 2013-03-15 | 2014-03-11 | ジェスチャによるタッチ入力の向上 |
Country Status (6)
Country | Link |
---|---|
US (2) | US9170676B2 (ja) |
EP (1) | EP2972676A1 (ja) |
JP (1) | JP6464138B2 (ja) |
KR (1) | KR102194272B1 (ja) |
CN (1) | CN105009035B (ja) |
WO (1) | WO2014150588A1 (ja) |
Families Citing this family (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9152306B2 (en) * | 2011-03-29 | 2015-10-06 | Intel Corporation | Techniques for touch and non-touch user interaction input |
GB2490108B (en) * | 2011-04-13 | 2018-01-17 | Nokia Technologies Oy | A method, apparatus and computer program for user control of a state of an apparatus |
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US9202297B1 (en) * | 2011-07-12 | 2015-12-01 | Domo, Inc. | Dynamic expansion of data visualizations |
US9170676B2 (en) | 2013-03-15 | 2015-10-27 | Qualcomm Incorporated | Enhancing touch inputs with gestures |
US20140267142A1 (en) * | 2013-03-15 | 2014-09-18 | Qualcomm Incorporated | Extending interactive inputs via sensor fusion |
US10481769B2 (en) * | 2013-06-09 | 2019-11-19 | Apple Inc. | Device, method, and graphical user interface for providing navigation and search functionalities |
US20150062056A1 (en) * | 2013-08-30 | 2015-03-05 | Kobo Incorporated | 3d gesture recognition for operating an electronic personal display |
US20150077345A1 (en) * | 2013-09-16 | 2015-03-19 | Microsoft Corporation | Simultaneous Hover and Touch Interface |
US20150091841A1 (en) * | 2013-09-30 | 2015-04-02 | Kobo Incorporated | Multi-part gesture for operating an electronic personal display |
US10048762B2 (en) | 2013-11-05 | 2018-08-14 | Intuit Inc. | Remote control of a desktop application via a mobile device |
US9965170B2 (en) * | 2013-11-11 | 2018-05-08 | Lenovo (Singapore) Pte. Ltd. | Multi-touch inputs for input interface control |
US10855911B2 (en) * | 2014-01-15 | 2020-12-01 | Samsung Electronics Co., Ltd | Method for setting image capture conditions and electronic device performing the same |
US9607139B1 (en) * | 2014-03-27 | 2017-03-28 | EMC IP Holding Company LLC | Map-based authentication |
CN103941873B (zh) * | 2014-04-30 | 2017-05-10 | 北京智谷睿拓技术服务有限公司 | 识别方法和设备 |
KR102251541B1 (ko) * | 2014-06-23 | 2021-05-14 | 엘지전자 주식회사 | 이동단말기 및 그것의 제어방법 |
TWI549504B (zh) * | 2014-08-11 | 2016-09-11 | 宏碁股份有限公司 | 影像擷取裝置及其自動對焦補償方法 |
US10015402B2 (en) * | 2014-09-08 | 2018-07-03 | Nintendo Co., Ltd. | Electronic apparatus |
JP6684042B2 (ja) * | 2014-09-08 | 2020-04-22 | 任天堂株式会社 | 電子機器 |
JP6519074B2 (ja) * | 2014-09-08 | 2019-05-29 | 任天堂株式会社 | 電子機器 |
US20160139628A1 (en) * | 2014-11-13 | 2016-05-19 | Li Bao | User Programable Touch and Motion Controller |
CN104536766B (zh) * | 2015-01-09 | 2018-01-26 | 京东方科技集团股份有限公司 | 一种电子设备的控制方法及电子设备 |
JP6539816B2 (ja) * | 2015-01-30 | 2019-07-10 | ソニー デプスセンシング ソリューションズ エスエー エヌブイ | 1つのシングル・センシング・システムを使用したマルチ・モーダル・ジェスチャー・ベースの対話型のシステム及び方法 |
JP6519075B2 (ja) * | 2015-02-10 | 2019-05-29 | 任天堂株式会社 | 情報処理装置、情報処理プログラム、情報処理システム、および、情報処理方法 |
JP6573457B2 (ja) * | 2015-02-10 | 2019-09-11 | 任天堂株式会社 | 情報処理システム |
US9946395B2 (en) * | 2015-02-16 | 2018-04-17 | Samsung Electronics Co., Ltd. | User interface method and apparatus |
CN105988695B (zh) * | 2015-02-16 | 2019-06-28 | 北京三星通信技术研究有限公司 | 智能设备及其操作响应方法 |
CN105100409A (zh) * | 2015-05-26 | 2015-11-25 | 努比亚技术有限公司 | 一种控制移动终端的方法和装置 |
US10444819B2 (en) * | 2015-06-19 | 2019-10-15 | Intel Corporation | Techniques to control computational resources for an electronic device |
US10488975B2 (en) * | 2015-12-23 | 2019-11-26 | Intel Corporation | Touch gesture detection assessment |
CN105867764A (zh) * | 2016-03-25 | 2016-08-17 | 乐视控股(北京)有限公司 | 一种多媒体调节方法、装置及移动设备 |
KR20170138279A (ko) * | 2016-06-07 | 2017-12-15 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
CN106648102A (zh) * | 2016-12-26 | 2017-05-10 | 珠海市魅族科技有限公司 | 通过非触控手势来控制终端设备的方法及系统 |
US10635291B2 (en) | 2017-02-20 | 2020-04-28 | Microsoft Technology Licensing, Llc | Thumb and pen interaction on a mobile device |
KR102316024B1 (ko) * | 2017-03-02 | 2021-10-26 | 삼성전자주식회사 | 디스플레이 장치 및 디스플레이 장치의 사용자 인터페이스 표시 방법 |
CN109697043B (zh) * | 2017-10-20 | 2020-05-22 | 北京仁光科技有限公司 | 多屏互动显示方法、装置、系统及计算机可读存储介质 |
WO2019126290A1 (en) * | 2017-12-20 | 2019-06-27 | Hubbell Incorporated | Gesture control for in-wall device |
FR3080688A1 (fr) * | 2018-04-26 | 2019-11-01 | Stmicroelectronics Sa | Dispositif de detection de mouvement |
CN109493821B (zh) * | 2018-12-20 | 2021-07-06 | 惠州Tcl移动通信有限公司 | 屏幕亮度调整方法、装置及存储介质 |
US11930439B2 (en) | 2019-01-09 | 2024-03-12 | Margo Networks Private Limited | Network control and optimization (NCO) system and method |
US11442621B2 (en) * | 2019-10-01 | 2022-09-13 | Microsoft Technology Licensing, Llc | Extensions to global keyboard shortcuts for computing devices having multiple display regions |
US11782522B1 (en) * | 2022-03-25 | 2023-10-10 | Huawei Technologies Co., Ltd. | Methods and systems for multimodal hand state prediction |
WO2023224680A1 (en) | 2022-05-18 | 2023-11-23 | Margo Networks Pvt. Ltd. | Peer to peer (p2p) encrypted data transfer/offload system and method |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9292111B2 (en) | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US20050052427A1 (en) | 2003-09-10 | 2005-03-10 | Wu Michael Chi Hung | Hand gesture interaction with touch surface |
JP2009042796A (ja) * | 2005-11-25 | 2009-02-26 | Panasonic Corp | ジェスチャー入力装置および方法 |
US8432372B2 (en) * | 2007-11-30 | 2013-04-30 | Microsoft Corporation | User input using proximity sensing |
JP4609543B2 (ja) * | 2008-07-25 | 2011-01-12 | ソニー株式会社 | 情報処理装置及び情報処理方法 |
US8433138B2 (en) * | 2008-10-29 | 2013-04-30 | Nokia Corporation | Interaction using touch and non-touch gestures |
KR101593727B1 (ko) | 2008-12-29 | 2016-02-15 | 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. | 제스처 검출 시스템, 방법 및 컴퓨터 판독가능 매체 |
JP2011008424A (ja) * | 2009-06-24 | 2011-01-13 | Sharp Corp | 電子機器、動作モード設定方法、およびプログラム |
KR101633332B1 (ko) * | 2009-09-30 | 2016-06-24 | 엘지전자 주식회사 | 단말기 및 그 제어 방법 |
US8633902B2 (en) * | 2009-11-23 | 2014-01-21 | Microsoft Corporation | Touch input for hosted applications |
US20110209098A1 (en) * | 2010-02-19 | 2011-08-25 | Hinckley Kenneth P | On and Off-Screen Gesture Combinations |
CA2735325C (en) | 2010-03-25 | 2015-01-20 | User Interface In Sweden Ab | System and method for gesture detection and feedback |
JP5658500B2 (ja) * | 2010-07-26 | 2015-01-28 | キヤノン株式会社 | 情報処理装置及びその制御方法 |
US8890818B2 (en) | 2010-09-22 | 2014-11-18 | Nokia Corporation | Apparatus and method for proximity based input |
WO2012048007A2 (en) | 2010-10-05 | 2012-04-12 | Citrix Systems, Inc. | Touch support for remoted applications |
US9423876B2 (en) * | 2011-09-30 | 2016-08-23 | Microsoft Technology Licensing, Llc | Omni-spatial gesture input |
US20130257753A1 (en) * | 2012-04-03 | 2013-10-03 | Anirudh Sharma | Modeling Actions Based on Speech and Touch Inputs |
CN102681774B (zh) * | 2012-04-06 | 2015-02-18 | 优视科技有限公司 | 通过手势控制应用界面的方法、装置和移动终端 |
US9170676B2 (en) | 2013-03-15 | 2015-10-27 | Qualcomm Incorporated | Enhancing touch inputs with gestures |
-
2013
- 2013-03-15 US US13/842,891 patent/US9170676B2/en active Active
-
2014
- 2014-03-11 KR KR1020157028176A patent/KR102194272B1/ko active IP Right Grant
- 2014-03-11 WO PCT/US2014/023704 patent/WO2014150588A1/en active Application Filing
- 2014-03-11 JP JP2016501321A patent/JP6464138B2/ja active Active
- 2014-03-11 EP EP14721594.1A patent/EP2972676A1/en not_active Withdrawn
- 2014-03-11 CN CN201480013281.2A patent/CN105009035B/zh active Active
-
2015
- 2015-09-24 US US14/864,567 patent/US9360965B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN105009035A (zh) | 2015-10-28 |
US20160011718A1 (en) | 2016-01-14 |
CN105009035B (zh) | 2017-11-14 |
US20140267084A1 (en) | 2014-09-18 |
US9360965B2 (en) | 2016-06-07 |
WO2014150588A1 (en) | 2014-09-25 |
US9170676B2 (en) | 2015-10-27 |
EP2972676A1 (en) | 2016-01-20 |
KR102194272B1 (ko) | 2020-12-22 |
JP2016511487A (ja) | 2016-04-14 |
KR20150130431A (ko) | 2015-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6464138B2 (ja) | ジェスチャによるタッチ入力の向上 | |
US11599154B2 (en) | Adaptive enclosure for a mobile computing device | |
DK179350B1 (en) | Device, Method, and Graphical User Interface for Navigating Media Content | |
US9360962B2 (en) | Electronic apparatus and a method for controlling the same | |
US20140267142A1 (en) | Extending interactive inputs via sensor fusion | |
US20130222329A1 (en) | Graphical user interface interaction on a touch-sensitive device | |
KR101608423B1 (ko) | 모바일 디바이스상의 풀 3d 상호작용 | |
US8775958B2 (en) | Assigning Z-order to user interface elements | |
US20110199387A1 (en) | Activating Features on an Imaging Device Based on Manipulations | |
DK201670755A1 (en) | User Interface for Camera Effects | |
KR20150107528A (ko) | 사용자 인터페이스를 제공하는 방법과 전자 장치 | |
KR20160060109A (ko) | 모션 또는 그의 부재에 기초한 터치 기반 디바이스 상의 제어 인터페이스의 제시 | |
DK201670753A1 (en) | User Interface for Camera Effects | |
WO2014024396A1 (en) | Information processing apparatus, information processing method, and computer program | |
US20180260044A1 (en) | Information processing apparatus, information processing method, and program | |
US10656746B2 (en) | Information processing device, information processing method, and program | |
WO2016183912A1 (zh) | 菜单布局方法及装置 | |
US10599326B2 (en) | Eye motion and touchscreen gestures | |
EP2750016A1 (en) | Method of operating a graphical user interface and graphical user interface | |
JP7091159B2 (ja) | 電子機器およびその制御方法 | |
Procházka et al. | Mainstreaming gesture based interfaces | |
US20160342280A1 (en) | Information processing apparatus, information processing method, and program | |
TW201523428A (zh) | 介面操作方法與應用該方法之可攜式電子裝置 | |
WO2014166044A1 (en) | Method and device for user input | |
JP2016167171A (ja) | 電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20170222 |
|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20170222 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20180115 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20180302 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20180611 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20180831 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20181210 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20190107 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 6464138 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |