WO2013192454A2 - Fingertip location for gesture input - Google Patents

Fingertip location for gesture input Download PDF

Info

Publication number
WO2013192454A2
WO2013192454A2 PCT/US2013/046906 US2013046906W WO2013192454A2 WO 2013192454 A2 WO2013192454 A2 WO 2013192454A2 US 2013046906 W US2013046906 W US 2013046906W WO 2013192454 A2 WO2013192454 A2 WO 2013192454A2
Authority
WO
WIPO (PCT)
Prior art keywords
fingertip
computing device
finger
user
location
Prior art date
Application number
PCT/US2013/046906
Other languages
English (en)
French (fr)
Other versions
WO2013192454A3 (en
Inventor
Kenneth M. Karakotsios
Isaac S. NOBLE
Dong Zhou
Original Assignee
Amazon Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Amazon Technologies, Inc. filed Critical Amazon Technologies, Inc.
Priority to JP2015518581A priority Critical patent/JP6072237B2/ja
Priority to EP13806783.0A priority patent/EP2864932B1/en
Priority to CN201380033135.1A priority patent/CN104662558B/zh
Publication of WO2013192454A2 publication Critical patent/WO2013192454A2/en
Publication of WO2013192454A3 publication Critical patent/WO2013192454A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04106Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
PCT/US2013/046906 2012-06-20 2013-06-20 Fingertip location for gesture input WO2013192454A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2015518581A JP6072237B2 (ja) 2012-06-20 2013-06-20 ジェスチャー入力のための指先の場所特定
EP13806783.0A EP2864932B1 (en) 2012-06-20 2013-06-20 Fingertip location for gesture input
CN201380033135.1A CN104662558B (zh) 2012-06-20 2013-06-20 用于手势输入的指尖定位

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/528,522 2012-06-20
US13/528,522 US9213436B2 (en) 2012-06-20 2012-06-20 Fingertip location for gesture input

Publications (2)

Publication Number Publication Date
WO2013192454A2 true WO2013192454A2 (en) 2013-12-27
WO2013192454A3 WO2013192454A3 (en) 2014-01-30

Family

ID=49769708

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/046906 WO2013192454A2 (en) 2012-06-20 2013-06-20 Fingertip location for gesture input

Country Status (5)

Country Link
US (1) US9213436B2 (zh)
EP (1) EP2864932B1 (zh)
JP (1) JP6072237B2 (zh)
CN (1) CN104662558B (zh)
WO (1) WO2013192454A2 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015102974A1 (en) * 2014-01-03 2015-07-09 Microsoft Technology Licensing, Llc Hangle-based hover input method
EP3242190A1 (en) * 2016-05-06 2017-11-08 Advanced Silicon SA System, method and computer program for detecting an object approaching and touching a capacitive touch device

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140123077A1 (en) * 2012-10-29 2014-05-01 Intel Corporation System and method for user interaction and control of electronic devices
EP2749996B1 (en) * 2012-12-28 2018-05-30 Sony Mobile Communications Inc. Electronic device and method for improving accuracy of location determination of a user input on a touch panel
US20140282269A1 (en) * 2013-03-13 2014-09-18 Amazon Technologies, Inc. Non-occluded display for hover interactions
US9110541B1 (en) * 2013-03-14 2015-08-18 Amazon Technologies, Inc. Interface selection approaches for multi-dimensional input
US9122916B2 (en) 2013-03-14 2015-09-01 Honda Motor Co., Ltd. Three dimensional fingertip tracking
US20150002275A1 (en) * 2013-06-26 2015-01-01 Nokia Corporation Methods, apparatuses, and computer program products for data transfer between wireless memory tags
US10152136B2 (en) * 2013-10-16 2018-12-11 Leap Motion, Inc. Velocity field interaction for free space gesture interface and control
US10126822B2 (en) 2013-12-16 2018-11-13 Leap Motion, Inc. User-defined virtual interaction space and manipulation of virtual configuration
DE102013022123A1 (de) * 2013-12-28 2015-07-02 Lutz Herkner Verfahren zur einfacheren Bedienung von Geräten mit Touchscreen, insbesondere bei der Bedienung mit einer Hand ("Magnetischer Daumen")
WO2015102658A1 (en) * 2014-01-03 2015-07-09 Intel Corporation Systems and techniques for user interface control
DE102014005064A1 (de) * 2014-04-05 2015-10-08 Audi Ag Eingabesystem und Verfahren zum Bereitstellen einer Eingabeinformation
US10684707B2 (en) 2014-06-25 2020-06-16 Sony Corporation Display control device, display control method, and program
CN105635776B (zh) * 2014-11-06 2019-03-01 深圳Tcl新技术有限公司 虚拟操作界面遥控控制方法及系统
JP2016091457A (ja) * 2014-11-10 2016-05-23 富士通株式会社 入力装置、指先位置検出方法及び指先位置検出用コンピュータプログラム
US10564770B1 (en) * 2015-06-09 2020-02-18 Apple Inc. Predictive touch detection
US20180239509A1 (en) * 2017-02-20 2018-08-23 Microsoft Technology Licensing, Llc Pre-interaction context associated with gesture and touch interactions
US10824293B2 (en) * 2017-05-08 2020-11-03 International Business Machines Corporation Finger direction based holographic object interaction from a distance
US10860088B2 (en) * 2018-05-03 2020-12-08 Microsoft Technology Licensing, Llc Method and system for initiating application and system modal control based on hand locations
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
US11132060B2 (en) * 2018-12-04 2021-09-28 International Business Machines Corporation Collaborative interactions and feedback with midair interfaces
FR3124872A1 (fr) * 2021-07-02 2023-01-06 Faurecia Interieur Industrie Dispositif électronique et procédé d'affichage de données sur un écran d’affichage, système d’affichage, véhicule et programme d’ordinateur associés

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2284655A2 (en) 2009-07-27 2011-02-16 Samsung Electronics Co., Ltd. Method and apparatus for controlling electronic device using user interaction

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170492B2 (en) * 2002-05-28 2007-01-30 Reactrix Systems, Inc. Interactive video display system
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
WO2006086508A2 (en) * 2005-02-08 2006-08-17 Oblong Industries, Inc. System and method for genture based control system
US8614669B2 (en) 2006-03-13 2013-12-24 Navisense Touchless tablet method and system thereof
US8180114B2 (en) * 2006-07-13 2012-05-15 Northrop Grumman Systems Corporation Gesture recognition interface system with vertical display
JP4390002B2 (ja) * 2007-03-16 2009-12-24 ソニー株式会社 表示装置およびその制御方法
CN101689244B (zh) 2007-05-04 2015-07-22 高通股份有限公司 用于紧凑设备的基于相机的用户输入
JP4991458B2 (ja) 2007-09-04 2012-08-01 キヤノン株式会社 画像表示装置及びその制御方法
US8432372B2 (en) 2007-11-30 2013-04-30 Microsoft Corporation User input using proximity sensing
CN102165396B (zh) * 2008-07-25 2014-10-29 高通股份有限公司 挥动约定姿态的增强检测
CA2737251A1 (en) 2008-09-15 2010-03-18 Smart Technologies Ulc Touch input with image sensor and signal processor
KR101537596B1 (ko) 2008-10-15 2015-07-20 엘지전자 주식회사 이동 단말기 및 이것의 터치 인식 방법
CN202142005U (zh) * 2009-07-22 2012-02-08 罗技欧洲公司 用于远程、虚拟屏幕输入的系统
CN101719015B (zh) * 2009-11-03 2011-08-31 上海大学 指示手势的手指尖定位方法
US8593402B2 (en) * 2010-04-30 2013-11-26 Verizon Patent And Licensing Inc. Spatial-input-based cursor projection systems and methods
JP2012003724A (ja) * 2010-06-21 2012-01-05 Nippon Telegr & Teleph Corp <Ntt> 三次元指先位置検出方法、三次元指先位置検出装置、及びプログラム
JP2012048393A (ja) * 2010-08-25 2012-03-08 Canon Inc 情報処理装置およびその動作方法
EP2628069B1 (en) * 2010-10-12 2020-12-02 New York University Apparatus for sensing utilizing tiles, sensor having a set of plates, object identification for multi-touch surfaces, and method
US20120314899A1 (en) * 2011-06-13 2012-12-13 Microsoft Corporation Natural user interfaces for mobile image viewing
US10684768B2 (en) 2011-10-14 2020-06-16 Autodesk, Inc. Enhanced target selection for a touch-based input enabled user interface

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2284655A2 (en) 2009-07-27 2011-02-16 Samsung Electronics Co., Ltd. Method and apparatus for controlling electronic device using user interaction

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015102974A1 (en) * 2014-01-03 2015-07-09 Microsoft Technology Licensing, Llc Hangle-based hover input method
US9262012B2 (en) 2014-01-03 2016-02-16 Microsoft Corporation Hover angle
EP3242190A1 (en) * 2016-05-06 2017-11-08 Advanced Silicon SA System, method and computer program for detecting an object approaching and touching a capacitive touch device
US10139962B2 (en) 2016-05-06 2018-11-27 Advanced Silicon Sa System, method and computer program for detecting an object approaching and touching a capacitive touch device

Also Published As

Publication number Publication date
EP2864932B1 (en) 2020-08-12
EP2864932A2 (en) 2015-04-29
US9213436B2 (en) 2015-12-15
WO2013192454A3 (en) 2014-01-30
JP2015520471A (ja) 2015-07-16
CN104662558B (zh) 2018-04-10
EP2864932A4 (en) 2016-02-17
US20130342459A1 (en) 2013-12-26
JP6072237B2 (ja) 2017-02-01
CN104662558A (zh) 2015-05-27

Similar Documents

Publication Publication Date Title
US9213436B2 (en) Fingertip location for gesture input
US11175726B2 (en) Gesture actions for interface elements
US9910505B2 (en) Motion control for managing content
JP6605000B2 (ja) 三次元オブジェクト表示のためのアプローチ
US9378581B2 (en) Approaches for highlighting active interface elements
US9696859B1 (en) Detecting tap-based user input on a mobile device based on motion sensor data
US9591295B2 (en) Approaches for simulating three-dimensional views
US9268407B1 (en) Interface elements for managing gesture control
EP2707835B1 (en) Using spatial information with device interaction
US10037614B2 (en) Minimizing variations in camera height to estimate distance to objects
US9262067B1 (en) Approaches for displaying alternate views of information
US9400575B1 (en) Finger detection for element selection
US9075514B1 (en) Interface selection element display
US9303982B1 (en) Determining object depth information using image data
US10019140B1 (en) One-handed zoom
US9110541B1 (en) Interface selection approaches for multi-dimensional input
US9471154B1 (en) Determining which hand is holding a device
US9377866B1 (en) Depth-based position mapping
US9041689B1 (en) Estimating fingertip position using image analysis
US9350918B1 (en) Gesture control for managing an image view display
US10082936B1 (en) Handedness determinations for electronic devices
US9898183B1 (en) Motions for object rendering and selection
US9690384B1 (en) Fingertip location determinations for gesture input

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13806783

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2015518581

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2013806783

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13806783

Country of ref document: EP

Kind code of ref document: A2