JP6353160B2 - 超音波を用いたモーダルボディタッチ - Google Patents
超音波を用いたモーダルボディタッチ Download PDFInfo
- Publication number
- JP6353160B2 JP6353160B2 JP2017508617A JP2017508617A JP6353160B2 JP 6353160 B2 JP6353160 B2 JP 6353160B2 JP 2017508617 A JP2017508617 A JP 2017508617A JP 2017508617 A JP2017508617 A JP 2017508617A JP 6353160 B2 JP6353160 B2 JP 6353160B2
- Authority
- JP
- Japan
- Prior art keywords
- user
- input
- ultrasonic
- state
- ultrasound
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000002604 ultrasonography Methods 0.000 title claims description 91
- 238000000034 method Methods 0.000 claims description 29
- 238000004891 communication Methods 0.000 claims description 28
- 230000009471 action Effects 0.000 claims description 26
- 230000015654 memory Effects 0.000 claims description 17
- 206010044565 Tremor Diseases 0.000 claims description 13
- 230000000644 propagated effect Effects 0.000 claims description 9
- 230000008602 contraction Effects 0.000 claims description 7
- 238000010801 machine learning Methods 0.000 claims description 5
- 230000004913 activation Effects 0.000 claims description 3
- 238000001514 detection method Methods 0.000 claims description 3
- 230000003213 activating effect Effects 0.000 claims 1
- 230000001902 propagating effect Effects 0.000 claims 1
- 210000003414 extremity Anatomy 0.000 description 18
- 230000008569 process Effects 0.000 description 18
- 210000003205 muscle Anatomy 0.000 description 15
- 230000033001 locomotion Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000004118 muscle contraction Effects 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/08—Touch switches specially adapted for time-pieces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/043—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42201—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Neurosurgery (AREA)
Description
Claims (13)
- センサからのデータを用いて、超音波センシングをアクティブ化することを示す、ユーザの肢の特定の状態を識別することと、
前記ユーザの身体上を経皮的に伝播され、且つ、伝播した領域におけるオンボディタッチによってもたらされる超音波信号を受信することを含む、超音波イベントの検出を前記ユーザにより装着可能な装置によって行うことと、
前記センサからのデータに基づいて、前記肢の状態を判断することと、
前記装置によって、前記超音波信号の前記受信及び前記肢の状態に基づいて、入力を選択することと、を含み、
前記センサには、前記ユーザの筋振戦を検出する加速度計が含まれ、
前記肢の状態を判断することには、前記加速度計から受信した前記筋振戦のデータに基づいて、前記肢の状態が収縮状態にあるか又は弛緩状態にあるかどうかを判断することが含まれる、
方法。 - 前記入力を第2の装置に送信することと、
前記第2の装置を介して、前記入力によって指定されるアクションを実行することと、をさらに含む、請求項1に記載の方法。 - 前記入力は、マウス、キーボード、又はタッチディスプレイを介して行われ得る入力のうちの1つである、請求項1に記載の方法。
- 受信した前記超音波信号の特性を分析することと、
前記分析することに基づいて、オンボディタッチの種類を識別することと、をさらに含み、
前記特性は、周波数、振幅、又は伝播速度のうちの少なくとも1つを含む、
請求項1に記載の方法。 - 前記入力は、アプリケーション固有である、請求項1に記載の方法。
- 前記オンボディタッチは、タップ又はスライドジェスチャである、請求項1に記載の方法。
- 超音波トランスデューサと、
加速度計と、
ソフトウェアを記憶するメモリと、
プロセッサと、を備える装置であって、
前記プロセッサは、
前記加速度計からのデータを用いて、超音波センシングをアクティブ化することを示す、前記装置が装着されているユーザの肢の特定の状態を識別し、
前記ユーザの身体上を経皮的に伝播され、且つ、伝播した領域におけるオンボディタッチによってもたらされる超音波信号を受信することを含む、超音波イベントの検出を、前記超音波トランスデューサを介して行い、
前記肢の状態及び収縮の度合いを検出し、
前記超音波信号の受信、前記肢の前記状態及び前記収縮の度合いに基づいて入力を選択する、前記ソフトウェアを実行し、
前記プロセッサは、
前記加速度計によって検出された前記ユーザの筋振戦に基づいて、前記オンボディタッチの時間の間、前記肢の状態が収縮状態であるか弛緩状態であるかどうかを判断する前記ソフトウェアを実行する、
装置。 - 通信インタフェースをさらに備え、前記プロセッサは、前記通信インタフェースを介して別の装置に前記入力を送信する前記ソフトウェアをさらに実行する、請求項7に記載の装置。
- 前記入力は、マウス、キーボード、又はタッチディスプレイを介して行われ得る入力のうちの1つである、請求項7に記載の装置。
- 前記プロセッサは、
受信した前記超音波信号の特性を分析し、
前記特性の分析に基づいて、前記オンボディタッチの種類を識別する、前記ソフトウェアをさらに実行し、
前記特性は、周波数、振幅、又は伝播速度のうちの少なくとも1つを含む、
請求項7に記載の装置。 - 前記プロセッサは、
前記超音波信号のプロファイル、前記肢の状態、及び入力を含むデータベースを記憶し、
前記データベースを使用して前記入力を選択する、前記ソフトウェアをさらに実行する、請求項7に記載の装置。 - 前記ユーザによって実行される特定のオンボディタッチイベントを認識し、前記オンボディタッチイベントに対応する入力を選択するために、前記ユーザが前記装置を訓練することを可能にする、機械学習モジュールをさらに備える、請求項7に記載の装置。
- ユーザによって装着される計算装置のプロセッサによって実行可能な命令であって、
実行時に、前記計算装置に、
加速度計を含むセンサからのデータを用いて、超音波センシングをアクティブ化することを示す、前記計算装置が装着される前記ユーザの肢の状態を識別することと、
前記ユーザの身体上を経皮的に伝播され、且つ、伝播した領域におけるオンボディタッチによってもたらされる超音波信号を受信することを含む、超音波イベントの検出を、超音波トランスデューサを介して行うことと、
前記加速度計によって検出された前記ユーザの筋振戦に基づいて、前記オンボディタッチの時間の間、前記肢の状態が収縮状態であるか弛緩状態であるかどうかを判断することと、
前記計算装置が装着された前記ユーザの肢の状態を検出することと、
前記超音波信号の受信及び前記肢の状態に基づいて入力を選択させることとをさせる、
前記命令を記憶する非一時的記憶媒体。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/461,621 | 2014-08-18 | ||
US14/461,621 US9389733B2 (en) | 2014-08-18 | 2014-08-18 | Modal body touch using ultrasound |
PCT/US2015/037332 WO2016028385A1 (en) | 2014-08-18 | 2015-06-24 | Modal body touch using ultrasound |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2017530443A JP2017530443A (ja) | 2017-10-12 |
JP6353160B2 true JP6353160B2 (ja) | 2018-07-04 |
Family
ID=53541932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2017508617A Expired - Fee Related JP6353160B2 (ja) | 2014-08-18 | 2015-06-24 | 超音波を用いたモーダルボディタッチ |
Country Status (6)
Country | Link |
---|---|
US (1) | US9389733B2 (ja) |
EP (1) | EP3183625B1 (ja) |
JP (1) | JP6353160B2 (ja) |
KR (1) | KR20170042693A (ja) |
CN (1) | CN106662898B (ja) |
WO (1) | WO2016028385A1 (ja) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3523782A4 (en) * | 2016-10-05 | 2020-06-24 | Magic Leap, Inc. | PERIOCULAR TEST FOR MIXED REALITY CALIBRATION |
EP3533161B1 (en) * | 2016-10-25 | 2020-07-29 | Telefonaktiebolaget LM Ericsson (PUBL) | User-worn device and touch-device for ultrasonic data transmission |
US10481699B2 (en) | 2017-07-27 | 2019-11-19 | Facebook Technologies, Llc | Armband for tracking hand motion using electrical impedance measurement |
TW201928604A (zh) * | 2017-12-22 | 2019-07-16 | 美商蝴蝶網路公司 | 用於基於超音波資料來識別姿勢的方法和設備 |
CN111936959A (zh) * | 2018-03-23 | 2020-11-13 | 脸谱科技有限责任公司 | 用于在用户身上显示用户界面并检测触摸手势的方法、设备和系统 |
WO2020023542A1 (en) | 2018-07-24 | 2020-01-30 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and eyes of a user |
JPWO2022249253A1 (ja) * | 2021-05-24 | 2022-12-01 | ||
EP4176335A1 (en) * | 2021-09-27 | 2023-05-10 | Google LLC | Touch control of wearable devices |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
US5745719A (en) * | 1995-01-19 | 1998-04-28 | Falcon; Fernando D. | Commands functions invoked from movement of a control input device |
JP3587034B2 (ja) * | 1997-10-21 | 2004-11-10 | オムロン株式会社 | 電子機器及び電子遊戯機器 |
US7148879B2 (en) * | 2000-07-06 | 2006-12-12 | At&T Corp. | Bioacoustic control system, method and apparatus |
JP2002358149A (ja) * | 2001-06-01 | 2002-12-13 | Sony Corp | ユーザ入力装置 |
US6984208B2 (en) * | 2002-08-01 | 2006-01-10 | The Hong Kong Polytechnic University | Method and apparatus for sensing body gesture, posture and movement |
KR100785764B1 (ko) * | 2005-05-11 | 2007-12-18 | 한국전자통신연구원 | 인체 안테나를 이용한 지상파 dmb 수신 장치 및 그 방법 |
JP4039530B1 (ja) * | 2006-11-30 | 2008-01-30 | 学校法人立教学院 | 視覚触知覚相互作用体感装置 |
KR100793079B1 (ko) * | 2006-12-08 | 2008-01-10 | 한국전자통신연구원 | 손목착용형 사용자 명령 입력 장치 및 그 방법 |
KR101572768B1 (ko) * | 2007-09-24 | 2015-11-27 | 애플 인크. | 전자 장치 내의 내장형 인증 시스템들 |
US20110071419A1 (en) * | 2008-05-26 | 2011-03-24 | Koninklijke Philips Electronics N.V. | Location indicating device |
US9037530B2 (en) * | 2008-06-26 | 2015-05-19 | Microsoft Technology Licensing, Llc | Wearable electromyography-based human-computer interface |
US20100056873A1 (en) * | 2008-08-27 | 2010-03-04 | Allen Paul G | Health-related signaling via wearable items |
TW201015099A (en) * | 2008-09-10 | 2010-04-16 | Koninkl Philips Electronics Nv | System, device and method for emergency presence detection |
JP2012511274A (ja) * | 2008-12-05 | 2012-05-17 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 体結合通信に基づかれるユーザー識別 |
US8386028B2 (en) * | 2009-03-24 | 2013-02-26 | Biospace Co., Ltd. | Method of analyzing body composition with measurement of voltage signals at multiple positions of body |
JP4988016B2 (ja) * | 2009-08-27 | 2012-08-01 | 韓國電子通信研究院 | 指の動き検出装置およびその方法 |
CN102597931B (zh) * | 2009-11-09 | 2016-01-27 | 罗姆股份有限公司 | 带触摸传感器的显示器、控制电路和使用它的电子设备 |
JP5856566B2 (ja) * | 2009-11-11 | 2016-02-10 | ニヴェウス メディカル, インコーポレーテッド | 相乗的筋活性化装置 |
US8421634B2 (en) * | 2009-12-04 | 2013-04-16 | Microsoft Corporation | Sensing mechanical energy to appropriate the body for data input |
KR101890717B1 (ko) * | 2010-07-20 | 2018-08-23 | 삼성전자주식회사 | 생체 정보를 활용한 가상 세계 조작 장치 및 방법 |
US8665210B2 (en) * | 2010-12-22 | 2014-03-04 | Microsoft Corporation | Sensing user input using the body as an antenna |
WO2012110042A1 (en) * | 2011-02-17 | 2012-08-23 | Sense A/S | A method of and a system for determining a cardiovascular quantity of a mammal |
US8908894B2 (en) * | 2011-12-01 | 2014-12-09 | At&T Intellectual Property I, L.P. | Devices and methods for transferring data through a human body |
US8988373B2 (en) * | 2012-04-09 | 2015-03-24 | Sony Corporation | Skin input via tactile tags |
US9170674B2 (en) * | 2012-04-09 | 2015-10-27 | Qualcomm Incorporated | Gesture-based device control using pressure-sensitive sensors |
EP2909699A1 (en) * | 2012-10-22 | 2015-08-26 | VID SCALE, Inc. | User presence detection in mobile devices |
EP2909755B1 (en) | 2012-10-22 | 2019-08-28 | Sony Corporation | User interface with location mapping |
US9226094B2 (en) * | 2013-07-25 | 2015-12-29 | Elwha Llc | Systems and methods for receiving gesture indicative data at a limb wearable computing device |
US9594433B2 (en) * | 2013-11-05 | 2017-03-14 | At&T Intellectual Property I, L.P. | Gesture-based controls via bone conduction |
-
2014
- 2014-08-18 US US14/461,621 patent/US9389733B2/en not_active Expired - Fee Related
-
2015
- 2015-06-24 EP EP15736733.5A patent/EP3183625B1/en active Active
- 2015-06-24 WO PCT/US2015/037332 patent/WO2016028385A1/en active Application Filing
- 2015-06-24 CN CN201580044160.9A patent/CN106662898B/zh not_active Expired - Fee Related
- 2015-06-24 JP JP2017508617A patent/JP6353160B2/ja not_active Expired - Fee Related
- 2015-06-24 KR KR1020177006983A patent/KR20170042693A/ko unknown
Also Published As
Publication number | Publication date |
---|---|
CN106662898A (zh) | 2017-05-10 |
WO2016028385A1 (en) | 2016-02-25 |
EP3183625A1 (en) | 2017-06-28 |
US20160048231A1 (en) | 2016-02-18 |
JP2017530443A (ja) | 2017-10-12 |
EP3183625B1 (en) | 2021-10-27 |
KR20170042693A (ko) | 2017-04-19 |
CN106662898B (zh) | 2020-09-11 |
US9389733B2 (en) | 2016-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6353160B2 (ja) | 超音波を用いたモーダルボディタッチ | |
US9727133B2 (en) | Ultrasound-based facial and modal touch sensing with head worn device | |
CN102640086B (zh) | 感测机械能以使得身体适用于进行数据输入 | |
US20170052596A1 (en) | Detector | |
US9857879B2 (en) | Finger gesture sensing device | |
US20210397267A1 (en) | Systems and methods for gesture-based control | |
JP2011048818A (ja) | 指の動き検出装置およびその方法 | |
KR101707734B1 (ko) | 디바이스 본체 정위의 판단 | |
US11241197B2 (en) | Apparatus for blood pressure estimation using photoplethysmography and contact pressure | |
WO2018131251A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US11347320B1 (en) | Gesture calibration for devices | |
Agarwal et al. | Opportunistic sensing with MIC arrays on smart speakers for distal interaction and exercise tracking | |
US20160202788A1 (en) | Multi-on-body action detection based on ultrasound | |
US11262850B2 (en) | No-handed smartwatch interaction techniques | |
US20180267618A1 (en) | Method for gesture based human-machine interaction, portable electronic device and gesture based human-machine interface system | |
Ibrahim et al. | PRESS/HOLD/RELEASE ultrasonic gestures and low complexity recognition based on TCN | |
Yang et al. | MAF: Exploring Mobile Acoustic Field for Hand-to-Face Gesture Interactions | |
Cao et al. | IPand: accurate gesture input with ambient acoustic sensing on hand | |
US20230144825A1 (en) | Detecting input gestures using onboard microphones | |
Bose et al. | A hands free browser using EEG and voice Inputs | |
JP6391486B2 (ja) | 情報処理装置、操作制御システムおよび操作制御方法 | |
CN106658339A (zh) | 一种音频信号处理方法及装置、系统 | |
WO2024191491A1 (en) | Gesture-based control using active acoustic sensing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20171024 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20171213 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20180508 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20180607 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 6353160 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
LAPS | Cancellation because of no payment of annual fees |