WO2012099584A1 - Method and system for multimodal and gestural control - Google Patents
Method and system for multimodal and gestural control Download PDFInfo
- Publication number
- WO2012099584A1 WO2012099584A1 PCT/US2011/021716 US2011021716W WO2012099584A1 WO 2012099584 A1 WO2012099584 A1 WO 2012099584A1 US 2011021716 W US2011021716 W US 2011021716W WO 2012099584 A1 WO2012099584 A1 WO 2012099584A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- destination device
- user
- devices
- command
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0227—Cooperation and interconnection of the input arrangement with other functional units of a computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
Definitions
- FIG. 1 is a three-dimensional perspective view of a multimodal and gestural control system according to an example of the present invention.
- FIG. 2 is a simplified block diagram of the multimodal and gestural control system according to an example of the present invention.
- FIGS. 3A - 3I illustrate various gesture commands in accordance with an example of the present invention.
- FIGS. 4A - 4E illustrate various meta-interaction approaches for allowing a user to specify the particular device for which the interactions are intended according to an examples of the present invention.
- FIG. 5 is a simplified flow diagram of the processing steps for enabling multimodal and gestural control according to an example of the present invention.
- a common solution to aforementioned problem is the use of universal remote controller to control multiple entertainment devices.
- a single controller is configured to control multiple devices.
- the set of devices and the corresponding commands that the controller can issue over a communication channel can be pre-configured or customized by the user.
- the physical controller is put in a particular mode such that all subsequent commands are meant for a particular device unless the mode is changed.
- physical controllers require frequent changing of the batteries, are often misplaced, and eventually wear down over time,
- Another solution to the afore-mentioned problem is to enable control of multiple devices without the need of a physical controller through use of gesture control.
- each device must contain a controller for detecting gestures from the user in addition to having its own unique interaction vocabulary (e.g. gesture command database).
- gesture command database e.g., gesture command database
- a physical controller is missing, specifying the destination device for which the gesture is destined often becomes a challenge.
- Examples of the present invention enable a user to specify a particular device when a user-action (i.e. gesture command) can be acted upon by multiple devices which are co-present. That is, examples of the present invention provide a single gesture and multimodal controller that can be used to control multiple entertainment devices. In particular, examples of the present invention utilize meta interactions, or interactions which do not result in any action but are useful for interpreting the interaction itself (i.e. the destination device of the interaction). Stated differently, when a user interacts with multiple devices in which two or more can be controlled via the same interaction mechanism, the multimodal and gestural control system helps to distinguish which device a particular interaction is intended for. As such, meta interactions and the gestural control system associated therewith are able to span several interaction modalities (i.e. multimodal) and enable fluid interactions with multiple proximate devices.
- a user-action i.e. gesture command
- examples of the present invention utilize meta interactions, or interactions which do not result in any action but are useful for interpreting the interaction itself (i.e. the destination
- FSG. 1 is a three- dimensional perspective view of a multimodal and gestural control system according to an example of the present invention.
- the system 100 includes a display unit 124 coupled to a multitude of entertainment devices 105, 1 10, 1 15.
- display unit 124 is a digital television configured to facilitate the transmission of audio and video signals to an operating user.
- Entertainment devices 105, 1 10, 1 15 represent multimedia consumer electronic devices for pairing with the display unit 124 such as a satellite receiver, a digital video recorder (DVR), digital video disc player (DVD/B!u-ray), a video game console, an audio-video (AV) receiver, and the like.
- a gesture detection controller may be embedded in at least one of the entertainment devices 105, 1 10, 1 15 along with the requisite sensors to feed the controller (e.g. cameras, microphones, infrared-cameras, etc.).
- the other entertainment devices may include standard wireless communication means such as infrared-based or radio frequency (RF) protocols. Examples of the present invention utilize both the spatial proximity of the multitude of entertainment devices 105, 1 10, and 1 15, and the broadcast nature of the wireless protocol.
- the gesture detection controller of the host device recognizes a gesture command from a user which is not destined for the host device, the host device is configured to convert the command into an alternate command code (e.g. infrared) which may be understood by other devices.
- an alternate command code e.g. infrared
- the alternate command is then broadcast by the host device to the destination device using the appropriate code or media (e.g. infrared).
- the appropriate code or media e.g. infrared.
- entertainment device 1 10 may then create an infrared code (i.e. alternate command) associated with the received gesture command and broadcast the alternate command to the device 105. Since device 1 10 and device 105 are in close proximity, the infrared-receiver of device 105 receives this alternate infrared command and acts on it accordingly. Furthermore, entertainment device 1 15 is unaffected by this command since the alternate command or infrared code is unique to entertainment device 105. Stili further, display unit 124 may be one of the entertainment devices controlled via the host device.
- FIG. 2 is a simplified block diagram of the multimodal and gestural control system according to an example of the present invention.
- the system 200 includes a plurality of entertainment devices 205, 210, 215 (A, B, ...N), a display unit 224, a gesture database 223, and a computer-readable storage medium 21 1 .
- the host or entertainment device 210 includes a gesture detection module 21 1 and a signal processor 209 for facilitating the gesture and multimodal control in accordance with examples of the present invention. More particularly, the gesture detection module 21 1 is coupled to a gesture database 223, which stores a list of gestures along with an associated operation command and destination device.
- the signal processor 209 is configured to convert the gesture command to an alternate code or command (e.g. radio frequency), and emit the converted command for detection by the appropriate destination device.
- Devices 205 and 215 include a signal receiver 207 and 217 respectively for detecting the alternate signal emitted from the host entertainment device 210.
- Display unit 224 represents an electronic visual display configured to present video and images to a user such as a liquid crystal display (LCD) television, plasma display panel, cathode ray tube (CRT) display, video projector, or the like. Furthermore, display unit 224 may also include a signal receiver for detecting the alternate command signal broadcast from the host device.
- Storage medium 219 represents volatile storage (e.g. random access memory), non-volatile store (e.g. hard disk drive, read-only memory, compact disc read only memory, flash storage, etc.), or combinations thereof. Furthermore, storage medium 219 may include software 213 that is executable by the host device to perform some or ail of the functionality described herein.
- FIGS. 3A - 3I illustrate various gesture commands in accordance with an example of the present invention.
- FSG. 3A depicts a gesture command in which the user's hand forms a closed fist and a thumb pointing in a westward direction.
- Such a gesture command may be mapped to a "BACK" operation such as skipping to a previous chapter on a DVD player device, or rewinding thirty seconds on a DVR device.
- FIG. 3B depicts a gesture command in which the user's hand forms a closed fist and a thumb is pointing in an eastward direction.
- This gesture may be mapped to a "FORWARD” operation such as skipping to a next chapter on a DVD player, or skipping forward thirty seconds on a DVR device.
- FIG. 3C depicts yet another gesture command in which the user's hand is open, fingers close together, and the thumb is perpendicular to the index finger.
- Such a gesture may be mapped to a "STOP" operation such as stopping the playback of a biu-ray movie on a biu-ray entertainment device for example.
- FIG. 3D depicts a gesture command in which the att ibutes of the users hand includes open, fingers essentially equal-distance apart, and the palm facing the user. Accordingly, this gesture may be mapped to an "OPEN" operation such as opening a saved file on a connected video game console for example.
- FIG. 3E depicts a gesture command in which attributes of the users hand mimic that of FIG. 3D, but for the palm faces away from the user. According to one example, this gesture may be mapped to a "MAXIMIZE" operation for increasing the visual display area of an image or video shown on the display unit.
- FIG. 3F depicts a gesture command in which the user's hand forms a closed fist. Here, this gesture command may be mapped to a "MINIMIZE” operation for decreasing the visual display area of an image or video shown on a connected display unit.
- FIG. 3G depicts yet another gesture command in which the user's hand is closed except for the middle and index fingers, with the index finger pointing in an eastward direction and the middle finger pointing in a north direction.
- a gesture may be mapped to an "UNDO" operation in which the last received command is canceled or removed.
- FIG. 3H depicts a similar gesture command as FIG. 3G except that the index finger points in a westward direction.
- the gesture of FIG. 3H may be mapped to a "REDO" operation such that previous command is applied.
- FIG. 3I depicts a gesture command in which each finger of the user's hand are curled inward so as to form claw-type gesture.
- Such a gesture command may mapped, via the gesture controller of the host device, to a "SAVE" operation for saving a data file on a particular destination device (e.g. save a photo on the hard drive of video game console).
- the above-described gesture commands are just one example of a small subset of the type of gestures that may be utilized by the system of the present invention. Furthermore, these gesture commands are simply used for example purposes only as each gesture may be mapped to any type of operation command.
- Examples of the present invention provide a method that allows the user to overload a particular gesture command and then specify which device the command is meant or destined. Specifically, mefa-interactions, or interactions which do not result in any action but are useful for interpreting the interaction itself, may be utilized to identify the destination device of a particular user interaction or gesture.
- FIGS. 4A - 4E illustrate various meta-interaction approaches for allowing a user to specify the particular device for which the interactions are intended according to an examples of the present invention.
- FIG. 4A depicts a "non-dominant hand" meta-interaction approach.
- the non-dominant hand 424 of the user 428 may be used for specifying the destination device while the dominant hand 432 is being used for gestures interactions. That is, certain postures or gestures made using the non-dominant hand 434 can be used to qualify the destination device for gesture commands being made using the dominant hand 432.
- the user 426 is forming a "BACK" gesture command with the dominant hand 424, while simultaneously holding up one finger with the non-dominant hand so as to indicate the destination entertainment device (i.e. device 1 ) for execution of the "BACK" operation.
- FIG. 4B depicts a "cross-modal" meta-interaction approach.
- the meta-interaction may include a visual hand gesture accompanied by another modality (e.g. speech) which specifies the destination device.
- a visual hand gesture for a "BACK” command may be accompanied by a speech tag such as "DVD” or "DVR” to specify which device the "BACK” command is meant to operate or control.
- FIG. 4C depicts yet another meta-interaction called the "temporal" approach.
- some gestures within the same interaction domain may act as meta-interactions that set the destination device unless future meta-interactions are received within a given time period.
- the user 426 may select “device 2" as the destination device and then, within a predetermined time threshold (e.g. 5 seconds) and within the same interaction domain, form a gesture command (e.g. "GO BACK", "VOLUME UP”, etc.) for operating the selected destination device.
- a gesture command e.g. "GO BACK", "VOLUME UP", etc.
- a particular gesture may act as a "toggle meta-interaction" to switch between different devices.
- the gesture recognition space of the interaction may be sub-divided into different spatial regions which are mapped to different devices such that an interaction triggered in a particular spatial region is destined towards a particular device. For example, if two entertainment devices are stacked on top of each other, visual hand gestures above a predetermined threshold (e.g. user's face or shoulders) may be assigned to the top-most device, while gesture commands below the threshold may be assigned to the device below the top-most device.
- a gesture command 436a within spatial region 436a i.e.
- a gesture command 432b within spatial region 436b i.e. between the eye level and shoulder level of the user 424) may be assigned to entertainment device 410
- a gesture command 432c within spatial region 436c i.e. between the shoulder level and waist level of the user 424) may be assigned to entertainment device 415.
- FIG. 4E depicts yet another meta-interaction approach ("gesture attributes") for controlling multiple devices.
- the particular attributes of the gesture are analyzed for determining the appropriate destination device.
- the meta-interaction may be embedded within the gesture command itself.
- a hand swipe gesture from left to right may mean "increase volume" of a particular device, while the number of fingers held-out while making the gesture may specify whether the gesture command is destined for the first device 405, second device 410, or third device 415.
- the hand swipe gesture 432 includes two fingers so as to indicate to the host device to perform a volume increase operation on the second device 410 (the host device may even be the second device).
- meta-interactions may be interactive in the event that the destination of a gesture command is unclear and may be determined by letting the user choose among the plurality of entertainment devices using a display or graphical user interface.
- FIG. 5 is a simplified flow diagram of the processing steps for enabling muitimodai and gestural control according to an example of the present invention.
- a gesture command from a user is detected by the gesture detection module of the host entertainment device.
- the host device analyzes the gesture command (e.g. meta-interactions) and determines if the destination device is the host device itself. If so, the gesture database is queried to determine the operation command associated with the received gesture command in step 512, and this operation command is executed on the host device in step 514. If the host device is not determined to be the desired destination device, however, then in step 508 the host device determines the appropriate destination device based on the meta-interactions described above for example.
- the gesture command e.g. meta-interactions
- the gesture command is then converted to an alternate command signal (e.g. infrared) via the signal processor of the host device in step 508. Thereafter, the alternate command signal is broadcast or transmitted by the host device to the destination device in step 510. Based on the received command signal at the destination device, the associated operation command is executed thereon in step 514.
- an alternate command signal e.g. infrared
- the gestural control system provides the user with the ability to control multiple devices with a single embedded gesture controller.
- the expense of manufacturing and deploying multiple sensors and gesture controllers on each entertainment device can be eliminated.
- the use of shared interaction language/vocabulary across multiple devices allows users to learn a small set of gestures thus increasing recall and use of these gesture commands.
- a small interaction vocabulary helps to improve recognition of gestures commands by the embedded gesture controller of the host device.
- examples of the present invention are extensible to more than two or three entertainment devices and to interaction modalities other than visual gestures.
- the gesture control system may be equally effective when each device has its own gesture controller but the devices are so close so as to make existing gaze and pointing solutions unreliable.
- examples of the present invention allow for manual configuration, or manual assignment of gestures and meta-interactions to a particular operation command.
- the user can also add new gestures which are mapped to particular devices only or can add new meta-interactions when more devices are added in the setup. Meta-interactions can also specify that a particular command is meant for more than one device thus using a single gestural command to trigger action in multiple devices.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Details Of Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013550458A JP5723462B2 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gesture control |
EP11856002.8A EP2666070A4 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gestural control |
US13/978,033 US9778747B2 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gestural control |
KR1020137020458A KR101690117B1 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gestural control |
CN201180065522.4A CN103329066B (en) | 2011-01-19 | 2011-01-19 | For the method and system of multi-mode gesture control |
PCT/US2011/021716 WO2012099584A1 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gestural control |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2011/021716 WO2012099584A1 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gestural control |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012099584A1 true WO2012099584A1 (en) | 2012-07-26 |
Family
ID=46515983
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/021716 WO2012099584A1 (en) | 2011-01-19 | 2011-01-19 | Method and system for multimodal and gestural control |
Country Status (6)
Country | Link |
---|---|
US (1) | US9778747B2 (en) |
EP (1) | EP2666070A4 (en) |
JP (1) | JP5723462B2 (en) |
KR (1) | KR101690117B1 (en) |
CN (1) | CN103329066B (en) |
WO (1) | WO2012099584A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130194180A1 (en) * | 2012-01-27 | 2013-08-01 | Lg Electronics Inc. | Device and method of controlling the same |
WO2014064025A1 (en) * | 2012-10-25 | 2014-05-01 | Mis-Robotics Gmbh | Manually operated robot control system and method for controlling a robot system |
JP2014086085A (en) * | 2012-10-19 | 2014-05-12 | Samsung Electronics Co Ltd | Display device and control method thereof |
WO2014117895A1 (en) * | 2013-01-29 | 2014-08-07 | Robert Bosch Gmbh | Method and device for controlling garage equipment |
CN104102335A (en) * | 2013-04-15 | 2014-10-15 | 中兴通讯股份有限公司 | Gesture control method, device and system |
WO2015011703A1 (en) * | 2013-07-21 | 2015-01-29 | Pointgrab Ltd. | Method and system for touchless activation of a device |
WO2015039050A1 (en) * | 2013-09-13 | 2015-03-19 | Nod, Inc | Using the human body as an input device |
WO2016121052A1 (en) * | 2015-01-29 | 2016-08-04 | 三菱電機株式会社 | Multimodal intent understanding device and multimodal intent understanding method |
EP3062196A1 (en) * | 2015-02-26 | 2016-08-31 | Xiaomi Inc. | Method and apparatus for operating and controlling smart devices with hand gestures |
EP2755111A3 (en) * | 2013-01-11 | 2016-10-19 | Samsung Electronics Co., Ltd | System and method for detecting three dimensional gestures to initiate and complete the transfer of application data between networked devices |
US9477313B2 (en) | 2012-11-20 | 2016-10-25 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving outward-facing sensor of device |
EP3112984A4 (en) * | 2014-02-25 | 2017-02-22 | ZTE Corporation | Hand gesture recognition method, device, system, and computer storage medium |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US10194060B2 (en) | 2012-11-20 | 2019-01-29 | Samsung Electronics Company, Ltd. | Wearable electronic device |
US10423214B2 (en) | 2012-11-20 | 2019-09-24 | Samsung Electronics Company, Ltd | Delegating processing from wearable electronic device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US10585478B2 (en) | 2013-09-13 | 2020-03-10 | Nod, Inc. | Methods and systems for integrating one or more gestural controllers into a head mounted wearable display or other wearable devices |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
US10732723B2 (en) | 2014-02-21 | 2020-08-04 | Nod, Inc. | Location determination and registration methodology for smart devices based on direction and proximity and usage of the same |
JP2021009619A (en) * | 2019-07-02 | 2021-01-28 | 富士ゼロックス株式会社 | Information processing system and program |
US11157436B2 (en) | 2012-11-20 | 2021-10-26 | Samsung Electronics Company, Ltd. | Services associated with wearable electronic device |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9292112B2 (en) * | 2011-07-28 | 2016-03-22 | Hewlett-Packard Development Company, L.P. | Multimodal interface |
US9142182B2 (en) * | 2011-10-07 | 2015-09-22 | Lg Electronics Inc. | Device and control method thereof |
US11493998B2 (en) | 2012-01-17 | 2022-11-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
KR101984154B1 (en) * | 2012-07-16 | 2019-05-30 | 삼성전자 주식회사 | Control method for terminal using touch and gesture input and terminal thereof |
KR20140014548A (en) * | 2012-07-24 | 2014-02-06 | 삼성전자주식회사 | Electronic device, method for controlling the same, and computer-readable recoding medium |
US20140245200A1 (en) * | 2013-02-25 | 2014-08-28 | Leap Motion, Inc. | Display control with gesture-selectable control paradigms |
US9766709B2 (en) * | 2013-03-15 | 2017-09-19 | Leap Motion, Inc. | Dynamic user interactions for display control |
US10338685B2 (en) * | 2014-01-07 | 2019-07-02 | Nod, Inc. | Methods and apparatus recognition of start and/or stop portions of a gesture using relative coordinate system boundaries |
US10338678B2 (en) * | 2014-01-07 | 2019-07-02 | Nod, Inc. | Methods and apparatus for recognition of start and/or stop portions of a gesture using an auxiliary sensor |
US10725550B2 (en) | 2014-01-07 | 2020-07-28 | Nod, Inc. | Methods and apparatus for recognition of a plurality of gestures using roll pitch yaw data |
US20150220158A1 (en) | 2014-01-07 | 2015-08-06 | Nod Inc. | Methods and Apparatus for Mapping of Arbitrary Human Motion Within an Arbitrary Space Bounded by a User's Range of Motion |
WO2015105919A2 (en) * | 2014-01-07 | 2015-07-16 | Nod, Inc. | Methods and apparatus recognition of start and/or stop portions of a gesture using an auxiliary sensor and for mapping of arbitrary human motion within an arbitrary space bounded by a user's range of motion |
CN103728906B (en) * | 2014-01-13 | 2017-02-01 | 江苏惠通集团有限责任公司 | Intelligent home control device and method |
KR20150084524A (en) * | 2014-01-14 | 2015-07-22 | 삼성전자주식회사 | Display apparatus and Method for controlling display apparatus thereof |
CN104866083B (en) * | 2014-02-25 | 2020-03-17 | 中兴通讯股份有限公司 | Gesture recognition method, device and system |
KR101556521B1 (en) * | 2014-10-06 | 2015-10-13 | 현대자동차주식회사 | Human Machine Interface apparatus, vehicle having the same and method for controlling the same |
CN105589550A (en) * | 2014-10-21 | 2016-05-18 | 中兴通讯股份有限公司 | Information publishing method, information receiving method, information publishing device, information receiving device and information sharing system |
CN104635537B (en) * | 2014-12-24 | 2017-10-20 | 北京元心科技有限公司 | A kind of control method of intelligent appliance |
US10048749B2 (en) | 2015-01-09 | 2018-08-14 | Microsoft Technology Licensing, Llc | Gaze detection offset for gaze tracking models |
US9864430B2 (en) | 2015-01-09 | 2018-01-09 | Microsoft Technology Licensing, Llc | Gaze tracking via eye gaze model |
US9653075B1 (en) * | 2015-11-06 | 2017-05-16 | Google Inc. | Voice commands across devices |
CN107329602A (en) * | 2016-04-28 | 2017-11-07 | 珠海金山办公软件有限公司 | A kind of touch-screen track recognizing method and device |
US10558341B2 (en) * | 2017-02-20 | 2020-02-11 | Microsoft Technology Licensing, Llc | Unified system for bimanual interactions on flexible representations of content |
US10684758B2 (en) * | 2017-02-20 | 2020-06-16 | Microsoft Technology Licensing, Llc | Unified system for bimanual interactions |
CN107122053A (en) * | 2017-04-27 | 2017-09-01 | 奥英光电(苏州)有限公司 | A kind of control method of display, device and display |
JP7006198B2 (en) * | 2017-12-01 | 2022-01-24 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment, information processing systems and programs |
KR102181499B1 (en) * | 2019-03-14 | 2020-11-23 | 주식회사 듀코젠 | Method and system for authoring virtual reality contents with two hands motion input |
KR102192051B1 (en) * | 2019-04-16 | 2020-12-16 | 경북대학교 산학협력단 | Device and method for recognizing motion using deep learning, recording medium for performing the method |
US11476894B2 (en) | 2019-12-10 | 2022-10-18 | AR & NS Investment, LLC | Edge communication system with cascaded repeater devices over wired medium |
US11010129B1 (en) * | 2020-05-08 | 2021-05-18 | International Business Machines Corporation | Augmented reality user interface |
US11177872B1 (en) | 2020-06-24 | 2021-11-16 | AR & NS Investment, LLC | Managing a network of radio frequency (RF) repeater devices |
US11283510B2 (en) | 2020-06-24 | 2022-03-22 | AR & NS Investment, LLC | Phase noise removal in a network of radio frequency (RF) repeaters |
US11989965B2 (en) * | 2020-06-24 | 2024-05-21 | AR & NS Investment, LLC | Cross-correlation system and method for spatial detection using a network of RF repeaters |
US11711126B2 (en) | 2020-06-24 | 2023-07-25 | AR & NS Investment, LLC | Wireless communication system based on mmWave RF repeaters |
US11789542B2 (en) | 2020-10-21 | 2023-10-17 | International Business Machines Corporation | Sensor agnostic gesture detection |
US11763809B1 (en) * | 2020-12-07 | 2023-09-19 | Amazon Technologies, Inc. | Access to multiple virtual assistants |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030128244A1 (en) * | 2001-09-19 | 2003-07-10 | Soichiro Iga | Information processing apparatus, method of controlling the same, and program for causing a computer to execute such a method |
US7028269B1 (en) * | 2000-01-20 | 2006-04-11 | Koninklijke Philips Electronics N.V. | Multi-modal video target acquisition and re-direction system and method |
US20090031240A1 (en) * | 2007-07-27 | 2009-01-29 | Gesturetek, Inc. | Item selection using enhanced control |
US20100199228A1 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture Keyboarding |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990008158A (en) * | 1995-04-28 | 1999-01-25 | 모리시타요우이치 | Interface device |
JPH0981309A (en) | 1995-09-13 | 1997-03-28 | Toshiba Corp | Input device |
JP3521187B2 (en) * | 1996-10-18 | 2004-04-19 | 株式会社東芝 | Solid-state imaging device |
JPH11327753A (en) | 1997-11-27 | 1999-11-30 | Matsushita Electric Ind Co Ltd | Control method and program recording medium |
JP2001216069A (en) | 2000-02-01 | 2001-08-10 | Toshiba Corp | Operation inputting device and direction detecting method |
EP1305791A1 (en) * | 2000-07-21 | 2003-05-02 | Koninklijke Philips Electronics N.V. | Speech control over a plurality of devices |
JP3997392B2 (en) * | 2001-12-13 | 2007-10-24 | セイコーエプソン株式会社 | Display device and input method of display device |
US7231609B2 (en) | 2003-02-03 | 2007-06-12 | Microsoft Corporation | System and method for accessing remote screen content |
CN100473152C (en) * | 2003-07-11 | 2009-03-25 | 株式会社日立制作所 | Image processing camera system and image processing camera controlling method |
JP2005242759A (en) * | 2004-02-27 | 2005-09-08 | National Institute Of Information & Communication Technology | Action/intention presumption system, action/intention presumption method, action/intention pesumption program and computer-readable recording medium with program recorded thereon |
TWI412392B (en) | 2005-08-12 | 2013-10-21 | Koninkl Philips Electronics Nv | Interactive entertainment system and method of operation thereof |
KR100630806B1 (en) | 2005-11-29 | 2006-10-04 | 한국전자통신연구원 | Command input method using motion recognition device |
JP2007318319A (en) * | 2006-05-24 | 2007-12-06 | Seiko Epson Corp | Remote controller, and control method therefor |
US9319741B2 (en) * | 2006-09-07 | 2016-04-19 | Rateze Remote Mgmt Llc | Finding devices in an entertainment system |
JP5207513B2 (en) | 2007-08-02 | 2013-06-12 | 公立大学法人首都大学東京 | Control device operation gesture recognition device, control device operation gesture recognition system, and control device operation gesture recognition program |
US7991896B2 (en) | 2008-04-21 | 2011-08-02 | Microsoft Corporation | Gesturing to select and configure device communication |
US8154428B2 (en) | 2008-07-15 | 2012-04-10 | International Business Machines Corporation | Gesture recognition control of electronic devices using a multi-touch device |
US20100083189A1 (en) * | 2008-09-30 | 2010-04-01 | Robert Michael Arlein | Method and apparatus for spatial context based coordination of information among multiple devices |
US9134798B2 (en) | 2008-12-15 | 2015-09-15 | Microsoft Technology Licensing, Llc | Gestures, interactions, and common ground in a surface computing environment |
US20100287513A1 (en) | 2009-05-05 | 2010-11-11 | Microsoft Corporation | Multi-device gesture interactivity |
US8487871B2 (en) | 2009-06-01 | 2013-07-16 | Microsoft Corporation | Virtual desktop coordinate transformation |
WO2010147600A2 (en) * | 2009-06-19 | 2010-12-23 | Hewlett-Packard Development Company, L, P. | Qualified command |
US8487888B2 (en) * | 2009-12-04 | 2013-07-16 | Microsoft Corporation | Multi-modal interaction on multi-touch display |
US9268404B2 (en) * | 2010-01-08 | 2016-02-23 | Microsoft Technology Licensing, Llc | Application gesture interpretation |
CN101777250B (en) * | 2010-01-25 | 2012-01-25 | 中国科学技术大学 | General remote control device and method for household appliances |
US20120110456A1 (en) * | 2010-11-01 | 2012-05-03 | Microsoft Corporation | Integrated voice command modal user interface |
-
2011
- 2011-01-19 KR KR1020137020458A patent/KR101690117B1/en active IP Right Grant
- 2011-01-19 JP JP2013550458A patent/JP5723462B2/en not_active Expired - Fee Related
- 2011-01-19 CN CN201180065522.4A patent/CN103329066B/en not_active Expired - Fee Related
- 2011-01-19 US US13/978,033 patent/US9778747B2/en active Active
- 2011-01-19 WO PCT/US2011/021716 patent/WO2012099584A1/en active Application Filing
- 2011-01-19 EP EP11856002.8A patent/EP2666070A4/en not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7028269B1 (en) * | 2000-01-20 | 2006-04-11 | Koninklijke Philips Electronics N.V. | Multi-modal video target acquisition and re-direction system and method |
US20030128244A1 (en) * | 2001-09-19 | 2003-07-10 | Soichiro Iga | Information processing apparatus, method of controlling the same, and program for causing a computer to execute such a method |
US20090031240A1 (en) * | 2007-07-27 | 2009-01-29 | Gesturetek, Inc. | Item selection using enhanced control |
US20100199228A1 (en) * | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture Keyboarding |
Non-Patent Citations (1)
Title |
---|
See also references of EP2666070A4 * |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130194180A1 (en) * | 2012-01-27 | 2013-08-01 | Lg Electronics Inc. | Device and method of controlling the same |
JP2014086085A (en) * | 2012-10-19 | 2014-05-12 | Samsung Electronics Co Ltd | Display device and control method thereof |
WO2014064025A1 (en) * | 2012-10-25 | 2014-05-01 | Mis-Robotics Gmbh | Manually operated robot control system and method for controlling a robot system |
US9582079B2 (en) | 2012-10-25 | 2017-02-28 | Abb Gomtec Gmbh | Manually operated robot control system and method for controlling a robot system |
US9477313B2 (en) | 2012-11-20 | 2016-10-25 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving outward-facing sensor of device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US11157436B2 (en) | 2012-11-20 | 2021-10-26 | Samsung Electronics Company, Ltd. | Services associated with wearable electronic device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US10423214B2 (en) | 2012-11-20 | 2019-09-24 | Samsung Electronics Company, Ltd | Delegating processing from wearable electronic device |
US10194060B2 (en) | 2012-11-20 | 2019-01-29 | Samsung Electronics Company, Ltd. | Wearable electronic device |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US9910499B2 (en) | 2013-01-11 | 2018-03-06 | Samsung Electronics Co., Ltd. | System and method for detecting three dimensional gestures to initiate and complete the transfer of application data between networked devices |
EP2755111A3 (en) * | 2013-01-11 | 2016-10-19 | Samsung Electronics Co., Ltd | System and method for detecting three dimensional gestures to initiate and complete the transfer of application data between networked devices |
WO2014117895A1 (en) * | 2013-01-29 | 2014-08-07 | Robert Bosch Gmbh | Method and device for controlling garage equipment |
KR101969624B1 (en) * | 2013-04-15 | 2019-04-16 | 지티이 코포레이션 | Gesture control method, apparatus and system |
KR20150143724A (en) * | 2013-04-15 | 2015-12-23 | 지티이 코포레이션 | Gesture control method, apparatus and system |
CN104102335A (en) * | 2013-04-15 | 2014-10-15 | 中兴通讯股份有限公司 | Gesture control method, device and system |
US10013067B2 (en) | 2013-04-15 | 2018-07-03 | Zte Corporation | Gesture control method, apparatus and system |
CN104102335B (en) * | 2013-04-15 | 2018-10-02 | 中兴通讯股份有限公司 | A kind of gestural control method, device and system |
JP2016518657A (en) * | 2013-04-15 | 2016-06-23 | 中▲興▼通▲訊▼股▲ふぇん▼有限公司Zte Corporation | Hand shaking control method, apparatus and system |
EP2988210A4 (en) * | 2013-04-15 | 2016-05-25 | Zte Corp | Gesture control method, apparatus and system |
WO2015011703A1 (en) * | 2013-07-21 | 2015-01-29 | Pointgrab Ltd. | Method and system for touchless activation of a device |
US10139914B2 (en) | 2013-09-13 | 2018-11-27 | Nod, Inc. | Methods and apparatus for using the human body as an input device |
WO2015039050A1 (en) * | 2013-09-13 | 2015-03-19 | Nod, Inc | Using the human body as an input device |
US10585478B2 (en) | 2013-09-13 | 2020-03-10 | Nod, Inc. | Methods and systems for integrating one or more gestural controllers into a head mounted wearable display or other wearable devices |
US11231786B1 (en) | 2013-09-13 | 2022-01-25 | Nod, Inc. | Methods and apparatus for using the human body as an input device |
US10732723B2 (en) | 2014-02-21 | 2020-08-04 | Nod, Inc. | Location determination and registration methodology for smart devices based on direction and proximity and usage of the same |
US10591999B2 (en) | 2014-02-25 | 2020-03-17 | Zte Corporation | Hand gesture recognition method, device, system, and computer storage medium |
EP3112984A4 (en) * | 2014-02-25 | 2017-02-22 | ZTE Corporation | Hand gesture recognition method, device, system, and computer storage medium |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
WO2016121052A1 (en) * | 2015-01-29 | 2016-08-04 | 三菱電機株式会社 | Multimodal intent understanding device and multimodal intent understanding method |
EP3062196A1 (en) * | 2015-02-26 | 2016-08-31 | Xiaomi Inc. | Method and apparatus for operating and controlling smart devices with hand gestures |
US10007354B2 (en) | 2015-02-26 | 2018-06-26 | Xiaomi Inc. | Method and apparatus for controlling smart device |
JP2021009619A (en) * | 2019-07-02 | 2021-01-28 | 富士ゼロックス株式会社 | Information processing system and program |
Also Published As
Publication number | Publication date |
---|---|
EP2666070A1 (en) | 2013-11-27 |
US20130290911A1 (en) | 2013-10-31 |
KR20140014129A (en) | 2014-02-05 |
CN103329066A (en) | 2013-09-25 |
EP2666070A4 (en) | 2016-10-12 |
US9778747B2 (en) | 2017-10-03 |
JP2014507714A (en) | 2014-03-27 |
CN103329066B (en) | 2017-03-29 |
JP5723462B2 (en) | 2015-05-27 |
KR101690117B1 (en) | 2016-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9778747B2 (en) | Method and system for multimodal and gestural control | |
US10782816B2 (en) | Electronic apparatus and method for implementing user interface | |
KR101364849B1 (en) | Directional touch remote | |
US9176590B2 (en) | Systems and methods for hand gesture control of an electronic device | |
US8963847B2 (en) | User interface for a remote control device | |
US9363549B2 (en) | Gesture and voice recognition for control of a device | |
US8638198B2 (en) | Universal remote control systems, methods, and apparatuses | |
KR101379118B1 (en) | System and method for capturing remote control device command signals | |
US11862010B2 (en) | Apparatus, system and method for using a universal controlling device for displaying a graphical user element in a display device | |
KR101258026B1 (en) | System and method for capturing remote control device command signals | |
CN107801074B (en) | Display system and control method thereof | |
KR20240076411A (en) | integrated controller |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11856002 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13978033 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2011856002 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011856002 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2013550458 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20137020458 Country of ref document: KR Kind code of ref document: A |