WO2017168186A1 - Procédés et appareil pour « un geste tangible » et simulation/instruments qui pourraient nécessiter des compétences en mouvement/geste de doigt et technologie d'entraînement associée - Google Patents

Procédés et appareil pour « un geste tangible » et simulation/instruments qui pourraient nécessiter des compétences en mouvement/geste de doigt et technologie d'entraînement associée Download PDF

Info

Publication number
WO2017168186A1
WO2017168186A1 PCT/IB2016/000384 IB2016000384W WO2017168186A1 WO 2017168186 A1 WO2017168186 A1 WO 2017168186A1 IB 2016000384 W IB2016000384 W IB 2016000384W WO 2017168186 A1 WO2017168186 A1 WO 2017168186A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
gesture
finger
tactile
feedback
Prior art date
Application number
PCT/IB2016/000384
Other languages
English (en)
Inventor
Quan Xiao
Original Assignee
Quan Xiao
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quan Xiao filed Critical Quan Xiao
Publication of WO2017168186A1 publication Critical patent/WO2017168186A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • This invention is about "Tangible Gesture” and effectively simulating (hand based input) devices/instruments such as but not limit to keyboards, music instruments (existing or fictional) or other devices for the purpose of "tangible” interaction with (simulated) devices/instruments that requires interaction of user's finger/hand/arm gesture/movement, maybe in a coordinated way (such as multiple fingers or finger movement together with wrist movement), provides prompt/quick response/feedback (such as but not limited to tactile) for such gesture (of fingers, hand etc), and technologies to provide related training (for example gesture, hand/finger skills and etc.).
  • gesture related finger/hand/arm movement and/or position information to determine input (which could simulate a key board , music instrument or other devices needs direct hand/finger operation) and using tactile "pattern'Veffects (such as "virtual tactile points” as well as other tactile senses such as frequency, strength and location of vibration ), or other tangible/"feelable” feedback/prompt to help user "navigate” the device/instrument for example (but not limit to) locating key(s), button(s) or the component user can use finger to operate, prompting the limits/boundary of (effective) operations (such as for the simulated control), and provide response or "feedback" output for user's operation.
  • tactile "pattern'Veffects such as "virtual tactile points” as well as other tactile senses such as frequency, strength and location of vibration ), or other tangible/"feelable” feedback/prompt
  • This also includes using force/pressure/tactile (such as but not limit to dynamic patterns) and other tangible/ "feelable” prompt to help teaching/training of the related skills (e.g. finger excise for piano) for user which enhance user's skills and "muscle memory”.
  • force/pressure/tactile such as but not limit to dynamic patterns
  • other tangible/ "feelable” prompt to help teaching/training of the related skills (e.g. finger excise for piano) for user which enhance user's skills and "muscle memory”.
  • a (dynamic) pressure pattern consists of a time sequence of frames of static pressure pattern "presents" to user, (for example can be used to indicate the direction or movement of something on the displaying surface)
  • FMU here refer to inertia measurement unit, this is currently usually integrated circuit MEMS device/sensor that can provide multiple degree of freedom (DOF) inertiaand navigational signals such as acceleration (x,y,z), rotation (angular speed, around x,y,z axis), 3d-magnetic compass (x,y,z direction) and etc.
  • DOF degree of freedom
  • a first method/means to provide for making gesture input/recognition "Context aware” includes:
  • commands or events/messages corresponding to a gesture (such as but not limit to a hand/finger gesture, could be relative/local or "global") that is categorized based on (multiple) contexts which can be switched (dynamically),
  • gesture recognize/input system might only “expects” these possible/expected inputs and filter out all others (for example for the "current” or “local” context),
  • the gesture recognize/input system might check for "context determining/switching” commands/signals or “short-cut” signals for some "context-less” activities (such as copy-paste). Once these "non-local context command" are found, context switch might be triggered or shortcut/global activities might be performed.
  • the context switch might be triggered by gestures from "lower order” body parts (those segments closer to body in connection sequence, such as arm) prioritizing "higher order” body parts gestures (such as those of finger)
  • the context switch might be triggered by determining the pose or "facing direction'V'pitch" of the back of hand, for example when the back of hand is vertical (to the ground) the context might be switched to "joystick” or "pointing" mode (might depends on other finger such as index finger gesture), while when the back of hand is facing up the context might be switched to "mouse” or “keyboard” mode, depends on other finger such as index finger gesture, and when the back of hand is facing down (palm facing up) the context might be switched to "TV remote control" mode, etc.
  • context switch might using "hint” or "natural gesture” which comes from typical action being simulated , such as the (typical) gesture of typing, grabbing a mouse, flipping a page on touch screen, using joy stick, grabbing a remote control, and etc.
  • gestures of hand such as "OK”, "good” (thumbs up) or thumbs down can be directly interpreted as commands to the system (such as a dialog, "ok” or “good” means confirm and thumbs down means cancel.
  • such context determine or context switching activity might include (but not limited to) using the rotation/orientation (or pitch) of the back of hand to determine which mode/context we are , and then determine "commands" based on gesture (such as those of palm, wrist and fingers).
  • gesture such as those of palm, wrist and fingers.
  • action/command might be "click” from pinky which means bring up the "context” menu, similar to right mouse click.
  • feedback (such as but not limited to tactile/force/feelable, visual ,audio and etc.) is given to user after context changed.
  • the context might also be determined by the object
  • a wearable platform for controlling or simulating control of device(s) comprised of: A elastic wearable frame that can accommodate user's hand for example like a glove shape (might not covering all fingers or entire part of the hand);
  • Motion or position sensors such as but not limit to MEMS EVIU sensors, magnetic, ultrasound or optical/infrared sensors located (or attached to) "movable sections" the frame accommodating corresponding movable part of fingers/hand;
  • elastic wearable frame could be using highly breathable materials/fabrics and/or highly breathable structure such as porous surface, design with multiple holes/openings to make partial skin(of user's hand) exposed.
  • multi-modal sensors such as flex resistor, optical fiber, and/or other bent/flex sensor system as well as magnetic detector can be used in addition to position/motion sensors (such as IMU, gyroscope, MEMES compass) to precisely determine the finger gesture and location.
  • position/motion sensors such as IMU, gyroscope, MEMES compass
  • the wearable platform further including using "context aware” mechanism for intuitive gesture recognition, which might including (but not limited to) using the rotation/orientation (or pitch) of the back of hand, or other "context determine or context switching activity/gesture” to determine the context/mode for interpreting the meaning of the user gestures that follows (or performed simultaneous by other parts of the hand/arm) , and then determines the actual meaning based context (such as those of palm, wrist and fingers), and if it is a valid gesture then corresponding "commands” or “events/messages” will be triggered.
  • a universal "context less” action/command might be "CLICK” from pinky which means bring up the "context” menu, similar to right mouse click.
  • "dynamic calibration” might be used by the wearable device to improve the data accuracy from the sensors ⁇ since from some "known” or “well defined” gesture (once we know from other channel) the position of each movable sections or joints of a hand might be determined or inferred (maybe dynamically), it is possible to use these positions (at that moment) as “starting point” or “calibration point” for calibrating the system / reducing the “accumulative error” from using the
  • inertia/acceleration/rotation data for calculating finger positions. For example when know (for example from other message channel) user are making a fist or when all five fingers are spread out straight, then this typical gesture can be used for dynamically calibrate the "model of hand", so that any accumulated errors from EVIU can be removed.
  • hand information model such as "hand skeleton" model (3D model) is established based on the motion/pose information from the sensors and calibration gesture(s) which could be once in the beginning or performed multiple times (maybe dynamically) from some "known" or “well defined” gesture, for example when know from other message channel (or instruct user to perform such gesture) user are making a fist or when all five fingers are spread out straight, then this typical gesture can be used for dynamically calibrate the "model” (it can be done dynamically as mentioned in the above paragraph)
  • a UI system use such input to simulate existing or new devices for interacting with machine in multiple modes.
  • the wearable platform comprised of force/pressure/tactile or other "feelable” actuators/effectors to provide "direct”/multi-modal(could including visual, audio) feedback based on the gesture for simulating a "virtual" input device;
  • the method to provide feedback including:
  • the sensor data for example those from EVIU's acceleration and pose data, magnetic "direction” data (or possibly from the "hand skeleton model” which capture and integrate these data), and maybe with other data from additional bent/flex sensors, for (optionally) optical sensors, (or the combine of such data to get a more reliable location).
  • such relationship data could be from relative positional or movement data, or it could be calculated from the location of each moving section or from the position object interacting with.
  • providing "direct'Vmulti-modal feedback including providing ("specific” or “corresponding”)”response” or “feedback” (such as but not limited to: pressure/tactile patterns, vibration patterns/strength/frequency changes etc.) to user's finger tip with tactile means that located on the part of the wearable accommodating finger tip (and other parts) based on the ("specific") spatial relationship between finger and the virual object(such as but not limit to keys, buttons, or boundaries of operational limit), for example , one activated “spot'V'fixel” in the pattern could means (barely) contacting the virtual object (at such location, might be off centered) but not pressed,
  • Different "feelable” dynamic patterns might be used to "represent” the relation ship between finger and the object(such as but not limit to key, button) it interact with, for example in like in Fig.2 with 5 spot pattern (201) (from feelable actuators), 1 one them could be activated (center, left, right, up, down) and provide tactile feedback to
  • the feelable/tactile/effector means mentioned above could include (but not limit to): pneumatic, hydraulic, electromagnetic, piezo, and other inducer and micro actuators. It is desirable to provide "static" pressure but it is also acceptable for vibration(strength and/or frequency might change according to location) indicating the location of the keys.
  • different shape (such as flat, round, edgy, hollow) can be represented by the dynamic pattern according to the shape's relation to the finger(s).
  • motion/position sensors and feelable such as
  • tactile/pressure pattern feedback are on all fingers(finger tips) of the wearable frame
  • the glove tip that fits for finger tip have a little bit space and not tightly wrapped the finger, at vertical directions, so that the finger tip is not “bound” to the "tactor", but instead, only contact the finger tip when user tries to press some key or “reach out”.
  • This can be done by making some pre-shaped glove or related sections (such as but not limited to between the middle section and the end section, and between the lower(base) section of the finger that connects to the palm and the middle section.
  • some pre-shaped glove or related sections such as but not limited to between the middle section and the end section, and between the lower(base) section of the finger that connects to the palm and the middle section.
  • more than one feelable effector such as vibration actuator, acoustic transducer (such as electromagnetic or piezo components) might work together (same frequency, similar "phase") to create a "virtual tactile point" in the areas between the transducers, or it can work in different frequency/timing /not work coordinately and create the feeling of 2 or more tactile points simultaneously.
  • This is useful to provide prompt and feedback (of the relative position to a key or zone.
  • Fig.3 provides a more detailed explanation and illustration. It is also possible the vibration type actuator/transducer work together with
  • a pneumatic actuator might press the flexible "mask” that have vibration type actuator/transducer on it the to the skin of the finger when there's "contact” or “prompt” needs to be passed to user (determined by the control system for example from the spatial relationship of the finger and the virtual object interacting).
  • the wearable frame might work together with
  • camera/optical/infrared sensor(s) to provide "optical tracking/correction" for more precise location requirements for example in coordinating with visual .
  • a UI system (such as but not limit to VR/AR system, wearable UI, mobile UI) use such "tangible gesture” input/recognition for input or simulate existing or new devices for interacting with machine in multiple modes.
  • buttons/hand there might also have feelable actuators/tactors that can serve as prompt indicating desired moving/turning direction of that part of the hand (such as 202 in figure 2), or the desired pose/gesture -the actuator is not activated/prompting when user's hand/finger which the actuator is on is already in desired position/gesture, and when they are not (such as but not limit to drifting from the center of desired location or approaching/reached operation boundary/limit) prompt user with tactile (such as but not limit to vibration, static pressure/pattern, the movement of the "virtual tactile point" formed as mentioned in earlier paragraphs ) .
  • These prompters(actuators) not serving the purpose of indicating key/button locations, (but rather, more towards the limits/boundary of desired operation) can all be vibration type if needed.
  • the prompt/feedback could be in various forms, for example,
  • gesture detection and prompt can be useful for detecting the shape of the hand from the sensors to correct inaccurate gesture:
  • tutoring system "expect” user's movement and/or final hit position, of palm, Obviously, fingers, form the “tutoring material” (b/c we know what is the correct next position, and approximately what should be the correct move in general), if in a given time period the movement we detected is “not there” “reverse”/” error” or “not enough/too much” then we can judge this is wrong and might not need to judge where it actually hit. This is called reliable "hit expectation mode"
  • the simulated keyboard might move with body, arm or even palm (which means only the relative position change of finger to palm might be considered), or they can be relative fixed (such as for simulating piano keyboard, however still, general shaking, drifting or any "whole body global movement” can be filtered out, as we can use sensor data such as IMU on other part of body besides hands, such "detachable” sensor might be in a form of "add-on" to the wearable input system).
  • the recognition/input feature might be set like: single hand, wrist parallel movement means travel along the keyboard— so the keyboard is kind of “semi” fixed to hand with some "inertia” that allows user to use single hand to input for whole keyboard, visual/display might provide additional clues for user for the simulation.
  • a wearable such as a smart watch, or a mobile device such as a smart phone
  • a wearable could have a user interface what communicate with this glove-like wearable platform, and let user to use gesture for input from a "virtual keyboard/instrument" this platform simulates.
  • the UI might work together with the virtual keyboard for example showing/highlighting the keys user's finger are over, touched or pressed, in addition to the tactile feedback this glove-like wearable platform provides.
  • the glove-like wearable platform Upon perform the action of using a finger to reach to a certain key, touch it and press it, the glove-like wearable platform process the gesture info from the sensor such as IMU, and calculates the location relation of that finger to the virtual keyboard, and determine if the finger touches any key. If a part of finger tip such as the center part touch the virtual key, the control system then activate a tactile actuator to provide feedback at the center of the finger tip.
  • the finger are not centrally aligned with the key, say only left part of the finger tip touch the key, then only the left side "fixel” or actuator is activated to provide the tactile effect indicating the relative position of the key when touched (but not yet pressed), this resolve the navigation issue and the key user touched might be highlighted on the visual system, when user perform the press action with the finger, (means detection of a pressing down action towards the key), pressure pattern provided to the user's finger changes to reflect the pressed situation, both left and center "Fixel” or actuator might be actuated to provide the tactile effect show a "pressed” pattern.
  • the tactile effect could be static pressure provided by pneumatic, hydraulic or electromagnetic actuator but it can also be provided by vibration type by inducers or piezo actuators. Further more , such effects can be induced by 2 or more tansducers to create an illusion of an vibration effect in the middle of the transducers.
  • the gesture recognition could be context aware, for example when back of the hand turn side ways, it means not using the keyboard, but instead using a joystick, which palm facing it, it might means using a remote key pad.
  • Tutoring system also expect user to strike a series of keys. When successful hit the correct key, system might perform the correct sound and correct force feedback. When not entirely correct, user might provide "weakened" feedback (or 2 sounds together light user hit the middle of 2 keys) and when user hit the wrong key , system might not provide correct tactile feedback at all.
  • a 4 "effector"/" fixel'Vtactor pattern around the finger than can be used to provide "prompt" for finger movement (such as moving directions and etc.) according to the current position/movement of the finger and the "desired" position/movement which can be used in skill training (such as piano, typing, other music instrument skill training and etc.)
  • Fig.1 shows a glove like elastic wearable frame with Highly breathable (porous) design that have multiple holes/openings (106) that exposes part of skin of user's hand.
  • Micro motion/position sensors 103 linked to controller 105 are embedded in the skin of the "movable sections" the frame accommodating corresponding movable part of fingers/hand.
  • Controller 105 which processes information from sensors on the elastic wearable frame may communication channels (such as but not limit to wifi, mobile, Bluetooth, infrared, NFC with other systems to provide input/interaction) , it may have processing capabilities to run gesture recognition locally (or coordinated with remote UI system), and it might have related capacities (such as storage, management) for running 3 rd party apps, it may further have display screen.
  • Fig.2 shows the some possible arrangement of "effectors'V'fixels'Vmicro tactors on the glove like wearable frame so that finger tip and other part of the finger can feel them (and the dynamic pattern they formed) when activated.
  • 201 shows a 5 spot pattern when can be used to simulate different situations in touching an object - no spot activated when no touched, 1 spot could be activated (center, left, right, up, down) and provide tactile feedback to simulated "barely touched” situation, and more adjacent spots can be further actuated to simulated "pressed” situation. It is also possible to simulate a "gap" when two spots on the sides (L-R, or U-D) are actuated except the center which could simulate the situation of finger hitting/passing the gap between 2 keys.
  • the simulated pattern changes according to the relationship of between the specific finger and the virtual objects(such as keys) being simulated.
  • 202 shows some other form of pattern - 4 "effector"/" fixel'Vtactor around the finger than can be used to provide "prompt" for finger movement (such as moving directions and etc.) according to the current position/movement of the finger and the "desired" position/movement which can be used in skill training (such as piano, typing, other music instrument skill training and etc.)
  • Fig.3 shows using "vibration" type of fixel/ tactors/effectors to create “virtual tactor” or “Phantom Actuators” as described in Ali KAr and Ivan Poupyrev's paper
  • a) shows a typical pair of vibration actuator/tactor 301 arrangement at the finger tip (on the elastic frame) where a "virtual tactor" (or “Tactile brush”) simulating a "virtual touch point” 302 can be created between them (with correct condition as described in Ali KAr and Ivan Poupyrev's paper) and the location, strength of the "virtual touch point” can be moved/controlled according to the relationship of the virtual (3D) object (such as but not limited to a key, button) .
  • the virtual (3D) object such as but not limited to a key, button
  • b) shows an innovate "3 spot” system based on a) and creating a virtual touch point 302 in the areas between them. Or it can simulate 2 or 3 separated spots simultaneously using the method described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention porte sur un « geste tangible » et sur une simulation efficace de dispositifs tels que, mais sans s'y limiter, des claviers, des instruments de musique ou d'autres dispositifs à des fins d'interaction « tangible » avec des dispositifs simulés qui nécessitent une interaction d'un geste/mouvement de doigt/main/bras de l'utilisateur, peut-être de manière coordonnée, assurant une réponse/rétroaction prompte/rapide pour de tels gestes, et sur des technologies destinées à fournir un entraînement associé. De façon plus précise, les dispositifs utilisent un mouvement de doigt/main/bras associé à un geste et/ou des informations de position pour déterminer une entrée, utilisent un « motif » tactile ou une invite, ou une autre rétroaction/invite tangible/« pouvant être ressentie » pour aider l'utilisateur à « naviguer » sur le dispositif/l'instrument par exemple (mais sans s'y limiter) des touches de localisation, des boutons ou l'utilisateur de composant peut utiliser un doigt pour actionner, pousser les limites/frontières d'opérations effectives (telles que pour la commande simulée), et fournir une sortie de réponse ou de « rétroaction » pour l'opération de l'utilisateur. Une force/une pression/un toucher et une autre invite tangible/« pouvant être ressentie » sont également utilisés pour aider à l'apprentissage/à l'entraînement des compétences associées (par exemple, un exercice des doigts ou un piano) pour l'utilisateur, qui améliorent les compétences et la « mémoire musculaire » de l'utilisateur.
PCT/IB2016/000384 2016-03-28 2016-03-28 Procédés et appareil pour « un geste tangible » et simulation/instruments qui pourraient nécessiter des compétences en mouvement/geste de doigt et technologie d'entraînement associée WO2017168186A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662139718P 2016-03-28 2016-03-28
USUS62/139,718 2016-03-28

Publications (1)

Publication Number Publication Date
WO2017168186A1 true WO2017168186A1 (fr) 2017-10-05

Family

ID=59963577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/000384 WO2017168186A1 (fr) 2016-03-28 2016-03-28 Procédés et appareil pour « un geste tangible » et simulation/instruments qui pourraient nécessiter des compétences en mouvement/geste de doigt et technologie d'entraînement associée

Country Status (1)

Country Link
WO (1) WO2017168186A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108789454A (zh) * 2018-08-17 2018-11-13 成都跟驰科技有限公司 带有机械臂的汽车的控制系统
EP3506059A1 (fr) * 2017-12-28 2019-07-03 Immersion Corporation Systèmes et procédés permettant de fournir des effets haptiques liés au toucher et à la préhension d'un objet virtuel
CN110515509A (zh) * 2018-08-17 2019-11-29 中山叶浪智能科技有限责任公司 一种避免超视野的手势交互方法、系统、平台及存储介质
CN110989838A (zh) * 2019-11-29 2020-04-10 维沃移动通信有限公司 可穿戴设备、其控制方法及计算机可读存储介质
CN115273583A (zh) * 2022-05-16 2022-11-01 华中科技大学同济医学院附属协和医院 一种基于混合现实的多人互动骨科临床教学方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101038504A (zh) * 2006-03-16 2007-09-19 许丰 人力操控方法,软件及硬件装置
CN102323856A (zh) * 2011-08-09 2012-01-18 大连民族学院 基于加速度传感器和ZigBee的数据手套
US20130093703A1 (en) * 2010-03-30 2013-04-18 Korea Institute Of Science And Technology Tactile transmission system using glove type actuator device and method thereof
CN103226398A (zh) * 2013-03-25 2013-07-31 上海交通大学 基于微惯性传感器网络技术的数据手套
CN103677289A (zh) * 2013-12-09 2014-03-26 中国科学院深圳先进技术研究院 一种智能交互手套及交互方法
CN103677273A (zh) * 2013-12-21 2014-03-26 南通芯迎设计服务有限公司 一种基于红外技术的智能家居控制系统
US20140134575A1 (en) * 2012-11-15 2014-05-15 Samsung Electronics Co., Ltd Wearable device to represent braille and control method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101038504A (zh) * 2006-03-16 2007-09-19 许丰 人力操控方法,软件及硬件装置
US20130093703A1 (en) * 2010-03-30 2013-04-18 Korea Institute Of Science And Technology Tactile transmission system using glove type actuator device and method thereof
CN102323856A (zh) * 2011-08-09 2012-01-18 大连民族学院 基于加速度传感器和ZigBee的数据手套
US20140134575A1 (en) * 2012-11-15 2014-05-15 Samsung Electronics Co., Ltd Wearable device to represent braille and control method thereof
CN103226398A (zh) * 2013-03-25 2013-07-31 上海交通大学 基于微惯性传感器网络技术的数据手套
CN103677289A (zh) * 2013-12-09 2014-03-26 中国科学院深圳先进技术研究院 一种智能交互手套及交互方法
CN103677273A (zh) * 2013-12-21 2014-03-26 南通芯迎设计服务有限公司 一种基于红外技术的智能家居控制系统

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3506059A1 (fr) * 2017-12-28 2019-07-03 Immersion Corporation Systèmes et procédés permettant de fournir des effets haptiques liés au toucher et à la préhension d'un objet virtuel
CN109976509A (zh) * 2017-12-28 2019-07-05 意美森公司 提供与触摸和抓握虚拟物体相关的触觉效果的系统和方法
US10583359B2 (en) 2017-12-28 2020-03-10 Immersion Corporation Systems and methods for providing haptic effects related to touching and grasping a virtual object
CN108789454A (zh) * 2018-08-17 2018-11-13 成都跟驰科技有限公司 带有机械臂的汽车的控制系统
CN110515509A (zh) * 2018-08-17 2019-11-29 中山叶浪智能科技有限责任公司 一种避免超视野的手势交互方法、系统、平台及存储介质
CN110515509B (zh) * 2018-08-17 2023-01-13 中山叶浪智能科技有限责任公司 一种避免超视野的手势交互方法、系统、平台及存储介质
CN110989838A (zh) * 2019-11-29 2020-04-10 维沃移动通信有限公司 可穿戴设备、其控制方法及计算机可读存储介质
CN115273583A (zh) * 2022-05-16 2022-11-01 华中科技大学同济医学院附属协和医院 一种基于混合现实的多人互动骨科临床教学方法

Similar Documents

Publication Publication Date Title
US10564730B2 (en) Non-collocated haptic cues in immersive environments
US11144121B2 (en) Wearable interactive user interface
JP6553136B2 (ja) タッチ感応表面上でのマルチ圧力相互作用のためのシステムと方法
US10509469B2 (en) Devices for controlling computers based on motions and positions of hands
WO2017168186A1 (fr) Procédés et appareil pour « un geste tangible » et simulation/instruments qui pourraient nécessiter des compétences en mouvement/geste de doigt et technologie d'entraînement associée
JP6314134B2 (ja) ロボット訓練のためのユーザインターフェース
JP6275839B2 (ja) リモコン装置、情報処理方法およびシステム
JP6545955B2 (ja) ストレッチ特性を組み込んだ触覚デバイス
WO2016097841A2 (fr) Procédés et appareil pour une interface homme-ordinateur hautement intuitive et une "hyper" interface utilisateur portable centrée sur l'homme qui peut être une plateforme transversale/un dispositif transversal et éventuellement une rétroaction locale tactile/tangible
WO2017173386A1 (fr) Système d'interface homme-ordinateur
US20020126026A1 (en) Information input system using bio feedback and method thereof
KR20170069936A (ko) 위치 기반 햅틱 효과를 위한 시스템 및 방법
JP2019537084A (ja) タッチセンサ式キーボード
RU179301U1 (ru) Перчатка виртуальной реальности
RU187548U1 (ru) Перчатка виртуальной реальности
JP2008203911A (ja) ポインティング・デバイス、およびコンピュータ
KR102170638B1 (ko) 손가락 마디의 추적을 통한 가상현실 내 상호작용 제어방법 및 이를 이용한 vr 시스템
KR20110044391A (ko) 입력 장치 및 방법
KR20020072081A (ko) 손가락의 움직임에 의한 입력장치 및 그 방법
KR102239469B1 (ko) 객체 제어 방법 및 객체 제어 장치
RU2670649C9 (ru) Способ изготовления перчатки виртуальной реальности (варианты)
CN110609615A (zh) 用于在增强现实中集成触觉覆盖的系统和方法
JP2006031732A (ja) 信号入力装置及び力電気変換装置
KR102322968B1 (ko) 사용자의 손동작에 따른 명령 입력 장치 및 이를 이용한 명령 입력 방법
JP2018181185A (ja) 力覚提示器及び仮想力覚提示装置

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16896662

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19.2.19)

122 Ep: pct application non-entry in european phase

Ref document number: 16896662

Country of ref document: EP

Kind code of ref document: A1