WO2022200686A1 - An apparatus and a method for detecting a hand in contact with an object based on haptic feedback from wrist - Google Patents

An apparatus and a method for detecting a hand in contact with an object based on haptic feedback from wrist Download PDF

Info

Publication number
WO2022200686A1
WO2022200686A1 PCT/FI2022/050186 FI2022050186W WO2022200686A1 WO 2022200686 A1 WO2022200686 A1 WO 2022200686A1 FI 2022050186 W FI2022050186 W FI 2022050186W WO 2022200686 A1 WO2022200686 A1 WO 2022200686A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
wristband
wrist
interaction
signals
Prior art date
Application number
PCT/FI2022/050186
Other languages
French (fr)
Inventor
Jamin HU
Eemil Visakorpi
Lauri Tuominen
Ville KLAR
Felix Bade
Original Assignee
Port 6 Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Port 6 Oy filed Critical Port 6 Oy
Publication of WO2022200686A1 publication Critical patent/WO2022200686A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • Various example embodiments generally relate to the field of wearable devices.
  • some example embodiments relate to determining a type of a hand interaction with an object based on haptic feedback from the object detected by an apparatus comprising at least one sensor attached to a wristband and configured to detect signals from a wrist.
  • a computer can be operated with a mouse or a keyboard, or a game running on a game console can be controlled with a handheld controller.
  • the user interface devices are controlled with hands, wherein the user interface device may comprise sensors and/or buttons for receiving inputs from the user based on the user moving the interface device or pressing the buttons.
  • the user interface devices may be dedicated for control of a specific device or devices.
  • the user interface devices may not be always at hand when needed and they may be prone to defects caused by, for example, faulty electronics or dead batteries.
  • Example embodiments provide an apparatus and a method for detecting when a hand of a user is in contact with an object.
  • the hand interaction with the object is detected with at least one sensor of a wristband con figured to monitor haptic feedback from the object re sulting from the hand contact.
  • the object may not com prise electronic means for providing the haptic feed back.
  • the apparatus may comprise a signal processing device configured to determine based on the sensor data which type of hand-object interaction the user performed and/or determine an input to a device based on the sensor data.
  • the user may manipulate any object in order to provide inputs for devices in a simple and intuitive manner, wherein the inputs may result from the touches the user makes with the objects.
  • an apparatus may comprise a wristband; at least one sensor mounted at an inner circle of the wristband and configured to detect signals from a wrist of a user wearing the wristband; at least one processor; and at least one memory comprising program code which, when executed on the at least one processor, causes the apparatus at least to determine a hand interaction with an object based on characteristics of the signals de tected by the at least one sensor from the wrist indi cating physical changes on the wrist resulting from the object in contact with the hand; and output data on the hand interaction with the object to be used as an input to a device.
  • the at least one memory and the program code are further configured to, when exe cuted on the at least one processor, cause the apparatus to determine a type of the hand interaction with the object based on characteristics of the signals; and determine the input for the device based on the type of the hand interaction with the object.
  • the type of hand interaction comprises at least one of the hand touching the object, the hand holding the ob ject, the object in contact with the hand making a sound, an impact of the object to the hand, the hand moving the object, the hand sliding on a surface of the object, the hand tapping the object, the hand touching an object with the object, or the hand releasing the object.
  • the at least one sensor mounted at the inner circle of the wristband comprises at least one of a microphone, an accelerometer, an acoustic transducer, an optical sensor, or an electrode.
  • the apparatus further comprises at least one sensor mounted at an outer circle of the wristband and config ured to detect signals from other sources than the wrist; and wherein determining the hand interaction with the object is further based on signals received from the at least one sensor mounted at the outer circle.
  • the at least one sensor mounted at the outer circle of the wristband comprises of at least one microphone con figured to detect sound waves from the air, an accel erometer configured to monitor vibrations on the wrist band, or a camera configured to monitor the hand.
  • the at least one memory and the program code are further configured to, when executed on the at least one pro cessor, cause the apparatus to detect an indication of a falsely detected hand interaction with an object based on signals received from the at least one sensor mounted at the outer circle of the wristband.
  • the at least one memory and the program code are further configured to, when executed on the at least one pro cessor, cause the apparatus to determine a time of the hand interaction based on a detection time of the sig nals from the at least one sensor mounted at the inner circle of the wristband.
  • the at least one sensor mounted at the inner circle of the wristband is configured to detect signals in re sponse to at least one of vibration on the wrist, inte rior waves from the wrist, surface waves from the wrist, changes in a shape of a skin surface of the wrist, changes in a tendon pattern of the wrist, or deformation of an electric field on the wrist.
  • the wristband comprises a plurality of sensors mounted at the inner circle and wherein the at least one memory and the program code are further configured to, when executed by the at least one processor, cause the appa ratus to localize the hand interaction with the object based on a signal profile combination on the plurality of sensors
  • the at least one memory and the program code are further configured to, when executed by the at least one pro cessor, cause the apparatus to determine a type of the object the hand interacts with based on the signals from the at least one sensor mounter at the inner and/or outer circle of the wristband.
  • the type of object is determined based on at least one of weight of the object, surface texture of the object or acoustic properties of the object determined based on the signals.
  • a method to de tect a hand interaction with an object may be executed with the apparatus according to the first aspect.
  • the method may comprise determining a hand interaction with an object based on characteristics of signals detected from a wrist indicating physical changes on the wrist resulting from the object in con tact with the hand by at least one sensor mounted at an inner circle of a wristband.
  • a computer program is configured, when executed by an apparatus, to cause the apparatus at least to determine a hand interaction with an object based on characteristics of signals de tected from a wrist indicating physical changes on the wrist resulting from the object in contact with the hand by at least one sensor mounted at an inner circle of a wristband.
  • a computer-read able medium comprising instructions which, when executed by the apparatus of the first aspect, cause the appa ratus to execute the method of the second aspect.
  • FIG. 1 illustrates an example of a wristband comprising a sensor configured to detect a hand inter action with an object, according to an embodiment
  • FIG. 2 illustrates an example of a wristband comprising different types of sensors configured to de tect a hand interaction with an object, according to an embodiment
  • FIG. 3A illustrates an example of a wristband comprising a plurality of sensors mounted along the wristband for detection of a hand interaction with an object, according to an embodiment
  • FIG. 3B illustrates the wristband of FIG. 3A depicted from another perspective, according to an em bodiment
  • FIG. 4A illustrates another example of a wrist band comprising a plurality of sensors mounted along the wristband for detection of a hand interaction with an object, according to an embodiment
  • FIG. 4B illustrates the wristband of FIG. 4A depicted from another perspective, according to an em bodiment
  • FIG. 5 illustrates a first example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment
  • FIG. 6 illustrates a second example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment
  • FIG. 7 illustrates a third example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment
  • FIG. 8 illustrates a fourth example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment
  • FIG. 9 illustrates a first example use case of a wristband configured to detect hand interaction with an object, according to an embodiment
  • FIG. 10 illustrates a second example use case of a wristband configured to detect hand interaction with an object, according to an embodiment
  • FIG. 11 illustrates a third example use case of a wristband configured to detect hand interaction with an object, according to an embodiment
  • FIG. 12 illustrates an example embodiment of an apparatus configured to practice one or more example embodiments
  • FIG. 13 illustrates an example of a method for detecting hand interaction with an object, according to an embodiment.
  • User input wristbands may continuously regis ter commands by detecting a position of a hand or by detecting a specific gesture of the hand.
  • the hand position-based detection may not take into account physical interaction with objects which may give valu able information. This may involve knowing the exact moment the hand touches something, detection of what type of object the hand is in contact with, as well as detecting how the foreign object reacts to the hand's manipulation.
  • a hand movement alone is not very intuitive way of interacting.
  • Tactile feedback, or haptic feedback may refer to a physical response on an object from a user input.
  • a weight of the book causes resistance to movement of the hand which may be detected by the user as a tactile feedback from the book.
  • the tactile feedback may transmit information to the user, for example, with vibration patterns or waves.
  • Optical hand tracking may be used to improve hand interaction detection, but it faces a similar is sue. Even though an optical hand tracking device may 'see' its surroundings, including the hand, as well as what the hand is holding, it may be hard to see the precise position of the hand and exactly what the hand and its fingers are doing due to the hand becoming oc cluded by the object it is holding. It may be difficult, for example, to see whether or not a trigger of a water gun is triggered and the exact time of the triggering using optical hand tracking only.
  • the wristband may comprise at least one sensor configured to be in contact with skin of a wrist of the user.
  • the wristband may comprise a plurality of different types of sensors for improved accuracy of the detection.
  • an apparatus may detect how an object is reacting haptically (through vibrations and sound, for example), to what hands of a user are doing to it.
  • the apparatus may be able to tell the exact moment the user has touched the object, and/or the moment a trigger on the object has been clicked, and/or whether or not the user's fin gers are sliding across a surface of the object or just hovering.
  • this may enable using everyday objects as controllers.
  • the object may not need to com prise electronics or be powered on to produce any feed back, but the haptic feedback may be received from the object inherently in response to contacting the hand of the user.
  • an apparatus may enable con trolling digital content, for example in augmented re ality (AR) or in virtual reality (VR), without a need for dedicated controllers.
  • a user may 'borrow' everyday objects and use them to control a digital en vironment the user sees through AR or VR.
  • the user may be able to use a real-life toilet paper roll as a light sabre in a Star Wars video game.
  • the user may be able to use a plate as a controller for a virtual formula 1 car or a water gun as virtual pistol.
  • a further possible use case comprises using an old wireless mouse that does not have batteries anymore, as a clicker for a slideshow.
  • these physical objects may become empty interaction shells that may map interactions to digital content once the objects are physically sensed in real life by the user and the wristband.
  • FIG. 1 illustrates an example of a wristband 100 comprising at least one sensor 102 configured to detect a hand interaction with an object, according to an embodiment.
  • the at least one sensor 102 may be placed in a right place to extract signals (e.g. light, sound, conductivity, temperature, and/or weight), to under stand what a hand wearing the wristband 100 is inter acting with as much as possible.
  • Positions of the one or more sensors, and sensor types may depend on which types of signals the sensors are configured to monitor. In an embodiment, some sensors may be config ured to monitor physiological signals from the wrist based on tendon and skin surface movements.
  • Some sensors may be configured to listen to acoustic signals travel ing along the hand, such as vertical and/or horizontal movement of acoustic waves originating from the object interacting with the hand and detected from the wrist.
  • One or more sensor 102 may be configured to be in contact with skin of the wrist and detect at least one of inte rior waves or surface waves from the wrist. For example, a sound originating from the object may induce the in terior and/or surface waves which may be sensed via the hand by the wristband 100.
  • a vertical movement in rela tion to the skin surface caused by the interior waves may be detected from the wrist, for example, with an acoustic transducer.
  • a horizontal movement in relation to the skin surface caused by the surface waves may be detected from the wrist, for example, with an accel erometer in contact with the wrist skin.
  • the waves may indicate a haptic feedback from the object in contact with the hand.
  • the haptic feedback may be received in response to the hand manipulating the object or in re sponse to the hand initiating a contact with the object.
  • the object may not comprise any electronics for haptics, but the haptic feedback may occur naturally based on the interaction between the hand and the object.
  • the at least one sensor 102 may comprise a microphone.
  • the microphone may be con figured to be in contact with skin on a dorsal side of the wrist.
  • the wristband 100 may house the microphone.
  • the wristband 100 may be configured to sit comfortably tight around the wrist and in contact on the wrist.
  • the wristband 100 may further house a signal processing device.
  • the at least one sensor 102 such as the microphone, may pick up vibrations such as the in terior and/or surface waves caused by touch events on the skin of the hand.
  • the at least one sensor 102 may be configured to send at least characteristics of the waves to a processing unit to be classified and/or fur ther processed.
  • the signals comprising the characteristics of the waves may be dispatched to an external signal processing device, for example, in a wireless manner.
  • the signal processing device may be trained to classify hand interactions with objects based on previ ously known hand interactions using any supervised learning model such as k means clustering, similarity, or machine learning. Once the used model has learned to classify hand interactions with objects from a few orig inal users, transfer learning may be used for general ization so that the classifier may not need to be re trained for any new user. At a first use time, a touch calibration sequence may be performed by the new users.
  • the classification model may be trained to classify many sorts of hand interaction events with objects, such as touch events, gestures, touch release events, slide events, sound events, and the like.
  • Signals received by the at least one sensor indicative of the hand interaction events with objects may be labelled, for example, in order to train the supervised machine learning model.
  • the events may be manually labelled.
  • touch events may be labelled by asking the user to tap on various surfaces with one hand wearing the wristband 100 while simultaneously pressing a button with their other free hand whenever a touch with the hand wearing the wristband 100 occurs. This may produce labels as dis crete touch events which may be used to train the machine learning model.
  • optical hand tracking cameras in an AR heads up display may be used to label tap/touch events or slide events.
  • the user may be instructed to first demonstrate tap events in a way that the taps made by fingers are not occluded from the optical hand tracking view so that the optical hand tracking may approximate and determine the tap event to generate a label for the occurrence of a tap which is also measured by the wristband.
  • the generated labels may be then used to detect tap events when the tap is occluded from the optical hand tracking view.
  • touch screens and/or elec trically conductive tape of varying textures may be used to label, for example, touch and/or slide events.
  • the user may be instructed to tap the touch screen in dif ferent hand orientations.
  • the touch screen may register the events and use that as a label for the machine learning model to be able to detect such events even on surfaces that are not touch screens, as long as the acoustic signature of the surface to be touched is similar or can be gen eralised to.
  • labels for such events can be collected by using the mouse itself while it is still on and electrically operational.
  • the user may be asked to use the mouse normally with the hand that has the wristband 100 on.
  • the clicks digitally produced by the mouse may be collected and used as labels for the machine learning model for acoustic signals of the wristband.
  • the user may switch off the mouse and continue using it as a mouse (with the aid of com puter vision hand tracking to sense location of the mouse) now that the wristband has learned what a click sounds like in terms of the acoustic signals sensed from a wrist via the wristband 100.
  • robustness of the classifi cation may be improved by introducing noise to the ma chine learning training process.
  • the user may tap their wristband and/or move their hand around to introduce noise.
  • Algorithm of the machine learning model may then have labels that indicate whether a signal received was a true hand interaction event with an ob ject or noise, and it may learn to filter out the noise as long as there is a significant difference in the signal.
  • the wristband may comprise, for example, infrared optical distance sensors configured to provide the machine learning system with information on how tightly the wristband is placed with the skin.
  • the infrared optical sensors may detect when an impact, such as a tap, is received by the wristband (and not the hand) when at least some of the infrared optical sensors are pressed tighter to the skin in response to the im pact.
  • the machine learning algorithm may be continuously observing both the sensors configured to detect the hand-object interactions and the sensors configured to detect the impacts to the wristband.
  • the algo rithm may learn to distinguish these false taps (to the wristband) from a true tap (to an object in hand) because the tightness information provided by the infrared sen sor may provide this new information that the acoustic sensor may not have on its own.
  • At least medium strength and/or velocity touch events as well as slide events may be detected from an object contact with any part of the hand or even forearm.
  • the touch event may refer to when the user touches an object or an item with their hand, or the object or item touches the user's hand.
  • the slide event may refer to the user moving their hand or finger along a surface of an object or item.
  • vibration of a click from a real button may be detected.
  • the vibration of the click may be caused by movement of mechanical components causing the click sound.
  • the wristband 100 may house a plurality of sensors to enable detection of touch release events and/or to localise any of the in teraction events being detected.
  • FIG. 2 illustrates an example of a wristband 100 comprising a plurality of sensors 102, 104, accord ing to an embodiment.
  • the plurality of sensors 102, 104 may comprise, for example, at least one microphone and at least one accelerometer.
  • the at least one microphone may be mounted on a dorsal side of the wristband 100.
  • the at least one accelerometer may be mounted on an opposite side of the wristband 100 than the microphone such that the microphone is mounted on an inner circle of the wristband 100 and the accelerometer on an outer circle of the wristband 100.
  • the at least one microphone may be configured to be in contact with skin.
  • the at least one accelerometer sensor may im prove robustness to wristband 100 movement, and against noise and taps on the wristband 100 itself. Noise may originate from hand-moving which causes the microphone- to-skin contact to change. Further, knocks or taps on the wristband 100 housing the microphone may be detected with the accelerometer to decrease false hand interac tion detections.
  • the falsely detected hand interactions with objects may refer to situations where the sensors configured to detect the hand interactions with the ob jects may register signals indicative of the hand in teractions even though the hand is not in contact with an object. Signals detected by the accelerometer may be transmitted to a signal processing device, which signals may provide information of when a signal from the mi crophone correlates with "non-interaction events", i.e.
  • the signal processing device may be trained to classify de tected hand interactions with objects, as described above. In addition, the signal processing device may be trained to distinguish between the hand interaction events and the non-interaction events based on, for ex ample, characteristics of signals received from at least two of the plurality of sensors 102, 104 at the same time.
  • FIGS. 3A and 3B illustrate an example of a wristband 100 comprising a plurality of sensors 102, 104 mounted along the wristband 100, according to an embod iment.
  • the wristband 100 may correspond to the wristband 100 described in figure 2, but the sensors may be dis tributed along the wristband 100 to gain more precise data of hand interactions with objects.
  • the sensors may be mounted along a whole length of the wristband 100.
  • an array of sensors 102 may be mounted on an inner circle of the wristband 100.
  • the sensors on the inner circle may be configured to be in contact with skin.
  • one or more sensors 104 may be mounted on an outer circle of the wristband 100.
  • the array of sensors 102 may comprise, for ex ample, microphones, which may be used to allow detected hand interactions such touch, slide, or haptic events to be localised. For example, a tap in one location may produce a signal profile combination on the multiple sensors that are distinct from a tap in another loca tion.
  • the wristband 100 may comprise a signal processing device configured to determine which part of the hand interacted with an object, based on the signal profile combination. For example, strength of the signals near the source of the sound may be higher than with the signals received from sensors locating father away from the source.
  • the one or more sensors 104 on the outer circle may comprise accelerometers. With multiple accelerometers, a more accurate representation of a state and a movement of the wristband 100 may be deter mined, compared to when only one accelerometer is used, to further filter out noise and make the system more robust.
  • the plurality of sensors 102, 104 may comprise a plurality of different types of mi crophones.
  • the microphones may be configured to detect characteristic vibrations released through the air and/or the inside of the hand. The vibrations may be registered as a certain type of hand interaction based on their characteristics. When vibrations are detected from the air, the microphone may be located on an outer circle of the wristband.
  • the different types of micro phones may comprise of, for example, a piezo contact microphone, a MEMS (micro-electro mechanical systems) microphone, and/or a vibration sensor, such as an ac celerometer detecting accelerations produced by acous tically vibrating structures (e.g. the hand or the wristband 100).
  • FIG. 4A and 4B illustrate an example of a wristband 100 comprising a plurality of different types of sensors 104, 106, 108 mounted in or on the wristband 100, according to an embodiment.
  • the plurality of dif ferent types of sensors 104, 106, 108 may comprise, for example, accelerometers, electrodes and/or cameras.
  • the plurality of sensors 104, 106, 108 may be mounted, for example, along a whole length of the wrist band 100 configured to be placed around a wrist of a user. At least some of the sensors 106 may be configured to be in contact with a skin on the wrist.
  • the wristband 100 may comprise, for example, an array of electrodes configured to be in contact with the skin. At least some of the sensors 104, 108 may locate on an opposite side of the wristband 100 than the skin, such as the one or more accelerometers and/or cameras. In an embodiment, the one or more cameras 108 may face towards a tip of fingers to get a view of the hand wearing the wristband 100.
  • the electrical capacitance and impedance of the hand may change which may be measured by the array of electrodes.
  • the one or more cameras may be used to form a contact probability distribution for all locations on the hand.
  • the probability data combined with hand in teraction events detected from the electrodes may pro vide real-time localised hand interaction event detec tion, such as localizing taps performed by the user on an item.
  • the wristband 100 may use the at least one camera 108 and/or the at least one accelerometer 104 to take into account such movement. For example, signals received from the at least one camera and/or accelerometer may be used to correct for the movements while reading signals from the electrodes.
  • the at least one sensor mounted at the inner circle of the wristband may com prise one or more optical sensors and/pressure sensors.
  • the optical/pressure sensors may be configured to meas ure a contact the wristband has with the skin.
  • the distance between var ious parts of the wristband and specific points on the user's skin surface may change when fingers move or the object in the hand moves.
  • the detected varying bright ness levels may be used to distinguish occasions when the other sensors at the inner circle may falsely detect a movement of the wristband as a hand interaction with an object. For example, the may user tap the wristband itself by accident.
  • This tap may be "heard" by the microphones and/or electrodes and the tap may be mis- classified as a tap on a finger of the hand wearing the wristband.
  • the wristband may slightly be pressed tighter to the skin of the wrist for a short duration, which may be detected by the optical sensors as an indication of a tap which may be ignored.
  • FIG. 5 illustrates an example of a first type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact.
  • the hand interaction event may comprise, for example, a touch of an object.
  • the hand interac tion event may further comprise releasing the object.
  • the hand interaction event may comprise localisation of the touch, e.g. to which part of the object 502 a finger of the user has touched, or which finger touched the object 502.
  • objects causing an impact to the hand e.g. when something is thrown at the hand, may be detected based on sensor data the wristband produces.
  • FIG. 6 illustrates an example of a second type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact.
  • the hand interaction event may comprise touch and/or release of an object 600 with the object 502 a user is holding.
  • haptic feedback received from the object 502 the user is holding may indicate an interaction of the hand 500 with another object 600, such as a floor, sensed via the object 502 the user is holding.
  • FIG. 7 illustrates an example of a third type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact.
  • the hand interaction event may comprise, for example, a slide interaction with an object.
  • the slide interaction may refer to activities wherein at least a part of the hand 500 wearing the wristband 100 is slide on a surface of an object, such as running a finger on a table.
  • Detection of the sliding may comprise localisation of the touch on the object.
  • the localisation may comprise determining which part of the hand 500 was in contact with the object 502.
  • FIG. 8 illustrates an example of a fourth type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact.
  • the hand interaction event may comprise, for example, de tection of sounds the object 502 the user is holding makes.
  • the sounds may be detected based on of haptic feedback from the object 502, wherein the haptic feed back travels as internal signals such as acoustic waves in the hand 500 or on skin surface of the hand 500.
  • the haptic feedback may be detected by sensors positioned around a wrist of the user.
  • the user may click a button of a mouse controller, and the click caused by a pressed and released mechanical switch in Jerusalem vibrations via the pressing finger and along the hand 500 to the wrist area which can be sensed by sensors mounted on the wristband 100.
  • the vibrations may be characteristic to the mouse click and mapped to an input event which may comprise a mouse click or any other programmed input.
  • the programmed input may depend on a device for which the input is provided for, and the mapped input may be changed.
  • the wristband 100 may be configured to dispatch the sensor data to a signal processing device for de termining properties of the touched object. For example, based on trembling of the hand, a weight of a lifted object may be estimated. Further, the weight may be used to determine which type of an object the user lifted. In addition, material properties such as a surface tex ture of the object may be determined, for example, based on shape and/or pattern of the signals received when the user slides finger across the surface. Reliability of the detections may be improved by providing robustness to incoherent skin contact, universality and/or noise and taps on the wristband 100 itself. In an embodiment, the properties of the object may be detected using op tical tracking technology.
  • the optical tracking tech nology may comprise computer vision system placed, for example, on a heads-up display or on the wristband.
  • FIG. 9 illustrates an example of a use case of a wristband 100 configured to detect hand interaction events, according to an example embodiment.
  • the wrist band 100 may be, for example, any of the wristbands 100 illustrated in the figures 1 to 4.
  • the wristband 100 may be worn by a user in one or both hands 500 while they play a virtual game, such as a driving game. In the game, the user may need to turn a steering wheel and shift gears up or down.
  • the wristband 100 may detect hand interaction events such as finger taps on an object 502, such as a plate, used as the steering wheel by listening for them through vibrations sensed from the wrist.
  • the wristband may be configured to detect gestures, such as turning the plate, for example based on signals from one or more accelerometers mounted on the wristband.
  • the gestures may be registered only in response to detecting the user is holding the plate based on the haptic feedback form the plate.
  • the user could grab a wheel and hold it in front of them as a substitute for a real steering wheel to provide physical feedback, and the user's large ac tions on the wheel, like turning the wheel, could be detected optically.
  • the optical system may enable detection of such non-visible taps based on haptic feedback from the plate detected by the at least one sensor of the wristband 100 in response to the taps made by the user's hand wearing the wristband 100.
  • the taps may be registered substantially in real-time, i.e.
  • the wristband 100 provides a simple arrangement to also detect manipulation of ob jects which may not be easy to detect even with compli cated or costly devices.
  • the wristband 100 may be used in combination with a camera-based hand tracking system to complement the hand tracking and/or make hand inter action detection more accurate.
  • the camera-based hand tracking system may be mounted, for example, to a heads- up display.
  • the heads-up display may refer to any trans parent display that presents data without requiring us ers to look away from their usual viewpoints.
  • the heads- up display may comprise a head-mounted display, wherein the display element may move with the orientation of the user's head.
  • the signal processing device associated with the wristband 100 may be configured to receive hand tracking data from the camera-based hand tracking system to determine inputs based on data from both the wrist band 100 and the camera-based hand tracking system. Al ternatively, the signal processing device may be con figured to provide inputs based on the wristband data to a device receiving the hand tracking data, which device may be the heads-up display.
  • FIG. 10 illustrates an example of another use case of a wristband 100 configured to detect hand in teraction events, according to an example embodiment.
  • the user may wear the wristband in at least one hand 500.
  • the wristband 100 may be, for example, any of the wristbands illustrated in the figures 1 to 4.
  • a user may use an existing physical object 502 to provide haptic feedback for the users' digital input.
  • the user may want to type words into a heads-up display.
  • the user may use an existing keyboard that they own and type with that, with words showing up on the heads-up display.
  • the keyboard may not need to be electron ically connected to the heads-up display.
  • an optical tracking system may be config ured to know where a finger of the user is with respect to each key on the keyboard and approximate which key is intended to be pressed.
  • the wristband 100 may be configured to detect the exact moment the user touches and then presses a key down to confirm the key press together with the optical tracking system, such as a camera on the wristband 100. Without the wristband 100, the press events may not be detectable with the optical hand tracking on its own since the hand may occlude the cameras view to the fingers and their actions itself.
  • the user may use an object to provide haptic feedback in response to by the user interacting with the object 502 to produce inputs to one or more electronic devices.
  • FIG. 11 illustrates an example of another use case of a wristband 100 configured to detect hand in teraction events, according to an embodiment.
  • the wrist band 100 may be, for example, the wristband 100 illus trated in any of figures 1 to 4.
  • the detection of touches and other hand interactions may enable users to start using physical objects 502 that have no electronics in them at all to provide inputs for one or more devices.
  • the physical objects may be everyday objects, or they may be custom designed with the main purpose of provid ing haptic feedback to the user.
  • an ergo nomic physical 3D structure may be provided. The 3D structure may be held by the user in hand 500 while wearing the wristband 100.
  • the object 502 may have but tons 1100, scroll wheels 1102, and/or trackballs 1104 on it for the user to be able to feel haptic feedback while clicking, scrolling, and rolling with the 3D structure.
  • These hand interaction events that may oth erwise be too discreet for optical systems to detect accurately may be detected by the wristband 100, for example, based on vibrations travelling from the 3D structure via the hand 500 and thus make input events feel extremely physical, real, and responsive.
  • different sounds and vibra tions may be mapped by a signal processing device con figured to receive sensor data from the wristband 100 to determine the input associated with the specific sound and/or vibration pattern.
  • the signal processing device may further provide a time of the hand interac tion with the object.
  • the input may be provided to the device substantially in real-time as the user input (contact/manipulation of the object) may be detected immediately (for example, with a latency of less than 100 ms) when the contact happens.
  • FIG. 12 illustrates an example embodiment of an apparatus 1200 configured to practice one or more example embodiments.
  • the apparatus 1200 may comprise a wristband comprising at least one sensor.
  • the apparatus 1200 may comprise at least one processor 1202.
  • the at least one processor 1202 may comprise, for example, one or more of various processing devices, such as for example a co-processor, a micro processor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accom panying DSP, or various other processing devices in cluding integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose com puter chip, or the like.
  • various processing devices such as for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose com puter chip, or the like.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • MCU microcontroller unit
  • hardware accelerator a special-purpose com puter chip, or
  • the apparatus 1200 may further comprise at least one memory 1204.
  • the memory 1204 may be configured to store, for example, computer program code 1206 or the like, for example operating system software and application software.
  • the memory 1204 may comprise one or more volatile memory devices, one or more non-vola tile memory devices, and/or a combination thereof.
  • the memory 1204 may be embodied as magnetic storage devices (such as hard disk drives, magnetic tapes, etc.), optical magnetic storage devices, or sem iconductor memories (such as mask ROM, PROM (programma ble ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.).
  • the apparatus 1200 may further comprise a com munication interface 1208 configured to enable the ap paratus 1200 to transmit and/or receive information, to/from other apparatuses.
  • the communication interface 1208 may be configured to provide at least one wireless radio connection, such as for example a 3GPP mobile broadband connection (e.g. 3G, 4G, 5G).
  • the communication interface 1208 may be configured to pro vide one or more other type of connections, for example a wireless local area network (WLAN) connection such as for example standardized by IEEE 802.11 series or Wi-Fi alliance; a short range wireless network connection such as for example a Bluetooth, NFC (near-field communica tion), or RFID connection; a wired connection such as for example a local area network (LAN) connection, a universal serial bus (USB) connection or an optical net work connection, or the like; or a wired Internet con nection.
  • the communication interface 1208 may comprise, or be configured to be coupled to, at least one antenna to transmit and/or receive radio frequency signals.
  • the apparatus 1200 may further comprise a user interface 1210 comprising an input device and/or an out put device.
  • the input device may take various forms such as a touch screen, or one or more embedded control but tons.
  • the output device may for example comprise a dis play, a speaker, or the like.
  • some component and/or components of the apparatus 1200 may be configured to implement this functionality.
  • some component and/or components of the apparatus 1200 such as for example the at least one processor 1202 and/or the memory 1204, may be configured to implement this functionality.
  • Fur thermore when the at least one processor 1202 is con figured to implement some functionality, this function ality may be implemented using program code 1206 com prised, for example, in the memory 1204.
  • the functionality described herein may be per formed, at least in part, by one or more computer program product components such as software components.
  • the apparatus 1200 comprises a processor or processor circuitry, such as for example a microcontroller, configured by the program code when executed to execute the embodiments of the operations and functionality described.
  • the functionality described herein can be per formed, at least in part, by one or more hardware logic components.
  • illus trative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), application-specific Integrated Circuits (ASICs), ap plication-specific Standard Products (ASSPs), System- on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
  • FPGAs Field-programmable Gate Arrays
  • ASICs application-specific Integrated Circuits
  • ASSPs ap plication-specific Standard Products
  • SOCs System- on-a-chip systems
  • CPLDs Complex Programmable Logic Devices
  • GPUs Graphics Processing Units
  • the apparatus 1200 comprises means for per forming at least one method described herein.
  • the means comprises the at least one processor 1202, the at least one memory 1204 including program code 1206 configured to, when executed by the at least one processor 1202, cause the apparatus 1200 to perform the method.
  • the apparatus 1200 may comprise for example a computing device such as for example a signal processing device, a client device, a wearable device, or the like. Although the apparatus 1200 is illustrated as a single device it is appreciated that, wherever applicable, functions of the apparatus 1200 may be distributed to a plurality of devices.
  • a computing device such as for example a signal processing device, a client device, a wearable device, or the like.
  • the appa ratus comprises a wristband comprising a sensor setup and a signal processing device configured to detect hand interactions with objects, such as touch events.
  • the touch events may comprise, for example, the moment a user touches something, the moment the user feels a physical click from an object, and/or the moments the user is sliding finger over an object.
  • the detected events may trigger various programmable input events to any other digital device such as a smartwatch on the wristband itself, a smartphone, an external monitor, or a heads-up display for AR and/or VR.
  • usage of real-world devices/objects as virtual devices may be enabled.
  • FIG. 13 illustrates an example of a method for detecting hand interaction with an object, according to an embodiment.
  • the method may be executed, for example, with the wristband 100 or the apparatus 1200.
  • the method may comprise determining a hand interaction with an object based on characteristics of signals detected from a wrist indicating physical changes on the wrist by at least one sensor mounted at an inner circle of a wristband.
  • a haptic feedback occurs, such as a physical click or vibrations caused from the contact when sliding a finger on the object
  • characteristic vibrations may be released through the air and/or the inside of the hand which can be picked up by the sensors, such as microphones.
  • the vibrations may be registered, for example, as touch events and mapped to any input.
  • data on the hand interaction with the object may be stored and out put to be used as an input to a device.
  • the device may be, for example, a monitor, an AR or VR device, a com puter, a game console, or the like.
  • the analysis may comprise determination of a type of the hand interaction and/or properties of the object.
  • the type of hand in- teraction may comprise at least one of the hands holding the object, an impact of the object to the hand, the hand moving the object, the hand sliding on a surface of the object, the hand tapping the object, the hand touching an object with the object, or the hand releas- ing the object.
  • the data may comprise a time of the hand interaction. The time may be accu rately determined based on a time of reception of the signals via the wrist.
  • An apparatus may be configured to perform or cause performance of any aspect of the method (s) de scribed herein.
  • a computer program may comprise instructions for causing, when executed, an apparatus to perform any aspect of the method (s) described herein.
  • an apparatus may comprise means for performing any aspect of the method (s) described herein.
  • the means comprises at least one processor, and memory including program code, the at least one processor, and program code configured to, when executed by the at least one processor, cause per formance of any aspect of the method (s). Any range or device value given herein may be extended or altered without losing the effect sought. Also, any embodiment may be combined with another em bodiment unless explicitly disallowed.
  • subjects may be referred to as 'first' or 'second' subjects, this does not necessarily indicate any order or importance of the subjects. Instead, such attributes may be used solely for the purpose of making a difference between subjects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Dermatology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Biomedical Technology (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Various example embodiments relate to detecting hand interaction with an object based on signals received from a wristband comprising at least one sensor configured to monitor signals from a wrist of a user. At least one of a type, a time, or a location of the hand interaction with the object may be determined. The produced hand interaction data may be provided to any device, for example, to enable using real-word objects as virtual objects. An apparatus, a method, a computer program product, and a computer readable medium are disclosed.

Description

AN APPARATUS AND A METHOD FOR DETECTING A HAND IN CONTACT WITH AN OBJECT BASED ON HAPTIC FEEDBACK FROM WRIST TECHNICAL FIELD
Various example embodiments generally relate to the field of wearable devices. In particular, some example embodiments relate to determining a type of a hand interaction with an object based on haptic feedback from the object detected by an apparatus comprising at least one sensor attached to a wristband and configured to detect signals from a wrist.
BACKGROUND Multiple types of electronic user interface devices can be used to provide input commands to de vices. For example, a computer can be operated with a mouse or a keyboard, or a game running on a game console can be controlled with a handheld controller. Usually the user interface devices are controlled with hands, wherein the user interface device may comprise sensors and/or buttons for receiving inputs from the user based on the user moving the interface device or pressing the buttons. The user interface devices may be dedicated for control of a specific device or devices. Furthermore, the user interface devices may not be always at hand when needed and they may be prone to defects caused by, for example, faulty electronics or dead batteries. SUMMARY
This summary is provided to introduce a selec tion of concepts in a simplified form that are further described below in the detailed description. This sum mary is not intended to identify key features or essen tial features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Example embodiments provide an apparatus and a method for detecting when a hand of a user is in contact with an object. The hand interaction with the object is detected with at least one sensor of a wristband con figured to monitor haptic feedback from the object re sulting from the hand contact. The object may not com prise electronic means for providing the haptic feed back. The apparatus may comprise a signal processing device configured to determine based on the sensor data which type of hand-object interaction the user performed and/or determine an input to a device based on the sensor data. Hence, the user may manipulate any object in order to provide inputs for devices in a simple and intuitive manner, wherein the inputs may result from the touches the user makes with the objects.
According to a first aspect, an apparatus is provided. The apparatus may comprise a wristband; at least one sensor mounted at an inner circle of the wristband and configured to detect signals from a wrist of a user wearing the wristband; at least one processor; and at least one memory comprising program code which, when executed on the at least one processor, causes the apparatus at least to determine a hand interaction with an object based on characteristics of the signals de tected by the at least one sensor from the wrist indi cating physical changes on the wrist resulting from the object in contact with the hand; and output data on the hand interaction with the object to be used as an input to a device.
In an embodiment, the at least one memory and the program code are further configured to, when exe cuted on the at least one processor, cause the apparatus to determine a type of the hand interaction with the object based on characteristics of the signals; and determine the input for the device based on the type of the hand interaction with the object.
In an embodiment, in addition or alternatively, the type of hand interaction comprises at least one of the hand touching the object, the hand holding the ob ject, the object in contact with the hand making a sound, an impact of the object to the hand, the hand moving the object, the hand sliding on a surface of the object, the hand tapping the object, the hand touching an object with the object, or the hand releasing the object.
In an embodiment, in addition or alternatively, the at least one sensor mounted at the inner circle of the wristband comprises at least one of a microphone, an accelerometer, an acoustic transducer, an optical sensor, or an electrode.
In an embodiment, in addition or alternatively, the apparatus further comprises at least one sensor mounted at an outer circle of the wristband and config ured to detect signals from other sources than the wrist; and wherein determining the hand interaction with the object is further based on signals received from the at least one sensor mounted at the outer circle.
In an embodiment, in addition or alternatively, the at least one sensor mounted at the outer circle of the wristband comprises of at least one microphone con figured to detect sound waves from the air, an accel erometer configured to monitor vibrations on the wrist band, or a camera configured to monitor the hand.
In an embodiment, in addition or alternatively, the at least one memory and the program code are further configured to, when executed on the at least one pro cessor, cause the apparatus to detect an indication of a falsely detected hand interaction with an object based on signals received from the at least one sensor mounted at the outer circle of the wristband.
In an embodiment, in addition or alternatively, the at least one memory and the program code are further configured to, when executed on the at least one pro cessor, cause the apparatus to determine a time of the hand interaction based on a detection time of the sig nals from the at least one sensor mounted at the inner circle of the wristband.
In an embodiment, in addition or alternatively, the at least one sensor mounted at the inner circle of the wristband is configured to detect signals in re sponse to at least one of vibration on the wrist, inte rior waves from the wrist, surface waves from the wrist, changes in a shape of a skin surface of the wrist, changes in a tendon pattern of the wrist, or deformation of an electric field on the wrist.
In an embodiment, in addition or alternatively, the wristband comprises a plurality of sensors mounted at the inner circle and wherein the at least one memory and the program code are further configured to, when executed by the at least one processor, cause the appa ratus to localize the hand interaction with the object based on a signal profile combination on the plurality of sensors
In an embodiment, in addition or alternatively, the at least one memory and the program code are further configured to, when executed by the at least one pro cessor, cause the apparatus to determine a type of the object the hand interacts with based on the signals from the at least one sensor mounter at the inner and/or outer circle of the wristband.
In an embodiment, in addition or alternatively, the type of object is determined based on at least one of weight of the object, surface texture of the object or acoustic properties of the object determined based on the signals.
According to a second aspect, a method to de tect a hand interaction with an object is provided. The method may be executed with the apparatus according to the first aspect. The method may comprise determining a hand interaction with an object based on characteristics of signals detected from a wrist indicating physical changes on the wrist resulting from the object in con tact with the hand by at least one sensor mounted at an inner circle of a wristband.
According to a third aspect, a computer program is configured, when executed by an apparatus, to cause the apparatus at least to determine a hand interaction with an object based on characteristics of signals de tected from a wrist indicating physical changes on the wrist resulting from the object in contact with the hand by at least one sensor mounted at an inner circle of a wristband.
According to a fourth aspect, a computer-read able medium comprising instructions which, when executed by the apparatus of the first aspect, cause the appa ratus to execute the method of the second aspect.
Many of the attendant features will be more readily appreciated as they become better understood by reference to the following detailed description consid ered in connection with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are included to provide a further understanding of the example em bodiments and constitute a part of this specification, illustrate example embodiments and together with the description help to explain the example embodiments. In the drawings:
FIG. 1 illustrates an example of a wristband comprising a sensor configured to detect a hand inter action with an object, according to an embodiment;
FIG. 2 illustrates an example of a wristband comprising different types of sensors configured to de tect a hand interaction with an object, according to an embodiment; FIG. 3A illustrates an example of a wristband comprising a plurality of sensors mounted along the wristband for detection of a hand interaction with an object, according to an embodiment;
FIG. 3B illustrates the wristband of FIG. 3A depicted from another perspective, according to an em bodiment;
FIG. 4A illustrates another example of a wrist band comprising a plurality of sensors mounted along the wristband for detection of a hand interaction with an object, according to an embodiment;
FIG. 4B illustrates the wristband of FIG. 4A depicted from another perspective, according to an em bodiment;
FIG. 5 illustrates a first example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment;
FIG. 6 illustrates a second example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment;
FIG. 7 illustrates a third example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment;
FIG. 8 illustrates a fourth example of a type of interaction a wristband is configured to detect wherein an object and a hand are in contact, according to an embodiment;
FIG. 9 illustrates a first example use case of a wristband configured to detect hand interaction with an object, according to an embodiment;
FIG. 10 illustrates a second example use case of a wristband configured to detect hand interaction with an object, according to an embodiment; FIG. 11 illustrates a third example use case of a wristband configured to detect hand interaction with an object, according to an embodiment;
FIG. 12 illustrates an example embodiment of an apparatus configured to practice one or more example embodiments;
FIG. 13 illustrates an example of a method for detecting hand interaction with an object, according to an embodiment.
DETAILED DESCRIPTION
Reference will now be made in detail to example embodiments, examples of which are illustrated in the accompanying drawings. The detailed description pro vided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The de scription sets forth the functions of the example and the sequence of operations for constructing and operat ing the example. However, the same or equivalent func tions and sequences may be accomplished by different examples.
User input wristbands may continuously regis ter commands by detecting a position of a hand or by detecting a specific gesture of the hand. However, the hand position-based detection may not take into account physical interaction with objects which may give valu able information. This may involve knowing the exact moment the hand touches something, detection of what type of object the hand is in contact with, as well as detecting how the foreign object reacts to the hand's manipulation. In general, a hand movement alone is not very intuitive way of interacting. When people interact with objects around them in everyday life, they may heavily rely on tactile feedback of the physical objects exerting a force on their appendages. Tactile feedback, or haptic feedback, may refer to a physical response on an object from a user input. For example, when a user picks up a book from a table, a weight of the book causes resistance to movement of the hand which may be detected by the user as a tactile feedback from the book. The tactile feedback may transmit information to the user, for example, with vibration patterns or waves.
Optical hand tracking may be used to improve hand interaction detection, but it faces a similar is sue. Even though an optical hand tracking device may 'see' its surroundings, including the hand, as well as what the hand is holding, it may be hard to see the precise position of the hand and exactly what the hand and its fingers are doing due to the hand becoming oc cluded by the object it is holding. It may be difficult, for example, to see whether or not a trigger of a water gun is triggered and the exact time of the triggering using optical hand tracking only.
It is an objective to provide an apparatus com prising a wristband which is able to detect the way a user of the wristband interacts with other objects. This may comprise, for example, detection of touch events of foreign objects, detection of what type of object the user is in contact with (e.g. based on a weight, surface texture, acoustics properties, and the like) and/or de tecting how the foreign object reacts to hand manipula tion. The wristband may comprise at least one sensor configured to be in contact with skin of a wrist of the user. In an embodiment, the wristband may comprise a plurality of different types of sensors for improved accuracy of the detection.
According to an embodiment, an apparatus is provided which may detect how an object is reacting haptically (through vibrations and sound, for example), to what hands of a user are doing to it. The apparatus may be able to tell the exact moment the user has touched the object, and/or the moment a trigger on the object has been clicked, and/or whether or not the user's fin gers are sliding across a surface of the object or just hovering. Advantageously, this may enable using everyday objects as controllers. The object may not need to com prise electronics or be powered on to produce any feed back, but the haptic feedback may be received from the object inherently in response to contacting the hand of the user.
In an embodiment, an apparatus may enable con trolling digital content, for example in augmented re ality (AR) or in virtual reality (VR), without a need for dedicated controllers. Instead, a user may 'borrow' everyday objects and use them to control a digital en vironment the user sees through AR or VR. For example, the user may be able to use a real-life toilet paper roll as a light sabre in a Star Wars video game. In addition, the user may be able to use a plate as a controller for a virtual formula 1 car or a water gun as virtual pistol. A further possible use case comprises using an old wireless mouse that does not have batteries anymore, as a clicker for a slideshow. In general, these physical objects may become empty interaction shells that may map interactions to digital content once the objects are physically sensed in real life by the user and the wristband.
FIG. 1 illustrates an example of a wristband 100 comprising at least one sensor 102 configured to detect a hand interaction with an object, according to an embodiment. The at least one sensor 102 may be placed in a right place to extract signals (e.g. light, sound, conductivity, temperature, and/or weight), to under stand what a hand wearing the wristband 100 is inter acting with as much as possible. Positions of the one or more sensors, and sensor types, may depend on which types of signals the sensors are configured to monitor. In an embodiment, some sensors may be config ured to monitor physiological signals from the wrist based on tendon and skin surface movements. Some sensors may be configured to listen to acoustic signals travel ing along the hand, such as vertical and/or horizontal movement of acoustic waves originating from the object interacting with the hand and detected from the wrist. One or more sensor 102 may be configured to be in contact with skin of the wrist and detect at least one of inte rior waves or surface waves from the wrist. For example, a sound originating from the object may induce the in terior and/or surface waves which may be sensed via the hand by the wristband 100. A vertical movement in rela tion to the skin surface caused by the interior waves may be detected from the wrist, for example, with an acoustic transducer. A horizontal movement in relation to the skin surface caused by the surface waves may be detected from the wrist, for example, with an accel erometer in contact with the wrist skin. The waves may indicate a haptic feedback from the object in contact with the hand. The haptic feedback may be received in response to the hand manipulating the object or in re sponse to the hand initiating a contact with the object. The object may not comprise any electronics for haptics, but the haptic feedback may occur naturally based on the interaction between the hand and the object.
In an embodiment, the at least one sensor 102 may comprise a microphone. The microphone may be con figured to be in contact with skin on a dorsal side of the wrist. The wristband 100 may house the microphone. The wristband 100 may be configured to sit comfortably tight around the wrist and in contact on the wrist.
The wristband 100 may further house a signal processing device. The at least one sensor 102, such as the microphone, may pick up vibrations such as the in terior and/or surface waves caused by touch events on the skin of the hand. The at least one sensor 102 may be configured to send at least characteristics of the waves to a processing unit to be classified and/or fur ther processed. In an embodiment, the signals comprising the characteristics of the waves may be dispatched to an external signal processing device, for example, in a wireless manner.
The signal processing device may be trained to classify hand interactions with objects based on previ ously known hand interactions using any supervised learning model such as k means clustering, similarity, or machine learning. Once the used model has learned to classify hand interactions with objects from a few orig inal users, transfer learning may be used for general ization so that the classifier may not need to be re trained for any new user. At a first use time, a touch calibration sequence may be performed by the new users. The classification model may be trained to classify many sorts of hand interaction events with objects, such as touch events, gestures, touch release events, slide events, sound events, and the like.
Signals received by the at least one sensor indicative of the hand interaction events with objects may be labelled, for example, in order to train the supervised machine learning model. In an embodiment, the events may be manually labelled. For example, touch events may be labelled by asking the user to tap on various surfaces with one hand wearing the wristband 100 while simultaneously pressing a button with their other free hand whenever a touch with the hand wearing the wristband 100 occurs. This may produce labels as dis crete touch events which may be used to train the machine learning model.
In an embodiment, optical hand tracking cameras in an AR heads up display may be used to label tap/touch events or slide events. The user may be instructed to first demonstrate tap events in a way that the taps made by fingers are not occluded from the optical hand tracking view so that the optical hand tracking may approximate and determine the tap event to generate a label for the occurrence of a tap which is also measured by the wristband. The generated labels may be then used to detect tap events when the tap is occluded from the optical hand tracking view.
In an embodiment, touch screens and/or elec trically conductive tape of varying textures may be used to label, for example, touch and/or slide events. The user may be instructed to tap the touch screen in dif ferent hand orientations. When the user makes the tap and/or slide, the touch screen may register the events and use that as a label for the machine learning model to be able to detect such events even on surfaces that are not touch screens, as long as the acoustic signature of the surface to be touched is similar or can be gen eralised to.
In an embodiment, to detect and classify sounds originating from an object the user is holding, such as inanimate mouse clicks, labels for such events can be collected by using the mouse itself while it is still on and electrically operational. The user may be asked to use the mouse normally with the hand that has the wristband 100 on. The clicks digitally produced by the mouse may be collected and used as labels for the machine learning model for acoustic signals of the wristband. Hence, once the wristband has learned the clicks to a high enough accuracy, the user may switch off the mouse and continue using it as a mouse (with the aid of com puter vision hand tracking to sense location of the mouse) now that the wristband has learned what a click sounds like in terms of the acoustic signals sensed from a wrist via the wristband 100.
In an embodiment, robustness of the classifi cation may be improved by introducing noise to the ma chine learning training process. During the signal and label collecting processes described above, the user may tap their wristband and/or move their hand around to introduce noise. Algorithm of the machine learning model may then have labels that indicate whether a signal received was a true hand interaction event with an ob ject or noise, and it may learn to filter out the noise as long as there is a significant difference in the signal. By increasing the number of sensors in the wristband 100, the difference in the signal between true event and noise may be increased. In an embodiment, the wristband may comprise, for example, infrared optical distance sensors configured to provide the machine learning system with information on how tightly the wristband is placed with the skin. For example, the infrared optical sensors may detect when an impact, such as a tap, is received by the wristband (and not the hand) when at least some of the infrared optical sensors are pressed tighter to the skin in response to the im pact. The machine learning algorithm may be continuously observing both the sensors configured to detect the hand-object interactions and the sensors configured to detect the impacts to the wristband. Hence, the algo rithm may learn to distinguish these false taps (to the wristband) from a true tap (to an object in hand) because the tightness information provided by the infrared sen sor may provide this new information that the acoustic sensor may not have on its own.
When one sensor is used, at least medium strength and/or velocity touch events as well as slide events may be detected from an object contact with any part of the hand or even forearm. The touch event may refer to when the user touches an object or an item with their hand, or the object or item touches the user's hand. The slide event may refer to the user moving their hand or finger along a surface of an object or item. Further, for example vibration of a click from a real button may be detected. The vibration of the click may be caused by movement of mechanical components causing the click sound. In an embodiment, the wristband 100 may house a plurality of sensors to enable detection of touch release events and/or to localise any of the in teraction events being detected.
FIG. 2 illustrates an example of a wristband 100 comprising a plurality of sensors 102, 104, accord ing to an embodiment. When a plurality of sensors is used, robustness against wristband 100 movement, and noise and taps on the wristband 100 itself may be im proved. The plurality of sensors 102, 104 may comprise, for example, at least one microphone and at least one accelerometer. The at least one microphone may be mounted on a dorsal side of the wristband 100. The at least one accelerometer may be mounted on an opposite side of the wristband 100 than the microphone such that the microphone is mounted on an inner circle of the wristband 100 and the accelerometer on an outer circle of the wristband 100. The at least one microphone may be configured to be in contact with skin.
The at least one accelerometer sensor may im prove robustness to wristband 100 movement, and against noise and taps on the wristband 100 itself. Noise may originate from hand-moving which causes the microphone- to-skin contact to change. Further, knocks or taps on the wristband 100 housing the microphone may be detected with the accelerometer to decrease false hand interac tion detections. The falsely detected hand interactions with objects may refer to situations where the sensors configured to detect the hand interactions with the ob jects may register signals indicative of the hand in teractions even though the hand is not in contact with an object. Signals detected by the accelerometer may be transmitted to a signal processing device, which signals may provide information of when a signal from the mi crophone correlates with "non-interaction events", i.e. the falsely detected hand interactions, such as the hand movements and taps on the wristband 100 itself. The signal processing device may be trained to classify de tected hand interactions with objects, as described above. In addition, the signal processing device may be trained to distinguish between the hand interaction events and the non-interaction events based on, for ex ample, characteristics of signals received from at least two of the plurality of sensors 102, 104 at the same time.
FIGS. 3A and 3B illustrate an example of a wristband 100 comprising a plurality of sensors 102, 104 mounted along the wristband 100, according to an embod iment. The wristband 100 may correspond to the wristband 100 described in figure 2, but the sensors may be dis tributed along the wristband 100 to gain more precise data of hand interactions with objects. In an embodi ment, the sensors may be mounted along a whole length of the wristband 100. In an embodiment, an array of sensors 102 may be mounted on an inner circle of the wristband 100. The sensors on the inner circle may be configured to be in contact with skin. In addition, or alternatively, one or more sensors 104 may be mounted on an outer circle of the wristband 100.
The array of sensors 102 may comprise, for ex ample, microphones, which may be used to allow detected hand interactions such touch, slide, or haptic events to be localised. For example, a tap in one location may produce a signal profile combination on the multiple sensors that are distinct from a tap in another loca tion. The wristband 100 may comprise a signal processing device configured to determine which part of the hand interacted with an object, based on the signal profile combination. For example, strength of the signals near the source of the sound may be higher than with the signals received from sensors locating father away from the source.
The one or more sensors 104 on the outer circle may comprise accelerometers. With multiple accelerometers, a more accurate representation of a state and a movement of the wristband 100 may be deter mined, compared to when only one accelerometer is used, to further filter out noise and make the system more robust.
In an embodiment, the plurality of sensors 102, 104 may comprise a plurality of different types of mi crophones. The microphones may be configured to detect characteristic vibrations released through the air and/or the inside of the hand. The vibrations may be registered as a certain type of hand interaction based on their characteristics. When vibrations are detected from the air, the microphone may be located on an outer circle of the wristband. The different types of micro phones may comprise of, for example, a piezo contact microphone, a MEMS (micro-electro mechanical systems) microphone, and/or a vibration sensor, such as an ac celerometer detecting accelerations produced by acous tically vibrating structures (e.g. the hand or the wristband 100).
FIG. 4A and 4B illustrate an example of a wristband 100 comprising a plurality of different types of sensors 104, 106, 108 mounted in or on the wristband 100, according to an embodiment. The plurality of dif ferent types of sensors 104, 106, 108 may comprise, for example, accelerometers, electrodes and/or cameras.
The plurality of sensors 104, 106, 108 may be mounted, for example, along a whole length of the wrist band 100 configured to be placed around a wrist of a user. At least some of the sensors 106 may be configured to be in contact with a skin on the wrist. The wristband 100 may comprise, for example, an array of electrodes configured to be in contact with the skin. At least some of the sensors 104, 108 may locate on an opposite side of the wristband 100 than the skin, such as the one or more accelerometers and/or cameras. In an embodiment, the one or more cameras 108 may face towards a tip of fingers to get a view of the hand wearing the wristband 100.
When the hand makes contact with foreign ob jects or with itself, the electrical capacitance and impedance of the hand may change which may be measured by the array of electrodes. To improve localization of the contact, the one or more cameras may be used to form a contact probability distribution for all locations on the hand. The probability data combined with hand in teraction events detected from the electrodes may pro vide real-time localised hand interaction event detec tion, such as localizing taps performed by the user on an item.
To increase robustness against changing elec- trode-to-skin contact e.g. when the wristband 100 moves around the wrist during use, the wristband 100 may use the at least one camera 108 and/or the at least one accelerometer 104 to take into account such movement. For example, signals received from the at least one camera and/or accelerometer may be used to correct for the movements while reading signals from the electrodes.
In an embodiment, the at least one sensor mounted at the inner circle of the wristband may com prise one or more optical sensors and/pressure sensors. The optical/pressure sensors may be configured to meas ure a contact the wristband has with the skin. For ex ample, the tighter the wristband is around the skin, the brighter the optical sensor may perceive the light re flecting back from the skin. The distance between var ious parts of the wristband and specific points on the user's skin surface may change when fingers move or the object in the hand moves. The detected varying bright ness levels may be used to distinguish occasions when the other sensors at the inner circle may falsely detect a movement of the wristband as a hand interaction with an object. For example, the may user tap the wristband itself by accident. This tap may be "heard" by the microphones and/or electrodes and the tap may be mis- classified as a tap on a finger of the hand wearing the wristband. However, when the wristband is tapped by ac cident, the wristband may slightly be pressed tighter to the skin of the wrist for a short duration, which may be detected by the optical sensors as an indication of a tap which may be ignored.
FIG. 5 illustrates an example of a first type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact. The hand interaction event may comprise, for example, a touch of an object. In an embodiment, the hand interac tion event may further comprise releasing the object. The hand interaction event may comprise localisation of the touch, e.g. to which part of the object 502 a finger of the user has touched, or which finger touched the object 502. In addition to hand-object interactions de tected in response to user inputs towards the objects, also objects causing an impact to the hand, e.g. when something is thrown at the hand, may be detected based on sensor data the wristband produces.
FIG. 6 illustrates an example of a second type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact. The hand interaction event may comprise touch and/or release of an object 600 with the object 502 a user is holding. Hence, haptic feedback received from the object 502 the user is holding may indicate an interaction of the hand 500 with another object 600, such as a floor, sensed via the object 502 the user is holding.
FIG. 7 illustrates an example of a third type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact. The hand interaction event may comprise, for example, a slide interaction with an object. The slide interaction may refer to activities wherein at least a part of the hand 500 wearing the wristband 100 is slide on a surface of an object, such as running a finger on a table. Detection of the sliding may comprise localisation of the touch on the object. The localisation may comprise determining which part of the hand 500 was in contact with the object 502.
FIG. 8 illustrates an example of a fourth type of interaction a wristband 100 is configured to detect wherein an object 502 and a hand 500 are in contact. The hand interaction event may comprise, for example, de tection of sounds the object 502 the user is holding makes. The sounds may be detected based on of haptic feedback from the object 502, wherein the haptic feed back travels as internal signals such as acoustic waves in the hand 500 or on skin surface of the hand 500. The haptic feedback may be detected by sensors positioned around a wrist of the user. For example, the user may click a button of a mouse controller, and the click caused by a pressed and released mechanical switch in duces vibrations via the pressing finger and along the hand 500 to the wrist area which can be sensed by sensors mounted on the wristband 100. Further, the vibrations may be characteristic to the mouse click and mapped to an input event which may comprise a mouse click or any other programmed input. The programmed input may depend on a device for which the input is provided for, and the mapped input may be changed.
The wristband 100 may be configured to dispatch the sensor data to a signal processing device for de termining properties of the touched object. For example, based on trembling of the hand, a weight of a lifted object may be estimated. Further, the weight may be used to determine which type of an object the user lifted. In addition, material properties such as a surface tex ture of the object may be determined, for example, based on shape and/or pattern of the signals received when the user slides finger across the surface. Reliability of the detections may be improved by providing robustness to incoherent skin contact, universality and/or noise and taps on the wristband 100 itself. In an embodiment, the properties of the object may be detected using op tical tracking technology. The optical tracking tech nology may comprise computer vision system placed, for example, on a heads-up display or on the wristband.
FIG. 9 illustrates an example of a use case of a wristband 100 configured to detect hand interaction events, according to an example embodiment. The wrist band 100 may be, for example, any of the wristbands 100 illustrated in the figures 1 to 4. In an embodiment, the wristband 100 may be worn by a user in one or both hands 500 while they play a virtual game, such as a driving game. In the game, the user may need to turn a steering wheel and shift gears up or down. Advantageously, the wristband 100 may detect hand interaction events such as finger taps on an object 502, such as a plate, used as the steering wheel by listening for them through vibrations sensed from the wrist. Further, the wristband may be configured to detect gestures, such as turning the plate, for example based on signals from one or more accelerometers mounted on the wristband. The gestures may be registered only in response to detecting the user is holding the plate based on the haptic feedback form the plate.
For a comparison, with optical hand tracking systems, the user could grab a wheel and hold it in front of them as a substitute for a real steering wheel to provide physical feedback, and the user's large ac tions on the wheel, like turning the wheel, could be detected optically. However, what may not be possible is for the optical system to detect the user tapping the back of the plate in order to shift gears or to detect the exact moment of the tapping when the fingertip is not visible. The wristband 100 may enable detection of such non-visible taps based on haptic feedback from the plate detected by the at least one sensor of the wristband 100 in response to the taps made by the user's hand wearing the wristband 100. Moreover, the taps may be registered substantially in real-time, i.e. as soon as the fingers touch the plate and the resulting wave/vibration movement travels along the hand to sen sors at the wristband. Hence, the wristband 100 provides a simple arrangement to also detect manipulation of ob jects which may not be easy to detect even with compli cated or costly devices.
In an embodiment, the wristband 100 may be used in combination with a camera-based hand tracking system to complement the hand tracking and/or make hand inter action detection more accurate. The camera-based hand tracking system may be mounted, for example, to a heads- up display. The heads-up display may refer to any trans parent display that presents data without requiring us ers to look away from their usual viewpoints. The heads- up display may comprise a head-mounted display, wherein the display element may move with the orientation of the user's head. The signal processing device associated with the wristband 100 may be configured to receive hand tracking data from the camera-based hand tracking system to determine inputs based on data from both the wrist band 100 and the camera-based hand tracking system. Al ternatively, the signal processing device may be con figured to provide inputs based on the wristband data to a device receiving the hand tracking data, which device may be the heads-up display.
FIG. 10 illustrates an example of another use case of a wristband 100 configured to detect hand in teraction events, according to an example embodiment. The user may wear the wristband in at least one hand 500. The wristband 100 may be, for example, any of the wristbands illustrated in the figures 1 to 4. In an embodiment, a user may use an existing physical object 502 to provide haptic feedback for the users' digital input. For example, the user may want to type words into a heads-up display. With the wristband 100, the user may use an existing keyboard that they own and type with that, with words showing up on the heads-up display. Importantly, the keyboard may not need to be electron ically connected to the heads-up display. This is be cause, first, an optical tracking system may be config ured to know where a finger of the user is with respect to each key on the keyboard and approximate which key is intended to be pressed. The wristband 100 may be configured to detect the exact moment the user touches and then presses a key down to confirm the key press together with the optical tracking system, such as a camera on the wristband 100. Without the wristband 100, the press events may not be detectable with the optical hand tracking on its own since the hand may occlude the cameras view to the fingers and their actions itself. Hence, the user may use an object to provide haptic feedback in response to by the user interacting with the object 502 to produce inputs to one or more electronic devices.
FIG. 11 illustrates an example of another use case of a wristband 100 configured to detect hand in teraction events, according to an embodiment. The wrist band 100 may be, for example, the wristband 100 illus trated in any of figures 1 to 4. The detection of touches and other hand interactions may enable users to start using physical objects 502 that have no electronics in them at all to provide inputs for one or more devices. The physical objects may be everyday objects, or they may be custom designed with the main purpose of provid ing haptic feedback to the user. For example, an ergo nomic physical 3D structure may be provided. The 3D structure may be held by the user in hand 500 while wearing the wristband 100. The object 502 may have but tons 1100, scroll wheels 1102, and/or trackballs 1104 on it for the user to be able to feel haptic feedback while clicking, scrolling, and rolling with the 3D structure. These hand interaction events that may oth erwise be too discreet for optical systems to detect accurately may be detected by the wristband 100, for example, based on vibrations travelling from the 3D structure via the hand 500 and thus make input events feel extremely physical, real, and responsive.
In an embodiment, different sounds and vibra tions may be mapped by a signal processing device con figured to receive sensor data from the wristband 100 to determine the input associated with the specific sound and/or vibration pattern. The signal processing device may further provide a time of the hand interac tion with the object. The input may be provided to the device substantially in real-time as the user input (contact/manipulation of the object) may be detected immediately (for example, with a latency of less than 100 ms) when the contact happens.
FIG. 12 illustrates an example embodiment of an apparatus 1200 configured to practice one or more example embodiments. The apparatus 1200 may comprise a wristband comprising at least one sensor.
The apparatus 1200 may comprise at least one processor 1202. The at least one processor 1202 may comprise, for example, one or more of various processing devices, such as for example a co-processor, a micro processor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accom panying DSP, or various other processing devices in cluding integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose com puter chip, or the like.
The apparatus 1200 may further comprise at least one memory 1204. The memory 1204 may be configured to store, for example, computer program code 1206 or the like, for example operating system software and application software. The memory 1204 may comprise one or more volatile memory devices, one or more non-vola tile memory devices, and/or a combination thereof. For example, the memory 1204 may be embodied as magnetic storage devices (such as hard disk drives, magnetic tapes, etc.), optical magnetic storage devices, or sem iconductor memories (such as mask ROM, PROM (programma ble ROM), EPROM (erasable PROM), flash ROM, RAM (random access memory), etc.).
The apparatus 1200 may further comprise a com munication interface 1208 configured to enable the ap paratus 1200 to transmit and/or receive information, to/from other apparatuses. The communication interface 1208 may be configured to provide at least one wireless radio connection, such as for example a 3GPP mobile broadband connection (e.g. 3G, 4G, 5G). However, the communication interface 1208 may be configured to pro vide one or more other type of connections, for example a wireless local area network (WLAN) connection such as for example standardized by IEEE 802.11 series or Wi-Fi alliance; a short range wireless network connection such as for example a Bluetooth, NFC (near-field communica tion), or RFID connection; a wired connection such as for example a local area network (LAN) connection, a universal serial bus (USB) connection or an optical net work connection, or the like; or a wired Internet con nection. The communication interface 1208 may comprise, or be configured to be coupled to, at least one antenna to transmit and/or receive radio frequency signals.
The apparatus 1200 may further comprise a user interface 1210 comprising an input device and/or an out put device. The input device may take various forms such as a touch screen, or one or more embedded control but tons. The output device may for example comprise a dis play, a speaker, or the like.
When the apparatus 1200 is configured to im plement some functionality, some component and/or components of the apparatus 1200, such as for example the at least one processor 1202 and/or the memory 1204, may be configured to implement this functionality. Fur thermore, when the at least one processor 1202 is con figured to implement some functionality, this function ality may be implemented using program code 1206 com prised, for example, in the memory 1204.
The functionality described herein may be per formed, at least in part, by one or more computer program product components such as software components. Accord ing to an embodiment, the apparatus 1200 comprises a processor or processor circuitry, such as for example a microcontroller, configured by the program code when executed to execute the embodiments of the operations and functionality described. Alternatively, or in addi tion, the functionality described herein can be per formed, at least in part, by one or more hardware logic components. For example, and without limitation, illus trative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), application-specific Integrated Circuits (ASICs), ap plication-specific Standard Products (ASSPs), System- on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
The apparatus 1200 comprises means for per forming at least one method described herein. In one example, the means comprises the at least one processor 1202, the at least one memory 1204 including program code 1206 configured to, when executed by the at least one processor 1202, cause the apparatus 1200 to perform the method.
The apparatus 1200 may comprise for example a computing device such as for example a signal processing device, a client device, a wearable device, or the like. Although the apparatus 1200 is illustrated as a single device it is appreciated that, wherever applicable, functions of the apparatus 1200 may be distributed to a plurality of devices.
According to an example embodiment, the appa ratus comprises a wristband comprising a sensor setup and a signal processing device configured to detect hand interactions with objects, such as touch events. The touch events may comprise, for example, the moment a user touches something, the moment the user feels a physical click from an object, and/or the moments the user is sliding finger over an object. The detected events may trigger various programmable input events to any other digital device such as a smartwatch on the wristband itself, a smartphone, an external monitor, or a heads-up display for AR and/or VR. Hence, usage of real-world devices/objects as virtual devices may be enabled.
FIG. 13 illustrates an example of a method for detecting hand interaction with an object, according to an embodiment. The method may be executed, for example, with the wristband 100 or the apparatus 1200.
At 1300, the method may comprise determining a hand interaction with an object based on characteristics of signals detected from a wrist indicating physical changes on the wrist by at least one sensor mounted at an inner circle of a wristband. When a haptic feedback occurs, such as a physical click or vibrations caused from the contact when sliding a finger on the object, characteristic vibrations may be released through the air and/or the inside of the hand which can be picked up by the sensors, such as microphones. The vibrations may be registered, for example, as touch events and mapped to any input.
In an embodiment, data on the hand interaction with the object, such as the signals, characteristics of the signals, and/or result of an analysis of the signals on the hand interaction, may be stored and out put to be used as an input to a device. The device may be, for example, a monitor, an AR or VR device, a com puter, a game console, or the like. The analysis may comprise determination of a type of the hand interaction and/or properties of the object. The type of hand in- teraction may comprise at least one of the hands holding the object, an impact of the object to the hand, the hand moving the object, the hand sliding on a surface of the object, the hand tapping the object, the hand touching an object with the object, or the hand releas- ing the object. In an embodiment, the data may comprise a time of the hand interaction. The time may be accu rately determined based on a time of reception of the signals via the wrist.
Further features of the method (s) directly re- suit for example from functionalities of the apparatuses described throughout the specification and in the ap pended claims and are therefore not repeated here. Dif ferent variations of the method (s) may be also applied, as described in connection with the various example em- bodiments. It is obvious to a person skilled in the art that with the advancement of technology, the basic idea of the disclosure may be implemented in various ways. The disclosure and the embodiments are thus not limited to the examples described above, instead they may vary within the scope of the claims.
An apparatus may be configured to perform or cause performance of any aspect of the method (s) de scribed herein. Further, a computer program may comprise instructions for causing, when executed, an apparatus to perform any aspect of the method (s) described herein. Further, an apparatus may comprise means for performing any aspect of the method (s) described herein. According to an example embodiment, the means comprises at least one processor, and memory including program code, the at least one processor, and program code configured to, when executed by the at least one processor, cause per formance of any aspect of the method (s). Any range or device value given herein may be extended or altered without losing the effect sought. Also, any embodiment may be combined with another em bodiment unless explicitly disallowed.
Although the subject matter has been described in language specific to structural features and/or acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as examples of implementing the claims and other equiv alent features and acts are intended to be within the scope of the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to 'an' item may refer to one or more of those items.
The operations of the methods described herein may be carried out in any suitable order, or simultane ously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the scope of the subject matter described herein. Aspects of any of the embodiments described above may be combined with aspects of any of the other embodiments described to form further embodiments with out losing the effect sought.
The term 'comprising' is used herein to mean including the method, blocks, or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
Although subjects may be referred to as 'first' or 'second' subjects, this does not necessarily indicate any order or importance of the subjects. Instead, such attributes may be used solely for the purpose of making a difference between subjects.
It will be understood that the above descrip- tion is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exem plary embodiments. Although various embodiments have been described above with a certain degree of particu larity, or with reference to one or more individual embodiments, those skilled in the art could make numer ous alterations to the disclosed embodiments without departing from scope of this specification.

Claims

1. An apparatus, comprising: a wristband; at least one sensor mounted at an inner circle of the wristband and configured to detect signals from a wrist of a user wearing the wristband; at least one processor; and at least one memory comprising program code which, when executed by the at least one processor, causes the apparatus at least to: determine a hand interaction with an object based on characteristics of the signals detected by the at least one sensor from the wrist indicating physical changes on the wrist resulting from the object in con tact with the hand; and output data on the hand interaction with the object to be used as an input to a device.
2. The apparatus of claim 1, wherein the at least one memory and the program code are further con figured to, when executed on the at least one processor, cause the apparatus to: determine a time of the hand interaction based on a detection time of the signals from the at least one sensor mounted at the inner circle of the wristband.
3. The apparatus of claim 1 or 2, wherein the at least one memory and the program code are further configured to, when executed by the at least one pro cessor, cause the apparatus to: determine a type of the hand interaction with the object based on characteristics of the signals; and determine the input for the device based on the type of the hand interaction with the object.
4. The apparatus of claim 3, wherein the type of hand interaction comprises at least one of the hand touching the object, the hand holding the object, the object in contact with the hand making a sound, an impact of the object to the hand, the hand moving the object, the hand sliding on a surface of the object, the hand tapping the object, the hand touching an object with the object, or the hand releasing the object.
5. The apparatus of any preceding claim, wherein the at least one sensor mounted at the inner circle of the wristband comprises at least one of a microphone, an accelerometer, an acoustic transducer, an optical sensor, or an electrode.
6. The apparatus of any preceding claim, fur ther comprising at least one sensor mounted at an outer circle of the wristband and configured to detect signals from other sources than the wrist; and wherein deter mining the hand interaction with the object is further based on signals received from the at least one sensor mounted at the outer circle.
7. The apparatus of claim 6, wherein the at least one sensor mounted at the outer circle of the wristband comprises at least one microphone configured to detect sound waves from the air, an accelerometer configured to monitor vibrations on the wristband, or a camera configured to monitor the hand.
8. The apparatus of claim 6 or 7, wherein the at least one memory and the program code are further configured to, when executed on the at least one pro cessor, cause the apparatus to: detect an indication of a falsely detected hand interaction with an object based on signals received from the at least one sensor mounted at the outer circle of the wristband.
9. The apparatus of any preceding claim, wherein the at least one sensor mounted at the inner circle of the wristband is configured to detect signals in response to at least one of vibration on the wrist, interior waves from the wrist, surface waves from the wrist, changes in a shape of a skin surface of the wrist, changes in a tendon pattern of the wrist, or deformation of an electric field on the wrist.
10. The apparatus of any preceding claim, wherein the wristband comprises a plurality of sensors mounted at the inner circle and wherein the at least one memory and the program code are further configured to, when executed by the at least one processor, cause the apparatus to: localize the hand interaction with the object based on a signal profile combination on the plurality of sensors.
11. The apparatus of any preceding claim, wherein the at least one memory and the program code are further configured to, when executed by the at least one processor, cause the apparatus to: determine a type of the object the hand inter acts with based on the signals from the at least one sensor mounted at the inner and/or outer circle of the wristband.
12. The apparatus of claim 11, wherein the type of object is determined based on at least one of weight of the object, surface texture of the object or acoustic properties of the object determined based on the sig nals.
13. A method to detect a hand interaction with an object, comprising: determining a hand interaction with an object based on characteristics of signals detected from a wrist indicating physical changes on the wrist resulting from the object in contact with the hand by at least one sensor mounted at an inner circle of a wristband.
14. A computer program product, comprising in structions to cause the apparatus of any of claims 1 to 12 to execute the method of claim 13.
15. A computer-readable medium comprising in structions which, when executed by the apparatus of any of claims 1 to 12, cause the apparatus to execute the method of claim 13.
PCT/FI2022/050186 2021-03-23 2022-03-23 An apparatus and a method for detecting a hand in contact with an object based on haptic feedback from wrist WO2022200686A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20215321 2021-03-23
FI20215321A FI130792B1 (en) 2021-03-23 2021-03-23 An apparatus and a method for detecting a hand in contact with an object based on haptic feedback from wrist

Publications (1)

Publication Number Publication Date
WO2022200686A1 true WO2022200686A1 (en) 2022-09-29

Family

ID=81325584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2022/050186 WO2022200686A1 (en) 2021-03-23 2022-03-23 An apparatus and a method for detecting a hand in contact with an object based on haptic feedback from wrist

Country Status (2)

Country Link
FI (1) FI130792B1 (en)
WO (1) WO2022200686A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210064132A1 (en) * 2019-09-04 2021-03-04 Facebook Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210064132A1 (en) * 2019-09-04 2021-03-04 Facebook Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control

Also Published As

Publication number Publication date
FI130792B1 (en) 2024-03-22
FI20215321A1 (en) 2022-09-24

Similar Documents

Publication Publication Date Title
CN210573659U (en) Computer system, head-mounted device, finger device, and electronic device
US9958942B2 (en) Data input device
WO2017215375A1 (en) Information input device and method
TWI457793B (en) Real-time motion recognition method and inertia sensing and trajectory
CN104919391B (en) Calculate interface system
CN104769522B (en) The remote controllers with gesture identification function are pointed to 3D
US10877558B1 (en) Wearable gesture input device using electric impedance measurement
CN102016765A (en) Method and system of identifying a user of a handheld device
US10838508B2 (en) Apparatus and method of using events for user interface
US20110133934A1 (en) Sensing Mechanical Energy to Appropriate the Body for Data Input
CN109005336B (en) Image shooting method and terminal equipment
CN108196668B (en) Portable gesture recognition system and method
CN110113116B (en) Human behavior identification method based on WIFI channel information
CN109313502A (en) Utilize the percussion state event location of selection device
US20190049558A1 (en) Hand Gesture Recognition System and Method
CN107145232B (en) A kind of method and apparatus using capacitance detecting identification user's hand behavior
CN108523281B (en) Glove peripheral, method, device and system for virtual reality system
WO2014106862A2 (en) A method and system enabling control of different digital devices using gesture or motion control
CN107533371A (en) Controlled using the user interface for influenceing gesture
CN106055958B (en) A kind of unlocking method and device
JP4677585B2 (en) Communication robot
CN111405361B (en) Video acquisition method, electronic equipment and computer readable storage medium
Saha et al. Gesture based improved human-computer interaction using Microsoft's Kinect sensor
WO2022200686A1 (en) An apparatus and a method for detecting a hand in contact with an object based on haptic feedback from wrist
CN112395921A (en) Abnormal behavior detection method, device and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22714472

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19.01.2024)