GB2552219A - Wearable input device - Google Patents

Wearable input device Download PDF

Info

Publication number
GB2552219A
GB2552219A GB1612304.4A GB201612304A GB2552219A GB 2552219 A GB2552219 A GB 2552219A GB 201612304 A GB201612304 A GB 201612304A GB 2552219 A GB2552219 A GB 2552219A
Authority
GB
United Kingdom
Prior art keywords
processing
wearable
controller
operable
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1612304.4A
Other versions
GB201612304D0 (en
Inventor
Antony Hack Stephen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Interactive Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Interactive Entertainment Inc filed Critical Sony Interactive Entertainment Inc
Priority to GB1612304.4A priority Critical patent/GB2552219A/en
Publication of GB201612304D0 publication Critical patent/GB201612304D0/en
Publication of GB2552219A publication Critical patent/GB2552219A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Processing system for generating inputs comprising a wearable control device 100 operable to detect changes of the surface of a first body part on which it is worn, and a processing device 110 operable to receive from the control device surface change information and with it identify a gesture associated with a second body part, and generate inputs based on an identified gesture. The first part may be an arm and the second part a hand. The controller may comprise a plurality of sensors operably coupled with the users skin. It may comprise a sensor for determining the position and/or orientation of the controller. The system may comprise a camera 130 operable to capture an image of the controller, which may be processed to determine position and/or orientation. The controller may comprise a marker to be imaged. Information about position and/or orientation may be used to generate inputs. Gestures may be identified using a model and/or a look-up table. Also provided is a method for operating the system, which may be realised as a computer program.

Description

(54) Title of the Invention: Wearable input device
Abstract Title: Input processing system with wearable control device for gesture recognition (57) Processing system for generating inputs comprising a wearable control device 100 operable to detect changes of the surface of a first body part on which it is worn, and a processing device 110 operable to receive from the control device surface change information and with it identify a gesture associated with a second body part, and generate inputs based on an identified gesture. The first part may be an arm and the second part a hand. The controller may comprise a plurality of sensors operably coupled with the user’s skin. It may comprise a sensor for determining the position and/or orientation of the controller. The system may comprise a camera 130 operable to capture an image of the controller, which may be processed to determine position and/or orientation. The controller may comprise a marker to be imaged. Information about position and/or orientation may be used to generate inputs. Gestures may be identified using a model and/or a look-up table. Also provided is a method for operating the system, which may be realised as a computer program.
Figure GB2552219A_D0001
120
FIG. 1
1/3
120
Figure GB2552219A_D0002
Figure GB2552219A_D0003
L200
310
Figure GB2552219A_D0004
Flf' Q
Bo, 0
Figure GB2552219A_D0005
Sensors
500
Sensor
Input
Unit
510
Communication Unit Gesture identification Unit
540 550
Input Generation Unit Storage
560 570
Communication
Unit
520
A
100
FIG. 5
WEARABLE INPUT DEVICE
This disclosure relates to wearable input devices.
In many processing arrangements it may be desirable to provide an apparatus that allows for intuitive inputs that is used for inputting commands to a processing device. Conventional controllers may be seen as being limiting in this respect, as the number of inputs is restricted by the number of buttons on the controller (or the complexity of using the controller is increased if the number of inputs is high) and having an action correspond to a button press may appear unintuitive to a user - especially in a virtual reality (VR) or augmented reality (AR) application, where users may expect more naturalistic interaction. Conventional controllers may decrease the sense of immersion that a user experiences when playing a game, for example, or at least limit the number or type of inputs that a user can use.
In previous known arrangements, such as the EyeToy® system, a camera was provided that was operable to capture images of the user during the playing of a game, for example. Image processing was then performed on the images captured of the user in order to detect movements made by the user which could then be interpreted as gestures. Processing corresponding to the user’s gesture may then be performed in response to this. Arrangements such as this may place a large processing burden on a processor in performing the image processing, and also may not be sufficiently accurate for many applications. For example, in many situations either the camera resolution or lighting conditions limited the fidelity of the captured image and hence what movements or gestures could be detected by the system.
Some arrangements have sought to alleviate the processing burden by providing recognisable markers, such as the PlayStation® Move ® motion controller which comprises a light source as an active marker. By providing such a recognisable marker in the image captured by the camera associated with the processing device a gesture may be more easily detected in the image processing step. However, providing handheld devices may not be suitable for a number of applications, as the number of gestures that may be performed is limited to movements of the device - actions such as clenching a fist or pointing are not distinguishable, for example, or are simply not possible due to the user holding a controller comprising the marker.
A further arrangement that has been proposed is that of a glove (worn by a user) that is operable to detect hand motions as gestures. However using gloves may restrict the ability of the user to interact with physical objects, such as picking up a prop or a drink or the like. In addition to this, gloves that are able to detect hand gestures may need to be well fitted to the user’s hand; therefore gloves cannot be easily shared between a plurality of users and may be expensive to produce due to the more bespoke nature of the product.
l
The Myo® gesture control armband is a recent arrangement that has been proposed in order to address these problems. This armband comprises a plurality of units that each comprise EMG (electromyography) sensors that are operable to detect the use of muscles in the user’s arm using an electrical connection. By recording the use of particular muscles, it is possible to determine the arm motion that is being performed by the user. However, such an arrangement may not be appropriate for all applications - for example, the use of an electrical connection may be unsafe for some users, and the costs associated with such a device can be prohibitively high.
In view of the above problems, an arrangement is provided that may be advantageous in a number of respects, which seeks to give a user the ability to perform a number of possible gestures that may be recognised by a processing device and interpreted to generate instructions to control an application being run by a processing device, which furthermore does not rely on the direct detection of the use of muscles.
The present disclosure is defined by claims 1 and 11, with further respective aspects and features of the disclosure being defined in the appended claims.
Embodiments of the disclosure will now be described with reference to the accompanying drawings, in which:
Figure 1 schematically illustrates an entertainment system;
Figure 2 schematically illustrates a plurality of wearable controllers;
Figure 3 schematically illustrates a wearable controller;
Figure 4 schematically illustrates a method of generating inputs for an application; and
Figure 5 schematically illustrates a wearable controller and a processing system.
A wearable input device and processing method are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practice the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
Figure 1 schematically illustrates an entertainment system comprising a controller 100, a processing device 110, a display 120 and a camera 130.
The controller 100 is operable to detect changes of the surface of a first body part upon which the controller 100 is worn, and to provide inputs to the processing device 110 that may be interpreted so as to cause processing to be performed. The controller 100 may provide inputs to the processing system responsive to a user’s gestures, button presses (if the controller 100 comprises one or more buttons), and/or any other suitable inputs such as voice control (if the controller 100 comprises a microphone). In the present disclosure, the controller 100 is wearable upon the user’s arm and is operable to detect hand motions. Hence the controller is worn on a first part of the body such as the arm, but is operable to detect gestures performed by a second part of the body such as the hand. A plurality of wearable controllers 100 may be provided, such that a user may wear more one or more controllers 100 on one or either of their arms. The wearable controller 100 may optionally comprise a handheld component, such as an additional unit with extra buttons, to provide a greater range of possible inputs to the user.
While this disclosure discusses the wearing of such a controller on the user’s arm, it would be immediately apparent to the skilled person upon reading this disclosure that the controller could be worn on any part of the body; an example being a controller that is wearable on the user’s leg that is able to determine the motion of the user’s foot. It may be possible to detect, from muscle motion, which part of the body the wearable controller is worn upon; for example, forearm muscles and upper-arm muscles will cause different sensor readings due to the different muscles present. Alternatively, or in addition, the user may manually inform the processing device 110 of the location of the controller 100 on their body, and/or controllers may be preset for different parts of the body.
By providing a device that may be able to detect hand motions without being in contact with the user’s hand, the user’s hands are left free to either hold other objects (such as a prop) or make a greater range of gestures that would not be possible when wearing a controller on the hand or holding a controller.
The processing device 110 is operable to receive information from the wearable controller 100 relating to the detected surface changes of a first part of the user’s body, and to identify a gesture associated with a second part of the user’s body using the received information. An example of this is receiving information about motion of the surface of the user’s arm (for example their forearm) and identifying a hand gesture associated with that motion. The processing device is also operable to generate inputs for controlling processing based upon the identified gesture. It will be understood that motion of the surface of the user’s arm may in whole or part also constitute deformation of the profile of the user’s arm. Consequently where reference to one is made herein, the other may be inferred as appropriate.
The processing device is also used to perform processing, such as executing a game program or other application, which may be controllable by the controller 100. The processing device 110 may generate images that are to be output to the display 120. The processing device 110 is described in more detail below, with reference to Figure 5.
The display 120 is operable to display images generated by the processing device 110.
The optional camera 130 is operable to capture images of the wearable controller 100, wherein the processing device 110 is operable to perform processing to determine a position and/or orientation of the wearable controller 100 from the images. These images may be used to perform motion tracking or otherwise locate the controller 100 in order to help recognise gestures. In order to assist with such tracking, the wearable controller may comprise a marker that may be imaged by the camera that is more easily recognisable in an image processing step.
Alternatively or in addition, the controller may comprise one or more sensors in the form of accelerometers I gyroscopes to provide position, motion and/or orientation telemetry to the processing device, again to help recognise gestures.
Figure 2 schematically illustrates a plurality of wearable controllers of different sizes arranged on a user’s arm 200. Each one of these may be suitable for a different application, and in some applications it may be suitable to provide the user with multiple controllers (either of the same or different sizes), for example to allow more accurate tracking. Each of the controllers illustrated here may be worn at any location on the arm; although it is expected that a smaller controller may be worn around the wrist, it may also be worn higher up the arm for example. The sensors that are embedded in the controllers will be discussed below with reference to Figure 3, as well as the method of communicating the results of sensor measurements to the processing device.
The controller 210 is a smaller controller that may be worn as a bracelet, allowing a user to be relatively unhindered by the controller 210 due to the light weight and small profile. This may be suitable for applications in which a lower level of functionality is acceptable; for example, only coarse gesture recognition and optionally measurements such as a user’s pulse rate. An example of such an application is the control of movie playback in which a user may only commonly use pause/play and stop commands - such a small number of corresponding gestures may be easily distinguished by the controller 210 as the gestures may be defined as being suitably distinct. For example, the commands could be related to the actions of making a fist or a flat palm with the hand with which the controller 210 is associated.
The controller 220 is a larger size than the controller 210, and as such may comprise a greater number of sensors that are able to monitor a larger portion of the user’s arm. As a result, the controller 220 may be able to detect more refined gestures; for example, rather than a binary choice of gestures as in the play/pause example the controller 220 may be able to distinguish between several different hand gestures. This is because the use of a greater number of muscles may be detected, in addition to being able to map the use over a greater area of the user’s arm. The controller 220 may therefore be more suitable as a gaming controller than the controller 210, due to the increased number of possible inputs.
The controller 230 is the largest of those illustrated in Figure 3, and generally (due to its greater size) will comprise a greater number of sensors than either of the controllers 210 or 220, over a greater area. As a result, the controller 230 may be most suitable for detecting a large number of gestures, as the increased number of sensors may allow more similar gestures to be distinguished from one another. While this may lead to an increased cost of the controller 230, the improved functionality provides an advantage over smaller controllers.
The wearable controllers may also comprise an AR marker in order to aid tracking of the controller using images obtained by the optional camera 130 associated with the processing device. This may be used instead of, or in addition to, other methods for determining the position of the controller. Determining the position of the controller may be useful in providing additional information to be used for identifying gestures performed by the user of the controller.
Figure 3 schematically illustrates a cut-away view of the controller 230. A plurality of sensors 300 that are in contact with the user’s skin (when in use) are shown, in addition to a control unit 310. The control unit 310 may comprise a number of further sensors, such as an accelerometer, as well as a communication unit in order to transmit sensor information to the processing device 110.
Each of the sensors 300 is operable to detect surface changes of the user’s arm; for example, a movement of the skin. Such changes may be indicative of the operation of the underlying muscles, and therefore by identifying such changes the gesture that is made by the user may be derived by an associated processing device. The sensors 300 may of course be provided in any suitable arrangement; for example, a mesh of sensors may provide more accurate and precise measurements than the concentric ring arrangement of sensors 300 shown in Figure 3.
An example of such a sensor is a strain gauge, a component that varies its output voltage in response to deformation. Each sensor 300 may comprise a number of these, or each sensor 300 may correspond to a single strain gauge; the arrangement of the strain gauges is dependent on the desired resolution of the arrangement. By arranging these strain gauges in a suitable manner, it is possible to generate a map of the user’s arm at a desired time of measurement by detecting the values of the output voltage for each strain gauge. These values may be used with a model or a look-up table in order to determine which gesture corresponds to the detected motion. The magnitude of the deformation may also be used to determine the magnitude of the gesture; for example, a lightly clenched fist and a strongly clenched fist result in different amounts of movement of the user’s arm.
It will be appreciated that whilst the sensors are illustrated in figure 3 as being in contact with user skin, this is not essential; instead of being directly coupled to the skin, sensors for example may be embedded within or bonded to an outer surface of a material that indirectly couples (transfers or translates) movement I deformation of the skin surface to the sensors, such as a latex, silicone, rubber or neoprene sleeve, or more generally any skin-tight or other suitable material that engages with the users skin (for example due to friction) and consequently tracks its motion and/or deformation. In order to accurately identify the gesture being performed by a user, a calibration period may be required in which a user is requested to perform a series of gestures. Information gathered during this calibration period may be used to generate a model of the user’s arms, for example by identifying the range of motion experienced by the user for different actions and then applying this to a model of the interactions between muscles in the arms and corresponding hand gestures, which may be used to identify gestures from a pre-determined database. Alternatively, or in addition, a look-up table may be generated which relates each of the series of gestures to a set of corresponding detected readings. Calibration for a particular gesture may be performed upon launching an application that makes use of the gesture, rather than upon an initial system start-up or first use of the wearable controller.
The control unit 310 may comprise a processor to generate information from the results obtained from the sensors that may be transmitted to the processing device using a communication unit also present in the control unit 310. The control unit 310, as noted above, may also comprise one or more further sensors, such as a sensor that is operable to determine the position and/or orientation of the wearable control device. An example of such a sensor is an accelerometer that may be used to detect the position, motion and/or acceleration of the user’s arm; measurements from one or more such sensors may be used to refine identification of or otherwise characterise the gesture being performed by the wearer of the controller.
The control unit 310 is also operable to transmit sensor data from the wearable controller to the associated processing device. This may be via any wired or wireless connection that is appropriate for the application.
Figure 4 schematically illustrates a method for generating inputs for an application using the disclosed arrangement.
At a step 400 motion of the surface of the user’s arm is detected by the sensors in the wearable controller. Additionally, the motion of the user’s arm itself may be detected via an accelerometer in the wearable controller for example.
As a step 410 data about the detected motion is transmitted by the wearable controller to the processing device with which the wearable controller is associated.
At a step 420 the information received by the processing device is used to identify the gesture that is made by the user. As described above, this may comprise the steps of either using a model in conjunction with the data about the detected motion to determine which gesture is performed, or by consulting a look-up table or any other suitable method.
At a step 430 the input corresponding to the identified gesture is determined; this may be achieved using a look-up table that relates gestures to input commands, which in turn may have a one-to-one relationship or may vary with context in-game. These inputs are used to control the processing of the application currently being executed by the processing device.
Figure 5 schematically illustrates an entertainment system comprising the wearable controller 100 and processing device 110.
The wearable controller 100 comprises the sensors 500, sensor input unit 510 and communication unit 520.
The sensors 500 include the sensors 300 of Figure 3 in addition to any accelerometers and the like that are used to supplement the detections performed by these sensors. The inputs from the sensors 500 are provided to the sensor input unit 510, which may perform any appropriate processing on the inputs such as changing the format or selecting particular measurements. The communication unit 520 is then operable to transmit the inputs to the processing device 110 via the connection 530.
The processing device 110 comprises a communication unit 540, a gesture identification unit 550, an input generation unit 560 and storage 570.
The communication unit 540 is operable to receive sensor information from the wearable controller 100. This information is then provided to the gesture identification unit 550, along with any additional information such as images from the optional camera 130 of Figure 1. The gesture identification unit 550 is operable to identify a gesture that corresponds to the sensor data that is recorded by the wearable controller 100 using either a look-up table or a suitable model that relates sensor readings to gestures. As previously noted, information about the position and/or orientation of the wearable controller may also be used to generate inputs for controlling processing
The input generation unit 560 is operable to use the gesture information generated by the gesture identification unit 550 to identify an input for the application that is currently being executed by the processing unit 110. This may be achieved by referring to a look-up table that relates gestures to inputs for an application; such a look-up table may be a part of the application that is currently being executed (such that applications may have unique gesture/input pairings), a system-level table (such that each application uses the same gesture/input pairings) or a combination of the two (such that the system-level table is supplemented with application-specific pairings). Alternatively, the gesture itself may be an input for the application and as such can simply be provided as an input to the application that is being executed.
Within an application a gesture may be assigned to a particular action - a ‘fire’ action in a first person shooter may be linked to a corresponding motion ofthe user’s index finger, for example. This may be defined by the application itself, or there may be a system-level set of look-up tables that define such an action depending on the type of application. Alternatively, or in addition, a specific gesture by the user may replace a button press from a gamepad; this may be particularly suitable for use with older games that do not specifically support inputs from the wearable controller 100. A direct mapping between gestures and button presses may be particularly suitable for mappings that are associated with the wearable controller 100 or the processing device 110, although they could also be associated with the application that is being executed. As noted above, such assignments may be fixed or may vary depending on in-game I OS context.
Storage 570 is operable to store one or more look-up tables and models that may be referenced as a part of the input generation process, in addition to application information and the like.
The techniques described above may be implemented in hardware, software or 5 combinations of the two. In the case that a software-controlled data processing apparatus is employed to implement one or more features of the embodiments, it will be appreciated that such software, and a storage or transmission medium such as a non-transitory machinereadable storage medium by which such software is provided, are also considered as embodiments of the disclosure.

Claims (15)

1. A processing system for generating inputs for an application being executed by a processing device, the system comprising:
a wearable control device operable to detect changes of the surface of a first body part upon which the wearable control device is worn; and a processing device operable to receive information from the wearable control device relating to the detected surface changes and identify a gesture associated with a second body part using the received information, wherein the processing device is operable to generate inputs for controlling processing based upon an identified gesture.
2. A processing system according to claim 1, wherein the wearable control device comprises a plurality of sensors operably coupled with the user’s skin.
3. A processing system according to claim 1, wherein the wearable control device comprises a sensor that is operable to determine the position and/or orientation of the wearable control device.
4. A processing system according to claim 1, comprising a camera that is associated with the processing device that is operable to capture an image of the wearable controller, wherein the processing device is operable to perform processing to determine a position and/or orientation of the wearable controller from the image.
5. A processing system according to claim 4, wherein the wearable controller comprises a marker that may be imaged by the camera.
6. A processing system according to either of claims 3 or 4, wherein information about the position and/or orientation of the wearable controller is used to generate inputs for controlling processing.
7. A processing system according to claim 1, wherein the first body part is the user’s arm and the second body part is the user’s hand.
8. A processing system according to claim 1, wherein gestures are identified from the information received from the wearable device using one or more from the list consisting of:
a model of the correspondence between surface changes and muscle motion;
and ii. a look-up table.
9. A processing system according to claim 1, wherein a model and/or a look-up table is used by the processing device to identify which body part the wearable controller is being worn upon.
10. A processing system according to claim 1, wherein identified gestures are related to inputs to control processing via a look-up table associated with one or more of the wearable controller, the processing device and an application that is being executed.
11. A processing method for generating inputs for an application being executed by a processing device, the method comprising:
detecting changes of the surface of a first body part upon which a wearable control device is worn;
transmitting information to the processing device from the wearable control device, the information relating to the detected surface changes identifying a gesture associated with a second body part using the transmitted information; and generating inputs for controlling processing based upon an identified gesture.
12. A processing method according to claim 11, wherein the wearable control device comprises a plurality of sensors that are operably coupled with the user’s skin.
13. A processing method according to claim 11 wherein gestures are identified from the information received from the wearable device using one or more from the list consisting of:
i. a model of the correspondence between surface changes and muscle motion; and ii. a look-up table.
14. A computer program that, when executed by a computer, causes the computer to carry out the method of claim 11.
15. A machine-readable non-transitory storage medium which stores computer software according to claim 14.
io
Intellectual
Property
Office
Application No: Claims searched:
GB1612304.4
1-15
GB1612304.4A 2016-07-15 2016-07-15 Wearable input device Withdrawn GB2552219A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1612304.4A GB2552219A (en) 2016-07-15 2016-07-15 Wearable input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1612304.4A GB2552219A (en) 2016-07-15 2016-07-15 Wearable input device

Publications (2)

Publication Number Publication Date
GB201612304D0 GB201612304D0 (en) 2016-08-31
GB2552219A true GB2552219A (en) 2018-01-17

Family

ID=56890586

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1612304.4A Withdrawn GB2552219A (en) 2016-07-15 2016-07-15 Wearable input device

Country Status (1)

Country Link
GB (1) GB2552219A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113167576A (en) * 2018-04-19 2021-07-23 泰克萨维技术有限公司 Method and system for estimating topography of at least two parts of body

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120157886A1 (en) * 2010-12-17 2012-06-21 Industrial Technology Research Institute Mechanomyography Signal Input Device, Human-Machine Operating System and Identification Method Thereof
US20130265229A1 (en) * 2012-04-09 2013-10-10 Qualcomm Incorporated Control of remote device based on gestures
US20140334083A1 (en) * 2013-05-13 2014-11-13 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices that accommodate different user forms
WO2015033327A1 (en) * 2013-09-09 2015-03-12 Belfiori Alfredo Wearable controller for wrist
WO2015119637A1 (en) * 2014-02-10 2015-08-13 Bodhi Technology Ventures Llc Motion gesture input detected using optical sensors
KR20150112741A (en) * 2014-03-27 2015-10-07 전자부품연구원 Wearable device and information input method using the same
US20160091980A1 (en) * 2014-09-30 2016-03-31 Apple Inc. Motion and gesture input from a wearable device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120157886A1 (en) * 2010-12-17 2012-06-21 Industrial Technology Research Institute Mechanomyography Signal Input Device, Human-Machine Operating System and Identification Method Thereof
US20130265229A1 (en) * 2012-04-09 2013-10-10 Qualcomm Incorporated Control of remote device based on gestures
US20140334083A1 (en) * 2013-05-13 2014-11-13 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices that accommodate different user forms
WO2015033327A1 (en) * 2013-09-09 2015-03-12 Belfiori Alfredo Wearable controller for wrist
WO2015119637A1 (en) * 2014-02-10 2015-08-13 Bodhi Technology Ventures Llc Motion gesture input detected using optical sensors
KR20150112741A (en) * 2014-03-27 2015-10-07 전자부품연구원 Wearable device and information input method using the same
US20160091980A1 (en) * 2014-09-30 2016-03-31 Apple Inc. Motion and gesture input from a wearable device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113167576A (en) * 2018-04-19 2021-07-23 泰克萨维技术有限公司 Method and system for estimating topography of at least two parts of body
EP3781903A4 (en) * 2018-04-19 2022-04-20 Texavie Technologies Inc. Methods of and systems for estimating a topography of at least two parts of a body

Also Published As

Publication number Publication date
GB201612304D0 (en) 2016-08-31

Similar Documents

Publication Publication Date Title
US10970936B2 (en) Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
CN110325947B (en) Haptic interaction method, tool and system
US11016569B2 (en) Wearable device and method for providing feedback of wearable device
US9360944B2 (en) System and method for enhanced gesture-based interaction
US20170336870A1 (en) Foot gesture-based control device
JP2022500729A (en) Neuromuscular control of augmented reality system
CN107789803B (en) Cerebral stroke upper limb rehabilitation training method and system
US20170262056A1 (en) Selection of optimally positioned sensors in a glove interface object
US20120157263A1 (en) Multi-user smartglove for virtual environment-based rehabilitation
CN104023802B (en) Use the control of the electronic installation of neural analysis
JPWO2016038953A1 (en) DETECTING DEVICE, DETECTING METHOD, CONTROL DEVICE, AND CONTROL METHOD
US20120133581A1 (en) Human-computer interaction device and an apparatus and method for applying the device into a virtual world
Song et al. Activities of daily living-based rehabilitation system for arm and hand motor function retraining after stroke
US20230142242A1 (en) Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds
Wang et al. HapticSphere: Physical support to enable precision touch interaction in mobile mixed-reality
Novacek et al. Overview of controllers of user interface for virtual reality
KR102438347B1 (en) Smart wearable devices and smart wearable equipment
GB2552219A (en) Wearable input device
US20220253140A1 (en) Myoelectric wearable system for finger movement recognition
CN114255511A (en) Controller and method for gesture recognition and gesture recognition device
US20230316620A1 (en) System and method for generating a virtual avatar
EP4354258A1 (en) Virtual reality control method for avoiding motion sickness
TWI835155B (en) Virtual reality control method for avoiding occurrence of motion sickness
WO2015067481A1 (en) Wearable electronic device, electronic system, as well as associated method and computer program product

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)