GB2428802A - Wearable motion sensor device with RFID tag - Google Patents

Wearable motion sensor device with RFID tag Download PDF

Info

Publication number
GB2428802A
GB2428802A GB0614404A GB0614404A GB2428802A GB 2428802 A GB2428802 A GB 2428802A GB 0614404 A GB0614404 A GB 0614404A GB 0614404 A GB0614404 A GB 0614404A GB 2428802 A GB2428802 A GB 2428802A
Authority
GB
United Kingdom
Prior art keywords
network
data
motion
mcu
storage memory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0614404A
Other versions
GB0614404D0 (en
Inventor
Peter Mccarthy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB0614404D0 publication Critical patent/GB0614404D0/en
Publication of GB2428802A publication Critical patent/GB2428802A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/039Accessories therefor, e.g. mouse pads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0384Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Physiology (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A wearable device includes an accelerometer 17 and a gyroscope 18 to measure motion, a passive RFID (radio frequency identification) tag 24 to identify the wearer of the device, a memory 15 and a wireless transceiver 20 to transmit measured data via an ad-hoc mesh networkto a host computer 41 for processing. The host computer 41 may transmit new software programs via the network to the device for storage and execution. The device may be in the form of a cufflink, wristwatch, popper stud etc (see figure 3). The device may be used for example to capture hand 44 motion for gesture recognition. The device includes a battery 22 or a rechargeable battery, power manager and solar cell (see figure 2).

Description

I
A Motion Capture And Identification Device
Description
This Descnption claims priority over Patent Application Number GB 0515796. 1, filed on 30th July 2005.
BACKGROUND OF THE INVENTION
Field of The Invention
This invention relates to an electronic device that: [ii captures human (i.e. user) motion, and Iii] uniquely identifies the user. The invention allows for the device to sense and record (i.e. store) user motion, and to transmit motion and identification information as digital data - via a wireless network - to a host computer for processing. The invention also allows for the device to be worn comfortably and discreetly by the user.
Description of Related Art
The technique of sensing, recording (i.e. storing), and transmitting human motion - as digital data - is commonly known as motion capture. The main purpose of motion capture is to enable the creation of three dimensional (3D) computer models that represent human motion occurring within a calibrated environment. These models are then processed by computer software for use in a number of scenarios, for example: Computer Graphics Imagery (CGI) animation, and gait/gesture recognition.
One method for capturing motion relates to a body suit worn by a human (i. e. an actor). Body suits are frequently used in the movie and computer game industries to model human motion for the purpose of creating lifelike CGI animation. The body suit is typically made from a flexible material - such as lycra - and comprises a number of sensors distributed on the arms, legs, torso, and head.
Sensors can be either passive or actWe. In the case of passive sensors, reflective markers - or small ping-pong type balls - are attached to the body suit Installed within the motion capture environment is a lightsource and a number of video sensors (i.e. cameras) - each calibrated to cover various positions and angles where motion will be conducted. When the actor enters the motion capture environment, video from alt cameras is recorded and analysed frame-by-frame by software running on a host computer. The image processing algorithms, employed by the software, look for reflections from the markers - or for the presence of the small balls - in order to build a set of reference points. These points are then connected together to form a 3D model that represents the motion made by the actor's body.
In the case of active sensors, small electronic sensing devices - such as accelerometers and gyroscopes - are attached to the body suit instead. Accelerometer sensors measure acceleration in multiple directions, from which calculations such as velocity, direction, and position can be derived. Gyroscope sensors, on the other hand, measure angular rates of rotation. Each sensor attached to the body suit is connected - normally via a wired link - to a wearable control unit The control unit is responsible for providing a power supply to all sensors, and for collecting motion metrics as the actor moves his arms, legs, torso, and head. The control unit then transmits motion metrics - either by a wired or wireless link - to the host computer, so that software can build suitable 3D models and analyse the actors motion.
Another method for capturing human motion relates to a glove worn by a user. Gloves are most commonly used in Virtual Reality (VR) applications, and are made from a flexible material - such as lycra - onto which a number of sensors are distributed (usually near the wrist and finger/thumb joints). A combination of flexure and abduction sensors is typically employed. Flexure sensors, made from optical fibres, measure the bend and twist of the user's fingers and thumbs. Abduction sensors, meanwhile, measure the spread' of the fingers and thumbs - i.e. measure the amount that the fingers/thumbs move laterally in the plane of the palm.
As with the body suit method, sensors are connected - via wired links - to a wearable control unit responsible for collecting motion metrics from each sensor. The control unit then transmits motion metrics, typically via a wired link, to the host computer so that software can build 3D models of hand motion. These motion models are then processed to generate animation sequences, or analysed to determine hand signals and other gestures.
A further method for capturing human motion relates to the use of video sensors (i.e. cameras) and Computer Vision (CV) techniques. This method does not require a person to wear any special clothing, and is commonly employed within covert security/surveillance scenarios to detect the presence - or suspicious behaviour - of people within sensitive areas.
Cameras are positioned to cover specific locations in which movement can occur. If the cameras generate analog video, the host computer will first digitise each frame into a digital data format that it can process. On the other hand, if the cameras generate digitised video - particularly if they participate in a TCP/IP network - each frame will already be in a data format that the computer can process. To understand human motion, the computer's software uses established Computer Vision techniques - such as Hidden Markov Models and Kalman filters - to analyse the data from each video frame. This analysis process results in the identification of a person's direction and speed of movement, a person's gestures - such as hand waving, or a person's gait (pose) - for example loitering.
Although the above methods are well established, they suffer from a number of issues that inhibit their general flexibility and usability: * In the case of the body suit method, passive sensors - such as reflective markers - require an optimal operational environment, where lighting and video sensors are carefully calibrated to achieve accurate results. If calibration is not done properly, or the actor wearing the body suit moves out of this environment, the accuracy of motion capture deteriorates dramatically * In the case of the video sensor and Computer Vision method, a person's motion can easily be misinterpreted due to occlusion, lighting anomalies, and so forth. Also, this method invariably requires lengthy configuration before it can be used effectively * With respect to all of the above methods, if sensors - whether they be low bit-rate (e.g. accelerometers, gyroscopes) or high bit-rate (e.g. video cameras) require wired links, the environment where motion is captured will be physically limited to the length of those wires.
In other words, mobility of the capture method is compromised * Regardless of the wired or wireless nature of the sensors used in the above methods, the requirement for battery and/or mains power supplies limits the operational lifetime and mobility of the motion capture method * With respect to the body suit and glove methods, their uncomfortable and obtrusive nature limits their practical use in normal, everyday life - for example, at home and in the With respect to uniquely identifying a human, a well-known method relates to the use of Radio Frequency IDentification (RFID) technologies - similar to those employed in many modem access control systems. In this scenario, an environmental space - such as a building - will require controlled access to various locations, such as rooms and offices. To be granted access to particular locations, a person is required to wear a RFID chip (i.e. a transponder) - normally embedded within a form factor such as a smartcard, badge, or a key fob. Transponders can be passive or active. A passive transponder requires no local power supply to operate, whereas an active transponder requires an integrated battery to power its components. Due to size and other considerations, passive transponders are the most widely used form of identification. Regardless of whether a transponder is passive or active, it stores a globally unique, factory-generated identification number typically of the order 2. Also, a transponder will be capable of storing data within internal, non-volatile memory- say, up to 1024 bytes (1KB) in size.
Located throughout the building environment are a multitude of transponder reader units, each attached to an antenna. Normally, each reader unit/antenna combination will be placed close to a point of access - such as a doorway. Assuming a system that utilises passive transponders only, each antenna will generate an electromagnetic field. When a person enters the proximity of a doorway, and therefore the person's transponder enters the antenna's electromagnetic field, the transponder induces current from the field - which in turn provides sufficient power for the transponder to operate. Once powered, the transponder will wirelessly transmit its unique identification number - or any data stored in its internal storage memory - to the reader unit. Upon receiving the transponder's identification number, or memory data, the reader unit forwards this information to the host computer for processing via software. Once received, the software will authenticate the person and - if necessary - release a door locking mechanism so that the person can proceed to the next location. Since communications between the transponder and the reader unit is bi-directional, it is possible for the computer to store data back within the transponder's internal memory. In this situation, when the transponder is induced by an antenna's electromagnetic field, the computer identifies the person and sends appropriate data back to the reader unit for transmission and storage in the person's transponder. As a result, data can follow' the person from location to location.
SUMMARY OF THE INVENTION
An objective of this invention is to provide a method to accurately capture human (i.e. user) motion for specific parts of the body - such as the hands - in wide range of operating environments. In order to do this, the invention intends to overcome various flexibility and usability issues associated with existing motion capture methods, as listed in the Related Art (above). The choice of motion sensors will ensure that the invention works in disparate operating environments (e.g. at home or in the workplace), and will help overcome operational issues such as occlusion and lighting. The invention will not require complicated calibration, nor will its operating environment require continual manual configuration to maintain optimal conditions. The invention will utilise wireless communications in order to establish and participate in ad-hoc networks, thus enabling the invention to send captured motion metrics - regardless of location - to a host computer for processing. The invention will be extremely energy efficient, requiring only a low-voltage battery to power its components. As a result, continual operation of the invention can be expected without the need for maintenance - for example, battery replacements.
Another objective of this invention is to provide a method to uniquely identify the user. To do this, the invention will utilise Radio Frequency IDentification (RFID) techniques, as discussed in the Related Art (above).
A further objective of this invention is to provide a method that enables the user to wear the invention comfortably and discreetly on his/her personage. An application for the invention is to capture hand motion. In this case, the user would be able to wear the invention close to his/her hand (on or around the wrist) without any discomfort or hindrance.
To achieve the objectives stated, the invention will comprise an electronics aspect, and a wearable aspect.
With respect to the electronics aspect of the invention, very low-power Micro Electrical Mechanical Systems (MEMS) inertial sensors - such as Accelerometers and Gyroscopes - will accurately measure motion in multiple directions.
With respect to the electronics aspect of the invention, a very low-power Microcontroller Unit (MCU) will sample and filter motion measurements generated by the MEMS inertial sensors. The MCU will then record (i.e. store) the samples - as digital motion metrics data - in non-volatile memory, in preparation for wireless transmission.
With respect to the electronics aspect of the invention, a very low-power wireless communications transceiver (TX/RX) will provide the capability to establish and participate in an ad-hoc network - with other peers - in order to send motion metrics data to a host computer for processing.
With respect to the electronics aspect of the invention, a very lowvoltage battery will provide the power needed to operate the MCU, MEMS inertial sensors, non-volatile storage memory, and TX/RX. This battery will be capable of providing enough power to operate the invention in its entirety for weeks, months, perhaps even years, with minimal human intervention or maintenance required.
With respect to the electronics aspect of the invention, the MCU, MEMS inertial sensors, and TXJRX may implement a sleep mode' capability. Therefore, when the invention is deemed not in use, the MCU will place itself - and other components - into this mode to conserve battery power, thus further enhancing the invention's operational lifetime.
With respect to the electronics aspect of the invention, a Power Manager may be incorporated. The Power Manager will control any batteryrecharging operations by harvesting environmental stimuli (such as natural light or vibration) in order to create enough electrical voltage to drive the recharging process.
With respect to the electronics aspect of the invention, a Radio Frequency IDentification (RFID) chip (i.e. transponder) will uniquely identify the user. This transponder will be passive - i.e. will not require a local power supply to function - and will store a globally unique, factory-generated identification number. The transponder will also include internal storage memory, capable of storing - for example - 1024 bytes (1KB) or more of data.
With respect to the electronics aspect of the invention, a software program can be sent from a host computer, via a wireless, ad-hoc network, for storage in non-volatile memory. Once received, the program will be loaded and executed by the MCU. Programs can be sent and uploaded at any time, thus enabling the invention to dynamically change its behaviour to react to changes in the operating environment.
With respect to the wearable aspect of the invention, a tough casing typically made from a plastic material - will be used to protect the electronics aspect of the invention from breakage and environmental interference.
With respect to the wearable aspect of the invention, the casing will be attached to a fitting mechanism, for example a clasp, a bracelet, a strap, or a press-stud fastener, to enable the invention to be worn comfortably and discreetly by the user.
BRIEF DESCRIPTION OF THE DRAWINGS
There now follows a brief description of the drawings: * Figure 1 is a block diagram illustrating the preferred embodiment of the electronics aspect of the invention * Figure 2 is a block diagram illustrating another embodiment of the electronics aspect of the invention. This embodiment adds a Power Manager and a Solar Cell, and replaces the battery shown in Figure 1 with a rechargeable alternative * Figure 3a illustrates the preferred embodiment of thewearable aspect of the invention - i.e. an electronics device designed to be attached to the cuffs of a user's shirt for the purpose of capturing hand motion * Figure 3b is a diagram illustrating the preferred embodiment for the wearable aspect of the invention * Figure 3c is a diagram illustrating another embodiment for the wearable aspect of the invention. Slight changes are made to support a power manager and rechargeable battery * Figure 3d is a diagram illustrating yet another embodiment for the wearable aspect of the invention. This is virtually the same as the embodiment illustrated in 3c, except that an alternative fitting mechanism (i.e. a press-stud fastener) is utilised instead * Figure 4a illustrates a simple mesh network * Figure 4b illustrates how the mesh network adapts when a new peer joins the network * Figure 4c illustrates how the mesh network adapts when existing peers leave the network * Figure 4d illustrates how the mesh network adapts when a peer rejoins the network * Figure 4e illustrates an example path for a peer's data as it traverses the mesh network * Figure 5 illustrates how the invention achieves a wireless mesh network with a host computer * Figure 6 illustrates how the invention achieves a wireless mesh network with peers and a host computer * Figure 7 illustrates an example RFID environment - i.e. a building comprising a number of rooms interconnected by doorways. Each doorway is fitted with an RFID chip (i.e. transponder) reader unit/antenna combination * Figure 8 illustrates how the invention wirelessly transmits its unique identification number, via a RFID reader unit/antenna network, to a host computer
DETAILED DESCRIPTION OF THE INVENTION
Preferred Embodiment This description relates to a preferred embodiment of the invention. The invention comprises two aspects: [iJ an electronics aspect, and [ii] a wearable aspect.
Figure 1 illustrates a preferred embodiment of the electronics aspect of the invention. This is delivered in the form of a self-contained device 10. The device 10 comprises a Microcontroller Unit (MCU) 13, an Accelerometer Sensor (AS) 17, a Gyroscope Sensor (GS) 18, non-volatile storage memory (Flash) 19, a wireless transceiver (TXIRX) 20, a battery 22, and a Radio Frequency IDentification (REID) chip 24.
The MCU 13 controls the operation and functional behaviour of the device 10. It comprises a Central Processing Unit (CPU) 14, internal storage memory (RAM, ROM, EEPROM, Flash) 15, and an Analog to Digital Converter (ADC) 16. The MCU 13 has very low-power requirements, and may be capable of suspending its operation - when not in use - to enter a sleep mode' in order to conserve power.
The CPU 14 is the brains' of the device 10. It executes programs comprising a series of instructions - stored in internal memory 15. These programs interact with components - such as the ADC 16 and the TX/RX 20in order to collect and process data, to store results in internal memory 15 or non-volatile memory 19, and to communicate wirelessly.
The ADC 16 provides a method to convert continuous analog signals (typically voltages) into digital data samples. In the preferred embodiment, the ADC is used to connect sensors AS 17 and GS 18, and convert their voltage outputs into digital data that can be processed and stored by programs executed by the CPU 14.
The AS 17 is a very low-power component that senses and measures acceleration in three axial directions (X, Y, and Z). The AS 17 outputs motion measurements as a voltage stream that is connected to the ADC 16. Programs executed by the Cpu 14 interact with the ADC 16 to sample - at regular intervals - the voltages. These samples are returned from the ADC 16 as digital data that can then be processed and stored by the programs. The AS 17 will typically come in the form of a Micro Electrical Mechanical Systems (MEMS) inertial sensor, which may be capable of entering a sleep mode' in order to conserve power when not in operation.
The GS 18 is a very low-power component that senses and measures angular rates of rotation in one, two, or all three axial directions. The GS 18 outputs motion measurements as a voltage stream that is connected to the ADC 16. Programs executed by the CPU 14 interact with the ADC 16 to sample - at regular intervals - the voltages. These samples are returned from the ADC 16 as digital data that can then be processed and stored by the programs. The GS 18 will typically come in the form of a Micro Electrical Mechanical Systems (MEMS) inertial sensor, which may be capable of entering a sleep mode' in order to conserve power when not in operation.
The MCU 13 has its own internal memory 15 for the purpose of storing programs and data.
However, this memory is limited in size. Therefore, the device 10 provides for larger-capacity non- volatile storage memory, implemented - for example - as a Flash component 19. This enables programs to store greater quantities of data, and to retain that data should power - in this case provided by a battery 22- fail or be disconnected.
The TX/RX 20 provides a method for the device 10 to communicate with the outside world' without the need for wires. The TX/RX 20 may implement wireless communications in a number of ways, for example via radio frequency or optical methods. In the preferred embodiment for the invention, though, radio frequency communications is the method of choice. The TX/RX 20 is a very low- power component, which may be capable of entering a sleep mode' in order to conserve power usage when not in operation. The TX/RX 20 implements a protocol stack, suited to very low-power wireless communications, to establish and maintain a secure network with peers (i.e. other devices connected to the same network). An example protocol stack, highly suited to this type of networking, is provided as a technology known as ZigBee. The MCU 13 uses the TX/RX 20 to transmit and receive motion metrics data (i.e. captured motion), and to receive new software programs for execution by the CPU 14.
To connect and interact with components - such as the AS 17, GS 18, Flash memory 19, and the TX/RX 20- the MCU 13 utilises a data bus 21, which enables control instructions and data to be passed around efficiently.
To provide power to the MCU 13, AS 17, GS 18, Flash memory 19, and the TX/RX 20, a battery 22 is used. This provides a low-voltage power supply 23 to the components. To prolong the lifetime of the battery 22, some components may support sleep mode' functionality. In this case, when the device 10 is deemed not in use, the MCU 13 will place itself - and any other applicable components - into this mode to conserve battery 22 power. Then, when the device 10 is deemed to be back in use, the MCU 13 will along with other components - leave sleep mode' and return to normal powered operation.
In order to facilitate unique identification, the device 10 incorporates a Radio Frequency IDentification (RFID) chip 24 - i.e. a transponder. The Transponder 24 is passive, not active, which means that it has no integrated power supply (for example, a battery). The transponder 24 contains a globally unique, factory-generated identification number, a small amount of non-volatile internal memory, and an antenna. When the transponder 24 passes within the proximity of a RFID reader unit (not shown), it is momentarily powered to transmit its unique identification number - and its internal storage memory data - to the reader unit. Also while powered, the transponder 24 is capable of receiving data from the reader unit for storage within its internal memory. Please refer to the Related Art for more information.
In another embodiment - shown in Figure 2-the device 11 incorporates device 10 (see Figure 1), but replaces the battery 10 with a rechargeable battery 25 instead.
In order to ensure that the battery 25 is operating at optimal charge as often as possible, a Power Manager (PM) 26 component is incorporated. The PM 26 regularly checks the status of the battery's 25 charge. Should the PM 26 decide that the battery 25 needs recharging, it will utilise an electrical voltage - generated by harvesting some form of external stimulus - to drive the recharging process. In the preferred embodiment, the external stimulus is provided in the form of a natural light source. In another embodiment, however, the external stimulus may be present in the form of - say - vibration. In yet another embodiment, the external stimulus could be provided by motion - harvested in a similar way as to that implemented by battery-less wristwatches. In the context of device 11, however, the external stimulus is provided by a natural light source. A Solar Cell (SC) 12 is incorporated into the device 11, comprising a set of photovoltaic modules (not shown) that convert natural light - for example sunlight - into the electrical voltage 27 required by the PM 26 to recharge the battery 25. Apart from managing battery recharging issues, the PM 26 interacts with the MCU 13 via the data bus 21. This enables the MCU 13 and PM 26 to co-ordinate power related operations, and for the MCU 13 to finely tune the control of the sleep mode' based on the current battery 25 charge levels.
Another aspect of the invention relates to wearability - i.e. the wearable aspect A preferred application for the invention relates to the capture (i.e. sensing, recording, and transmission) of human hand motion. A host computer, running appropriate software, can then create 3D motion models for the purposes of recognising gestures - such as hand waving. In view of this, the electronics aspect of the invention (embodied as devices 10 or 11) will need to be worn comfortably and discreetly on a person's (i.e. users) hand or around the wrist.
Figure 3a illustrates the preferred wearable embodiment in a form similar in design to a cufflink 28/29, allowing the invention to be worn on the cuffs of a shirt 30. In another embodiment, a wearable form may be similar in design to a wristwatch, or a bracelet, allowing the invention to be worn comfortably around the wrists instead.
The preferred embodiment of the cufflink design is illustrated in Figure 3b. The cufflink 28 comprises a casing 31 with an affixed clasp 33. The device 10 is embedded within the casing 31.
The clasp 33 enables the casing 31 to be securely attached and worn on a users clothing, namely on the cuff 30 of a shirt - near the wrist area. It is envisaged that the casing 31 will be made of a tough, protective material - excluding metal. In the case of a metal casing, a level of interference with the transponder 24 may occur due to the prevalence of a Faraday Cage effect. Therefore, a plastic material is preferred for the casing 31 instead.
Another embodiment of the cufflink design is shown in Figure 3c. The cufflink 29 comprises a casing 32 with an affixed clasp 33. The device 11 is embedded within the casing 32. The surface of the casing 32 is designed such that a Solar Cell 12 (a constituent of device 11) is exposed to a natural light source - for example sunlight. The clasp 33 enables the casing 32 to be securely attached and worn on a users clothing, namely on the cuff 30 of a shirt - near the wrist area.
Again, it is envisaged that the casing 32 will be made of a tough, protective material - such as plastic - but excluding metal.
The cufflink embodiments illustrated in Figure 3b and Figure 3c utilise a clasp 33 to attach the casing 31132 onto a users clothing. Another method of attachment to consider is that of a press- stud fastener, already used extensively in applications such as luggage and clothing. In this embodiment, illustrated in Figure 3d, a small stud 34 would be woven into the user's clothing - near the wrist area - allowing the casing to be pressed-in and securely attached.
Both devices 10 and 11 - shown in Figure 1 and Figure 2 respectively carry out a number of important functional tasks that define the operational behaviour of the invention. These tasks include: [i] the creation of a wireless, ad-hoc network to transmit and receive data; [ii] to accurately sense human (i.e. user) motion, and transmit motion metrics data to a host computer for software analysis; [iii] to receive motion metrics data from a network peer and forward that data to another peer in the network; [iv) to receive software programs from a host computer and [vi to uniquely identify the user. Each of these functions is now described.
[The Creation of a Wireless, Ad-Hoc Network to Transmit And Receive Data] A fundamental functional aspect of the invention is to wirelessly communicate human motion to a host computer for modelling and gesture recognition via software.
In order for the invention to communicate motion to the host computer - as X, Y, and Z metrics data for example - the invention must be capable of establishing and participating in a wireless network with the computer. And, as previously stated, the preferred method for achieving wirelesscommunications is via radio frequency.
Due to the size and very low-power constraints of the invention's electronics aspect (embodied as devices 10 or 11), and the requirement for the electronics aspect to draw power from a battery, communications via radio frequency will be limited to short ranges and low bit-rate data transmissions. Furthermore, due to the invention's wearable aspect, the electronics aspect will be physically mobile during most of its operation. As a result, the invention will need to constantly communicate reliably and seamlessly in a multitude of orientations and different locations.
In order to handle the requirement for low-power, shot-range communications, and cater for the mobile nature of the invention, a flexible and robust network topology is needed. As a result, it is proposed that the network is designed to operate as a wireless, ad-hoc, mesh network. The principles of this mesh network are now described.
Figure 4a illustrates a simple mesh network. The network comprises two or more nodes 35, connected together via wireless radio links 36. A node 35 represents any device capable of some form of program execution, data processing, and wireless communications. Each node 35 connects to one or more of its closest peer nodes in the network. For example, node A connects to peer nodes B and C. In the case of node C, it connects to its closest peer nodes - namely nodes A, B, 0, E, and F. Apart from maintaining active network connections with peer nodes, each node will periodically attempt to discover and connect to new peer nodes. Figure 4b illustrates what happens when a new node H joins the network. Nodes 0, E, and G maintain network connections with each other and other peer nodes. However, they also attempt to discover new nodes close to them. When node H is placed close enough in proximity to the network, nodes D, E, and G discover node H and node H discovers nodes D, E, and G in return. As a result, nodes D and H, E and H, and G and H negotiate network connections instantly - and node H joins the network.
Figure 4c illustrates a scenario where nodes are removed from the network. This could be due to the re-location of nodes, or even the technical failure of nodes. In this situation, the network must be flexible and robust, and automatically re-configure its topology to maintain integrity and facilitate continuous, uninterrupted operation of all other nodes. In other words, the network must be self- healing'. In the illustration, nodes B and E are no longer present However, instead of the network simply failing due to broken communication links, nodes in the network automatically re-configure their connections so that the network self-heals and continues operation. In the case where node B is removed, nodes A, C, and 0 automatically re- configure connections between themselves to compensate for node B's removal. In the case of node E, nodes C, D, F, G, and H all re-configure their connections automatically. Notice how nodes C and G establish a new connection between themselves to compensate for the removal of node E. Figure 4d illustrates what happens if node B is later re-introduced to the network - perhaps after repair or re-location. Nodes A, C, and D immediately re-discover node B, and node B re-discovers A, C, and D in return. Consequently, network connections between these peers are re- established once again.
The purpose of any network is to enable nodes to connect together and communicate - i.e. to transmit and receive data between each other. In a mesh network, nodes behave like a bucket brigade' - but instead of transporting water, they transport data. When a node wishes to send data through the network, peer nodes forward the data - one by one - until it reaches its destination. In Figure 4e, node A wants to send/receive data to/from node H. An example path 37 that the data takes is illustrated. Notice that node A first sends the data to one of its closest peers - node C. Node C then forwards the data to one of its closest peers - node E. Node E then forwards the data to node H - the destination node. The same path 37 can be used by node H to send data back to node A. In order to establish and participate in a wireless, ad-hoc, mesh network, the MCU 13 and TXJRX components of the invention's electronics aspect (see Figure 1 and Figure 2) must implement a suitable wireless communications protocol stack. The preferred embodiment of this protocol stack is provided by the ZigBee Alliance, and is known simply as the ZigBee protocol.
[Accurately Sense Human (i.e. User) Motion And Transmit Motion Metrics Data to a Host Computer For Software Analysis] Having successfully established a mesh network with one or more peer nodes (i.e. devices), the invention can then sense and wirelessly transmit motion metrics data through the network to its primary destination - i.e. a host computer running modelling and analysis software.
Figure 5 illustrates an example scenario. A user is wearing the invention - embodied as a cufflink 28/29 - on his shirt cuff. A wireless (radio) network connection 38 has been established with a single peer device 39, and device 39 has a wired connection 40 to the host computer 41. The wired connection 40 can be implemented in a number of ways, for example as an Ethernet or a serial RS232/RS422/RS485 link.
When the user moves his hand, the AS 17 and GS 18 sensors (see Figure 1 and Figure 2) measure acceleration and angular rates of rotation respectively. The MCU 13 samples the measurements from both sensors, stores the samples as motion metrics data in Flash memory 19, and instructs the TXJRX 20 to wirelessly transmit the metrics data (via connection 38) to device 39.
Upon receiving the metrics data, device 39 forwards the data (via connection 40) to the host computer 41 for processing via software.
Device 39 represents a special node in the mesh network. Its role is to act as a network controller.
All data transmitted by peer devices in the network is received and buffered by device 39, and sent - via the wired connection 40- to the host computer 41.
[Receive Motion Metrics Data From a Network Peer And Forward That Data to Another Peer In The Network) Apart from capturing user motion and transmitting motion metrics data across the network, the invention also acts as a receiver - so that data transmitted by peer devices can be routed through the network. Consider Figure 6.
This network comprises an ad-hoc mesh of wirelessly connected peer devices. Device X 42 is an adaptation of device 10 (see Figure 1) - the preferred embodiment of the electronics aspect of the invention. It does not contain AS 17 or GS 18 sensors, nor is it designed to be wearable by a user.
Instead, it is designed to be stored in a small housing (not shown), preferably made of a durable plastic material, which can be easily attached to a fixed entity - for example, a wall or a ceiling.
Similarly, device Y 43 is an adaptation of device 11 (see Figure 2) another embodiment of the electronics aspect of the invention. It also contains no AS 17 or GS 18 sensors, nor is it designed to be wearable by a user. Instead, it is designed to be stored in a small housing (not shown), preferably made of a durable plastic material, which can be attached to a fixed entity - for example, a wall or a ceiling.
The mesh network also comprises multiple wearable instances 28/29 of the invention, a network controller device 39, and a host computer 41.
When a user moves her hand 44, the AS 17 and GS 18 sensors (see Figure 1 and Figure 2) measure acceleration and angular rates of rotation respectively. The MCU 13 samples the measurements from both sensors and instructs the TX/RX 20 to wirelessly transmit the motion measurements as metrics data (via connection 38) to a suitable peer device. Peer devices then communicate with each other to propagate the metrics data through the network until it reaches the network controller device 39 (and then the host computer 41). An example data path 46 is illustrated in Figure 6. In this path, peer device V receives metrics data from the invention attached to the wrist of the user's hand 44. Device V then transmits the metrics data to peer device X in the path 46. Device X receives the metrics data from peer device V and transmits it to the network controller device 39. Finally, the network controller device 39 receives and forwards the metrics data - via connection 40- to the computer 41 for processing.
In the network depicted in Figure 6, also notice the presence of another instance of the invention - worn on the wrist of another user's hand 45. It is quite conceivable that motion metrics data generated from the user's hand 44 could be forwarded through the network via the invention attached the other user's hand 45 instead - as shown by path 47.
[Receive Software Programs From a Host Computer] The primary role of the host computer 41 is to receive and process motion metrics data captured by the invention. In a preferred embodiment, the invention is used to capture human hand motion.
Therefore, the computer's 41 software processes and analyses motion metrics data to recognise particular hand gestures.
However, in some circumstances, it may be necessary to dynamically change the operational characteristics or parameters of the invention. Or, it may be necessary to re-purpose the invention entirely to perform different motion capture tasks. As a result, the invention's programmed functionality will need to be updated. In this situation, the host computer will initiate communications with individual, or all, devices within the network in order to send software program updates.
For the invention to receive software program updates from the host computer 41, programs are sent from the computer 41 - via the network controller device 39- as small chunks of data. These chunks propagate through the network - via peer devices, such as X and Y - until they reach their destination(s). The invention will receive the data chunks - via the TX/RX 20 component of its electronics aspect - and store them in non-volatile memory - i.e. Flash memory 19. Once all chunks are received, validated, and re-assembled as the original program, the program will then be loaded into internal storage memory 15 and executed by the MCU 13.
[To Uniquely Identify The User] To uniquely identify the user of the invention, the invention's electronics aspect employs a Radio Frequency IDentification (RFID) chip 24- i.e. a transponder.
Figure 7 illustrates a typical environment where RFID identification would be installed - i.e. a building. The building 48 comprises a number of locations (e.g. rooms) interconnected by doorways 49. Located at each doorway 49 is a transponder reader unit 50, connected to an antenna 51.
When the user of the invention approaches a doorway 49, the transponder 24 enters the proximity of its reader unit 50 and antenna 51. The electromagnetic field generated by the antenna 51 causes a process of induction to occur within the transponder. As a result, a small electrical voltage is generated which provides enough power to enable the transponder 24 to wirelessly transmit its unique identification number to the reader unit 50.
Figure 8 illustrates a preferred RFID environment for the invention to operate in. The implementation comprises a network of reader units 50 attached to antennae 51. Each reader unit is connected - via a wired of wireless link 52- to a network controller device 53. Examples of wired links are Ethernet and Serial RS2321RS4221RS485. In the case of Ethernet, the network controller device 53 would represent a hub or a switch. In the case of serial RS232/RS422/RS485, the network controller device 53 would represent a serial port hub, or a multi-port serial expansion card installed within the host computer 41. Depending on the nature of the network controller device 53, the connection 54 between it and the computer 41 may be wired or wireless. In the case of an Ethernet hub or switch, a built-in Wireless Access Point (WAP) would transmit data to/from the computer 41 via standards such as 802.11g. In the case of a serial port hub, a wired serial RS232/RS422/RS485 connection 54 would probably be favoured. In the case of a multi-port serial expansion card - installed within the computer 41 - the connection 54 between it and the computer 41 would be achieved directly through the computer's 41 internal data bus (e.g. PCI bus).
The implementation shown in Figure 8 also illustrates the circumstances required for the transponder 24 to communicate with the reader unit network. A reader unit 50 is connected to an antenna 51 - usually constructed from copper wiring. The antenna 51 generates an electromagnetic field 55. At the same time, the reader unit's 50 wireless transceiver (not shown) begins to interrogate the radio waves to determine the proximity of the transponder 24. When the user of the invention moves her hand such that the transponder 24 enters the electromagnetic field 55, the transponder 24 induces current from the electromagnetic field 55. This current is then used to power the transponder 24 sufficiently in order to: [i] transmit its unique identification number to the reader unit 50, or [ii] transmit data from its internal storage memory to the reader unit 50, or [iii] receive data from the reader unit 50 for storage in its internal memory. The transmitting and receiving of data between the transponder 24 and the reader unit 50 is performed wirelessly 56, via a licensed radio frequency - for example, 13.56MHz.
Once the reader unit 50 receives data from the transponder 24, it sends the data to the network controller device 53, which in turn forwards the data to the host computer 41. By associating each reader unit 50 with a specific location (room) within the environment (building) 48, it is possible for the computer's 41 software to identify the invention's user and track his/her movements.
This Description merely illustrates the principles of the invention and a specific (i.e. preferred) embodiment of it. It should, therefore, be appreciated that those skilled in the Related Art will be able to devise various other embodiments that - although not explicitly described or shown within - embody the principles of this invention, and are therefore - within its scope.

Claims (30)

  1. A Motion Capture And Identification Device Claims 1. A motion capture and
    identification device (hereinafter, referred to as a device) that includes an electronics aspect comprising: a method to capture the motion of the user of the device, as digital motion metrics data; a method to participate in a wireless, ad-hoc, mesh network (hereinafter, referred to as a network); a method to transmit motion metrics data - via a network of peer devices - to a host computer for processing via software; a method to receive software programs sent - via a network - from a host computer, and to change the device's operation by executing that program; and a method to uniquely identify the user of the device.
  2. 2. The electronics aspect, according to Claim 1, comprises the following components: a Microcontroller Unit (MCU); a number of Micro Electrical Mechanical Systems (MEMS) inertial sensors; a wireless communications transceiver (TX/RX); non-volatile storage memory; a Radio Frequency IDentification (RFID) chip; a battery; and a data bus.
  3. 3. The electronics aspect, according to Claim 2, wherein the MCU comprises a Central Processing Unit (CPU), internal storage memory, and an Analog to Digital Converter (ADC).
  4. 4. The MCU, according to Claim 3, has very low power requirements, has the capability to enter a sleep mode' which consumes far less power than normally required, and has the capability to leave sleep mode' to return to its normal mode of operation.
  5. 5. The electronics aspect, according to Claim 2, wherein the MEMS inertial sensors comprise an Accelerometer sensor for sensing acceleration in three axial directions (X, Y, and Z), and a Gryroscope sensor for sensing angular rates of rotation in one, two, or all three axial directions.
  6. 6. The MEMS inertial sensors, according to Claim 5, have very low power requirements, have the capability to enter a sleep mode' which consumes far less power than normally required, and have the capability to leave sleep mode' to return to their normal modes of operation.
  7. 7. The electronics aspect, according to Claim 2, wherein the TX/RX provides a method to join or leave a network, and to transmit and receive data to/from peer devices in the network.
  8. 8. The TX/RX, according to Claim 7, has very low power requirements, has the capability to enter a sleep mode' which consumes far less power than normally required, and has the capability to leave sleep mode' to return to its normal mode of operation.
  9. 9. The electronics aspect, according to Claim 2, wherein non-volatile storage memory stores data and executable software programs.
  10. 10. The electronics aspect, according to Claim 2, wherein the RFID chip: is passive - not active - and thus has no requirement for an integrated power supply such as a battery; contains a globally unique, factorygenerated identification number; has an internal storage memory capability; and has a wireless data communications method that enables it to transmit its unique identification number (and any data stored in its memory) to a reader unit, and to receive data - from the reader unit - for storage in its memory.
  11. 11. The electronics aspect, according to Claim 2, wherein the battery provides a low-voltage power supply to the MCU, MEMS inertial sensors, non-volatile storage memory, and TXJRX.
  12. 12. The electronics aspect, according to Claim 2, wherein the data bus provides a mechanism for data and control instructions to be passed between the MCU, MEMS inertial sensors, non-volatile storage memory, and TX/RX.
  13. 13. A device that includes an electronics aspect comprising: a Microcontroller Unit (MCU); a number of Micro Electrical Mechanical Systems (MEMS) inertial sensors; a wireless communications transceiver (TX/RX); non-volatile storage memory; a Radio Frequency IDentification (RFID) chip; a solar cell (SC): a power manager (PM); a rechargeable battery; and a data bus.
  14. 14. The electronics aspect, according to Claim 13, wherein the MCU is as claimed in Claim 4.
  15. 15. The electronics aspect, according to Claim 13, wherein the MEMS inertial sensors are as claimed in Claim 6.
  16. 16. The electronics aspect, according to Claim 13, wherein the TX/RX is as claimed in Claim 8.
  17. 17. The electronics aspect, according to Claim 13, wherein non-volatile storage memory is as claimed in Claim 9.
  18. 18. The electronics aspect, according to Claim 13, wherein the RFID chip is as claimed in Claim 10.
  19. 19. The electronics aspect, according to Claim 13, wherein the SC comprises an array of photovoltaic modules, and a method for converting natural light - harvested by the photovoltaic modules - into an electrical voltage.
  20. 20. The electronics aspect, according to Claim 13, wherein the PM comprises: a method to measure the charge level of the rechargeable battery; a method to instruct the MCU, MEMS inertial sensors, and TX/RX to enter a sleep mode' to conserve power, and to return to a their normal modes of operation thereafter and a method to utilise the electrical voltage generated by the SC to recharge the rechargeable battery.
  21. 21. The electronics aspect, according to Claim 13, wherein the rechargeable battery: delivers a low-voltage power supply to the MCU, MEMS inertial sensors, non-volatile storage memory, PM, and TX/RX; and provides a mechanism for the PM to deliver an electrical voltage to facilitate its recharging.
  22. 22. The electronics aspect, according to Claim 13, wherein the data bus provides a mechanism for data and control instructions to be passed between the MCU, MEMS inertial sensors, non-volatile storage memory, PM, and TXJRX.
  23. 23. A method to capture motion, according to Claim 1, comprises: sensing acceleration and angular rates of rotation by using MEMS Accelerometer and Gyroscope sensors respectively, as claimed in Claim 6: converting sensed acceleration and angular rates of rotation into digital motion metrics data by using the ADC as claimed in Claim 3; and storing motion metrics data in non-volatile storage memory, as claimed in Claim 9.
  24. 24. A method, according to Claim 1, to participate in a network comprises a TX/RX as claimed in Claim 8, installed with a suitable network communications protocol stack.
  25. 25. A method, according to Claim 1, to transmit motion metrics data to a host computer for processing comprises: motion metrics data stored in nonvolatile storage memory, as claimed in Claim 23; a TXIRX enabling the device to participate in a network, as claimed in Claim 24; a MCU, as claimed in Claim 4, that reads data from non-volatile storage memory and passes it - via the data bus as claimed in Claim 12 - to the TX/RX for transmission to a peer device in the network; and a host computer, connected to the network, which is installed with software that collects motion metrics data for analysis.
  26. 26. A method, according to Claim 1, to receive software programs from a host computer comprises: a host computer connected to a network, which transmits a software program - as sequential data chunks - to the device; a TX/RX enabling the device to participate in a network, as claimed in Claim 24; a number of peer network devices which route data chunks to the device; and a MCU - as claimed in Claim 4- which reads data chunks received by the TX/RX, stores and assembles the chunks in non-volatile storage memory, and - once the original software program is fully reassembled - controls the loading and execution of the program.
  27. 27. A method, according to Claim 1, to uniquely identify the device's user comprises: a RFID chip as claimed in Claim 10; and a reader unit that receives the RFID chip's identification number, and routes it to a host computer for processing via software.
  28. 28. A device that includes a wearable aspect comprising: a method to securely attach the device to a user's clothing; and a tough, protective, non-metallic casing to house the electronics aspect - as claimed in Claim 2 - so as to protect it from breakage and environmental interference.
  29. 29. A method, according to Claim 28, to securely attach the device to a user's clothing comprises a fitting mechanism that affixes the device's casing to the user's attire.
  30. 30. A motion capture and identification device substantially as herein described above and illustrated in the accompanying Drawings.
GB0614404A 2005-07-30 2006-07-20 Wearable motion sensor device with RFID tag Withdrawn GB2428802A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB0515796.1A GB0515796D0 (en) 2005-07-30 2005-07-30 A motion capture and identification device

Publications (2)

Publication Number Publication Date
GB0614404D0 GB0614404D0 (en) 2006-08-30
GB2428802A true GB2428802A (en) 2007-02-07

Family

ID=34983874

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB0515796.1A Ceased GB0515796D0 (en) 2005-07-30 2005-07-30 A motion capture and identification device
GB0614404A Withdrawn GB2428802A (en) 2005-07-30 2006-07-20 Wearable motion sensor device with RFID tag

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB0515796.1A Ceased GB0515796D0 (en) 2005-07-30 2005-07-30 A motion capture and identification device

Country Status (1)

Country Link
GB (2) GB0515796D0 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2451890A (en) * 2007-08-17 2009-02-18 Lancaster University Position and orientation detector
WO2009044677A1 (en) 2007-10-02 2009-04-09 Sharp Kabushiki Kaisha Semiconductor storage device and storage system
EP2117172A1 (en) * 2008-05-07 2009-11-11 Fundación Robotiker Device for mobile ad-hoc networks
WO2010056392A1 (en) * 2008-11-14 2010-05-20 Sony Ericsson Mobile Communications Ab Portable communication device and remote motion input device
WO2011049533A1 (en) 2009-10-20 2011-04-28 Ids D.O.O. Rfid label comprising an interface to external sensors
EP2353065A1 (en) * 2008-10-29 2011-08-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US20110314425A1 (en) * 2010-06-16 2011-12-22 Holy Stone Enterprise Co., Ltd. Air gesture recognition type electronic device operating method
DE102010032775A1 (en) 2010-07-29 2012-02-02 Giesecke & Devrient Gmbh Sensor node for wireless sensor network utilized for condition monitoring of machine in industry, has microprocessor encoding and decoding sensor measurement data that are read-out by reader secured under control of microprocessor
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
WO2013107038A1 (en) * 2012-01-20 2013-07-25 Thomson Licensing Method and apparatus for user recognition
US8508039B1 (en) 2008-05-08 2013-08-13 Invensense, Inc. Wafer scale chip scale packaging of vertically integrated MEMS sensors with electronics
US8539835B2 (en) 2008-09-12 2013-09-24 Invensense, Inc. Low inertia frame for detecting coriolis acceleration
US8628417B2 (en) * 2007-06-22 2014-01-14 Broadcom Corporation Game device with wireless position measurement and methods for use therewith
CN103619090A (en) * 2013-10-23 2014-03-05 深迪半导体(上海)有限公司 System and method of automatic stage lighting positioning and tracking based on micro inertial sensor
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US8960002B2 (en) 2007-12-10 2015-02-24 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
CN104574426A (en) * 2015-02-03 2015-04-29 大连恒锐科技股份有限公司 Method and device for human body feature analysis and based on barefoot or stocking-wearing footprint images
CN106267781A (en) * 2016-09-19 2017-01-04 清华大学 Water proof type motion capture system for athletic training
RU168861U1 (en) * 2016-03-30 2017-02-21 Игорь Юрьевич Лобортас STUD
US9610908B2 (en) 2011-09-02 2017-04-04 Audi Ag Device for setting at least one operating parameter of at least one vehicle system in a motor vehicle
US9636509B2 (en) 2012-01-27 2017-05-02 Medtronic, Inc. Retrieval of information from an implantable medical device
EP3185110A1 (en) * 2015-12-24 2017-06-28 Epawn Hybrid mobile element, method and device for interfacing a plurality of hybrid mobile elements with a computer system, and assembly for virtual or augmented reality system
CN108536314A (en) * 2017-03-06 2018-09-14 华为技术有限公司 Method for identifying ID and device
EP3885964A1 (en) * 2020-03-27 2021-09-29 Hand Held Products, Inc. Methods and systems for improved tag identification

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020060630A1 (en) * 2000-01-11 2002-05-23 Power Michael W. System for monitoring patients with alzheimer's disease or related dementia
EP1394665A1 (en) * 2001-06-01 2004-03-03 Sony Corporation User input apparatus
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20050101314A1 (en) * 2003-11-10 2005-05-12 Uri Levi Method and system for wireless group communications
US20050212911A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture identification of controlled devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020060630A1 (en) * 2000-01-11 2002-05-23 Power Michael W. System for monitoring patients with alzheimer's disease or related dementia
EP1394665A1 (en) * 2001-06-01 2004-03-03 Sony Corporation User input apparatus
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20050101314A1 (en) * 2003-11-10 2005-05-12 Uri Levi Method and system for wireless group communications
US20050212911A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture identification of controlled devices

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9292102B2 (en) 2007-01-05 2016-03-22 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US8462109B2 (en) 2007-01-05 2013-06-11 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US8351773B2 (en) 2007-01-05 2013-01-08 Invensense, Inc. Motion sensing and processing on mobile devices
US8628417B2 (en) * 2007-06-22 2014-01-14 Broadcom Corporation Game device with wireless position measurement and methods for use therewith
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8997564B2 (en) 2007-07-06 2015-04-07 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US10288427B2 (en) 2007-07-06 2019-05-14 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
GB2451890A (en) * 2007-08-17 2009-02-18 Lancaster University Position and orientation detector
WO2009044677A1 (en) 2007-10-02 2009-04-09 Sharp Kabushiki Kaisha Semiconductor storage device and storage system
EP2199954A1 (en) * 2007-10-02 2010-06-23 Sharp Kabushiki Kaisha Semiconductor storage device and storage system
EP2199954A4 (en) * 2007-10-02 2011-03-09 Sharp Kk Semiconductor storage device and storage system
US8655350B2 (en) 2007-10-02 2014-02-18 Sharp Kabushiki Kaisha Semiconductor storage device and storage system
US9846175B2 (en) 2007-12-10 2017-12-19 Invensense, Inc. MEMS rotation sensor with integrated electronics
US8960002B2 (en) 2007-12-10 2015-02-24 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US9342154B2 (en) 2008-01-18 2016-05-17 Invensense, Inc. Interfacing application programs and motion sensors of a device
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US9811174B2 (en) 2008-01-18 2017-11-07 Invensense, Inc. Interfacing application programs and motion sensors of a device
EP2117172A1 (en) * 2008-05-07 2009-11-11 Fundación Robotiker Device for mobile ad-hoc networks
US8508039B1 (en) 2008-05-08 2013-08-13 Invensense, Inc. Wafer scale chip scale packaging of vertically integrated MEMS sensors with electronics
US8539835B2 (en) 2008-09-12 2013-09-24 Invensense, Inc. Low inertia frame for detecting coriolis acceleration
EP2353065A4 (en) * 2008-10-29 2012-06-27 Invensense Inc Controlling and accessing content using motion processing on mobile devices
EP2353065A1 (en) * 2008-10-29 2011-08-10 Invensense Inc. Controlling and accessing content using motion processing on mobile devices
US8503932B2 (en) 2008-11-14 2013-08-06 Sony Mobile Comminications AB Portable communication device and remote motion input device
WO2010056392A1 (en) * 2008-11-14 2010-05-20 Sony Ericsson Mobile Communications Ab Portable communication device and remote motion input device
CN102576417A (en) * 2009-10-20 2012-07-11 Ids有限责任公司 RFID label comprising an interface to external sensors
WO2011049533A1 (en) 2009-10-20 2011-04-28 Ids D.O.O. Rfid label comprising an interface to external sensors
CN102576417B (en) * 2009-10-20 2014-12-31 ams研发公司 RFID label comprising an interface to external sensors
US9239981B2 (en) 2009-10-20 2016-01-19 Ams R&D D.O.O. RFID label comprising an interface to external sensors
US20110314425A1 (en) * 2010-06-16 2011-12-22 Holy Stone Enterprise Co., Ltd. Air gesture recognition type electronic device operating method
DE102010032775A1 (en) 2010-07-29 2012-02-02 Giesecke & Devrient Gmbh Sensor node for wireless sensor network utilized for condition monitoring of machine in industry, has microprocessor encoding and decoding sensor measurement data that are read-out by reader secured under control of microprocessor
US9610908B2 (en) 2011-09-02 2017-04-04 Audi Ag Device for setting at least one operating parameter of at least one vehicle system in a motor vehicle
US9684821B2 (en) 2012-01-20 2017-06-20 Thomson Licensing Method and apparatus for user recognition
WO2013107038A1 (en) * 2012-01-20 2013-07-25 Thomson Licensing Method and apparatus for user recognition
CN103999133A (en) * 2012-01-20 2014-08-20 汤姆逊许可公司 Method and apparatus for user recognition
US9636509B2 (en) 2012-01-27 2017-05-02 Medtronic, Inc. Retrieval of information from an implantable medical device
CN103619090A (en) * 2013-10-23 2014-03-05 深迪半导体(上海)有限公司 System and method of automatic stage lighting positioning and tracking based on micro inertial sensor
CN104574426B (en) * 2015-02-03 2017-08-18 大连恒锐科技股份有限公司 Method and device based on personal signature analysis that is barefoot or wearing sock print image
CN104574426A (en) * 2015-02-03 2015-04-29 大连恒锐科技股份有限公司 Method and device for human body feature analysis and based on barefoot or stocking-wearing footprint images
EP3185110A1 (en) * 2015-12-24 2017-06-28 Epawn Hybrid mobile element, method and device for interfacing a plurality of hybrid mobile elements with a computer system, and assembly for virtual or augmented reality system
FR3046261A1 (en) * 2015-12-24 2017-06-30 Epawn HYBRID MOBILE ELEMENT, METHOD AND DEVICE FOR INTERFACING A PLURALITY OF HYBRID MOBILE ELEMENTS WITH A COMPUTER SYSTEM, AND ASSEMBLY FOR A VIRTUAL OR INCREASED REALITY SYSTEM
US10527400B2 (en) 2015-12-24 2020-01-07 Starbreeze Paris Hybrid mobile entity, method and device for interfacing a plurality of hybrid mobile entities with a computer system, and a set for a virtual or augmented reality system
RU168861U1 (en) * 2016-03-30 2017-02-21 Игорь Юрьевич Лобортас STUD
CN106267781A (en) * 2016-09-19 2017-01-04 清华大学 Water proof type motion capture system for athletic training
CN108536314A (en) * 2017-03-06 2018-09-14 华为技术有限公司 Method for identifying ID and device
EP3885964A1 (en) * 2020-03-27 2021-09-29 Hand Held Products, Inc. Methods and systems for improved tag identification
US11625547B2 (en) 2020-03-27 2023-04-11 Hand Held Products, Inc. Methods and systems for improved tag identification

Also Published As

Publication number Publication date
GB0614404D0 (en) 2006-08-30
GB0515796D0 (en) 2005-09-07

Similar Documents

Publication Publication Date Title
GB2428802A (en) Wearable motion sensor device with RFID tag
US11231786B1 (en) Methods and apparatus for using the human body as an input device
Foxlin et al. VIS-Tracker: A Wearable Vision-Inertial Self-Tracker.
KR101728052B1 (en) Iot system recognizing action of user and controlling things by using the same
CN108717271A (en) Man-machine interaction control method, device, system and readable storage medium storing program for executing
US20140285416A1 (en) Short Range Wireless Powered Ring for User Interaction and Sensing
Van Acht et al. Miniature wireless inertial sensor for measuring human motions
US20080081656A1 (en) Mobile communication device and method for controlling component activation based on sensed motion
CN108196686A (en) A kind of hand motion posture captures equipment, method and virtual reality interactive system
CN104473618A (en) Body data acquisition and feedback device and method for virtual reality
Nasif et al. Wireless head gesture controlled wheel chair for disable persons
CN102332203A (en) System for operating and controlling other apparatuses through motion behavior
US10725550B2 (en) Methods and apparatus for recognition of a plurality of gestures using roll pitch yaw data
Liu et al. The design of wearable wireless inertial measurement unit for body motion capture system
WO2018129041A1 (en) Remote image capture and mounting ecosystem
US10338685B2 (en) Methods and apparatus recognition of start and/or stop portions of a gesture using relative coordinate system boundaries
Farella et al. MOCA: A Low‐Power, Low‐Cost Motion Capture System Based on Integrated Accelerometers
EP3915735A1 (en) Robot control system
Shi et al. Human motion capture system and its sensor analysis
Roggen et al. Poster: BlueSense-Designing an extensible platform for wearable motion sensing, sensor research and IoT applications.
Chen et al. Human motion tracking with wireless wearable sensor network: experience and lessons.
Barbieri et al. A low-power motion capture system with integrated accelerometers [gesture recognition applications]
Scholl et al. Jnode: a sensor network platform that supports distributed inertial kinematic monitoring
CN110077590B (en) Home service robot
CN114756130A (en) Hand virtual-real interaction system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)