WO2007008930A2 - Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices - Google Patents

Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices Download PDF

Info

Publication number
WO2007008930A2
WO2007008930A2 PCT/US2006/026954 US2006026954W WO2007008930A2 WO 2007008930 A2 WO2007008930 A2 WO 2007008930A2 US 2006026954 W US2006026954 W US 2006026954W WO 2007008930 A2 WO2007008930 A2 WO 2007008930A2
Authority
WO
WIPO (PCT)
Prior art keywords
orientation
body part
gravity force
apparent gravity
user
Prior art date
Application number
PCT/US2006/026954
Other languages
French (fr)
Other versions
WO2007008930A3 (en
Inventor
Christopher R. Noble
Kenneth S. Lyons
Original Assignee
Ultimate Balance, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ultimate Balance, Inc. filed Critical Ultimate Balance, Inc.
Publication of WO2007008930A2 publication Critical patent/WO2007008930A2/en
Publication of WO2007008930A3 publication Critical patent/WO2007008930A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • A63B69/3608Attachments on the body, e.g. for measuring, aligning, restraining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2209/00Characteristics of used materials
    • A63B2209/10Characteristics of used materials with adhesive type surfaces, i.e. hook and loop-type fastener
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/62Measuring physiological parameters of the user posture
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • A63B69/3623Training appliances or apparatus for special sports for golf for driving
    • A63B69/3629Visual means not attached to the body for aligning, positioning the trainee's head or for detecting head movement, e.g. by parallax

Definitions

  • the present invention relates generally to the fields of athletic training, physical rehabilitation and evaluation, and physical activity monitoring, and more specifically to apparatus and methods of monitoring the orientation of body parts, measuring the range of motion of joints or limbs of the body, measuring levels of physical activity, and providing cuing and measurement feedback for training and rehabilitation purposes .
  • the present invention also relates to hand-held devices for sensing the orientation and motion of body parts or other objects.
  • Athletic training syscems ana appct ⁇ cii_u» ⁇ ic JS.IH_-W.X .--- ⁇ _ m ⁇ be employed to monitor the orientation or movement of a user's body as he or she engages in a particular sporting activity.
  • a conventional athletic training system may be attached to the user's head or any other suitable body part, and may include a number of tilt sensors for detecting the direction of tilt of the user' s head relative to a user reference orientation (such as a "straight ahead" reference orientation) and/or to an adjustable tilt threshold magnitude.
  • a conventional system may provide the user with one or more visible or audible indications of the orientation or movement of his or her body in real time.
  • the user may employ the system to train his or her body to maintain a desired body posture or to execute a desired movement while performing a particular sporting activity.
  • patients receiving physical therapy for balance disorders or other posture-related conditions may use the system to monitor their progress while performing rehabilitation exercises, or to monitor their posture as they go about their daily activities .
  • the tilt sensors include accelerometers, which, because accelerometers are responsive to both acceleration and tilt, can generate misleading signals when the body part is accelerating.
  • the sensitivity and accuracy of the accelerometer are generally high when the sensitive axis of the accelerometer is close to horizontal, i.e., parallel to the earth's surface, but typically worsen as the sensitive axis of the accelerometer becomes vertical, i.e., perpendicular to the earth's surface.
  • the sensitive axes of the tilt sensors typically require the sensitive axes of the tilt sensors to be precisely aligned relative to corresponding axes of the user.
  • the sensitive axis of one tilt sensor may have to be precisely aligned with the left/right axis of the user's head, while the sensitive axis of another tilt sensor may have to be precisely aligned with the front/back axis of the user's head.
  • some users of conventional athletic training systems may be incapable of recognizing or responding to the visible or audible indications provided by the system.
  • the type of visible or audible feedback provided by the conventional system may be insufficient in some applications, e.g. , when an attending therapist requires quantitative feedback relating to the user's balance skill level, range of motion, conformance to a requested motion or sequence of motions, and/or in applications where users may require guidance or instruction from the training system itself in the absence of the trainer or therapist .
  • the visible or audible feedback may also be inappropriate or unduly distracting to others, e.g., when the system is used in public places.
  • angular rate sensors can be more expensive and larger than accelerometers, can consume more power, and can exhibit significant drift errors .
  • Athletic training systems are also known that employ techniques to "arm" the system, i.e., to initiate monitoring activity based upon an analysis of user movement. To initiate the monitoring activity, they typically require the user to maintain a steady position for a specified time-period, which can lead to errors because the user can sometimes remain motionless with no intention of initiating the monitoring activity.
  • a number of systems for measuring the range of motion (ROM) of a body part about a joint or limb of the user's body are also known.
  • one such system that may be employed in physical rehabilitation applications includes a pair of accelerometers to compensate for the reduction in sensitivity and accuracy than can occur as the sensitive axis of a single accelerometer becomes vertical .
  • the pair of accelerometers of this system needs to be aligned with the intended axis of rotation of the measured body part.
  • the system monitors the outputs of each accelerometer for either a varying signal or an over-range signal, which can be indicative of such acceleration.
  • the above-described conventional system for measuring range of motion also has drawbacks.
  • the acceleration of a body part can cause a distortion in the sensor reading that is not characterized by an over-range or varying signal output, and the system may be incapable of detecting such a condition.
  • the system must typically be manipulated while the measurements are being taken, for example, to trigger a reading when determining the initial orientation or maximum extension during range of motion (ROM) measurements.
  • the system must typically be repositioned to perform multiple measurements on a single joint to re-establish precise alignment of the sensors with each new axis of motion.
  • Another known system for monitoring physical activity may be employed in pedometers and other activity-monitoring devices.
  • the primary objective is to measure accurately the magnitude of an oscillating acceleration, such as an up-down acceleration of a runner or a front-back acceleration of a rower, which is subsequently used to estimate activity level and/or for other purposes.
  • the system includes a plurality of accelerometers disposed in different directions . Signals generated by the accelerometers are compared, and, in response to the signal comparison, one of the accelerometers is selected as being aligned closest to the direction of user acceleration of interest.
  • this system has drawbacks in that there is a practical limit to the number of accelerometers that may be employed in the system. Further, the likelihood that any one of the accelerometers will be oriented precisely in the direction of user acceleration may be low.
  • one such hand-held device includes a 2-axis accelerometer operative to control the position a graphical pointer on a display screen.
  • the device filters out the DC and low frequency components of the accelerometer output, and inserts a new DC component in the system output with a slow feedback loop to maintain correspondence between the average tilt of the accelerometer and the center of the screen*.
  • One drawback of this device is that it does not provide a measurement of the actual magnitude of the accelerometer output.
  • this device fails to address the reduction in sensitivity and accuracy than can occur as the sensitive axis of the accelerometer becomes vertical .
  • an apparatus for monitoring the orientation of an object in 3 -dimensional space including a 3- axis sensor, a least one memory, and at least one processor.
  • the apparatus is configured to be attached to, mounted to, held against, or otherwise disposed in contact with the object to be monitored.
  • the 3 -axis sensor is configured to sense a magnitude of tilt along each of a first axis, a second axis, and a third axis
  • the memory is operative to store data representing the sensed magnitude of tilt along each of the three axes
  • the processor is operative to process the data stored in the memory. Specifically, the processor determines an angle between each of the first, second, and third axes and the horizontal plane, and selects the two axes with the two smallest such angles.
  • the processor then generates an indication of the orientation of the object based upon the sensed magnitude of tilt along the two selected axes . In this way, the apparatus provides increased sensitivity and accuracy in substantially any orientation relative to the object to which it is attached, even when one of the sensitive axes of the 3-axis sensor becomes vertical.
  • This first embodiment of the present invention may also be employed to detect the presence of acceleration.
  • the apparatus is attached to, mounted to, or held against the object to be monitored.
  • the apparent gravity force acting on the apparatus is measured.
  • the direction of the actual gravity- force is determined by analyzing the variation in the apparent gravity force.
  • a first vector representing the actual gravity force is then subtracted from a second vector representing the apparent gravity force to obtain a third vector representing the acceleration of the object.
  • an indication of the direction and/or the magnitude of the third vector is generated, thereby providing an indication of the acceleration of the obj ect .
  • Another embodiment of the present invention may be employed in athletic training or any other suitable physical activity or exercise to determine a reference orientation of a user.
  • the direction of tilt of a body part of the user can then be determined relative to the user's reference orientation, independently of the mounted orientation of the sensing apparatus.
  • This embodiment of the present invention may be employed, for example, to monitor the direction and magnitude of tilt of the user's head while he or she plays tennis or golf.
  • the user's body part is positioned in a first orientation, and an apparent gravity force acting on the body part is measured to obtain a first direction of the apparent gravity force.
  • the body part undergoes an angular displacement about at least one axis from the first orientation to a second orientation, and the apparent gravity force acting on the body part is measured again to obtain a second direction of the apparent gravity force.
  • the reference orientation of the user is then determined based upon the first and second directions of the apparent gravity force, and stored in memory. Because the user's reference orientation is stored in memory, directions of subsequent angular displacements of the body part can be determined relative to the stored reference orientation.
  • Still another embodiment of the present invention may be employed in physical rehabilitation and evaluation applications .
  • this embodiment of the present invention may be employed to measure the extension of a body part around a fixed joint fulcrum.
  • a housing including a sensor is disposed against the body part.
  • the body part is positioned in a first orientation relative to the joint.
  • the sensor measures an apparent gravity force acting on the housing disposed against the body part to obtain a first direction of the apparent gravity force.
  • the body part is positioned in a second orientation relative to the joint.
  • the sensor measures the apparent gravity force acting on the housing at the second orientation to obtain a second direction of the apparent gravity force .
  • a magnitude of rotation of the body part from the first orientation to the second orientation can then be determined based upon the first and second directions of the apparent gravity force, independent of the alignment between the body part and the housing.
  • the monitoring of the orientation of a body part can be initiated by a specified sequence of user motions, thereby obviating the need to manipulate the orientation and motion-sensing apparatus directly.
  • a sensor is disposed against the body part.
  • the body part is positioned in a first orientation, and the sensor is operated to provide data representing a first position of the body part.
  • the body part is then positioned in at least one second orientation, and the sensor is operated to provide data representing at least one second position of the body part. If the first and second positions of the body part correspond to a specified sequence of user positions, then monitoring of the orientation of the body part by the sensor is initiated.
  • the apparatus includes a sensor, at least one memory, at least one processor, and an audio output system.
  • the sensor is configured to sense an angular orientation of the body part, and to provide data representing the sensed angular orientation.
  • the memory is operative to store data representing a plurality of words or phrases, and the audio output system generates an audible message in response to an electronic input.
  • the processor monitors the data provided by the sensor, and accesses data stored in the memory corresponding to at least one word or phrase relating to the sensed angular orientation of the body part.
  • the processor In cooperation with the audio output system, the processor generates a message audible to the user that corresponds to the accessed word or phrase.
  • the word or phrase may include at least one instructional word or phrase for the user, or a confirmation of the start or completion of a specified act performed by the user or the apparatus during the course of monitoring the orientation of the body part.
  • the orientation and motion-sensing apparatus may provide feedback to the user in the form of one or more visible and/or tactile outputs.
  • Fig. 1 illustrates a conventional orientation and motion- sensing device attached to the head of a user
  • Fig. 2a illustrates an orientation and motion-sensing device according to the present invention
  • Fig. 2b-2e depict a back end view, a back side view, a top view, and a front side view, respectively, of the device of Fig. 2a;
  • Fig. 3 illustrates a feedback pattern of the device of Fig. 2a with reference to various head orientations of a user,-
  • Fig. 4 is a block diagram of the device of Fig. 2a;
  • Fig. 5 is a diagram of a geometric model illustrating the operation of the device of Fig. 2a;
  • Fig. 6 is a diagram of a geometric model illustrating a sequence of user movements that may be performed to calibrate the alignment between the device of Fig. 2a and a reference orientation of the user, and how to determine the direction and magnitude of subsequent user deviation from the reference orientation;
  • Fig. 7 is a flow diagram of a method of calibrating the device of Fig. 2a, corresponding to the sequence of user movements of Fig. 6;
  • Fig. 8 is a diagram of a geometric model illustrating an alternate sequence of user movements that may be performed when calibrating the device of Fig. 2a, and how to determine the magnitude of subsequent deviation from the calibrated orientation;
  • Fig. 9a is a diagram of a geometric model illustrating the operation of the device of Fig. 2a when the device is being subjected to a combined stimulus of tilt and periodic acceleration;
  • Fig. 9b is a schematic diagram illustrating a technique for discriminating between the tilt and motion stimuli of Fig. 9a.
  • Fig. 10 is a flow diagram of a method of performing the technique of Fig. 9b.
  • FIG. 1 depicts a conventional orientation and motion- sensing device 110 attached to the head of a user.
  • the conventional device 110 may be attached to a headband 120 using one or more VelcroTM fasteners.
  • the device 110 may be attached to the user's headband to position the device over his or her right ear.
  • the device 110 includes a number of tilt indicators (not shown) operative to detect and monitor the orientation of the user's head, i.e., the direction and magnitude of tilt, relative to a reference orientation and/or to an adjustable tilt magnitude threshold.
  • the device 110 may be employed to monitor the direction and magnitude of tilt of the user's head while the user plays tennis.
  • the user mounts the device 110 over his or her right ear, and assumes a suitable tennis posture such as a vertical stance.
  • the device 110 measures the tilt of the device relative to X and Y-axes 130 and 140 of the user's head to establish the reference orientation.
  • the device 110 monitors the tilt of the user's head, e.g., from left to right, from right to left, from front to back, and/or from back to front, while the user plays tennis. If the magnitude of tilt in any direction exceeds the adjustable magnitude threshold, then a visible and/or audible alarm is generated to indicate the dominant direction of tilt.
  • the conventional orientation and motion-sensing device 110 of Fig. 1 depends highly upon the positioning and orientation of the device 110 relative to the user's head. For example, if the user were to position the device 110 over the left ear instead of the right ear, as depicted in Fig. 1, then the device 110 would incorrectly interpret head tilting from front to back as tilting from back to front, and head tilting from left to right as tilting from right to left.
  • the X-axis 150 of the device 110 is not precisely aligned with the X-axis 130 of the user's head, i.e., the X-axis 130 of the user's head points slightly to the right of the X-axis 150 of the device 110. As a result, the device 110 may provide inaccurate directional feedback to the user, especially when the user tilts his or her head in the front-left, front-right, back-left, or back-right directions.
  • Figs. 2a-2e depict an illustrative embodiment of an orientation and motion-sensing device 210, in accordance with the present invention.
  • the orientation and motion-sensing device 210 provides proper operation and increased accuracy when attached in substantially any orientation relative to the user, and accurately measures changes in body-part orientation whether or not these changes are in directions aligned with the device's sensitive axes.
  • the device 210 provides a technique for calibrating a reference orientation of the user so that the device correctly tracks changes in the user's posture or body orientation.
  • the device 210 also maintains high sensitivity and accuracy as a sensitive axis of the device becomes vertical, reduces the generation of erroneous or misleading signals in the presence of acceleration, and estimates the magnitude and direction of the acceleration.
  • the device 210 provides visible, audible (e.g., human speech), and/or tactile feedback to the user.
  • Fig. 2a depicts the orientation and motion-sensing device 210 attached to the head of a user.
  • the device 210 may be attached to a golf cap 220 using one or more VelcroTM fasteners.
  • the device 210 may alternatively be attached directly or indirectly to any other suitable body part of the user, or any suitable article of clothing or accessory of the user, using any other suitable type of fastener.
  • the device 210 may be incorporated into an article of clothing or accessory, may be held by the user, may be held against the user by an attendant, or may be incorporated into a hand-held device such as a cell phone, a computer game control, or any other suitable hand-held device.
  • the device 210 is configured to provide one or more visible and/or audible indications of the orientation or movement of the user's body in real time.
  • the device 210 may be employed to monitor the orientation or movement of the user as he or she engages in a sporting or leisure activity such as golf, tennis, fencing, sculling, running, walking, bicycling, dancing, or any other suitable activity.
  • the device 210 may also be employed by physical ' therapy patients as an aid in performing rehabilitation exercises or to palliate the effects of a loss of balance ability, which may have resulted from an accident, physical and/or mental degradation, or illness.
  • the device 210 may be used to monitor the tilt of the user's head while he or she plays golf.
  • the device 210 is operative to monitor the tilt of the user's head relative to an X-axis 230 (see Fig. 2e) and a Y- axis 235 (see Fig. 2b) of the device 210.
  • the X-axis 230 points approximately straight ahead of the user and the Y-axis 235 points approximately toward the right side of the user when the device 210 is attached to the user's cap, as depicted in Fig. 2a.
  • the device 210 monitors the tilt of the user's head, e.g., from left to right, from right to left, from front to back, and from back to front, while the user plays golf.
  • Fig. 3 is a diagram illustrating the various approximate head tilts (i.e., front, back, left, right) of the user relative to the X and Y-axes 310 and 320.
  • Fig. 2b depicts a back-end view of the orientation and motion-sensing device 210, illustrating a connector 240 for receiving a headphone or earphone jack (not shown) .
  • connector 240 may be designed to accommodate a battery-charger connector or a network connector, or may be excluded from the system.
  • Fig. 2c depicts a backside view of the device 210 including an inner surface 250 that would normally be disposed against the user, and a speaker 260.
  • one or more visible, alphanumerical, tactile, and/or graphical outputs may be provided instead of, or in addition to, the audible output provided by the headphone/earphone 240 (not shown) or the speaker 260.
  • Fig. 2b depicts a back-end view of the orientation and motion-sensing device 210, illustrating a connector 240 for receiving a headphone or earphone jack (not shown) .
  • connector 240 may be designed to accommodate a battery-charger connector or
  • FIG. 2d depicts a top view of the device 210 including four user controls 270 (e.g., cal, V, mode, ⁇ ) implemented as pushbuttons, and a light emitting diode (LED; not numbered) .
  • Fig. 2e depicts an exemplary front side view of the device 210.
  • Fig. 4 depicts exemplary functional components 400 included in the orientation and motion-sensing device 210 (see Figs. 2a- 2e) .
  • the functional components 400 include an X-axis sensor 402, a Y-axis sensor 404, a signal multiplexer and analog-to-digital converter (A/D) 406, a data processor 408, a program memory 410, a data memory 412, a user display and controls section 414, a voice data memory 416, an audio output converter 418, a speaker 420, a PC/network interface 422, a wireless networking antenna 424, a wired networking connector 426, a battery and power supply 428,, a battery charger connector 430, one or more tactile vibration outputs 432, and an over- range sensor 440.
  • A/D analog-to-digital converter
  • Front/Back/Left/Right tilt indications can be signaled respectively by, for example, the activation of a tactile vibration sensor attached to the inside of a headband at the forehead, the back of the skull, the left temple, and the right temple.
  • the battery charger connector 430 is connected to a battery charger (not shown) .
  • the PC/network interface 422 is connected to a personal computer (PC; not shown) , and/or to a point-to-point network/remote-control unit, a local area network (LAN) , or a wide area network (WAN) through either the wireless networking antenna 424 or the wired networking connector 426.
  • PC personal computer
  • LAN local area network
  • WAN wide area network
  • orientation and motion- sensing device 210 may include all or a subset of the functional components illustrated in Fig. 4.
  • the components 416, 418, 420, 422, 424, 426, 430, 432, and/or 440 may be omitted.
  • the X-axis sensor 402 and the Y-axis sensor 404 are accelerometers oriented within the device 210 so that their respective sensitive axes, namely, the X-axis 230 and the Y-axis 235, are positioned 90° to one another. It is noted, however, that the X and Y-axis sensors 402 and 404 may employ any other suitable technique for sensing tilt, and may be oriented at any other suitable angle relative to one another.
  • the X-axis sensor 402 is operative to sense tilt along the X-axis 230 (see Fig. 2e) of the device 210
  • the Y-axis sensor 404 is operative to sense tilt along the Y-axis 235 (see Fig. 2b) of the device 210.
  • the X and Y-axis sensors 402 and 404 sense tilt and acceleration along the X and Y axes 230 and 235, respectively, by measuring the projection of a force vector on their respective axes that is the sum of the force of gravity at the location of the device 210 and a force of acceleration applied to the device 210 during use.
  • This force vector is known interchangeably as an apparent acceleration vector or an apparent gravity vector.
  • FIG. 3 depicts the relationship between the X and Y axes 310 and 320 and an exemplary apparent gravity vector G 330.
  • the frame of reference is the X and Y axes 310 and 320 of the device 210, while the direction of the apparent gravity vector G 330 relative to the X and Y axes 310 and 320 can change over time, as indicated by a directional arrow 340.
  • each of the X and Y-axis sensors 402, 404 may be a micro-machined accelerometer such as the ADXL103 accelerometer sold by Analog Devices Inc., Norwood, Massachusetts, U.S.A.
  • the X and Y-axis sensors 402, 404 may be implemented using a single dual-axis accelerometer such the ADXL322 dual-axis accelerometer sold by Analog Devices Inc.
  • the signal multiplexer and analog-to-digital converter (A/D) 406 and the data processor 408 may be implemented using the PIC16F777 microcontroller sold by Microchip Technology Inc., Chandler, Arizona, U. S.A, or any- other suitable microcontroller or microprocessor.
  • the audio converter 418 and the voice data memory 416 may be implemented using the ML22Q54 signal processor sold by OKI Semiconductor, Sunnyvale, California, U.S.A., or any other suitable device for storing and processing audio files .
  • conversion of the voice files is performed using software executing on the data processor 408 instead of being implemented as the separate functional block 418.
  • the PC/network interface 422 may be a wired or wireless (e.g., infrared or RF) interface for downloading or uploading content to or from the program memory 410, the data memory 412, and/or the voice data memory 416.
  • the PC/network interface 422 may also be configured for controlling the device 210 remotely.
  • Time- stamps and/or sequences of measurements performed by the orientation and motion-sensing device 210 may be stored within the data memory 412 for subsequent local processing, for subsequent feedback to the user, and/or for subsequent uploading to a computer via the PC/network interface 422.
  • application-specific user feedback phrases, measurement algorithms, and/or cuing sequences may be downloaded to the device 210 from a computer or over a communications network such as the Internet .
  • the over-range sensor 440 operates as a third tilt sensor, which is oriented at a specified angle to the X-Y plane defined by the sensitive X and Y-axes 310, 320.
  • the sensitive axis of the over-range sensor 440 is oriented at 90° to the X-Y plane.
  • the over-range sensor 440 may be a micro-machined accelerometer such as the ADXL103 accelerometer sold by Analog Devices Inc.
  • Fig. 5 is a diagram of a geometric model 500 that may be employed to illustrate the operation of the orientation and motion-sensing device 210 (see Pigs. 2a-2e) .
  • the geometric model 500 which is constructed with reference to both the spherical and Cartesian coordinate systems, may be used to quantify the device's orientation and acceleration, and to quantify the alignment of the device 210 with the body part to which it is attached.
  • the geometric model 500 includes the entire unit sphere, of which one octal portion is shown. Further, an X- axis 502 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4) , a Y-axis 504 represents the sensitive axis of the Y- axis sensor 404 (see Fig. 4) , and a Z-axis 506 represents the sensitive axis of the over-range sensor 440 (see Fig. 4) .
  • the X and Y-axes 502 and 504 define an X-Y plane, and the Z-axis 506 is oriented 90° to the X-Y plane.
  • Fig. 5 depicts one possible direction of an exemplary apparent gravity vector G 508.
  • the origin of the vector G corresponds to the origin of the unit sphere . Because the actual gravity vector always points in the same direction, i.e., toward the center of the earth, the device 210 can determine changes in the orientation and acceleration of the body part to which it is attached by monitoring and analyzing changes in the magnitude and direction of the apparent gravity vector G 508 relative to axes X 502, Y 504 and Z 506.
  • the length x of an X-vector 510 represents the magnitude of the apparent gravity vector G 508 as measured by the X-axis sensor 402 along the X-axis 502
  • the length y of a Y-vector 512 represents the magnitude of the apparent gravity vector G 508 measured by the Y-axis sensor 404 along the Y-axis 504.
  • the direction of the apparent gravity vector G 508 can be determined using the measurements provided by the X and Y-axis sensors 402 and 404 (see Fig. 4) . It is noted that, in an alternative embodiment, the X- axis 502 and the Y-axis 504 may be oriented at an angle different from 90° to one another, in which case the formulas (1) and (2) above and the other formulas below may be modified as appropriate using known trigonometric identities.
  • an accelerometer' s sensitivity to changes in tilt is at a maximum when the sensitive axis of the accelerometer is close to horizontal, and is at a minimum when the sensitive axis of the accelerometer becomes vertical.
  • the orientation and motion-sensing device 210 employs the over-range sensor 440 (see Fig. 4) in conjunction with the X and Y-axis sensors 402 and 404 (see Fig. 4) to determine the direction of the apparent gravity vector G 508 over the entire unit sphere, thereby allowing the device 210 to provide an accurate measurement of tilt in any orientation of the device .
  • the length of the X- vector 510, the length of the Y-vector 512, and the length of the Z-vector 514 represent the magnitudes of acceleration measured by the X-axis sensor 402, the Y-axis sensor 404, and the over-range sensor 440, respectively, when the device 210 is acted upon by the apparent gravity vector G 508.
  • the length p 524 (see Fig. 5) of the apparent gravity vector G 508, normalized to the gravitational field at the earth's surface may be expressed as
  • the accuracy of the measurement of the direction of the apparent gravity vector G by the orientation and motion-sensing device 210 can decrease when the device undergoes acceleration. Whether or not the device 210 is undergoing acceleration, and how much the acceleration is affecting the accuracy of the measurement of the apparent gravity vector G by the device, can be determined based upon the vector G 2 . Applying known rules of trigonometry, the direction ⁇ 2 i in the spherical coordinate system may be determined from the expression
  • the device 210 generates an audible message "Front” if tilts subsequent to the calibration are ⁇ 45° from the user's Front direction, an audible message “Left” if tilts are +45° from the Left direction, an audible message “Back” if tilts are ⁇ 45° from the Back direction, and an audible message “Right” if tilts are ⁇ 45° from the Right direction.
  • the correct feedback from the device 210 can thus be expressed mathematically as
  • angle ⁇ 2n is the length of the great circle segment G 2 -MU n .
  • Fig. 8 is a diagram of a geometric model 800 that may be employed to illustrate the operation of the orientation and
  • -23- motion-sensing device 210 in applications including a physical rehabilitation and evaluation application involving the determination of a patient's range of motion (ROM).
  • ROM range of motion
  • the orientation of the device 210 relative to the body part is fixed but indeterminate, and the direction of the body part motion to be measured is in a vertical plane, but in an unknown direction. This method is thus useful when range of motion in several different directions is to be measured without having to reposition the device between measurements.
  • the geometric model 800 of Fig. 8 is constructed with reference to both the spherical and Cartesian coordinate systems. Only one octal portion of the unit sphere is shown.
  • An X- axis 802 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4)
  • a Y-axis 804 represents the sensitive axis of the Y- axis sensor 404 (see Fig. 4)
  • a Z-axis 806 represents the sensitive axis of the over-range sensor 440.
  • the device 210 employs a single orientation calibration at time 1, as indicated by the apparent gravity vector G 1 830 (see Fig. 8) .
  • the vector G 1 830 may correspond to the resting or starting orientation of a limb extension. As the patient extends his or her limb, the end of the apparent gravity vector G moves away from the end of the vector G x 830.
  • the end of apparent gravity vector G n 860 may be located anywhere on a circle 870.
  • the magnitude of extension which is represented by the magnitude of tilt ⁇ ln 850 (see Fig. 8) is the length of the great circle segment Gi-MS n where ⁇ m can be calculated using the expression
  • the device 210 may be configured to monitor, capture and store the maximum value of the magnitude of tilt ⁇ m for subsequent feedback, thereby allowing the patient to
  • the device 210 may be concluded that the device 210 is being subjected to acceleration in addition to the force of gravity. It is noted that for small changes in the value of p, the worst-case error in the calculation of the direction of the apparent gravity vector G is about 1° for a 1.75% change in p. In one embodiment, if the error in the calculation of the apparent gravity vector G is significant for a given application, then the device 210 provides a suitable visible, tactile and/or audible warning to the user.
  • Fig. 6 is a diagram of a geometric model 600 that may be employed to illustrate a technique of determining the orientation of the device 210 (see Figs. 2a-2e) relative to a body part of a user to which it is attached.
  • the geometric model 600 of Fig. 6 is constructed with reference to both the spherical and Cartesian coordinate systems and displays one octal portion of a unit sphere.
  • an X-axis 602 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4)
  • a Y-axis 604 represents the sensitive axis of the Y-axis sensor 404 (see Fig. 4)
  • a Z-axis 606 represents the sensitive axis of the over-range sensor 440.
  • Fig. 7 provides a sequence of steps for quantifying the orientation of the device 210 relative to the body part to which it is attached, and allows the determination of a reference orientation of the user.
  • the device is attached to the user's golf cap above his or her right ear (see, e.g., illustration 715 of Fig. 7) .
  • the device 210 may alternatively be attached to any other suitable body part (e.g. , the user's chest, back, elbow, etc.), and in any other suitable orientation relative to the user. This is because
  • the X and Y-axes of the device 210 are not required to be aligned with the corresponding axes of the body part to which it is attached, nor does the degree of misalignment need to be known.
  • the user first stands vertically, looking towards the horizon, as depicted in step 705. It is noted that the user may alternatively look in any other suitable direction.
  • the user triggers a first calibration of the device 210 at time 1 by depressing one or more suitable user controls (see, e.g., the cal pushbutton 270 of Fig. 2), as depicted in step 710.
  • the user then holds his or her vertical standing orientation, as depicted in step 720, while the device 210 captures the first calibration direction of the apparent gravity vector G, as indicated by the vector Gi 610 (see Fig. 6) .
  • the vector Gi 610 does not necessarily coincide with the Z-axis 606 of the device 210.
  • the X and/or Y axes 602, 604 of the device 210 are not required to be horizontal.
  • the user tilts his or her head a number of degrees toward the front or forward direction, and triggers a second calibration of the device 210 at time 2 by depressing the cal pushbutton 270, as depicted in step 725.
  • the user may alternatively tilt his or her head in any other suitable direction.
  • the device 210 may be configured to execute the triggering steps 710 and 725 under program control, allowing the calibration procedure to be performed without requiring the user to manipulate the device .
  • step 735 The user then holds the tilted orientation of his or her head, as depicted in step 735, while the device 210 captures the second calibration direction of the apparent gravity vector G, as indicated by the apparent gravity vector G 2 620 (see Fig. 6) .
  • the data-capture phase of the calibration is then complete as indicated in step 740.
  • the device 210 employs the first and second calibration directions of the apparent gravity vector G to determine the orientation of the device relative to the body part to which it is
  • an arc Gr-MS 2 extending from the end of the apparent gravity vector Gi 610 to the end of the apparent gravity- vector G 2 620 is a great circle segment on the unit sphere whose direction at each point is the direction of forward angular tilt of the user at that point. It is noted that subsequent tilting of the user's head exactly in the forward direction will cause the end of the apparent gravity vector G to extend the path defined by the great circle arc Gr-MS 2 .
  • FIG. 6 illustrates an example of a continued tilt beyond the vector G 2 620, slightly to the left of straight ahead at time n, resulting in the apparent movement of the end of the vector G to a point corresponding to the end of the vector G n 630 at time n.
  • Left and Right directions are inverted because the perspective of Fig. 6 is from outside the sphere looking in, whereas the user's perspective is from the center of the sphere looking out.
  • This method is valid for end-points of the vectors G x 610, G 2 620, and G n 630 located anywhere on the unit sphere, so the device can be mounted in any orientation relative to the user, while allowing accurate determination of the orientation of the device relative to the body part to which it is attached, and of the reference orientation of the user.
  • the magnitude of forward tilt of the user's head below the horizontal plane 770 (see Fig. 7) at time 2 corresponds to an angle ⁇ i 2 680 representing the change in direction from Gi 610 to G 2 620.
  • the angle ⁇ i 2 is equivalent to the length of the great circle segment Gr-MS 2 and, applying known rules of trigonometry, may be determined from the expression
  • a back tilt 640 is in the direction ⁇ 2i , which is the direction of the great circle arc G 2 -MSi at the point located at the end of the apparent gravity
  • the presently disclosed orientation and motion-sensing device 210 includes the voice data memory 416, the data processor 408, the audio processor 418, and the speaker 420 (see Fig. 4) , which may be configured to provide a sequence of distinguishable audible cues and action confirmations to the user while he or she performs the calibration method of Fig. 7 or any other suitable function of the device 210.
  • Fig. 7 depicts exemplary audible cues and action confirmations 750 in the English language that may be provided by the device 210 after the user performs the acts depicted in steps 710, 720, 725, and 735.
  • the audible cues and action confirmations 750 are designed to facilitate and confirm proper execution of the various steps in the calibration procedure.
  • the device 210 may be configured to provide the audible cue "Look straight ahead" after step 710. Further, the device 210 may provide the audible confirmation "Level set” after step 720, the audible cue “Lean” after step 725, and the audible confirmation "Direction set” after step 735. It is understood that in alternative embodiments, the device 210 may be configured to provide any other suitable audible, visible, and/or tactile cues and action confirmations to the user, using any other suitable language, in order to facilitate device operation.
  • the device 210 may include one or more vibrating transducers (not shown) to provide one or more tactile cues and/or action confirmations against the user's skin.
  • the orientation and motion-sensing device 210 may also be configured to provide user feedback in the form of audible phrases stored in the voice data memory 416 or synthesized by the device.
  • the audible user feedback phrases may be constructed and selected by the device 210 under control of the data processor 408, which may sequence the phrases in response to user motions monitored by the device.
  • the calibration method of Fig. 7 includes an exemplary use of such phrases as cues to guide the user in executing specific and desired motions ⁇ e.g. , "Look straight ahead"), and to confirm to the user the proper or improper execution of a step or sequence of steps ⁇ e.g., "Level set", "Direction set”).
  • Such audible user feedback phrases may also be employed in physical rehabilitation and evaluation applications to cue the user while performing physical therapy exercises, e.g. , "Raise your arm slowly as far as it can go” , “Stand on your right foot until you are told to stop” , "The left elbow flexion will now be measured” , or "Attach the device to the left wrist and stabilize the humerus” .
  • Suitable sequences of user guidance and feedback phrases can be programmed into the device 210, for example through the PC/Network Interface 422 (see Fig. 4) according to a specific plan of desired user motions, in response to an analysis of user motions, or a combination thereof.
  • the orientation and motion-sensing device 210 may be configured to provide audible performance feedback to the user that is contextual to a specific application.
  • the desired performance feedback in response to a tilt in the forward direction may be "You are leaning forward"
  • the desired performance feedback in response to the same forward tilt may be “Go Back” , "Keep your head up” , or "You are about to fall over” .
  • the desired performance feedback in response to a maximum limb extension that is below a specified lower limit may be "Stretch a little farther”
  • the desired performance feedback in response to exceeding a specified upper limit may be "You've gone too far”.
  • the desired performance feedback may be "Your extension is 85°", "Your maximum extension was 135°” or, in the case of blind measurements, the desired performance feedback may be "Measurement number 4 has been recorded” .
  • the device 210 may also provide feedback that tracks user progress, using phrases such as "Repetition three completed, seven more to go” , or "Your average head tilt over the past five minutes was 5° and your average direction was 45° to the right of straight ahead” .
  • the device 210 may provide user feedback corresponding to the number of times a local minimum or maximum point satisfying certain specified conditions has been reached.
  • the orientation and motion-sensing device 210 may be incorporated into a hand-held device such as a cell-phone or a computer game control.
  • the device 210 may be configured to announce the phrase "Call sent to voice-mail" in response to an outward flick of the user's wrist, e.g., when there is a call waiting.
  • the device 210 may be configured to announce the phrase "Your opponent is defeated" after the user has moved the device through a correct sequence of target orientations.
  • the device 210 may be configured to allow selection and/or programming, via the PC/Network Interface 422, of a particular individual's voice, e.g. , a teacher, a sports celebrity, etc., or a particular language, e.g. , English, French, German, Italian, Chinese, Japanese, Korean, etc., to provide the user feedback.
  • the orientation and motion-sensing device 210 may be configured to initiate a particular operational mode in response to a specified sequence of user movements.
  • the device 210 may be configured to initiate a posture-monitoring operational mode in response to a specified sequence of movements while the user is practicing or participating in a round of golf. In this way, the user can initiate the posture-monitoring mode of the device 210 without having to release his or her golf club.
  • the sequence of user movements includes at least two steps performed in a specified order, in which each step requires the user to look in a specified direction.
  • the device 210 may provide audible, visible, and/or tactile confirmation of the proper execution of the ordered steps.
  • the sequence of user movements is designed to assure that the user is unlikely to perform the movements unintentionally.
  • a user engaged in a round of golf may initiate the posture monitoring mode of the device 210 by performing a specified sequence of movements, which, based on the resulting orientations of the device 210 relative to the user, effectively causes the apparent gravity vector G to retrace the path from a direction corresponding to the vector G 2 620 (see Fig. 6) to a direction corresponding to the vector G 1 610 (see Fig. 6) , and back to the direction corresponding to the vector G 2 620.
  • the specified and corresponding sequence of user movements may include addressing the golf ball, looking at the horizon, and addressing the golf ball again.
  • the device 210 may provide audible, visible, and/or tactile confirmations of the proper execution of each user movement in the specified sequence.
  • a tolerance circle may be provided around the locations of the vectors G 2 and/or G 1 so that the user is not required to look exactly at a particular point on the horizon or to address the golf ball in a precise manner in order for the device 210 to recognize the user' s intent to initiate a particular operational mode.
  • a tolerance circle of 10° or any other suitable size may be provided.
  • the directions of the vectors G 1 and G 2 corresponding to the first and second target orientations of the device 210 may be replaced by two other orientations that are related geometrically to the directions of the vectors G 1 and G 2 , so long as these orientations correspond to convenient visual targets for the user.
  • Fig. 9a is a diagram of a geometric model 900 that may be employed to illustrate a technique of distinguishing the effects of acceleration on the orientation and motion-sensing device 210 (see Figs. 2a-2e) from the effects of tilt.
  • the accuracy of the measurement of the apparent gravity vector G by the device 210 can decrease in the presence of acceleration, and, in certain applications (such as physical activity monitors) , it is useful to estimate the magnitude and direction of the acceleration vector as precisely as possible in order to improve the accuracy of the measurement of physical activity.
  • the geometric model 900 of Fig. 9a is constructed with reference to both the spherical and Cartesian coordinate systems and only one octal portion of the unit sphere is shown.
  • an X-axis 902 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4)
  • a Y-axis 904 represents the sensitive axis of the Y-axis sensor 404 (see Fig. 4)
  • a Z-axis 906 represents the sensitive axis of the over-range sensor 440.
  • an apparent gravity vector G 910 is the sum of an actual gravity vector G a 920 and an oscillating acceleration vector ⁇ 930, which has its origin at the endpoint of the vector G a 920, i.e.,
  • such an oscillating acceleration vector ⁇ 930 may occur when the user is running and the dominant direction of the acceleration vector ⁇ 930 is up-down relative to the user, or when the user is rowing and the dominant direction of the vector ⁇ 930 is front-back relative to the user.
  • signals representing the angles ⁇ G 914 and ⁇ G 916 are low- pass filtered by low-pass filter components 940 and 950 (see Fig. 9b) , respectively, to suppress just the effects of the acceleration vector ⁇ 930 (see Fig. 9a) from the apparent gravity vector G 910 (see Fig. 9a) .
  • filter architectures and values 940 and 950 can be chosen to perform this step if the magnitude variation of the acceleration vector ⁇ 930 is in a sufficiently higher frequency band than the directional variation of the actual gravity vector G a 920.
  • a 15-tap, 0.5 Hz FIR filter with a 6 Hz sampling rate will attenuate a periodic acceleration vector ⁇ 930 with a period of 1 second by 33 dB, while attenuating a periodic G a 920 with a period of 9 seconds by less than 1 dB.
  • the low pass filters 940 and 950 generate output signals corresponding to angles ⁇ Ga and ⁇ Ga , respectively, which define the direction of the actual gravity vector G a .
  • signals representing the angles ⁇ Ga and ⁇ G a are converted to Cartesian coordinates by the converter 960 (see Fig. 9b) , as depicted in step 1004.
  • the signals representing the angles ⁇ G 914 and ⁇ G 916, and the length p G are converted to Cartesian coordinates by the converter 970 (see Fig. 9b) , as depicted in step 1006.
  • a representation 980 (see Fig. 9b) of the acceleration vector ⁇ 930 is obtained at a summation node 990 (see Fig. 8b) by subtracting the actual gravity vector G a from the apparent gravity vector G, as depicted in step 1008.
  • the method of Fig. 10 allows accurate measurements of the direction and magnitude of acceleration of the device 210 to be obtained without having to calibrate the alignment of the device to the user, without knowing a priori the direction of the acceleration relative to the device 210, and without requiring the device's orientation relative to the user to remain constant.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Human Computer Interaction (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Geometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Improved apparatus and methods of sensing or monitoring body orientation and motion, and measuring range of motion (ROM) for use in athletic training, physical rehabilitation and evaluation. The apparatus is attachable to an object for monitoring, and includes a 3-axιs sensor, at least one memory, and at least one processor. The sensor senses a magnitude of tilt along first, second, and third axes, the memory stores data representing the sensed magnitudes of tilt, and the processor processes the data stored in the memory. In one embodiment, the processor determines an angle between each of the three axes and a horizontal plane, and selects the two axes corresponding to the two smallest angles between the three axes and the horizontal plane. The processor then generates an indication of the orientation of the object based upon the sensed magnitudes of tilt along the two selected axes.

Description

TITLE OF THE INVENTION
ORIENTATION AND MOTION SENSING IN ATHLETIC TRAINING SYSTEMS, PHYSICAL REHABILITATION AND EVALUATION SYSTEMS, AND HAND-HELD
DEVICES
CROSS REFERENCE TO RELATED APPLICATIONS
This application claims benefit of U.S. Provisional Patent Application No. 60/698,995 filed July 13, 2005 entitled MONITORING, EVALUATION AND TRAINING SYSTEM FOR ATHLETICS AND PHYSICAL REHABILITATION INCLUDING STUDENT UNIT AND REMOTE UNIT COMMUNICABLE THEREWITH, and U.S. Provisional Patent Application No. 60/719,161 filed September 21, 2005 entitled MONITORING, EVALUATION AND TRAINING SYSTEM FOR ATHLETICS AND PHYSICAL REHABILITATION INCLUDING STUDENT UNIT AND REMOTE UNIT COMMUNICABLE THEREWITH.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR
DEVELOPMENT -- Not applicable --
BACKGROUND OF THE INVENTION
The present invention relates generally to the fields of athletic training, physical rehabilitation and evaluation, and physical activity monitoring, and more specifically to apparatus and methods of monitoring the orientation of body parts, measuring the range of motion of joints or limbs of the body, measuring levels of physical activity, and providing cuing and measurement feedback for training and rehabilitation purposes . The present invention also relates to hand-held devices for sensing the orientation and motion of body parts or other objects. Athletic training syscems ana appctχcii_u» αic JS.IH_-W.X .---^_ m^ be employed to monitor the orientation or movement of a user's body as he or she engages in a particular sporting activity. For example, a conventional athletic training system may be attached to the user's head or any other suitable body part, and may include a number of tilt sensors for detecting the direction of tilt of the user' s head relative to a user reference orientation (such as a "straight ahead" reference orientation) and/or to an adjustable tilt threshold magnitude. Such a conventional system may provide the user with one or more visible or audible indications of the orientation or movement of his or her body in real time. The user may employ the system to train his or her body to maintain a desired body posture or to execute a desired movement while performing a particular sporting activity. In addition, patients receiving physical therapy for balance disorders or other posture-related conditions may use the system to monitor their progress while performing rehabilitation exercises, or to monitor their posture as they go about their daily activities .
Although athletic training systems like the conventional system described above may be employed in various sporting and physical therapy applications, such systems have drawbacks. For example, in some conventional athletic training systems, the tilt sensors include accelerometers, which, because accelerometers are responsive to both acceleration and tilt, can generate misleading signals when the body part is accelerating. Further, when an accelerometer is used as a tilt sensor, the sensitivity and accuracy of the accelerometer are generally high when the sensitive axis of the accelerometer is close to horizontal, i.e., parallel to the earth's surface, but typically worsen as the sensitive axis of the accelerometer becomes vertical, i.e., perpendicular to the earth's surface. Moreover, it is often desirable to mount such athletic training systems in various orientations and/or on different parts of the user's body to suit a particular application and/or for aesthetic reasons. However, conventional athletic training systems typically require the sensitive axes of the tilt sensors to be precisely aligned relative to corresponding axes of the user. For example, when the system is attached to the user's headband, baseball helmet, or golf cap, the sensitive axis of one tilt sensor may have to be precisely aligned with the left/right axis of the user's head, while the sensitive axis of another tilt sensor may have to be precisely aligned with the front/back axis of the user's head. In addition, some users of conventional athletic training systems may be incapable of recognizing or responding to the visible or audible indications provided by the system. Alternatively, the type of visible or audible feedback provided by the conventional system may be insufficient in some applications, e.g. , when an attending therapist requires quantitative feedback relating to the user's balance skill level, range of motion, conformance to a requested motion or sequence of motions, and/or in applications where users may require guidance or instruction from the training system itself in the absence of the trainer or therapist . The visible or audible feedback may also be inappropriate or unduly distracting to others, e.g., when the system is used in public places.
Athletic training systems are also known that employ tilt sensors in combination with one or more angular rate sensors such as gyroscopes for sensing and analyzing sequences of movement rather than just monitoring orientation. However, in addition to the drawbacks of conventional athletic training systems listed above, angular rate sensors can be more expensive and larger than accelerometers, can consume more power, and can exhibit significant drift errors .
Athletic training systems are also known that employ techniques to "arm" the system, i.e., to initiate monitoring activity based upon an analysis of user movement. To initiate the monitoring activity, they typically require the user to maintain a steady position for a specified time-period, which can lead to errors because the user can sometimes remain motionless with no intention of initiating the monitoring activity.
A number of systems for measuring the range of motion (ROM) of a body part about a joint or limb of the user's body are also known. For example, one such system that may be employed in physical rehabilitation applications includes a pair of accelerometers to compensate for the reduction in sensitivity and accuracy than can occur as the sensitive axis of a single accelerometer becomes vertical . The pair of accelerometers of this system needs to be aligned with the intended axis of rotation of the measured body part. In addition, to reduce the generation of erroneous or misleading signals when detecting the tilt of a body part that is undergoing acceleration, the system monitors the outputs of each accelerometer for either a varying signal or an over-range signal, which can be indicative of such acceleration.
However, the above-described conventional system for measuring range of motion also has drawbacks. For example, the acceleration of a body part can cause a distortion in the sensor reading that is not characterized by an over-range or varying signal output, and the system may be incapable of detecting such a condition. Further, the system must typically be manipulated while the measurements are being taken, for example, to trigger a reading when determining the initial orientation or maximum extension during range of motion (ROM) measurements. Moreover, the system must typically be repositioned to perform multiple measurements on a single joint to re-establish precise alignment of the sensors with each new axis of motion. As a result, it can be difficult to establish and/or maintain a precise alignment of the system with an axis and/or fulcrum of a joint or bone. Such alignment and re-alignment of the system may also interfere with or slow down the measurement process, thereby making the measurement process inaccurate, or painful for the user. In addition, when this system is used for diagnostic or physical rehabilitation purposes, the measurement process may interfere with the visual and/or tactile communication between a physical therapist and his or her patient, and/or an additional attendant may be required to take the actual measurement readings .
Another known system for monitoring physical activity may be employed in pedometers and other activity-monitoring devices. In such a system, the primary objective is to measure accurately the magnitude of an oscillating acceleration, such as an up-down acceleration of a runner or a front-back acceleration of a rower, which is subsequently used to estimate activity level and/or for other purposes. The system includes a plurality of accelerometers disposed in different directions . Signals generated by the accelerometers are compared, and, in response to the signal comparison, one of the accelerometers is selected as being aligned closest to the direction of user acceleration of interest. However, this system has drawbacks in that there is a practical limit to the number of accelerometers that may be employed in the system. Further, the likelihood that any one of the accelerometers will be oriented precisely in the direction of user acceleration may be low.
A number of hand-held devices for sensing motion are also known. For example, one such hand-held device includes a 2-axis accelerometer operative to control the position a graphical pointer on a display screen. To reduce undesirable pointer movements when responding to the tilt of the accelerometer as it also undergoes acceleration, the device filters out the DC and low frequency components of the accelerometer output, and inserts a new DC component in the system output with a slow feedback loop to maintain correspondence between the average tilt of the accelerometer and the center of the screen*. One drawback of this device is that it does not provide a measurement of the actual magnitude of the accelerometer output. In addition, this device fails to address the reduction in sensitivity and accuracy than can occur as the sensitive axis of the accelerometer becomes vertical .
It would therefore be desirable to have improved apparatus and methods of sensing or monitoring body orientation and motion and measuring range of motion (ROM) , for use in athletic training, physical rehabilitation and evaluation, and any other suitable physical activity or exercise . Such improved apparatus for sensing orientation and motion would avoid the drawbacks of the above- described conventional systems and apparatus . It would also be desirable to have an improved method of sensing orientation and motion that can be used in hand-held devices .
BRIEF SUMMARY OF THE INVENTION
In accordance with the present invention, improved apparatus and methods of sensing or monitoring body orientation and motion and measuring range of motion (ROM) are disclosed, for use in athletic training, physical rehabilitation and evaluation, and any other suitable physical activity or exercise. In one embodiment of the present invention, an apparatus for monitoring the orientation of an object in 3 -dimensional space is provided, including a 3- axis sensor, a least one memory, and at least one processor. The apparatus is configured to be attached to, mounted to, held against, or otherwise disposed in contact with the object to be monitored. The 3 -axis sensor is configured to sense a magnitude of tilt along each of a first axis, a second axis, and a third axis, the memory is operative to store data representing the sensed magnitude of tilt along each of the three axes, and the processor is operative to process the data stored in the memory. Specifically, the processor determines an angle between each of the first, second, and third axes and the horizontal plane, and selects the two axes with the two smallest such angles. The processor then generates an indication of the orientation of the object based upon the sensed magnitude of tilt along the two selected axes . In this way, the apparatus provides increased sensitivity and accuracy in substantially any orientation relative to the object to which it is attached, even when one of the sensitive axes of the 3-axis sensor becomes vertical.
This first embodiment of the present invention may also be employed to detect the presence of acceleration. Specifically, the apparatus is attached to, mounted to, or held against the object to be monitored. Next, the apparent gravity force acting on the apparatus is measured. Next, the direction of the actual gravity- force is determined by analyzing the variation in the apparent gravity force. A first vector representing the actual gravity force is then subtracted from a second vector representing the apparent gravity force to obtain a third vector representing the acceleration of the object. Next, an indication of the direction and/or the magnitude of the third vector is generated, thereby providing an indication of the acceleration of the obj ect .
Another embodiment of the present invention may be employed in athletic training or any other suitable physical activity or exercise to determine a reference orientation of a user. The direction of tilt of a body part of the user can then be determined relative to the user's reference orientation, independently of the mounted orientation of the sensing apparatus. This embodiment of the present invention may be employed, for example, to monitor the direction and magnitude of tilt of the user's head while he or she plays tennis or golf. Specifically, the user's body part is positioned in a first orientation, and an apparent gravity force acting on the body part is measured to obtain a first direction of the apparent gravity force. Next, the body part undergoes an angular displacement about at least one axis from the first orientation to a second orientation, and the apparent gravity force acting on the body part is measured again to obtain a second direction of the apparent gravity force. The reference orientation of the user is then determined based upon the first and second directions of the apparent gravity force, and stored in memory. Because the user's reference orientation is stored in memory, directions of subsequent angular displacements of the body part can be determined relative to the stored reference orientation.
Still another embodiment of the present invention may be employed in physical rehabilitation and evaluation applications . For example, this embodiment of the present invention may be employed to measure the extension of a body part around a fixed joint fulcrum. First, a housing including a sensor is disposed against the body part. Next, the body part is positioned in a first orientation relative to the joint. The sensor then measures an apparent gravity force acting on the housing disposed against the body part to obtain a first direction of the apparent gravity force. Next, the body part is positioned in a second orientation relative to the joint. The sensor then measures the apparent gravity force acting on the housing at the second orientation to obtain a second direction of the apparent gravity force . A magnitude of rotation of the body part from the first orientation to the second orientation can then be determined based upon the first and second directions of the apparent gravity force, independent of the alignment between the body part and the housing.
In yet another embodiment of the present invention, the monitoring of the orientation of a body part can be initiated by a specified sequence of user motions, thereby obviating the need to manipulate the orientation and motion-sensing apparatus directly. In this embodiment, a sensor is disposed against the body part. Next, the body part is positioned in a first orientation, and the sensor is operated to provide data representing a first position of the body part. The body part is then positioned in at least one second orientation, and the sensor is operated to provide data representing at least one second position of the body part. If the first and second positions of the body part correspond to a specified sequence of user positions, then monitoring of the orientation of the body part by the sensor is initiated.
In another embodiment of the presently disclosed invention, useful feedback is provided to a user based upon the direction and/or extent of one or more rotations of a body part to which it is attached. In this embodiment, the apparatus includes a sensor, at least one memory, at least one processor, and an audio output system. The sensor is configured to sense an angular orientation of the body part, and to provide data representing the sensed angular orientation. The memory is operative to store data representing a plurality of words or phrases, and the audio output system generates an audible message in response to an electronic input. The processor monitors the data provided by the sensor, and accesses data stored in the memory corresponding to at least one word or phrase relating to the sensed angular orientation of the body part. In cooperation with the audio output system, the processor generates a message audible to the user that corresponds to the accessed word or phrase. For example, the word or phrase may include at least one instructional word or phrase for the user, or a confirmation of the start or completion of a specified act performed by the user or the apparatus during the course of monitoring the orientation of the body part. In alternative embodiments, the orientation and motion-sensing apparatus may provide feedback to the user in the form of one or more visible and/or tactile outputs.
Other features, functions, and aspects of the invention will be evident from the Detailed Description of the Invention that follows .
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
The invention will be more fully understood with reference to the following Detailed Description of the Invention in conjunction with the drawings of which:
Fig. 1 illustrates a conventional orientation and motion- sensing device attached to the head of a user;
Fig. 2a illustrates an orientation and motion-sensing device according to the present invention;
Fig. 2b-2e depict a back end view, a back side view, a top view, and a front side view, respectively, of the device of Fig. 2a;
Fig. 3 illustrates a feedback pattern of the device of Fig. 2a with reference to various head orientations of a user,-
Fig. 4 is a block diagram of the device of Fig. 2a; Fig. 5 is a diagram of a geometric model illustrating the operation of the device of Fig. 2a;
Fig. 6 is a diagram of a geometric model illustrating a sequence of user movements that may be performed to calibrate the alignment between the device of Fig. 2a and a reference orientation of the user, and how to determine the direction and magnitude of subsequent user deviation from the reference orientation;
Fig. 7 is a flow diagram of a method of calibrating the device of Fig. 2a, corresponding to the sequence of user movements of Fig. 6;
Fig. 8 is a diagram of a geometric model illustrating an alternate sequence of user movements that may be performed when calibrating the device of Fig. 2a, and how to determine the magnitude of subsequent deviation from the calibrated orientation;
Fig. 9a is a diagram of a geometric model illustrating the operation of the device of Fig. 2a when the device is being subjected to a combined stimulus of tilt and periodic acceleration;
Fig. 9b is a schematic diagram illustrating a technique for discriminating between the tilt and motion stimuli of Fig. 9a; and
Fig. 10 is a flow diagram of a method of performing the technique of Fig. 9b.
DETAILED DESCRIPTION OF THE INVENTION
The disclosures of U.S. Provisional Patent Application No. 60/698,995 filed July 13, 2005 entitled MONITORING, EVALUATION AND TRAINING SYSTEM FOR ATHLETICS AND PHYSICAL REHABILITATION INCLUDING STUDENT UNIT AND REMOTE UNIT COMMUNICABLE THEREWITH, and U.S. Provisional Patent Application No. 60/719,161 filed September 21, 2005 entitled MONITORING, EVALUATION AND TRAINING SYSTEM FOR ATHLETICS AND PHYSICAL REHABILITATION INCLUDING STUDENT UNIT AND REMOTE UNIT COMMUNICABLE THEREWITH, are incorporated herein by reference in their entirety. Fig. 1 depicts a conventional orientation and motion- sensing device 110 attached to the head of a user. For example, the conventional device 110 may be attached to a headband 120 using one or more Velcro™ fasteners. As shown in Fig. 1, the device 110 may be attached to the user's headband to position the device over his or her right ear. The device 110 includes a number of tilt indicators (not shown) operative to detect and monitor the orientation of the user's head, i.e., the direction and magnitude of tilt, relative to a reference orientation and/or to an adjustable tilt magnitude threshold. For example, the device 110 may be employed to monitor the direction and magnitude of tilt of the user's head while the user plays tennis. To establish the reference orientation, the user mounts the device 110 over his or her right ear, and assumes a suitable tennis posture such as a vertical stance. The device 110 then measures the tilt of the device relative to X and Y-axes 130 and 140 of the user's head to establish the reference orientation. After the reference orientation has been established, the device 110 monitors the tilt of the user's head, e.g., from left to right, from right to left, from front to back, and/or from back to front, while the user plays tennis. If the magnitude of tilt in any direction exceeds the adjustable magnitude threshold, then a visible and/or audible alarm is generated to indicate the dominant direction of tilt.
Proper operation of the conventional orientation and motion-sensing device 110 of Fig. 1 depends highly upon the positioning and orientation of the device 110 relative to the user's head. For example, if the user were to position the device 110 over the left ear instead of the right ear, as depicted in Fig. 1, then the device 110 would incorrectly interpret head tilting from front to back as tilting from back to front, and head tilting from left to right as tilting from right to left. In addition, as shown in Fig. 1, the X-axis 150 of the device 110 is not precisely aligned with the X-axis 130 of the user's head, i.e., the X-axis 130 of the user's head points slightly to the right of the X-axis 150 of the device 110. As a result, the device 110 may provide inaccurate directional feedback to the user, especially when the user tilts his or her head in the front-left, front-right, back-left, or back-right directions.
Figs. 2a-2e depict an illustrative embodiment of an orientation and motion-sensing device 210, in accordance with the present invention. The orientation and motion-sensing device 210 provides proper operation and increased accuracy when attached in substantially any orientation relative to the user, and accurately measures changes in body-part orientation whether or not these changes are in directions aligned with the device's sensitive axes. In addition, the device 210 provides a technique for calibrating a reference orientation of the user so that the device correctly tracks changes in the user's posture or body orientation. The device 210 also maintains high sensitivity and accuracy as a sensitive axis of the device becomes vertical, reduces the generation of erroneous or misleading signals in the presence of acceleration, and estimates the magnitude and direction of the acceleration. In addition, the device 210 provides visible, audible (e.g., human speech), and/or tactile feedback to the user.
Fig. 2a depicts the orientation and motion-sensing device 210 attached to the head of a user. As shown in Fig. 2a, the device 210 may be attached to a golf cap 220 using one or more Velcro™ fasteners. It should be understood, however, that the device 210 may alternatively be attached directly or indirectly to any other suitable body part of the user, or any suitable article of clothing or accessory of the user, using any other suitable type of fastener. Further, the device 210 may be incorporated into an article of clothing or accessory, may be held by the user, may be held against the user by an attendant, or may be incorporated into a hand-held device such as a cell phone, a computer game control, or any other suitable hand-held device. In the presently disclosed embodiment, the device 210 is configured to provide one or more visible and/or audible indications of the orientation or movement of the user's body in real time. For example, the device 210 may be employed to monitor the orientation or movement of the user as he or she engages in a sporting or leisure activity such as golf, tennis, fencing, sculling, running, walking, bicycling, dancing, or any other suitable activity. The device 210 may also be employed by physical' therapy patients as an aid in performing rehabilitation exercises or to palliate the effects of a loss of balance ability, which may have resulted from an accident, physical and/or mental degradation, or illness.
For example, if the device 210 is attached to the user's head, as depicted in Fig. 2a, then the device 210 may be used to monitor the tilt of the user's head while he or she plays golf. Specifically, the device 210 is operative to monitor the tilt of the user's head relative to an X-axis 230 (see Fig. 2e) and a Y- axis 235 (see Fig. 2b) of the device 210. In the illustrated embodiment, the X-axis 230 points approximately straight ahead of the user and the Y-axis 235 points approximately toward the right side of the user when the device 210 is attached to the user's cap, as depicted in Fig. 2a. In this exemplary embodiment, the device 210 monitors the tilt of the user's head, e.g., from left to right, from right to left, from front to back, and from back to front, while the user plays golf. Fig. 3 is a diagram illustrating the various approximate head tilts (i.e., front, back, left, right) of the user relative to the X and Y-axes 310 and 320.
Fig. 2b depicts a back-end view of the orientation and motion-sensing device 210, illustrating a connector 240 for receiving a headphone or earphone jack (not shown) . In other embodiments, connector 240 may be designed to accommodate a battery-charger connector or a network connector, or may be excluded from the system. Fig. 2c depicts a backside view of the device 210 including an inner surface 250 that would normally be disposed against the user, and a speaker 260. In alternative embodiments, one or more visible, alphanumerical, tactile, and/or graphical outputs may be provided instead of, or in addition to, the audible output provided by the headphone/earphone 240 (not shown) or the speaker 260. Fig. 2d depicts a top view of the device 210 including four user controls 270 (e.g., cal, V, mode, Δ) implemented as pushbuttons, and a light emitting diode (LED; not numbered) . Fig. 2e depicts an exemplary front side view of the device 210.
Fig. 4 depicts exemplary functional components 400 included in the orientation and motion-sensing device 210 (see Figs. 2a- 2e) . As shown in Fig. 4, the functional components 400 include an X-axis sensor 402, a Y-axis sensor 404, a signal multiplexer and analog-to-digital converter (A/D) 406, a data processor 408, a program memory 410, a data memory 412, a user display and controls section 414, a voice data memory 416, an audio output converter 418, a speaker 420, a PC/network interface 422, a wireless networking antenna 424, a wired networking connector 426, a battery and power supply 428,, a battery charger connector 430, one or more tactile vibration outputs 432, and an over- range sensor 440. In embodiments that include more than one tactile sensor, Front/Back/Left/Right tilt indications can be signaled respectively by, for example, the activation of a tactile vibration sensor attached to the inside of a headband at the forehead, the back of the skull, the left temple, and the right temple. When charging, the battery charger connector 430 is connected to a battery charger (not shown) . When in a network configuration, the PC/network interface 422 is connected to a personal computer (PC; not shown) , and/or to a point-to-point network/remote-control unit, a local area network (LAN) , or a wide area network (WAN) through either the wireless networking antenna 424 or the wired networking connector 426. It is noted that alternative embodiments of the orientation and motion- sensing device 210 may include all or a subset of the functional components illustrated in Fig. 4. For example, in some embodiments, the components 416, 418, 420, 422, 424, 426, 430, 432, and/or 440 may be omitted. In the presently disclosed embodiment, the X-axis sensor 402 and the Y-axis sensor 404 are accelerometers oriented within the device 210 so that their respective sensitive axes, namely, the X-axis 230 and the Y-axis 235, are positioned 90° to one another. It is noted, however, that the X and Y-axis sensors 402 and 404 may employ any other suitable technique for sensing tilt, and may be oriented at any other suitable angle relative to one another.
The X-axis sensor 402 is operative to sense tilt along the X-axis 230 (see Fig. 2e) of the device 210, and the Y-axis sensor 404 is operative to sense tilt along the Y-axis 235 (see Fig. 2b) of the device 210. The X and Y-axis sensors 402 and 404 sense tilt and acceleration along the X and Y axes 230 and 235, respectively, by measuring the projection of a force vector on their respective axes that is the sum of the force of gravity at the location of the device 210 and a force of acceleration applied to the device 210 during use. This force vector is known interchangeably as an apparent acceleration vector or an apparent gravity vector. Fig. 3 depicts the relationship between the X and Y axes 310 and 320 and an exemplary apparent gravity vector G 330. In the description of the operation of the device 210 provided below, the frame of reference is the X and Y axes 310 and 320 of the device 210, while the direction of the apparent gravity vector G 330 relative to the X and Y axes 310 and 320 can change over time, as indicated by a directional arrow 340.
For example, each of the X and Y-axis sensors 402, 404 may be a micro-machined accelerometer such as the ADXL103 accelerometer sold by Analog Devices Inc., Norwood, Massachusetts, U.S.A. Alternatively, the X and Y-axis sensors 402, 404 may be implemented using a single dual-axis accelerometer such the ADXL322 dual-axis accelerometer sold by Analog Devices Inc. In addition, the signal multiplexer and analog-to-digital converter (A/D) 406 and the data processor 408 may be implemented using the PIC16F777 microcontroller sold by Microchip Technology Inc., Chandler, Arizona, U. S.A, or any- other suitable microcontroller or microprocessor. In addition, the audio converter 418 and the voice data memory 416 may be implemented using the ML22Q54 signal processor sold by OKI Semiconductor, Sunnyvale, California, U.S.A., or any other suitable device for storing and processing audio files . In one embodiment, conversion of the voice files is performed using software executing on the data processor 408 instead of being implemented as the separate functional block 418. In addition, the PC/network interface 422 may be a wired or wireless (e.g., infrared or RF) interface for downloading or uploading content to or from the program memory 410, the data memory 412, and/or the voice data memory 416. The PC/network interface 422 may also be configured for controlling the device 210 remotely. Time- stamps and/or sequences of measurements performed by the orientation and motion-sensing device 210 may be stored within the data memory 412 for subsequent local processing, for subsequent feedback to the user, and/or for subsequent uploading to a computer via the PC/network interface 422. In addition, application-specific user feedback phrases, measurement algorithms, and/or cuing sequences may be downloaded to the device 210 from a computer or over a communications network such as the Internet .
The over-range sensor 440 operates as a third tilt sensor, which is oriented at a specified angle to the X-Y plane defined by the sensitive X and Y-axes 310, 320. In one embodiment, the sensitive axis of the over-range sensor 440 is oriented at 90° to the X-Y plane. Like the X and Y-axis sensors 402 and 404, the over-range sensor 440 may be a micro-machined accelerometer such as the ADXL103 accelerometer sold by Analog Devices Inc. Alternatively, the X-axis sensor 402, the Y-axis sensor 404, and the over-range sensor 440 may be implemented using a single micro-machined 3 -axis accelerometer such as the ADXL330 accelerometer sold by Analog Devices Inc. Fig. 5 is a diagram of a geometric model 500 that may be employed to illustrate the operation of the orientation and motion-sensing device 210 (see Pigs. 2a-2e) . Specifically, the geometric model 500, which is constructed with reference to both the spherical and Cartesian coordinate systems, may be used to quantify the device's orientation and acceleration, and to quantify the alignment of the device 210 with the body part to which it is attached. The geometric model 500 includes the entire unit sphere, of which one octal portion is shown. Further, an X- axis 502 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4) , a Y-axis 504 represents the sensitive axis of the Y- axis sensor 404 (see Fig. 4) , and a Z-axis 506 represents the sensitive axis of the over-range sensor 440 (see Fig. 4) . The X and Y-axes 502 and 504 define an X-Y plane, and the Z-axis 506 is oriented 90° to the X-Y plane. In addition, Fig. 5 depicts one possible direction of an exemplary apparent gravity vector G 508. The origin of the vector G corresponds to the origin of the unit sphere . Because the actual gravity vector always points in the same direction, i.e., toward the center of the earth, the device 210 can determine changes in the orientation and acceleration of the body part to which it is attached by monitoring and analyzing changes in the magnitude and direction of the apparent gravity vector G 508 relative to axes X 502, Y 504 and Z 506.
Within the geometric model 500, the length x of an X-vector 510 represents the magnitude of the apparent gravity vector G 508 as measured by the X-axis sensor 402 along the X-axis 502, and the length y of a Y-vector 512 represents the magnitude of the apparent gravity vector G 508 measured by the Y-axis sensor 404 along the Y-axis 504. Similarly, the length z of a Z-vector 514 represents the magnitude of the apparent gravity vector G 508 measured by the over-range sensor 440 along the Z-axis 506. It is noted that the direction of the apparent gravity vector G 508 can be defined by angles θ 520 and φ 522, which may be determined using the formulas (cos θ ) 2 = x2/ (x2 + y2) (1)
(cos cp) 2 = 1 - (x2 + y2) . ( 2 )
Accordingly, using the formulas (I)- (2) above, the direction of the apparent gravity vector G 508 can be determined using the measurements provided by the X and Y-axis sensors 402 and 404 (see Fig. 4) . It is noted that, in an alternative embodiment, the X- axis 502 and the Y-axis 504 may be oriented at an angle different from 90° to one another, in which case the formulas (1) and (2) above and the other formulas below may be modified as appropriate using known trigonometric identities.
Those of ordinary skill in this art will appreciate that an accelerometer' s sensitivity to changes in tilt is at a maximum when the sensitive axis of the accelerometer is close to horizontal, and is at a minimum when the sensitive axis of the accelerometer becomes vertical. In the presently disclosed embodiment, the orientation and motion-sensing device 210 (see Figs. 2a-2e) employs the over-range sensor 440 (see Fig. 4) in conjunction with the X and Y-axis sensors 402 and 404 (see Fig. 4) to determine the direction of the apparent gravity vector G 508 over the entire unit sphere, thereby allowing the device 210 to provide an accurate measurement of tilt in any orientation of the device .
Specifically, as discussed above, the length of the X- vector 510, the length of the Y-vector 512, and the length of the Z-vector 514 represent the magnitudes of acceleration measured by the X-axis sensor 402, the Y-axis sensor 404, and the over-range sensor 440, respectively, when the device 210 is acted upon by the apparent gravity vector G 508. In the absence of acceleration, the length p 524 (see Fig. 5) of the apparent gravity vector G 508, normalized to the gravitational field at the earth's surface, may be expressed as
p = x2 + y2 + z2 = 1 . (3) It is noted that representation of the direction θ 520 and φ 522, and the length p 524, of G 508 is in the spherical coordinate system for illustrative purposes only, and that all of the angles and formulas expressed in this application can be represented . and expressed equivalently in other 3-dimensional coordinate systems by those of ordinary skill in this art.
To extend the calculation of the angles θ 520 and φ 522 of the vector G 508 to orientations of device 210 (see Figs. 2a-2e) where the sensitive axis of either the X-axis sensor 402 or the Y- axis sensors 404 is more vertical than the sensitive axis of the over-range sensor 440, an appropriate substitution between variables x, y and z is performed using formula (3) so that the two most-horizontal sensors are used for each calculation. The formulas (I)- (2) above may then be employed to determine the angles θ 520 and φ 522 of the vector G 508. In this way, the measurement provided by the single over-range sensor 440 can be used to extend the calculation of the angles θ 520 and φ 522 of the vector G 508 over the entire unit sphere without loss of precision.
It will be apparent to those of ordinary skill in this art that, in a Cartesian system, two of the three x, y, z axes are always within arcsin(2/3) = 41.81° of the horizontal plane. The direction of the vector G 508 can therefore be determined over the entire unit sphere using just three accelerometers, and with measurements that are always taken within 41.81° of the horizontal plane .
It is noted that the accuracy of the measurement of the direction of the apparent gravity vector G by the orientation and motion-sensing device 210 (see Figs. 2a-2e) can decrease when the device undergoes acceleration. Whether or not the device 210 is undergoing acceleration, and how much the acceleration is affecting the accuracy of the measurement of the apparent gravity vector G by the device, can be determined based upon the vector G2. Applying known rules of trigonometry, the direction η2i in the spherical coordinate system may be determined from the expression
cos η2i =(sin φx-(sin φ2*cos ψ12) ) / (cos φ2*sin ψ12) (5)
and, similarly, the direction of the user's subsequent tilt η2n 695 at time n can be determined by:
cos η2n =(sin φn-(sin <p2*cos ψn2))/(cos φ2*sin ψn2) (6)
In the illustrated embodiment of the method, the device 210 generates an audible message "Front" if tilts subsequent to the calibration are ±45° from the user's Front direction, an audible message "Left" if tilts are +45° from the Left direction, an audible message "Back" if tilts are ±45° from the Back direction, and an audible message "Right" if tilts are ±45° from the Right direction. The correct feedback from the device 210 can thus be expressed mathematically as
"Back" if η21 - 45° < η2n < η2i + 45° (7)
"Right" if η21 + 45° < η2n < η2i + 135° (8)
"Forward" if η21 + 135° < η2n < η21 + 225° (9)
"Left" if η21 + 225° < η2n < η21 + 315° , (10)
and the magnitude of tilt ψ2n 690 (see Fig. 6) at time n relative to the second calibration orientation may be determined from the expression
cos ψ2n=(sin φ2*sin φn) + (cos (θn2) *cos φ2*cos φn) , (11)
where angle ψ2n is the length of the great circle segment G2-MUn.
Fig. 8 is a diagram of a geometric model 800 that may be employed to illustrate the operation of the orientation and
-23- motion-sensing device 210 (see Figs. 2a-2e) in applications including a physical rehabilitation and evaluation application involving the determination of a patient's range of motion (ROM). In this illustrative mode of operation, the orientation of the device 210 relative to the body part is fixed but indeterminate, and the direction of the body part motion to be measured is in a vertical plane, but in an unknown direction. This method is thus useful when range of motion in several different directions is to be measured without having to reposition the device between measurements. The geometric model 800 of Fig. 8 is constructed with reference to both the spherical and Cartesian coordinate systems. Only one octal portion of the unit sphere is shown. An X- axis 802 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4) , a Y-axis 804 represents the sensitive axis of the Y- axis sensor 404 (see Fig. 4) , and a Z-axis 806 represents the sensitive axis of the over-range sensor 440. In this application, the device 210 employs a single orientation calibration at time 1, as indicated by the apparent gravity vector G1 830 (see Fig. 8) . For example, the vector G1 830 may correspond to the resting or starting orientation of a limb extension. As the patient extends his or her limb, the end of the apparent gravity vector G moves away from the end of the vector Gx 830. Because the orientation of the device relative to the body part is fixed but indeterminate, the end of apparent gravity vector Gn 860, corresponding to an intermediary or maximum extension, may be located anywhere on a circle 870. The magnitude of extension, which is represented by the magnitude of tilt ψln 850 (see Fig. 8) is the length of the great circle segment Gi-MSn where ψm can be calculated using the expression
cos
Figure imgf000022_0001
φn) . (12)
It is noted that the device 210 may be configured to monitor, capture and store the maximum value of the magnitude of tilt ψm for subsequent feedback, thereby allowing the patient to
-24- calculated length p 524 of the apparent gravity vector G, using formula (3) without substitution between the three variables x, y and z. For example, if the length p 524 is greater than or less than 1 (p > 1, p < 1) , then it may be concluded that the device 210 is being subjected to acceleration in addition to the force of gravity. It is noted that for small changes in the value of p, the worst-case error in the calculation of the direction of the apparent gravity vector G is about 1° for a 1.75% change in p. In one embodiment, if the error in the calculation of the apparent gravity vector G is significant for a given application, then the device 210 provides a suitable visible, tactile and/or audible warning to the user.
Fig. 6 is a diagram of a geometric model 600 that may be employed to illustrate a technique of determining the orientation of the device 210 (see Figs. 2a-2e) relative to a body part of a user to which it is attached. Like the geometric model 500 of Fig. 5, the geometric model 600 of Fig. 6 is constructed with reference to both the spherical and Cartesian coordinate systems and displays one octal portion of a unit sphere. Further, an X-axis 602 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4) , a Y-axis 604 represents the sensitive axis of the Y-axis sensor 404 (see Fig. 4) , and a Z-axis 606 represents the sensitive axis of the over-range sensor 440.
An illustrative method of calibrating the alignment of the device 210 with the body part to which it is attached is described below with reference to Figs. 6 and 7. The method of Fig. 7 provides a sequence of steps for quantifying the orientation of the device 210 relative to the body part to which it is attached, and allows the determination of a reference orientation of the user. In this illustrative method, the device is attached to the user's golf cap above his or her right ear (see, e.g., illustration 715 of Fig. 7) . It is understood, however, that the device 210 may alternatively be attached to any other suitable body part (e.g. , the user's chest, back, elbow, etc.), and in any other suitable orientation relative to the user. This is because
-20- the X and Y-axes of the device 210 are not required to be aligned with the corresponding axes of the body part to which it is attached, nor does the degree of misalignment need to be known.
According to the calibration method of Fig. 7, the user first stands vertically, looking towards the horizon, as depicted in step 705. It is noted that the user may alternatively look in any other suitable direction. Next, the user triggers a first calibration of the device 210 at time 1 by depressing one or more suitable user controls (see, e.g., the cal pushbutton 270 of Fig. 2), as depicted in step 710. The user then holds his or her vertical standing orientation, as depicted in step 720, while the device 210 captures the first calibration direction of the apparent gravity vector G, as indicated by the vector Gi 610 (see Fig. 6) . As shown in Fig. 6, the vector Gi 610 does not necessarily coincide with the Z-axis 606 of the device 210. In other words, the X and/or Y axes 602, 604 of the device 210 are not required to be horizontal. Next, the user tilts his or her head a number of degrees toward the front or forward direction, and triggers a second calibration of the device 210 at time 2 by depressing the cal pushbutton 270, as depicted in step 725. It is noted that the user may alternatively tilt his or her head in any other suitable direction. It is further noted that, in an alternative embodiment, the device 210 may be configured to execute the triggering steps 710 and 725 under program control, allowing the calibration procedure to be performed without requiring the user to manipulate the device . The user then holds the tilted orientation of his or her head, as depicted in step 735, while the device 210 captures the second calibration direction of the apparent gravity vector G, as indicated by the apparent gravity vector G2 620 (see Fig. 6) . The data-capture phase of the calibration is then complete as indicated in step 740.
The device 210 employs the first and second calibration directions of the apparent gravity vector G to determine the orientation of the device relative to the body part to which it is
-21- attached, and the reference orientation of the user. As illustrated in Fig. 6, an arc Gr-MS2 extending from the end of the apparent gravity vector Gi 610 to the end of the apparent gravity- vector G2 620 is a great circle segment on the unit sphere whose direction at each point is the direction of forward angular tilt of the user at that point. It is noted that subsequent tilting of the user's head exactly in the forward direction will cause the end of the apparent gravity vector G to extend the path defined by the great circle arc Gr-MS2. Fig. 6 illustrates an example of a continued tilt beyond the vector G2 620, slightly to the left of straight ahead at time n, resulting in the apparent movement of the end of the vector G to a point corresponding to the end of the vector Gn 630 at time n. Left and Right directions are inverted because the perspective of Fig. 6 is from outside the sphere looking in, whereas the user's perspective is from the center of the sphere looking out.
This method is valid for end-points of the vectors Gx 610, G2 620, and Gn 630 located anywhere on the unit sphere, so the device can be mounted in any orientation relative to the user, while allowing accurate determination of the orientation of the device relative to the body part to which it is attached, and of the reference orientation of the user.
In the illustrated embodiment, the magnitude of forward tilt of the user's head below the horizontal plane 770 (see Fig. 7) at time 2 corresponds to an angle ψi2 680 representing the change in direction from Gi 610 to G2 620. The angle ψi2 is equivalent to the length of the great circle segment Gr-MS2 and, applying known rules of trigonometry, may be determined from the expression
cos φx*sin φ2) + (cos [Q2-Q1) *cos φ1*cos φ2) . (4)
With respect to the user's orientation, a back tilt 640 is in the direction η2i, which is the direction of the great circle arc G2-MSi at the point located at the end of the apparent gravity
-22- reduce the amount of time needed to hold a limb extension, potentially to a fraction of a second.
As described above, the presently disclosed orientation and motion-sensing device 210 (see Figs. 2a-2e) includes the voice data memory 416, the data processor 408, the audio processor 418, and the speaker 420 (see Fig. 4) , which may be configured to provide a sequence of distinguishable audible cues and action confirmations to the user while he or she performs the calibration method of Fig. 7 or any other suitable function of the device 210. Fig. 7 depicts exemplary audible cues and action confirmations 750 in the English language that may be provided by the device 210 after the user performs the acts depicted in steps 710, 720, 725, and 735. The audible cues and action confirmations 750 are designed to facilitate and confirm proper execution of the various steps in the calibration procedure. For example, the device 210 may be configured to provide the audible cue "Look straight ahead" after step 710. Further, the device 210 may provide the audible confirmation "Level set" after step 720, the audible cue "Lean" after step 725, and the audible confirmation "Direction set" after step 735. It is understood that in alternative embodiments, the device 210 may be configured to provide any other suitable audible, visible, and/or tactile cues and action confirmations to the user, using any other suitable language, in order to facilitate device operation. For example, the device 210 may include one or more vibrating transducers (not shown) to provide one or more tactile cues and/or action confirmations against the user's skin.
The orientation and motion-sensing device 210 may also be configured to provide user feedback in the form of audible phrases stored in the voice data memory 416 or synthesized by the device. The audible user feedback phrases may be constructed and selected by the device 210 under control of the data processor 408, which may sequence the phrases in response to user motions monitored by the device. The calibration method of Fig. 7 includes an exemplary use of such phrases as cues to guide the user in executing specific and desired motions {e.g. , "Look straight ahead"), and to confirm to the user the proper or improper execution of a step or sequence of steps {e.g., "Level set", "Direction set"). Such audible user feedback phrases may also be employed in physical rehabilitation and evaluation applications to cue the user while performing physical therapy exercises, e.g. , "Raise your arm slowly as far as it can go" , "Stand on your right foot until you are told to stop" , "The left elbow flexion will now be measured" , or "Attach the device to the left wrist and stabilize the humerus" . Suitable sequences of user guidance and feedback phrases can be programmed into the device 210, for example through the PC/Network Interface 422 (see Fig. 4) according to a specific plan of desired user motions, in response to an analysis of user motions, or a combination thereof.
In addition, the orientation and motion-sensing device 210 may be configured to provide audible performance feedback to the user that is contextual to a specific application. For example, in a sports training application, the desired performance feedback in response to a tilt in the forward direction may be "You are leaning forward", while in a balance training exercise, the desired performance feedback in response to the same forward tilt may be "Go Back" , "Keep your head up" , or "You are about to fall over" . In a physical therapy application, the desired performance feedback in response to a maximum limb extension that is below a specified lower limit may be "Stretch a little farther" , while the desired performance feedback in response to exceeding a specified upper limit may be "You've gone too far". In an application for determining a patient's range of motion (ROM), the desired performance feedback may be "Your extension is 85°", "Your maximum extension was 135°" or, in the case of blind measurements, the desired performance feedback may be "Measurement number 4 has been recorded" . The device 210 may also provide feedback that tracks user progress, using phrases such as "Repetition three completed, seven more to go" , or "Your average head tilt over the past five minutes was 5° and your average direction was 45° to the right of straight ahead" . In addition, the device 210 may provide user feedback corresponding to the number of times a local minimum or maximum point satisfying certain specified conditions has been reached.
As described above, the orientation and motion-sensing device 210 may be incorporated into a hand-held device such as a cell-phone or a computer game control. For example, in a cellphone application, the device 210 may be configured to announce the phrase "Call sent to voice-mail" in response to an outward flick of the user's wrist, e.g., when there is a call waiting. In a computer game application, the device 210 may be configured to announce the phrase "Your opponent is defeated" after the user has moved the device through a correct sequence of target orientations. In addition, the device 210 may be configured to allow selection and/or programming, via the PC/Network Interface 422, of a particular individual's voice, e.g. , a teacher, a sports celebrity, etc., or a particular language, e.g. , English, French, German, Italian, Chinese, Japanese, Korean, etc., to provide the user feedback.
In addition, the orientation and motion-sensing device 210 (see Figs. 2a-2e) may be configured to initiate a particular operational mode in response to a specified sequence of user movements. For example, the device 210 may be configured to initiate a posture-monitoring operational mode in response to a specified sequence of movements while the user is practicing or participating in a round of golf. In this way, the user can initiate the posture-monitoring mode of the device 210 without having to release his or her golf club. In one embodiment, the sequence of user movements includes at least two steps performed in a specified order, in which each step requires the user to look in a specified direction. The device 210 may provide audible, visible, and/or tactile confirmation of the proper execution of the ordered steps. The sequence of user movements is designed to assure that the user is unlikely to perform the movements unintentionally.
For example, after performing the calibration method of Fig. 7, a user engaged in a round of golf may initiate the posture monitoring mode of the device 210 by performing a specified sequence of movements, which, based on the resulting orientations of the device 210 relative to the user, effectively causes the apparent gravity vector G to retrace the path from a direction corresponding to the vector G2 620 (see Fig. 6) to a direction corresponding to the vector G1 610 (see Fig. 6) , and back to the direction corresponding to the vector G2 620. In a golfing application, the specified and corresponding sequence of user movements may include addressing the golf ball, looking at the horizon, and addressing the golf ball again. Further, the device 210 may provide audible, visible, and/or tactile confirmations of the proper execution of each user movement in the specified sequence. Moreover, a tolerance circle may be provided around the locations of the vectors G2 and/or G1 so that the user is not required to look exactly at a particular point on the horizon or to address the golf ball in a precise manner in order for the device 210 to recognize the user' s intent to initiate a particular operational mode. For example, a tolerance circle of 10° or any other suitable size may be provided. In an alternative embodiment, the directions of the vectors G1 and G2 corresponding to the first and second target orientations of the device 210 may be replaced by two other orientations that are related geometrically to the directions of the vectors G1 and G2, so long as these orientations correspond to convenient visual targets for the user.
Fig. 9a is a diagram of a geometric model 900 that may be employed to illustrate a technique of distinguishing the effects of acceleration on the orientation and motion-sensing device 210 (see Figs. 2a-2e) from the effects of tilt. As described above, the accuracy of the measurement of the apparent gravity vector G by the device 210 can decrease in the presence of acceleration, and, in certain applications (such as physical activity monitors) , it is useful to estimate the magnitude and direction of the acceleration vector as precisely as possible in order to improve the accuracy of the measurement of physical activity. Like the geometric model 600 of Fig. 6, the geometric model 900 of Fig. 9a is constructed with reference to both the spherical and Cartesian coordinate systems and only one octal portion of the unit sphere is shown. Further, an X-axis 902 represents the sensitive axis of the X-axis sensor 402 (see Fig. 4) , a Y-axis 904 represents the sensitive axis of the Y-axis sensor 404 (see Fig. 4) , and a Z-axis 906 represents the sensitive axis of the over-range sensor 440. As illustrated in Fig. 9a, an apparent gravity vector G 910 is the sum of an actual gravity vector Ga 920 and an oscillating acceleration vector μ 930, which has its origin at the endpoint of the vector Ga 920, i.e.,
G = Ga + μ . (13)
For example, such an oscillating acceleration vector μ 930 may occur when the user is running and the dominant direction of the acceleration vector μ 930 is up-down relative to the user, or when the user is rowing and the dominant direction of the vector μ 930 is front-back relative to the user.
An illustrative method of distinguishing the effects of acceleration on the device 210 from the effects of tilt is described below with reference to Figs. 9a, 9b and 10. In this method, it is assumed that the average magnitude of the acceleration vector μ 930 is zero, and that the magnitude variation of the acceleration vector μ 930 is in a higher frequency band than the directional variation of the actual gravity vector Ga 920. As described above, the apparent gravity vector G can be specified in spherical coordinates by angles ΘG 914 and φG 916 using formulas (1) - (3) above.
As depicted in step 1002 (see Fig. 10) , signals representing the angles ΘG 914 and φG 916 (see Fig. 9a) are low- pass filtered by low-pass filter components 940 and 950 (see Fig. 9b) , respectively, to suppress just the effects of the acceleration vector μ 930 (see Fig. 9a) from the apparent gravity vector G 910 (see Fig. 9a) .
Those of ordinary skill in this art will appreciate that appropriate filter architectures and values 940 and 950 can be chosen to perform this step if the magnitude variation of the acceleration vector μ 930 is in a sufficiently higher frequency band than the directional variation of the actual gravity vector Ga 920. In one exemplary embodiment, a 15-tap, 0.5 Hz FIR filter with a 6 Hz sampling rate will attenuate a periodic acceleration vector μ 930 with a period of 1 second by 33 dB, while attenuating a periodic Ga 920 with a period of 9 seconds by less than 1 dB.
As shown in Fig. 9b, the low pass filters 940 and 950 generate output signals corresponding to angles θGa and φGa, respectively, which define the direction of the actual gravity vector Ga. Next, assuming that the length pa of Ga is equal to 1 (pa = 1) , signals representing the angles θGa and φGa are converted to Cartesian coordinates by the converter 960 (see Fig. 9b) , as depicted in step 1004. Similarly, the signals representing the angles ΘG 914 and φG 916, and the length pG, are converted to Cartesian coordinates by the converter 970 (see Fig. 9b) , as depicted in step 1006. Finally, a representation 980 (see Fig. 9b) of the acceleration vector μ 930 is obtained at a summation node 990 (see Fig. 8b) by subtracting the actual gravity vector Ga from the apparent gravity vector G, as depicted in step 1008.
The method of Fig. 10 allows accurate measurements of the direction and magnitude of acceleration of the device 210 to be obtained without having to calibrate the alignment of the device to the user, without knowing a priori the direction of the acceleration relative to the device 210, and without requiring the device's orientation relative to the user to remain constant.
It should be appreciated that the functions necessary to implement the present invention may be embodied in whole or in part using hardware or software or some combination thereof using micro-controllers, microprocessors, digital signal processors, programmable logic arrays, and/or any other suitable hardware and/or software .
It will further be appreciated by those of 'ordinary skill in this art that modifications to and variations of the above- described systems and methods of monitoring body orientation, posture, and motion, and providing cueing and feedback thereof, may be made without departing from the inventive concepts disclosed herein. Accordingly, the invention should not be viewed as limited except as by the scope and spirit of the appended claims .

Claims

CLAIMS What is claimed is:
1. An apparatus for monitoring an orientation of an object in 3-dimensional space, said apparatus being mountable to said object, comprising: a 3-axis sensor configured to sense a magnitude of tilt along each of a first axis, a second axis, and a third axis,- at least one memory operative to store data representative of the sensed magnitude of tilt along each of said first, second, and third axes; and at least one processor operative to process the data stored in said at least one memory, wherein said at least one processor is operative: to determine an angle between each of said first, second, and third axes and a horizontal plane; to select two of said first, second, and third axes corresponding to two smallest angles between said first, second, and third axes and the horizontal plane; and to generate an indication of the orientation of said object based upon the sensed magnitude of tilt along the two selected axes .
2. The apparatus of claim 1 wherein said 3 -axis sensor includes at least one accelerometer, said 3 -axis sensor being operative to measure a magnitude of an apparent gravity force along each of said first, second, and third axes.
3. The apparatus of claim 2 wherein said at least one processor is operative to monitor an acceleration of said object by monitoring the magnitude of said apparent gravity force.
4. The apparatus of claim 2 wherein said first and second axes are disposed at a first specified angle to one another and define a first plane, said third axis being disposed at a second specified angle to the first plane; wherein each of said first and second specified angles is equal to 90°, and wherein said at least one processor is operative to determine a direction of said apparent gravity force, said apparent gravity force being defined in a spherical coordinate system by angles θ and φ such that
(cos θ)2 = x2/ (x2 + y2) , and (cos φ)2 = 1 -(x2 + y2) ,
wherein Λλx" represents a magnitude of said apparent gravity force measured along said first axis, and wherein λλy" represents a magnitude of said apparent gravity force measured along said second axis .
5. The apparatus of claim 2 wherein said first and second axes are disposed at a first specified angle to one another and define a first plane, said third axis being disposed at a second specified angle to the first plane; i wherein each of said first and second specified angles is equal to 90°, and wherein said at least one processor is operative to determine the magnitude of said apparent gravity force, the magnitude of said apparent gravity force being expressible as
= x2 + y2 z\
wherein "x" represents a magnitude of said apparent gravity force measured along said first axis, wherein "y" represents a magnitude of said apparent gravity force measured along said second axis, and wherein "z" represents a magnitude of said apparent gravity- force measured along said third axis .
6. The apparatus of claim 5 wherein said at least one processor is operative to monitor an acceleration of said object by monitoring the magnitude of said apparent gravity force, and wherein the magnitude p of said apparent gravity force, normalized to earth's gravity at the earth's surface, is expressible as
p = x2 + y2 + z2 > 1, or p = x2 + y2 + z2 < 1
in a presence of an acceleration of said object.
7. The apparatus of claim 5 wherein said at least one processor is operative to monitor an acceleration of said object by monitoring the magnitude of said apparent gravity force, and wherein the magnitude p of said apparent gravity force, normalized to earth's gravity at the earth's surface, is expressible as
p = x + y + z = 1
in an absence of an acceleration of said object.
8. A method of monitoring an orientation of an object in 3- dimensional space, comprising the steps of: in a positioning step, positioning a housing including a 3- axis sensor against said object; in a sensing step, sensing, by said 3 -axis sensor, a magnitude of tilt along a first axis, a second axis, and a third axis ; in a determining step, determining an angle between each of said first, second, and third axes and a horizontal plane; in a selecting step, selecting two of said first, second, and third axes corresponding to two smallest angles between said first, second, and third axes and the horizontal plane,- and in a first generating step, generating an indication of the orientation of said object based upon the sensed magnitude of tilt along the two selected axes .
9. The method of claim 8 wherein said sensing step includes measuring, by said 3-axis sensor, a magnitude of an apparent gravity force along each of said first, second, and third axes.
10. The method of claim 9 further including the step of monitoring the magnitude p of said apparent gravity force to determine a presence or an absence of an acceleration of said object.
11. The method of claim 9 further including the steps of filtering a signal representing a direction of said apparent gravity force to determine a direction of an actual gravity force, subtracting a first vector representing said actual gravity force from a second vector representing said apparent gravity force to obtain a third vector, said third vector representing a periodic acceleration of said object, and in a second generating step, generating an indication of at least one of a direction and a magnitude of the third vector representing the acceleration of said object.
12. The method of claim 8 further including generating an audible message corresponding to at least one word or phrase at a start or completion of at least one of said positioning step, said sensing step, said determining step, said selecting step, and said first generating step.
13. The method of claim 12 wherein the at least one word or phrase corresponding to said audible message comprises at least one instructional word or phrase.
14. The method of claim 12 wherein the at least one word or phrase corresponding to said audible message comprises a confirmation of the start or completion of a specified act.
15. A method of monitoring an orientation of a body part of a user in 3-dimensional space, comprising the steps of: in a positioning step, positioning said body part in a first orientation within the 3-dimensional space; in a first measuring step, measuring an apparent gravity force acting on said body part at the first orientation to obtain a first direction of said apparent gravity force; in a first causing step, causing a first angular displacement of said body part about at least one axis from the first orientation to a second orientation within the 3 -dimensional space; in a second measuring step, measuring said apparent gravity force acting on said body part at the second orientation to obtain a second direction of said apparent gravity force; in a first determining step, determining a reference orientation of said user within the 3-dimensional space based upon the first and second directions of said apparent gravity force acting on said body part at the first and second orientations, respectively; and in a storing step, storing an indication of the reference orientation of said user.
16. The method of claim 15 further including the steps of in a second causing step, causing at least one next angular displacement of said body part about said at least one axis to at least one next orientation within the 3 -dimensional space, in a third measuring step, measuring said apparent gravity- force acting on said body part at the next orientation to obtain a next direction of said apparent gravity force; and in a second determining step, determining, relative to the reference orientation of said user, a direction corresponding to said next angular displacement based upon the second and next orientations of said body part.
17. The method of claim 15 wherein the reference orientation of said user determined in said first determining step corresponds to a direction along a great circle arc from an end of a first vector representing said apparent gravity force measured in said first measuring step to an end of a second vector representing said apparent gravity force measured in said second measuring step.
18. The method of claim 15 wherein the first angular displacement corresponds to a length of a great circle arc from an end of a first vector representing said apparent gravity force measured in said first measuring step to an end of a second vector representing said apparent gravity force measured in said second measuring step.
19. The method of claim 16 wherein the next angular displacement corresponds to a length of a great circle arc from an end of a first vector representing said apparent gravity force measured in said second measuring step to an end of a second vector representing said apparent gravity force measured in said third measuring step.
20. The method of claim 16 wherein the orientation of said user determined in said second determining step corresponds to a direction along a great circle arc from an end of a first vector representing said apparent gravity force measured in said second measuring step to an end of a second vector representing said apparent gravity force measured in said third measuring step.
21. The method of claim 15 further including generating an audible message corresponding to at least one word or phrase at a start or completion of at least one of said positioning step, said first measuring step, said first causing step, said second measuring step, said first determining step, and said storing step.
22. The method of claim 21 wherein the at least one word or phrase corresponding to said audible message comprises at least one instructional word or phrase.
23. The method of claim 21 wherein the at least one word or phrase corresponding to said audible message comprises a confirmation of the start or completion of a specified act.
24. A method of monitoring a range of motion of a body part, said body part being rotatable about a joint to which said body part is coupled, comprising the steps of: in a first positioning step, positioning a housing including a sensor against said body part; in a second positioning step, positioning said body part in a first orientation relative to the joint; in a first measuring step, measuring, by said sensor in 3- dimensional space, an apparent gravity force acting on said housing disposed against said body part at the first orientation to obtain a first direction of said apparent gravity force; in a third positioning step, positioning said body part in a second orientation relative to the joint; in a second measuring step, measuring, by said sensor in 3- dimensional space, said apparent gravity force acting on said housing disposed against said body part at the second orientation to obtain a second direction of said apparent gravity force,- in a determining step, determining a magnitude of rotation of said body part from the first orientation to the second orientation based upon the first and second directions of said apparent gravity force; and in a providing step, providing an indication of the magnitude of rotation of said body part .
25. The method of claim 24 wherein said first positioning step includes positioning said housing including said sensor against said body part, wherein said sensor has at least two axes.
26. The method of claim 24 wherein the magnitude of rotation of said body part corresponds to a length of a great circle arc from an end of a first vector representing said apparent gravity force measured in said first measuring step to an end of a second vector representing said apparent gravity force measured in said second measuring step.
27. The method of claim 24 wherein said second measuring step includes performing a plurality of measurements of said apparent gravity force at substantially the second orientation to obtain a plurality of second directions of said apparent gravity force, wherein the determining step includes determining a plurality of magnitudes of rotation of said body part from the first orientation to substantially the second orientation based upon the first direction and the plurality of second directions of said apparent gravity force, and further including storing a maximum of the plurality of magnitudes of rotation.
28. The method of claim 24 further including generating an audible message corresponding to at least one word or phrase at a start or completion of at least one of said first positioning step, said second positioning step, said first measuring step, said third positioning step, said second measuring step, said determining step, and said providing step.
29. The method of claim 28 wherein the at least one word or phrase corresponding to said audible message comprises at least one instructional word or phrase.
30. The method of claim 28 wherein the at least one word or phrase corresponding to said audible message comprises a confirmation of the start or completion of a specified act.
31. A method of initiating monitoring of an orientation of a body part in 3-dimensional space, said monitoring being performed using a sensor disposed against said body part, comprising the steps of : in a first positioning step, positioning said body part in a first orientation within the 3-dimensional space; in a first providing step, providing, by said sensor, data representing a first position of said body part at the first orientation; in a second positioning step, positioning said body part in at least one second orientation within the 3-dimensional space; in a second providing step, providing, by said sensor, data representing at least one second position of said body part at the at least one second orientation; and in an initiating step, in the event the first position of said body part at the first orientation and the at least one second position of said body part at the at least one second orientation correspond to a specified sequence of positions of said body part, initiating said monitoring of the orientation of said body part by said sensor.
32. The method of claim 31 wherein the first providing step includes measuring, by said sensor,, an apparent gravity force acting on said body part at the first orientation to obtain a first direction of said apparent gravity force, the first direction of said apparent gravity force being indicative of the first position of said body part at the first orientation, and wherein the second providing step includes measuring, by said sensor, said apparent gravity force acting on said body part at the at least one second orientation to obtain at least one second direction of said apparent gravity force, wherein the at least one second direction of said apparent gravity force is indicative of the at least one second position of said body part at the at least one second orientation.
33. The method of claim 31 further including generating an audible message corresponding to at least one word or phrase at a start or completion of at least one of said first positioning step, said first providing step, said second positioning step, said second providing step, and said initiating step.
34. The method of claim 33 wherein the at least one word or phrase corresponding to said audible message comprises at least one instructional word or phrase .
35. The method of claim 33 wherein the at least one word or phrase corresponding to said audible message comprises a confirmation of the start or completion of a specified act.
36. An apparatus for monitoring an orientation of a body part, the apparatus being attachable to said body part, comprising: a sensor configured to sense an angular orientation of said body part, and to provide data representing the sensed angular orientation; at least one memory operative to store data representing a plurality of words or phrases,- an audio output system operative to generate an audible message in response to an electronic input; and at least one processor operative: to monitor data provided by said sensor; to access data stored in said memory corresponding to at least one word or phrase, said at least one word or phrase relating to the sensed angular orientation of said body part; and to generate, in cooperation with said audio output system, an audible message corresponding to said at least one word or phrase .
37. The apparatus of claim 36 wherein the at least one word or phrase corresponding to said audible message comprises at least one instructional word or phrase.
38. The apparatus of claim 36 wherein the at least one word or phrase corresponding to said audible message comprises a confirmation of a start or completion of a specified act.
39. The apparatus of claim 36 wherein the at least one word or phrase corresponding to said audible message comprises a representation of a voice of one of a plurality of predetermined individuals .
40. The apparatus of claim 39 wherein one of said plurality of predetermined individuals is selectable by a user.
41. The apparatus of claim 36 wherein the data representing said plurality of words or phrases is stored in said memory in a plurality of different languages.
42. The apparatus of claim 41 wherein one of said plurality of different languages is selectable by a user.
43. The apparatus of claim 36 wherein said at least one processor is communicably coupleable to a data communications network.
44. The apparatus of claim 43 wherein said data communications network comprises one of a local area network (LAN) , a wide area network (WAN) , and a point-to-point network/remote-control unit.
45. The apparatus of claim 43 wherein said at least one processor is operative to receive data corresponding to said plurality of words or phrases over said network, and to store said data in said memory.
46. The apparatus of claim 36 further including an interface configured to communicate information over a data communications network, and wherein said at least one processor is operative to store said data provided by said sensor, to generate at least one data message containing said data provided by said sensor, and to communicate said data message to said interface for subsequent transmittal over said network.
47. An apparatus for monitoring an orientation of a body part of a user, the apparatus being attachable to said body part, comprising: a sensor configured to sense an angular orientation of said body part, and to provide data representing the sensed angular orientation; a memory operative to store data representative of a plurality of indications of the sensed angular orientation of said body part; an output system operative to generate a plurality of indications perceptible by the user,- and at least one processor operative: to monitor data provided by said sensor; and to access data corresponding to at least one of said plurality of indications of the sensed angular orientation of said body part, and, in cooperation with said output system, to generate one of said indications perceptible by the user to provide the user with feedback pertaining to the orientation of said body part within the 3-dimensional space, wherein said output system comprises at least one vibrating transducer, and said indications perceptible by the user comprise a plurality of predetermined vibration patterns.
PCT/US2006/026954 2005-07-13 2006-07-12 Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices WO2007008930A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US69899505P 2005-07-13 2005-07-13
US60/698,995 2005-07-13
US71916105P 2005-09-21 2005-09-21
US60/719,161 2005-09-21

Publications (2)

Publication Number Publication Date
WO2007008930A2 true WO2007008930A2 (en) 2007-01-18
WO2007008930A3 WO2007008930A3 (en) 2007-12-13

Family

ID=37637907

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/026954 WO2007008930A2 (en) 2005-07-13 2006-07-12 Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices

Country Status (2)

Country Link
US (1) US7383728B2 (en)
WO (1) WO2007008930A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010084440A1 (en) 2009-01-22 2010-07-29 Koninklijke Philips Electronics N.V. Interpreting angular orientation data
CN102836543A (en) * 2012-08-28 2012-12-26 洪德伟 Fitness sport recognizing method and related devices
CN110514178A (en) * 2019-09-03 2019-11-29 北京源清慧虹信息科技有限公司 Inclination angle measurement method and device based on single-axis acceleration sensors
US10744371B2 (en) 2014-09-21 2020-08-18 Stryd, Inc. Methods and apparatus for power expenditure and technique determination during bipedal motion
US11406291B2 (en) 2016-09-30 2022-08-09 Goertek Inc. Method for monitoring user gesture of wearable device
US12059267B2 (en) 2019-06-26 2024-08-13 Stryd, Inc. Expenditure to overcome air resistance during bipedal motion

Families Citing this family (121)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5065275B2 (en) 2005-09-02 2012-10-31 エムセンス コーポレイション Apparatus and method for detecting electrical activity in an organization
US8016776B2 (en) * 2005-12-02 2011-09-13 Medtronic, Inc. Wearable ambulatory data recorder
JP4967368B2 (en) * 2006-02-22 2012-07-04 ソニー株式会社 Body motion detection device, body motion detection method, and body motion detection program
KR100772539B1 (en) * 2006-12-07 2007-11-01 한국전자통신연구원 Apparatus and method for golf swing form guiding
US8353854B2 (en) * 2007-02-14 2013-01-15 Tibion Corporation Method and devices for moving a body joint
US9215996B2 (en) * 2007-03-02 2015-12-22 The Nielsen Company (Us), Llc Apparatus and method for objectively determining human response to media
US20090253996A1 (en) * 2007-03-02 2009-10-08 Lee Michael J Integrated Sensor Headset
US8230457B2 (en) 2007-03-07 2012-07-24 The Nielsen Company (Us), Llc. Method and system for using coherence of biological responses as a measure of performance of a media
US8473044B2 (en) * 2007-03-07 2013-06-25 The Nielsen Company (Us), Llc Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals
US20080221969A1 (en) * 2007-03-07 2008-09-11 Emsense Corporation Method And System For Measuring And Ranking A "Thought" Response To Audiovisual Or Interactive Media, Products Or Activities Using Physiological Signals
US8764652B2 (en) * 2007-03-08 2014-07-01 The Nielson Company (US), LLC. Method and system for measuring and ranking an “engagement” response to audiovisual or interactive media, products, or activities using physiological signals
US8782681B2 (en) * 2007-03-08 2014-07-15 The Nielsen Company (Us), Llc Method and system for rating media and events in media based on physiological data
US7634379B2 (en) * 2007-05-18 2009-12-15 Ultimate Balance, Inc. Newtonian physical activity monitor
US7458943B1 (en) * 2007-06-25 2008-12-02 The Hong Kong Polytechnic University Spine tilt monitor with biofeedback
US20080319351A1 (en) * 2007-06-25 2008-12-25 The Hong Kong Polytechnic University Spine tilt monitor with biofeedback
US7850537B2 (en) * 2007-08-21 2010-12-14 Stern Ben D Vibration-based training device and method
US20090111598A1 (en) * 2007-10-31 2009-04-30 O'brien Scott Systems and methods for improving golf swing
US8052629B2 (en) * 2008-02-08 2011-11-08 Tibion Corporation Multi-fit orthotic and mobility assistance apparatus
US20090306548A1 (en) 2008-06-05 2009-12-10 Bhugra Kern S Therapeutic method and device for rehabilitation
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
CN105768322A (en) 2008-06-13 2016-07-20 耐克创新有限合伙公司 Footwear Having Sensor System
US8058823B2 (en) * 2008-08-14 2011-11-15 Tibion Corporation Actuator system with a multi-motor assembly for extending and flexing a joint
US8274244B2 (en) * 2008-08-14 2012-09-25 Tibion Corporation Actuator system and method for extending a joint
US20100204620A1 (en) * 2009-02-09 2010-08-12 Smith Jonathan A Therapy and mobility assistance system
US8639455B2 (en) 2009-02-09 2014-01-28 Alterg, Inc. Foot pad device and method of obtaining weight data
TR201908933T4 (en) * 2009-02-13 2019-07-22 Koninklijke Philips Nv Head motion tracking for mobile applications.
US20100228158A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including device level determining of subject advisory information based on subject status information and postural influencer status information
DE102009002547A1 (en) * 2009-04-21 2010-10-28 Robert Bosch Gmbh Patient-to-wear device for controlling movements of the patient
US8739599B2 (en) 2010-03-02 2014-06-03 Bio-Applications, LLC Intra-extra oral shock-sensing and indicating systems and other shock-sensing and indicating systems
US20110219852A1 (en) * 2010-03-10 2011-09-15 Kasten Stephen P Impact monitoring apparatus
US20120116714A1 (en) * 2010-08-03 2012-05-10 Intellisysgroup Llc Digital Data Processing Systems and Methods for Skateboarding and Other Social Sporting Activities
US9111460B2 (en) 2010-08-17 2015-08-18 Judith Dawn Whyte Peri-mid
EP4138095A1 (en) 2010-11-10 2023-02-22 Nike Innovate C.V. Systems and methods for time-based athletic activity measurement and display
CN112545101B (en) 2011-02-17 2022-05-03 耐克创新有限合伙公司 Footwear with sensor system
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
WO2012112903A2 (en) 2011-02-17 2012-08-23 Nike International Ltd. Location mapping
CN107224026B (en) 2011-02-17 2020-04-21 耐克创新有限合伙公司 Shoe with sensor system
US8784274B1 (en) 2011-03-18 2014-07-22 Thomas C. Chuang Athletic performance monitoring with body synchronization analysis
US9140637B2 (en) 2011-03-31 2015-09-22 Mihaly Kis, JR. Method and apparatus for simulating head impacts for helmet testing
US8460001B1 (en) 2011-04-14 2013-06-11 Thomas C. Chuang Athletic performance monitoring with overstride detection
US20120296601A1 (en) * 2011-05-20 2012-11-22 Graham Paul Eatwell Method and apparatus for monitoring motion of a substatially rigid
US9615994B2 (en) 2011-07-06 2017-04-11 LELO Inc. Motion-based control for a personal massager
WO2013010040A1 (en) 2011-07-13 2013-01-17 Zero2One System and method of biomechanical posture detection and feedback
US9128521B2 (en) 2011-07-13 2015-09-08 Lumo Bodytech, Inc. System and method of biomechanical posture detection and feedback including sensor normalization
US8702529B2 (en) * 2011-08-24 2014-04-22 QPutt, LLC Systems and methods for improving a golf swing or putting stroke
US8944940B2 (en) 2011-08-29 2015-02-03 Icuemotion, Llc Racket sport inertial sensor motion tracking analysis
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
US20130213147A1 (en) 2012-02-22 2013-08-22 Nike, Inc. Footwear Having Sensor System
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US9292858B2 (en) 2012-02-27 2016-03-22 The Nielsen Company (Us), Llc Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments
US9451303B2 (en) 2012-02-27 2016-09-20 The Nielsen Company (Us), Llc Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing
US8436737B1 (en) 2012-03-13 2013-05-07 Steelhead Innovations, Llc Postural state attitude monitoring, caution, and warning systems and methods
US10716510B2 (en) 2013-09-17 2020-07-21 Medibotics Smart clothing with converging/diverging bend or stretch sensors for measuring body motion or configuration
US10321873B2 (en) 2013-09-17 2019-06-18 Medibotics Llc Smart clothing for ambulatory human motion capture
US9588582B2 (en) 2013-09-17 2017-03-07 Medibotics Llc Motion recognition clothing (TM) with two different sets of tubes spanning a body joint
US9582072B2 (en) 2013-09-17 2017-02-28 Medibotics Llc Motion recognition clothing [TM] with flexible electromagnetic, light, or sonic energy pathways
US10602965B2 (en) 2013-09-17 2020-03-31 Medibotics Wearable deformable conductive sensors for human motion capture including trans-joint pitch, yaw, and roll
US8989835B2 (en) 2012-08-17 2015-03-24 The Nielsen Company (Us), Llc Systems and methods to gather and analyze electroencephalographic data
WO2014052874A1 (en) * 2012-09-27 2014-04-03 X2 Biosystems, Inc. Adhesive shock patch
WO2014070799A1 (en) 2012-10-30 2014-05-08 Truinject Medical Corp. System for injection training
US9792836B2 (en) 2012-10-30 2017-10-17 Truinject Corp. Injection training apparatus using 3D position sensor
US20140142442A1 (en) * 2012-11-19 2014-05-22 Judy Sibille SNOW Audio Feedback for Medical Conditions
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US9161708B2 (en) * 2013-02-14 2015-10-20 P3 Analytics, Inc. Generation of personalized training regimens from motion capture data
US9384671B2 (en) 2013-02-17 2016-07-05 Ronald Charles Krosky Instruction production
US9320450B2 (en) 2013-03-14 2016-04-26 The Nielsen Company (Us), Llc Methods and apparatus to gather and analyze electroencephalographic data
US9410857B2 (en) 2013-03-15 2016-08-09 Nike, Inc. System and method for analyzing athletic activity
WO2014151584A1 (en) 2013-03-15 2014-09-25 Alterg, Inc. Orthotic device drive system and method
US9226707B2 (en) * 2013-04-26 2016-01-05 Chiming Huang Device and system to reduce traumatic brain injury
US8961440B2 (en) * 2013-04-26 2015-02-24 Chiming Huang Device and system to reduce traumatic brain injury
WO2014179507A1 (en) * 2013-04-30 2014-11-06 White Chester Body impact bracing apparatus
US9591996B2 (en) 2013-06-07 2017-03-14 Lumo BodyTech, Inc System and method for detecting transitions between sitting and standing states
US9586116B2 (en) * 2013-08-19 2017-03-07 David Churchman Training system and method
US20150109129A1 (en) * 2013-10-18 2015-04-23 Brain Sentry Llc System and method for measuring bodily impact events
US10664690B2 (en) 2013-11-21 2020-05-26 Mo' Motion Ventures Jump shot and athletic activity analysis system
US9589207B2 (en) 2013-11-21 2017-03-07 Mo' Motion Ventures Jump shot and athletic activity analysis system
US11045366B2 (en) * 2013-12-05 2021-06-29 Now Technologies Zrt. Personal vehicle, and control apparatus and control method therefore
US9474681B2 (en) 2013-12-09 2016-10-25 LELO, Inc. Wearable massager for couples
JP6447514B2 (en) * 2013-12-25 2019-01-09 ソニー株式会社 Attitude measurement apparatus and attitude measurement method, image processing apparatus and image processing method, display apparatus and display method, computer program, and image display system
US9922578B2 (en) 2014-01-17 2018-03-20 Truinject Corp. Injection site training system
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
US9622702B2 (en) 2014-04-03 2017-04-18 The Nielsen Company (Us), Llc Methods and apparatus to gather and analyze electroencephalographic data
JP2017519917A (en) * 2014-07-03 2017-07-20 レイザー スポルト ナムローゼ フェンノートシャップ Helmet providing position feedback
DK2962586T3 (en) * 2014-07-03 2017-03-20 Lazer Sport Nv Helmet with position feedback
US10668353B2 (en) 2014-08-11 2020-06-02 Icuemotion Llc Codification and cueing system for sport and vocational activities
US10232242B2 (en) * 2014-09-19 2019-03-19 Kelvin Guerrero Devices to improve swing technique, and methods of use thereof
GB2551238B (en) 2014-09-30 2019-04-10 270 Vision Ltd Mapping trajectories of the anatomy of the human or animal body for comparitive analysis
US10235904B2 (en) 2014-12-01 2019-03-19 Truinject Corp. Injection training tool emitting omnidirectional light
US10188311B2 (en) * 2015-12-04 2019-01-29 Chiming Huang Device to reduce traumatic brain injury
US11298040B2 (en) * 2014-12-05 2022-04-12 Chiming Huang Device to reduce traumatic brain injury
US9452338B1 (en) * 2014-12-31 2016-09-27 Leg Up Industries LLC Golf swing head movement detection system
US10631793B1 (en) * 2015-04-14 2020-04-28 Eric Levell Luster Impact indicator
US10548510B2 (en) * 2015-06-30 2020-02-04 Harrison James BROWN Objective balance error scoring system
US10854104B2 (en) * 2015-08-28 2020-12-01 Icuemotion Llc System for movement skill analysis and skill augmentation and cueing
US10314520B2 (en) 2015-10-02 2019-06-11 Seismic Holdings, Inc. System and method for characterizing biomechanical activity
EP3365049A2 (en) 2015-10-20 2018-08-29 Truinject Medical Corp. Injection system
WO2017083661A1 (en) * 2015-11-11 2017-05-18 Tour Pro Tech, Llc Head movement detection method and system for training in sports requiring a swing
EP3384845B1 (en) * 2015-11-30 2019-09-18 Alps Alpine Co., Ltd. Calibration method, portable device, and program
US10463909B2 (en) 2015-12-27 2019-11-05 Seismic Holdings, Inc. System and method for using performance signatures
US10959647B2 (en) 2015-12-30 2021-03-30 Seismic Holdings, Inc. System and method for sensing and responding to fatigue during a physical activity
WO2017151441A2 (en) 2016-02-29 2017-09-08 Truinject Medical Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
WO2017151963A1 (en) 2016-03-02 2017-09-08 Truinject Madical Corp. Sensory enhanced environments for injection aid and social training
WO2018029064A1 (en) * 2016-08-09 2018-02-15 Peter Sonntag System and method for posture and movement regulation
US9950239B1 (en) * 2016-11-01 2018-04-24 Kevin Harvey Hitting training device
WO2018106220A1 (en) 2016-12-06 2018-06-14 Vuelosophy Inc. Systems and methods for tracking motion and gesture of heads and eyes
US10650703B2 (en) 2017-01-10 2020-05-12 Truinject Corp. Suture technique training system
US10269266B2 (en) 2017-01-23 2019-04-23 Truinject Corp. Syringe dose and position measuring apparatus
US20180228403A1 (en) * 2017-02-13 2018-08-16 Conghua Li Wearable aparatus for monitoring head posture, and method of using the same
US11559252B2 (en) 2017-05-08 2023-01-24 Starkey Laboratories, Inc. Hearing assistance device incorporating virtual audio interface for therapy guidance
US10695611B2 (en) 2017-08-14 2020-06-30 AssessLink LLC Physical education kinematic motor skills testing system
US10085507B1 (en) * 2018-02-12 2018-10-02 Raymond C. Jarvis Cap alarm system
US10147299B1 (en) * 2018-06-04 2018-12-04 Raymond C. Jarvis Cap alarm system
EP3863738A1 (en) * 2018-10-09 2021-08-18 Brian Francis Mooney Coaching, assessing or analysing unseen processes in intermittent high-speed human motions, including golf swings
KR20210039875A (en) * 2019-10-02 2021-04-12 주식회사 모아이스 Method, device and non-transitory computer-readable recording medium for estimating information about golf swing
US11580837B2 (en) * 2020-04-19 2023-02-14 Pedro Pachuca Rodriguez Head orientation training devices
KR20240068771A (en) * 2021-10-05 2024-05-17 카스턴 매뉴팩츄어링 코오포레이숀 System and method for predicting ball flight data to produce a set of consistently gapped golf clubs
US20240001195A1 (en) * 2022-06-30 2024-01-04 bOMDIC Inc. Method and device for predicting sports performance, and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5430435A (en) * 1992-11-13 1995-07-04 Rhys Resources Adjustable athletic training system
US20040259651A1 (en) * 2002-09-27 2004-12-23 Imego Ab Sporting equipment provided with a motion detecting arrangement
US6836971B1 (en) * 2003-07-30 2005-01-04 Honeywell International Inc. System for using a 2-axis magnetic sensor for a 3-axis compass solution

Family Cites Families (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1169188A (en) 1915-01-20 1916-01-25 Arthur E Peck Golf-ball-addressing device.
US3945646A (en) 1974-12-23 1976-03-23 Athletic Swing Measurement, Inc. Athletic swing measurement system and method
US4502035A (en) 1983-07-11 1985-02-26 Obenauf James E Golfer's head motion sensor
US4991850A (en) 1988-02-01 1991-02-12 Helm Instrument Co., Inc. Golf swing evaluation system
US5348519A (en) 1988-02-04 1994-09-20 Loredan Biomedical, Inc. Exercise and diagnostic apparatus and method
US5243998A (en) 1989-05-25 1993-09-14 The Rockefeller University Automatic operant conditioning system
US5221088A (en) 1991-01-22 1993-06-22 Mcteigue Michael H Sports training system and method
US5197489A (en) 1991-06-17 1993-03-30 Precision Control Design, Inc. Activity monitoring apparatus with configurable filters
US5621922A (en) 1992-01-10 1997-04-22 Rush, Iii; Gus A. Sports helmet capable of sensing linear and rotational forces
US5251902A (en) 1992-03-16 1993-10-12 John Federowicz Golfer's head rotation indicating means and method
US5253870A (en) 1992-03-25 1993-10-19 Bedney Reginald C Golf practicing device with head motion detector
US5338036A (en) 1993-06-09 1994-08-16 Universal System Control, Inc. Golf exercising aid device
US5373857A (en) 1993-06-18 1994-12-20 Forte Technologies, Inc. Head tracking apparatus
US5553857A (en) 1993-12-06 1996-09-10 Fish; Leonard A. Physical activity training device and method
US6032530A (en) 1994-04-29 2000-03-07 Advantedge Systems Inc. Biofeedback system for sensing body motion and flexure
US5826578A (en) 1994-05-26 1998-10-27 Curchod; Donald B. Motion measurement apparatus
US8280682B2 (en) 2000-12-15 2012-10-02 Tvipr, Llc Device for monitoring movement of shipped goods
US6885971B2 (en) 1994-11-21 2005-04-26 Phatrat Technology, Inc. Methods and systems for assessing athletic performance
US5524894A (en) 1994-11-23 1996-06-11 Shannon; Allan P. Head movement sensor for golf practice
US5592401A (en) 1995-02-28 1997-01-07 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5913727A (en) 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
US6196932B1 (en) 1996-09-09 2001-03-06 Donald James Marsh Instrumented sports apparatus and feedback method
AU5457598A (en) 1996-11-27 1998-06-22 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
US5836829A (en) 1997-03-25 1998-11-17 Van Cott; Robert Golf swing training device
US6038074A (en) * 1997-05-20 2000-03-14 Ricoh Company, Ltd. Three-dimensional measuring apparatus and method, image pickup apparatus, and apparatus and method for inputting image
SE514795C2 (en) 1997-10-03 2001-04-23 Ericsson Telefon Ab L M Device and method for up and down conversion
US6331168B1 (en) 1997-10-24 2001-12-18 Creative Sports Technologies, Inc. Golf training head gear for detecting head motion and providing an indication of head movement
US5916181A (en) 1997-10-24 1999-06-29 Creative Sports Designs, Inc. Head gear for detecting head motion and providing an indication of head movement
US6730047B2 (en) * 1997-10-24 2004-05-04 Creative Sports Technologies, Inc. Head gear including a data augmentation unit for detecting head motion and providing feedback relating to the head motion
DE19750441C2 (en) 1997-11-14 2000-01-27 Markus Becker Device for detecting and controlling postures for therapeutic use in a sitting position
US6059576A (en) 1997-11-21 2000-05-09 Brann; Theodore L. Training and safety device, system and method to aid in proper movement during physical activity
US5895328A (en) 1997-11-21 1999-04-20 Pahio; Pete Golf swing training apparatus
US6871413B1 (en) 1997-12-15 2005-03-29 Microstrain, Inc. Miniaturized inclinometer for angle measurement with accurate measurement indicator
US5891180A (en) 1998-04-29 1999-04-06 Medtronic Inc. Interrogation of an implantable medical device using audible sound communication
US5993323A (en) 1998-08-26 1999-11-30 Golf Tutor, Inc. Golf training apparatus
US6491647B1 (en) 1998-09-23 2002-12-10 Active Signal Technologies, Inc. Physiological sensing device
US6307481B1 (en) 1999-09-15 2001-10-23 Ilife Systems, Inc. Systems for evaluating movement of a body and methods of operating the same
US6568396B1 (en) 1999-10-26 2003-05-27 Philip F. Anthony 3 dimensional head apparatus and method for the treatment of BPPV
US7127370B2 (en) 2000-01-07 2006-10-24 Nocwatch International Inc. Attitude indicator and activity monitoring device
US6611783B2 (en) 2000-01-07 2003-08-26 Nocwatch, Inc. Attitude indicator and activity monitoring device
AU2768001A (en) 2000-01-07 2001-07-24 Paul B. Kelly Jr. Attitude indicator and activity monitoring device
US6582380B2 (en) 2000-01-24 2003-06-24 Ambulatory Monitoring, Inc. System and method of monitoring and modifying human activity-based behavior
US6546291B2 (en) 2000-02-16 2003-04-08 Massachusetts Eye & Ear Infirmary Balance prosthesis
US20040006747A1 (en) 2000-03-13 2004-01-08 Tyler Joseph C. Electronic publishing system and method
US6826509B2 (en) 2000-10-11 2004-11-30 Riddell, Inc. System and method for measuring the linear and rotational acceleration of a body part
US20030207718A1 (en) 2000-10-20 2003-11-06 Perlmutter Michael S. Methods and systems for analyzing the motion of sporting equipment
CA2364919A1 (en) 2000-12-14 2002-06-14 Kevin Tuer Proprioceptive golf club with analysis, correction and control capabilities
AU2001297825A1 (en) 2000-12-15 2002-11-25 Phatrat Technology, Inc. Movement and event systems and associated methods related applications
US6607497B2 (en) 2000-12-18 2003-08-19 The Research Foundation Of The State University Of New York (Suny) Non-invasive method for treating postural instability
WO2002076293A1 (en) 2001-03-26 2002-10-03 Maryrose Cusimano Combined physiological monitoring system
US6678549B2 (en) 2001-03-26 2004-01-13 Cusimano Maryrose Combined physiological monitoring system
US20020134153A1 (en) 2001-03-26 2002-09-26 Grenlund Aaron E. Instrumented athletic device for coaching and like purposes
US20020173364A1 (en) 2001-05-17 2002-11-21 Bogie Boscha Apparatus for measuring dynamic characteristics of golf game and method for asessment and analysis of hits and movements in golf
US20020187860A1 (en) 2001-06-07 2002-12-12 Shoane George K. Method and apparatus for analyzing a golf stroke
WO2002102475A1 (en) 2001-06-07 2002-12-27 Rutgers, The State University Of New Jersey Method and apparatus for analyzing a golf stroke
US20030028377A1 (en) 2001-07-31 2003-02-06 Noyes Albert W. Method and device for synthesizing and distributing voice types for voice-enabled devices
GB2395251B (en) 2001-10-03 2005-04-13 Long Shot Products Ltd A tilt indicator for firearms
US6955542B2 (en) 2002-01-23 2005-10-18 Aquatech Fitness Corp. System for monitoring repetitive movement
GB0203240D0 (en) 2002-02-12 2002-03-27 Worrall Christopher Movement detection apparatus and method of use thereof
US20030216228A1 (en) 2002-05-18 2003-11-20 Rast Rodger H. Systems and methods of sports training using specific biofeedback
TWI282092B (en) 2002-06-28 2007-06-01 Brilliance Semiconductor Inc Nonvolatile static random access memory cell
US6816326B2 (en) 2002-07-12 2004-11-09 Schott Glas Optical system with compensated spatial dispersion
US20040014531A1 (en) 2002-07-17 2004-01-22 Ziener-Gundersen Dag H. Device for training the correct swing for a club
AU2002368068A1 (en) 2002-07-17 2004-02-02 Yoram Baram Closed-loop augmented reality apparatus
US20040033843A1 (en) * 2002-08-19 2004-02-19 Miller John Clifford Motion evaluation system for golf swing and sports training
US20040077438A1 (en) 2002-10-21 2004-04-22 In Choi Racket orientation indicator device and associated method of operation
US20050288119A1 (en) * 2004-06-28 2005-12-29 Hongchuan Wang Real-time measurements for establishing database of sporting apparatus motion and impact parameters
JP5028751B2 (en) * 2005-06-09 2012-09-19 ソニー株式会社 Action recognition device
KR100653081B1 (en) * 2005-11-25 2006-12-04 삼성전자주식회사 Geomagnetic sensor and azimuth computation method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5430435A (en) * 1992-11-13 1995-07-04 Rhys Resources Adjustable athletic training system
US20040259651A1 (en) * 2002-09-27 2004-12-23 Imego Ab Sporting equipment provided with a motion detecting arrangement
US6836971B1 (en) * 2003-07-30 2005-01-04 Honeywell International Inc. System for using a 2-axis magnetic sensor for a 3-axis compass solution

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010084440A1 (en) 2009-01-22 2010-07-29 Koninklijke Philips Electronics N.V. Interpreting angular orientation data
US8818751B2 (en) 2009-01-22 2014-08-26 Koninklijke Philips N.V. Interpreting angular orientation data
CN102836543A (en) * 2012-08-28 2012-12-26 洪德伟 Fitness sport recognizing method and related devices
US10744371B2 (en) 2014-09-21 2020-08-18 Stryd, Inc. Methods and apparatus for power expenditure and technique determination during bipedal motion
US11278765B2 (en) 2014-09-21 2022-03-22 Stryd, Inc. Methods and apparatus for power expenditure and technique determination during bipedal motion
US11406291B2 (en) 2016-09-30 2022-08-09 Goertek Inc. Method for monitoring user gesture of wearable device
US12059267B2 (en) 2019-06-26 2024-08-13 Stryd, Inc. Expenditure to overcome air resistance during bipedal motion
CN110514178A (en) * 2019-09-03 2019-11-29 北京源清慧虹信息科技有限公司 Inclination angle measurement method and device based on single-axis acceleration sensors
CN110514178B (en) * 2019-09-03 2021-11-26 北京源清慧虹信息科技有限公司 Inclination angle measuring method and device based on single-axis acceleration sensor

Also Published As

Publication number Publication date
WO2007008930A3 (en) 2007-12-13
US20070015611A1 (en) 2007-01-18
US7383728B2 (en) 2008-06-10

Similar Documents

Publication Publication Date Title
US7383728B2 (en) Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices
US10736545B1 (en) Portable system for vestibular testing and/or training of a user
US10966606B1 (en) System and method for measuring the head position and postural sway of a subject
US20110054782A1 (en) Method and apparatus of measuring and analyzing user movement
US9814430B1 (en) System and method for measuring eye movement and/or eye position and postural sway of a subject
US20100156653A1 (en) Assessment device
KR102712457B1 (en) Sports training aid with motion detector
KR102712460B1 (en) Real-time golf swing training aid device
JP6027038B2 (en) Measuring system and measuring device
KR102712454B1 (en) Real-time sports motion training assistant device
US11185736B2 (en) Systems and methods for wearable devices that determine balance indices
US11273354B2 (en) Real time sports motion training aid
WO2024068170A1 (en) Improved golf swing detector

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06786933

Country of ref document: EP

Kind code of ref document: A2